Возраст домена | 18 лет |
Дата окончания | Истек срок регистрации |
ИКС | n/a |
Страниц в Google | 2 |
Страниц в Яндексе | 1 |
Dmoz | Нет |
Яндекс Каталог | Нет |
Alexa Traffic Rank | Нет данных |
Alexa Country | Нет данных |
История изменения показателей | Авторизация |
Идет сбор информации... Обновить
New Page 1
n/a
n/a
WINDOWS-1252
0.38 КБ
2
10 симв.
8 симв.
Идет сбор информации... Обновить
Идет сбор информации... Обновить
Идет сбор информации... Обновить
Domain name:
borntokill.co.uk
Data validation:
Nominet was able to match the registrant's name and address against a 3rd party data source on 17-Dec-2015
Registrar:
123-Reg Limited t/a 123-reg [Tag = 123-REG]
URL: http://www.123-reg.co.uk
Relevant dates:
Registered on: 18-Sep-2005
Expiry date: 18-Sep-2023
Last updated: 22-Aug-2018
Registration status:
Registered until expiry date.
Name servers:
ns1.fasthostingdirect.co.uk 79.170.40.2
ns2.fasthostingdirect.co.uk 79.170.43.3
WHOIS lookup made at 01:34:00 16-Oct-2018
--
# $Id: robots.txt,v 1.9 2007/06/27 22:37:44 goba Exp $
#
# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
#
# This file will be ignored unless it is at the root of your host:
# Used: http://example.com/robots.txt
# Ignored: http://example.com/site/robots.txt
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
# File identity -
# So that the contents of this file are copied only once to DocumentRoot/robots.txt at
# installation time the file must contain a unique string.
# The copy_robots_text.pl program looks for a line with a string of the form "hhh ddd vvv ooo"
# where hhh is '#'
# ddd is 'Drupal'
# vvv is 'version'
# ooo is some other chars to make it unique. For instance, using the version and installation directory (see 3 lines down).
# If copy_robots_text.pl finds that string in DocumentRoot/robots.txt it assumes that the data is already there and does not copy again.
# DO NOT REMOVE THE NEXT LINE - it prevents copying to DocumentRoot more than once.
# Drupal version 6.2 /drupal/
User-agent: *
Crawl-delay: 10
# Directories
Disallow:/*
Disallow:/drupal/database/
Disallow:/drupal/includes/
Disallow:/drupal/misc/
Disallow:/drupal/modules/
Disallow:/drupal/sites/
Disallow:/drupal/themes/
Disallow:/drupal/scripts/
Disallow:/drupal/updates/
Disallow:/drupal/profiles/
Disallow:/drupal/_files/
# Files
Disallow:/drupal/xmlrpc.php
Disallow:/drupal/cron.php
Disallow:/drupal/update.php
Disallow:/drupal/install.php
# Paths (clean URLs)
Disallow:/drupal/admin/
Disallow:/drupal/comment/reply/
Disallow:/drupal/contact/
Disallow:/drupal/logout/
Disallow:/drupal/node/add/
Disallow:/drupal/search/
Disallow:/drupal/user/register/
Disallow:/drupal/user/password/
Disallow:/drupal/user/login/
# Paths (no clean URLs)
Disallow:/drupal/?q=admin/
Disallow:/drupal/?q=comment/reply/
Disallow:/drupal/?q=contact/
Disallow:/drupal/?q=logout/
Disallow:/drupal/?q=node/add/
Disallow:/drupal/?q=search/
Disallow:/drupal/?q=user/password/
Disallow:/drupal/?q=user/register/
Disallow:/drupal/?q=user/login/
# $Id: robots.txt,v 1.9 2007/06/27 22:37:44 goba Exp $
#
# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
#
# This file will be ignored unless it is at the root of your host:
# Used: http://example.com/robots.txt
# Ignored: http://example.com/site/robots.txt
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
# File identity -
# So that the contents of this file are copied only once to DocumentRoot/robots.txt at
# installation time the file must contain a unique string.
# The copy_robots_text.pl program looks for a line with a string of the form "hhh ddd vvv ooo"
# where hhh is '#'
# ddd is 'Drupal'
# vvv is 'version'
# ooo is some other chars to make it unique. For instance, using the version and installation directory (see 3 lines down).
# If copy_robots_text.pl finds that string in DocumentRoot/robots.txt it assumes that the data is already there and does not copy again.
# DO NOT REMOVE THE NEXT LINE - it prevents copying to DocumentRoot more than once.
# Drupal version 6.12 /josh/
User-agent: *
Crawl-delay: 10
# Directories
Disallow: /josh/database/
Disallow: /josh/includes/
Disallow: /josh/misc/
Disallow: /josh/modules/
Disallow: /josh/sites/
Disallow: /josh/themes/
Disallow: /josh/scripts/
Disallow: /josh/updates/
Disallow: /josh/profiles/
Disallow: /josh/_files/
# Files
Disallow: /josh/xmlrpc.php
Disallow: /josh/cron.php
Disallow: /josh/update.php
Disallow: /josh/install.php
Disallow: /josh/_files/INSTALL.txt
Disallow: /josh/_files/INSTALL.mysql.txt
Disallow: /josh/_files/INSTALL.pgsql.txt
Disallow: /josh/_files/CHANGELOG.txt
Disallow: /josh/_files/MAINTAINERS.txt
Disallow: /josh/_files/LICENSE.txt
Disallow: /josh/_files/UPGRADE.txt
# Paths (clean URLs)
Disallow: /josh/admin/
Disallow: /josh/comment/reply/
Disallow: /josh/contact/
Disallow: /josh/logout/
Disallow: /josh/node/add/
Disallow: /josh/search/
Disallow: /josh/user/register/
Disallow: /josh/user/password/
Disallow: /josh/user/login/
# Paths (no clean URLs)
Disallow: /josh/?q=admin/
Disallow: /josh/?q=comment/reply/
Disallow: /josh/?q=contact/
Disallow: /josh/?q=logout/
Disallow: /josh/?q=node/add/
Disallow: /josh/?q=search/
Disallow: /josh/?q=user/password/
Disallow: /josh/?q=user/register/
Disallow: /josh/?q=user/login/
# $Id: robots.txt,v 1.9 2007/06/27 22:37:44 goba Exp $
#
# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
#
# This file will be ignored unless it is at the root of your host:
# Used: http://example.com/robots.txt
# Ignored: http://example.com/site/robots.txt
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
# File identity -
# So that the contents of this file are copied only once to DocumentRoot/robots.txt at
# installation time the file must contain a unique string.
# The copy_robots_text.pl program looks for a line with a string of the form "hhh ddd vvv ooo"
# where hhh is '#'
# ddd is 'Drupal'
# vvv is 'version'
# ooo is some other chars to make it unique. For instance, using the version and installation directory (see 3 lines down).
# If copy_robots_text.pl finds that string in DocumentRoot/robots.txt it assumes that the data is already there and does not copy again.
# DO NOT REMOVE THE NEXT LINE - it prevents copying to DocumentRoot more than once.
# Drupal version 6.15 /drupal/
User-agent: *
Crawl-delay: 10
# Directories
Disallow: /drupal/database/
Disallow: /drupal/includes/
Disallow: /drupal/misc/
Disallow: /drupal/modules/
Disallow: /drupal/sites/
Disallow: /drupal/themes/
Disallow: /drupal/scripts/
Disallow: /drupal/updates/
Disallow: /drupal/profiles/
Disallow: /drupal/_files/
# Files
Disallow: /drupal/xmlrpc.php
Disallow: /drupal/cron.php
Disallow: /drupal/update.php
Disallow: /drupal/install.php
Disallow: /drupal/_files/INSTALL.txt
Disallow: /drupal/_files/INSTALL.mysql.txt
Disallow: /drupal/_files/INSTALL.pgsql.txt
Disallow: /drupal/_files/CHANGELOG.txt
Disallow: /drupal/_files/MAINTAINERS.txt
Disallow: /drupal/_files/LICENSE.txt
Disallow: /drupal/_files/UPGRADE.txt
# Paths (clean URLs)
Disallow: /drupal/admin/
Disallow: /drupal/comment/reply/
Disallow: /drupal/contact/
Disallow: /drupal/logout/
Disallow: /drupal/node/add/
Disallow: /drupal/search/
Disallow: /drupal/user/register/
Disallow: /drupal/user/password/
Disallow: /drupal/user/login/
# Paths (no clean URLs)
Disallow: /drupal/?q=admin/
Disallow: /drupal/?q=comment/reply/
Disallow: /drupal/?q=contact/
Disallow: /drupal/?q=logout/
Disallow: /drupal/?q=node/add/
Disallow: /drupal/?q=search/
Disallow: /drupal/?q=user/password/
Disallow: /drupal/?q=user/register/
Disallow: /drupal/?q=user/login/
# $Id: robots.txt,v 1.9 2007/06/27 22:37:44 goba Exp $
#
# robots.txt
#
# This file is to prevent the crawling and indexing of certain parts
# of your site by web crawlers and spiders run by sites like Yahoo!
# and Google. By telling these "robots" where not to go on your site,
# you save bandwidth and server resources.
#
# This file will be ignored unless it is at the root of your host:
# Used: http://example.com/robots.txt
# Ignored: http://example.com/site/robots.txt
#
# For more information about the robots.txt standard, see:
# http://www.robotstxt.org/wc/robots.html
#
# For syntax checking, see:
# http://www.sxw.org.uk/computing/robots/check.html
# File identity -
# So that the contents of this file are copied only once to DocumentRoot/robots.txt at
# installation time the file must contain a unique string.
# The copy_robots_text.pl program looks for a line with a string of the form "hhh ddd vvv ooo"
# where hhh is '#'
# ddd is 'Drupal'
# vvv is 'version'
# ooo is some other chars to make it unique. For instance, using the version and installation directory (see 3 lines down).
# If copy_robots_text.pl finds that string in DocumentRoot/robots.txt it assumes that the data is already there and does not copy again.
# DO NOT REMOVE THE NEXT LINE - it prevents copying to DocumentRoot more than once.
# Drupal version 6.15 /slt2/
User-agent: *
Crawl-delay: 10
# Directories
Disallow: /slt2/database/
Disallow: /slt2/includes/
Disallow: /slt2/misc/
Disallow: /slt2/modules/
Disallow: /slt2/sites/
Disallow: /slt2/themes/
Disallow: /slt2/scripts/
Disallow: /slt2/updates/
Disallow: /slt2/profiles/
Disallow: /slt2/_files/
# Files
Disallow: /slt2/xmlrpc.php
Disallow: /slt2/cron.php
Disallow: /slt2/update.php
Disallow: /slt2/install.php
Disallow: /slt2/_files/INSTALL.txt
Disallow: /slt2/_files/INSTALL.mysql.txt
Disallow: /slt2/_files/INSTALL.pgsql.txt
Disallow: /slt2/_files/CHANGELOG.txt
Disallow: /slt2/_files/MAINTAINERS.txt
Disallow: /slt2/_files/LICENSE.txt
Disallow: /slt2/_files/UPGRADE.txt
# Paths (clean URLs)
Disallow: /slt2/admin/
Disallow: /slt2/comment/reply/
Disallow: /slt2/contact/
Disallow: /slt2/logout/
Disallow: /slt2/node/add/
Disallow: /slt2/search/
Disallow: /slt2/user/register/
Disallow: /slt2/user/password/
Disallow: /slt2/user/login/
# Paths (no clean URLs)
Disallow: /slt2/?q=admin/
Disallow: /slt2/?q=comment/reply/
Disallow: /slt2/?q=contact/
Disallow: /slt2/?q=logout/
Disallow: /slt2/?q=node/add/
Disallow: /slt2/?q=search/
Disallow: /slt2/?q=user/password/
Disallow: /slt2/?q=user/register/
Disallow: /slt2/?q=user/login/
Британия - Лидс - 79.170.40.177
Host Europe GmbH
Heart Internet Ltd
HTTP/1.1 200 OK
Date: Sat, 16 Feb 2019 13:23:53 GMT
Server: Apache/2.4.37 (Unix)
Last-Modified: Mon, 21 Jul 2008 11:20:21 GMT
ETag: "187-45286e4966740"
Accept-Ranges: bytes
Content-Length: 391
Content-Type: text/html
Кнопка для анализа сайта в один клик, для установки перетащите ссылку на "Панель закладок"