Возраст домена | 22 года |
Дата окончания | Истек срок регистрации |
PR | 3 |
ИКС | |
Страниц в Google | 28 |
Страниц в Яндексе | 13 |
Dmoz | Нет |
Яндекс Каталог | Нет |
Alexa Traffic Rank | 9239470 |
Alexa Country | Нет данных |
История изменения показателей | Авторизация |
Идет сбор информации... Обновить
Scuba Diving Nanaimo Vancouver Island British Columbia Canada
dive, vancouver island, bc, british columbia, scuba
Rodale's Readers' Choice Awards have consistently ranked British Columbia as 'the' best place to dive in North America! Come on in and see how we can help plan your next diving trip
ISO-8859-1
39.17 КБ
1 202
8 644 симв.
7 170 симв.
Данные предоставлены сервисом semrush
Сайт | Общие фразы | PR | тИЦ | Alexa Rank | Alexa Country | |
---|---|---|---|---|---|---|
wikipedia.org | 10 | 9 |
0 | 5 | 6 | |
tripadvisor.com | 10 | 7 |
0 | 264 | 81 | |
scubadiving.com | 9 | 7 |
0 | 127335 | 42431 | |
youtube.com | 9 | 9 |
0 | 2 | 2 | |
hellobc.com | 8 | 7 |
0 | 220012 | 9535 | |
harbourliving.ca | 6 | 4 |
0 | 1532225 | 38248 | |
bcferries.com | 6 | 6 |
20 | 59436 | 1662 | |
padi.com | 5 | 6 |
600 | 30474 | 15320 | |
viu.ca | 5 | 6 |
50 | 45249 | 1426 | |
vancouverisland.travel | 5 | 5 |
0 | 1062283 | 112561 | |
Еще 40 сайтов после авторизации |
Данные предоставлены сервисом semrush
Domain Name: DIVINGBC.COM
Registry Domain ID: 87833614_DOMAIN_COM-VRSN
Registrar WHOIS Server: whois.enom.com
Registrar URL: http://www.enom.com
Updated Date: 2018-06-22T22:08:11Z
Creation Date: 2002-06-24T22:32:14Z
Registry Expiry Date: 2019-06-24T22:37:27Z
Registrar: eNom, Inc.
Registrar IANA ID: 48
Registrar Abuse Contact Email:
Registrar Abuse Contact Phone:
Domain Status: clientTransferProhibited https://icann.org/epp#clientTransferProhibited
Name Server: NS1.MATTIJEWEL.COM
Name Server: NS2.MATTIJEWEL.COM
URL of the ICANN Whois Inaccuracy Complaint Form: https://www.icann.org/wicf/
>>> Last update of whois database: 2018-10-15T21:10:57Z <<<
For more information on Whois status codes, please visit https://icann.org/epp
NOTICE: The expiration date displayed in this record is the date the
registrar's sponsorship of the domain name registration in the registry is
currently set to expire. This date does not necessarily reflect the expiration
date of the domain name registrant's agreement with the sponsoring
registrar. Users may consult the sponsoring registrar's Whois database to
view the registrar's reported date of expiration for this registration.
TERMS OF USE: You are not authorized to access or query our Whois
database through the use of electronic processes that are high-volume and
automated except as reasonably necessary to register domain names or
modify existing registrations; the Data in VeriSign Global Registry
Services' ("VeriSign") Whois database is provided by VeriSign for
information purposes only, and to assist persons in obtaining information
about or related to a domain name registration record. VeriSign does not
guarantee its accuracy. By submitting a Whois query, you agree to abide
by the following terms of use: You agree that you may use this Data only
for lawful purposes and that under no circumstances will you use this Data
to: (1) allow, enable, or otherwise support the transmission of mass
unsolicited, commercial advertising or solicitations via e-mail, telephone,
or facsimile; or (2) enable high volume, automated, electronic processes
that apply to VeriSign (or its computer systems). The compilation,
repackaging, dissemination or other use of this Data is expressly
prohibited without the prior written consent of VeriSign. You agree not to
use electronic processes that are automated and high-volume to access or
query the Whois database except as reasonably necessary to register
domain names or modify existing registrations. VeriSign reserves the right
to restrict your access to the Whois database in its sole discretion to ensure
operational stability. VeriSign may restrict or terminate your access to the
Whois database for failure to abide by these terms of use. VeriSign
reserves the right to modify these terms at any time.
The Registry database contains ONLY .COM, .NET, .EDU domains and
Registrars.
################################################################################
# For more examples see http://www.robotstxt.org/wc/exclusion-admin.html
# Default settings below allow all agents access to all objects in the webserver
################################################################################
User-agent: *
################################################################################
# LATEST CHANGES TO THE ROBOTS.TXT STANDARD INCLUDE #
################################################################################
################################################################################
# Crawl-delay:
################################################################################
# Crawl-delay: 60 would tell any robots named in the preceeding User-agent:
# section that they must wait 60 seconds between requests
################################################################################
Crawl-delay: 120
################################################################################
# Sitemap:
################################################################################
# Specifying the Sitemap location in your robots.txt file
# You can specify the location of the Sitemap using a robots.txt file. To do
# this, simply add the following line:
# Sitemap:
# The should be the complete URL to the Sitemap, such as:
# http://www.example.com/sitemap.xml
# This directive is independent of the user-agent line, so it doesn't matter
# where you place it in your file. If you have a Sitemap index file, you can
# include the location of just that file. You don't need to list each individual
# Sitemap listed in the index file.
################################################################################
################################################################################
# Disallow: /cgi-bin/
# Disallow: /images/
# Disallow: basket.aspx
################################################################################
################################################################################
# The Format
################################################################################
# The format and semantics of the "/robots.txt" file are as follows:
#
# The file consists of one or more records separated by one or more blank lines
# (terminated by CR,CR/NL, or NL). Each record contains lines of the form
# :
# The field name is case insensitive.
################################################################################
################################################################################
# Comments
################################################################################
# Comments can be included in file using UNIX bourne shell conventions: the '#'
# character is used to indicate that preceding space (if any) and the remainder
# of the line up to the line termination is discarded. Lines containing only a
# comment are discarded completely, and therefore do not indicate a record
# boundary.
################################################################################
################################################################################
# Records
################################################################################
# The record starts with one or more User-agent lines, followed by one or more
# Disallow lines, as detailed below. Unrecognised headers are ignored.
################################################################################
################################################################################
# User-agent
################################################################################
# The value of this field is the name of the robot the record is describing
# access policy for.
# If more than one User-agent field is present the record describes an identical
# access policy for more than one robot. At least one field needs to be present
# per record.
# The robot should be liberal in interpreting this field. A case insensitive
# substring match of the name without version information is recommended.
# If the value is '*', the record describes the default access policy for any
# robot that has not matched any of the other records. It is not allowed to
# have multiple such records in the "/robots.txt" file.
################################################################################
################################################################################
# Disallow
################################################################################
# The value of this field specifies a partial URL that is not to be visited.
# This can be a full path, or a partial path; any URL that starts with this
# value will not be retrieved. For example,
# Disallow: /help disallows both /help.html and /help/index.html
# Disallow: /help/ would disallow /help/index.html but allow /help.html.
# Any empty value, indicates that all URLs can be retrieved.
# At least one Disallow field needs to be present in a record.
################################################################################
################################################################################
# The presence of an empty "/robots.txt" file has no explicit associated
# semantics, it will be treated as if it was not present, i.e. all robots will
# consider themselves welcome.
################################################################################
США - 70.87.90.194
Softlayer
ThePlanet.com Internet Services
HTTP/1.1 200 OK
Date: Sun, 22 Dec 2019 00:23:11 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=dc50fa762410797fd13d8c11ea6ed439c1576974191; expires=Tue, 21-Jan-20 00:23:11 GMT; path=/; domain=.divingbc.com; HttpOnly; SameSite=Lax
Last-Modified: Thu, 29 Nov 2012 18:58:21 GMT
Accept-Ranges: bytes
CF-Cache-Status: DYNAMIC
Server: cloudflare
CF-RAY: 548df215c89ee1ce-ORD
Кнопка для анализа сайта в один клик, для установки перетащите ссылку на "Панель закладок"