Возраст домена | n/a |
Дата окончания | n/a |
PR | 4 |
ИКС | |
Страниц в Google | 1090 |
Страниц в Яндексе | 57 |
Dmoz | ![]() |
Яндекс Каталог | Нет |
Alexa Traffic Rank | 21089470 |
Alexa Country | Нет данных |
История изменения показателей | Авторизация |
Идет сбор информации... Обновить
INAP - The International Network for Acid Prevention
n/a
INAP is an organization of international mining companies dedicated to reducing liabilities associated with sulphide mine materials.
UTF-8
36.46 КБ
661
4 979 симв.
4 190 симв.
Данные предоставлены сервисом semrush
Сайт | Общие фразы | PR | тИЦ | Alexa Rank | Alexa Country | |
---|---|---|---|---|---|---|
![]() ![]() |
7 | 9 |
0 | 9 | ![]() | |
![]() ![]() |
5 | 8 |
0 | 113 | ![]() | |
![]() ![]() |
5 | 9 |
0 | 5 | ![]() | |
![]() ![]() |
4 | 9 |
0 | 2 | ![]() | |
![]() ![]() |
4 | 5 |
0 | 26477 | ![]() | |
![]() ![]() |
3 | 7 |
0 | 610 | ![]() | |
![]() ![]() |
3 | 5 |
0 | 38810 | ![]() | |
![]() ![]() |
3 | 8 |
0 | 7483 | ![]() | |
![]() |
3 | 4 |
0 | 9450661 | ![]() |
|
![]() |
3 | 1 |
0 | Нет данных | ![]() |
|
Еще 40 сайтов после авторизации |
Данные предоставлены сервисом semrush
[Querying whois.ausregistry.net.au]
[whois.ausregistry.net.au]
Domain Name: inap.com.au
Last Modified: 02-Dec-2012 20:57:00 UTC
Registrar ID: Melbourne IT
Registrar Name: Melbourne IT
Status: ok
Registrant: INAP Ltd
Registrant ID: OTHER 084 719 184
Eligibility Type: Other
Registrant Contact ID: Z115923734916794
Registrant Contact Name: Franca Tomaras
Registrant Contact Email: Visit whois.ausregistry.com.au for Web based WhoIs
Tech Contact ID: Z115923734916794
Tech Contact Name: Franca Tomaras
Tech Contact Email: Visit whois.ausregistry.com.au for Web based WhoIs
Name Server: ns5.ns0.com
Name Server: ns480.pair.com
#
# robots.txt for http://www.wikipedia.org/ and friends
#
# Please note: There are a lot of pages on this site, and there are
# some misbehaved spiders out there that go _way_ too fast. If you're
# irresponsible, your access to the site may be blocked.
#
# advertising-related bots:
User-agent: Mediapartners-Google*
Disallow: /
# Wikipedia work bots:
User-agent: IsraBot
Disallow:
User-agent: Orthogaffe
Disallow:
# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /
User-agent: DOC
Disallow: /
User-agent: Zao
Disallow: /
# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /
User-agent: Zealbot
Disallow: /
User-agent: MSIECrawler
Disallow: /
User-agent: SiteSnagger
Disallow: /
User-agent: WebStripper
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: Fetch
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: Teleport
Disallow: /
User-agent: TeleportPro
Disallow: /
User-agent: WebZIP
Disallow: /
User-agent: linko
Disallow: /
User-agent: HTTrack
Disallow: /
User-agent: Microsoft.URL.Control
Disallow: /
User-agent: Xenu
Disallow: /
User-agent: larbin
Disallow: /
User-agent: libwww
Disallow: /
User-agent: ZyBORG
Disallow: /
User-agent: Download Ninja
Disallow: /
#
# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /
#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /
#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /
#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /
# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /
# Don't allow the wayback-maschine to index user-pages
#User-agent: ia_archiver
#Disallow: /wiki/User
#Disallow: /wiki/Benutzer
#
# Friendly, low-speed bots are welcome viewing article pages, but not
# dynamically-generated pages please.
#
# Inktomi's "Slurp" can read a minimum delay between hits; if your
# bot supports such a thing using the 'Crawl-delay' or another
# instruction, please let us know.
#
User-agent: *
Disallow: /gardwiki/
Disallow: /trap/
Disallow: /gardwiki/Special:Random
Disallow: /gardwiki/Special%3ARandom
Disallow: /gardwiki/Special:Search
Disallow: /gardwiki/Special%3ASearch
Disallow: /gardwiki/mediawiki:Checkuser/
Crawl-delay: 1
США - 66.96.160.153
The Endurance International Group
The Endurance International Group
HTTP/1.1 200 OK
Server: nginx/1.14.1
Date: Tue, 24 Dec 2019 01:42:22 GMT
Content-Type: text/html; charset=UTF-8
Transfer-Encoding: chunked
Connection: keep-alive
X-Pingback:
Link: ; rel="https://api.w.org/", ; rel=shortlink
X-Server-Cache: true
X-Proxy-Cache: EXPIRED
Кнопка для анализа сайта в один клик, для установки перетащите ссылку на "Панель закладок"