Возраст домена | 14 лет |
Дата окончания | Истек срок регистрации |
PR | 2 |
ИКС | n/a |
Страниц в Google | 39 |
Страниц в Яндексе | 0 |
Dmoz | Нет |
Яндекс Каталог | Нет |
Alexa Traffic Rank | 29079714 |
Alexa Country | Нет данных |
История изменения показателей | Авторизация |
Идет сбор информации... Обновить
Where to Play | Pickleball Calgary
n/a
Where to play pickleball in Calgary. A list of independent places to play recreational and competitive pickleball at all levels.
UTF-8
38.17 КБ
1 570
11 917 симв.
10 006 симв.
Domain Name: PICKLEBALLCALGARY.COM
Registry Domain ID: NA
Registrar WHOIS Server: whois.dreamhost.com
Registrar URL: www.dreamhost.com
Updated Date: 2014-05-01 00:30:10Z
Creation Date: 2010-04-29 12:23:51Z
Registrar Registration Expiration Date: 2015-04-29 19:23:51Z
Registrar: DREAMHOST
Registrar IANA ID: 431
Registrar Abuse Contact Email: domain-abuse@dreamhost.com
Registrar Abuse Contact Phone: +1.2132719359
Domain Status: clientTransferProhibited
Registry Registrant ID: NEW
Registrant Name: BRENT JOHNER
Registrant Organization: N/A
Registrant Street: 10015 OAKFIELD DRIVE SW
Registrant City: CALGARY
Registrant State/Province: AB
Registrant Postal Code: T2V 1S9
Registrant Country: CA
Registrant Phone: +1.4032380687
Registrant Phone Ext:
Registrant Fax:
Registrant Fax Ext:
Registrant Email: RACQUETNETWORK@RACQUETNETWORK.COM
Registry Admin ID: NEW
Admin Name: BRENT JOHNER
Admin Organization: N/A
Admin Street: 10015 OAKFIELD DRIVE SW
Admin City: CALGARY
Admin State/Province: AB
Admin Postal Code: T2V 1S9
Admin Country: CA
Admin Phone: +1.4032380687
Admin Phone Ext:
Admin Fax:
Admin Fax Ext:
Admin Email: RACQUETNETWORK@RACQUETNETWORK.COM
Registry Tech ID: NEW
Tech Name: BRENT JOHNER
Tech Organization: N/A
Tech Street: 10015 OAKFIELD DRIVE SW
Tech City: CALGARY
Tech State/Province: AB
Tech Postal Code: T2V 1S9
Tech Country: CA
Tech Phone: +1.4032380687
Tech Phone Ext:
Tech Fax:
Tech Fax Ext:
Tech Email: RACQUETNETWORK@RACQUETNETWORK.COM
Name Server: NS1.DREAMHOST.COM
Name Server: NS2.DREAMHOST.COM
Name Server: NS3.DREAMHOST.COM
Last update of WHOIS database: 2014-05-01 00:30:10Z
DreamHost whois server terms of service: http://whois.dreamhost.com/
DreamHost is a global Web hosting and cloud services provider with over 375,000 customers and 1.2 million blogs, websites and apps hosted. The company offers a wide spectrum of Web hosting and cloud services including Shared Hosting, Virtual Private Servers (VPS), Dedicated Server Hosting, Domain Name Registration, the cloud storage service, DreamObjects, and the cloud computing service DreamCompute. Please visit http://DreamHost.com for more information.
User-agent: Baiduspider
Disallow: /
User-agent: Baiduspider-video
Disallow: /
User-agent: 8484 Boston Project
Disallow: /
User-agent: Atomic_Email_Hunter
Disallow: /
User-agent: Attentio/Nutch
Disallow: /
User-agent: China Local Browse
Disallow: /
User-agent: ContentSmartz
Disallow: /
User-agent: DBrowse
Disallow: /
User-agent: DOC
Disallow: /
User-agent: DSurf15a 01
Disallow: /
User-agent: DSurf15a 71
Disallow: /
User-agent: DSurf15a 81
Disallow: /
User-agent: DSurf15a VA
Disallow: /
User-agent: DataCha0s
Disallow: /
User-agent: Demo Bot DOT 16b
Disallow: /
User-agent: Demo Bot Z 16b
Disallow: /
User-agent: Download Ninja
Disallow: /
User-agent: Educate Search VxB
Disallow: /
User-agent: EldoS TimelyWeb
Disallow: /
User-agent: EmailSiphon
Disallow: /
User-agent: EmailSpider
Disallow: /
User-agent: EmailWolf
Disallow: /
User-agent: Extreme Picture Finder
Disallow: /
User-agent: Fetch
Disallow: /
User-agent: Full Web Bot 0416B
Disallow: /
User-agent: Full Web Bot 0516B
Disallow: /
User-agent: Full Web Bot 2816B
Disallow: /
User-agent: Guestbook Auto Submitter
Disallow: /
User-agent: HTTrack
Disallow: /
User-agent: IUPUI Research Bot
Disallow: /
User-agent: LMQueueBot
Disallow: /
User-agent: LWP::Simple
Disallow: /
User-agent: LetsCrawl.com
Disallow: /
User-agent: Lincoln State Web Browser
Disallow: /
User-agent: MFC Foundation Class Library
Disallow: /
User-agent: MSIECrawler
Disallow: /
User-agent: MVAClient
Disallow: /
User-agent: Mac Finder
Disallow: /
User-agent: Microsoft.URL.Control
Disallow: /
User-agent: Missauga Locate
Disallow: /
User-agent: Missigua Locator
Disallow: /
User-agent: Missouri College Browse
Disallow: /
User-agent: Mizzu Labs
Disallow: /
User-agent: Mo College
Disallow: /
User-agent: NASA Search
Disallow: /
User-agent: NPBot
Disallow: /
User-agent: NameOfAgent
Disallow: /
User-agent: NationalDirectory-WebSpider
Disallow: /
User-agent: Nsauditor
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: PBrowse
Disallow: /
User-agent: PSurf15a 11
Disallow: /
User-agent: PSurf15a 51
Disallow: /
User-agent: PSurf15a VA
Disallow: /
User-agent: Poirot
Disallow: /
User-agent: Port Huron Labs
Disallow: /
User-agent: Production Bot 0116B
Disallow: /
User-agent: Production Bot 2016B
Disallow: /
User-agent: Production Bot DOT 3016B
Disallow: /
User-agent: Program Shareware
Disallow: /
User-agent: RSurf15a
Disallow: /
User-agent: SSurf15a
Disallow: /
User-agent: ShablastBot
Disallow: /
User-agent: SiteSnagger
Disallow: /
User-agent: Snapbot
Disallow: /
User-agent: Sogou web spider
Disallow: /
User-agent: TSurf15a
Disallow: /
User-agent: Teleport
Disallow: /
User-agent: TeleportPro
Disallow: /
User-agent: Under the Rainbow
Disallow: /
User-agent: User-Agent: Mozilla/4.0
Disallow: /
User-agent: VadixBot
Disallow: /
User-agent: WEP Search 00
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: WebReaper
Disallow: /
User-agent: WebStripper
Disallow: /
User-agent: WebZIP
Disallow: /
User-agent: Xenu
Disallow: /
User-agent: Zao
Disallow: /
User-agent: Zealbot
Disallow: /
User-agent: ZyBORG
Disallow: /
User-agent: autoemailspider
Disallow: /
User-agent: bwh3_user_agent
Disallow: /
User-agent: egothor
Disallow: /
User-agent: grub-client
Disallow: /
User-agent: iVia Page Fetcher
Disallow: /
User-agent: infoConveraCrawler
Disallow: /
User-agent: k2spider
Disallow: /
User-agent: larbin
Disallow: /
User-agent: libwww
Disallow: /
User-agent: linko
Disallow: /
User-agent: psycheclone
Disallow: /
User-agent: sogou spider
Disallow: /
User-agent: sohu agent
Disallow: /
User-agent: wget
Disallow: /
Disallow: /wp-admin
Disallow: /*.zip$
Disallow: /*.Zip$
Disallow: /*.ZIP$
User-agent: Slurp
Crawl-delay: 10
User-agent: *
Crawl-delay: 3600
# Wikipedia work bots:
User-agent: IsraBot
Disallow:
User-agent: Orthogaffe
Disallow:
# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /
User-agent: DOC
Disallow: /
User-agent: Zao
Disallow: /
# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /
User-agent: Zealbot
Disallow: /
User-agent: MSIECrawler
Disallow: /
User-agent: SiteSnagger
Disallow: /
User-agent: WebStripper
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: Fetch
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: Teleport
Disallow: /
User-agent: TeleportPro
Disallow: /
User-agent: WebZIP
Disallow: /
User-agent: linko
Disallow: /
User-agent: HTTrack
Disallow: /
User-agent: Microsoft.URL.Control
Disallow: /
User-agent: Xenu
Disallow: /
User-agent: larbin
Disallow: /
User-agent: libwww
Disallow: /
User-agent: ZyBORG
Disallow: /
User-agent: Download Ninja
Disallow: /
# Misbehaving: requests much too fast:
User-agent: fast
Disallow: /
#
# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /
#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /
#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /
#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /
# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /
# Don't allow the Wayback Machine to index user-pages
#User-agent: ia_archiver
#Disallow: /wiki/User
#Disallow: /wiki/Benutzer
#
# Friendly, low-speed bots are welcome viewing article pages, but not
# dynamically-generated pages please.
#
# Inktomi's "Slurp" can read a minimum delay between hits; if your
# bot supports such a thing using the 'Crawl-delay' or another
# instruction, please let us know.
#
# There is a special exception for API mobileview to allow dynamic
# mobile web & app views to load section content.
# These views aren't HTTP-cached but use parser cache aggressively
# and don't expose special: pages etc.
#
# Another exception is for REST API documentation, located at
# /api/rest_v1/?doc.
#
HTTP/1.1 200 OK
Date: Wed, 11 Dec 2019 06:29:15 GMT
Server: Apache
Link: ; rel="https://api.w.org/", ; rel=shortlink
Content-Length: 39091
Content-Type: text/html; charset=UTF-8
X-Pad: avoid browser bug
Кнопка для анализа сайта в один клик, для установки перетащите ссылку на "Панель закладок"