PR | 6 |
ИКС | n/a |
Страниц в Google | 457000 |
Страниц в Яндексе | 1000 |
Dmoz | Нет |
Яндекс Каталог | Нет |
Alexa Traffic Rank | 50542 |
Alexa Country | ![]() |
История изменения показателей | Авторизация |
Идет сбор информации... Обновить
DLI Home
n/a
n/a
UTF-8
25.08 КБ
473
3 755 симв.
3 150 симв.
Счетчик | Посетители за 24 часа | Просмотры | Просмотров на посетителя |
---|---|---|---|
![]() | Нет доступа | Нет доступа | n/a |
# The FULL URL to the DSpace sitemaps
# The http://www.dli.ernet.in:8081/dli will be auto-filled with the value in dspace.cfg
# XML sitemap is listed first as it is preferred by most search engines
Sitemap: http://www.dli.ernet.in:8081/dli/sitemap
Sitemap: http://www.dli.ernet.in:8081/dli/htmlmap
##########################
# Default Access Group
# (NOTE: blank lines are not allowable in a group record)
##########################
User-agent: *
# Disable access to Discovery search and filters
Disallow: /discover
Disallow: /search-filter
#
# Optionally uncomment the following line ONLY if sitemaps are working
# and you have verified that your site is being indexed correctly.
# Disallow: /browse
# Disallow: /handle/2015/*/browse
#
# If you have configured DSpace (Solr-based) Statistics to be publicly
# accessible, then you may not want this content to be indexed
# Disallow: /statistics
#
# You also may wish to disallow access to the following paths, in order
# to stop web spiders from accessing user-based content
# Disallow: /contact
# Disallow: /feedback
# Disallow: /forgot
# Disallow: /login
# Disallow: /register
##############################
# Section for misbehaving bots
# The following directives to block specific robots were borrowed from Wikipedia's robots.txt
##############################
# advertising-related bots:
User-agent: Mediapartners-Google*
Disallow: /
# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /
User-agent: DOC
Disallow: /
User-agent: Zao
Disallow: /
# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /
User-agent: Zealbot
Disallow: /
User-agent: MSIECrawler
Disallow: /
User-agent: SiteSnagger
Disallow: /
User-agent: WebStripper
Disallow: /
User-agent: WebCopier
Disallow: /
User-agent: Fetch
Disallow: /
User-agent: Offline Explorer
Disallow: /
User-agent: Teleport
Disallow: /
User-agent: TeleportPro
Disallow: /
User-agent: WebZIP
Disallow: /
User-agent: linko
Disallow: /
User-agent: HTTrack
Disallow: /
User-agent: Microsoft.URL.Control
Disallow: /
User-agent: Xenu
Disallow: /
User-agent: larbin
Disallow: /
User-agent: libwww
Disallow: /
User-agent: ZyBORG
Disallow: /
User-agent: Download Ninja
Disallow: /
# Misbehaving: requests much too fast:
User-agent: fast
Disallow: /
#
# If your DSpace is going down because of someone using recursive wget,
# you can activate the following rule.
#
# If your own faculty is bringing down your dspace with recursive wget,
# you can advise them to use the --wait option to set the delay between hits.
#
#User-agent: wget
#Disallow: /
#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /
#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /
#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /
# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /
HTTP/1.1 200 200
Date: Tue, 29 Jan 2019 00:35:27 GMT
Server: Apache/2.4.6 (CentOS) OpenSSL/1.0.1e-fips PHP/5.4.16 mod_wsgi/3.4 Python/2.7.5 mod_jk/1.2.41
Set-Cookie: JSESSIONID=93E631374E06D31C4E3A139CCCD60D09; Path=/; HttpOnly
X-Cocoon-Version: 2.2.0
Content-Language: en
Content-Length: 25686
Content-Type: text/html;charset=utf-8
Кнопка для анализа сайта в один клик, для установки перетащите ссылку на "Панель закладок"