您好,欢迎来到

中国信息港

! 请登录 免费注册
  • 在线企业QQ:

怎么样屏蔽搜索引擎对敏感数据的抓取

服务器技巧
2018-1-18   阅读:265

如果是正常的搜索引擎蜘蛛访问,不建议对蜘蛛进行禁止,否则网站在百度等搜索引擎中的收录和排名将会丢失,造成客户流失等损失。但是在很多时候我们会发现网站被很多异常不知名的蜘蛛抓取网站这时候,站长就可能需要屏蔽这些搜索引擎蜘蛛。同样,当一个网站因为新搭建的网站或者改版的网站,需要快速允许搜索引擎的爬行和抓取,以便是网站内容更快的收录供网友搜索浏览,这时候就用到了如何使用.htaccess/httpd.conf/web.config(apache、iis6、ii7)规则拦截搜索引擎蜘蛛抓取。

说明:下方的.htaccess/httpd.conf/web.config规则中,被红色标注的为蜘蛛名称,以|为分割。

 

Linux下 规则文件.htaccess(手工创建.htaccess文件到站点根目录),.htaccess规则内容为:

<IfModule mod_rewrite.c>

RewriteEngine On

#Block spider

RewriteCond %{HTTP_USER_AGENT} "Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu" [NC]

RewriteRule !(^robots\.txt$) - [F]

</IfModule>


windows2003下 规则文件httpd.conf规则内容为:   (在虚拟主机控制面板中用 “ISAPI筛选器自定义设置 "  开启自定义伪静态 Isapi_Rewite3.1 )

#Block spider

RewriteCond %{HTTP_USER_AGENT} (Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|mail.RU|curl|perl|Python|Wget|Xenu|ZmEu) [NC]

RewriteRule !(^/robots.txt$) - [F]


windows2008下 规则文件web.config规则内容为:

<?xml version="1.0" encoding="UTF-8"?>

<configuration>

    <system.webServer>

        <rewrite>

            <rules>

<rule name="Block spider">

      <match url="(^robots.txt$)" ignoreCase="false" negate="true" />

      <conditions>

        <add input="{HTTP_USER_AGENT}" pattern="Webdup|AcoonBot|AhrefsBot|Ezooms|EdisterBot|EC2LinkFinder|jikespider|Purebot|MJ12bot|WangIDSpider|WBSearchBot|Wotbox|xbfMozilla|Yottaa|YandexBot|Jorgee|SWEBot|spbot|TurnitinBot-Agent|curl|perl|Python|Wget|Xenu|ZmEu" ignoreCase="true" />

      </conditions>

      <action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Forbidden" />

</rule>

            </rules>

        </rewrite>

    </system.webServer>

</configuration>


附:规则中默认屏蔽部分不明蜘蛛,要屏蔽其他蜘蛛按规则添加即可

附各大蜘蛛名字:

google蜘蛛:googlebot

百度蜘蛛:baiduspider

yahoo蜘蛛:slurp

alexa蜘蛛:ia_archiver

msn蜘蛛:msnbot

bing蜘蛛:bingbot

altavista蜘蛛:scooter

lycos蜘蛛:lycos_spider_(t-rex)

alltheweb蜘蛛:fast-webcrawler

inktomi蜘蛛:slurp

有道蜘蛛:YodaoBot和OutfoxBot

热土蜘蛛:Adminrtspider

搜狗蜘蛛:sogou spider

SOSO蜘蛛:sosospider

360搜蜘蛛:360spider