我们经常会利用一些三方漏洞检测软件,或者网页扫描工具扫描网站的漏洞,常见的有
1、X-Frame-Options头未设置
2、[轻微] WEB服务器启用了OPTIONS方法
3、检测到目标服务器启用了TRACE方法
4、http-only标志未设置
本文提供一些规则来进行修复
1、windows服务器IIS7下规则,复制下面规则保存为web.config
<?xml version="1.0"?>
<configuration>
<system.webServer>
<security>
<requestFiltering>
<verbs allowUnlisted="true">
<add verb="OPTIONS" allowed="false"/>
<add verb="TRACE" allowed="false"/>
</verbs>
</requestFiltering>
</security>
<httpProtocol>
<customHeaders>
<add name="X-Frame-Options" value="SAMEORIGIN" />
</customHeaders>
</httpProtocol>
</system.webServer>
<system.web>
<httpCookies httpOnlyCookies="true" />
</system.web>
</configuration>
2、apache下规则,复制保存为.htaccess
ErrorDocument 400 "security"
ErrorDocument 500 "security"
ErrorDocument 404 "security"
ErrorDocument 401 "security"
ErrorDocument 403 "security"
Options -Indexes
<IfModule php5_module>
php_value session.cookie_httponly true
</IfModule>
<IfModule mod_headers.c>
#Header set X-Frame-Options deny
Header always append X-Frame-Options DENY
</IfModule>
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_METHOD} ^(TRACE|OPTIONS)
RewriteRule .* - [F]
RewriteCond %{REQUEST_URI} ^(TRACE|OPTIONS) [NC]
RewriteRule .* - [F]
</IfModule>
3、敏感文件的漏洞
删除phpinfo,关闭服务器500详细报错
4、robots.txt 用规则屏蔽非蜘蛛访问
(1)httpd.conf/.htaccess
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} !(spider|bot|slurp|ia_archiver|Python-urllib|pycurl) [NC]
RewriteRule (^robots\.txt$) - [F]
(2)web.config/IIS7
<rule name="allow_spider">
<match url="(^robots.txt$)" ignoreCase="false"/>
<conditions>
<add input="{HTTP_USER_AGENT}" pattern="spider|bot|slurp|ia_archiver|Python-urllib|pycurl" ignoreCase="false" negate="true"/>
</conditions>
<action type="CustomResponse" statusCode="403" statusReason="Forbidden" statusDescription="Forbidden" />
</rule>
(3)nginx
location=/robots.txt{
if($http_user_agent!~*"spider|bot|slurp|ia_archiver|Python-urllib|pycurl"){
return403;
}
}
文章评论