Disabling some user agent can save some traffic can also prevent some malicious access, especially some search engine crawler, such as our site is a local site, there is no need to be some foreign search engine crawler index, can be banned, the specific operation is as follows:
1. Edit the file:
# vi/usr/local/nginx/conf/vhosts/yourpool.conf
2, add the following content (example):
#禁止Scrapy等工具的抓取, note that curl fetching has been canceled
if ($http _user_agent ~* (scrapy| curl| HttpClient)) {
return 403;
}
#禁止恶意user_agent访问
if ($http _user_agent ~ "feeddemon| jikespider| Indy library| Alexa toolbar| asktbfxtv| Ahrefsbot| crawldaddy| Coolpadwebkit| java| feedly| universalfeedparser| apachebench| Microsoft URL control| Swiftbot| zmeu|obot|jaunty| Python-urllib|lightdeckreports bot| yyspider| Digext| Httpclient| Mj12bot|heritrix| easouspider| ezooms|^$ ") {
return 403;
}
#禁止非GET | head| Post-mode fetching
if ($request _method!~ ^ (get| head| POST) ($) {
return 403;
}
There are some words in the middle of the space, so the two sides need to use double quotation marks, disable a number of search engine crawler, there are several malicious machine, etc., you can analyze the log according to the situation to block malicious user Agent.
Nginx Disable malicious UESR agent Access