Nginx configuration steps
Go to the conf directory under the nginx installation directory and save the following code as agent_deny.conf.
# Cd/usr/local/nginx/conf
# Vi agent_deny.conf
# Prohibit Scrapy and other tools from capturing
If ($ http_user_agent ~ * (Scrapy | Curl | HttpClient )){
Return 403;
}
# Prohibit access with null UA and UA values
If ($ http_user_agent ~ "WinHttp | WebZIP | FetchURL | node-superagent | java/| FeedDemon | Jullo | JikeSpider | Indy Library | Alexa Toolbar | found | AhrefsBot | crawler ldaddy | Java | Feedly | Apache-HttpAsyncClient | export | apacheider | Microsoft URL Control | Swiftbot | ZmEu | oBot | jaunty | Python-urllib | Bot BOT | YYSpider | DigExt | HttpClient | MJ12bot | heritrix | EasouSpider | Ezooms | Bot/0.1 | YandexBot | FlightDeckReports | Linguee Bot | ^ $ ") {
Return 403;
}
# Prohibit non-GET | HEAD | POST-based crawling
If ($ request_method !~ ^ (GET | HEAD | POST) $ ){
Return 403;
}
Then, insert the following code into the server segment of the website configuration:
Include agent_deny.conf;
After saving, run the following command to smoothly restart nginx:
/Usr/local/nginx/sbin/nginx-s reload
Of course, you can use php directly without configuring the environment.
// Obtain UA Information
$ Ua = $ _ SERVER ['http _ USER_AGENT '];
// Store malicious USER_AGENT into an array
$ Now_ua = array ('feeddemo', 'bot/0.1 (BOT for JCE) ', 'crawldaddy', 'Java', 'feedly ', 'universalfeedparser', 'Apache ', 'swiftbot ', 'zmeu', 'Indy Library', 'obot ', 'jaunty', 'yandexbot ', 'ahrefsbot', 'mj12bot ', 'winhttp ', 'easouspider ', 'httpclient', 'Microsoft URL control', 'yyspider', 'jaunty ', 'Python-urllib', 'lightdeckreports Bot ');
// Disable empty USER_AGENT and dedecms among other mainstream collection programs. Some SQL injection tools are empty USER_AGENT.
If (! $ Ua ){
Header ("Content-type: text/html; charset = utf-8 ");
Wp_die ('Do not collect this site, because the webmaster has a small JJ! ');
} Else {
Foreach ($ now_ua as $ value)
// Determine whether the UA exists in the array
If (eregi ($ value, $ ua )){
Header ("Content-type: text/html; charset = utf-8 ");
Wp_die ('Do not collect this site, because the webmaster has a small JJ! ');
}
}
Apache shields malicious User Agent
There are many ways to use. htaccess to shield the User Agent. Here, these User agents are transferred using the rewrite rules to achieve blocking effect.
RewriteCond % {HTTP_USER_AGENT} ". * EmbeddedWB. *" [OR]
RewriteCond % {HTTP_USER_AGENT} ". * QunarBot. *" [OR]
RewriteCond % {HTTP_USER_AGENT} ". * Windows 98. *" [OR]
RewriteCond % {HTTP_USER_AGENT} "^ Mozilla/4.0 $"
RewriteRule ^ (. *) $ http://www.111cn.net/
To use. htaccess, you need to know some regular expression syntaxes to correctly match strings.
If you find it useful, analyze your website logs and customize the logs according to your needs before blocking malicious User agents. You can use Firefox + User Agent Switcher to test the effect. Chrome also has a similar User Agent switching extension.