Use the. Htaccess file to prevent malicious website attacks from some IP addresses
If I have read the article "my website has been attacked by hackers" a few days ago, I should know that some people have gone from 180.97.106 a few days ago. * This IP address segment initiates a large number of malicious and targeted scans to the "WEB hacker (www.website.com)" website in an attempt to obtain some internal configuration files and information of the website through brute force detection. I use. htaccess is used to defend against attacks. the following configuration is added to the Htaccess file: order allow, denydeny from 180.97.106.allow from all. htaccess is a powerful configuration file for a website. The more you know about its functions, the easier you can control your website configuration. Using. Htaccess to prohibit access to a website by an IP address is one of the basic functions. The above configuration is only one of the usage. Next I will summarize more usage under this topic. Specify the IP address to block access to order allow, denydeny from 192.168.44.201deny from 224.39.163.12deny from 172.16.7.92allow from all. The code above shows how to disable access to websites from three different IP addresses. If you have a lot of IP addresses to prohibit access by specifying IP segments, it is too troublesome to specify one IP segment at a time. The following describes how to disable an IP segment At A Time: order allow, denydeny from 192.168.deny from 10.0.0.allow from all specifies the domain name to block access to order allow, denydeny from some-evil-isp.comdeny from subdomain. the code above another-evil-isp.comallow from all can block access to the website by a specific ISP. Use. htaccess prohibits bot crawlers (spiders) in China. I think you only need Google and Baidu search engines. Other small search engines, such as sogou and 360, can be ignored. Otherwise, these unimportant search engine crawlers will not only benefit you, but will also crawl your website. The following describes how to disable them:
#get rid of the bad botRewriteEngine onRewriteCond %{HTTP_USER_AGENT} ^BadBotRewriteRule ^(.*)$ http://go.away/
The preceding section disables a crawler. If you want to disable multiple crawlers, you can configure it in. Htaccess as follows:
#get rid of bad botsRewriteEngine onRewriteCond %{HTTP_USER_AGENT} ^BadBot [OR]RewriteCond %{HTTP_USER_AGENT} ^EvilScraper [OR]RewriteCond %{HTTP_USER_AGENT} ^FakeUserRewriteRule ^(.*)$ http://go.away/
This code blocks different crawlers in 3 at the same time. Pay attention to "[OR]". Use. htaccess hotlink: If your website is very popular, you will certainly have resources such as images or videos on your website, some people may embed their pages without professional ethics, occupying or wasting your bandwidth, affecting the stability of your server. For such leeching behaviors, using. Htaccess can easily block their theft, as shown below:
RewriteEngine onRewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC]RewriteRule .* - [F]
After the above Code is added to. Htaccess, when the somebadforum.com website leeches your website resources, the server will return the 403 Forbidden error and your bandwidth will not be lost. The following code blocks multiple websites:
RewriteEngine onRewriteCond %{HTTP_REFERER} ^http://.*somebadforum\.com [NC,OR]RewriteCond %{HTTP_REFERER} ^http://.*example\.com [NC,OR]RewriteCond %{HTTP_REFERER} ^http://.*lastexample\.com [NC]RewriteRule .* - [F]
You also saw ,. htaccess is a powerful website server configuration tool. With it, you can have rich and free control over your website server, but the solution is usually very simple, elegant, basically no need to restart the server, that is, immediately take effect. If this configuration file is not available on your server, create one!