. Htaccess rules-PHP source code

Source: Internet
Author: User
Tags 403 forbidden error password protection
Ec (2 );. htaccess Rule 1. introduction introduces the file name. htaccess property 644 (RW-R & ndash; R & ndash;) htaccess will affect all subdirectories in the directory where it is located. Note that most of the content must be kept within one row, do not wrap, otherwise, an error occurs. 2. errorDocuments error document Officialdocument: ErrorDocumentDi script ec (2); script

. Htaccess rules

1. Introduction to Introduction
File Name. htaccess property 644 (RW-R-R -)
Htaccess affects all subdirectories in its directory.
Note that most of the content must be kept within one line. Do not wrap the line. Otherwise, an error may occur.

2. Error Documents Error document
Official document: ErrorDocument Directive
ErrorDocument code document
Example
ErrorDocument 400/errors/badrequest.html
ErrorDocument 404 http: // yoursite/errors/notfound.html
ErrorDocument 401 "Authorization Required"

More ..
Less... (Note that if the following content appears, the double quotation marks must be escaped ")
Common HTTP Status Codes
Successful Client Requests
200 OK
201 Created
202 Accepted
203 Non-Authorative Information
204 No Content
205 Reset Content
206 Partial Content
Client Request Redirected
300 Multiple Choices
301 Moved Permanently
302 Moved Temporarily
303 See Other
304 Not Modified
305 Use Proxy
Client Request Errors
400 Bad Request
401 Authorization Required
402 Payment Required (not used yet)
403 Forbidden
404 Not Found
405 Method Not Allowed
406 Not Acceptable (encoding)
407 Proxy Authentication Required
408 Request Timed Out
409 Conflicting Request
410 Gone
411 Content Length Required
412 Precondition Failed
413 Request Entity Too Long
414 Request URI Too Long
415 Unsupported Media Type
Server Errors
500 Internal Server Error
501 Not Implemented
502 Bad Gateway
503 Service Unavailable
504 Gateway Timeout
505 HTTP Version Not Supported

3. Password Protection
Official document: Authentication, Authorization and Access Control
Assume that the password file is. htpasswd.
AuthUserFile/usr/local/safedir/. htpasswd (full path name must be used here)
AuthName EnterPassword
AuthType Basic
Two common verification methods:
Require user windix
(Only allow windix login)
Require valid-user
(All valid users can log on)
Tip: how to generate a password file
Use the htpasswd command (apache)
Create a password file for the first generation
Htpasswd-c. htpasswd user1
Then add new users
Htpasswd. htpasswd user2

4. Enabling SSI Via htaccess allows SSI (Server Side Including) functions through htaccess
AddType text/html. shtml
AddHandler server-parsed. shtml
Options Indexes FollowSymLinks Includes
DirectoryIndex index.shtml index.html

5. Blocking users by IP address blocks User Access
Order allow, deny
Deny from 123.45.6.7
Deny from 12.34.5. (the entire class C address)
Allow from all

6. Blocking users/sites by referrer blocks user/site access based on referrer
Mod_rewrite module required
Example 1. Block A single referrer: badsite.com
RewriteEngine on
# Options + FollowSymlinks
RewriteCond % {HTTP_REFERER} badsite.com [NC]
RewriteRule. *-[F]
Example 2. prevent multiple referers: badsite1.com and badsite2.com
RewriteEngine on
# Options + FollowSymlinks
RewriteCond % {HTTP_REFERER} badsite1.com [NC, OR]
RewriteCond % {HTTP_REFERER} badsite2.com
RewriteRule. *-[F]
[NC]-Case-insensitive (Case-insensite)
[F]-403 Forbidden
Note that the above Code has commented out the "Options + FollowSymlinks" statement. If the Server does not set FollowSymLinks In the httpd. conf section, add this sentence. Otherwise, the error "500 Internal Server error" is returned.

7. Blocking bad bots and site rippers (aka offline browsers) prevents bad crawlers and offline browsers
Mod_rewrite module required
Bad crawler? For example, crawlers that capture the emailaddress and crawlers that do not comply with robots.txt (such as baidu ?)
You can judge them based on HTTP_USER_AGENT.
(But there is also a more shameless stream like "Search for zhongsou.com". It is too rogue to set your agent to "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 5.0, I can't do anything)
RewriteEngine On
RewriteCond % {HTTP_USER_AGENT} ^ BlackWidow [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Bot mailto: craftbot@yahoo.com [OR]
RewriteCond % {HTTP_USER_AGENT} ^ ChinaClaw [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Custo [OR]
RewriteCond % {HTTP_USER_AGENT} ^ DISCo [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Download Demon [OR]
RewriteCond % {HTTP_USER_AGENT} ^ eCatch [OR]
RewriteCond % {HTTP_USER_AGENT} ^ EirGrabber [OR]
RewriteCond % {HTTP_USER_AGENT} ^ EmailSiphon [OR]
RewriteCond % {HTTP_USER_AGENT} ^ EmailWolf [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Express WebPictures [OR]
RewriteCond % {HTTP_USER_AGENT} ^ ExtractorPro [OR]
RewriteCond % {HTTP_USER_AGENT} ^ EyeNetIE [OR]
RewriteCond % {HTTP_USER_AGENT} ^ FlashGet [OR]
RewriteCond % {HTTP_USER_AGENT} ^ GetRight [OR]
RewriteCond % {HTTP_USER_AGENT} ^ GetWeb! [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Go! Zilla [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Go-Ahead-Got-It [OR]
RewriteCond % {HTTP_USER_AGENT} ^ GrabNet [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Grafula [OR]
RewriteCond % {HTTP_USER_AGENT} ^ HMView [OR]
RewriteCond % {HTTP_USER_AGENT} HTTrack [NC, OR]
RewriteCond % {HTTP_USER_AGENT} ^ Image Stripper [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Image Sucker [OR]
RewriteCond % {HTTP_USER_AGENT} Indy Library [NC, OR]
RewriteCond % {HTTP_USER_AGENT} ^ InterGET [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Internet Ninja [OR]
RewriteCond % {HTTP_USER_AGENT} ^ JetCar [OR]
RewriteCond % {HTTP_USER_AGENT} ^ JOC Web Spider [OR]
RewriteCond % {HTTP_USER_AGENT} ^ larbin [OR]
RewriteCond % {HTTP_USER_AGENT} ^ LeechFTP [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Mass Downloader [OR]
RewriteCond % {HTTP_USER_AGENT} ^ MIDown tool [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Mister PiX [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Navroad [OR]
RewriteCond % {HTTP_USER_AGENT} ^ NearSite [OR]
RewriteCond % {HTTP_USER_AGENT} ^ NetAnts [OR]
RewriteCond % {HTTP_USER_AGENT} ^ NetSpider [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Net Vampire [OR]
RewriteCond % {HTTP_USER_AGENT} ^ NetZIP [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Octopus [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Offline Explorer [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Offline Navigator [OR]
RewriteCond % {HTTP_USER_AGENT} ^ PageGrabber [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Papa Foto [OR]
RewriteCond % {HTTP_USER_AGENT} ^ pavuk [OR]
RewriteCond % {HTTP_USER_AGENT} ^ pcBrowser [OR]
RewriteCond % {HTTP_USER_AGENT} ^ RealDownload [OR]
RewriteCond % {HTTP_USER_AGENT} ^ ReGet [OR]
RewriteCond % {HTTP_USER_AGENT} ^ SiteSnagger [OR]
RewriteCond % {HTTP_USER_AGENT} ^ SmartDownload [OR]
RewriteCond % {HTTP_USER_AGENT} ^ SuperBot [OR]
RewriteCond % {HTTP_USER_AGENT} ^ SuperHTTP [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Surfbot [OR]
RewriteCond % {HTTP_USER_AGENT} ^ tAkeOut [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Teleport Pro [OR]
RewriteCond % {HTTP_USER_AGENT} ^ VoidEYE [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Web Image Collector [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Web Sucker [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebAuto [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebCopier [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebFetch [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebGo IS [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebLeacher [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebReaper [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebSauger [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Website eXtractor [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Website Quester [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebStripper [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebWhacker [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WebZIP [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Wget [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Widow [OR]
RewriteCond % {HTTP_USER_AGENT} ^ WWWOFFLE [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Xaldon WebSpider [OR]
RewriteCond % {HTTP_USER_AGENT} ^ Zeus
RewriteRule ^. *-[F, L]
[F]-403 Forbidden
[L]-?

8. Change your default directory page Change default directory page
DirectoryIndex index.html index. php index. cgi index. pl

9. Redirects redirection
Single file
Redirect/old_dir/old_file.html http://yoursite.com/new_dir/new_file.html
Entire directory
Redirect/old_dir http://yoursite.com/new_dir
Effect: Move the directory to another location.
[Url] http://yoursite.com/old_dir-> http://yoursite.com/new_dir#/url]
[Url] http://yoursite.com/old_dir/dir1/test.html-> http://yoursite.com/new_dir/dir1/test.html#/url]
Tip: the solution that Redirect cannot be switched when using the user directory
When you use Apache Default User Directories such as http://mysite.com /~ Windix, when you want to turn to http://mysite.com /~ During windix/jump, you will find that the following Redirect does not work:
Redirect/jump http://www.google.com
The correct method is to change
Redirect /~ Windix/jump http://www.google.com
(Source:. htaccess Redirect in "Sites" not redirecting: why?
)

10. Prevent viewing of. htaccess file prevents. htaccess files from being viewed
Order allow, deny
Deny from all

11. Add the MIME type to Adding MIME Types
AddType application/x-shockwave-flash swf
Tips: application/octet-stream will prompt download

12. Preventing hot linking of images and other file types anti-leech
Mod_rewrite module required
RewriteEngine on
RewriteCond % {HTTP_REFERER }! ^ $
RewriteCond % {HTTP_REFERER }! ^ Http: // (www /.)? Mydomain.com/.#$ [NC]
RewriteRule. (gif | jpg | js | css) $-[F]
Resolution:
If HTTP_REFERER is not empty (the source is another site, not directly connected) and
If HTTP_REFERER does not start with (www.) mydomain.com (case-insensitive [NC]) (source is not the site)
The 403 Forbidden error [F] is returned for all files that end with. gif/. jpg/. js/. css.
You can also specify a response, as shown in the following example:
RewriteRule. (gif | jpg) $
[R, L]
[R]-redirection (Redirect)
[L]-Link)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.