Squid is mostly used as a caching server for HTTP services, and static files such as cached images can speed up the client's request return.
Squid can not only do the forward proxy, but also can do reverse proxy. When acting as a forward proxy, squid is behind the client, and the client wants to surf the internet regardless of the network. When a user (client) wants to request a home page, it sends a request to squid, squid requests it, then squid connects the user to the requested site and requests the homepage, and then passes the page to the user to keep a backup while other users request the same page. Squid passes the saved backup immediately to the user, making the user feel quite fast.
Forward agent, squid behind is the client, the client Internet to go through squid; reverse proxy, squid is behind the server, the server returned to the user data need to go squid.
First, to build squid forward agent
1. Install Squid
[email protected] ~]# Yum install-y squid
2. Edit the configuration file
[[email protected] ~]# vim/etc/squid/squid.conf//Modify one place, add two sentences
Find Cache_dir ufs/var/spool/squid 16 256
Change to Cache_dir ufs/var/spool/squid 16 256//1024 cache size, 16 subdirectories, 256 two level directories
Then add: Cache_mem MB//Specify the size of the cache memory footprint
Add: Refresh_pattern \ On the last side. (jpg|png|gif|js|css|mp3|mp4) 1440 20% 2880 ignore-reload//Set cache object, 1440 cache time, Ignore-reload ignore reload
As shown in the following:
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/73/3E/wKiom1X4HGDRjVIiAAJ53ybpYxs422.jpg "title=" Qq20150915212508.jpg "alt=" wkiom1x4hgdrjviiaaj53ybpyxs422.jpg "/> The configuration file after saving test for syntax errors:
[Email protected] ~]# Squid-kcheck
Start Squid:
[[email protected] ~]#/etc/init.d/squid start
Init_cache_dir/var/spool/squid ... Starting squid:. [ OK ]
We can view the cache log:
[Email protected] ~]# Cat/var/log/squid/cache.log
You can also go to the cache directory to see the generated files:
[[email protected] ~]# Ls/var/spool/squid///16 subdirectories, sub-directories there are altogether 256 two levels of directories
Geneva, Geneva, 0A 0B 0C 0D 0E 0F swap.state
3. Testing
(1) Testing on Windows
Open Internet Explorer--tools -->internet Options -- LAN Settings -- Check the proxy server -- high --- HTTP: The server address is: 192.168.0.109, Port 3128--and OK--- ok
Then we use the Internet browser to open www.baidu.com, casually visit the website. We'll grab the packet and see if the agent is implemented.
[[email protected] ~]# tcpdump-nn port 3128//Can see a lot of bags caught
We can also go to the cache directory under the level two sub-directory to see that will generate 256 three levels of directories, the first three-level directory will be stored in 256 files, filled with the next level of a three directory, and then stored in turn.
[Email protected] ~]# ls/var/spool/squid/00/00
(2) Testing on Linux
[Email protected] ~]# curl-x127.0.0.1:3128 Www.baidu.com-I
http/1.0 OK
Date:tue, Sep 13:58:49 GMT
4. Set blacklist/whitelist
Set up a proxy server can access any website, then we can also control the Internet, do not allow access to specific sites.
(1) Set white list: can only visit Baidu, the other can not access.
Edit configuration file, set ACL
[[email protected] ~]# vim/etc/squid/squid.conf//below the ACL, write the following:
ACL http Proto http
ACL good_domain dstdomain. baidu.com//good_domain Custom, Dstdomain and whitelist
Http_access Allow HTTP Good_domain
Http_access deny HTTP!good_domain
After saving exit, detect the configuration file for errors and restart the service.
[[email protected] ~]# squid-k Check//Can be abbreviated to-KCH
[Email protected] ~]#/etc/init.d/squid restart
Stop the Squid: ......... [ OK ]
Starting squid:. [ OK ]
Note: Here we can also not restart, directly reload the line, the command is as follows:
[Email protected] ~]# Squid-k reconfigure//Can be abbreviated to-kre
Testing under Windows:
We open the Web page, we can find only on Baidu, the other sites are not on.
Linux under test:
[Email protected] ~]# curl-x127.0.0.1:3128 Www.baidu.com-I
http/1.0 OK
[Email protected] ~]# curl-x127.0.0.1:3128 Www.qq.com-I
http/1.0 403 Forbidden
(2) Set the blacklist: In addition to Baidu, other sites can access.
[[email protected] ~]# vim/etc/squid/squid.conf//Replace the above to write the following content
ACL http Proto http
ACL Bad_domain dstdomain. baidu.com
Http_access deny HTTP Bad_domain
Http_access Allow HTTP!bad_domain
The test is the same as above.
Second, build squid reverse proxy
Reverse proxy is mainly used to cache static items, because many static items, especially pictures, streaming media, and so more bandwidth-consuming, in China, Unicom network access to telecommunications resources are slow, if you go to visit the large flow of pictures, streaming media that will be slower, so if the Unicom network configuration of a squid reverse proxy, Let the Unicom client directly access the Unicom squid, and these static items have been slow to exist on squid, which greatly speed up the access speed.
There is no big difference between the reverse proxy process and the forward proxy, the only difference being that one place in the configuration file needs to be changed. (To better achieve the effect, we first commented out the forward proxy configured, the browser also cancels the proxy settings)
1. Edit the configuration file
[Email protected] ~]# vim/etc/squid/squid.conf
Change Http_port 3128 to Http_port Accel Vhost Vport
Then add the backend real server information you want to proxy (here are examples with qq.com and baidu.com)
Cache_peer 101.226.103.106 Parent 0 originserver Name=a
Cache_peer_domain a www.qq.com
Cache_peer 115.239.211.112 Parent 0 originserver Name=b
Cache_peer_domain b www.baidu.com
Where Cache_peer is configured to configure the backend server IP (by pinging the real IP) and the port, name is the domain name to be configured, which corresponds to the cache_peer_domain in the back. The actual application, IP is mostly internal and external IP, and the domain name may have many, if it is squid to proxy a web of all domain names, then write this:
Cache_peer 192.168.10.111 0 Originserver
The back even Cache_peer_domain was saved.
2. Test Reverse Proxy
Because the configuration file has been modified, it is wrong to look at the configuration file first
[Email protected] ~]# Squid-krestart
Squid:ERROR:No Running Copy
Here is an error, say no squid running, we look at the status of the next 80 ports
[Email protected] ~]# NETSTAT-NLP
We can find that the 80 port is used by Nginx, and we have previously configured Nginx on this machine. So we killed the Nginx service process.
[Email protected] ~]# killall Nginx
Start the squid again.
[[email protected] ~]#/etc/init.d/squid start
Starting squid:. [ OK ]
(1) Testing under Windows
First locate the C:\Windows\System32\drivers\etc\hosts file and add the following:
192.168.0.109 www.baidu.com www.qq.com
We open the browser, the test can only access www.baidu.com www.qq.com These two sites, the other can not access.
(2) testing under Linux
[Email protected] ~]# curl-x127.0.0.1:80 www.baidu.com
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/73/3E/wKioL1X4Sc2glIDYAACEP6oJfEs647.jpg "title=" Qq20150916002845.jpg "alt=" Wkiol1x4sc2glidyaacep6ojfes647.jpg "/>
[Email protected] ~]# curl-x127.0.0.1:80 www.qq.com
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/73/41/wKiom1X4R6_QP0n7AABvLvGy27Y320.jpg "title=" Qq20150916002855.jpg "alt=" Wkiom1x4r6_qp0n7aabvlvgy27y320.jpg "/>
[Email protected] ~]# curl-x127.0.0.1:80 www.sina.com
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M01/73/3E/wKioL1X4Sfbh0AD3AACF1Cxje60812.jpg "title=" Qq20150916002902.jpg "alt=" Wkiol1x4sfbh0ad3aacf1cxje60812.jpg "/>
This article is from the "M April Days" blog, please be sure to keep this source http://1015489314.blog.51cto.com/8637744/1695166
Configure Squid Service