There are many ways to load balance Web services, but using Nginx for load balancing deployment is undoubtedly very efficient and very popular.
I do most of it. NET development, but the deployment load has been using Nginx on the other load mode of research is not much, only tested once using server farm to do the load deployment is not in the actual project, recently see colleagues using server farm to do load configuration, but to its performance and so do not know, hope to know the message of the discussion.
Talk less, see steps:
1. Installation and deployment of Nginx service
Please see the above "Nginx Service Introduction"
2. configuration file Writing
Open the Vhost directory under the Nginx installation directory (this directory has been included in nginx.conf, if it is only for a certain directory load of a site, you can modify the include path), the new load profile testfz.conf, the following:
Upstream Backend {server192.168.1.106:8001weight=1; Server192.168.1.107:8001weight=2; #ip_hash;} server {Listen the; server_name www.test.com; Location~ ^/*{proxy_set_header Host $host; Proxy_set_header X-real-ip $remote _addr; Proxy_set_header x-forwarded-for $proxy _add_x_forwarded_for; Proxy_buffering off; Proxy_passHttp://wwwbackend; }}
3. Testing
./ is is successful
4. Reload the configuration file
/etc/init.d/nginx Reload
If you restart the report PID error, under the installation path-C reload the configuration file.
5, about the parameter description of Nginx configuration
A, polling each request according to the Nginx configuration file in order, distributed to different back-end servers, the server if down will automatically identify and eliminate;
B, weight nginx According to the weight configuration to distribute the request more to the high-configuration back-end server, the relatively small number of requests distributed to the low provisioning server.
C, ip_hash each request according to the hash result of the access IP allocation, to ensure that the connection to a fixed load server, this way can solve the session problem;
D, the least connection can be forwarded to the server with the least number of connections to the Web request. Just add least_conn to the festival.
In addition, this can be configured on each load server as follows:
A, down: The current server temporarily does not participate in the load;
B, Max_fails: The number of times to allow the request to fail defaults to 1, and when the maximum number of times is exceeded, the error defined by the Proxy_next_upstream module is returned;
C, fail_timeout:max_fails times of failure, the time of the pause;
D, Backup: All other non-backup machine down or busy time, only start requesting backup machine, can be used as a failover.
Nginx Load Balancer Configuration instructions