Strategy:
1) Use the Ngx_http_limit_req_module module to limit the rate of requests and the number of request connections
Configuration reference: Http://nginx.org/en/docs/http/ngx_http_limit_req_module.html#limit_req_zone
2) limit the number of concurrency using the Ngx_http_limit_conn_module module
Configuration reference: Http://nginx.org/en/docs/http/ngx_http_limit_conn_module.html#directives
The configuration is given as follows:
http { limit_req_zone $binary _remote_addr zone=one:10m rate=1r/s; limit_conn_zone $binary _remote_addr zone=addr:10m; server { listen 80; server_name 210.10.5.102; location / { root html; index index.html index.htm; limit_req zone=one burst=5; limit_conn addr 1; } }}
Other configurations are omitted, this is only to focus on the anti-DDoS setting limit,
After the configuration is complete, there are several restrictions
No more than 1 processing requests per second (1R/S),
The number of requests per visit is no more than 5 (burst=5), if more than 5 are processed according to 503,
The number of concurrent connections per access allows only 1 concurrent (addr 1), more than 1 concurrency followed by 503 processing
3) test based on these configured policies (APACHE-AB):
3.1 First Test whether AB is working,
Server software:bws/1.1server Hostname:www.baidu.comServer port:80document path:/document length:96527 Bytesconcurrency Level:10time taken for tests:1.952 secondscomplete requests:20failed requests:19 (connect:0, length:19 , exceptions:0)
A total of 20 requests, 10 concurrent each time, failed 19, indicating that Baidu did Burst=1 and addr 1 of the defense
3.2 Test local NGINX:20 requests each time 10 concurrency, Success 20, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/test.html/document Length:168 Bytesconcurrency Level:10time taken for tests:0.109 secondscomplete requests:20failed requests:0
3.2 Test local NGINX:20 requests each time 10 concurrency, Success 20, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/test.html/document Length:168 Bytesconcurrency Level:10time taken for tests:0.109 secondscomplete requests:20failed requests:0
3.2 Test local nginx:2000 requests each time 1000 concurrency, Success 2000, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/test.html/document Length:168 Bytesconcurrency Level:1000time taken for tests:12.900 secondscomplete requests:2000failed requests:0
Indicates that the local throughput is excellent and is fully throughput.
3.2 Test local nginx:200 requests each time 100 concurrency, Success 200, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:100time taken for tests:0.983 secondscomplete requests:200failed requests:0Non-2xx Responses : 200
This test is JSP, through the reverse proxy, the original static HTML is taken directly from the Nginx server.
3.2 Test local nginx:2000 requests each time 1000 concurrency, Success 2000, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:1000time taken for tests:9.858 secondscomplete requests:2000failed requests:0Non-2xx Responses : 2000
The result is very good, whether it is a movement or not, it can be full-throughput.
3.2 Test local nginx:200 Requests 10 concurrency and 1 concurrency there is no difference in processing time
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:10time taken for tests:1.001 secondscomplete requests:200failed requests:0Non-2xx responses:200
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:1time taken for tests:1.792 secondscomplete requests:200failed requests:0Non-2xx responses:200
1 concurrent time is approximately 10 concurrent 1.7 times times, the explanation certainly has the difference.
3.2 Join policy to process 1 req per second, while waiting for queue burst=5, test local NGINX:10 requests each time 1 concurrency, 10 success, 0 failures, but time consuming 9s+
Limit_req_zone $binary _remote_addr zone=one:10m rate=1r/s;limit_req zone=one burst=5;
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:1time taken for tests:9.014 secondscomplete requests:10failed requests:0Non-2xx responses:10
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, test local nginx: 10 requests 6 concurrent, 6 successful, 4 failed
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:6time taken for tests:5.019 secondscomplete requests:10failed requests:4 (connect:0, Length:4, Ex ceptions:0)
This is speculated to be the first time the number of concurrent 6 failed 1, the second concurrency 4 failed 3, but specifically uncertain. But the burst=5 really came into effect.
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, test local nginx: 10 requests 5 concurrent, 10 successful, 0 failed
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:5time taken for tests:9.016 secondscomplete requests:10failed requests:0Non-2xx responses:10
The reason for all the success should be burst=5, not exceeding the queue, compared to the failure of concurrent 6.
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, test local nginx: 20 Requests 7 concurrent, 6 successful, 4 failed
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:7time taken for tests:5.009 secondscomplete requests:10failed requests:4 (connect:0, Length:4, Ex ceptions:0) non-2xx Responses:10
7 concurrency is the same as 6 concurrent results, all of them failed 4, which makes me very puzzled.
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, test local nginx: 10 requests 10 concurrent, 6 successful, 4 failed
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:10time taken for tests:5.023 secondscomplete requests:10failed requests:4 (connect:0, Length:4, E xceptions:0) non-2xx Responses:10
10 concurrency is the same as 6 concurrent results, which are 4 failures.
3.2 Join policy processing 1 req per second, while waiting for the queue burst=5, and limit the IP concurrent connection to allow only 1 concurrent, test local nginx: 5 requests each time 1 concurrency, 5 success, 0 failures, No failure due to no exceeding limit
Limit_conn addr 1;
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:1time taken for tests:4.025 secondscomplete requests:5failed requests:0non-2xx responses:5
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent, test local nginx: 5 requests each time 2 concurrency, Success 5, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:2time taken for tests:4.012 secondscomplete requests:5failed requests:0non-2xx responses:5
This result is not expected, because every time 2 concurrent it can be processed, it is more difficult to understand, but whether he continued to test, and then explain.
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent, test local nginx: 5 requests each time 5 concurrency, Success 2, failed 3
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:5time taken for tests:4.010 secondscomplete requests:5failed requests:3 (connect:0, Length:3, EXC eptions:0) non-2xx Responses:5
This result shows that the concurrency limit limit_conn addr 1 is in effect, otherwise it is impossible to handle 5 concurrency. But there's a contradiction with just dealing with 2 concurrency, because it's also impossible to deal with 2 concurrency, regardless of whether he continues to test.
3.2 Join policy processing 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent, test local nginx: 5 requests each time 3 concurrency, Success 5, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:3time taken for tests:4.009 secondscomplete requests:5failed requests:0non-2xx responses:5
Description 3 concurrency can also be handled.
3.2 Join the policy to process 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent each time, test the local nginx:5 request every 4 concurrent, Success 5, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:4time taken for tests:4.025 secondscomplete requests:5failed requests:0non-2xx responses:5
Description 4 concurrency can also be handled.
3.2 Join the policy to process 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent each time, test the local nginx:10 request every 4 concurrent, Success 6, failed 4
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:4time taken for tests:13.057 secondscomplete requests:10failed requests:4 (connect:0, Length:4, E xceptions:0) non-2xx Responses:10
5 Request 4 can be processed concurrently, but 10 requests 4 can not be processed concurrently. Absolutely incomprehensible! Whatever it continues.
3.2 Join the policy to process 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent each time, test the local nginx:10 request every 3 concurrent, Success 7, failed 3
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:3time taken for tests:11.049 secondscomplete requests:10failed requests:3 (connect:0, Length:3, E xceptions:0) non-2xx Responses:10
3.2 Join the policy to process 1 req per second, while waiting for queue burst=5, and limit IP concurrent connection to allow only 1 concurrent each time, test the local nginx:10 request every 2 concurrent, Success 10, failed 0
Server software:nginx/1.2.6server hostname:210.10.5.189server port:80document path:/index.jsp/document Length:168 Bytesconcurrency Level:2time taken for tests:9.001 secondscomplete requests:10failed requests:0Non-2xx responses:10
I'm not going to go on here anymore, I've seen other people testing blogs, and I'm not sure what the reason is, but it's not exactly the same as pre-planning, but testing is not meaningless, because at least we know that our configuration strategy will limit access and, to a certain extent, protect against DDoS attacks.
Nginx to prevent partial DDoS attacks