Since the site now needs to be able to withstand the high concurrency requirements, so when we finish writing code, if need to go online, it is best to undergo stress testing, so it is better
Run:
Under Windows System, open the CMD Command Line window and navigate to the Apache installation directory under the bin directory
CD C:\Program Files (x86) \apache software Foundation\apache2.2\bin
Type the command:
Ab-n 800-c http://192.168.0.10/
(-N Issues 800 requests,-C simulates 800 concurrent, quite 800 people concurrently, followed by test URL)
Ab-t 60-c http://192.168.0.10/
Requests are made within 60 seconds, 100 requests at a time.
result parameter explanation:
This is apachebench, Version 2.3 < $Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus technology LTD, http://www.zeustech.net/
Licensed to the Apache software Foundation, http://www.apache.org/
Benchmarking 192.168.0.10 (Be patient)
Completed requests
Completed requests
Completed requests
Completed requests
Completed requests
Completed requests
Completed requests
Completed requests
Finished requests
Server software:microsoft-httpapi/2.0
Server hostname:192.168.0.10
Server port:80
Document Path:/
Document length:315 bytes The body length of the HTTP response data
Concurrency level:800
Time taken for tests:0.914 seconds all of the times that these requests are processed to complete
Complete requests:800 number of completed requests
Failed requests:0 number of failed requests
Write errors:0
Non-2xx responses:800
Total transferred:393600 Bytes Network traffic
HTML transferred:252000 bytes HTML content transfer Volume
Requests per second:875.22 [#/sec] (mean) throughput-number of requests per second
Time per request:914.052 [MS] (mean) server receives request, Response page takes
Time per request:1.143 [MS] (mean, across all concurrent requests) average consumption of each request concurrently
Transfer rate:420.52 [Kbytes/sec] received average per second network traffic can help you troubleshoot problems with extended response times due to excessive network traffic
Decomposition of time consumed on the
Network:
Connection Times (ms)
min MEAN[+/-SD] median max
connect: 0 1 0.5 1 3
processing: 245 534 125.2 570 682
waiting: 11 386 189.1 409 669
total: 246 535 125.0 571 684
The response of all requests in the entire scene. Each request has a response time in the scene
Where 50% of the user response time is less than 571 milliseconds
80% User response time is less than 652 milliseconds
Maximum response time of less than 684 milliseconds
Percentage of the requests served within a certain time (MS)
50% 571
66% 627
75% 646
80% 652
90% 666
95% 677
98% 681
99% 682
100% 684 (Longest request)
It's just one of the test methods.
After several tests, I came to experience the following:
1. The more complex the page needs, the lower the concurrency success rate
lottery.php Page This is a more complex feature page
Load concurrency: 1000 simultaneous online people: 500
Complete requests:1000
Failed request:965
index.php Page This is a more simple feature page
Load concurrency: 1000 simultaneous online: 500
Complete requests:1000
Failed request:15
Apache AB concurrent load pressure test