Http_load Learning Experience:
Test the average amount of traffic (throughput) that the site can withstand per second
http_load -parallel 5 -fetches 1000 urls.txt
This command line uses 5 processes at the same time to randomly access the list of URLs in Urls.txt, with a total of 1000 visits. Results after the run:
Fetches, 5 max parallel, 6e+06 bytes, in 58.1026 seconds
6000 mean Bytes/connection
17.2109 fetches/sec, 103266 bytes/sec
msecs/connect:0.403263 mean, 68.603 max, 0.194 min
msecs/first-response:284.133 mean, 5410.13 max, 55.735 min
HTTP Response Codes:
Code 200-1000
Judging from the results above, the target site is only able to withstand 17 visits per second, not strong enough.
Test whether the site can withstand the expected access pressure (
http_load -rate 2 -seconds 300 urls.txt
Maintain a certain frequency of access to the destination URL within 300 seconds.
Note:
- Urls.txt Save a list of URLs to access, one per line
- Do not test the website after the launch, it is not fun to crush
For example:
1.http_load-parallel 5-fetches Urls.txt
2.http_load-rate 2-seconds Urls.txt
3. Http_load-p 30-s Urllist.txt
4. Http_load-parallel 50-s Urls.txt
This command line uses 50 simultaneous processes to randomly access the list of URLs in Urls.txt for a total of 10 seconds.
Parameter description:
-parallel Shorthand-P: means the number of concurrent user processes.
-fetches abbreviated-F: Meaning total number of accesses
-rate Shorthand-r: meaning the frequency of visits per second
-seconds shorthand-S: meaning Total access time
Parameters can be freely combined, and there is no limit to the choice between parameters.
Urls.txt Save a list of URLs to access,
The URL is the name of the URL you want to access, and the parameter can be either a single URL or a file containing a URL. The file format is one url,url per line preferably more than 50-100 test results are better. The file format is as follows
http://iceskysl.1sters.com/?action=show&id=336
http://iceskysl.1sters.com/?action=show&id=335
http://iceskysl.1sters.com/?action=show&id=332
Http://iceskysl.1sters.com/?action=show&id=32
Parameter understanding, let's run a command and look at its return results
command :% ./http_load-rate 5-seconds URLs
command explanation : Perform a test with a duration of 10 seconds and access frequency of 5 times per second.
49 fetches, 2 max parallel, 289884 bytes, in 10.0148 seconds
5916 mean bytes/connection
4.8 9274 fetches/sec , 28945.5 bytes/sec ( Important performance indicator throughput )
msecs/ Connect : 28.8932 mean, 44.243 max, 24.488 min ( Important metric response Time )
msecs/first-response:63.5362 mean, 81.624 max, 57.803 min
HTTP response codes:
Code 200-49
result Analysis:
1. Fetches, 2 max parallel, 289884 bytes, in 10.0148 seconds
explains that 49 requests were run in the above test, the maximum number of concurrent processes is 2, the total data transferred is 289884bytes, The run time is 10.0148 seconds
2. 5916 mean Bytes/connection
Describes the average amount of data transferred per connection 289884/49=5916
3. 4.89274 fetches/sec , 28945.5 bytes/sec (throughput: Number of requests per unit of time)
indicates a response request of 4.89274 for each second and 28945.5 data per second Bytes/sec
This is worth the fetches/10.0148 seconds seconds calculated according to the
4. msecs/connect : 28.8932 mean, 44.243 max, 24.488 min (Response time: Time required per request, average, maximum, minimum)
indicates that the average response time per connection is 28.893 2 msecs, maximum response time 44.243 msecs, minimum response time 24.488 msecs
5. msecs/first-response:63.5362 mean, 81.624 max, 57.803 min
6, HTTP response Codes:code 200-49
Note the type of Open response page, if there are too many types of 403, it may be important to note if the system is experiencing bottlenecks.
Special Note: Here, we will generally pay attention to the indicator is FETCHES/SEC, Msecs/connect
They correspond to the common performance index parameters
qpt- the number of users per second and response time per connection in response to user times.
The result of the test is mainly to look at these two values. Of course, only these two indicators do not complete the performance analysis, we also need to the server CPU, memory analysis, in order to come to the conclusion
, in addition, the main indicator in the test results is the FETCHES/SEC option, which is the number of queries that the server can respond to per second, which is used to measure performance. Seems to be a bit higher than Apache's AB accuracy and more persuasive.
Http_load test parameters Comparison June 11th, Posted in http_load < by Johnny Woo >
./http_load-parallel 200-seconds URLs
End the test at a fixed time, which compares the response speed of the server being tested in the same time.
./http_load-parallel 200-fetches
Test by the number of fixed requests, which can compare the response speed returned under the same traffic.
Although both can get the response speed to the server
But using fetches makes it easier to get pressure on the server under test.
Because seconds controls the test time, it is possible to test the client in a short period of time without initiating a sufficient number of requests
The test is over before the server receives enough pressure.
There are situations, such as a memory leak and a bad resource recovery or a slower response to the back.
It is not easy to happen under this test condition
Using Fetchs, however, allows the client to ensure that the total number of requests is processed.
Use time as control parameter
The seconds parameter is set too small due to the lack of patience of the tester
Cause the test results to lose meaning
Therefore, it is recommended to use fetches as the test parameter. Used as a benchmark for comparison
Http_load "byte count wrong" error when doing the test,
If Httpd_load gets the page data and the last inconsistency
It will error byte count wrong
If it is a dynamic page, the content of the returned data is different. This error can be ignored
Webbench
Webbench is a Web site stress testing tool under Linux that can simulate up to 30,000 concurrent connections to test the load capacity of a Web site. Can go to Baidu Google search, I here give a
:http://cid-9601b7b7f2063d42.skydrive.live.com/self.aspx/Public/webbench-1.5.tar.gz
This program is smaller, after decompression less than 50K, hehe
Very simple installation
#tar ZXVF webbench-1.5.tar.gz
#cd webbench-1.5
#make && make Install
Generates the Webbench executable file in the current directory, which can be used directly
Usage:
Webbench-c Concurrent number-T run test time URL
Such as:
Webbench-c 5000-t http://www.askwan.com
Ab
AB is a powerful test tool from Apache
Apache has been installed in the general to bring their own,
Usage to view its description
Reference $./ab
./ab:wrong Number of arguments
Usage:./ab [Options] [http://]hostname[:p Ort]/path
Options are:
-N requests number of requests to perform
-c concurrency number of multiple requests to make
-T TimeLimit Seconds to Max. Wait for responses
-P postfile File containing data to POST
-T Content-type content-type header for POSTing
-V verbosity how much troubleshooting info to print
-W Print out results in HTML tables
-I use HEAD instead of GET
-X attributes String to insert as table attributes
-Y attributes String to insert as TR attributes
-Z attributes String to insert as TD or TH attributes
-C attribute Add cookie, eg. ' Apache=1234. (repeatable)
-H attribute Add arbitrary header line, eg. ' Accept-encoding:gzip '
Inserted after all normal header lines. (repeatable)
-A attribute Add Basic WWW authentication, the attributes
is a colon separated username and password.
-P attribute Add Basic Proxy authentication, the attributes
is a colon separated username and password.
-X Proxy:port ProxyServer and port number to use
-V Print version number and exit
-K Use HTTP KeepAlive feature
-D do not show percentiles served table.
-S do not show confidence estimators and warnings.
-G filename Output collected data to gnuplot format file.
-e filename Output CSV file with percentages served
-H Display Usage information (this message)
There are many parameters, generally we use the-N and-C
For example:
./ab-c 1000-n 100 http:// www.askwan.com/index.php
This represents processing 1000 requests and running 100 index.php files at the same time