Experience and comparison of AB and JMeter in GET/POST stress testing

Source: Internet
Author: User

Experience and comparison of AB and JMeter in GET/POST stress testing

Introduction: Internet service stress testing is a very important evaluation method. AB, webbench, and jmeter are all popular testing tools in the industry. AB and webbench are lightweight testing tools in shell mode, jmeter is a more advanced test tool with a GUI. It has its own characteristics. AB has more features than webbench. Therefore, we select AB and jmeter for a comparison.

[Installation of test environment]
[AB]

AB is short for Apache Benchmark. as its name implies, it is a web stress testing tool developed by Apache. It has the advantages of convenient use and powerful statistical functions.

As a very popular stress testing tool, AB is not described here. I will give my personal experience:

First, install Ubuntu. Both ubuntu and CentOS currently provide automatic installation commands (at least Ubuntu 14 and centos 6)

Ubuntu: sudo apt-get install apache2-utils

Centos: yum install httpd-tools

After installation, you can start testing.

Common AB parameters include-n,-t, and-c.

-C (concurrency) indicates the number of concurrencies used for testing;

-T indicates the duration of the test;

-N indicates the number of test requests to be sent.

Generally, use-t or-n to select one.

For testing a simulated GET request, AB is very simple, that is, AB-n 100-c 10 'HTTP: // testurl.com/xxxx? Para1 = aaa & para2 = bbb'

Testing a simulated POST request is a little complicated. You need to put the data to be post (generally in json format) in the file. For example, a post interface needs to be accessed in the following way:

Curl-H 'content-Type: application/json'-x post-d' {"actionType": "collect", "appId": 1, "contentId ": "1770730744", "contentType": "musictrack", "did": "866479025346031", "endType": "mobile", "recommendId ": "104169490_1_0_1434453099 #1770730744 # musictrack # USER_TO_SONG_TO_SONGS # gsql_similarity_content2content", "tabId": 0, "uid": "104169490"} 'HTTP: // localhost: 8083/query/leui/v0/post/user/behavior/content

You need to put the jsondata behind-d in a file, such as a file post_data.txt,:

{"ActionType": "collect", "appId": 1, "contentId": "1770730744", "contentType": "musictrack", "did": "866479025346031 ", "endType": "mobile", "recommendId": "comment #1770730744 # musictrack # USER_TO_SONG_TO_SONGS # gsql_similarity_content2content", "tabId": 0, "uid": "104169490 "}

Then parse and send the json data with the-p parameter: AB-n 100-c 10-p post_data.txt-T 'application/json' http: // localhost: 8083/query/leui/v0/post/user/behavior/content

[Jmeter]
Jmeter is a very powerful and user-friendly GUI tool. You can set parameters for http access. The user manual legends in the help software are rich and can be used by beginners.

Configurable parameters for http testing include (1) http request configuration: test target host, port, url path, http request parameter, post data, http header (2) test Global Strategy: test the number of concurrent requests sent and the number of test cycles (jmeter does not have a test time setting, only the number of times that the test request is repeatedly sent, or infinite repetition ). For each configurable parameter in (1), the variable $ {varaible_name} can be used instead. araiable can be an external input from the CSV format, and the GUI has a test plan) right-click ----- add ----- configure component ----- You can Set the variable name corresponding to the CSV file source and each column of Data in CSV.

The features mentioned above have other powerful feature settings. They are described in detail in the help document and many online documents, such as the CSV input settings (blog.

Generally, the following settings are used.

Test Plan)

Test Plan ---- create a Thread Group)

Thread group ---- Configuration component ---- HTTP Header Manager

Thread group ----- Configuration component --- CSV Data Set Config

Thread group --- sampler (sampler) ----- HTTP request, which contains two tabs: "parameters" tab, http request parameters, and post data tab, configurable post data is generally a json string. All fields in the json string can be represented by variables such as $ {xxx.

You can configure these interfaces, and The jmeter configuration will be generated when saved. jmx file, which is not only used to save and modify configurations, but also used to run in a non-GUI (command line, shell) environment.

Because our stress testing environment is usually a linux system, and to maximize the performance of the stress testing tool, it is best to run the system or the stress testing tool in non-GUI mode.

In linuxshell, jmeter is developed in java and does not need to be installed. After extracting jmeter, run

{Jmeter_install_dir) //} bin/jmeter-n-t $ target-l xxxx. jtl

-N is the silent mode;

-T is followed by the path of the. jmx configuration file;

-L use an output file to record the time of each request. You can use jmeter GUI to open and generate the final statistical aggregation report.

Run jmeter in command line mode. You can also input parameters from the command line (shell). You only need to add-JXXXX = value to the parameter list after the jmeter command, XXXX is recognized by jmeter as an external input variable whose value is value. In jmerter configuration, $ {__ P (XXXX)} is used to use this external input variable, note that "_ P" starts with two underscores.

[Test build and Effect Comparison]

Next we will use a practical example to compare the effect of the AB and jmeter stress tests. The first is the GET request.

[AB]

Use 10 concurrent stress tests for 100 seconds.

] # AB-t 100-c 10 'HTTP: // localhost: 8083/xxxx? Uid = 1233435 & did = 123456789 & appId = 1'

This is apachetings, Version 2.3 <$ Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking xxx. xxx (be patient)
Finished 733 requests

Server Software: CppCMS-Embedded/1.0.4
Server Hostname: xxx. xxx
Server Port: 8083

Document Path:/xxx? Uid = 79057533 & did = 123456789 & appId = 1
Document Length: 4601 bytes

Concurrency Level: 10
Time taken for tests: 100.137 seconds
Complete requests: 733
Failed requests: 732
(Connect: 0, Receive: 0, Length: 732, Exceptions: 0)
Write errors: 0
Total transferred: 3672653 bytes
HTML transferred: 3572232 bytes
Requests per secondd: 7.32 [#/sec] (mean)
Time per request: 1366.124 [MS] (mean)
Time per request: 136.612 [MS] (mean, internal SS all concurrent requests)
Transfer rate: 35.82 [Kbytes/sec] canceled ed

Connection Times (MS)
Min mean [+/-sd] median max
Connect: 1 2 2.4 2 40
Processing: 342 1352 636.3 1183 6046
Waiting: 342 1351 636.2 1183
Total: 345 1354 636.8 1186

Percentage of the requests served within a certain time (MS)
50% 1185
66% 1333
75% 1460
80% 1564
90% 1835
95% 2357
98% 3248
99% 5205
100% 6049 (longest request)

[Jmeter]

Use jemeter with the same configuration (the number of threads is 10, the ramp time is 1 second, And the timeout threshold is 3000 ms)

The running result is as follows:
Creating summariser <summary>
Created the tree successfully using music_api_uid.jmx
Starting the test @ Thu Nov 19 11:19:43 CST 2015 (1447903183454)
Waiting for possible shutdown message on port 4445
Summary + 90 in 16 s = 5.7/s Avg: 1677 Min: 959 Max: 3757 Err: 0 (0.00%) Active: 10 Started: 10 Finished: 0
Summary + 202 in 31.1 s = 6.5/s Avg: 1477 Min: 912 Max: 2727 Err: 0 (0.00%) Active: 10 Started: 10 Finished: 0
Summary = 292 in 46 s = 6.4/s Avg: 1539 Min: 912 Max: 3757 Err: 0 (0.00%)
Summary + 164 in 31 s = 5.3/s Avg: 1830 Min: 972 Max: 5009 Err: 5 (3.05%) Active: 10 Started: 10 Finished: 0
Summary = 456 in 76 s = 6.0/s Avg: 1643 Min: 912 Max: 5009 Err: 5 (1.10

Finally, the detailed request record is opened through the GUI, and the result of generating the aggregation report is

Samples: 576

Average: 1713

Median: 1496.

90% Line: 2353

Min: 912

Max: 5009

Throught: 5.8/sec

Kb/sec: 27.8

Error %: 2.08%

Comparison of get api Pressure Test Results

AB Jmeter
Total number of requests sent 733 576
Average request time (MS) 1366 1713
Request median time (50% <) (MS) 1185 1496
Request time 90% (MS) 1835 2353
Error   2.08%
QPS 7.32 6

Compared with the two tests, AB completed 733 times, while jmeter completed 576 times, while AB completed 733 times. In fact, this data is not accurate, because jmeter does not support precise test time restrictions, I forcibly terminated jemeter on time, so some requests may be missed. However, I later tested the same total number of requests (AB was set with-n, jmeter was set with thread * loops), and jmeter was 15% slower than AB, it may be related to jmeter's own statistics.

Because the test time is not strictly equal, we mainly look at the mean: time distribution, AB is generally lower. Both parties have the same test environment and test parameters. I wonder if jmeter has different algorithms in calculating the return time. Average time, jmeter statistics is also high

I kept an eye on the background log of the web interface and ensured that the parameters for the two tests were the same, the difference between the results can only be understood as the difference between the two software in the statistical caliber (for example, the measurement standard of the returned part) and the http access method (for example, the same setting of 10 concurrency, it is generally understood to open 10 threads to continuously request interfaces, but the thread scheduling policies are different, the pressure on the server is different, and the returned performance is different ).

After the GET test and comparison, compare the results of the post api test:

[AB ]:

# AB-t 100-c 10-p post_data.txt-T 'application/json' http: // localhost: 8083/xxxxx

This is apachetings, Version 2.3 <$ Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking xxx. xxx (be patient)
Completed5000 requests
Completed10000 requests
Finished 12937 requests

Server Software: CppCMS-Embedded/1.0.4
Server Hostname: xxx. xxx
Server Port: 8083

Document Path:/xxxxx
Document Length: 92 bytes

Concurrency Level: 10
Time taken for tests: 100.001 seconds
Complete requests: 12937
Failed requests: 0
Write errors: 0
Total transferred: 2962573 bytes
Total POSTed: 4828858
HTML transferred: 1190204 bytes
Requests per secondd: 129.37 [#/sec] (mean)
Time per request: 77.299 [MS] (mean)
Time per request: 7.730 [MS] (mean, internal SS all concurrent requests)
Transfer rate: 28.93 [Kbytes/sec] canceled ed
47.16 kb/s sent
76.09 kb/s total

Connection Times (MS)
Min mean [+/-sd] median max
Connect: 1 2 8.9 1 1001
Processing: 31 76 78.5 69 2452
Waiting: 31 75 77.7 69 2452
Total: 33 77 79.0 71 2454

Percentage of the requests served within a certain time (MS)
50% 71
66% 80
75% 88
80% 91
90% 101
95% 113
98% 124
99% 140
100% 2454 (longest request)

[Jmeter]

The result of jmeter configured with the same parameter is:

#.. /Apache-jmeter-2.11/bin/jmeter-n-t post. jmx -jcsv1_post_paras.txt-JIP = xxx. xxx. xxx. xxx-JPORT = 8083-JTHREAD = 10-JRAMP = 1-l "post_test.log"

Creating summariser <summary>
Created the tree successfully using post_to_recommend_user_action_server.jmx
Starting the test @ Tue Nov 17 20:49:37 CST 2015 (1447764577991)
Waiting for possible shutdown message on port 4445
Summary + 3978 in 21.1 s = 188.5/s Avg: 51 Min: 32 Max: 1049 Err: 0 (0.00%) Active: 10 Started: 10 Finished: 0
Summary + 3796 in 30.1 s = 126.2/s Avg: 78 Min: 34 Max: 1596 Err: 0 (0.00%) Active: 10 Started: 10 Finished: 0
Summary = 7774 in 51.1 s = 152.1/s Avg: 64 Min: 32 Max: 1596 Err: 0 (0.00%)
Summary + 3273 in 30.1 s = 108.8/s Avg: 91 Min: 37 Max: 3091 Err: 1 (0.03%) Active: 10 Started: 10 Finished: 0
Summary = 11047 in 81.1 s = 136.2/s Avg: 72 Min: 32 Max: 3091 Err: 1 (0.01%)

Aggregate Report Analysis for post_test.log

Samples: 11899

Average: 58

Median: 52

90% line: 76

Min: 27

Max: 3091

Error: 0.01%

Throughout: 7.6/sec

Kb/sec: 1.9

Comparison of post api Pressure Test Results

AB Jmeter
Number of completed requests 12937 11899
Average return time (MS) 77 58
Maximum return time (MS)   3091
Minimum return time (MS)   27
Median request time (50% <) 71 52
90% of request reverse time less than (MS) 101 76
Error Rate (basically timeout) 0 0.01%
QPS 129 136

[Usage comparison summary]
My personal experience is:

In terms of statistical results, AB is dominant. The advantage of AB is that the statistical results are more readable and helpful for analysis. for differences in some parameters, the main focus is on source code implementation, but the error is within the acceptable range. As a stress test, what we need is a general capability of the server to cope with the pressure and the changing trend of the server performance as the pressure increases, so the numbers AB and jmeter do not make much sense, but they have already played a role in mutual validation if they do not have much difference.

As for the stress testing scheme, jmeter is dominant: jmeter supports variable parameters and input of CSV data sets, and can set more complex test samples with a wider range of applicability.

AB does not need to write the configuration file. It only needs several command line parameters to perform the stress test. It is suitable for testing http services with simple interface business logic.

Use JMeter to perform a stress test on WebService

JMeter Installation notes in Windows

Use JMeter for basic stress testing

How to Use assertions in JMeter

JMeter monitoring plug-in JMeterPlugins & PerfMon Installation

JMeter details: click here
JMeter: click here

This article permanently updates the link address:

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.