Jmeter performance (stress) Test-Unlock

Source: Internet
Author: User

1. first go to the official website to download jmeter: http://jmeter.apache.org/download_jmeter.cgi

2. decompress the package to the directory \ apache-jmeter-5.0 \ bin find jmeter. Bat click to execute, the following interface appears:

3. then create an HTTP request to be tested to "http://www.baidu.com/s? For example, ie = UTF-8 & WD = jmeter performance test:

Test Plan-> Add-> threads (users)-> thread group

Name: thread group name

Number of threads (users): number of virtual users. A virtual user occupies a thread or process and sets the number of virtual users here to set the number of threads.

Ramp-up period (in seconds): The preparation duration. Set the total time required for starting the virtual user. If the number of threads is 10, the preparation duration is 2, it takes two seconds to start 10 threads, that is, five threads are started every second.

Loop count: Number of loops. The number of requests sent by each thread. If the number of threads is 10 and the number of loops is 100, 100 requests are sent to each thread. The total number of requests is 10*100 = 1000. If forever is selected, all threads will send requests until they choose to stop running the script.

Delay thread creation until needed: the creation of the thread is delayed when it is required.

Sched: scheduler, which specifies the start time and end time of the thread group. When configuring the scheduler, you must check the number of cycles to always ).

4. Add the request to be tested: thread group-> Add-> sampler-> HTTP Request

The parameters are as follows:

IE: encoding method. The default value is UTF-8.

WD: search term, which is a jmeter performance test.

5. Add the view result tree. Right-click the thread group and choose add> lisenter> View results tree.

Modify the response data format to HTML source formatted. You can see the data returned by this request in response data:

6. add custom variables, right-click and choose thread group> Add> config element> User Defined variables.

Set WD (search term) as a custom variable:

You can also use this parameter in an HTTP request in the format of $ {WD}, which was previously used for jmeter performance testing.

7. Add assertions

Right-click HTTP request-> assertions-> response assertion

Select Main sample only, document [text], contains, and add User-Defined variables in patterns to test, because the query results must contain the search term. Otherwise, the request fails.

8. Add assertion results

Right-click HTTP request-> Add-> lisenter-> assertion results

If the request is successful, the HTTP request is directly displayed; otherwise, the response assertion: test failed: match content written in document expected to contain/assertion/(I intentionally changed it to $ {WD} 111, so the assertion will fail)

9. Add an aggregate report

Right-click thread group-> Add-> lisenter-> aggregate report

After the request is executed (you can first modify the parameter configurations related to the performance test: Number of threads, number of cycles, and duration. Currently, I am still a single-thread job), the aggregation Report will be shown as follows:

Detailed description of aggregate report parameters:

Lable: Each jmeter has a name attribute. The value of the name attribute is displayed here.

# Samples: Number of requests, indicating the total number of requests sent in this test. If 10 users are simulated and each user Iterates for 10 times, 100 is displayed here.

Average: average response time, which is the average response time of a single request by default. When transaction controller is used, the average response time is displayed in the unit of transaction.

Median: Median, that is, the response time of 50% users.

90% line: the response time of 90% users.

Min: minimum response time.

MAX: Maximum response time.

Error %: Error Rate -- number of error requests/total number of requests

Throughput: Throughput-the number of requests completed per second by default.

KB/sec: the amount of data received from the server per second, which is equivalent to throughput/sec in the Load Runner.

In general, the data we need to focus on in Performance Testing includes: # samples request count, average response time, Min minimum response time, Max maximum response time, error % error rate, and throughput.

Jmeter performance (stress) Test-Unlock

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.