This article is reproduced in the push cool: HTTP://WWW.TUICOOL.COM/ARTICLES/BNVUEZR
Before JMeter3.0, the official only provided a graphical display of the part dimensions of the test results on the UI of the tool, which caused me two problems:
- In the actual use, after the integration of JMeter in the platform needs to display the TPS curve, the average response time curve and other diagrams, we need to manually play with hightcharts/echarts such as the front-end chart library.
- To view historical test results, it is necessary to start JMeter graphical interface, import saved CSV results, the process is cumbersome, and when the result set is large, jmeter need to spend a considerable amount of time on the interface to display graphical reports.
The new features discussed in this paper provide a better solution to these two problems:
- The new feature is a good way to visualize the result data, the generated report is HTML page form, and contains most of the measurement dimensions of the actual test, can be easily embedded into the platform, from the browser to view each test run.
- Just keep the generated HTML page, and later to see the results of the test, just open in the browser, easy and quick.
two. Introduction to new features
JMeter3.0 provides an extension module for generating graphical reports of HTML page formatting . This module supports the creation of multidimensional graphical test reports in two ways:
- Automatically generate HTML graphical reports for this test at the end of the JMeter performance test
- Use an existing result file (such as a CSV file) to generate an HTML graphical report of the result
The metrics dimensions that are provided by default include:
- APDEX (Application Performance Index) index
- Aggregated reports
- Similar to the Aggregate report on the UI
- Errors Report
- Show the number and percentage of different error types
- Response Time Change curve
- Show average response time over time
- Similar to JMeter plugins on the UI [email protected]-Response times over time
- Data throughput Time Curve
- Show how data throughput changes over time per second
- Similar to JMeter plugins on the UI [email protected]-Bytes throughput over time
- Latency Time Change curve
- Show latency time changes over the years
- Similar to JMeter plugins on the UI [email protected]-Response latencies over time
- Number of hits per second curve
- Similar to JMeter plugins on the UI [email protected]-Hits per Second
- HTTP status code time distribution curve
- Show the distribution of response status codes over time
- Similar to JMeter plugins on the UI [email protected]-Response Codes per Second
- Transaction throughput Time Curve (TPS)
- Show the number of transactions processed per second over time
- Similar to JMeter plugins on the UI [email protected]-transactions per Second
- Graph of average response time vs. Requests per second
- Show the relationship between average response time and requests per second (can be understood as QPS)
- Latency time vs. Requests per second diagram
- Shows the relationship between latency time and the number of requests per second
- Response Time percentile bitmap
- Percentile distribution map of response time
- Number of active threads change curve
- Show the number of active threads over time during testing
- Graph of average response time and number of threads
- Show the relationship between average response time and number of threads
- Similar to JMeter plugins on the UI [email protected]-Response times vs Threads
- Column Response Time Distribution chart
- Shows the number of requests falling across the average response time interval
Note 1:latency time is not translated into Chinese, here is a comment on how it is calculated:
Latency time = The first byte of the response received-the point in time at which the request began to be sent
From just before sending the request to just after the first response have been received
–apache JMeter Glossary
Response Time (elapsed time in jmeter terminology) = The point at which all response content was received-the point in time when the request started to be sent
From just before sending the request to just after the last response have been received
–apache JMeter Glossary
Note 2:apdex Standard from the user's point of view, the application response time to the performance of the user to the applicable energy can be quantified to a range of 0-1 satisfaction evaluation.
Apdex (Application Performance Index) is a open standard developed by an alliance of companies. It defines a standard method for reporting and comparing the performance of software applications in computing.
–wikipedia
three. Quick Start1. Confirm the basic configuration
Confirm the following configuration items in Jmeter.properties or user.properties:
Jmeter.save.saveservice.bytes = True
Jmeter.save.saveservice.label = True
jmeter.save.saveservice.latency = True
Jmeter.save.saveservice.response_code = True
Jmeter.save.saveservice.response_message = True
jmeter.save.saveservice.successful = True
jmeter.save.saveservice.thread_counts = True
Jmeter.save.saveservice.thread_name = True
Jmeter.save.saveservice.time = True
# The timestamp format must include the time and should include the date.
# For example the default, which is milliseconds since the epoch:
Jmeter.save.saveservice.timestamp_format = ms
# Or The following would also be suitable
Jmeter.save.saveservice.timestamp_format = Yyyy/mm/dd HH:mm:ss
If you want to show more detailed data in the errors report, you need to ensure that the following configuration
jmeter.save.saveservice.assertion_results_failure_message = true
- If the transaction controller is used, verify that Generate Parent sample is not checked Transaction
2. Generate Reports
A. Reporting at the end of a stress test
- Basic command format:
jmeter -n -t <test JMX file> -l <test log file> -e -o <Path to output folder>
- Examples:
jmeter -n -t F:\PerformanceTest\TestCase\script\getToken.jmx -l testLogFile -e -o ./output
B. Generating a report using an existing stress test CSV log file
- Basic command format:
jmeter -g <log file> -o <Path to output folder>
- Examples:
jmeter -g D:\apache-jmeter-3.0\bin\testLogFile -o ./output
The two samples will produce the following files (clips) in the \apache-jmeter-3.0\bin\output directory:
You can view a variety of graphical reports by opening the index.html file in your browser:
four. Custom Configuration
JMeter3.0 added a file in the bin directory to reportgenerator.properties
save all the default configuration of the graphical HTML report generation module, to change the configuration, it is recommended not to edit the file directly, but recommend in the user.properties
configuration and overwrite.
1. Overall configuration
The overall configuration is jmeter.reportgenerator.
prefixed with. such as: jmeter.reportgenerator.overall_granularity=60000
overall_granularity
: Define sampling point granularity, default is 60000ms, usually in tests other than stability, we may need to define finer granularity, such as 1000ms, we can add the following configuration at the end of the user.properties
file:
Time Graphs.
JMeter. Reportgenerator. overall_granularity=6000
report_title
: Define the title of the report, we may need to define the title as the actual test item name
apdex_satisfied_threshold
: Define a satisfactory threshold value in Apdex evaluation (unit ms)
apdex_tolerated_threshold
: Define tolerable thresholds in Apdex evaluation
Apdext = (Satisfied Count + Tolerating Count / 2) / Total Samples
In addition, in jmeter.properties
, there is a default value for were in the collection report:
90
95
99
You can user.properties
overwrite it in, such as: aggregate_rpt_pct1 = 70
, the effect is as follows:
2. Chart Configuration
Each chart configuration is prefixed with jmeter.reportgenerator.graph.<图表名称>.
.
classname
The implementation class of the chart, if it has its own custom implementation, writes the value of the configuration to the class name of the custom implementation class
title
Icon title, such as when you want to Chinese, configure the Chinese title here
property.set_granularity
Sets the sampling point granularity of the icon, which is not configured by default using the granularity setting in the overall configuration
3. Output Configuration
The output configuration is jmeter.reportgenerator.exporter
prefixed.
property.output_dir
Configure the default report output path. At the command line, you can override the configuration with the-O option to set a specific path.
html.series_filter
Used to filter the display content. As in User.properties, add the following configuration:
jmeter.reportgenerator.exporter.html.series_filter=(^Login)(-success|-failure)?
The final report will show only the data named login as the sampler. This configuration consists of two parts, which (-success|-failure)?
are Transactions per second
the configurations on which the chart depends. The previous section accepts a regular expression for filtering.
jmeter3.0-graphical HTML report with multidimensional dimensions