JMeter Performance Test 3.0-multi-dimensional graphical HTML report

Source: Internet
Author: User

In the previous blog JMeter Performance Test 3.0-New JMeter plugin Management I said will write real JMeter 3.0 new features, after two weeks, finally on this weekend, suspended other arrangements to continue this unfinished series.
This paper mainly introduces the new features introduced by JMeter3.0: Dashboard report, graphical HTML format multi-dimension test reports. With this feature, we can greatly reduce the difficulty of setting up the performance test platform based on JMeter, and put more experience on the backend platform function rather than the temporary learning front-end chart library.

I. Why talk about this new feature

Before JMeter3.0, the official only provided a graphical display of the part dimensions of the test results on the UI of the tool, which caused me two problems:

    1. In the actual use, after the integration of JMeter in the platform needs to display the TPS curve, the average response time curve and other diagrams, we need to manually play with hightcharts/echarts such as the front-end chart library.
    2. To view historical test results, it is necessary to start JMeter graphical interface, import saved CSV results, the process is cumbersome, and when the result set is large, jmeter need to spend a considerable amount of time on the interface to display graphical reports.

The new features discussed in this paper provide a better solution to these two problems:

    • The new feature is a good way to visualize the result data, the generated report is HTML page form, and contains most of the measurement dimensions of the actual test, can be easily embedded into the platform, from the browser to view each test run.
    • Just keep the generated HTML page, and later to see the results of the test, just open in the browser, easy and quick.
Two. Introduction to new features

JMeter3.0 provides an extension module for generating graphical reports of HTML page formatting . This module supports the creation of multidimensional graphical test reports in two ways:

    1. Automatically generate HTML graphical reports for this test at the end of the JMeter performance test
    2. Use an existing result file (such as a CSV file) to generate an HTML graphical report of the result

The metrics dimensions that are provided by default include:

  1. APDEX (Application Performance Index) index
  2. Aggregated reports
    • Similar to the Aggregate report on the UI
  3. Errors Report
    • Show the number and percentage of different error types
  4. Response Time Change curve
    • Show average response time over time
    • Similar to JMeter plugins on the UI [email protected]-Response times over time
  5. Data throughput Time Curve
    • Show how data throughput changes over time per second
    • Similar to JMeter plugins on the UI [email protected]-Bytes throughput over time
  6. Latency Time Change curve
    • Show latency time changes over the years
    • Similar to JMeter plugins on the UI [email protected]-Response latencies over time
  7. Number of hits per second curve
    • Similar to JMeter plugins on the UI [email protected]-Hits per Second
  8. HTTP status code time distribution curve
    • Show the distribution of response status codes over time
    • Similar to JMeter plugins on the UI [email protected]-Response Codes per Second
  9. Transaction throughput Time Curve (TPS)
    • Show the number of transactions processed per second over time
    • Similar to JMeter plugins on the UI [email protected]-transactions per Second
  10. Graph of average response time vs. Requests per second
    • Show the relationship between average response time and requests per second (can be understood as QPS)
  11. Latency time vs. Requests per second diagram
    • Shows the relationship between latency time and the number of requests per second
  12. Response Time percentile bitmap
    • Percentile distribution map of response time
  13. Number of active threads change curve
    • Show the number of active threads over time during testing
  14. Graph of average response time and number of threads
    • Show the relationship between average response time and number of threads
    • Similar to JMeter plugins on the UI [email protected]-Response times vs Threads
  15. Column Response Time Distribution chart
    • Shows the number of requests falling across the average response time interval

Note 1:latency time is not translated into Chinese, here is a comment on how it is calculated:

time = 接收到响应的第一个字节的时间点 - 请求开始发送的时间点

From just before sending the request to just after the first response have been received
--Apache JMeter Glossary

time) = 接收完所有响应内容的时间点 - 请求开始发送的时间点

From just before sending the request to just after the last response have been received
--Apache JMeter Glossary

Note 2:apdex Standard from the user's point of view, the application response time to the performance of the user to the applicable energy can be quantified to a range of 0-1 satisfaction evaluation.

Apdex (Application performance Index) is a open standard developed by an alliance of companies. It defines a standard method for reporting and comparing the performance of software applications in computing.
-Wikipedia

Three. Quick start 1. Confirm Basic Configuration
  • The
  • confirms the following configuration entry in Jmeter.properties or user.properties:
     
  • If you want to show more detailed data in the errors report, you need to ensure that the following configuration
    • jmeter.save.saveservice.assertion_results_failure_message = true
    • If the transaction controller is used, verify that Generate Parent sample is not checked Transaction
2. Generate reports

A. Reporting at the end of a stress test

    • Basic command format:
      jmeter -n -t <test JMX file> -l <test log file> -e -o <Path to output folder>
    • Examples:
      jmeter -n -t F:\PerformanceTest\TestCase\script\getToken.jmx -l testLogFile -e -o ./output

B. Generating a report using an existing stress test CSV log file

    • Basic command format:
      jmeter -g <log file> -o <Path to output folder>
    • Examples:
      jmeter -g D:\apache-jmeter-3.0\bin\testLogFile -o ./output

The two samples will produce the following files (clips) in the \apache-jmeter-3.0\bin\output directory:

You can view a variety of graphical reports by opening the index.html file in your browser:

Note: In the 3.0 version, due to the source code in the character encoding problem, may encounter generated reports, Chinese label display as garbled problem, due to space limitations, the solution please poke here to see my another article.

Four. Custom Configuration

JMeter3.0 added a file in the bin directory to reportgenerator.properties save all the default configuration of the graphical HTML report generation module, to change the configuration, it is recommended not to edit the file directly, but recommend in the user.properties configuration and overwrite.

1. Overall configuration

The overall configuration is jmeter.reportgenerator. prefixed with, for example: jmeter.reportgenerator.overall_granularity=60000

    • overall_granularity: Define sampling point granularity, default is 60000ms, usually in tests other than stability, we may need to define finer granularity, such as 1000ms, we can add the following configuration at the end of the user.properties file:
      # Change this parameter if you want to change the granularity of over time graphs.jmeter.reportgenerator.overall_granularity=6000
    • report_title: Define the title of the report, we may need to define the title as the actual test item name
    • apdex_satisfied_threshold: Define a satisfactory threshold value in Apdex evaluation (unit ms)
    • apdex_tolerated_threshold: Define tolerable thresholds in Apdex evaluation
      Apdext = (Satisfied Count + Tolerating Count / 2) / Total Samples

In addition, in jmeter.properties , there is a default value for were in the collection report:

aggregate_rpt_pct1 : Defaults to 90aggregate_rpt_pct2 : Defaults to 95aggregate_rpt_pct3 : Defaults to 99

It can be overwritten in user.properties, such as: aggregate_rpt_pct1 = 70 , the effect is as follows:

2. Chart configuration

Each chart configuration is prefixed with jmeter.reportgenerator.graph.<图表名称>. .

    • classnameThe implementation class of the chart, if it has its own custom implementation, writes the value of the configuration to the class name of the custom implementation class
    • titleIcon title, such as when you want to Chinese, configure the Chinese title here
    • property.set_granularitySets the sampling point granularity of the icon, which is not configured by default using the granularity setting in the overall configuration
3. Output configuration

The output configuration is jmeter.reportgenerator.exporter prefixed.

    • property.output_dirConfigure the default report output path. At the command line, you can override the configuration with the-O option to set a specific path.
    • html.series_filterUsed to filter the display content. As in User.properties, add the following configuration:
      jmeter.reportgenerator.exporter.html.series_filter=(^Login)(-success|-failure)?
      The final report will show only the data named login as the sampler. This configuration consists of two parts, which (-success|-failure)? are Transactions per second the configurations on which the chart depends. The previous section accepts a regular expression for filtering.
Five. Summary

The nature of this presentation is that the Dashboard Report Apache JMeter update to the age of the data visualization of test results, albeit belated, is not cool, but at least it is still a boon for those who need to perform performance testing based on it. Finally, thank you for the ongoing updates to the Apache JMeter project contributors.

References
    1. Apache JMeter Dashboard Report
    2. Apache JMeter Glossary
Ah, Rome.

Wrote 15410 words, was paid attention by 76 people, gained 78 likes

Java, backend services, data processing, testing, embracing full stack http://www.aloo.me

If you feel that my article is useful to you, please feel free to make a reward. Your support will encourage me to continue to create!

2/F · 2016.09.20 17:00

An error occurred:error while processing Samples:consumer failed with Message:consumer failed with Message:consumer FAI LED with Message:no column <threadName> found in sample metadata <timestamp,elapsed,label,responsecode, Responsemessage,datatype,success,failuremessage,bytes,grpthreads,allthreads,latency,idletime>, check # jmeter.save.saveservice.* properties to add the missing column

Landlord Hello, according to your configuration finally reported this error. Solution ~

3/F · 2016.09.21 00:10

Reading the wrong key information: No column <threadName> found in sample metadata
Solution: Verify that the./bin/jmeter.properties file has the following configuration: Jmeter.save.saveservice.thread_name = True. You can also check "Save thread name" in the Configure option of your listener on the interface.

Praise Reply

11a80f8fc74f: @ Do you confirm that you have set up or this question, with what listener is it related??

2016.09.21 14:57 reply.

11A80F8FC74F: @ An error occurred:data exporter "html" was unable to export Data. Re-run the next occurrence of this error is there a problem with setting it?

2016.09.21 15:21 reply.

11A80F8FC74F: @ The latest progress is running yes but the directory is empty ... Really did not recruit, ask for help ~ ~

2016.09.21 16:19 reply. Girls, 4 Floor · 2016.10.18 16:02

Jmeter-n-T <test JMX file>-L <test log file>-e-o <path to output folder>
Where do you put this order?

Praise Reply

Arro: @ Girls under windows at the command line execution


5/F · 2016.11.06 12:56

Ask, why I report this error, report folder generated but no index.html
Tidying up ... @ Sun 12:09:41 CST 2016 (1478405381170)
The JVM should has exitted but does not.
The following Non-daemon threads is still running (DESTROYJAVAVM is OK):
Thread[standardjmeterengine,5,main], StackTrace:sun.nio.fs.WindowsNativeDispatch
Er#copyfileex0
Sun.nio.fs.windowsnativedispatcher#copyfileex
Sun.nio.fs.windowsfilecopy#copy
Sun.nio.fs.windowsfilesystemprovider#copy
Java.nio.file.files#copy
Org.apache.jmeter.report.dashboard.templatevisitor#visitfile at line:126
Org.apache.jmeter.report.dashboard.templatevisitor#visitfile at line:48
Java.nio.file.files#walkfiletree
Java.nio.file.files#walkfiletree
Org.apache.jmeter.report.dashboard.htmltemplateexporter#export at line:487
Org.apache.jmeter.report.dashboard.reportgenerator#exportdata at line:348
Org.apache.jmeter.report.dashboard.reportgenerator#generate at line:256
Org.apache.jmeter.jmeter$listentotest#generatereport at line:1144
Org.apache.jmeter.jmeter$listentotest#testended at line:1089
Org.apache.jmeter.engine.standardjmeterengine#notifytestlistenersofend at Line:2
15
Org.apache.jmeter.engine.standardjmeterengine#run at line:436
Java.lang.thread#run

Thread[destroyjavavm,5,main], StackTrace:
... end of Run

JMeter Performance Test 3.0-multi-dimensional graphical HTML report

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.