[Translation] Using JMH for micro-Benchmark Testing: do not guess, test it !, Jmh Benchmark Test
Micro Benchmarking with JMH: Measure, don't guess!
Use JMH for micro-Benchmark Testing: do not guess, to test!
Author: Antonio
Translated by: holis
Indicate the source for reprinting.
Many Java developers know that declaring a variable as null is helpful for garbage collection, therefore, explicitly setting unused objects to Null is helpful for the GC collector to determine garbage and improve GC efficiency .), Some people know that using final to define an inline method can improve performance. However, we also know that nowadays, JVM is constantly evolving, and many theories that you recognized yesterday are not necessarily tested today. (Note: Due to the continuous optimization of jvm, in most cases, we do not declare unnecessary variables as null immediately, and the garbage collector can quickly determine whether the object should be recycled. After the jvm is constantly optimized, setting the variable to null may not significantly improve the performance.) So how can we know whether the code we write is efficient? In fact, we should not guess, but test it.
Measure, don't guess!
As my friend Kirk Pepperdine once said, "Measure, don't guess". when there is a performance problem in our code, we always try to make some small changes (probably random changes) to improve the performance. On the contrary, we should establish a stable performance testing environment (including the operating system, jvm, application server, database, and so on), set some performance goals, and continuously test this goal, until your expectation is reached. Like Continuous testing and continuous delivery, we should also conduct continuous performance testing.
In any case, performance is a dark art, which is not the main content discussed in this article. I just want to focus on micro-Benchmark Testing and show you how to use JMH in a real use case (this document uses logging as an example.
Use Micro-Benchmark Test in log output
I believe many people, like me, have seen the following debugging logs after using multiple log frameworks:
logger.debug("Concatenating strings " + x + y + z);logger.debug("Using variable arguments {} {} {}", x, y, z);if (logger.isDebugEnabled()) logger.debug("Using the if debug enabled {} {} {}", x, y, z);
In general applications, the log output level is INFO or WARNING. Even if the WARNING level is used, the debugging information can be normally output for the above several broken codes. However, debugging logs can affect the performance (performance) of applications ). To prove this, we will use micro-Benchmark Testing to test the performance of the above three types of code. Here we use Java micro-Benchmark Testing Tool (JMH ). The above three codes can be summarized as follows: using string connection, using variable parameters, and using If for debug availability detection.
JMH settings
JMH is a tool used in java or other JVM languages to build, run, and analyze (based on multiple benchmarks: nanoseconds, subtle, millisecond, and macro. With maven archtype, we can quickly create a JMH project.
mvn archetype:generate -DinteractiveMode=false -DarchetypeGroupId=org.openjdk.jmh \ -DarchetypeArtifactId=jmh-java-benchmark-archetype -DarchetypeVersion=1.4.1 \ -DgroupId=org.agoncal.sample.jmh -DartifactId=logging -Dversion=1.0
The project structure created using the maven prototype is as follows:
A pom file containing JMH dependencies and maven-shade-plugin plug-ins
One used@Benchmark
Empty AnnotationMyBenchmark
File
At this time, although we haven't done anything yet, the micro-benchmark test project we just created can be started and run. You can use the maven command to generate a benchmarks. jar file.
mvn clean install java -jar target/benchmarks.jar
When we use the above command to run this jar, we can see some interesting content output on the console: JMH enters the loop, preheat JVM, and execute@Benchmark
The empty method of the annotation and the number of operations per second.
# Run progress: 30,00% complete, ETA 00:04:41# Fork: 4 of 10# Warmup Iteration 1: 2207650172,188 ops/s# Warmup Iteration 2: 2171077515,143 ops/s# Warmup Iteration 3: 2147266359,269 ops/s# Warmup Iteration 4: 2193541731,837 ops/s# Warmup Iteration 5: 2195724915,070 ops/s# Warmup Iteration 6: 2191867717,675 ops/s# Warmup Iteration 7: 2143952349,129 ops/s# Warmup Iteration 8: 2187759638,895 ops/s# Warmup Iteration 9: 2171283214,772 ops/s# Warmup Iteration 10: 2194607294,634 ops/s# Warmup Iteration 11: 2195047447,488 ops/s# Warmup Iteration 12: 2191714465,557 ops/s# Warmup Iteration 13: 2229074852,390 ops/s# Warmup Iteration 14: 2221881356,361 ops/s# Warmup Iteration 15: 2240789717,480 ops/s# Warmup Iteration 16: 2236822727,970 ops/s# Warmup Iteration 17: 2228958137,977 ops/s# Warmup Iteration 18: 2242267603,165 ops/s# Warmup Iteration 19: 2216594798,060 ops/s# Warmup Iteration 20: 2243117972,224 ops/sIteration 1: 2201097704,736 ops/sIteration 2: 2224068972,437 ops/sIteration 3: 2243832903,895 ops/sIteration 4: 2246595941,792 ops/sIteration 5: 2241703372,299 ops/sIteration 6: 2243852186,017 ops/sIteration 7: 2221541382,551 ops/sIteration 8: 2196835756,509 ops/sIteration 9: 2205740069,844 ops/sIteration 10: 2207837588,402 ops/sIteration 11: 2192906907,559 ops/sIteration 12: 2239234959,368 ops/sIteration 13: 2198998566,646 ops/sIteration 14: 2201966804,597 ops/sIteration 15: 2215531292,317 ops/sIteration 16: 2155095714,297 ops/sIteration 17: 2146037784,423 ops/sIteration 18: 2139622262,798 ops/sIteration 19: 2213499245,208 ops/sIteration 20: 2191108429,343 ops/s
Add SFL4J to baseline
As mentioned above, the test case we want to test is logging. In this project, I will use SFL4J and Logback, And we will add dependencies to the pom file:
<dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>1.7.7</version></dependency><dependency> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> <version>1.0.11</version></dependency>
Then we addlogback.xml
Configure the file and set the log output levelINFO
.
<configuration> <appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender"> <encoder> <pattern>%highlight(%d{HH:mm:ss.SSS} [%thread] %-5level %logger - %msg%n)</pattern> </encoder> </appender> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder><pattern>%msg%n</pattern></encoder> </appender> <root level="INFO"> <appender-ref ref="CONSOLE" /> </root></configuration>
The advantage of using maven-shade-plugin is that when we use maven to package the application, all dependencies and configuration files will be packaged into the target directory.
Use string connection in logs
Start the first micro-Benchmark Test: Use a string connection in the log. Here we write the required code@Benchmark
In the annotation method, other things are handed over to JMH.
In this Code, we create three string variables x, y, and z, and then output the debugging log in the form of string connection in the loop. The Code is as follows:
import org.openjdk.jmh.annotations.Benchmark;import org.slf4j.Logger;import org.slf4j.LoggerFactory;public class MyBenchmark { private static final Logger logger = LoggerFactory.getLogger(MyBenchmark.class); @Benchmark public void testConcatenatingStrings() { String x = "", y = "", z = ""; for (int i = 0; i < 100; i++) { x += i; y += i; z += i; logger.debug("Concatenating strings " + x + y + z); } }}
Then, run the micro-Benchmark Test and view the iteration output, just as you did.
Note: The comparison will be conducted later.
Use variable parameters in logs
In this micro-benchmark test, we use variable parameters to replace the string connection, change the code content as follows, and then package and execute.
@Benchmarkpublic void testVariableArguments() { String x = "", y = "", z = ""; for (int i = 0; i < 100; i++) { x += i; y += i; z += i; logger.debug("Variable arguments {} {} {}", x, y, z); }}
Use the If statement in the log
The last and most important one is to use log outputisDebugEnabled()
Optimization
@Benchmarkpublic void testIfDebugEnabled() { String x = "", y = "", z = ""; for (int i = 0; i < 100; i++) { x += i; y += i; z += i; if (logger.isDebugEnabled()) logger.debug("If debug enabled {} {} {}", x, y, z); }}
Results of the micro-Benchmark Test
After running three micro-benchmark tests, we will expect the results (Remember, don't guess, measure ). The more operations per second, the better the performance. If we look at the last row in the table below, we note thatisDebugEnabled
The best performance, the worst use of string connection. It can also be found thatisDebugEnabled
InsteadVariable parametersThe test results are not bad. Comprehensive code readability (lessboilerplate code
(Modular code can also be understood as unimportant but indispensable code )). So I will choose to useVariable parametersThat form!
|
String concatenation |
Variable arguments |
If isDebugEnabled |
Iteration 1 |
57108,635 ops/s |
97921,939 ops/s |
104993,368 ops/s |
Iteration 2 |
58441,293 ops/s |
98036,051 ops/s |
104839,216 ops/s |
Iteration 3 |
58231,243 ops/s |
97457,222 ops/s |
106601,803 ops/s |
Iteration 4 |
58538,842 ops/s |
100861,562 ops/s |
104643,717 ops/s |
Iteration 5 |
57297,787 ops/s |
100405,656 ops/s |
104706,503 ops/s |
Iteration 6 |
57838,298 ops/s |
98912,545 ops/s |
105439,939 ops/s |
Iteration 7 |
56645,371 ops/s |
100543,188 ops/s |
102893,089 ops/s |
Iteration 8 |
56569,236 ops/s |
102239,005 ops/s |
104730,682 ops/s |
Iteration 9 |
57349,754 ops/s |
94482,508 ops/s |
103492,227 ops/s |
Iteration 10 |
56894,075 ops/s |
101405,938 ops/s |
106790,525 ops/s |
Average |
57491,4534 ops/s |
99226,5614 ops/s |
104913,1069 ops/s |
# Conclusion
Jvm has evolved significantly over the past few decades. It is not advisable to optimize our code using a design pattern that was just a decade ago. The only way to compare the quality of two pieces of code is to measure it. JMH is a perfect tool for simple and efficient microbenchmark testing. Of course, a small part of the reasoning code has only one step, because we usually need to analyze the performance of the entire application. With HMH, this first step is easy.
There are also some examples of JMH, which are full of interesting ideas.
References
JMH Samples
Java Performance Tuning Guide
Using JMH for Java Microbenchmarking
Writing Java Micro Benchmarks with JMH: Juicy
[Announcement]Copyright Notice
(Full text)