Go Language Practical Notes (22) | Go Benchmark Test

Source: Internet
Author: User
Tags benchmark
This is a creation in Article, where the information may have evolved or changed.

"Go language Combat" reading notes, not to be continued, welcome to sweep code attention flysnow_org to the public or website http://www.flysnow.org/, the first time to see follow-up notes. If you feel helpful, share it with your friends and thank you for your support.

What is benchmark testing

Benchmarking, is a way to test the performance of code, such as you have a number of different scenarios, can solve the problem, then in the end is that kind of performance better? This is where benchmark testing comes in handy.

Benchmarking is primarily about evaluating the performance of the code under test by testing the efficiency of the CPU and memory to find a better solution. For example, the number of linked pools is not the more the better, then which value is the best value, which need to coordinate with the benchmark is constantly tuning.

How to write Benchmark tests

Benchmark code is written and unit tested very similar, it also has some rules, we first look at an example.

Itoa_test.go

1234567
func benchmarksprintf (b *testing. B){num:=b.resettimer () for i:=0; i<b.n;i++{fmt. Sprintf ("%d", num)}}

This is an example of a benchmark, from which we can see the following rules:

    1. The code file for the benchmark must end with _test.go
    2. The function of the benchmark must start with benchmark and must be exportable
    3. The benchmark function must accept a pointer to the benchmark type as a unique parameter
    4. Benchmark function cannot have return value
    5. b.ResetTimeris to reset the timer, which avoids the interference of the initialization code before the For loop
    6. The final for loop is important, and the code being tested is put into the loop.
    7. B.N is provided by the benchmark framework, which represents the number of cycles, because the test code needs to be called repeatedly to evaluate performance

Below we run the benchmark test to see the effect.

1234
➜  Test -bench=.-run=nonebenchmarksprintf-8      20000000               117 Ns/oppassok      Flysnow.org/hello       2.474s

The command is also used to run the benchmark, go test but we want to add -bench= a tag that takes an expression as a parameter, matches the function of the benchmark, and . represents running all benchmarks.

Because go test unit tests are run by default, in order to prevent the output of unit tests from affecting the results of our benchmark tests, we can -run= filter out the output of unit tests by matching a unit test method that has never been used, which we use here none , Because we basically don't create the unit test method for that name.

The following is an emphasis on explaining the results, see the function behind it? -8 This represents the value of the Gomaxprocs that corresponds to the runtime. Then the 20000000 number of times the For loop is run, that is, the number of times the code is called, and the last 117 ns/op means 117 nanoseconds per call.

The above is the test time default is 1 seconds, that is, 1 seconds of time, call 20 million times, each call takes 117 nanoseconds. If you want to make the test run longer, you can -benchtime specify it by, say, 3 seconds.

1234
➜  Test -bench=.-benchtime=3s-run=nonebenchmarksprintf-8      50000000               109 Ns/oppassok      Flysnow.org/hello       5.628s

It can be found that we have extended the test time, the number of tests has changed, but the final performance result: the time of each execution, and not much change. In general, this value should not be more than 3 seconds, meaning little.

Performance comparison

The example of the benchmark above is actually an example of an int type to a string type, and there are several methods in the standard library, and we can see which performance is more.

1234567891011121314151617181920212223
 func benchmarksprintf(b *testing. B) {num:=TenB.resettimer () fori:=0; i<b.n;i++{fmt. Sprintf ("%d", num)}} func benchmarkformat(b *testing. B) {num:=Int64(Ten) B.resettimer () fori:=0; I<b.n;i++{strconv. Formatint (NUM,Ten)}} func benchmarkitoa(b *testing. B) {num:=TenB.resettimer () fori:=0; I<b.n;i++{strconv. Itoa (num)}}

Run benchmark tests to see the results

123456
➜  Test -bench=.-run=none              Benchmarksprintf-8      20000000 117               ns/ OpBenchmarkFormat-8       50000000                33.3 ns/opbenchmarkitoa-8         50000000                34.9 Ns/oppassok      Flysnow.org/hello       5.951s

From the results, the strconv.FormatInt function is the fastest, followed by strconv.Itoa , then the fmt.Sprintf slowest, the first two function performance reached the last 3 times times more. So the last one why so slow, we can -benchmem find the root cause.

123456
➜  Test -bench=-benchmem-run=nonebenchmarksprintf-8      20000000               ns/op          2 allocs/opbenchmarkformat-8       50000000                31.0 ns/op             2 b/op          1 allocs/opbenchmarkitoa-8         50000000                33.1 ns/op             2 b/op          1 allocs/oppassok      Flysnow.org/hello       5.610s

-benchmemYou can provide the number of times each operation allocates memory, and the number of bytes allocated per operation. From the results we can see that the performance of the two functions, each operation is 1 memory allocation, and the slowest one to allocate 2 times, high performance each operation allocates 2 bytes of memory, and the slow one needs to allocate 16 bytes of memory each time. From this data we know why it is so slow and memory allocations are too high.

In code development, it's important to write benchmarks where we want performance, which helps us develop better-performing code. However, performance, usability, reusability and so on must have a relative trade-offs, not for the pursuit of performance and over-optimization.

"Go language Combat" reading notes, not to be continued, welcome to sweep code attention flysnow_org to the public or website http://www.flysnow.org/, the first time to see follow-up notes. If you feel helpful, share it with your friends and thank you for your support.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.