How to stress test the server in Linux

Source: Internet
Author: User
Tags min

Http_load is a performance measurement tool based on Linux platform. It is run in parallel multiplexing, only for performance testing of Web pages, not for access to databases, and test results analysis is limited and the platform relies on Linux. Http_load the HTTP server can be tested simply by the parameters recorded in txt text file, how is the stress test on the server? Let's explain how to install a tutorial on how to use Http_load to stress test a server in Linux.

  The steps of the method are as follows:

  1, download

Official website: http://acme.com/software/http_load/

The code is as follows:

Cd/root

wget http://acme.com/software/http_load/http_load-12mar2006.tar.gz

Tar xzf http_load-12mar2006.tar.gz

  2, installation

The code is as follows:

CD http_load-12mar2006

Make

After make, a http_load binary file is generated in the current directory.

  3, the use of methods

The code is as follows:

root@www:~/http_load-12mar2006#. /http_load--help

Usage:. /http_load [-checksum] [-throttle] [-proxy host:port] [-verbose] [-timeout secs] [-sip Sip_file]

-parallel N | -rate N [-jitter]

-fetches N | -seconds N

Url_file

One start specifier, Either-parallel or-rate, is required.

One end specifier, either-fetches or-seconds, is required.

Main parameter Description:

-parallel-P: Meaning is the number of concurrent user processes.

-rate-R: Meaning is the frequency of access per second

-fetches-F: Meaning is the total number of visits

-seconds-S: Meaning is the total access time

When selecting a parameter,-parallel and-rate choose one of them,-fetches and-seconds select one.

  4, Examples:

The code is as follows:

Http_load-parallel 50-s Urls.txt

This command line uses 50 simultaneous processes and randomly accesses the list of URLs in Urls.txt for a total of 10 seconds.

The code is as follows:

Http_load-rate 50-f 5000 Urls.txt

50 requests per second, Total request 5,000 stops.

Test the average number of visits per second that the site can withstand:

The code is as follows:

Http_load-parallel 5-fetches 1000urls.txt

This command line uses 5 processes simultaneously, randomly accesses the list of URLs in Urls.txt, and accesses a total of 1000 times. Results after the run:

1000 fetches, 5 max parallel, 6e+06 bytes, in 58.1026 seconds

6000 mean Bytes/connection

17.2109 fetches/sec, 103266 bytes/sec

msecs/connect:0.403263 mean, 68.603 max, 0.194 min

msecs/first-response:284.133 mean, 5410.13 max, 55.735 min

HTTP Response Codes:

Code 200-1000

From the results above, the target site can only withstand 17 visits per second, not strong enough.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.