A test purpose This time, jetty and Tomcat were tested for performance, primarily to select the appropriate Java Web Server for the development of the new version of Webportal. We have performed performance tests on old rest and new Tmmi, and rest has Akamai's cache, and performance is basically fine. However, the content of Webportal is constantly updated and is not suitable for use with Akamai caches, so performance is all about the webportal itself and the balanced load. So I think, when developing a new version of Webportal, try to optimize it, including hardware, Web Server, database access, and so on. Jetty and Tomcat are the two main open source Java Web servers, both of which are more mature, and Google App engine uses Tomcat as the Web container for Jetty,jboss. Which is more suitable for our needs, or to do their own testing more reliable. In this test, the version of Jetty is the 6.1.19,tomcat version is 6.0.35. The application being tested is the Akamai map application (Akamailogparser), which has been partially modified for testing purposes and is described later. Deployment, configuration of two jetty and Tomcat Our previous versions of rest and Webportal were deployed using embedded's jetty. Both jetty and Tomcat support the stand-alone and embedded methods. The following is an introduction to the deployment and configuration of the jetty and Tomcat stand-alone methods. Deployment, configuration of jetty1. Copy the jetty zip file to the server and unzip it to see the following file (clip) Where the bin is some of the scripts and jar files that start jetty; Contextes is the configuration information for each context, a context equivalent to an application, such as Akamailogparser can correspond to a context , etc is the configuration information of jetty, WebApps is the jetty default storage of the application directory for deployment, but in this test, I created a folder MyApp, for storing akamailogparser. 2. Copy the Akamailogparser package to the MyApp directory. Note that in order to place all dependent jar files in {jetty_home}/myqpp/akamailog/webapps/web-inf/lib, JETTY will load the required jar files in this directory. In addition, the config file for Web. XML and spring can also be placed under Web-inf. 3. Under {jetty_home}/contexts new file Akamai.xml, the file content is: Where contextpath refers to the URL to access the Akamailogparser, which is set to "/", the URL is http://domianname:port/***.html. Jetty.home refers to the path of the Akamailogparser package that is relative to the {jetty_home}. 4. Log4j.properties is placed under {jetty_home}/resources, JETTY will read the log configuration from this directory 5. Akamailogparser additional configuration file conf.properties, can be placed directly under {Jetty_home}, this is the current path when the JETTY is started 6. Under {jetty_home}, use Java–jar Start.jar to start the JETTY 7. You can use JAVA–XMSXXXM–XMXXXXM to adjust the amount of memory allocated to jetty 8. You can modify JETTY configuration information in {Jetty_home}/etc/jetty.xml, such as JETTY ports, MinThreads, and MaxThreads. Later in the test, the performance tuning of the jetty, the main is to modify the Minthread and maxthreads these two parameters. Configuration, deployment of Tomcat1. The deployment method of Tomcat is relatively simple, this is only for testing, so ignore the log configuration and so on. The deployment method is to copy all the contents of the WebApps under the akamailogparser {tomcat_home}/webapps/root directory, which, like jetty, requires all the dependent jar packages to be placed in the Web-inf/lib directory. 2. Use {tomcat_home}/bin/startup.sh to start TOMCAT 3. To adjust the amount of memory allocated for TOMCAT, modify {tomcat_home}/bin/catalina.sh, add the following code before the echo "Using catalina_base: $CATALINA _base": Java_ Opts= "-server-xms512m-xmx2048m". 4. Adjust the connection pooling method to add the following parameters to the {Tomcat_home}/conf/server.xml Connector node: connectiontimeout= "..." maxthreads= "..." acceptcount= "... "。 Three test environments Server under test:CentOS 5.6 64-bit CPU: two x 2.93GHz CPU Memory: 2G Jdk:1.6.0_29,java HotSpot (TM) 64-bit Server VM jetty:6.1.19 tomcat:6.0.35 The JMeter machine that sent the test request:Using three machines to run the JMeter, use the JMeter Remote test method (described later), where two machines are used as JMeter server and one as the client. The test server and JMeter three machines are the same physically divided virtual machine, so the speed of network transmission should be fast, will not be a factor affecting the results of testing. During the test, Akamailogparser's code was also modified so that each request returned directly to a string in memory, which masked the effect of the read and write speed of the disk file on the test results. In addition, the memory size allocated for jetty and Tomcat is "-xms512m–xmx2048m". This ensures that the processing power of jetty and Tomcat is the only factor that affects test results under the same hardware conditions. Quad Jmter Remote Testing In JMeter remote testing, two roles are used: Server and client. The server is used to send requests to the tested server, which the client uses to detect the test results. The following is the physical environment diagram in this test: The steps for remote testing using JMeter are: 1. Start JMETER server with the start command: Jmeter_home/bin/jmeter-server. Because you need to send a large number of requests at the same time, you need to change the size of the Java heap allocated by JMeter by adding jvm_args= "-xms512m-xmx2048m" in jmeter_home/bin/jmeter.sh. Sometimes you encounter server failed to start:java.rmi.RemoteException:Cannot start. XXX is a loopback address. error, then you need to modify the/etc/hosts file that will 127.0.0.1 point to Localhost.localdomain , the actual IP of the machine point to the real machine name. 2. Add the address of the JMeter server to the JMeter client's properties file. Edit the properties file of the JMeter control machine. Locate the attribute "Remote_hosts" in the/bin/jmeter.properties file, using the IP address of the JMeter remote server as its property value. You can add IP addresses for multiple servers, separated by commas. 3. Start the remote test with JMeter client and add the thread group: Where "number of Threads" is the amount of simultaneous requests sent. Ramp-up period is how long it takes to reach the concurrency number. Loop count is how many rounds of requests are sent. 4. Create a new HTTP REQUEST: 5. Create a test result listener. 6. Open the test by run->remote start all. Analysis of five test results Here is the data collected in the test Tomcat test Results
Concurrency number |
MaxThreads |
Acceptcount |
Cpu |
Memory |
error% |
Throughput (Times/sec) |
4000 |
2000 |
2000 |
47.5% |
7.5% |
0.34% |
987 |
6000 |
3000 |
3000 |
56.2% |
7.5% |
0.63% |
1150 |
8000 |
4000 |
4000 |
63.2% |
7.7% |
0.15% |
1120 |
10000 |
5000 |
5000 |
54.4% |
7.4% |
0.31% |
1188 |
10000 |
10000 |
10000 |
66.2% |
7.9% |
0.53% |
1148 |
Jetty Test Results
Concurrency number |
MaxThreads |
MinThreads |
Cpu |
Memory |
error% |
Throughput (Times/sec) |
4000 |
5000 |
100 |
53.9% |
29.5% |
0.32% |
1009.6 |
4000 |
8000 |
800 |
48.8% |
19.8% |
0.25% |
1000.1 |
10000 |
5000 |
100 |
58.5% |
29.4% |
3.23% |
1130 |
10000 |
10000 |
1000 |
65.1% |
20.2% |
0.99% |
1106 |
In the above two sets of data, the red part is the best performance to ensure the low error path. Comparing the above two sets of data, Tomcat throughput is 1188 times/second, Jetty is 1009.6 times/second, the difference is not too much, but for jetty, when the number of concurrent increases, the error rate is also rising, at the same time, jetty also consumes more resources. In summary, in this test environment, Tomcat performance is slightly better than jetty, tomcat stability, and resource consumption is less than jetty. In the development of the proposed new version webportal, select Tomcat as the Web server. Because the hardware environment of this test is not consistent with the hardware environment of Mars production, the test results do not necessarily have reference value to the performance of the production environment. Original: http://www.cnblogs.com/zhangxiaojun/archive/2013/02/07/2908619.html |