The bandwidth of the hundred trillion is theoretically 12.5MB of data can be transferred in 1 seconds , but it is normal to take into account that the interference is transmitted as long as it exceeds 10MB per second . Gigabit bandwidth transmission per second is 100M.
Http://www.cnblogs.com/candle806/archive/2011/04/02/2003828.html
Through analysis, in the peak only network bandwidth, is more than 90%, and compare the throughput rate here is exactly 95mb/s,1Gbps Network bandwidth transfer rate of 128mb/s, which indicates that because of the throughput is too large, occupy a large amount of bandwidth resources, Causes subsequent virtual users to be denied access to the server's resources, causing the request to be rejected. From the last page response time, the system pressure has not been taken to the page, but because the excessive throughput swallowed the network bandwidth, resulting in the final failure to effectively complete the test task.
Http://www.xinfengit.com/200907/1848581.html
During performance testing, you often encounter database CPU resource utilization not going to
1, network bandwidth problems
1.1 Tested environments and LR machines are in the gigabit bandwidth
1.2 The test environment and the LR machine are not in the same bandwidth, the test environment in the Gigabit bandwidth environment, theLR machine in the Gigabit bandwidth environment
2.Controller machine in the gigabit bandwidth, tested environment and LR pressure generator over gigabit or above bandwidth
You can see whether the switch in the tested environment has a transfer rate of 100Mbps or 1000Mbps.
Tp-link tl-sf1016, transfer rate:10/100mbps
3. Data volume problem
3.1 Network is not a problem, the throughput is even more than 100M, but the background server resources are still relatively low.
The database has less basic data and is almost empty, so the database CPU utilization is not
3.2 The amount of data in the database is more (more than 1 million), but the user that is actually used in the performance test is less connected, or has no associated flow at all. For example:more than 1.5 million of the transaction flow, the current user table has 500 user numbers, of which 200 user numbers are linked to the data in the water table, while testing the use of 50 users. Database CPU does not go up, first to exclude the network and the amount of data limitations, and then to see whether the 50 concurrent users are linked to the water table? How much water is associated with each customer number (greater than 2000, less than 100,000, too large to be unrealistic)?
4.JDBC Connection pool limit
There is no problem with the amount of network and data above, it is considered whether there is a limit on the number of connections to the database, and the SQL requests for those transactions that are operating on the database do not reach the database server at all. We can view the maximum capacity of JDBC through the console of the middleware (the maximum number of physical connections This connection buffer pool can hold)
4.1 Database JDBC Connection pool limit, set to a small size,weblogic Default maximum capacity of 50.
4.2 If multiple deployments are deployed on an application server, see if the JDBC connections are balanced
5. Application issues
The processing power really reached the limit = =
6. performance test scripts and data issues
Server stress does not go to cause analysis