10 tips for writing high-performance WEB applications

Source: Internet
Author: User
Tags connection pooling garbage collection interface sql net resource web services access
web| Program | skills | performance | skills | Performance This article discusses:
Common myths about ASP.net performance
Useful asp.net performance tips and tricks
Some suggestions for processing databases in asp.net
Buffering and using ASP.net for background processing
This article uses the following techniques: asp.net,.net framework, IIS

It's incredibly easy to write a Web application with asp.net. It is so easy that many developers can achieve very good performance without having to spend much time building their applications. In this article, I'll give you 10 tips for writing high-performance WEB applications. My comments are not limited to asp.net applications, as they are just a subset of WEB applications. This article is not an authoritative guide to WEB application Performance tuning-it can be written as a book. On the contrary, this article can be regarded as a good starting point.
I often go to rock climbing before I sleep and work. Before climbing, I always had to look at the lines in the guide book and read the advice and advice from people who had been here before. However, no matter how well the manuals are polished, you must take action before you try a specific challenging climb. Again, you can only manage to write high-performance Web applications before you are faced with performance problems or if you are working on a high throughput site.
Our personal experience comes from the Microsoft ASP.net team engaged in the underlying architecture program manager, running and managing www.asp.net, and assisting in the architecture Community server process, Community server is the next of several well-known asp.net applications Version (it will asp.net forums,. Text and Ngallery are integrated into one platform). I am sure that these skills, which have helped me, will be of benefit to you.
You should consider separating the application into several logical tiers. You may have heard of the term 3-layer (or N-layer) physical architecture. They are typically a prescribed architectural pattern for the physical partitioning of functionality across processes and/or hardware. More hardware can be added when the system needs to be scaled. However, you should always avoid performance issues that are related to the level of process and machine busyness. So, whenever possible, run the ASP.net page and its associated components together in the same application.
Because of the separation of boundaries between code and layers, using WEB services or remote calls will reduce performance by more than 20%.
The data tier is slightly different because the database is usually dedicated to hardware. However, the processing cost of the database is still high, so when optimizing code, the performance of the data layer should be the first place to pay attention to.
Before you begin to solve the performance problems of your application, be sure to dissect the application to determine where the problem lies. Getting critical performance counter values, such as the value of a performance counter that is a percentage of the time it takes to implement garbage collection, is also important to find out where the application is most time-consuming. Intuition is often used to find the time consuming.
There are two types of performance improvements described in this article: large optimizations, such as the use of ASP.net Cache, and continuous micro optimizations. These micro-optimizations are sometimes interesting. Small changes to the code will cause a lot of movement, resulting in thousands of calls. For large optimizations, you may see a big jump in overall performance. For micro optimizations, a given request may be only a millisecond-level adjustment, but the resulting improvement in the number of requests per day may be significant.

Performance of the data tier

When tuning an application's performance, there is a simple touchstone that you can use to prioritize: Check that the code accesses the database? If yes, how often do you access it? Note that the same test can also be applied to code that uses WEB services or remote calls, but this is not covered in this article.
If you must have a request for a database in a particular code flow and look at other aspects, such as prioritizing the string processing, put it aside for a moment and follow the priority set above. Unless you have an unusual performance problem, your time should be spent trying to optimize the connection to the database, the amount of data returned, and how long it takes to communicate with the database.
With these generalizations, let's take a look at 10 tips that can help you improve your application's performance. I will start with a change that will have the most dramatic effect.

Tip 1--Return multiple result sets

Review your database code to see if there are more than one requests for access to the database. This reduces the number of requests per second that your application can handle every time the round-trip database is available. By returning multiple result sets in a single database request, you can reduce the overall time to communicate with the database. You will also make the system more scalable because you reduce the burden of the database server processing requests.
Although you can return multiple result sets with dynamic SQL, I prefer to use stored procedures. The question of whether the business logic resides in the stored procedure is debatable, but I think it's a good thing if the logic in the stored procedure constrains the returned data (reducing the size of the dataset, the time it travels on the network, and the logic layer does not have to worry about data).
Using the SqlCommand command instance and its ExecuteReader method to handle the strongly typed business classes, you can move the result set pointer forward by calling NextResult. Figure 1 demonstrates the handling of several arraylists example sessions with a type. Returning only the data you need from the database also lowers the allocation of memory on the server.

Tips 2--Paging data access

The ASP.net DataGrid provides a very good capability: data paging support. When paging functionality is enabled in the DataGrid, only a fixed number of records is displayed at a time. In addition, the paging user interface is also displayed at the bottom of the DataGrid for navigating records. The paging user interface allows you to navigate forwards and backwards to the displayed record, displaying a fixed number of records at a time.
One drawback is that using a DataGrid paging requires all data Pradesh to this grid control (gird). For example, your data tier must return all the data, and the DataGrid will filter out all the displayed records based on the current page. When you page through the DataGrid, if 100,000 records are returned, then 99,975 records per request will be discarded (assuming a page size of 25). As the number of records increases, the performance of this application suffers, as more data is sent each time the request is made.
A good way to write better paging code is to use stored procedures. Figure 2 Demonstrates an example of using the Orders table in the Northwind database to page through stored procedures. Very simple, as long as you pass the index and page size in the page. The corresponding result set is computed first and then returned.
In Community Server, we have written several paging controls to complete the paging of data. As you will see, I used the idea discussed in tip 1 to return a result set from a stored procedure: the total number of records and the requested data.
The total number of records returned depends on the query being executed and different. For example, a WHERE clause can be used to constrain the returned data. To calculate the total number of pages displayed in the paging user interface, the total records returned must be known. For example, if you have 1,000,000 records that are filtered to 1,000 records with a WHERE clause, the paging logic must know the total number of records to render correctly in the paging user interface.

Tip 3--Connection Pool

Establishing a TCP connection between a WEB application and a SQL Server is an expensive operation. Microsoft's developers have been using connection pooling technology for a long time, and this technology allows them to reuse connections to the database. Instead of creating a new TCP connection on each request, the new connection is established only if the connection pool is not connected. When the connection is closed, it is returned to the connection pool, where it still remains connected to the database, contrary to the complete disconnection of the TCP connection.
Of course, you need to be wary of leaking connections. When you have finished processing, be sure to close the connection. Once again: No matter how much people brag about the garbage collection feature in the Microsoft. NET Framework, you must explicitly call the close or Dispose method of the Connection object whenever you have finished processing it. Do not expect the common language runtime (CLR) to purge and close connections for you at timed intervals. The CLR will eventually destroy the class and forcibly close the connection, but you cannot guarantee that the object's garbage collection will work at that time.
In order to fully use the connection pool, there are several rules that must be clear to the heart. First, open the connection, process it, and then close the connection. It is preferred that each requested connection be opened and closed more than once, and that the connection be opened and transmitted between different methods. Second, use the same connection string (if you use an integrated identity check, use the same thread identity). If you do not use the same connection string, for example, to customize the connection string according to the logged-in user, you will not be given the same optimal value as the connection pool provides. When you simulate a large user situation, if you use an integrated identity check, your connection pool will be significantly reduced.. NET CLR Data performance counters are useful when attempting to track any performance issues associated with a connection pool.
Whenever your application is connected to a resource running in another process, such as a database, you should optimize the time it takes to connect to the resource, and how long it will take to send and receive data, as well as the number of round-trip times. To achieve better performance, you should first optimize any kind of busy process in your application.
The application layer contains the connections to the data tier and the logic that transforms the data into meaningful class instances and business processes. Take Community Server For example, where you deal with forums and Threads collections, and business rules such as licensing, and it is particularly important that buffering (Caching) logic is implemented.

Tips 4--asp.net Cache API

One of the first things to do before writing code is to maximize the application tier and discover the Cache features of ASP.net.
If your component is running within the ASP.net application, you only need to refer to System.Web.dll in the application engineering. When you need to access the Cache, use the Httpruntime.cache attribute (the same object can also be accessed through Page.cache and Httpcontext.cache).
There are several guidelines for buffering data. First, if the data can be used multiple times, buffering is a good option. Second, if the data is general data to a given request or user rather than private data, it is best to select a buffer. If the data user or request is private, if the retention period is long but may not be used frequently, then the buffer is still needed. Third, one rule that is often overlooked is that there are sometimes too many things to buffer. Generally speaking, in order to reduce the chance of low memory error on the x86 machine, run a process do not exceed 800MB private bytes. Therefore, the buffer should have a ceiling. In other words, you may be able to reuse the results of a calculation, but if the calculation has 10 parameters, you may try to buffer against 10 permutations, which may cause trouble to you. The most common fault tolerance provided by ASP.net is an out-of-memory error caused by overriding buffering, especially large datasets.
There are several important features of the Cache that must be understood. The first is that the cache implements the least recently used (least-recently-used) algorithm, allowing ASP.net to force the cache cleanup operation--if the available memory drops to a low level--to automatically remove unused items from the cache. The second is the Cache support dependency expiration feature, which can force time, key values, and file invalidation. Time is often used, but ASP.net 2.0 introduces a more powerful type of failure: Database buffering fails. That is, when the data in the database changes, the entries in the buffer are automatically deleted. For more information about database buffering failures see Dino Esposito's Cutting Edge column in the July 2004 issue of MSDN Magazine. The buffered architecture, see Figure 3.


Figure 3 asp.net Cache

Tips 5--Pre-request buffering (Per-request Caching)


Earlier in this article, I mentioned that small changes to frequently executed blocks of code could produce significant overall performance improvements. One of the things I'm particularly interested in is called the pre-request buffer (Per-request caching).



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.