Whether your boss or customer complained about the company in the New YearWebsite performanceGetting worse? Everyone will write about the website. Since Visual Studio is available, it will be written in front of your sister xiaomiao in the downstairs and Wang Dayun next door.ASP. NET. However, in the same picture, the performance behind the scenes may be far from the ground. I am more worried about enterprise Websites launched by many people at the same time, and the value of programmers is also different. This article provides some ideas for improving website performance, including hardware, software, and program skills. You are also welcome to share your experience or tips.
1. re-adjust or re-design the database schema and index)
The performance of an online system is poor, mainly because of the Database Planning and SQL statement level. NET Program writing is still inferior.
First, properly normalize the database. For example, in a Table, avoid inserting common fields and rarely used fields in the same Table, which affects the speed of data scanning.
The fields rarely used should be split into another table.
2. Rewrite the SQL statement. Check whether the index is actually used in queries.
The SQL performance gap between a poor "correlated subquery" and a good "independent subquery" is less than a second or more than a few minutes.
Some SQL keywords, as long as they appear in SQL statements, may cause the "index" of the table to be completely invalid or partially invalid. This means that the entire table needs to be scanned row by column,
For example: NOT, not in ,! =, <>, OR... and other keywords.
Fuzzy queries LIKE '% keyword' may also cause index failure, but LIKE 'keyword % 'will not cause index failure.
3. Use Native DataProvider
Abandon OleDb and use DataProvider of ADO. NET Native, such as SqlClient and OracleClient. However, if your company insisted on using Sybase, a database that never updated the DB driver since 2003, you had to continue to use OleDb with poor performance.
I used Visual Studio's built-in stress test tool to test the performance gap between OleDb and SqlClient. I simulated 30 concurrent releases and used a browser to retrieve 10 thousand pieces of data, the speed of the two is less than one second, and the performance gap is more obvious when the database has more data or more people go online at the same time.
4. cache using programs or software
Using programs for caching, such as ASP. NET's built-in Cache mechanism from the 1.x era, or using third-party auxiliary software and Framework.
5. Use hardware for caching or buffering and install AP Server
The following are two examples provided by netizens:
1) ITHome-key player in improving the game base's webpage Performance
Reference from the original article below:
All kinds of deficiencies have led to a sharp drop in the number of users of the website. In the face of a bunch of website problems, Chen xx also decided to make a major adjustment to the website, rewrite all the previous web programs and SQL query statements, and it took them three months to execute.
Chen xx also added a group of application servers to the original web server and database server architecture as the cache data source of the Web server.
After the revision, the search speed of the new website has increased a lot. In the previous daily statistics, the processing speed of more than 0.5 million pieces of data exceeds 3 seconds. After the revision, less than 10 queries per week within 3 seconds, and these queries are mostly caused by the execution of a large number of internal jobs.
Because the previously used L4 Switch was relatively old and the load volume was relatively poor, Chen xx chose to replace it with a new device to increase the load volume. At that time, it was just ready to launch the architecture of the application server, this will take the opportunity to update the network architecture. Chen xx said that this architecture works with L4 switches with high loads to enhance the processing performance of websites and defend against network attacks. After that, the website will still be attacked in sporadic ways, but it will not have much impact on it.
2) digital wall-personal practice for website Export (2) Operation
Reference from the original article below:
During the heyday, the number of blogs from the United States reached 0.8 million times per day. This number is not very high. It is a piece of cake for programmers, but I am a half-hanging engineer. Due to limited knowledge, the program may not be well written and is frequently warned by host suppliers, it is required to improve the website system performance. Finally, I decided to develop the cache system.
After the cache system goes online, it reads and writes the database from 0.8 million times a day to 0.16 million times a day. During this period, experts and friends are invited to help optimize the database structure, which is very helpful. During the process, I learned that a good cache system is very important for websites that provide the Widget function.
The ability to relocate the entire website to another host supplier at any time, in addition to the adjustment of the program itself, but also thanks to the prevalence of website management software. Here, we recommend a set of website management software called Plesk. Some host suppliers will directly help you install Plesk for free or additional charges. All the management functions of Plesk are implemented through the Web interface, so that they can be easily added without any need, greatly reducing the technical capability requirements.
In addition to Plesk, the website management software also has other options. The combination of WHM and cPanel is also a common website management solution. However, I prefer to use Plesk. After all, it is easy to use. It is no wonder that their market share has always been the largest. However, high-tech engineers may like WHM + cPanel, because it is relatively elastic. Either method can help you save a lot of time.
6. Install a physical machine for load balancing ). Some Server OS also has such configuration functions.
7. program skills-ADO. NET
If you can use DataReader, do not use DataSet/able. The former can read quickly without consuming memory. The latter is elastic, but slow, consumes a lot of memory for each user. If you use the default value-DataSet of the SqlDataSource control to connect to the data source of the DropDownList control, the performance will of course be affected when a bunch of drop-down menus are inserted in the page.
But the premise is that the programmer must have a certain degree of understanding of ADO. NET. If only Visual Studio is used through the graphical interface, drag TableAdapter, DataTable,. xsd to avoid talking about it.
If a Primary Key is created for the DataTable, The DataTable creates an index to track whether the data added to the DataTable meets the constraints (constraint ). ADO. NET 2.0 uses the "Red-Black Tree" algorithm of algorithm to process indexes. This makes it easier to maintain indexes when the data size of a DataTable is large; however, the disadvantage is that index creation reduces performance.
In addition, the database access and retrieval values should be completed at a DB connection. A connection can be used with multiple DbCommand objects without having to configure a DbConnection each time.
We recommend a good original ADO. NET book:
Programming Microsoft ADO. NET 2.0 Core Reference:
Http://www.amazon.com/Programming-Microsoft%C2%AE-ADO-NET-Core-Reference/dp/073562206X/ref=sr_1_1? Ie = UTF8 & s = books & qid = 1230971264 & sr = 1-1
I have explored some in-depth content that is rare in many books on the market, such as various Oracle + ADO. NET applications, Connection Pooling features, and various database Balk (batch) job applications.
8. program skills-. NET syntax
Avoid some books, and put the able or a large amount of data directly into the Session. This will definitely die in the system to be launched. When multiple users launch a Session simultaneously, the memory consumption is considerable, because the Session is stored by each user in the server's memory instead of like cache) it is a piece of memory shared by all users on the server. In many ASP. NET requirements, sessions can be replaced by the HiddenField control or ViewState.
You can use "generic" instead of the old version. In addition to security, Generics can also prevent. NET types from affecting performance during Boxing/Unboxing transformation. For example:
Use List Do not use the old ArrayList. Do not use Hashtable or loop. Because the ArrayList and Hashtable elements belong to the object class, boxing/Unboxing is triggered when you store or retrieve Value types such as int.
In most cases, List, Dictionary, and other generic classes have better efficiency and are type-safe.
Of course, the above premise is that the system should be developed with. NET, and the old system that is still supported by ASP or non-OOP language will not be discussed.
9. program skills-Database "Transaction )」
Do you know that SQL Server's default "transaction Isolation Level" is "ReadCommitted" when you are writing ADO. when SqlTransaction is used, by default, when a person modifies a record, all others who read the record will be locked, the browser of all other users is waiting and cannot do other work.
The Oracle transaction feature will never be similar to the case where it cannot be read. At least it will use the newly added Snapshot Isolation (Snapshot Isolation) similar to SQL Server 2005 )」, this allows users to read at least records that are not Commit or Rollback, rather than sitting in front of a browser.
However, "Snapshot isolation" in SQL Server 2005 is disabled by default. If the system developed with SQL Server is afraid of locking users, you can switch to the loose "ReadUncommitted" transaction isolation level as required by the project. This feature does not cause any locks, but it may cause Dirty Read. SQL Server has the following seven transaction isolation levels. If you are interested, you can query ADO. NET books or the MSDN Library:
Chaos
ReadCommitted // default value of SQL Server
ReadUncommitted // loose, with Dirty Read
RepeatableRead
Serializable // the strictest possible number of locks
Snapshot
Unspecified
10. ASP. NET Paging
The default behavior of the GridView + SqlDataSource is that all records are retained every time a page is changed or sorted. When the database contains 1 million pieces of data, each user changes pages, all of them are retained, which consumes a lot of Web server/AP server memory, database system resources, and network bandwidth. As a result, the website performance can be imagined.
Small websites in many enterprises may outsource to low-price bidding studios or inexperienced students and SOHO families to save money. The most terrible thing is that during the development period and when the system was just launched and the data volume was small, they could not feel it. Like cancer, it would suddenly erupt in the future.
11. ASP. net ajax UpdatePanel control is not omnipotent
The following references are from MSDN Magazine:
Whether it is good or bad, the UpdatePanel control is a favorite of the ASP. net ajax Community. I say "good" because UpdatePanel makes rendering some pages quite simple, but "bad" is because its simplicity and ease of use are at the cost of efficiency and ridiculous bandwidth.
UpdatePanel can bring AJAX magical benefits to general web pages, but it cannot provide the efficiency of normal association with AJAX. For example, do you know that when the UpdatePanel Control performs asynchronous AJAX callback on the server to update its content, this request contains the regular ASP. NET, including ViewState? Pages with too many viewstates will reduce performance, and pages with too many viewstates are too common in ASP. NET applications.
If you want to use the UpdatePanel control, you need to know what you are doing. In many cases, from a performance perspective, it is better for an application to use Asynchronous calls to WebMethods or page methods instead of UpdatePanel.
When you use UpdatePanel to perform non-blinking updates on a page, you may think that you are building an efficient website. After all, UpdatePanel uses AJAX, isn't it? Unfortunately, if you check the communication in the network when UpdatePanel is updated, you will find that you have not saved anything at all, at least at the time of sending. ViewState data (and other data) transmitted to the server during the sending-back process is also transmitted during the UpdatePanel callback. In fact, the data that grows in asynchronous XML-HTTP requests from UpdatePanel is almost the same as the data that grows in standard ASP. NET sending. The following are some secrets related to ASP. net ajax: Although UpdatePanel is easy to use, its communication efficiency is not high.
There is almost no way to improve the efficiency of UpdatePanel, but you can stop using UpdatePanel and switch to ASP. other functions of net ajax are used to update page content, which is not only the same but also more efficient. It only requires a little effort, but the final result is often considered worthwhile, because you can greatly reduce the amount of data transmitted between the client and the server.
12. Design Patterns
Although the "Design Model" was not born to solve performance problems, it can prevent unexperienced newcomers from doing stupid things. Learn more about the. NET architecture, OOAD, Design Patterns, and related frameworks, which will also help you increase your worth and salary.
All of the above are my personal experiences and insights. Thank you for your reading. You are also welcome to share your experiences or tips.
- Build high-performance and scalable ASP. NET websites
- Top ten essential tools for building ASP. NET websites
- Best practices for website performance optimization
- Practice
- Detailed descriptions of website performance test indicators