10 secrets of Asp.net to improve performance or scalability (I)

Source: Internet
Author: User
Tags net thread

Introduction

Asp.net has many "secrets" worth exploring. When you discover them, it will greatly improve the performance and scalability of your website! For example, for membership and ProfileProgramThere are some secret bottlenecks that can be easily solved, making authentication and authorization faster. In addition, the HTTP pipeline of Asp.net can process each request to prevent execution of unnecessaryCodeAnd is under attack. More than that, the Asp.net workflow can completely exert its power by breaking through the default limits. The page segment output cache on the browser side (not on the server side) can significantly reduce the large amount of download time required for access requests. Loading on the desired user interface can bring a fast and stable experience to your website. Finally, content delivery network (CDN) and correct use of HTTP cache headers can make your website quickly respond.

In this articleArticleYou will learn these technologies to greatly improve the performance and scalability of your Asp.net application. The following are the technologies to be discussed:

    • Asp.net pipeline Optimization
    • Asp.net Process configuration optimization
    • What you need to do before the Asp.net website goes online
    • Content Delivery Network (CDN)
    • Browser cache Ajax call
    • Best use of browser cache
    • Load the required UI step by step to provide a fast and smooth experience
    • Optimize the Asp.net 2.0 profile provider
    • How to query the membership table of Asp.net 2.0 without turning the site offline
    • Prevent DoS Attacks

The above technology can be implemented on any Asp.net website, especially those websites that use the membership and profile providers of Asp.net 2.0.

Asp.net pipeline Optimization

Some Asp.net default httpmodules are set in the request pipeline and they participate in each request. For example, sessionstatemodule processes every request, converts session cookies, and loads appropriate sessions for httpcontext. Not all of these modules are always needed. For example, if you do not use membership and profile provider, you do not need the formsauthentication module. If you do not use Windows authentication for your users, you do not need the windowsauthentication module. These modules are only placed in the pipeline and execute some unnecessary code for each request.

These default modules are defined in the machine. config file (located in $ windows $ \ Microsoft. NET \ framework \ $ version $ \ config directory)


You can remove these default modules from your web application by adding <remove> nodes to the Web. config file.


The above configuration is very suitable for websites that use databases and perform forms-based authentication without any session support. Therefore, all the above modules can be safely removed.

Asp.net Process configuration optimization

The Asp.net Process Model configuration defines some process-level attributes, such as the number of threads used by Asp.net, how long the thread will be blocked before timeout, and how many requests are allowed to wait for I/O to complete. These default configurations have too many restrictions in many cases. Now, the hardware has become quite cheap, and the dual-core Ram server of Gigabyte technology has become a very common choice. Therefore, the process model configuration allows the Asp.net thread to use more system resources and provide better scalability for each server.

A general Asp.net installation will create a machine. config according to the following Configuration:


You need to modify this automatic configuration to use some special values for different attributes to customize the work of The Asp.net workflow. For example:


Except for the following values, other values are default values:

  • Maxworkerthreads-For each process, 20 is the default value. On a dual-core computer, 40 threads are allocated to Asp.net. This means that Asp.net can process 40 requests concurrently on a dual-core machine last time. I increased it to 100 and gave Asp.net more threads for each process. If you have a non-CPU-intensive application that can easily process more requests, you can increase the value. Especially if your web application uses many web services to call or download/upload a lot of data without putting pressure on the CPU. When Asp.net runs more than the number of worker threads allowed, it stops processing more incoming requests. The request is put into a queue and kept waiting to know that a working thread is released. This usually happens when your site begins to suffer more attacks than you expected. In this case, if your CPU is idle, increase the number of worker threads for each process (Asp.net process.
  • Maxiothreads-For each process, 20 is the default value. On a dual-core computer, 40 threads are allocated to Asp.net for I/O operations. This means that Asp.net can process 40 I/O requests in parallel on a dual-core server. I/O requests may be file read/write, database operations, and HTTP requests generated by Web applications called by Web Services. Therefore, you can set it to 100 if your server has enough system resources to process more I/O requests. In particular, when your web applications often download or upload data and concurrently call many external WebServices, the efficiency will be significantly improved.
  • Minworkerthreads-When the number of available Asp.net worker threads falls below this value, Asp.net starts to push incoming requests to the queue. Therefore, you can set this value to a very low number to increase the number of current requests that can be processed. However, do not set it too low, because Web application code may require some background processing and some parallel processing requires a certain number of idle working threads.
  • Miniothreads-the same as minworkerthreads, but this value involves the number of I/O threads. However, you can set a number that is lower than minworkerthreads, because it does not occur on the I/O thread for parallel processing.
  • Momorylimit-Specify the maximum allowed memory size, which is the percentage of the total system memory. It is the memory size that can be consumed by the worker process before Asp.net starts a new thread and reallocates the request being processed. If only your web application is allowed in a "dedicated box" and no other process needs Ram, you can set it to a high value, such as 80. However, if you have a memory leak application that keeps leaking the memory, you 'd better set it to a lower value, so that the memory leakage process can be completely reclaimed as soon as it becomes unmanageable. This is especially true when you are using a COM component and it causes memory leakage.

In addition to the processmodel node, there is also a very important node, --system.net, you can specify an IP address it can send the maximum number of requests.

The default value is 2, which is too low. This means that you cannot establish more than two simultaneous connections from your web application to an IP address. Websites that need to obtain external content are largely restricted by this default configuration. Here I set it to 100. If your web application requires a lot of calls to a specific server, you can consider setting this or higher value.

What you need to do before the Asp.net website goes online

If you are using the membership provider of Asp.net 2.0, you should make some adjustments to your web. config:

    • Add the applicationname attribute to profileprovider. If you do not add a special name, profileprovider uses a guid. Therefore, on your local machine, you will have a guid and another guid on the published server. If you copy your local database to the Publishing Server, you will not be able to reuse existing records in your local database, and Asp.net will create a new application on the Publishing Server. You need to add it here:
    • No matter when a page request is complete, profileprovider automatically saves the profile. Therefore, this may lead to an unnecessary update of your database, which has a significant performance loss! Disable auto save and use profile. Save () in your code to do so explicitly.
    • Role manager keeps querying the database to obtain the role of a user. This also has a significant performance loss. You can allow role manager to cache role information into cookies to avoid this. However, it also allocates approximately 2 kb cookies to users who do not have many roles, but this is not a common scenario. Therefore, you can store role information in cookies safely.

The above three settings are mainly for high-traffic websites.

Content Delivery Network

Each request from the browser is sent to your server through a backbone network that spans the world. Requests must pass through a certain number of countries, continents, and oceans to your server, so it will become very slow. For example, if you host your server in USA and some people visit your website in Australia, each request will be sent from one end of the earth to another to reach your server, then return to the browser. If your site has a large number of static files, such as images, CSS, and JavaScript. Sending Requests to them across the world will take a lot of time to download them. If you can set up a server in Australia and redirect the user to your server in Australia, it will take less time for such a request to arrive in the United States. Not only will the network latency be lower, but data transmission routes will be faster. Therefore, static content can be downloaded at a faster speed.

Note: As this section has nothing to do with Asp.net technology and involves network planning, we will not discuss it here, but just provide one:


Cache Ajax calls in a browser

The browser can cache images, JS files, and CSS files to the hard disk. It can also cache xml http calls if http get is called. The cache is URL-based. If the URL is the same, it is cached on the computer when the request is sent again, and then the response is loaded from the cache instead of from the server. Basically, the browser can cache any http get request call and return cached data based on the URL. If you send an xml http request like an httpget request and the server returns some special response headers, it instructs the browser to cache the response data. In the future, the response will return data directly from the cache. This reduces the network round-trip latency and download time.

We cache the user's status, so when the user visits the website again in the next few days, the user directly obtains the cached page from the browser cache, rather than from the server. Therefore, the second load will become very fast. We also cache some parts of the page, depending on the user's operations. When the user executes the same operation again, the result of a cache is directly loaded from the local cache, thus saving the network round-trip time.

If you return the Expires header during the response, the browser caches the xml http response. There are two Response Headers you need to cache the response data with the response returned:


This indicates that the browser cache response data is available on April 9, January 2030. Even if you initiate an xml http request with the same parameters, you will only get the cached response data from your local computer. Of course, there is a more reasonable way to control browser cache. For example, some request headers indicate that the browser caches data for 60 seconds, but after 60 seconds, the server is reconnected and a refresh data is obtained. When the browser is locally cached for more than 60 seconds, it also prevents the Proxy from returning the cached response data.


Let's use an Asp.net web service call to output such a response header.


This will cause the request header to become the following form:


The Expires header is correctly set. However, cache-control is the only choice. You can see that Max-age is set to 0, which will prevent the browser from caching any form. If you are sure you want to prevent caching, you should set such a cache-control header. It seems like things happen in real time.

The output is as follows, without Caching:


Asp.net 2.0 has a bug-you cannot change the max-age header. Because Max-age is set to 0, Asp.net 2.0 sets cache-control to private. Because Max-age = 0 means no cache is required. Therefore, there is no way for Asp.net 2.0 to return the appropriate response header cached. This is because the Asp.net Ajax framework can intercept calls to Web Services before executing a request and incorrectly Set max-age to 0 by default.

"Hacker" is coming! After decompiling the source code of the httpcachepolicy class (class of context. response. cache object), I found the following code:


In any case, this. _ maxage is set to 0 and -- If (! This. _ ismaxageset | (delta <this. _ maxage) also sets a larger value for the Organization. For this reason, we need to bypass the setmaxage method and directly set the value of _ maxage. Of course, this requires the "reflection" mechanism:


This will return the following header:


Now, Max-age is set to 60, so the browser will cache the response for 60 seconds. If you send the same request again within 60 seconds. It returns the same response. Here, an output test shows the date/time returned from the server:


One minute later, the cache becomes invalid, and the browser sends another request to the server. The client code is as follows:


There is another problem to be solved: In web. config, you will see that Asp.net Ajax will see:


This will prevent us from setting the _ maxage object of the response object because it requires reflection. So you will have to delete this item or change the value to: full.


Not complete, To be continued...




Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.