ASP. NET Website performance improvement-shorten the homepage generation time

Source: Internet
Author: User
Tags website performance
ArticleDirectory
    • Generate a project in release Mode
    • Publish website
    • Disable debug mode
    • Reduce the number of assemblies
    • Use server. Transfer instead of response. Redirect
    • Specify the default page in the URL
    • Permanent redirection
    • Search Engine

The following bottlenecks affect the homepage generation time:

    • Memory pressure
    • Cache
    • CPU utilization
    • Thread Utilization
    • External resource wait time
How to identify the bottleneck memory

First, determine whether the server has exhausted the memory. If yes, it will increase the CPU usage and disk I/O. Because, when the memory is exhausted, the swap files on the disk will be used. Therefore, solving the memory pressure problem can also reduce the CPU and disk pressure.

RunPerfmonTo add memory-> pages/sec counters to the Performance Monitor. This counter counts the number of pages used per second. If page files are frequently used, memory bottlenecks exist.

Cache

Cache will increaseProgramRunning Efficiency, but it will occupy the memory.

CPU

InPerfmonUse processor-> % processor time counter.

Thread

The number of threads in the thread pool is limited .. The number of net2.0 threads is 12 times the number of CPUs. 3.5 and 4.0 are 12 times the number of CPUs in iis7 Classic mode and 100 times the number of CPUs in integrated mode.

You can use ASP. net-> request execution time (milliseconds for processing each request), requests current (number of requests, including queued requests and ongoing requests), Asp. net Applications-> requests executing (request being executed) counter.

If requests current continues much larger than requests executing, many requests are waiting for the thread.

Long wait

If the request execution value is high and the CPU and memory pressure is low, each execution thread may need to wait for external resources, such as databases.

Other measures

There are some other measures to reduce server load

Deploy and generate a project in release Mode

The release mode can reduce CPU and memory usage.

Publish website

If you use a website instead of a website application project, the website will be compiled in release mode.

Disable debug mode

Disable debug mode in Web. config.

Reduce the number of assemblies

When you publish a website, a folder generates an assembly. The following two links can reduce the number of assembly.

Http://msdn.microsoft.com/en-us/library/aa479568.aspx

Http://www.microsoft.com/downloads/en/details.aspx? Familyid = 0aa30ae8-c73b-4bdd-bb1b-fe697256c459 & displaylang = en

Reduce round-trip time using server. Transfer instead of response. Redirect

The disadvantage of using server. Transfer is that two different pages have the same URL. This may confuse visitors, especially their favorites or share the URL with others.

Therefore, only the transfer page is clearly related to the page. For example, if a visitor submits a form, you can transfer a page to display the result of the Form submission. Instead of turning to a completely unrelated page. If you know that the post back of a page is always switched to another page, use postbackurl to directly post back to this page.

Specify the default page in the URL

IIS automatically redirects http: // mydomain/myfolder to http: // mydomain/myfolder /. In addition, HTTP. sys does not cache default pages. Http: // mydomain/myfolder/default. aspx can be used to prevent redirection and cache.

Permanent redirection

The browser and proxy will update their cache, and the search engine will also use it. In this way, the access to the old page can be reduced.

Programming implementation 301 turn:

 
Response. statuscode = 301; response. addheader ("location", "Newpage. aspx"); response. End ();

. Net4 and later versions:

Response. redirectpermanent ("Newpage. aspx ");

You can also set IIS for permanent redirection.

Reduce cname records

In DNS, the record is better than the cname record. The cname may need another round-trip for resolution.

SSL

Using SSL will have a big impact on small static pages, because the time when the server interacts with the browser is greater than the time when the page is generated and transmitted. The impact on the creation of large pages, such as pages involving database access, is not that great.

When using SSL, use an absolute link address when the encrypted page references a non-encrypted page or when the non-encrypted page references an encrypted page. Or use the absolute link address on the entire site.

If there is an image on the encrypted page, https is also required for the image. Otherwise, the visitor will receive a warning.

Unwanted access requests

All requests increase the server load, especially involving expensive operations such as database access. You may want to block access requests that you do not need.

Search Engine

It is a good thing to have a search engine crawling your website. But you don't need it to access files that are not related to the search list, or you want to keep them private. These files may include:

    • Image
    • Javascript, CSS
    • Page to be authenticated

Stop searching for a certain number of websites in the engine, and put a plain text file named robots.txt on the root directory of the website.

Completely block access by search engines. Use:

 
User-Agent: * disallow :/

Prevent search engines from accessing a specified folder:

 
User-Agent: * disallow:/images/disallow:/JS/disallow:/CSS/disallow:/private/

Some mainstream search engines allow fewer crawlers:

 
User-Agent: * crawl-delay: 10

Another method is to provide site map. Site map is an XML file placed in the root directory of the website. It contains every file on the website.

Complete site map specifications: http://www.sitemaps.org /.

Leeching

Prevent leeching module: http://www.iis.net/community/default.aspx? Tabid = 34 & I = 1288 & G = 6.

Verification Code

If there are some forms on the website that update the database or perform other expensive actions, make sure that the form is submitted by a person rather than a robot.

One way to ensure that the form is submitted is to use the verification code.

The verification code library that can be inserted into the form:

    • ReCAPTCHA: http://recaptcha.net.
    • Botdetect CAPTCHA: http://captcha.biz.
    • Bytes.
Scraper program

If the website has a lot of useful content, someone may use the scraper program to automatically read the content. For example, they can use this content to publish their own websites and advertise them.

Because the scraper program does not modify the source IP address, and they read frequently. Such a simpleCodeThese accesses can be blocked.

These codes cannot prevent DoS attacks. In these attacks, the source IP address is modified, making attack detection very difficult.

The following code blocks simple scraper programs:

Public class botdefence {private const int intervalseconds = 5; private const int maxrequestsinterval = 100; private const int blockedperiodseconds = 20; public static bool isbotattack () {string visitorip = httpcontext. current. request. userhostaddress; visitorinfo = (visitorinfo) httpcontext. current. cache [visitorip] As visitorinfo; If (visitorinfo = NULL) {httpcontext. current. cache. insert (visitorip, new visitorinfo (), null, datetime. now. addseconds (intervalsecods), system. web. caching. cache. noslidingexpiration);} else {If (visitorinfo. blocked) {return true;} visitorinfo. nbrhits ++; If (visitorinfo. nbrhits> maxrequestsinterval) {visitorinfo. blocked = true; httpcontext. current. cache. insert (visitorip, visitorinfo, null, datetime. now. addseconds (blockedperiodseconds), system. web. caching. cache. noslidingexpiration); Return true ;}} return false ;}private class visitorinfo {public int nbrhits; Public bool blocked; Public visitorinfo () {nbrhits = 1; blocked = false ;}}}
 
 
Add judgment to the onload event on the page:
 
 
 
Protected void page_load (Object sender, eventargs e) {If (botdefence. isbotattack () {response. End (); Return ;}}
 

This is just a simple sample code. Do not add it to the production environment. You may need a whitelist and do not want to block Google crawlers or other search engines.

Finally, remember that the IP address is not a reliable evidence of access identity:

    • ISP or company may use proxy or firewall, so that all visitors have the same IP address.
    • Some ISPs give the dialing user a dynamic IP address, and another user may use this IP address.
Availability Test

Another source of unwanted traffic is that visitors do not find what they want, or do not know what you want them to do. If your website is easier to use, the traffic will be reduced.

 
 
Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.