Why does IIS7/7.5 Gzip not work?

Source: Internet
Author: User

In the IIS7 Gzip Compression configuration article, I introduced the configuration of IIS7 for gzip Compression, and by default, because IIS7 configures Javascript as Dynamic Compression (Dynamic Compression ), due to other factors in the CPU, the Compressed Content of gzip may not be returned. However, it was found that although Javascript is configured as static compression, sometimes gzip is unavailable when Javascript script files are requested. This article shares with you the problem I encountered and the process and ideas for solving the problem, hoping to help you.

Random gzip Behavior

The symptoms are very simple and strange. Sometimes the script is accessed multiple times in a row. At the beginning, there is no gzip and gzip is accessed again. Wait for a while and then visit again. gzip is gone. Sometimes, no matter how you access it, the response after gzip is returned to me.

Problem Analysis

The reason for excessive CPU load is excluded, because the problem also occurs when the machine's CPU load is normal in the middle of the night.

The following Warning entries are found in the system log, causing my attention:

Warning 2010/4/13 6:32:41 IIS-W3SVC-WP 2264 None

The directory specified for caching compressed content C: \ inetpub \ temp \ IIS Temporary Compressed Files \ XXX is invalid. Static compression is being disabled.

Through log filtering, it is found that such a Warning has been appearing since March July, and the frequency is about 1-3 days.

This folder is used by IIS to store the compressed static content cache. According to the log above, this directory is faulty, resulting in the static compression function being disabled. But I can access this folder clearly, and there are indeed cached scripts in it.

There are many posts asking this question on the IIS forum. However, some of them gave me a lot of ideas. Some of them said it may be a folder permission problem. I looked at the compressed cache directory. The above permissions are indeed the identity of the application pool, and there is no problem. Processmonitoris used to monitor the activity of the w3wp.exe process. No error occurs when the process accesses the file system.

Later, I saw an article in Kanwaljeet Singla (a member of the IIS Development Team) on the Internet, introducing IIS7's changes to the compression module compared with IIS6. This article mentioned a point,To reduce the overhead, IIS only enables compression for frequently accessed static resources.. With the two new configuration options emerge and frequentHitTimePeriod introduced by IIS7, we can set that when a resource is continuously accessed for frequentHitThreshold times within the frequentHitTimePeriod time, it will be "frequently accessed. The default configuration is to initiate two consecutive requests to the same Url within 10 seconds. Even if the resources corresponding to this Url are frequently accessed resources, the IIS server compresses the requests, and then drop it into the cache directory.

This explains why the server does not return the gzip compressed content when the script is requested for the first time. Instead, the server returns the gzip response only when the next access occurs.

However, according to the design philosophy of IIS mentioned in this article, once resources are compressed and cached, this "frequent access" detection logic will not be applied in the future, because there is no compression overhead. I am very confident that this script has been cached on the server's hard disk. Why is there still no gzip when I try again later?

There is a function in IIS7 called failure request tracing, which allows technicians to track what happens to the problematic requests in IIS. To uncover my questions, I enabled failed request tracking for this website.

IIS7 Bug ?!

Open IIS7 manager, locate the Operation Panel on the right of the corresponding website, and enable "failed request tracking ". The default trace log files are stored in the % SystemDrive % \ inetpub \ logs \ FailedReqLogFiles directory.

Then, on the function panel, find "failed request tracking rules" For Configuration:

To prevent interference from other scripts, I have defined only the scripts for testing.

Enter 200 in the response status.

The next step is to select the tracking provider, that is, to provide you with the tracking log source program. Because our scripts only go through the static file processor, we only need to select the www server handler here.

Then we can test the problematic script. Once some problems are encountered during IIS processing, related logs are recorded in xml format. Open the FailedReqLogFiles Directory, which contains some files in the format of fr000001.xml. Each file represents a failed request. Open this file, search for the compression keyword, and find the following content:

<Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> <System>  <Provider Name="WWW Server" Guid="{3A2A4E84-4C21-4981-AE10-3FDA0D9B0F83}"/>  <EventID>0</EventID>  <Version>1</Version>  <Level>4</Level>  <Opcode>3</Opcode>  <Keywords>0x40</Keywords>  <TimeCreated SystemTime="2010-05-13T16:59:08.086Z"/>  <Correlation ActivityID="{00000000-0000-0000-FD2B-008041000061}"/>  <Execution ProcessID="10752" ThreadID="14504"/>  <Computer>DMZ</Computer> </System> <EventData>  <Data Name="ContextId">{00000000-0000-0000-FD2B-008041000061}</Data>  <Data Name="Reason">14</Data> </EventData> <RenderingInfo Culture="zh-CN">  <Opcode>STATIC_COMPRESSION_NOT_SUCCESS</Opcode>  <Keywords>   <Keyword>Compression</Keyword>  </Keywords>  <freb:Description Data="Reason">NOT_FREQUENTLY_HIT</freb:Description> </RenderingInfo> <ExtendedTracingInfo xmlns="http://schemas.microsoft.com/win/2004/08/events/trace">  <EventGuid>{E60CEE96-4472-448D-A13C-2170B18220EC}</EventGuid> </ExtendedTracingInfo></Event>

We can see the NOT_FREQUENTLY_HIT text above. It seems that there is no gzip response because it still does not meet the "frequent access" requirement, even if the requested resources have been compressed and stored on the hard disk. It seems that this compression cache has a certain validity period, that is, how long it will take to re-verify this "frequently accessed" logic.

I performed a lot of tests and found that the failure time was about 5 minutes. That is to say, when you frequently access a resource, IIS starts to compress it and stores it in the local compressed cache directory. after about five minutes, you can access it again, the verification logic of "frequent access" will be re-executed.

To confirm my conclusion, I asked Kanwaljeet Singla (@ kjsingla) on Twitter. To my surprise, he quickly replied to me, probably because I had been doing this test in the middle of the night. He was just in the daytime. He tested and confirmed that the logic detection of "frequent access" occurred before "whether the response has been compressed". This is indeed a Bug in IIS7/7.5.

The solution is simple. You just need to define "frequent access" more broadly. For example, one access in one minute.

<configuration>    <system.webServer>        <serverRuntime frequentHitThreshold="1" frequentHitTimePeriod="00:01:00" />    </system.webServer></configuration>
Is the problem solved?

When I thought the problem was solved successfully and fell asleep with satisfaction, I found that the request script had no gzip again during the test the next day, and this time the request was not compressed several times, this cannot be explained using the conclusion I got last night. It makes me very depressed-_-|

Calm down!

In the system log, I saw an Invalid Warning Message for the compressed directory, but this time, the compressed directory is really gone. Recall the operation at that time. The website operated at that time threw an unhandled exception. After I modified this exception, I restarted the application pool. Does it mean that when an exception occurs to a website, if the application pool is forcibly reclaimed, the static compressed directory will be deleted? Even more strange, When I recycle the application pool again, I found that the compressed directory was re-created.

To verify my conjecture, I repeated this process and asked the website to throw an exception => restart the application pool => request the script again to reproduce the problem; I created a new test website with the same configuration, and the same test process found that the compressed directory was normal, without the above symptoms.

After careful comparison, I found that the operating website is not the same as the website environment I tested. Websites operated by Alibaba Cloud always have external requests and are frequently used. The test website does not have any interfering traffic.

To simulate the real situation, I wrote a macro using iMacro to continuously request the script from the test website. At the beginning, we can see that the compressed directory exists, and I manually reclaimed the application pool in the middle. It is found that the compressed cache directory is indeed deleted.

After careful tests, I finally got a conclusion:

In IIS7/7.5, if the administrator account permission is configured for the application pool, if the application pool is reclaimed when the application pool is processing static File compression, the compressed cache directory is deleted. This compression cache directory will not be re-created until the application pool is healthy and restarted the next time.

This conclusion has not been confirmed by the IIS team. However, the results of IIS7 and 7.5 tests on several of my machines are indeed the same. In Lenovo's previous system logs, this problem occurs every 1-3 days, just because the default recovery interval of IIS7 is 1740 minutes, that is, one day and five hours. I calculated the time for all the warnings about the Static Compression directory in the system log, which is very consistent with the multiples of the recovery interval.

There are two solutions:

  • If this is not necessary, do not configure the administrator account for the application pool. Use the built-in account to avoid this problem.
  • Disable the fixed interval recycle function and change it to the recycle function at a relatively idle time, for example, at in the middle of the night.
    I hope the conclusion I got for two nights will help you ~

--Kevin Yang

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.