Virus Detection mechanism: a virus is a piece of code, and the final form is a file. Any data streams that are transmitted on the Internet must be composed of files to be detected by the anti-virus scanning engine. When browsing the Web page, we actually get a lot of html files, gif icons, and script files through the http protocol, and the files will eventually arrive at the user's place. To implement http virus filtering, the gateway anti-virus product must reorganize the data in http traffic into files, that is, restore the OSI7 layer. To do this, all current solutions are using proxy technology. Both self-proxy and icap methods are used when a user initiates a browser request to a Web page, the gateway anti-virus device first downloads all the page files and gif files on this webpage, and then scans them for users. We need to make it clear that restoring http traffic to a file through proxy is a process, and scanning the file for viruses is another process. These two processes can be carried out separately. This highlights the first problem. Intranet Lan-based enterprise users are unwilling to set up proxies in IE browsers for http virus scanning, many applications that do not support proxy will not be available. To implement http virus scanning, it is also transparent to users. The second problem directly shows that after an http virus scan, the user may feel a very serious delay. Generally, the page latency is not more than 5 seconds, and the default http timeout time is 60 seconds (or 30 seconds ?). We can calculate the theoretical latency after http scanning. Assume that a company's Internet egress is 10 Mb and the download speed is more than 1 Mb/s theoretically. We must consider the impact of communication with China Netcom and other factors such as website speed limiting, generally, it is good to download kb/S from a single link on a random website. According to the previously described virus scanning method, the gateway anti-virus device needs to download this file before scanning the user. We do not consider the ttl delay, the latency of accessing small gif files on the page is less than 1 S, but what if the file size is 10 MB when the user wants to download the patch or file? 10 MB files to kb/S Speed Theory to download S plus 10 MB file virus scanning, which must take time to decompress, code matching and other scanning time. S has timed out for http, that is, the connection is disconnected. In the face of this situation, the anti-virus vendor's compromise solution is to increase the size limit of the files scanned by viruses, so that large files can be passed directly and the meaning of small files can be scanned. It is undeniable that this is indeed a good solution. The delay of virus scanning can be removed, but the conflict between obtaining files through proxy cannot be solved. What's more, the multi-threaded http File Download tool cannot be used in this proxy mode or is not working properly. In the face of this problem on the proxy, the proxy vendor provides a new solution, that is, when the proxy downloads files for the user, it keeps sending the connection-preserving data packets to the user, this prevents your http connection from being disconnected. This is a problem that all agents have encountered. Such a solution does avoid conflicts on the surface. The third problem is hard to overcome. With the development of Internet technology, more and more multi-protocol applications, especially IM programs, can be connected and communicated through http to ensure strong versatility and network connection performance. This results in a lot of msn Chat data, because the http protocol is used to go to the proxy and the virus gateway scans, not only does msn fail to use or use is not smooth, it also increases the burden on the virus gateway. At the same time, the current network virus is rampant, and many virus and botnets will continuously send a large number of external scattered connection requests in the LAN, many of which use access requests on port 80, A large number of 80 requests with external non-existent addresses cannot be ignored by the proxy and virus gateway, but it does not have any effect if a large amount of resources are spent to respond to these non-existent requests, network-layer attacks of Network viruses cannot be restored to files. The Gateway's http virus scan is powerless. What measures can we think of in the face of this prominent contradiction? That is, before the http traffic passes through the proxy and passes through the anti-virus gateway, the device that can detect network-layer virus attacks first filters out a large number of abnormal http requests and then sends them to the gateway for processing. Yes, we can solve all these problems, but how much equipment and money have we spent? What should we do with the http Proxy gateway if we have a device that can detect network viruses? Even if there is malicious code on the http page, the anti-virus client installed on the client can be completely detected, or some bad websites can be blocked through the blacklist or whitelist, which can greatly reduce the hidden risks of http access. It takes a lot of resources to perform extremely low-efficiency http virus scanning. It is difficult to detect several java viruses for thousands of years, but it greatly reduces the efficiency of internal staff and is not worth it!