First, check whether the server hardware is sufficient to support the current traffic. Common P4 servers generally support up to 0.1 million independent IP addresses per day. if the traffic volume is larger than this, you must first configure a dedicated server with higher performance to solve the problem, otherwise, how can we optimize PHP to solve the problems of high website traffic and high concurrency?
First, check whether the server hardware is sufficient to support the current traffic.
Common P4 servers generally support up to 0.1 million independent IP addresses per day. if the traffic volume is larger than this, you must first configure a dedicated server with higher performance to solve the problem, otherwise, it is impossible to completely solve the performance problem.
Second, optimize database access.
The front-end is fully static. of course, it is best to achieve it without having to access the database at all. However, for websites with frequent updates, static content often cannot meet certain functions.
Caching technology is another solution. it stores dynamic data in cached files. dynamic webpages directly call these files without having to access the database, wordPress and Z-Blog Use This caching technology extensively. I also wrote a Z-Blog counter plug-in based on this principle. If you cannot avoid database access, you can try to optimize the database query SQL. avoid using statements such as Select * from. only the results you need are returned for each query to avoid a large number of SQL queries in a short period of time.
Third, prohibit external leeching.
Image or file leeching on external websites often results in a lot of load pressure. Therefore, we should strictly limit external leeching on images or files. Fortunately, at present, we can simply use refer to control leeching, apache can disable leeching through configuration. IIS also has some third-party isapis to implement the same function. Of course, spoofing referer can also implement leeching through code. However, currently, there are not many attempts to deliberately forge referer leeching. you can skip this step or use non-technical means to solve the problem, for example, add a watermark to an image.
Fourth, control the download of large files.
Large file downloads consume a lot of traffic, and for non-SCSI hard disks, large file downloads consume CPU, reducing website response capabilities. Therefore, we recommend that you do not download large files larger than 2 MB.
Fifth, use different hosts to distribute major traffic
Place files on different hosts and provide different images for users to download. For example, if you think that the RSS file occupies a large amount of traffic, use FeedBurner or FeedSky to output the RSS to other hosts. in this way, most of the traffic pressure accessed by others is concentrated on the FeedBurner host, RSS does not occupy too much resources.
Sixth, use the traffic analysis and statistics software.
Install a traffic analysis and statistics software on the website to instantly know where traffic is consumed and which pages need to be optimized. therefore, precise statistical analysis is required to solve traffic problems. The traffic analysis statistics software I recommend is Google Analytics ).
?