For efficient cache class, the server cannot handle page visits per second

Source: Internet
Author: User
For efficient cache classes, the server can access page requests per second. the server cannot handle the efficient cache class and can access pages per second. each page is different. it may be millions of pages ,, if you want to try cache processing, the server cannot handle it. I found a cache class, which does not seem to solve the load problem .!!!





 CACHE_LIFE) {unlink ($ file) ;}}// callback function auto_cache ob_start ('auto _ cache');} else {// delete the cache file if it is not a GET request. If (file_exists ($ cache_file) unlink ($ cache_file) ;}?>




Thank you !!!!!!


Reply to discussion (solution)

It is impossible to use file cache with such a high throughput

So how to cache?

It is impossible to use file cache with such a high throughput
What should we do ?????

Of course, it's about changing the server. you have more than the portal site.

Isn't there memory-based cache software and databases?
You can select one based on your hardware configuration and operating system

Isn't there memory-based cache software and databases?
You can select one based on your hardware configuration and operating system

In fact, I'm exaggerating. if there are not so many accesses, it means that the access speed is very slow after the access is too large ,,
In addition, the cpu usage is high. after the cache is deleted, it looks better. Please check whether it is a problem with my cache writing ..

Isn't there memory-based cache software and databases?
You can select one based on your hardware configuration and operating system

During peak hours, visit 20 times per second. There are 30 websites in total, so there will be 60 visits.

First, optimize the code and SQL, and then separate and cache the files. this can basically solve most of the high concurrency. if it doesn't work, it means that your volume is not a normal technical solution, let's see how yahoo and google do it.

During peak hours, visit 20 times per second. There are 30 websites in total, so there will be 60 visits.

There are many factors that affect the speed of website access, such as database pressure, server pressure, and bandwidth pressure. if it is a PV of times per second, as mentioned above, none of the above is what a server can afford. you must create a server cluster. If you are talking about 20 times per second, in my experience, there is no pressure on both the database and server. only bandwidth may cause a bottleneck in this access volume. You can calculate the total size of all resources on your page multiplied by 20 to determine whether your bandwidth is exceeded.

There are many factors that affect the speed of website access, such as database pressure, server pressure, and bandwidth pressure. if it is a PV of times per second, as mentioned above, none of the above is what a server can afford. you must create a server cluster. If you are talking about 20 times per second, in my experience, there is no pressure on both the database and server. only bandwidth may cause a bottleneck in this access volume. You can calculate the total size of all resources on your page multiplied by 20 to determine whether your bandwidth is exceeded.

Currently, it is found that the number of cached files is too large, resulting in each access to the cached files, occupying resources ,,

I hope you can answer this question. thank you !!


There are many factors that affect the speed of website access, such as database pressure, server pressure, and bandwidth pressure. if it is a PV of times per second, as mentioned above, none of the above is what a server can afford. you must create a server cluster. If you are talking about 20 times per second, in my experience, there is no pressure on both the database and server. only bandwidth may cause a bottleneck in this access volume. You can calculate the total size of all resources on your page multiplied by 20 to determine whether your bandwidth is exceeded.

Currently, it is found that the number of cached files is too large, resulting in each access to the cached files, occupying resources ,,

I hope you can answer this question. thank you !!
Freemaker makes a static webpage

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.