I have read the efficiency of phpinclude and I/O streams before. a big article in the ordinary world is & lt; ahrefwww.ccvita.com & gt; Portal & lt; a & gt;, and PHP file cache performance test. I don't agree with it. how can the include statement be slower than the io stream? even if serialization is added, the include statement will be a little faster according to common sense, but this is true! The code below tests the efficiency of the php include and io streams of io streams.
I have read the article Portal in the ordinary world and tested the php file cache performance. I don't agree with it. how can the include statement be slower than the io stream? even if serialization is added, the include statement will be a little faster according to common sense, but this is true! The code is as follows:
Test io stream + serialization
Function read_cache ($ filename ){
??????? If ($ datas = file_get_contents ($ filename )){
??????????? Return $ datas;
??????? }
}
$ T1 = gettimeofday ();
For ($ I = 0; I I <10000; $ I ++ ){
??? $ X = read_cache ("CacheTest_SerializeData.php ");
??? $ X_r = unserialize ($ x );
}
$ T2 = gettimeofday ();
Echo ($ t2 ['SEC ']-$ t1 ['SEC']) * 1000 + ($ t2 ['usec ']-$ t1 ['usec']) /1000. "\ n ";
Test include
$ T1 = gettimeofday ();
For ($ I = 0; I I <10000; $ I ++ ){
??? Include ("CacheTest_IncludeData.php ");
}
$ T2 = gettimeofday ();
Echo ($ t2 ['SEC ']-$ t1 ['SEC']) * 1000 + ($ t2 ['usec ']-$ t1 ['usec']) /1000. "\ n ";
The result time difference is 1 s, but there is a problem in the middle that there is a limit on the length of the serialized string, which is based on the limit of the php array, therefore, io + serialization cannot meet the expected requirements when a large number of articles are stored.