Welcome to the Linux community forum and interact with 2 million technical staff. We have always stored some infrequently modified data as files for calling. It seems that it is correct, direct file reading is more efficient than database query, and the connection and disconnection time have not been counted in the text. This problem was just recently come to mind, that is, reading files.
Welcome to the Linux community forum and interact with 2 million technical staff> we have always saved some infrequently modified data as files for calling. It seems that it is correct, direct file reading is more efficient than database query, and the connection and disconnection time have not been counted in the text. This problem was just recently come to mind, that is, reading files.
Welcome to the Linux community forum and interact with 2 million technicians>
For a long time, we have stored some infrequently modified data as files for calling. It seems that it is correct to directly read files, which is more efficient than database query, the connection time and disconnection time are not counted yet.
This problem was also recently mentioned, that is, whether to read files faster or to read databases faster and how fast it can be. Tian Yuan also searched for the problem and did not see any netizens reply to this question, it may also be too simple. In this article, we will test whether VC has not been installed due to the time relationship. tianyuan first tested it with PHP, next time, I will try again in C/C ++. Because PHP's underlying parsing should also be based on C, it is estimated that the test results of both environments are similar, and there will be a big gain in small problems, now let's take a look at the test process and results.
The test procedure is as follows:
Note 1: Because the read database statement calls a simple encapsulation function twice, the Read File is also changed to a continuous call for two times. The database record ID is 1 at the first entry and has a unique index.
NOTE 2: 4 k Data and integer data are tested twice.
Set_time_limit (0 );
Function fnGet ($ filename)
{
$ Content = file_get_contents ($ filename );
Return $ content;
}
Function fnGetContent ($ filename)
{
$ Content = fnGet ($ filename );
Return $ content;
}
$ Times = 100000;
Echo 'database query result:
';
//---------------------------------
$ Begin = fnGetMicroTime ();
For ($ I = 0; $ I <$ times; $ I ++)
{
$ Res = $ dbcon-> mydb_query ("SELECT log_Content FROM blog WHERE log_ID = '1 '");
$ Row = $ dbcon-> mydb_fetch_row ($ res );
$ Content = $ row [0];
}
Echo 'fetch _ row'. $ times. 'Time:'. (fnGetMicroTime ()-$ begin ). 'Second
';
//---------------------------------
$ Begin = fnGetMicroTime ();
For ($ I = 0; $ I <$ times; $ I ++)
{
$ Res = $ dbcon-> mydb_query ("SELECT log_Content FROM blog WHERE log_ID = '1 '");
$ Row = $ dbcon-> mydb_fetch_array ($ res );
$ Content = $ row ['Log _ content'];
}
Echo 'fetch _ array'. $ times. 'Time:'. (fnGetMicroTime ()-$ begin ). 'Second
';
//---------------------------------
$ Begin = fnGetMicroTime ();
For ($ I = 0; $ I <$ times; $ I ++)
{
$ Res = $ dbcon-> mydb_query ("SELECT log_Content FROM blog WHERE log_ID = '1 '");
$ Row = $ dbcon-> mydb_fetch_object ($ res );
$ Content = $ row-> log_Content;
}
Echo 'fetch _ object'. $ times. 'Time:'. (fnGetMicroTime ()-$ begin ). 'Second
';
//---------------------------------
$ Dbcon-> mydb_free_results ();
$ Dbcon-> mydb_disconnect ();
FnWriteCache('test.txt ', $ content );
Echo 'directly read the file test result:
';
//---------------------------------
$ Begin = fnGetMicroTime ();
For ($ I = 0; $ I <$ times; $ I ++)
{
$ Content = fnGetContent('test.txt ');
}
[1] [2]