When php reads large files, I want to analyze a 6 GB log file to check whether each file line meets my requirements. The program is as follows $ file_path & nbsp ;=& nbsp; 'd: \ work \ workplace \ test \ file \ system. log'; $ file & php: a question about reading large files
I want to analyze a 6 GB log file to check whether each file line meets my requirements. the procedure is as follows:
$file_path = 'd:\work\workplace\test\file\system.log';
$file = fopen($file_path, 'r');
$key = md5(0);
$i = 1;
while (!feof($file)) {
$buff = fgets($file);
if ($buff == $key . "\r\n")
{
echo "find 0 at Line {$i}\r\n";
}
$i ++;
}
fclose($file);
I would like to ask how the performance is like. Will there be no memory leakage or other problems? is there any further optimization method?
------ Solution --------------------
You need to split the file into several small files first
Then read every small file cyclically!
------ Solution --------------------
$ Split-B in linux
Segmentation ···
------ Solution --------------------
I suggest you use fgets to specify the number of characters to read. do not read one row. a 6 GB file may have a long row!
------ Solution --------------------
6G text file ......
How can you create such a large file?
Logs should be recorded by day, week, or month. a new file is created when the log size exceeds a certain size.
It should be divided into multiple files
------ Solution --------------------
Yes, no problem. Is time-consuming.
It can be reduced by just a bit for the code.
$ File_path = 'd: \ work \ workplace \ test \ file \ system. log ';
$ File = fopen ($ file_path, 'r ');
$ Key = md5 (0 );
$ I = 1;
While ($ buff = fgets ($ file )){
If ($ buff = $ key. "\ r \ n ")
{
Echo "find 0 at Line {$ I} \ r \ n ";
}
$ I ++;
}
Fclose ($ file );
If one read is a little more (such as 1 M), it may be faster. But the algorithm is more complex.
------ Solution --------------------
This is not a long term of php.
If you do not want to engage in WEB, you should change to other programs.
Reference:
Reference:
6G text file ......
How can you create such a large file?
Logs should be recorded by day, week, or month. a new file is created when the log size exceeds a certain size.
It should be divided into multiple files
Our company's behavior log, 6 GB a day