One question about php reading large files

Source: Internet
Author: User
When php reads large files, I want to analyze a 6 GB log file to check whether each file line meets my requirements. The program is as follows $ file_path & nbsp ;=& nbsp; 'd: \ work \ workplace \ test \ file \ system. log'; $ file & php: a question about reading large files
I want to analyze a 6 GB log file to check whether each file line meets my requirements. the procedure is as follows:

$file_path = 'd:\work\workplace\test\file\system.log';
$file = fopen($file_path, 'r');
$key = md5(0);
$i = 1;
while (!feof($file)) {
    $buff = fgets($file);
    if ($buff == $key . "\r\n")
    {
        echo "find 0 at Line {$i}\r\n";
    }
    $i ++;
}
fclose($file);

I would like to ask how the performance is like. Will there be no memory leakage or other problems? is there any further optimization method?
------ Solution --------------------
You need to split the file into several small files first
Then read every small file cyclically!
------ Solution --------------------
$ Split-B in linux
Segmentation ···
------ Solution --------------------
I suggest you use fgets to specify the number of characters to read. do not read one row. a 6 GB file may have a long row!
------ Solution --------------------
6G text file ......

How can you create such a large file?
Logs should be recorded by day, week, or month. a new file is created when the log size exceeds a certain size.

It should be divided into multiple files
------ Solution --------------------
Yes, no problem. Is time-consuming.

It can be reduced by just a bit for the code.
$ File_path = 'd: \ work \ workplace \ test \ file \ system. log ';
$ File = fopen ($ file_path, 'r ');
$ Key = md5 (0 );
$ I = 1;
While ($ buff = fgets ($ file )){
If ($ buff = $ key. "\ r \ n ")
{
Echo "find 0 at Line {$ I} \ r \ n ";
}
$ I ++;
}
Fclose ($ file );

If one read is a little more (such as 1 M), it may be faster. But the algorithm is more complex.

------ Solution --------------------
This is not a long term of php.

If you do not want to engage in WEB, you should change to other programs.

Reference:
Reference:

6G text file ......

How can you create such a large file?
Logs should be recorded by day, week, or month. a new file is created when the log size exceeds a certain size.

It should be divided into multiple files

Our company's behavior log, 6 GB a day

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.