PHP reads large files with a little doubt

Source: Internet
Author: User
I would like to analyze a 6G log file, compared to each line of files to meet my requirements, the program is as follows
$file _path = ' D:\work\workplace\test\file\system.log '; $file = fopen ($file _path, ' R '); $key = MD5 (0); $i = 1;while (!feof ($ File) {    $buff = fgets ($file);    if ($buff = = $key. "\ r \ n")    {        echo "find 0 at line {$i}\r\n";    }    $i + +;} Fclose ($file);

I would like to ask how is this performance, there is no memory leak or other problems, there is a way to further optimize it?


Reply to discussion (solution)

You need to split this file into several small files first
Then loop through each small file!

Linux under $ split-b
Split ...

Linux under $ split-b
Split ...
Why to divide ah, with fgets not only go to one line at a time, and do not read the files into memory Ah

I suggest you use fgets when the best to set the number of characters read, not a line of reading, 6G files may be a long line!

It is now certain that each line will not be very long and that the log is accessed in the specified format

Oh, then you can do it! Oh, I'm just suggesting!

Oh, then you can do it! Oh, I'm just suggesting!
I do see some people do that, but if not read into memory all at once, it should be no better than that, cutting files and deleting temporary files and consumption, this is my personal feeling, there is a wrong point of view

6G text File ...

How can you make such a big file?
Logs should be remembered by day or by week, month, and a new file over a certain size

Should be divided into multiple files

6G text File ...

How can you make such a big file?
Logs should be remembered by day or by week, month, and a new file over a certain size

Should be divided into multiple files
Our company's behavior log, 6G a day

Yes, no problem. It's time-consuming.

In terms of code alone, you can cut back a little
$file _path = ' d:\work\workplace\test\file\system.log ';
$file = fopen ($file _path, ' R ');
$key = MD5 (0);
$i = 1;
while ($buff = fgets ($file)) {
if ($buff = = $key. "\ r \ n")
{
echo "Find 0 at line {$i}\r\n";
}
$i + +;
}
Fclose ($file);

If you read a bit more (such as 1M) at a time, it might be a bit faster. But the algorithms are more complicated.

It's not PHP's forte to do this stuff.

If you want to do the web, or change other programs to do it.

Reference 9 Floor Baiyuxiong's reply:

6G text File ...

How can you make such a big file?
Logs should be remembered by day or by week, month, and a new file over a certain size

Should be divided into multiple files

Our company's behavior log, 6G a day

If you use the shell how to write Ah, to reach people, no, thank you very much

And then what? How did it work out??

I use the fgets read the file, also not big bar, 150M a CSV file, 18 seconds, read by the Fgets line (known not to appear a long line), need not to use Fseek to set the file pointer? Can it be more efficient?

  • Contact Us

    The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

    If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.