The reading and writing of PHP large files is usually studied and developed, because we seldom access the reading and writing of a large amount of data, so when this demand suddenly arises, we can
You can still use some quick methods, such as file_get_contents and fread, to read files.
Large, it will cause problems. when reading and writing large files, I found some information on the internet. However, some examples do not meet my needs.
So I will write another summative blog based on the existing examples on the Internet.
So what are the issues? let's talk about the underlying implementations of some PHP, file_get_contents and fread. First, let's talk about it.
The functions file_get_contents and fread used to read files in PHP are actually the same in principle and are read to the system memory,
However, if you only want to read the content of a file into a string, use file_get_contents (). its performance is much better than that of fread.
There is no problem when reading a file that is not very large, but when reading a large file (such as a 2 GB log), if your machine's memory is only 4 GB,
If you read all the files and store them in the string, the system memory may burst and cause freezing, because some of the memory will be used
To maintain the running of the system and other processes, we need some other methods to avoid reading too much content at a time.
To read large files.
Php file read:
The following is an example of reading large files from the Internet to illustrate the memory explosion ..
_________________________________ Invincible split line _______________________________________
The requirements are as follows:There is a 1 GB log file, with about more than 5 million lines. use php to return the last few lines of content.
Implementation method:
1. directly use the file function.
Note: The file function reads all the content into the memory at a time. to prevent some poorly written programs from occupying too much memory, php causes insufficient system memory and server downtime, therefore, the maximum memory usage is 16 MB by default. set memory_limit = 16 m in ini. if this value is set to-1, the memory usage is not limited.
Below is a piece of code that uses file to retrieve the last line of the file.
The execution of the entire code takes 116.9613 (s ).
My machine has 2 GB of memory. when I press F5 to run the machine, the system will be grayed out directly, and it will be restored after about 20 minutes. it can be seen that all such large files will be directly read into the memory, the consequence is serious, so no more than ten thousand. memory_limit cannot be too high. Otherwise, you only need to call the data center and let the reset machine go.
_____________________________________ Invincible split line _____________________________________
Although the above example is an example of reading the last few lines, the file content is traversed, so it is the same as reading the entire file.
To read the last few lines of content, you can also use fseek to locate and read part of the content.
Next we will discuss how to read and write large files.
Read large files:
Because the read part is required, if the file is not very large, you can use file_get_contents or the split parameter that comes with fread to read the part.(A sleep function is needed to reduce the peak IO size, but I don't know if it is correct. I hope you can give me some advice)Another method is to use the while loop and fgets for row-by-row reading. because fgetss reads a row through a file pointer, the efficiency is relatively high.
The following is an example of how to read a large file through fgets and encode the file content (UTF-8-> GBK). The code is as follows:
$file = fopen($old_file_path,"r"); $result = fopen($temporary_file_path,"a"); $re_sign = 0; while(!feof($file)) { $content = fgets($file); $encode = mb_detect_encoding($content, array('ASCII','UTF-8','GB2312','GBK','BIG5')); if ($encode == 'UTF-8') { $str = iconv($encode,"GBK//IGNORE", $content); $encode = mb_detect_encoding($content, array('ASCII','UTF-8','GB2312','GBK','BIG5')); fwrite($result, $str); $re_sign = 1; } else { fwrite($result, $content); } } fclose($file); fclose($result); if($re_sign == 1){ rename($old_file_path, $old_file_path . '.bak' ); rename($temporary_file_path, $old_file_path); } else { unlink($temporary_file_path); }
Large File write:
Writing a large file is not costly for reading a large file, because the file is written to the hard disk. if too many files are written at one time,
It will only generate the phenomenon of card hard disk. if efficiency is concerned, it takes the most time and efficiency to write data directly at one time. Therefore, it is recommended to write data into the file after one-time reading.