This address: http://www.cnblogs.com/aiweixiao/p/7535351.html
Welcome to my public number, "Programmer's Civic Feelings" http://t.cn/RotyZtu
"Background": Because the file function is a one-time to read all the content into memory, and PHP in order to prevent some poorly written programs to take up too much memory and cause the system memory shortage, so that the server is down, so think of a good way.
"Ideas":
01 idea 1: Use PHP to execute Linux commands to copy a file content (a.log) to another file (b.log) Cat a. log >>b.log02 2: Executes the Linux Head command with PHP, gets the content, and a row of rows is written to another file (b.log). Cat a. log|WC-n ' 1 P ' a.logfileputcontents (' B.log ', $content, file_append);
"Code Implementation":
1 //02. Read 1G of file contents2 Public functionTestgetlarge ()3{/*{{{*/4 #01. Method 1 Direct Cat post redirect5 #if (' usecat ' = = ' Usecat ')6 if(0 && ' usecat ' = = ' Usecat ')7 {8' Cat A.Log>>b.Log`;9 }Ten One #02. Use Flie_put_contents to read and append a line of writing A if(' usefileputcontents ' = = ' usefileputcontents ') -{/*{{{*/ - $lineNum= ' Cat a '.Log|WC-l '; the for($i= 1;$i<=$lineNum;$i++) - { - $content= ' sed-n ' $i p ' a.Log`; - //file Append + file_put_contents(' B.log ',$content,file_append); - } + A}/*}}}*/
"Other solutions": http://www.jb51.net/article/40847.htm
"Combat Code" PHP implementation to read a 1G file size