This article mainly introduces several methods of reading large files based on PHP, there are 3 main methods. Interested friends can refer to a bit.
Reading large files has always been a headache, we like to use PHP development to read small files can directly use a variety of function implementation, but one to large articles will find common method is not normal use or time too long too card, next we look at the PHP read big file Problem resolution, I hope the examples will help you.
Scenario: Php reads very large files, such as 1G log files, I use 400M access.log files here
1. Use file to read directly
<?php$starttime=microtime_float (); Ini_set (' Memory_limit ', '-1 '); $file = ' testfile.txt '; $data = File ($file), $line = $data [Count ($data)-+]; $endtime =microtime_float (); echo Count ($data), "<br/>"; Echo $endtime-$starttime; function microtime_float () {list ($usec, $sec) = Explode ("", Microtime ()); return (float) $usec + (float) $sec); >
Running result: 10127784 rows of total use, 7.8764359951s
My computer is 3G memory, this method is not recommended, because you need to load all the files into memory
2. Using the Linux command tail
<?php $starttime =microtime_float (); $file = ' testfile.txt '; $file = Escapeshellarg ($file); Secure escape of command-line arguments $line = ' tail-n $file '; echo $line, "<br/>"; $endtime =microtime_float (); Echo $endtime-$starttime; function microtime_float () {list ($usec, $sec) = Explode ("", Microtime ()); return (float) $usec + (float) $sec);} End
Run Result: Use only a few milliseconds, easy to fix, this method cannot be used under Windows
3. Using the Fseek function
This is the most common way, it does not need to read all the contents of the file, because PHP is written in C, so the implementation is similar to C read the file, through the pointer movement, so efficiency is quite efficient. There are a number of different ways to use fseek to manipulate files, and the efficiency may be slightly differentiated,
Here are a few common methods
Method One: Open the file with fopen (from the file pointer resource handle)
<?php$starttime=microtime_float (); $file = ' testfile.txt '; $fp = fopen ($file, "r+"); $line = +, $pos = 2; $t = $data = ""; while ($line > 0) {while ($t! = "\ n")//newline character {fseek ($fp, $pos, seek_end);//move pointer $t = FGETC ($FP);//Get a character $pos--;//forward offset } $t = ""; $data = fgets ($FP);//Gets the data $line of the current row--;} Fclose ($FP); Echo $data, "<br/>"; $endtime =microtime_float (); Echo $endtime-$starttime; function microtime_float () {list ($usec, $sec) = Explode ("", Microtime ()); return (float) $usec + (float) $sec); >
Run Result: 0.338493108749
Method Two: A piece of reading
<?php$starttime=microtime_float (); $file = ' testfile.txt '; $fp = fopen ($file, "R"), $num = ten; $chunk = 4096;//4k block $fs = sprintf ("%u", FileSize ($file)); $readDat A= "; $max = (Intval ($fs) = = Php_int_max)? Php_int_max: $fs; for ($len = 0; $len < $max; $len + = $chunk) { $seekSize = ($max-$len > $chunk)? $chunk: $max-$len; Fseek ($fp , ($len + $seekSize) *-1, seek_end); $readData = Fread ($fp, $seekSize). $readData; if (Substr_count ($readData, "\ n") >= $num + 1) { $ns =substr_count ($readData, "\ n")-$num +2; Preg_match ('/(. *?\n) { '. $ns. '} /', $readData, $match); $data = $match [1]; Break;}} Fclose ($FP); Echo $data, "<br/>"; $endtime =microtime_float (); Echo $endtime-$starttime; function microtime_float () {list ($usec, $sec) = Explode ("", Microtime ()); return (float) $usec + (float) $sec); >
Running time: 0.00199198722839
Use the Fgets function to read a row of rows
<?php$file = fopen ("testfile.txt", "R"); while (!feof ($file)) { echo fgets ($file);} fclose ($file);
SPL Library functions
<?phptry{ foreach (New Splfileobject (' testfile.txt ') as $line) echo $line. ' <br/> ';} catch (Exception $e) { echo $e->getmessage ();}
In addition, there are many online reading files according to the block, interested readers can try, I tried unsuccessfully, as if must contain a newline character "\ n" can be.
The above is the whole content of this article, I hope that everyone's study has helped.