Hello, seniors.
Little brother to read a large log file to have 1.6 million lines
How to optimize the speed of reading files?
Because the younger brother wants to read and write into the database.
The younger brother now uses the Fopen,fget to read like this.
Please enlighten your predecessors.
Thank you.
Reply to discussion (solution)
Fgets read a line
If it's too slow, then:
Fread read a piece (2048 or larger)
Cut by line after reading
Processing when the last one is left unhandled, and then processed after the next read content is connected
The quickest way is to use the LoadFile in directive to load directly into the staging table before processing
Can seniors give the code a demonstration?
Thank you so much.
$FN = ' filename '; $fp = fopen ($FN, ' R '); $last = "; while (! feof ($fp)) { $a = preg_split ("/[\r\n]+/", $last. Frenad ($FP, 204 8)); for ($i =0; $i <count ($a)-1; $i + +) { processing of data } $last = $a [$i];} if ($last) processing of data
Reading a piece with fread can increase speed.
Thanks for the help of seniors.