How does PHP read the CSV large file and import the database?
For millions of data volumes of CSV file, the file size may reach hundreds of m, if simply read, it is likely to have a time-out or a card-dead phenomenon.
Batch processing is necessary in order to successfully import data from a CSV file into a database.
The following function reads a few rows of data specified in a CSV file:
/** * Csv_get_lines read a few lines of data in the CSV file * @param $csvfile CSV file path * @param $lines Read rows * @param $offset starting lines * @return arr AY * */function Csv_get_lines ($csvfile, $lines, $offset = 0) {if (! $fp = fopen ($csvfile, ' R ')) {return false;} $i = $j = 0; while (false!== ($line = fgets ($fp))) {if ($i + + < $offset) {continue;} break;} $data = Array (); while ($j + + < $lines) &&!feof ($fp)) {$data [] = Fgetcsv ($FP);} fclose ($FP); return $data; }
Call Method:
$data = Csv_get_lines (' path/bigfile.csv ', 10, 2000000);
Print_r ($data);
The function mainly adopts the idea of line positioning, and the file pointer positioning is achieved by skipping the number of start lines.
As for how data is put into storage, this article is no longer detailed.
The above function tests the files within 500M, runs smoothly, and does not test for larger files, please use them or improve them.
PHP read CSV large file Import Database