This article mainly introduces PHP line reading, processing large CSV file code instances, the need for friends can refer to the following
For millions of data CSV files, the file size may reach hundreds of M, and if a simple read is likely to be timed out or stuck to death. In order to successfully import data from a CSV file into a database, batching is necessary. The following function reads a few rows of data specified in a CSV file: code as follows:/** * Csv_get_lines read a few rows of data in a CSV file * @param $csvfile csv file The path * @param $lines reads the number of rows * @param $offset The number of start rows * @return array */function Csv_get_lines ($csvfile , $lines, $offset = 0) { if (! $fp = fopen ($csvfile, ' R ')) { return false; } $i = $j = 0; while (False!== ($line = fgets ($fp))) { if ($i + + < $offset) { continue; } Break &NBSP} $data = array (); while ($j + + < $lines) &&!feof ($fp)) { $data [] = Fgetcsv ($fp); fclose ($FP); return $data; Call method: Code as follows: $data = Csv_get_lines (' path/bigfile.csv ', 10, 2000000); Print_r ($data); The function mainly adopts the idea of line positioning, and realizes the file pointer positioning by skipping the starting line number. The above function of 500M of files have been tested, smooth operation, for the larger files did not test, please use or improve.