For millions of data volumes of CSV file, the file size may reach hundreds of m, if simply read, it is likely to have a time-out or a card-dead phenomenon.
Batch processing is necessary in order to successfully import data from a CSV file into a database.
The following function reads a few rows of data specified in a CSV file:
Copy the Code code as follows:
/**
* Csv_get_lines reads a few rows of data from a CSV file
* @param $csvfile CSV file path
* Number of rows @param $lines read
* Number of start lines @param $offset
* @return Array
* */
function Csv_get_lines ($csvfile, $lines, $offset = 0) {
if (! $fp = fopen ($csvfile, ' R ')) {
return false;
}
$i = $j = 0;
while (false!== ($line = fgets ($fp))) {
if ($i + + < $offset) {
Continue
}
Break
}
$data = Array ();
while (($j + + < $lines) &&!feof ($fp)) {
$data [] = Fgetcsv ($FP);
}
Fclose ($FP);
return $data;
}
Call Method:
Copy the Code code as follows:
$data = Csv_get_lines (' path/bigfile.csv ', 10, 2000000);
Print_r ($data);
The function mainly adopts the idea of line positioning, and the file pointer positioning is achieved by skipping the number of start lines.
The above function tests the files within 500M, runs smoothly, and does not test for larger files, please use them or improve them.
http://www.bkjia.com/PHPjc/751506.html www.bkjia.com true http://www.bkjia.com/PHPjc/751506.html techarticle for millions of data volumes of CSV file, the file size may reach hundreds of m, if simply read, it is likely to have a time-out or a card-dead phenomenon. In order to successfully put the data in the CSV file ...