PHP reads the CSV large file into the database instance,
How does PHP read large CSV files and import them to the database?
For CSV files with millions of data records, the file size may reach several hundred mb. If you simply read the file, it is likely that the file times out or gets stuck.
Batch Processing is necessary to successfully import the data in the CSV file into the database.
The following function reads the specified rows of data in a CSV file:
/*** Csv_get_lines reads certain rows of data in the CSV file * @ param $ csvfile csv file path * @ param $ lines read rows * @ param $ offset start row * @ return array * */function csv_get_lines ($ csvfile, $ lines, $ offset = 0) {if (! $ Fp = fopen ($ csvfile, 'R') {return false;} $ I = $ j = 0; while (false! ==( $ Line = fgets ($ fp) {if ($ I ++ <$ offset) {continue;} break ;}$ data = array (); while ($ j ++ <$ lines )&&! Feof ($ fp) {$ data [] = fgetcsv ($ fp);} fclose ($ fp); return $ data ;}
Call method:
$data = csv_get_lines('path/bigfile.csv', 10, 2000000);print_r($data);
The function uses the line Locating Method to locate the file pointer by skipping the Starting number of lines.
This article will not detail how to import data to Alibaba Cloud.
The above functions have been tested for files within MB and run smoothly. For larger files, please use or improve them as appropriate.
This PHP example of reading a large CSV file and importing it into the database is all the content that I have shared with you. I hope you can give me a reference and support me a lot.