PHP imports the data in the excel file to the database in batches. during Project creation over the past few days, you may encounter the need to import data in batches. after submitting the excel table, we need to insert all the content in the excel table information to the data table in the background. Of course, the premise is that the information in the excel table that the user gives us must correspond to the field information in our table. below is the implementation steps for batch data import in the background.
First, we need to download phpExcel
PhpExcel is a PHP class library used to operate Office Excel documents. it is based on Microsoft's OpenXML standard and PHP language. You can use it to read and write workbooks of different formats.
There is a need for this class library of shoes can find me, I mail 823410261@qq.com.
After the PHP Excel file is downloaded, the following section describes the encoding implementation. First, let's talk about the overall solution.
First, you need to obtain the data in the excel table, and second, you need to make the data into a format that complies with the SQL specifications. the detailed processing will be shown in the following code, finally, execute the obtained data together with the SQL statement in the program and insert all the results into the data table.
Function getFileExcle ($ value) {$ cgsdiseasesinterface = new Cgs_diseases_interface (); $ filePath = $ value; // $ value is the excel file for data import $ PHPExcel = new PHPExcel (); /** excel is read by default. if the format is incorrect, use the previous version to read it */$ PHPReader = new PHPExcel_Reader_Excel2007 (); if (! $ PHPReader-> canRead ($ filePath) {$ PHPReader = new PHPExcel_Reader_Excel5 (); if (! $ PHPReader-> canRead ($ filePath) {echo 'no Excel '; return ;}}$ PHPExcel = $ PHPReader-> load ($ filePath ); /** read the first worksheet in the excel file */$ currentSheet = $ PHPExcel-> getSheet (0 ); /** obtain the largest column number */$ allColumn = $ currentSheet-> getHighestColumn (); /** get the total number of rows */$ allRow = $ currentSheet-> getHighestRow (); $ row = array ();/** output from the second row, because the first row column name in the excel table */for ($ currentRow = 2; $ currentRow <= $ allRow; $ currentRow ++) {/** output from column A */$ col = array (); for ($ currentColumn = 'a'; $ currentColumn <= $ allColumn; $ currentColumn ++) {$ val = $ currentSheet-> getCellByColumnAndRow (ord ($ currentColumn)-65, $ currentRow)-> getValue ();/** ord () convert the character to a decimal number */array_push ($ col, $ val);} array_push ($ row, $ col);} foreach ($ row as $ key => $ value) {$ row [$ key] [1] = '\''. $ value [1]. '\ ''';} foreach ($ row as $ key => $ value) {$ row [$ key] = implode (",", $ value );} $ row = implode ("), (", $ row); $ row = '('. $ row. ')'; $ res = $ cgsdiseasesinterface-> insertDiseasesInformation ($ row); // This function inserts data into the database if ($ res) {echo json_encode (['code' => CODE_SUCCESS, 'result' => 'batch import succeeded ']);} else {echo json_encode (['code' => CODE_ERROR, 'result' => 'batch import failed']);}
The main reason is that $ currentRow starts from 2, so the first line should be the interpretation of the corresponding field, and the data we need from the second line, you can make changes based on your actual needs.
Here, I splice all the data into strings. the advantage is that when the database is short connection, you only need to connect to the database once, which saves time and system resources, if the database is set to persistent connection, the advantage of string concatenation is not obvious. Of course, you can also insert data into a table in a loop. the advantage of this method is that you do not need to splice the data in a complex way, but the disadvantage is what I just said, it increases the running time and consumes more system resources.