Old topic... for help exporting EXCEL via PHP, the phpexcel class is very powerful, but the efficiency is low and the speed of fputcsv is fast, but I don't know how to solve the problem of scientific notation, ask the old driver to teach his experience. This post was last edited by anyilaoliu from 2014-08-6:6:39:12... the exported data is about items at present. & nbsp; it may be an old topic in the next month... help PHP export EXCEL, phpexcel class is very powerful, but the efficiency is low, slow, fputcsv speed is fast, but do not know how to solve the problem of scientific notation, ask the old driver to teach experience
This post was last edited by anyilaoliu at 16:39:12
As the question...
At present, the exported data may be about million records in the next month. in this case, about 20 M is finally exported. due to the large number of fields, some processing calculations are made for each data record cyclically, export Results in PHPEXCEL for about 3 minutes using the fputcsv file in php for about 1 minute. However, because of the ID card and other extra-long numeric fields, it will become a scientific notation.
Adding \ t to each field when using FPUTCSV can avoid scientific notation, but every field in the csv file hides double quotation marks, which cannot be accepted by customers.
You can use PHPEXCEL to specify a field as the text format, but the export time is too long for the customer to accept it.
At present, the program loop itself has been optimized as much as possible, that is, there is no way to export any of the following two directions to get help
1. Optimize or simplify the use of PHPEXCEL so that it can improve the efficiency as much as possible to meet the above features
2. export a CSV file so that long numbers are not displayed in scientific notation, or specify the cell format of the column as text.
I am grateful for the effective ideas.
------ Solution ----------------------
Reference:
Quote: reference:
An optimization method:
Http://bbs.youyax.com/Content-5058
Suppose there are 10000 pieces of data,
Export 10 times, 1000 records each time,
The advantage of this method is that you can avoid waiting for a long time,
However, the problem is that the memory is still very occupied,
If the file is too large, insufficient memory will still be reported.
The principle is:
Export 1000 records first,
Read again, locate row 1,001st,
Continue to export the next 1000 entries,
Read again, locate 2001,
......
The total time consumption is not significant if it is not reduced ~~
Paste your method or code to see if it is unclear after research.
------ Solution ----------------------
1. Excel supports 1 million rows of records, Excel 2003 supports a maximum of 65536 rows, and 2007 rows from version 1.04 million. The Excel limit is 65536.
Reference: generate CSV data cyclically. refresh the buffer period every time 1000 entries are generated.
$ Fp = fopen ('php: // output', 'A ');
// Output the Excel column name information
$ Head = array ("email ");
Foreach ($ head as $ I =>$ v ){
// CSV Excel supports GBK encoding and must be converted; otherwise, garbled characters
$ Head [$ I] = iconv ('utf-8', 'gbk', $ v );
}
// Write data to the file handle through fputcsv
Fputcsv ($ fp, $ head );
// Counter
$ Cnt = 0;
// Refresh the output buffer every $ limit Line. do not set the buffer size to too large or too small.
$ Limit = 100000;
// Extract data row by row without wasting memory
$ Count = count ($ email );
For ($ t = 0; $ t <$ count; $ t ++ ){
$ Cnt ++;
If ($ limit = $ cnt) {// refresh the output buffer to prevent problems caused by excessive data
Ob_flush ();
Flush ();
$ Cnt = 0;
}
$ Row [] = $ email [$ t];
Foreach ($ row as $ I =>$ v ){
$ Row [$ I] = iconv ('utf-8', 'gbk', $ v );
}
Fputcsv ($ fp, $ row );
Unset ($ row );
}
2. when you can export a long number, add a space in front of it. this is not exported in the form of length count, but in the form of text.