Comparison of the three methods and speed for PHP mass database insertion, and three methods for php database insertion. Compare the three methods and speed of large-volume PHP database insertion. The first method is to use insertinto insert. the code is as follows: $ paramsarray ('value50 ′); comparison of set_time _ PHP 3 methods and speed for mass database insertion, and php database 3 methods
Method 1:Use insert into to insert. the code is as follows:
$params = array(‘value'=>'50′);set_time_limit(0);echo date(“H:i:s”);for($i=0;$i<2000000;$i++){$connect_mysql->insert($params);};echo date(“H:i:s”);
It is shown as follows: 23: 25: 05 01:32:05, that is, it took more than two hours!
Method 2:Use transaction commit, batch insert database (every commit Records), the last time consumed is: 22: 56: 13 23:04:00, a total of 8 minutes 13 seconds, the code is as follows:
echo date(“H:i:s”);$connect_mysql->query(‘BEGIN');$params = array(‘value'=>'50′);for($i=0;$i<2000000;$i++){$connect_mysql->insert($params);if($i%100000==0){$connect_mysql->query(‘COMMIT');$connect_mysql->query(‘BEGIN');}}$connect_mysql->query(‘COMMIT');echo date(“H:i:s”);
Method 3:Optimize SQL statement: concatenate the SQL statement, insert into table () values (), and insert () at a time. if the string is too long,
You need to configure MYSQL and run: set global max_allowed_packet = 2*1024*1024*10 in the mysql command line; time consumed: 11: 24: 06 11:25:06;
It takes only one minute to insert million pieces of test data! The code is as follows:
$sql= “insert into twenty_million (value) values”;for($i=0;$i<2000000;$i++){$sql.=”('50′),”;};$sql = substr($sql,0,strlen($sql)-1);$connect_mysql->query($sql);
In conclusion, when inserting a large volume of data, the first method is undoubtedly the worst, and the second method is widely used in practical applications, the third method is suitable for inserting test data or other low requirements, and the speed is indeed fast.
The fastest way for php to insert large amounts of data into the mysql database
Inserting 1000 data records at a time is N times faster than inserting one data record at a time. the main technique is to write SQL statements without any difficulty.
Insert into table1 value (v1, v2, v3), (x1, x2, x3 ),....
Instead
Insert into table1 value (v1, v2, v3 );
Insert into table1 value (x1, x2, x3 );
Insert such a line
Hope you understand
References: Niu Ren Xiaohai
Php, mysql, how to import large amounts of excel data into the database? Previously, I recorded data in this way and first converted it to cs.
About csv format
If there is a comma in the content, you can enclose the entire field in quotation marks. for details, refer to csv in Baidu Encyclopedia.
For example
Field 1, "Field 2 with, number", field 3
In fact, the key lies in the rules for php to read csv files. csv files do not have to use commas (,) or other symbols, such as semicolons.
Modify the corresponding php read rule.
Use insert into to insert. the code is as follows: $ params = array ('value' = '50'); set_time _...