When you need to perform a test, you need to insert data in batches. There are three methods: first, insert one by one, which is obviously the slowest, second, transaction commit, and Third, tips for SQL statement optimization: We will test the following data separately. The goal is to insert million data records into an empty data table.
Method 1: USE insert into to insert,CodeAs follows:
$Params= Array ('Value'=>'50'); Set_time_limit (0);
Echo date ("H: I: s ");
For($ I =0; $ I <2000000; $ I ++) {$ Connect_mysql-> Insert ($Params);};
Echo date ("H: I: s ");
It is shown as follows: 23: 25: 05 01:32:05, that is, it took more than two hours!
Method 2: Use transaction commit to insert data into the database in batches (every commits). the time consumed is shown as follows:22:56:13 23:04:00, totally 8 minutes 13 seconds,The Code is as follows:
Echo Date("H: I: S");
$ Connect_mysql-> Query ('begin');$ Params=Array('Value' => '50');
For($ I= 0;$ I<1, 2000000;$ I++){
$ Connect_mysql-> insert ($ Params );If($ I% 100000 = 0){$ Connect_mysql-> Query ('commit');$ Connect_mysql-> Query ('begin');}}$ Connect_mysql-> Query ('commit');Echo Date("H: I: s ");
Method 3: optimize the SQL statement:Concatenate the SQL statement, insert into table () values (), (), and then insert () at a time. If the string is too long,
You need to configure MySQL and run: set global max_allowed_packet = 2*1024*1024*10 in the MySQL command line; time consumed: 11: 24: 06 11:25:06;
It takes only one minute to insert million pieces of test data! The Code is as follows:
$ SQL= "Insert into twenty_million (value) values";For($ I= 0;$ I<1, 2000000;$ I++){$ SQL. = "('50 '),";};$ SQL=Substr($ SQL, 0,Strlen($ SQL)-1);$ Connect_mysql-> Query ($ SQL);
In conclusion, when inserting a large volume of data, the first method is undoubtedly the worst, and the second method is widely used in practical applications, the third method is suitable for inserting test data or other low requirements, and the speed is indeed fast. Over