Assume that our table structure is as follows:
The Code is as follows: |
|
Create table example ( Example_id int not null, Name VARCHAR (50) not null, Value VARCHAR (50) not null, Other_value VARCHAR (50) NOT NULL ) |
We usually write the following SQL statement for an insert order:
The Code is as follows: |
|
Insert into example (Example_id, name, value, other_value) VALUES (100, 'name 1', 'value 1', 'other 1 '); |
Mysql allows you to insert data in batches in an SQL statement. The following SQL statement is used:
The Code is as follows: |
|
Insert into example (Example_id, name, value, other_value) VALUES (100, 'name 1', 'value 1', 'other 1 '), (101, 'name 2', 'value 2', 'other 2 '), (102, 'name 3', 'value 3', 'other 3 '), (103, 'name 4', 'value 4', 'other 4 '); |
If we insert columns in the same order as the columns in the table, we can also save the column name definition, as shown in the following SQL
The Code is as follows: |
|
Insert into example VALUES (100, 'name 1', 'value 1', 'other 1 '), (101, 'name 2', 'value 2', 'other 2 '), (102, 'name 3', 'value 3', 'other 3 '), (103, 'name 4', 'value 4', 'other 4 '); |
It seems that there is no problem above. Here I will use tips for SQL statement optimization. Next I will test them separately. The goal is to insert million data records into an empty data table.
Method 1: USE insert into to insert. The Code is as follows:
The Code is as follows: |
|
$ Params = array ('value' => '50 '); Set_time_limit (0 ); Echo date ("H: I: s "); For ($ I = 0; I I <2000000; $ I ++ ){ $ Connect_mysql-> insert ($ params ); }; Echo date ("H: I: s "); |
It is shown as follows: 23: 25: 05 01:32:05, that is, it took more than two hours!
Method 2: Use transaction commit, insert data into the database in batches (every commits), and finally display the consumed time: 22: 56: 13 23:04:00, a total of 8 minutes 13 seconds, the Code is as follows:
The Code is as follows: |
|
Echo date ("H: I: s "); $ Connect_mysql-> query ('begin '); $ Params = array ('value' => '50 '); For ($ I = 0; I I <2000000; $ I ++ ){ $ Connect_mysql-> insert ($ params ); If ($ I % 100000 = 0 ){ $ Connect_mysql-> query ('commit '); $ Connect_mysql-> query ('begin '); } } $ Connect_mysql-> query ('commit '); Echo date ("H: I: s "); |
Method 3: optimize the SQL statement: concatenate the SQL statement, insert into table () values (), (), and then insert the statement at a time, if the string is too long,
You need to configure MYSQL and run: set global max_allowed_packet = 2*1024*1024*10 in the mysql command line; time consumed: 11: 24: 06 11:25:06;
It takes only one minute to insert million pieces of test data! The Code is as follows:
The Code is as follows: |
|
$ SQL = "insert into twenty_million (value) values "; For ($ I = 0; I I <2000000; $ I ++ ){ $ SQL. = "('50 '),"; }; $ SQL = substr ($ SQL, 0, strlen ($ SQL)-1 ); $ Connect_mysql-> query ($ SQL ); |
In conclusion, when inserting a large volume of data, the first method is undoubtedly the worst, and the second method is widely used in practical applications, the third method is suitable for inserting test data or other low requirements, and the speed is indeed fast.