In contrast, the first thing to do is to use the MySQL stored procedure:
Copy CodeThe code is as follows:
Mysql>delimiter $
Mysql>set autocommit = 0$$
Mysql> CREATE PROCEDURE Test ()
Begin
Declare I decimal (ten) default 0;
Dd:loop
INSERT into ' million ' (' categ_id ', ' categ_fid ', ' sortpath ', ' address ', ' p_identifier ', ' pro_specification ', ' name ', ' Add _date ', ' picture_url ', ' thumb_url ', ' is_display_front ', ' create_html_time ', ' hit ', ' Buy_sum ', ' athor ', ' Templete _style ', ' is_hot ', ' is_new ', ' is_best ') VALUES
(268, 2, ' 0,262,268, ', 0, ' 2342 ', ' 423423 ', ' 123123 ', ' 2012-01-09 09:55:43 ', ' upload/product/20111205153432_53211.jpg ', ' Upload/product/thumb_20111205153432_53211.jpg ', 1, 0, 0, 0, ' admin ', ' 0 ', 0, 0, 0);
Commit
Set i = i+1;
If i= 1000000 then leave DD;
End If;
End Loop DD;
end;$
Mysql>delimiter;
Mysql> call test;
Results
Mysql> call test; Query OK, 0 rows affected (min 30.83 sec)
Very time consuming.
So I found another way.
First generate the data in PHP code, and then import:
Copy CodeThe code is as follows:
<?php
$t =mktime ();
Set_time_limit (1000);
$myFile = "E:/insert.sql";
$fhandler =fopen ($myFile, ' WB ');
if ($fhandler) {
$sql = "268\t2\t ' 0,262,268, ' \t0\t ' 2342 ' \ t ' 423423 ' \ t ' 123123 ' \ t ' 23423423 ' \ t ' 2012-01-09 09:55:43 ' \ t ' upload/product/ 20111205153432_53211.jpg ' \ t ' upload/product/thumb_20111205153432_53211.jpg ' \tnull\tnull\t38\t ' pieces ' \ t ' \t123\t123\ T0 ";
$i = 0;
while ($i <1000000)//1,000,000
{
$i + +;
Fwrite ($fhandler, $sql. " \ r \ n ");
}
echo "Write successful, time-consuming:", mktime ()-$t;
}
and then import
Copy CodeThe code is as follows:
LOAD DATA local INFILE ' e:/insert.sql ' into TABLE tenmillion (' categ_id ', ' categ_fid ', ' sortpath ', ' address ', ' p_identifie R ', ' Pro_specification ', ' name ', ' description ', ' add_date ', ' picture_url ', ' thumb_url ', ' shop_url ', ' shop_thumb_url ', ' brand_id ', ' unit ', ' square_meters_unit ', ' market_price ', ' true_price ', ' square_meters_price ');
Note that the field is no longer separated by a comma, and the record is divided by \ r \ n. As a result I inserted 10 data, 100W average as long as 1 minutes to fix.
The second way, the middle of MySQL omitted a lot of intermediate steps, resulting in a faster insertion speed than the first, the specific no research.
Fast generation of millions test data on MySQL
Because of the test needs, only 10,000 data in the original table, now randomly copied into the record, quickly reached 1 million.
Itemid is the primary key.
Run the following code several times. Randomly take 1000 inserts,
Insert INTO Downitems (Chid,catid,softid,....)
SELECT Chid,catid,softid ... From ' Downitems ' WHERE itemid >= (select Floor (RAND () * (select MAX (itemid) from ' Downitems ')] ORDER by Itemid LIMIT 1 000;
You can then change the number 1000. Change to 5000 or 10,000. It will soon reach 1 million of the amount of data.
Transferred from: http://www.jb51.net/article/30099.htm
Just inserting random millions data? If this is the case with stored procedures, see here: http://fc-lamp.blog.163.com/blog/static/17456668720108305520576/
If you are importing big data files from outside, you can try using load data INFILE, see: http://fc-lamp.blog.163.com/blog/static/17456668720121021104916483/
or see official website: http://dev.mysql.com/doc/refman/5.1/zh/optimization.html#insert-speed
How to quickly insert millions test data in MySQL