A quick way to insert millions test data in Mysql _mysql

Source: Internet
Author: User
In contrast, the first is to use the MySQL stored procedures to get:
Copy Code code as follows:

Mysql>delimiter $
Mysql>set autocommit = 0$$
Mysql> CREATE PROCEDURE Test ()
Begin
Declare i decimal (a) default 0;
Dd:loop
INSERT into ' million ' (' categ_id ', ' categ_fid ', ' sortpath ', ' address ', ' p_identifier ', ' pro_specification ', ' name ', ' Add _date ', ' picture_url ', ' thumb_url ', ' is_display_front ', ' create_html_time ', ' hit ', ' Buy_sum ', ' athor ', ' Templete _style ', ' is_hot ', ' is_new ', ' is_best ') VALUES
(268, 2, ' 0,262,268, ', 0, ' 2342 ', ' 423423 ', ' 123123 ', ' 2012-01-09 09:55:43 ', ' upload/product/20111205153432_53211.jpg ', ' Upload/product/thumb_20111205153432_53211.jpg ', 1, 0, 0, 0, ' admin ', ' 0 ', 0, 0, 0;
Commit
Set i = i+1;
If i= 1000000 then leave DD;
End If;
End Loop DD;
end;$
Mysql>delimiter;
Mysql> call test;

Results
Mysql> call test; Query OK, 0 rows affected (min 30.83 sec)
Very time-consuming.
So I found another way.
First generate data using PHP code, and then import:
Copy Code code as follows:

<?php
$t =mktime ();
Set_time_limit (1000);
$myFile = "E:/insert.sql";
$fhandler =fopen ($myFile, ' WB ');
if ($fhandler) {
$sql = "268\t2\t ' 0,262,268, ' \t0\t ' 2342 ' \ t ' 423423 ' \ t ' 123123 ' \ t ' 23423423 ' \ t ' 2012-01-09 09:55:43 ' \ t ' upload/product/ 20111205153432_53211.jpg ' t ' upload/product/thumb_20111205153432_53211.jpg ' \tnull\tnull\t38\t ' pieces ' \ t ' \t123\t123\ T0 ";
$i = 0;
while ($i <1000000)//1,000,000
{
$i + +;
Fwrite ($fhandler, $sql. " \ r \ n ");
}
echo "Write successfully, time-consuming:", mktime ()-$t;
}

and then import
Copy Code code as follows:

LOAD DATA local INFILE ' e:/insert.sql ' to TABLE tenmillion (' categ_id ', ' categ_fid ', ' sortpath ', ' address ', ' p_identifie ') R ', ' Pro_specification ', ' name ', ' description ', ' add_date ', ' picture_url ', ' thumb_url ', ' shop_url ', ' shop_thumb_url ', ' brand_id ', ' unit ', ' square_meters_unit ', ' market_price ', ' true_price ', ' square_meters_price ';

Note The fields are no longer separated by commas, with \ t split, and the records are split by \ r \ n. As a result, I inserted 10 times, and the 100W averaged 1 minutes.
The second way in the middle of MySQL omitted a lot of intermediate steps, resulting in a far faster insertion speed than the first, specific no research.

quickly generate millions test data on MySQL
As the test needs, the original table only 10,000 data, now randomly copied into the record, quickly reached 1 million.

Itemid is the primary key.

Run the following code several times. Randomly take 1000 inserts,

Insert INTO Downitems (Chid,catid,softid,....)
SELECT Chid,catid,softid ... From ' Downitems ' WHERE itemid >= (select Floor (RAND () * (select MAX (itemid) from ' downitems ')) Order by Itemid LIMIT 1 000;

You can then modify the number of 1000. Change to 5000 or 10,000. It will soon be up to 1 million of the amount of data.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.