How to rapidly insert millions of test data records in mysql

Source: Internet
Author: User

For comparison, the first step is to use the mysql stored procedure: Copy codeThe Code is as follows: mysql> delimiter $
Mysql> set autocommit = 0 $
Mysql> create procedure test ()
Begin
Declare I decimal (10) default 0;
Dd: loop
Insert into 'million' ('categ _ id', 'categ _ fid', 'sortpath', 'address', 'P _ identifier ', 'Pro _ specification ', 'name', 'add _ date', 'picture _ url', 'thumb _ url', 'is _ display_front ', 'create _ html_time', 'hit ', 'Buy _ Sum', 'attr', 'templete _ style', 'is _ hot ', 'is _ new', 'is _ best') VALUES
(268, 2, '1970, ', 0, '1970, 0,262,268', '1970, 2342-01-09 09:55:43 ', 'upload/product/20111205153432_53211.jpg ', 'upload/product/thumb_20111205153432_53211.jpg', 1, 0, 0, 0, 0, 'admin', '0', 0, 0, 0, 0 );
Commit;
Set I = I + 1;
If I = 1000000 then leave dd;
End if;
End loop dd;
End; $
Mysql> delimiter;
Mysql> call test;

Result
Mysql> call test; Query OK, 0 rows affected (58 min 30.83 sec)
Very time-consuming.
So I found another method.
Use PHP code to generate data and then import the data:Copy codeThe Code is as follows: <? Php
$ T = mktime ();
Set_time_limit (1000 );
$ MyFile = "e:/insert. SQL ";
$ Fhandler = fopen ($ myFile, 'wb ');
If ($ fhandler ){
$ SQL = "268 \ t2 \ t' 0,262,268, '\ t0 \ t' 000000' \ t' 000000' \ t' 000000' \ t' 000000' \ t' 2342-01-09 09:55:43' \ t' upload/product/ 20111205153432_53211.jpg '\ t' upload/product/thumb_20111205153432_53211.jpg' \ tNULL \ t38 \ t' pieces '\ t' \ t123 \ t123 \ t0 ";
$ I = 0;
While ($ I <1000000) // 1,000,000
{
$ I ++;
Fwrite ($ fhandler, $ SQL. "\ r \ n ");
}
Echo "written successfully, time consumed:", mktime ()-$ t;
}

Then ImportCopy codeThe Code is as follows: load data local INFILE 'e:/insert. SQL 'into TABLE tenmillion ('categ _ id', 'categ _ fid', 'sortpath', 'address', 'P _ identifier ', 'Pro _ specification ', 'name', 'description', 'add _ date', 'picture _ url', 'thumb _ url', 'shop _ url', 'shop _ thumb_url ', 'brand _ id', 'unit ', 'square _ meters_unit', 'market _ price', 'true _ price', 'square _ meters_price ');

Note that fields are no longer separated by commas (,) and records are separated by \ r \ n. As a result, I insert 10 data records, and an average of data records can only be inserted in 1 minute.
In the second method, many intermediate steps are omitted in mysql, resulting in a much faster insertion speed than in the first method.

Quickly generate millions of mysql test data records
As a result of the test, the original table only contains 10 thousand data records, and now the number of inserted records is copied randomly to quickly reach 1 million.

Itemid is the primary key.

Run the following code several times. 1000 inserts at random,

Insert into downitems (chid, catid, softid ,....)
SELECT chid, catid, softid... FROM 'downitems 'WHERE itemid> = (SELECT floor (RAND () * (select max (itemid) FROM 'downloads') order by itemid LIMIT 1000;

Then you can modify the number 1000. Change to 5000 or 10 thousand. It will soon reach 1 million of the data volume.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.