Many times in order to test the database design is appropriate, to optimize the SQL statement, you need to insert a large number of data in the table, how to insert a large number of data is a problem.
The first thing to think about is to write a program that is constantly plugged in through a large loop, such as this:
int i = LOOP_COUNT;while(i-->=0){ //insert data here.}
But when I do that, I find that the insertion data is very slow, the amount of data inserted in a second is less than 100, so I think of not inserting a single piece, but by
INSERT INTO TABLE VALUES (),(),(),()...
Such a way to insert. The program is then modified to:
Inti = loop_count; StringBuilder stringbuilder; While (i-->=0) { if (loop_count!=i && i%5000== 0//insert these 5,000 data by inserting values and empty StringBuilder } stringbuilder. ( //Insert the remaining data
The insertion speed is much higher, but if you want to insert a large number of inputs, such as hundreds of billions, then the time spent is very long.
Query the MySQL document, found a page: LOAD DATA INFILE Light Look at this name, feel that there is a chance, so carefully read the next.
The official description of this command is:
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE ‘file_name‘ [REPLACE | IGNORE] INTO TABLE tbl_name [CHARACTER SET charset_name] [{FIELDS | COLUMNS} [TERMINATED BY ‘string‘] [[OPTIONALLY] ENCLOSED BY ‘char‘] [ESCAPED BY ‘char‘] ] [LINES [STARTING BY ‘string‘] [TERMINATED BY ‘string‘] ] [IGNORE number LINES] [(col_name_or_user_var,...)] [SET col_name = expr,...]
The command is not complex, the meaning and usage of each parameter can be seen in the official explanation http://dev.mysql.com/doc/refman/5.5/en/load-data.html
So what we're doing now is generating data, I'm used to using \ t as the delimiter for the data, using \ n as a line delimiter, so the code to generate the data is as follows:
LongStart=System.Currenttimemillis()/1000;Try{FileFile=NewFile(FILE);If(File.Exists()){File.Delete();}File.CreateNewFile();FileOutputStreamOutStream=NewFileOutputStream(File,True);StringBuilderBuilder=NewStringBuilder(10240);DateFormatDateFormat=NewSimpleDateFormat(Date_format);RandomRand=NewRandom();StringTmpdate=DateFormat.Format(NewDate());LongTmptimestamp=System.Currenttimemillis()/1000;IntI=0;While(I++<LOOP){If(I>0&&I%30000==0){System.Out.println("Write offset:"+I);OutStream.Write(Builder.Tostring().GetBytes(CharCode));Builder=NewStringBuilder(10240);}If(Tmptimestamp.CompareTo(System.Currenttimemillis()/1000)!=0){Tmpdate=DateFormat.Format(NewDate());Tmptimestamp=System.Currenttimemillis()/1000;}Builder.Append(Tmpdate);Builder.Append("\ T");Builder.Append(Rand.Nextint(999));Builder.Append("\ T");Builder.Append(Encrypt.Md5(System.Currenttimemillis()+""+Rand.Nextint(99999999)));Builder.Append("\ T");Builder.Append(Rand.Nextint(999)%2==0?"AA.":"BB");Builder.Append("\ T");Builder.Append(Rand.Nextfloat()*2000);Builder.Append("\ T");Builder.Append(Rand.Nextint(9));Builder.Append("\ n");}System.Out.println("Write Data:"+I);OutStream.Write(Builder.Tostring (). getbytes (charcodeoutstream. catch (exception e) {e. Printstacktrace system.. (system. Currenttimemillis () /1000 - Start
This code will generate a data file, each behavior of a record, and then use the above mentioned LOAD data to import it, I am under the company's computer (2G memory + garbage dual core Cpu,mysql directly installed in Windows, no optimization, Developer mode) can achieve nearly million insertion speed per second, much faster than other methods.
In addition, if you want to operate directly with GUI tools, such as SQLyog, right-click the table to import, select Import–import CSV Data Using Load Local. Then set the encoding, the delimiter can be imported directly.
Quickly insert large amounts of test data in MySQL