Read and write files
Background and Wood: The existing Data 1000w single table, for the pressure test to prepare 100 million data.
Steps:
1. Import the 1000w record, except the ID, into multiple files:
//DELIMITERDROP PROCEDURE if EXISTScreatemanytable;Create PROCEDUREcreatemanytable ()BEGINDECLAREIint;DECLAREFileNameVARCHAR( -);SetI=1; whileI<251 DoSETFileName=CONCAT ('F_log_'I'. txt');SET @STMT:=CONCAT ("Select' xx ', ' xx ', ' xx ', ' xx ',.... intoOutFile'temp/", FileName,"'Lines terminated by '\ r \ n' from' F_log 'WHEREId>=",40000*(I-1)," andId<",40000*i);PREPARESTMT from @STMT;EXECUTESTMT;SetI=I+1;End while;END;//Delimitercall createmanytable ();
2. Merge the above multiple files into the same file, and add the ID column to the first column:
Public Static voidMain (string[] args)throwsIOException {inti=10000000; intstep=40000; File out=NewFile ("E:/data/f_log_data.txt"); for(intk=1;k<251;k++) {File file=NewFile ("E:/data/temp/f_log_" +k+ ". txt"); StringBuffer SB=NewStringBuffer (); if(File.exists ()) {SB=readfile (file,i+step*k); WriteFile (OUT,SB); } } } Public StaticStringBuffer readFile (File file,intStartthrowsioexception{stringbuffer SB=NewStringBuffer (); BufferedReader Reader=NewBufferedReader (Newfilereader (file)); String Line=""; while(Line! =NULL) { line=Reader.readline (); if(line = =NULL){ Break; } if(Line.trim (). Equalsignorecase ("")){ Continue; } Start++; Sb.append (Start+ "\ T" +line.trim () + "\ r \ n"); } reader.close (); returnSB; } Public Static voidWriteFile (File file,stringbuffer SB)throwsioexception{bufferedwriter writer=NewBufferedWriter (NewFileWriter (file,true)); Writer.write (Sb.tostring ()); Writer.close (); } Public voidWriteFile11 ()throwsioexception{//TODO auto-generated Method StubBufferedWriter writer =NewBufferedWriter (NewFileWriter (NewFile ("D:/driver/data.txt"),true)); for(inti=0;i<1000000;i++){ if(i%10==0) {Writer.write ("Zhao" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }if(i%10==1) {Writer.write ("Money" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); } if(i%10==2) {Writer.write ("Sun" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }if(i%10==3) {Writer.write ("Li" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); } if(i%10==4) {Writer.write ("Zheng" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }if(i%10==5) {Writer.write ("WU" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); } if(i%10==6) {Writer.write ("Week" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }if(i%10==7) {Writer.write ("King" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); } if(i%10==8) {Writer.write ("Zhang" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }if(i%10==9) {Writer.write ("Liu" + (I/10) + "\ T" + (int) (Math.random () *100) + "\ n"); }} writer.close (); }
3. Import the merged file into the data table:
Load ' /tmp/finance_log_data.txt ' into Table f_log (' id ', ' xx ',
' XX ',.........................
);
Caveats: Start thinking about using stored procedures to incrementally import into a datasheet, but the load data command cannot be used in stored procedures.
In addition, the merging of data can be done with shell scripts, but it is customary to use Java, so it is more complicated to do it in Java. However, you can casually review the Java read and write files, have a good experience.
Q&a
Time issue: Generate 100 million data (in the case of an index) for 3 hours. If you use the INSERT statement, the estimate will go insane!
MySQL makes 100 million records of a single table-Big data table