We all know that when a large volume of data is inserted into MySQL,
MySQL uses load data local infile to import data from a file faster than the insert Statement, which is about 20 times faster.
However, this method has a disadvantage: before importing data, you must have a file, that is, import from the file. In this way, you need to write files,
And file deletion and other maintenance. In some cases, for example, if the data source is concurrent, the file writing concurrency problem may occur, which is difficult to handle.
Is there any way to import data directly from the memory (I/O Stream) at the same efficiency without writing files?
Some time ago, I went to MySQLCommunityWe found this method:Setlocalinfileinputstream ()In the com. MySQL. JDBC. preparedstatement class.
The specific implementation is as follows:
Use the setlocalinfileinputstream method of MySQL JDBC to load data local infile from Java inputstream to MySQL database.
Prepare a test table
The SQL statement is as follows:
Use test; Create Table 'test' ('id' int (10) unsigned not null auto_increment, 'A' int (11) not null, 'B' bigint (20) unsigned not null, 'C' bigint (20) unsigned not null, 'D' int (10) unsigned not null, 'E' int (10) unsigned not null, 'F' int (10) unsigned not null, primary key ('id'), key 'A _ B '('A',' B ')) engine = InnoDB auto_increment = 1 default charset = utf8
JavaCodeAs follows:
Package COM. seven. dbtools. dbtools; import Java. io. bytearrayinputstream; import Java. io. inputstream; import Java. SQL. connection; import Java. SQL. preparedstatement; import Java. SQL. sqlexception; import Org. springframework. JDBC. core. jdbctemplate; import javax. SQL. datasource; import Org. apache. log4j. logger;/*** @ author seven * @ since 07.03.2013 */public class bulkloaddata2mysql {Private Static final logger Lo Gger = logger. getlogger (bulkloaddata2mysql. class); Private jdbctemplate; private connection conn = NULL; Public void setdatasource (datasource) {This. jdbctemplate = new jdbctemplate (datasource);} public static inputstream gettestdatainputstream () {stringbuilder builder = new stringbuilder (); For (INT I = 1; I <= 10; I ++) {for (Int J = 0; j <= 10000; j ++) {builder. append (4); Builde R. append ("\ t"); builder. append (4 + 1); builder. append ("\ t"); builder. append (4 + 2); builder. append ("\ t"); builder. append (4 + 3); builder. append ("\ t"); builder. append (4 + 4); builder. append ("\ t"); builder. append (4 + 5); builder. append ("\ n") ;}} byte [] bytes = builder. tostring (). getbytes (); inputstream is = new bytearrayinputstream (bytes); return is;}/*** load bulk data from inputstream to MySQL */Public int B Ulkloadfrominputstream (string loaddatasql, inputstream datastream) throws sqlexception {If (datastream = NULL) {logger.info ("inputstream is null, no data is imported"); Return 0 ;} conn = jdbctemplate. getdatasource (). getconnection (); preparedstatement Statement = Conn. preparestatement (loaddatasql); int result = 0; If (statement. iswrapperfor (COM. mySQL. JDBC. statement. class) {COM. mySQL. JDBC. preparedstatement Mysqlstatement = Statement. unwrap (COM. mySQL. JDBC. preparedstatement. class); mysqlstatement. setlocalinfileinputstream (datastream); Result = mysqlstatement.exe cuteupdate ();} return result;} public static void main (string [] ARGs) {string testsql = "load data local infile 'SQL .csv' ignore into table test. test (a, B, c, d, e, f) "; inputstream datastream = gettestdatainputstream (); bulkloaddata2mysql Dao = new bulk Loaddata2mysql (); try {long begintime = system. currenttimemillis (); int rows = Dao. bulkloadfrominputstream (testsql, datastream); long endtime = system. currenttimemillis (); logger.info ("importing" + rows + "rows data into MySQL and cost" + (endtime-begintime) + "MS! ");} Catch (sqlexception e) {e. printstacktrace ();} system. Exit (1 );}}
Tip:
String testsql = "load data local infile 'SQL .csv' ignore into Table Test. Test (a, B, c, d, e, f )";
Using the setlocalinfileinputstream method, the file name is directly ignored and the IO is directly imported into the database.
Refer:
Http://assets.en.oreilly.com/1/event/21/Connector_J%20Performance%20Gems%20Presentation.pdf
Http://jeffrick.com/2010/03/23/bulk-insert-into-a-mysql-database/
OriginalArticle, Welcome to reprint, reprint please indicate the source: http://write.blog.csdn.net/postedit/9237495