Previous notes use JDBC and Ibatis methods to implement bulk insert data into the database. Use the corresponding interface in the hibernate frame, also can realize the batch operation of the data, hibernate the data that inserts recently is cached in memory of Session-level cache, First, set a reasonable JDBC batch size hibernate.jdbc.batch_size parameter in the configuration file to specify the number of SQL submissions per commit. The reason for configuring the Hibernate.jdbc.batch_size parameter is to read the database as little as possible, and the greater the number of hibernate.jdbc.batch_size parameters, the less the database will be read, the faster. The session is then flush () and clear () at a certain interval, after each insertion of a certain amount of data is promptly removed from the internal cache, freeing the memory occupied. Session implements an asynchronous Write-behind that allows hibernate to explicitly write the batch of operations. Here is an example of using Hibernate to BULK INSERT data:
The code is as follows |
Copy Code |
/** * Hibernate BULK INSERT Data * @param recordlist */ public void Saverecordbylist (final list<record> recordlist) { Gethibernatetemplate (). Execute (new Hibernatecallback () { @Override The public Object Doinhibernate throws Hibernateexception, SQLException { Session.begintransaction (); Maximum number of bars per commit Final int batchsize = 200; int count = 0; for (record Record:recordlist) { Session.save (record); Every 200 data is submitted once if (++count% batchsize = = 0) { Session.gettransaction (). commit (); Session.flush (); Session.clear (); Session.begintransaction (); } } Submit the remaining data Session.begintransaction (); return null; } }); } |
Hibernate configuration File Set Hibernate.jdbc.batch_size parameters:
code is as follows |
copy code |
<!-- Set hibernate.jdbc.batch_size parameters < Session-factory> ..... <property name= "Hibernate.jdbc.batch_size" >200</property ...... <session-factory> |