in the Itoo , basically each system has an import function, a lot of data to fill into Excel template, and then use the Import feature to import the database, which can greatly improve productivity. Then the import involves the problem of bulk saving the database.
then typically, in a session Span lang= "en-US" >session object processing transaction complete, also close session session 1 1 Span lang= "ZH-CN" > data is loaded into the memory staff, the cache is cleaned up when the transaction commits, hibernate Execute 1
( 1 ) with a lot of memory, you have to load 1 of thousands of bars into memory and update them one after the other.
( 2 ) performed by Update An excessive number, each Update statement to update only one data, you must pass the 1 million Bar Update statement to update 1 Thousands of data, frequent access to the database, will greatly reduce the performance of the application.
for the above situation, we can pass Session to perform bulk operations, Session of the Sava method will handle the objects stored in their own cache staff, if through a Session object to handle a large number of persisted objects, the object that has been processed and not being accessed should be emptied from the cache in a timely manner. This is done by immediately calling the Flush() method to clean up the cache after processing an object or a small batch object, and then calling the clear() method to empty the cache.
If you pass Session to perform bulk operations, the following constraints are:
1. The number of JDBC single-batch processing needs to be set in hibernate 's configuration file , and the reasonable value is usually ten ~50 , such as hibernate.jdbc.batch_size= 3 0; In this way, you need to ensure that every time you send a batch like a database SQL the number of statements with this batch_size properties are consistent.
2. If the Action object uses the "identity" identifier generator, Hibernate cannot layer for BULK insert operations.
3. When doing bulk operations, it is recommended to turn off hibernate 's second level cache,Session cache is hibernate First-level cache, typically a transaction-scoped cache, with a separate first-level cache for each transaction. sessionfactory 's external cache is Hibernate 's second-level meat dishes, which is an application-wide cache, and all transactions share the same second-level cache. In any case,hibernate 's first-level cache is always available,andhibernate 's second-level cache is off by default, but can also be manually turn off level two caching in Hibernate's configuration file:
Hibernate.cache.use_second_level_cache=false;
in the Itoo The code stored in bulk is as follows:
<span style= "FONT-FAMILY:FANGSONG_GB2312;FONT-SIZE:18PX;" >/** * Bulk Save * * @param list * List type * @return Returns a Boolean value */public <T> boolean saveentitys (list<t> list {Boolean flag = False;int BatchSize = 30;int i = 0;getdatabasename (list.get (0)); try {for (Object entity:list) {Getenti Tymanager (). Persist (entity), i++;if (i% batchsize = 0) {Getentitymanager (). Flush (); Getentitymanager (). Clear ();}} Flag = true;} catch (Exception e) {}return flag;} </span>
in the above program, each execution Session.flush () method, the database staff will be inserted in bulk - record. The next session.clear () method empties the first saved object from the cache.
As long as the design has a batch import situation, will involve the content of bulk storage, this blog is to let your bulk storage to achieve the best results.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
"Java" Itoo project, Hibernate batch Save optimization