Hibernate operates databases in an object-oriented way. When a program operates persistence objects in an object-oriented way, it is automatically converted to database operations. For example, if we call the delete () method of the Session to delete the Persistent Object, Hibernate will delete the corresponding data record. When we execute the setter method of the Persistent object, hibernate will automatically convert to the underlying update statement to modify the corresponding records of the database.
The problem is: if we need to update 100000 records at the same time, is it necessary to load 100000 records one by one, and then call the setter method in sequence-this is not only cumbersome, but also has poor data access performance. In the face of this batch processing scenario, Hibernate provides a batch processing solution. The following describes how to deal with this situation from batch insert, batch update, and batch Delete.
Batch insert
If you need to insert 100000 records into the database, Hibernate may adopt the following practices:
- Session session = sessionFactory. openSession ();
- Transaction tx = session. beginTransaction ();
- // Insert 100000 records for 100000 cycles
- For (int I = 0; I <100000; I ++)
- {
- User u = new User (.....);
- Session. save (u );
- }
- Tx. commit ();
- Session. close ();
However, as this program runs, it always fails at some time and throws an OutOfMemoryException memory overflow exception ). This is because the Hibernate Session holds a required level-1 cache, and all User instances will be cached in the Session-level cache area.
To solve this problem, we have a very simple idea: regularly refresh Session cached data into the database, rather than always caching at the Session level. You can consider designing a accumulators. Each time you save a User instance, the number of accumulators increases by 1. Determines whether to fl Session cache data into the database based on the value of the accumulators.
The following is a code snippet for adding 100000 User instances.
Program list: codes \ 06 \ 6.3 \ batchInsert \ src \ lee \ UserManager. java
- Private void addUsers () throws Exception
- {
- // Open the Session
- Session session = HibernateUtil. currentSession ();
- // Start the transaction
- Transaction tx = session. beginTransaction ();
- // Insert 100000 records for 100000 cycles
- For (int I = 0; I <100000; I ++)
- {
- // Create a User instance
- User u1 = new User ();
- U1.setName ("xxxxx" + I );
- U1.setAge (I );
- U1.setNationality ("china ");
- // Cache the User instance at the Session level
- Session. save (u1 );
- // When the accumulators are multiples of 20, the Session data is flushed into the database,
- // And clear the Session cache.
- If (I % 20 = 0)
- {
- Session. flush ();
- Session. clear ();
- }
- }
- // Submit the transaction
- Tx. commit ();
- // Close the transaction
- HibernateUtil. closeSession ();
- }
In the above Code, when I % 20 = 0, manually write the data cached in the Session and clear the data cached in the Session. In addition to processing Session-level cache, you should also disable the second-level cache of SessionFactory through the following Configuration:
- hibernate.cache.use_second_level_cache false
Note: In addition to manually clearing the Session-level cache, it is best to disable the SessionFactory-level second-level cache. Otherwise, even if the Session-level cache is manually flushed, an exception may occur because there is a second-level cache in SessionFactory. For more information about Level 2 caching, see the following content.
This article from the "crazy Java Li Gang" blog, please be sure to keep this source http://javaligang.blog.51cto.com/5026500/910675