In some business systems, a large amount of data needs to be inserted into the database every minute. I am developing a NetFlow management system and receive more than several thousand data per minute in the database. Although the data volume is not large, it cannot be processed well, and the database may not be stable. If you need to insert so many SQL statements to the database every minute, many of my friends have proposed batch submission, and I also adopt related technologies. However, there is too much data. Each time only 1000 rows of data are submitted in the cache, the submission time is long and the database crashes.
Later, we proposed a new solution, that is, multi-threaded cache for data submission.
First, start 20 threads and wait for the task.
Then, the data is buffered. IF 1000 SQL statements are cached, the SQL statement is sent to the thread. The thread is not blocked and the operation is inserted into the main thread to continue waiting;
Package COM. shine. dbutil; public class batchexample {/*** batch submit ** @ Param ARGs */public static void main (string [] ARGs) {system. out. println ("batchexample... "); dbutil. getinstance (). init ("e :\\ workspace \ javaframework2.0 \ SRC \ com \ shine \ framework \ dbutil \ config \ dbxml. XML "); For (INT I = 0; I <5006; I ++) {// system. out. println (I); string SQL = "insert into test1 (TEST) value ('" + I + "')"; dbutil. getinstance (). addbatchupdate ("JDBC/test", SQL );}}}