When doing a large web site or system, you often encounter a problem is to bulk INSERT or modify the database. If you encounter this problem, or a record to deal with, this is too inefficient, so consider the bulk INSERT or modify
Today, this is not about SqlBulkCopy, just a simple
The code found on the internet hasn't been verified yet./// ///Bulk inserts multiple rows to SQL/// /// The name of the table to insert /// The name of the primary key column of the table /// True If the primary key is automatically allocated by the DB /// The POCO objects that specifies the column values to be inserted /// The number of Pocos
In the project, there are often the following scenarios:After you bulk insert a batch of data into the database, you need to know which inserts were successful and which failed.At this time there will always be two ideas, one is to determine the same record before inserting the existence, filter out the duplicate data, the other is the edge of the insertion edge of the judgment, dynamic filtering.The first
Used to say that the efficiency of bulk inserts and updates is higher than that of non batches, but how much higher, not evaluated, today I'm going to test the specific (1) Three kinds of insert operation method 1.1 use a For loop BULK insert
Sample XML
Sample code:
for (int i = 1; I
1.2 using JDBC
Sample code
Tags: highlight batch tor add ORM Framework official test results own wayTransferred from: http://blog.csdn.net/starywx/article/details/23268465 Some time ago because the project was in a hurry.performance issues in the development process, the process of optimizing some of the code has found that in the case of large amounts of data, the operation of the data seems to be slow, the thought of database DML operations in bulk operations.
It also reminds
Label:SQLite database is essentially a disk file, so all the database operation will be converted to the operation of the file, and frequent file operations will be a good process, will greatly affect the speed of database access. For example, insert 1 million data into the database and, by default, if only the executionSqlite3_exec (db, "INSERT into name values ' lxkxf ', '"; ", 0, 0, zerrmsg);The database
Tags: databaseRecently, a batch of data needs to be processed from the fields in the table in the library and then exported to a new table. However, this table has a data volume of nearly 500w. The problem with this amount of data is that it takes a long time to process.First think, a sentence of the insertion, large data processing time is long, ignore.Next think, multi-threaded INSERT, think database connection is need to synchronize, so feel not ve
Because of the previous time to interview, asked how to efficiently insert 100,000 records into the database, did not deal with similar problems before, and did not read the relevant information, the results did not answer, today to check some information, summed up three ways:Test the database for MySQL!!!Method One:public static void Insert () {//open time long begin = New Date (). GetTime ();//
Because the project needs to generate multiple data and save it to the database, a list collection object is encapsulated in the program, then the entities in the collection need to be inserted into the database, the project uses Spring+mybatis, so it is intended to use MyBatis BULK INSERT, Should be better than the effect of the loop insert, because previously d
Label:SQLite database is essentially a disk file, so all the database operation will be converted to the operation of the file, and frequent file operations will be a good process, will greatly affect the speed of database access. For example, insert 1 million data into the database and, by default, if only the executionSqlite3_exec (db, "INSERT into name values ' lxkxf ', '"; ", 0, 0, zerrmsg);The database
SqlBulkCopy class is System.Data.SqlClient under the class, we are not used in development, and even do not know that there is such a class exists, but it is more than SQL INSERT, transaction BULK INSERT, SQL bulk splicing is much
Previous notes use JDBC and Ibatis methods to implement bulk insert data into the database. Use the corresponding interface in the hibernate frame, also can realize the batch operation of the data, hibernate the data that inserts recently is cached in memory of Session-level cache, First, set a reasonable JDBC batch size hibernate.jdbc.batch_size parameter in the configuration file to specify the number of
cur bulk collect into l_target l Imit l_limit; Exit when l_target.count=0; forall i in L_target.first: l_target.last insert into TMP_LBX (id,name) VALUES (L_target (i). Object_id,l_target (i). object_name); l_count: =l_count + l_target.count;nbsp ; if l_count >= l_commit then commit; NB Sp l_count:=0; end if; end loop; commit; close Cur;end; 2. Exception Handling Example Fo
Now execute a simpleBULK INSERTTask to practice the topic just discussed. First, createSSISProject. SetPackage. dtsxRenameBulkloadzip. dtsxIf the dialog box is displayed, whether to rename or select YES.
Create a folder firstC: \ ssisdemos, Put the file (/files/tylerdonet/zipcode.txt) in this folder. Create a database locally and use the name of a common Microsoft database.Adventureworks, Use the followingCodeCreate a table:
1
Create
Table
Insertcontent(
2
Zipcode
Char
(
5
The rational use of batch inserts and updates has a great effect on performance optimization, and the speed is obviously N times faster.
Pay attention to the new addition of the database connection string: allowMultiQueries = true, which means that a SQL can be split into multiple independent SQLs by a semicolon.
The maximum limit of batch insertion is mainly based on the length of your entire SQL, so yo
Label:You may encounter two problems when you update large amounts of data:
If each update executes a SQL performance is low, it is also prone to blocking;
There may be problems with primary key duplication when you batch update
Use on DUPLICATE key to update a SQL to resolve bulk updates and primary key duplication issues (ID primary key)
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.