Project development, there is a function to the database to bulk insert data, just start not to consider so much, just a data loop into the database, the amount of data, the efficiency can be, but when the amount of data reached thousands or even hundreds of, this method is not efficient, the time has to wait for all insert success, User experience is very bad, decisive pass off the method.
Then a way to do that is to stitch the insert string, that is, insert into TableName Values (', ', ', '), (', ', ', '), "and, at first, compare happy, insert quickly and efficiently, but when the data is thousands, The problem came up with an error indicating that the number of rowvalue expressions in the INSERT statement exceeds the maximum allowable value for 1000 rows of values. "I go, I can only BULK insert 1000 data at a time, this is too pit!" To continue to use this method, it is necessary to cycle the data, each time the insertion of 1000, so add a variety of judgments more cumbersome, lazy add on the pass.
Just like this. Find the third method, using SqlBulkCopy class to add data in bulk, after looking at the data found that this method is NB, millions data inserted into the database only more than 10 seconds or even a few seconds, this efficiency simply to heaven! So decisively use this method. The following is a simple use of the SqlBulkCopy class:
public void Batchinsert (list
This method is not only efficient, but also data source, as long as the data is loaded into a DataTable instance or a DataRow array can be bulk inserted into the database.
In this way, the problem of batch insertion happily solved, no longer worry about the tens of thousands of data volume insertion efficiency problem ~ ~ ~
Summary of issues encountered in batch importing data to database