In practical scenarios, it is sometimes necessary to import a batch of data into the database. This batch of data may originate from another data source. It is common practice to read the dataset first, then run a loop, and each line is spelled with an INSERT INTO statement, which executes. Used people will know that this method of importing data is very inefficient. So, is there any better way? The answer is yes. Use the System.Data.SqlClient.SqlBulkCopy class. I think this should be the method that is used when you import data when you use SQL Server Explorer. The code is as follows:
-----------------------------------------------I'm a split line for the code-----------------------------------------------------
private void InsertData (DataTable dt)
{
if (dt. Rows.Count > 0)
{
If the table has a self-increment field. You need to add this field and assign the first line an initial value. Therefore, please be careful to operate, although not tested, but if the initial value is repeated should be an error.
Dt. Columns.Add ("Innerid", Type.GetType ("System.Int32"));
Dt. rows[0]["Innerid"] = 1;
ConnStr is the connection string.
SqlBulkCopy SqlBulkCopy = new SqlBulkCopy (connstr, sqlbulkcopyoptions.useinternaltransaction);
SqlBulkCopy. DestinationTableName = "TableName";//the table name in the database
SqlBulkCopy. WriteToServer (DT);
}
}
-----------------------------------------------I'm a split line for the code-----------------------------------------------------
This approach is highly efficient. Especially when the amount of data is large (mainly the number of rows), it is quite obvious.
It feels like the first 10 years of programming life have been white. This simple and efficient approach has only just been discovered.
Bulk data import of SQL Server