Inserting a single piece of data into SQL Server uses the INSERT statement, but if you want to bulk insert a bunch of data, looping through insert is inefficient and results in a system performance problem with SQL. The following are the two bulk data insertion methods supported by SQL Server: Bulk and table-valued parameters (table-valued Parameters).
Run the following script to establish the test database and table-valued parameters.
-CREATE DATABASE CREATE database bulktestdb; Go use bulktestdb; Go --CREATE TABLE CREATE TABLE bulktesttable ( int primary key, UserName nvarchar (+), Pwd varchar (+)) go -- Create Table Valued CREATE TYPE Bulkudt as TABLE int, UserName nvarchar (+), Pwd varchar (+))
Let's insert 1 million data using the simplest INSERT statement, with the following code:
Stopwatch SW =NewStopwatch (); SqlConnection sqlconn=NewSqlConnection (configurationmanager.connectionstrings["ConnStr"]. ConnectionString);//connecting to a databaseSqlCommand Sqlcomm=NewSqlCommand (); Sqlcomm.commandtext=string. Format ("INSERT INTO bulktesttable (ID,USERNAME,PWD) VALUES (@p0, @p1, @p2)");//Parameterized SQLSQLCOMM.PARAMETERS.ADD ("@p0", SqlDbType.Int); SQLCOMM.PARAMETERS.ADD ("@p1", SqlDbType.NVarChar); SQLCOMM.PARAMETERS.ADD ("@p2", SqlDbType.VarChar); Sqlcomm.commandtype=CommandType.Text; Sqlcomm.connection=sqlconn; Sqlconn.open (); Try { //the loop inserts 1 million data, inserts 100,000, inserts 10 times each time. for(intMultiply =0; Multiply <Ten; multiply++) { for(intCount = Multiply *100000; Count < (multiply +1) *100000; count++) {sqlcomm.parameters["@p0"]. Value =count; sqlcomm.parameters["@p1"]. Value =string. Format ("user-{0}", Count *multiply); sqlcomm.parameters["@p2"]. Value =string. Format ("pwd-{0}", Count *multiply); Sw. Start (); Sqlcomm.executenonquery (); Sw. Stop (); } //when 100,000 data is inserted, the time taken for this insertion is displayedConsole.WriteLine (string. Format ("Elapsed time is {0} Milliseconds", SW. Elapsedmilliseconds)); } } Catch(Exception ex) {Throwex; } finally{sqlconn.close (); } console.readline ();
The time-consuming diagram is as follows:
Since the operation is too slow to insert 100,000 is time-consuming 72390 milliseconds, so I manually forced to stop.
Here's a look at the use of bulk inserts:
The main idea of the bulk method is to insert the data in the table into the database by sqlbulkcopy the data in the table at the client, and then using the
The code is as follows:
Public Static voidBulktodb (DataTable dt) {SqlConnection sqlconn=NewSqlConnection (configurationmanager.connectionstrings["ConnStr"]. ConnectionString); SqlBulkCopy bulkcopy=NewSqlBulkCopy (sqlconn); Bulkcopy.destinationtablename="bulktesttable"; Bulkcopy.batchsize=dt. Rows.Count; Try{sqlconn.open (); if(dt! =NULL&& dt. Rows.Count! =0) bulkcopy.writetoserver (DT); } Catch(Exception ex) {Throwex; } finally{sqlconn.close (); if(BulkCopy! =NULL) Bulkcopy.close (); } } Public StaticDataTable GetTableSchema () {DataTable dt=NewDataTable (); Dt. Columns.addrange (Newdatacolumn[]{NewDataColumn ("Id",typeof(int)), NewDataColumn ("UserName",typeof(string)), NewDataColumn ("PWD",typeof(string))}); returnDT; } Static voidMain (string[] args) {Stopwatch SW=NewStopwatch (); for(intMultiply =0; Multiply <Ten; multiply++) {DataTable dt=Bulk.gettableschema (); for(intCount = Multiply *100000; Count < (multiply +1) *100000; count++) {DataRow R=dt. NewRow (); r[0] =count; r[1] =string. Format ("user-{0}", Count *multiply); r[2] =string. Format ("pwd-{0}", Count *multiply); Dt. Rows.Add (R); } SW. Start (); BULK.BULKTODB (DT); Sw. Stop (); Console.WriteLine (string. Format ("Elapsed time is {0} Milliseconds", SW. Elapsedmilliseconds)); } console.readline (); }
The time-consuming diagram is as follows:
It can be seen that efficiency and performance increase significantly after using bulk. Inserting 100,000 data with insert takes 72390, and now it takes 17583 to insert 1 million data using bulk.
Finally, let's look at the efficiency of using table-valued parameters and you'll be surprised.
The table-valued parameter is a new SQL Server 2008 feature, referred to as TVPs. For friends who are not familiar with table-valued parameters, you can refer to the latest book online, I will also write a blog about table-valued parameters, but this time does not introduce the concept of table-valued parameters too much. Anyway, look at the code:
[C-Sharp ] View plain copy Public Static voidTablevaluedtodb (DataTable dt) {SqlConnection sqlconn=NewSqlConnection (configurationmanager.connectionstrings["ConnStr"]. ConnectionString); Const stringTsqlstatement ="INSERT INTO bulktesttable (ID,USERNAME,PWD)"+"SELECT NC. Id, NC. Username,nc. PWD"+"From @NewBulkTestTvp as NC"; SqlCommand cmd=NewSqlCommand (tsqlstatement, sqlconn); SqlParameter Catparam= cmd. Parameters.addwithvalue ("@NewBulkTestTvp", DT); Catparam.sqldbtype=sqldbtype.structured; //The name of the table-valued parameter is Bulkudt, which is in the SQL that created the test environment above. Catparam.typename ="dbo. Bulkudt"; Try{sqlconn.open (); if(dt! =NULL&& dt. Rows.Count! =0) {cmd. ExecuteNonQuery (); } } Catch(Exception ex) {Throwex; } finally{sqlconn.close (); } } Public StaticDataTable GetTableSchema () {DataTable dt=NewDataTable (); Dt. Columns.addrange (Newdatacolumn[]{NewDataColumn ("Id",typeof(int)), NewDataColumn ("UserName",typeof(string)), NewDataColumn ("PWD",typeof(string))}); returnDT; } Static voidMain (string[] args) {Stopwatch SW=NewStopwatch (); for(intMultiply =0; Multiply <Ten; multiply++) {DataTable dt=Tablevalued.gettableschema (); for(intCount = Multiply *100000; Count < (multiply +1) *100000; count++) {DataRow R=dt. NewRow (); r[0] =count; r[1] =string. Format ("user-{0}", Count *multiply); r[2] =string. Format ("pwd-{0}", Count *multiply); Dt. Rows.Add (R); } SW. Start (); TABLEVALUED.TABLEVALUEDTODB (DT); Sw. Stop (); Console.WriteLine (string. Format ("Elapsed time is {0} Milliseconds", SW. Elapsedmilliseconds)); } console.readline (); }
The time-consuming diagram is as follows:
It's 5 seconds faster than bulk.
"ADO. net-Intermediate "Two ways to test millions data in bulk INSERT