MSSQL Batch Write data scheme

Source: Internet
Author: User
Tags server memory

Recently, there is a project feature need to have a batch of data to write the scene, just sort of data found themselves previously similar implementation of the project, in the reconstruction of the relevant data made a simple comb, convenient for everyone to reference.

    1. Circular write (simple rough, graduation design just do this)(not recommended)
    2. Bulk Copy Write (>1000k record write-once recommendation)
    3. Table-valued parameter mode write (MSSQL 2008 new feature)(highly recommended)

Before SQL Server 2008 provides table-valued parameters, you need to pass multiple rows of data to a stored procedure or to a parameterized SQL command. We generally use the following methods:

    1. Use a series of single parameters to represent values in multiple data columns and rows. However, using this method is limited by the number of allowed parameters. A SQL Server program can have a maximum of 2,100 parameters. The server must then organize these parameters into temporary tables or table variables for subsequent processing.
    2. Add multiple data to separate strings or serialize them into XML strings, and then Fu Hui the words to the server. The server processes the XML based on the parsing string.
    3. Wraps multiple write statements in a single statement. This approach, along with the implementation logic of the Update method in SqlDataAdapter, can identify the number of batches processed. However, each statement will be executed on the server separately, even if the batch is submitted in a wrapper of multiple statements. (just save the number of requests)
    4. Use the bcp utility or the SqlBulkCopy object to load many rows of data into a table. Although this recommendation technique is very effective, server processing is not supported unless the data is loaded into a temporary table or table variable.

Programme I

As the first time to learn the highest rate of the implementation of the method I do not specifically explain here, directly on the code and test data:

 Public Static voidnormalinsertdate (DataTable dt) {using(varSqlconn =NewSqlConnection (_testdataconnectionstring)) {                varsql ="INSERT into Student (name,age) VALUES (@Name, @Age)"; using(varcmd =NewSqlCommand (SQL, sqlconn))                    {Sqlconn.open (); Cmd. Parameters.Add ("@Name", SqlDbType.NVarChar, -); Cmd. Parameters.Add ("@Age", SqlDbType.Int);  for(inti =0; i < dt. Rows.Count; i++) {cmd. parameters["@Name"]. Value = dt. rows[i]["Name"]; Cmd. parameters["@Age"]. Value = dt. rows[i][" Age"]; Cmd.                    ExecuteNonQuery (); }                }            }        }
View Code

Figure one for each 10k, write 10 times total 100k data Total 15329ms

Figure two for each 100k, write 10 times total 1000k data Total 184395ms

Programme II

As the savior of early batch writing, the volume-written index of 4 stars. The following is the test data:

         Public Static voidbulkinsertdata (DataTable dt) {using(varSqlconn =NewSqlConnection (_testdataconnectionstring)) {                using(varBulkCopy =NewSqlBulkCopy (sqlconn) {DestinationTableName="Student", BatchSize=dt.                    Rows.Count}) {Sqlconn.open ();                Bulkcopy.writetoserver (DT); }            }        }
View Code

Figure one for each 10k, write 10 times total 100k data Total 1848ms

Figure two for each 100k, write 10 times total 1000k data Total 21584ms

Scenario three table-valued parameter mode write

You can use table-valued parameters to wrap data rows in a client application and send data to the server using a single parameterized command. The incoming data row is stored in a table variable, and you can then manipulate the table variable by using Transact-SQL.

You can use standard Transact-SQL SELECT statements to access column values in table-valued parameters.

Note: The table-valued parameter can only be an input parameter and cannot be used as an output parameter.

The following are relevant implementations:

1. Create a table-valued parameter type (UDT)

 UseTest--CREATE TABLECREATE TABLEStudent (IdINT IDENTITY(1,1)PRIMARY KEY, NameNVARCHAR( -), ageINT)--CREATE TABLE parameter typeCREATETYPE Studentudt as TABLE(NameNVARCHAR( -), ageINT)
View Code
 Public Static voidtableparameterinsertdata (DataTable dt) {using(varSqlconn =NewSqlConnection (_testdataconnectionstring)) {        varsql ="INSERT into Student (name,age) SELECT Name, age from @StudentTVPS";//Direct access to table-valued parameters here        using(varcmd =NewSqlCommand (SQL, sqlconn)) {            varCatparam = cmd. Parameters.addwithvalue ("@StudentTVPS", DT); Catparam.sqldbtype=sqldbtype.structured; Catparam.typename="Studentudt";//Our custom table-valued parameter type nameSqlconn.open (); Cmd.        ExecuteNonQuery (); }    }}
View Code

Figure one for each 10k, write 10 times total 100k data Total 390ms

Figure two for each 100k, write 10 times total 1000k data Total 4451ms

Finally, we will compare the horizontal:

In the case of my native test, Normal=9*bulk=42*tvps

In addition, I have a one-time large number of data write to bulk and TVPs alone test, write 100K data once two scenarios basically flat 490ms

But the gap is again pulled apart when writing 1000K data at once, Bulk=1.5tvps

The test data show that bulk in a one-time large-volume write still have a big advantage, after all, MS is specifically let him do this thing.

However, there is a small improvement in the total consumption time of the TVPs by the way of the sub-range writing.

All, there is a lot of data write to the scene directly using bulk copy method. He has the responsibility to accomplish his mission efficiently.

If there are some common business batch scenarios, do not consider the direct TVPS approach. His efficiency compared to the previous XML parameters, the implementation of the complex parameters of bulk write is already an order of magnitude of ascension.

You are absolutely worth having.

The test environment should not be very precise due to the client hardware environment. So the above data is for reference only.

You are welcome to share the exchange.

Accessories native Test Hardware environment: i7 4770+128 ssd+8g Memory;

MSSQL Batch Write data scheme

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.