Extreme Challenge-C # It takes only 4 seconds to import 1 million pieces of data into the SQL SERVER database (with source code)

Source: Internet
Author: User
Tags bulk insert sql 2008

In practice, you sometimes need to import a large amount of data into the database and then use it for various program calculations. In this experiment, we will use step 5 to complete this process and record the time spent by various methods in detail.

The tools used in this experiment are VS2008, SQL server 2000, and SQL server 2008. 1 million pieces of data are imported into SQL 2000 and SQL 2008 respectively using the 5 method. The experiment environment is DELL 2850. 2.0 GB CPU, 2G Memory server. If you are interested, you can download the source code and verify the time used.

Note that the DataProvider module in the IsLine FrameWork is used for SQL statement execution in this experiment. This module only reads and encapsulates the SQL configuration, this does not affect the final result. For details about the IsLine FrameWork, see the "IsLine FrameWork" FrameWork series.

Next, let's go to the question: Use Basic Insert statements, bulk insert statements, bulk insert in multiple threads, SqlBulkCopy, and SqlBulkCopy in multiple threads, challenge the 4-second limit.

For databases, SQL 2000 and SQL 2008 are used. The table name is TableB and the field name is Value1. You can modify the database name in App. config. The default value is test.

 

Figure 1 five methods in the test

 

Method 1.Use Basic Insert statements

This method is the most basic method that most people will think of at first. However, the Insert statement does not seem suitable for a large number of operations. Is that true?

In this method, 1 million data is divided into 10 batches, with 0.1 million data records in each batch and 1 transaction in each 0.1 million data records. The data is imported into the database 10 times.

Basic statement:Insert Into TableB (Value1) values ('"+ I + "');

Note: The I in the statement is an accumulation variable in the Host Program and is used to fill the value in the database field.

SQL 2000 elapsed time: 901599

SQL 2008 elapsed time: 497638

Method 2.Use the bulk insert statement

The effect of this class is the most satisfactory in this experiment. It is easy to use, flexible, and fast.

The "bulk insert" statement does not seem very commonly used. Aicken heard that Oracle can map external files to temporary Oracle tables, and then directly import data from temporary tables to other Oracle tables, the speed of this method is very satisfactory. Is the bulk insert of SQL SERVER equally satisfactory?

Basic statement:Bulk insert TableB FROM 'C: \ SQL .txt 'WITH (FIELDTERMINATOR =', ', ROWTER/., mbMINATOR =' | ', BATCHSIZE = 100000)

Note: "c: \ SQL .txt" is a pre-generated file containing 100 pieces of data. The data is separated by the "|" symbol, and each 0.1 million pieces of data is a transaction.

SQL 2000 elapsed time: 4009

SQL 2008 elapsed time: 10722

Method 3.Use bulk insert in multiple threads

On the basis of method 2, divide 1 million pieces of data into five threads. Each thread is responsible for 0.2 million pieces of data, each 50 thousand pieces of data, and five threads are started at the same time. Let's take a look at this effect.

SQL 2000 elapsed time: 21099

SQL 2008 elapsed time: 10997

Method 4.Use SqlBulkCopy class

This method is fast, but it depends on the memory. For complex data with tens of millions of records and multiple fields, the memory may be greatly consumed, however, you can use a 64-bit solution to solve this problem.

Tens of millions of data records and multiple fields are generally encountered in some business scenarios. For example, when calculating the consumption amount of a global consumer in a certain business cycle, first, you need to obtain a snapshot of the member consumption record in the primary database table, and store the snapshot in the temporary table for the computing program to use the data. In addition, sometimes the consumer's consumption data is not in a database server, but from multiple servers in multiple countries. In this way, we must use memory or external storage devices to transfer the data, then clean, merge, check, and finally import the dedicated table for use by the computing program.

Basic statement:

Code

 using (System.Data.SqlClient.SqlBulkCopy sqlBC = new System.Data.SqlClient.SqlBulkCopy(conn))

{

sqlBC.BatchSize = 100000;

sqlBC.BulkCopyTimeout = 60;

sqlBC.DestinationTableName = "dbo.TableB";

sqlBC.ColumnMappings.Add("valueA", "Value1");

sqlBC.WriteToServer(dt);

}

Note:

BatchSize = 100000; indicates that each 0.1 million transactions are committed.

BulkCopyTimeout = 60; indicates Processing Based on timeout in 60 seconds

DestinationTableName = "dbo. TableB"; indicates that data is imported into Table B.

ColumnMappings. Add ("valueA", "Value1"); indicates matching the valueA field in memory with the Value1 field in TableB

WriteToServer (dt); write data to the database. Dt is a pre-built able, which contains the valueA field.

SQL 2000 elapsed time: 4989

SQL 2008 elapsed time: 10412

Method 5.Use the SqlBulkCopy class in multiple threads

Based on Method 4, 1 million pieces of data are divided into five threads. Each thread is responsible for 0.2 million pieces of data, each 50 thousand pieces of data, and five threads are started at the same time. Let's take a look at this effect.

SQL 2000 elapsed time: 7682

SQL 2008 elapsed time: 10870

Summary

Database

Test Method

Sqls 2000

Sqls 2008

Basic Insert

901599

497638

Single-thread Bulk Insert

4209

10722

Multi-thread Bulk Insert

21099

10997

Single-thread SqlBulkCopy

4989

10412

Multi-thread SqlBulkCopy

7682

10870

The above are the results of the experiments over the past few days. What is disappointing is that SQL SERVER 2008 does not seem as good as we think.

In addition, if you download the source code, Let me see why the multi-thread performance is not as good as that of a single thread? Is it because of improper use of static resources, or why?

Source Code address: http://files.cnblogs.com/isline/sqltest.rar

Test data: http://files.cnblogs.com/isline/Data.rar

I'm Li Ming (Aicken). Welcome to my next article.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.