In our internal training, students have business requirements for batch big data insertion into databases in able. So I made a simple Demo and analyzed and compared it. Test Environment OS: Windows 7 flagship CPU: Intel (R) Pentium (R) DualCPUE2180@2.00GHzRAM: 2.00GB data insertion uses the following methods 1. One by one
During our internal training, students have business requirements for batch big data insertion into the data warehouse in DataTable. So I made a simple Demo and analyzed and compared it. Test Environment OS: Windows 7 Ultimate CPU: Intel (R) Pentium (R) Dual CPU E2180 @ 2.00 GHz RAM: 2.00 GB Data insertion uses the following methods 1. One by one
During the recent internal training, students have a large number of students in DataTable.DataInsertDataBusiness Needs of the database.
So I made a simple Demo,AnalysisComparison.
Test Environment
OS: Windows 7 flagship Edition
CPU: Intel (R) Pentium (R) Dual CPU E2180 @ 2.00 GHz
RAM: 2.00 GB
DataInsertThe following methods are used:
1. One by oneDataInsert
2. concatenate SQL statements in batchesInsert
3. concatenate SQL statements and use Transaction
4. concatenate SQL statements and use SqlTransaction
5. Use DataAdapter
6. Use TransactionScope and SqlBulkCopy
7. Use table value Parameters
DataThe database uses SQL Server. The script is as follows:
Create table TestTable
(
Id int
, Name nvarchar (20)
)
Program Generation test able structure and TestDataThe class is as follows:
Public class Tools {public static DataTable MakeDataTable () {DataTable table = new DataTable (); // generate the DataTable schema table. columns. add ("Id", Type. getType ("System. int32 "); table. columns. add ("Name", Type. getType ("System. string "); // set the primary key table. primaryKey = new DataColumn [] {table. columns ["ID"]}; table. columns ["Id"]. autoIncrement = true; table. columns ["Id"]. autoIncrementSeed = 1; table. columns ["Id"]. readOnly = true; return table;} public static void MakeData (DataTable table, int count) {if (table = null) return; if (count <= 0) return; dataRow row = null; for (int I = 1; I <= count; I ++) {// create a new DataRow object (generate a new row) row = table. newRow (); row ["Name"] = "Test" + I. toString (); // Add a new DataRow table. rows. add (row );}}}
The test program uses Windows Form. The interface is as follows:
Use Log4net to record logs. The default value isInsert40000 records, each timeInsertOne record, which can be modified on the interface, recorded using System. Diagnostics. StopWatchInsertTime. After each test, the original table is deleted and rebuilt.
The form code is as follows:
Public delegate bool InsertHandler (DataTable table, int batchSize); public partial class FrmBatch: Form {private Stopwatch _ watch = new Stopwatch (); public FrmBatch () {InitializeComponent ();} private void FrmBatch_Load (object sender, EventArgs e) {txtRecordCount. text = "40000"; txtBatchSize. text = "1" ;}// <strong> data </strong> <strong> insert </strong> private void btnInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeInsert, "Use SqlServer Insert");} // concatenate an SQL statement <strong> Insert </strong> private void btnBatchInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeBatchInsert, "Use SqlServer Batch Insert");} // concatenate an SQL statement and Use Transaction private void btnTransactionInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeTransactionInsert, "Use SqlServer Batch Transaction Insert");} // concatenate an SQL statement and Use SqlTransaction private void btnSqlTransactionInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeSqlTransactionInsert, "Use SqlServer Batch SqlTransaction Insert");} // Use DataAdapter private void btnDataAdapterInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeDataAdapterInsert, "Use SqlServer DataAdapter Insert");} // Use TransactionScope private void btnTransactionScopeInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeTransactionScopeInsert, "Use SqlServer TransactionScope Insert");} // Use the table value parameter private void btnTableTypeInsert_Click (object sender, EventArgs e) {Insert (DbOperation. executeTableTypeInsert, "Use SqlServer TableType Insert");} private DataTable InitDataTable () {DataTable table = Tools. makeDataTable (); int count = 0; if (int. tryParse (txtRecordCount. text. trim (), out count) {Tools. makeData (table, count); // MessageBox. show ("Data Init OK");} return table;} public void Insert (InsertHandler handler, string msg) {DataTable table = InitDataTable (); if (table = null) {MessageBox. show ("DataTable is null"); return;} int recordCount = table. rows. count; if (recordCount <= 0) {MessageBox. show ("No Data"); return;} int batchSize = 0; int. tryParse (txtBatchSize. text. trim (), out batchSize); if (batchSize <= 0) {MessageBox. show ("batchSize <= 0"); return;} bool result = false; _ watch. reset (); _ watch. start (); result = handler (table, batchSize); _ watch. stop (); string log = string. format ("{0}; RecordCount: {1}; BatchSize: {2}; Time: {3};", msg, recordCount, batchSize, _ watch. elapsedMilliseconds); LogHelper. info (log); MessageBox. show (result. toString ());}}
Full text link:
. NET large batchesDataInsertPerformanceAnalysisAndComparison(1.PreparationWork)
. NET large batchesDataInsertPerformanceAnalysisAndComparison(2. NormalInsertBatch join SQLInsert)
. NET large batchesDataInsertPerformanceAnalysisAndComparison(3. Use transactions)
. NET large batchesDataInsertPerformanceAnalysisAndComparison(4. Batch use DataAdapterInsert)
. NET large batchesDataInsertPerformanceAnalysisAndComparison(5. Use SqlBulkCopy)
. NET large batchesDataInsertPerformanceAnalysisAndComparison(6. Use table value parameters)