C # Use the SqlBulkCopy class to batch copy data to the database,
Recently, the company needs to optimize the import. Because the previous method was to generate an Insert statement and Insert the statement into the database, the data volume is fine when the data volume is small, but as the data volume grows, the performance of the previous method could not keep up, so we found the SqlBulkCopy class.
The SqlBulkCopy class can only be used to write data to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to the able instance or the IDataReader instance can be used to read data.
Public class Conn {private static string StrConn {get {return ConfigurationManager. connectionStrings ["StrConn"]. toString (); // return ConfigurationManager. appSettings ["StrConn"]. toString () ;}} public static SqlConnection SqlConn {get {return new SqlConnection (StrConn) ;}} public class SqlHelper {public DataTable GetDataTable (string SQL) {DataTable dt = new DataTable (); SqlConnection conn = Null; SqlDataAdapter sda = null; try {conn = Conn. sqlConn; sda = new SqlDataAdapter (SQL, conn); conn. open (); sda. fill (dt);} catch (Exception ex) {} finally {if (conn! = Null) {conn. Close (); conn. Dispose () ;}if (sda! = Null) {sda. dispose () ;}} return dt;} public DataSet GetDataSet (string SQL) {DataSet ds = new DataSet (); SqlConnection conn = null; SqlDataAdapter sda = null; try {conn = Conn. sqlConn; sda = new SqlDataAdapter (SQL, conn); conn. open (); sda. fill (ds);} catch (Exception ex) {} finally {if (conn! = Null) {conn. Close (); conn. Dispose () ;}if (sda! = Null) {sda. dispose () ;}} return ds ;} /// <summary> /// use the transaction Insertion Method /// </summary> /// <param name = "dt"> source data </param> /// <param name = "tableName"> target table name </param> public void InsertO (DataTable dt, string tableName) {using (SqlConnection conn = Conn. sqlConn) {using (SqlBulkCopy sqlBuleCopy = new SqlBulkCopy (conn. connectionString, SqlBulkCopyOptions. checkConstraints | SqlBulkCopyOptions. default | SqlBulkCopyOptions. useInternalTransaction) {try {// set the target table name, that is, the database table name sqlBuleCopy. destinationTableName = tableName; // you can specify the number of rows for each batch. When the specified number of rows is reached, the database sqlBuleCopy is inserted. batchSize = 100000; // set the time (in seconds) before timeout sqlBuleCopy. bulkCopyTimeout = 3600; for (int I = 0; I <dt. columns. count; I ++) {// set the ing between the source data column and the target table column. The first parameter is the source data column, and the second parameter is sqlBuleCopy of the target table column. columnMappings. add (dt. columns [I]. columnName, dt. columns [I]. columnName);} sqlBuleCopy. writeToServer (dt);} catch (Exception) {}}}/// <summary> // The transaction insertion method is not used /// </summary> /// <param name = "dt"> source data </param> // <param name = "tableName"> target table name </param> public void InsertT (DataTable dt, string tableName) {using (SqlConnection conn = Conn. sqlConn) {using (SqlBulkCopy sqlBuleCopy = new SqlBulkCopy (conn) {try {conn. open (); // set the target table name, that is, the database table name sqlBuleCopy. destinationTableName = tableName; // you can specify the number of rows for each batch. When the specified number of rows is reached, the database sqlBuleCopy is inserted. batchSize = 100000; // set the time (in seconds) before timeout sqlBuleCopy. bulkCopyTimeout = 3600; for (int I = 0; I <dt. columns. count; I ++) {// set the ing between the source data column and the target table column. The first parameter is the source data column, and the second parameter is sqlBuleCopy of the target table column. columnMappings. add (dt. columns [I]. columnName, dt. columns [I]. columnName);} sqlBuleCopy. writeToServer (dt);} catch (Exception) {conn. close (); conn. dispose ();} finally {conn. close (); conn. dispose ();}}}}}
My source data is imported using Excel. The import method is not mentioned here. I will summarize the Excel import method later. Then, query the fields that need to be inserted into the target table, modify the field names and types of the source data table, and then call the batch insert method.
Protected void btnImport_Click (object sender, EventArgs e) {try {// get imported data DataSet ds = BI. ExecleDs (savePath, ""); if (ds! = Null & ds. tables. count> 0) {DataTable dt = ds. tables [0]; // query the fields to be inserted in the target table string SQL = "select U_No, U_Name, U_Pwd, P_Id from UserInfo"; DataTable dt1 = sqlhelper. getDataTable (SQL); if (dt1! = Null) {for (int I = 0; I <dt1.Columns. count; I ++) {// modify the field type and field name of the source data table dt. columns [I]. dataType = dt1.Columns [I]. dataType; dt. columns [I]. columnMapping = dt1.Columns [I]. columnMapping; dt. columns [I]. columnName = dt1.Columns [I]. columnName;} sqlhelper. insertO (dt, "UserInfo") ;}} catch (Exception ex) {throw ;}}
If the sequence or column name of the columns in the source and target tables are different, you must use the ColumnMappings. Add () method to set the ing relationship.
Refer:
Http://www.cnblogs.com/zfanlong1314/archive/2013/02/05/2892998.html