Some days ago, the company asked to do a data import program, the requirements of Excel data, large-scale import into the database, as little as possible to access the database, high-performance database storage. So on-line search, found a better solution, is to use SqlBulkCopy to process storage data. SqlBulkCopy storage of large quantities of data is very efficient, just like the name of this method, you can store the data table in memory directly into the database, without the need to insert data to the database once. The first experiment, millions other data tables, can be fully stored in a database in a matter of seconds, faster than the traditional insert method, many times faster. Below, I'll use the code to describe its usage.
/// <summary>///laborreport Data Import to Database/// </summary>/// <param name= "Laborreport" >laborreport data Table</param>/// <param name= "Laborreportdetail" >laborreportdetail data Table</param> Public voidLaborreportinsert (DataTable Laborreport, DataTable Laborreportdetail) {using(SqlConnection conn =NewSqlConnection ( This. Connection)) {if(Conn. State! =ConnectionState.Open) Conn. Open (); using(SqlBulkCopy Sqlbclaborreport =NewSqlBulkCopy (conn)) {Sqlbclaborreport.batchsize=Laborreport.rows.count;sqlbclaborreport.bulkcopytimeout= -; Sqlbclaborreport.destinationtablename="Laborreport"; Sqlbclaborreport.writetoserver (Laborreport);}using(SqlBulkCopy sqlbclaborreportdetails =NewSqlBulkCopy (conn)) {Sqlbclaborreportdetails.batchsize=Laborreportdetail.rows.count;sqlbclaborreportdetails.bulkcopytimeout= -; Sqlbclaborreportdetails.destinationtablename="laborreportdetails"; Sqlbclaborreportdetails.writetoserver (Laborreportdetail);}if(Conn. State! =connectionstate.closed) Conn. Close ();}}
The above example code, the memory of two DataTable data into the database one time, as long as the structure of the data table in memory is the same as the table structure in the database, if the database contains the self-increment column field, we can not need to define in the memory table, when the data is saved to the database , the self-increment column automatically generates the data.
//Bulk import of data into SQL Server, creating instancesSystem.data.sqlclient.sqlbulkcopy?sqlbulk?=?New? System.Data.SqlClient.SqlBulkCopy (system.configuration.configurationmanager.connectionstrings["ConnStr"]. ToString ());//? target database table nameSqlbulk. Destinationtablename?=?"TableName"; //? DataSet field index and database field index mappingSqlbulk. Columnmappings.add (0,?5); Sqlbulk. Columnmappings.add (1,?4); Sqlbulk. Columnmappings.add (2,?7); Sqlbulk. Columnmappings.add (3,?1); Sqlbulk. Columnmappings.add (4,?Ten); Sqlbulk. Columnmappings.add (5,?6); Sqlbulk. Columnmappings.add (6,?2); //? ImportSqlbulk. WriteToServer (SQLDB); Sqlbulk. Close ();
Public voidSqlbulkcopydomo () {dal.dalsql Dal=NewDal.dalsql (); //Get Source data stringGongpansql =@"Select A,b,c,d from T where conditional filtering"; DataTable Dtgongpan= Dal. Querybyfind (Gongpansql). tables[0];//SQL Help Class//Data Replication using(SqlConnection conn =NewSqlConnection (sqlconnstring)) { //Open the connection.Conn. Open (); using(SqlBulkCopy bulkcopy =NewSqlBulkCopy (conn)) { //set the target table nameBulkcopy.destinationtablename ="Housebase"; Bulkcopy.bulkcopytimeout=3600;//time-out settingBulkcopy.batchsize = +;//number of submitted records in batches, can not be set//Column Name Mapping BULKCOPY.COLUMNMAPPINGS.ADD ("Source data column", "target table corresponding column name"); BULKCOPY.COLUMNMAPPINGS.ADD ("a","Housecode"); BULKCOPY.COLUMNMAPPINGS.ADD ("b","Commname"); BULKCOPY.COLUMNMAPPINGS.ADD ("C","Fenqiname"); BULKCOPY.COLUMNMAPPINGS.ADD ("D","Seat"); //Copying DataBulkcopy.writetoserver (Dtgongpan); } } }
"Big Data Processing" high-performance, large-volume storage solution sqlbulkcopy