SQLBulkCopy is used for transferring large volumes of data between databases. It is usually used to update data between new and old databases. Even if the table structure is completely different, you can smoothly export data through the correspondence between fields.
1. initialize the SqlBulkCopy object and use the new connection as the parameter.
SqlBulkCopy bulkCopy = new SqlBulkCopy (link string );
2. ing between the data source and the target data table (corresponding to the column name)
BulkCopy. ColumnMappings. Add ("source", '"dest ")
3. Set the target table name
BulkCopy. DestinationTableName = Name of the target table;
4. Set the number of rows for one-time processing. After the number of rows is processed, the SqlRowsCopied () method is triggered. The default value is 1.
BulkCopy. policyafter = 10;
5. Data Transmission
BulkCopy. WriteToServer (sdr );
Currently, there is a txt file that records a large amount of data.
Then we use common database operations to import them into the database.
OpenFileDialog ofd = ofd.Filter = (ofd.ShowDialog() == [] lines = (SqlConnection con = SqlConnection(ConfigurationManager.ConnectionStrings[ DateTime startTime = ( i=;i<=lines.Length;i++ line= [] strs = line.Split( startNum = strs[ city = strs[ city = city.Trim( type = strs[ type = type.Trim( areaNum = strs[ areaNum = areaNum.Trim( (SqlCommand cmd = cmd.CommandText = cmd.Parameters.Add( SqlParameter( cmd.Parameters.Add( SqlParameter( cmd.Parameters.Add( SqlParameter( cmd.Parameters.Add( SqlParameter( DateTime nowTime = TimeSpan ts = nowTime - totalTime = ts.TotalSeconds * lines.Length / (i + );
Run the project to view the total number of seconds to complete.
The total running time is 1671 seconds, nearly 28 minutes.
Then we use SqlBulkCopy to read and store the data.
DataTable table = table.Columns.Add( table.Columns.Add( table.Columns.Add( table.Columns.Add( DateTime start = ( i = ; i <= lines.Length; i++ line = [] strs = line.Split( startNum = strs[ city = strs[ city = city.Trim( type = strs[ type = type.Trim( areaNum = strs[ areaNum = areaNum.Trim( DataRow row = row[] = row[] = row[] = row[] = (System.Data.SqlClient.SqlBulkCopy copy = System.Data.SqlClient.SqlBulkCopy(ConfigurationManager.ConnectionStrings[ copy.DestinationTableName = copy.ColumnMappings.Add(, copy.ColumnMappings.Add(, copy.ColumnMappings.Add(, copy.ColumnMappings.Add(,
It is not difficult to find that it takes only 3 seconds, so we can see that it is best to use SqlBulkCopy for operations in Big Data Processing!