Pay special attention to SqlBulkCopy. Columnmappings.add (dt. Columns[i]. ColumnName, dt. Columns[i]. ColumnName);
The order of the columns can be inconsistent when inserting, but the names and data types are best kept consistent. When inconsistent, the correct conversion can also be done, except for example datatime data type, cannot insert an invalid string time string.
This article reproduced: http://zhoufoxcn.blog.51cto.com/792419/166052
Reference http://www.cnblogs.com/scottckt/archive/2011/02/16/1955862.html
Large data bulk inserts for several databases
Http://www.cnblogs.com/netuml/p/3719298.html
This article is good. Mapping relationships through XML configuration: http://www.cnblogs.com/pengyq/archive/2009/12/04/1616997.html
Application Description:
Some days ago, the company asked to do a data import program, the requirements of Excel data, large-scale import into the database, as little as possible to access the database, high-performance database storage. So on-line search, found a better solution, is to use SqlBulkCopy to process storage data. SqlBulkCopy storage of large quantities of data is very efficient, just like the name of this method, you can store the data table in memory directly into the database, without the need to insert data to the database once. The first experiment, millions other data tables, can be fully stored in a database in a few seconds, faster than the traditional insert method many times faster
MSDN Summary: Microsoft SQL Server provides a popular command prompt utility called BCP, which is used to move data from one table to another (the table can be either on the same server or on a different server). The SqlBulkCopy class allows you to write managed code solutions that provide similar functionality. There are other ways to load data into a SQL Server table, such as an INSERT statement, but in contrast SqlBulkCopy provides significant performance benefits.
Use the SqlBulkCopy class to write data to a SQL Server table only. However, the data source is not limited to SQL Server; You can use any data source, as long as the data can be loaded into a DataTable instance or you can use a IDataReader instance to read the data.
<summary>
///
</summary>
<param name= "connectionString" > Target connection character </param>
<param name= "TableName" > Target table </param>
<param name= "DT" > Source data </param>
private void Sqlbulkcopybydatatable (String connectionString, String TableName, DataTable DT)
{
using (SqlConnection conn = new SqlConnection (connectionString))
{
using (SqlBulkCopy SqlBulkCopy =new SqlBulkCopy (connectionString, sqlbulkcopyoptions.useinternaltransaction))
{
Try
{
SqlBulkCopy. DestinationTableName = TableName;
for (int i = 0; i < dt. Columns.count; i++)
{
SqlBulkCopy. Columnmappings.add (dt. Columns[i]. ColumnName, dt. Columns[i]. ColumnName);
}
SqlBulkCopy. WriteToServer (DT);
}
catch (System.Exception ex)
{
Throw ex;
}
}
}
}
SqlBulkCopy uses the use of custom table types as stored procedure parameters to bulk write data.
If you use SQL Server 2008, it provides a new feature table variable (table Parameters) that aggregates the entire table data into one parameter to the stored procedure or SQL statement. Its attention to performance overhead is the aggregation of data into parameters (O (data volume)).
Now, we modify the previous code to define our table variables in SQL Server, as follows:
-- =============================================
--Author:jkhuang
--Create date:08/16/2012
--description:declares a user table paramter.
-- =============================================
CREATE TYPE Jk_users_bulk_insert as TABLE (
User_login varchar (60),
User_pass varchar (64),
User_nicename varchar (50),
User_email varchar (100),
User_url varchar (100),
User_activation_key varchar (60),
User_status int,
Display_name varchar (250)
)
Above, we define a table parameter Jk_users_bulk_insert, and then we define a stored procedure to accept the table parameter Jk_users_bulk_insert, which is defined as follows:
-- =============================================
--Author:jkhuang
--Create date:08/16/2012
--description:creates a stored procedure, receive
--a jk_users_bulk_insert argument.
-- =============================================
CREATE PROCEDURE Sp_insert_jk_users
@usersTable Jk_users_bulk_insert READONLY
As
INSERT into Jk_users (User_login, User_pass, User_nicename, User_email, User_url,
User_activation_key, User_status, Display_name, user_registered)
SELECT User_login, User_pass, User_nicename, User_email, User_url,
User_activation_key, User_status, Display_name, GETDATE ()
From @usersTable
Then we call the stored procedure in the client code and pass the table as a parameter to the stored procedure.
var sw = stopwatch.startnew ();
using (var conn = new SqlConnection (configurationmanager.connectionstrings["SQLCONN2"]. ToString ()))
{
Conn. Open ();
Invokes the stored procedure.
using (var cmd = new SqlCommand ("Sp_insert_jk_users", conn))
{
Cmd.commandtype = CommandType.StoredProcedure;
Adding a "structured" parameter allows the insert tons of data with low overhead
var param = new SqlParameter ("@userTable", sqldbtype.structured) {Value = dt};
Cmd. Parameters.Add (param);
Cmd. ExecuteNonQuery ();
}
}
Sw. Stop ();
Now, we re-execute the write operation to find that the write efficiency is equivalent to SqlBulkCopy.
C # Bulk copying Big Data using the SqlBulkCopy class