SqlBulkCopy with triggers, BULK INSERT Table (presence is updated, insert not present)

Source: Internet
Author: User
Tags bulk insert

Temp table: Test

/*object: Table [dbo]. [Test] Script Date: 05/10/2013 11:42:07 * * * * **/SET ANSI_NULLSOnGOSET QUOTED_IDENTIFIEROnGOCREATETABLE[Dbo].[Test]([Id][Int]IDENTITY (1,1)NotNull,[Userid][Int]NotNull,[UserName][nvarchar]() COLLATE Chinese_prc_ci_asNull,CONSTRAINT[pk_test primary key clustered[idasc) with (Ignore_dup_key = on [primary]) on [primary "  

Triggers for temporary tables: Tri_edit

[tri_edit]  [dbo].  [Test] as


Select @temp = inserted. UserID from inserted

--Update the existing primary key (update all fields)
If (@temp is not NULL)

Update [Test] set userqq=inserted. Userqq
From [test] join inserted on [test]. Userid=inserted. Userid

Else--(update the specified part of the field)

Update [Test] set username=inserted. UserName
From [test] join inserted on [test]. Userid=inserted. Userid

--Insert the existing primary key data
Insert [Test] (Userid,username)
Select inserted. Userid,inserted. UserName
From inserted left join [Test] on inserted. Userid=[test]. Userid
where [test].id is null

Go

C # SqlBulkCopy Method:

///<summary>///SqlBulkCopy inserting data into a database in bulk///</summary>///<param Name= "Sourcedatatable"> Data source table</param>///<param Name= "Targettablename"> Target table on server</param>///<param Name= "Mapping"> creates a new column map and uses the column ordinal to refer to the column names of the source and destination columns. </param> public static void Bulktodb (DataTable sourcedatatable, String targettablename, Sqlbulkcopycolumnmapping[] mapping) {/* call Method- May 10, 2013 written//datatable dt = Get_all_roomstate_byhid (); sqlbulkcopycolumnmapping[] mapping = new SQLBULKCOPYCOLUMNMAPPING[4]; Mapping[0] = new Sqlbulkcopycolumnmapping ("xing_h_id", "xing_h_id"); MAPPING[1] = new Sqlbulkcopycolumnmapping ("H_name", "h_name"); MAPPING[2] = new Sqlbulkcopycolumnmapping ("H_sname", "h_sname"); MAPPING[3] = new Sqlbulkcopycolumnmapping ("H_ename", "h_ename"); Bulktodb (DT, "bak_tts_hotel_name", mapping); */        

using (SqlConnection conn = new SqlConnection (dbhelper.connectionstring))
{
SqlBulkCopy bulkcopy = new SqlBulkCopy (conn); Efficiently bulk load SQL Server tables with data from other sources
Specifies whether the bulk insert fires a trigger on the table. The default value for this property is False.
SqlBulkCopy bulkcopy = new SqlBulkCopy (dbhelper.connectionstring, sqlbulkcopyoptions.firetriggers);
Bulkcopy.destinationtablename = Targettablename; Name of the destination table on the server
Bulkcopy.batchsize = SourceDataTable.Rows.Count; Number of rows in each batch
Bulkcopy.bulkcopytimeout = 300; The number of seconds allowed for the operation to complete before timing out, 5 minutes for large quantities, 2013-11-6 notesError:“The timeout period has expired. The timeout has expired before the operation completes or the server is not respondingSolutions

&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;
                if (sourcedatatable! = Null && SourceDataTable.Rows.Count! = 0)
                 {
                     for (int i = 0; i < mapping. Length; i++)
                         BulkCopy.ColumnMappings.Add (Mapping[i]);

Copy all rows from the supplied data source to the target table
Bulkcopy.writetoserver (sourcedatatable);
}
}

C # calls the Bulktodb method:

int KK =Environment.tickcount; DataTable dt =NewDataTable (); Dt. Columns.Add (New DataColumn ("Userid",typeofString))); Dt. Columns.Add (New DataColumn ("UserName",typeofString)));for (int k =0; K <10000; k++)//40000{DataRow Dr =Dt. NewRow (); dr["Userid"] =K dr["UserName"] ="8888-" +K Dt. Rows.Add (DR); } sqlbulkcopycolumnmapping[] Mapp =New sqlbulkcopycolumnmapping[2]; mapp[0] =New Sqlbulkcopycolumnmapping ("Userid" userid "); Mapp[1] = new sqlbulkcopycolumnmapping ( "username "username "); // Submit task table Bulktodb (DT, testint exeg = environment.tickcount- KK; MessageBox.Show (Exeg. ToString ());                 

Dynamic load data column name:

New Sqlbulkcopycolumnmapping[dt. Columns.count];                                        For (0; j < dt. Columns.count; J + +new sqlbulkcopycolumnmapping (dt. COLUMNS[J]. ColumnName, dt. COLUMNS[J]. ColumnName); }

1 、--Examples of triggers processed CREATE trigger tr_insert on table instead of insert--Note the type of triggeras--Updating a primary key Update table that already existsSet name=b.name,sex=B.sexFrom table A join inserted B on a.id=b.id--Insert table for inserting existing primary key dataSelect A.*From inserted a LEFT JOIN table B on a.id=b.IDwhere b.idIsNullGo ——————————————————————————————————————————2 、-- trigger create TRIGGER Tri_edit on Tabinstead of Insertasif exists (select col1,col2 from tab join Inser Ted on tab. Study Number =INSERTED. begin-- You can add as many other modifications as you like, depending on the specific functionality of the Update tab set col1='num1'  From the tab join inserted on tab. Study Number =inserted. Study number endelse Insert tab SELECT * from  insertedgo 

SqlBulkCopy with triggers, BULK INSERT Table (presence is updated, insert not present)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.