Sqlbulkcopy.writetoserver has 4 overloads:
Sqlbulkcopy.writetoserver (datarow[])
Copies all rows from the supplied DataRow array to a destination table specified by the
DestinationTableName property of the SqlBulkCopy object.
sqlbulkcopy.writetoserver (DataTable)
Copies all rows in the supplied DataTable to a destination table specified by the
DestinationTableName property of the SqlBulkCopy object.
Sqlbulkcopy.writetoserver (IDataReader)
Copies all rows in the supplied IDataReader to a destination table specified by the
DestinationTableName property of the SqlBulkCopy object.
sqlbulkcopy.writetoserver (DataTable, DataRowState)
Copies only rows This match the supplied row state on the supplied DataTable to a
Destination table specified by the DestinationTableName property of the SqlBulkCopy object.
When importing-text files with this method has to create a DataTable first, import the text file
To the created DataTable and then write this DataTable to server.
With this we ' re acctually performing 2 tasks in. NET:
1. Fill data from the text file to the DataTable in memory
2. Fill data from the DataTable in memory to SQL Server
Compared to SQL servers native bulk Import methods where we just import the text file directly.
I used the same file and the same table structure as in previous bulk import methods described in the last.
The time it took to complete the whole process is around-seconds.
The code I used for import:
Private voidStartimport () {Stopwatch SW=NewStopwatch (); Sw. Start (); SqlBulkCopy bulkcopy=NewSqlBulkCopy ("server=servername;database=test; Trusted_connection=true;", Sqlbulkcopyoptions.tablelock); Bulkcopy.destinationtablename="dbo.testsqlbulkcopy"; Bulkcopy.writetoserver (Createdatatablefromfile ()); Sw. Stop (); Txtresult.text= (SW. elapsedmilliseconds/1000.00). ToString ();}PrivateDataTable Createdatatablefromfile () {DataTable dt=NewDataTable (); DataColumn DC; DataRow Dr; DC=NewDataColumn (); dc. DataType= System.Type.GetType ("System.Int32"); dc. ColumnName="C1"; dc. Unique=false; Dt. Columns.Add (DC); DC=NewDataColumn (); dc. DataType= System.Type.GetType ("System.Int32"); dc. ColumnName="C2"; dc. Unique=false; Dt. Columns.Add (DC); DC=NewDataColumn (); dc. DataType= System.Type.GetType ("System.Int32"); dc. ColumnName="C3"; dc. Unique=false; Dt. Columns.Add (DC); DC=NewDataColumn (); dc. DataType= System.Type.GetType ("System.Int32"); dc. ColumnName="C4"; dc. Unique=false; Dt. Columns.Add (DC); StreamReader SR=NewStreamReader (@"D:\work\test.txt"); stringinput; while(input = Sr. ReadLine ())! =NULL) { string[] s = input. Split (New Char[] {'|' }); Dr=dt. NewRow (); dr["C1"] = s[0]; dr["C2"] = s[1]; dr["C3"] = s[2]; dr["C4"] = s[3]; Dt. Rows.Add (DR); } Sr. Close (); returnDT;}
Bulk Import Methods is ad below..:-
1. BCP
2. Bulk Insert
3. OpenRowset with BULK option
4. SQL Server Integration Services-ssis
I ran each bulk import option, disregarded best and worst time and averaged the remaining ten times.
Results is:
1. |
Ssis-fastparse on |
= |
7322 MS |
2. |
Ssis-fastparse OFF |
= |
8387 ms |
3. |
Bulk Insert |
= |
10534 ms |
4. |
OpenRowset |
= |
10687 ms |
5. |
Bcp |
= |
14922 ms |
So-speed gain are quite large when using FastParse.
I was also surprised that Ssis-fastparse OFF method is faster by 20% to Bulk Insert and OpenRowset
And around 40% faster than BCP.
Since my desire was to test how much faster are importing flat files when FastParse option is used
I created a text file containing 4 bigint columns with 1,000,000 rows.
The script I used to create a sample test file in C #:
stringstr; StreamWriter SW=NewStreamWriter (@"D:\work\test.txt"); for(inti =1; I <=1000000; i++) {str= i.ToString () +"|"+ Convert.ToString (i *2) +"|"+ Convert.ToString (i *3) +"|"+ convert.tostring (I/2); Sw. WriteLine (str);} Sw. Close ();
I also created this format file is with BCP, Bulk Insert and OpenRowset:
9.041 sqlbigint 0 8 "|" 1 C1 " " 2 sqlbigint 0 8 "|" 2 C2 " " 3 sqlbigint 0 8 "|" 3 C3 " " 4 sqlbigint 0 8 "\ r \ n" 4 C4 " "
SSIS package is a very simple one with a Flat File source and SQL Server destination objects.
The SQL script I used is:
Create DatabaseTestGo UseTestGo--ran for each SSIS test run--SSIS data type for each column is "Eight-byte signed integer [Dt_i8]"Drop TableTestfastparseCreate TableTestfastparse (C1bigint, C2bigint, C3bigint, C4bigint)Go--Insert data using OPENROWSETCreate TableTestopenrowset (C1bigint, C2bigint, C3bigint, C4bigint)GoDBCCdropcleanbuffersDeclare @start datetimeSet @start = getdate()Insert intoTestopenrowset (C1, C2, C3, C4)SELECTt1.c1, T1.c2, t1.c3, t1.c4 from OPENROWSET(BULK 'D:\work\test.txt', FormatFile= 'D:\WORK\TESTIMPORT-F-N.FMT') asT1 (C1, C2, C3, C4);Select getdate()- @start asElapsedTimeDrop TableTestopenrowset--Insert Data using Bulk InsertCreate TableTestbulkinsert (C1bigint, C2bigint, C3bigint, C4bigint)GoDBCCdropcleanbuffersDeclare @start datetimeSet @start = getdate()BULK INSERTTestbulkinsert from 'D:\work\test.txt' with(FormatFile='D:\WORK\TESTIMPORT-F-N.FMT')Select getdate()- @start asElapsedTimeDrop TableTestbulkinsertGo--Insert data using BCPCreate TableTESTBCP (C1bigint, C2bigint, C3bigint, C4bigint)GoDBCCdropcleanbuffersexecMaster.. xp_cmdshell'bcp Test.dbo.testBCP in D:\work\test.txt-T-b1000000-fd:\work\testimport-f-n.fmt'Drop TabletestbcpGoDrop DatabaseTest
C #. Net-sql Bulk Insert from multiple delimited textfile using c#.net