A few days ago to get a. csv file of more than 400 m, open on the computer for a long time, open after the data are garbled. Therefore, a program that first transcoded and then imported the database was done. 100多万条 Data transcoding + import took 4 minutes, the sense of efficiency is also possible. Netizens have a better way, please in the message in the guidance, learning together, thank you.
Static voidMain (string[] args) { intCount =0; stringReaderpath=@"C:\Users\Administrator\Desktop\readerDemo.csv"; stringWriterpath=@"C:\Users\Administrator\Desktop\writeDemo.csv"; if(File.exists (Writerpath)) {file.delete (Writerpath); } using(StreamReader reader =NewStreamReader (Readerpath,encoding.utf8)) { while(!Reader. Endofstream) {stringline =Reader. ReadLine (); using(StreamWriter writer =NewStreamWriter (Writerpath,true, Encoding.default)) {writer. WriteLine (line); } Count++; Console.WriteLine ("transcoding {0} line, please wait a moment", Count); }} Console.WriteLine ("transcoding Complete, total transcoding {0} bar data", Count); Console.WriteLine ("start importing data, please wait a moment"); stringsql ="BULK INSERT Test.dbo.BagDataTable from ' C:\\users\\administrator\\desktop\\writedemo.csv ' with (fieldterminator= ', ', batchsize=100000,firstrow=2)"; Try{dbhelper.executesql (SQL); } Catch(Exception ex) {using(StreamWriter Writerlog =NewStreamWriter (@"C:\Users\Administrator\Desktop\Log.txt") {writerlog.writeline (ex). ToString ()); }} Console.WriteLine ("Data Import Complete"); Console.readkey (); } }
Bulk insert Command Details:
http://blog.csdn.net/jackmacro/article/details/5959321/
FieldTerminator represents the Terminator flag in the CSV file (the CSV default column end flag is, the line end flag is \ r \ n), use see:
Http://www.cnblogs.com/sunice/p/6367332.html
Large data volume. csv file import into SQL Server database