Solution for importing sqlserver into large-capacity csv files quickly (recommended), csvsqlserver

Source: Internet
Author: User
Tags bulk insert

Solution for importing sqlserver into large-capacity csv files quickly (recommended), csvsqlserver

Preface

In the forum, I answered a question: Import csv data and add a new column date datetime to the imported data. It is required to import million rows of data within 10 s. The ideas and methods for solving the problem should be shared.

Analysis

Generally, Bulk insert is faster than BCP. We choose Bulk insert. Solution: import data to the SQL server temporary table before inserting the target table. The statement is as follows:

bulk insert test07232 from 'D:\2017-7-22.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '/n' )SELECT * ,GETDATE() AS Date INTO ttt FROM test07232

However, the following error message is displayed when the csv file provided by him is imported.

Message 4866, level 16, status 1, 1st rows loading failed. The length of the 1st columns in the 2nd rows in the data file is too long. Verify that the field Terminator and line terminator are correctly specified. An error is reported for the ole db access interface "BULK" of the 7399, level 16, status 1, 1st line link server "(null. The provider does not provide any information about the error.

Message 7330, level 16, status 2, and 1st rows cannot be extracted from the ole db access interface "BULK" of the linked server "(null.

This is because the line terminator cannot be recognized. Use notepad ++ to open the csv file and select the line ending number in the view.

The line feed of the file is LF.

For normal csv problems, CRLF is used as the line break by default.

Therefore, the above bulk insert statement cannot be executed normally.

Solution

1. the first thing that comes to mind is to modify the data source so that the source can generate normal data, but the source data cannot be modified. use a program to write c # for processing, which is too time-consuming 3. finally, the correct method was found.

bulk insert test07232 from 'D:\2017-7-22.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0a' )SELECT * ,GETDATE() AS Date INTO ttt FROM test07232

All of them are on SSD. The import time is 2 s. Production formal Table 1 s. The entire process is completed in 3 seconds.

Summary

To solve the problem, you must start with each problem and find the cause to better solve the problem.

Summary

The preceding section describes how to import large-capacity csv files to sqlserver in a quick manner. I hope it will be helpful to you. If you have any questions, please leave a message, the editor will reply to you in a timely manner. Thank you very much for your support for the help House website!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.