There are many ways to import text data into a database, importing text formats (CSV and TXT) into SQL Server, BULK INSERT is the simplest way to implement
The 1,bulk Insert command has been simplified as follows
bulk insert schema_name. table_name from data_file with (FieldTerminator = '
2, use BULK INSERT to import TXT data into the database, text data using tab to split column, use newline character to split row.
--Create staging TableCreate Tabledbo.txt_staging (col1nvarchar(255), col2nvarchar(255), Col3nvarchar(255))Go--Populate DataBulk Insertdbo.txt_staging fromN'D:\test.txt' with(FieldTerminator= '\ t', Rowterminator= '\ n')
3. Use BULK INSERT to import CSV data into the database
The CSV file uses "," as the column delimiter, using "\ n" as the row delimiter
--Populate DataBulk Insertdbo.txt_staging fromN'D:\abc.csv' with(FieldTerminator=N',', Rowterminator=N'\ n', CODEPAGE=N'Raw')
3, when importing Unicode, you need to save the. txt document as Unicode encoding
When using the BULK INSERT import, you must set datafiletype= ' Widechar ', and the options may be: ' char ' | ' Native ' | ' Widechar ' | ' Widenative ', the default value is char.
If some of the column does not have a value, setting the Keepnulls option indicates that the column is set to NULL.
Bulk Insertdbo.txt_staging fromN'D:\abc.txt' with(FieldTerminator= '\ t', Rowterminator= '\ n', DataFileType='Widechar', Keepnulls)
Attached: Bulk Insert command call format:
- BULK INSERT [schema_name]. [table_name]
- from ' data_file '
- [With (Arguments)]
Important Parameter notes:
- ' data_file ': Specifies the full path,bulk Insert command of the data file to import data from the file into the target table
- rowterminator = ' row_terminator ': Specifies the character separating rows, using this character to split rows (row);
- fieldterminator = ' field_terminator ': Specifies the character separating the field, using this character to split the field (field or column);
- DataFileType = { ' char ' | ' native ' | ' Widechar ' | ' widenative ': Specifies the type of data file encoding (Encoding), Widechar encoding is recommended;
- CODEPAGE = { ' ACP ' | ' OEM ' | ' RAW ' | ' code_page ': if the data file contains single-byte (char or varchar) characters, use the codepage parameter to specify the codepage of the character column;
- batchsize = batch_size: Specifies the number of data rows that a batch contains, and when the data is copied to a table, each batch acts as a separate transaction, and if a batch copy fails, the transaction is rolled back. By default, all data in the file is used as a batch. Specifies the number of rows in a batch. Each batch are copied to the server as one transaction. If this fails, SQL Server commits or rolls back to the transaction for every batch. By default, all data in the specified data file is one batch.
- Check_constraints: Specifies that during the bulk insert operation, the inserted data must be checked to satisfy all constraints on the target table. If the check_constraints option is not specified, all CHECK and FOREIGN KEY constraints are ignored, and after this operation, all constraints on the table are marked as not trusted (not-trusted);
- Fire_triggers: Specifies whether to start the insert trigger, and if this option is specified, an INSERT trigger is executed after each batch is successfully inserted, and if the option is not specified, the INSERT trigger is not executed;
- keepidentity: Specifies that the identity value in the data file is inserted into the identity column, and if the keepidentity option is not specified, the ID column in the Target table automatically assigns a unique identity value;
- Keepnulls: Specifies that null columns (empty Columns) should be left blank while performing bulk insert operations instead of the default value of the inserted column
- TABLOCK: Specifies that a table-level lock, a table-level lock, can reduce lock contention (lock contention) and improve import performance during the bulk insert Operation
Reference doc:
BULK INSERT (Transact-SQL)
Bulk Insert: Import text data (CSV and TXT) into the database