Bulk Insert Syntax

Source: Internet
Author: User
Tags bulk insert

BULK INSERT Imports A data file into a database table or view in a user-specified format.

BULK INSERT    [database_name. [Schema_name].|Schema_name. ][table_name | view_name]        from 'data_file'      [with ([[,]BatchSize=Batch_size][ [ , ]Check_constraints][ [ , ]CODEPAGE={'ACP' | 'OEM' | 'RAW' | 'Code_page' } ]    [ [ , ]DataFileType=       { 'Char' | 'native'| 'Widechar' | 'widenative' } ]    [ [ , ]FieldTerminator= 'Field_terminator' ]    [ [ , ]FirstRow=First_row][ [ , ]Fire_triggers][ [ , ]FormatFile= 'Format_file_path' ]    [ [ , ]KeepIdentity][ [ , ]Keepnulls][ [ , ]Kilobytes_per_batch=Kilobytes_per_batch][ [ , ]LastRow=Last_row][ [ , ]Maxerrors=Max_errors][ [ , ] ORDER( {column [ASC | DESC]}[,... N] ) ]    [ [ , ]Rows_per_batch=Rows_per_batch][ [ , ]Rowterminator= 'Row_terminator' ]    [ [ , ]TABLOCK][ [ , ]ErrorFile= 'file_name' ]     )] 


Critical Arguments

'data_file '

is the full path of the data file, which contains data to import into the specified table or view. BULK INSERT can import data from a disk (including network, floppy disk, hard disk, and so on).

Data_file must specify a valid path from the server in which SQL server is running. If data_file is a remote file, specify the Universal naming convention (UNC) name. A UNC name has the form \ \systemname\sharename\path\FileName.

For example, \\SystemX\DiskZ\Sales\update.txt.

batchsize = batch_size

Specifies the number of rows in a batch. Each batch are copied to the server as one transaction. If this fails, SQL Server commits or rolls back to the transaction for every batch. By default, all data in the specified data file is one batch. For information on performance considerations, see "Remarks," later in this topic.

Check_constraints

Specifies that all constraints to the target table or view must be checked during the bulk-import operation. Without the check_constraints option, any CHECK and FOREIGN KEY CONSTRAINTS is ignored, and after the operation, the cons Traint on the table is marked as not-trusted.

UNIQUE and PRIMARY KEY constraints is always enforced. When importing to a character column that's defined with a not NULL constraint, BULK INSERT inserts a blank string when There is no value in the text file.

At some point, you must examine the constraints on the whole table. If the table is non-empty before the bulk-import operation, the cost of revalidating the constraint may exceed the cost O F Applying CHECK constraints to the incremental data.

A situation in which you might want constraints disabled (the default behavior) are if the input data contains rows that VI olate constraints. With CHECK constraints disabled, you can import the data and then use Transact-SQL statements to remove the invalid data.

keepidentity

Specifies that identity value or values in the imported data file is to is used for the identity column. If KeepIdentity is not a specified, the identity values for this column was verified but not imported and SQL Server Automat Ically assigns unique values based on the seed and increment values specified during table creation. If The data file does not contain values for the identity column in the table or view, use a format file to specify that T He identity column in the table or view was to be skipped when importing data; SQL Server automatically assigns unique values for the column. For more information, see DBCC checkident (Transact-SQL).

For more information, see about keeping identify values see Keep Identity values when Bulk importing Data (SQL Server).

Keepnulls

Specifies that empty columns should retain a null value during the bulk-import operation, instead of have any default VA Lues for the columns inserted. For more information, see Keep Nulls or use Default Values During Bulk Import (SQL Server).

Performance Considerations

If the number of pages to being flushed in a single batch exceeds an internal threshold, a full scan of the buffer pool might occur to identify which pages to flush when the batch commits. This full scan can hurt Bulk-import performance. A likely case of exceeding the internal threshold occurs when a large buffer pool was combined with a slow I/O subsystem. To avoid buffer overflows on large machines, either don't use the TABLOCK hint (which would remove the bulk optimizations) Or use a smaller batch size (which preserves the bulk optimizations).

Because computers vary, we recommend that you test various batch sizes with your data load to find out what works best for You.

Reference Documentation:

MSDN BULK INSERT (Transact-SQL)

Bulk Insert Syntax

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.