BCP export and import SQL SERVER large capacity data practice tutorial

Source: Internet
Author: User
Tags bulk insert getdate microsoft sql server sql server books rand

This tutorial introduces the BCP utility, a powerful tool for exporting and importing large data volumes. At the same time, we will introduce bulk insert to import large-capacity data, and the practices of BCP combined with bulk insert for data interfaces (practices on SQL2008R2 ).

1. BCP usage

The BCP utility can replicate large data volumes between a Microsoft SQL Server instance and a data file in a specified format. The BCP utility can be used to import a large number of new rows to SQL Server tables or to import table data to data files. Unless used with the queryout option, you do not need to know about Transact-SQL when using this utility. BCP can be run either at the CMD prompt or in SSMS.



Bcp {[[database_name.] [schema].] {table_name | view_name} | "query"} {in | out | queryout | format} data_file [-mmax_errors] [-fformat_file] [-x] [-eerr_file] [-Ffirst_row] [-Llast_row] [-bbatch_size] [-ddatabase_name] [-n] [-c] [-N] [-w] [-V (70 | 80 | 90)] [-q] [-C {ACP | OEM | RAW | code_page}] [-tfield_term] [-rrow_term] [-iinput_file] [-ooutput_file] [-apacket_size] [-S [server_name [\ instance_name] [-Ulogin_id] [-Ppassword] [-T] [-v] [-R] [-k] [-E] [-h" hint [,... n] "]

Example 1:


Example 2:


On SSMS, you can also execute:

EXEC [master] .. xp_mongoshell 'bcp TestDB_2005.dbo.T1 out E: \ T1_02.txt-c-t' GO



EXEC [master] .. xp_mongoshell 'bcp "SELECT * FROM TestDB_2005.dbo.T1" queryout E: \ T1_03.txt-c-t' GO




Personally, I prefer the second method used with the queryout option, because this gives me more flexible control over the data to be exported. If the following error message is displayed when you run the BCP command:

Msg 15281, Level 16, State 1, Procedure xp_cmdshell, Line 1
SQL Server blocked access to procedure 'sys. xp_expose Shell 'of component 'XP _ expose shell' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'XP _ your Shell' by using sp_configure. for more information about enabling 'XP _ your shell', see "Surface Area Configuration" in SQL Server Books Online.

Based on security considerations, the xp_cmdshell option is disabled by default. Use the following statement to enable this option.

EXEC sp_configure 'show advanced options', 1 RECONFIGUREGOEXEC sp_configure 'XP _ your shell', 1 RECONFIGUREGO


After use, you can close sp_domainshell.

EXEC sp_configure 'show advanced options', 1 RECONFIGUREGOEXEC sp_configure 'XP _ your shell', 0 RECONFIGUREGO


BCP import data

Modify the "out" in figure-2 to "in" and import the data.




Use bulk insert to import data

Bulk insert dbo. T1 FROM 'E: \ T1.txt 'WITH (FIELDTERMINATOR =' \ t', ROWTERMINATOR = '\ n ')



For more detailed description of bulk insert, refer to: https://msdn.microsoft.com/zh-cn/library/ms188365%28v= SQL .105%29.aspx

Compared with BCP import, bulk insert provides more flexible options.


Descriptions of several common BCP parameters:

Database_name Name of the database where the specified table or view is located. If not specified, the user's default database is used.
In | Out| Queryout | Format
  • InCopy a file to a database table or view.

  • OutCopy data from a database table or view to a file. If an existing file is specified, the file will be overwritten. Note the following when extracting data:BcpThe utility represents a null string as null, and a null string as a null string.

  • QueryoutCopy from query. This option must be specified only when copying data from a large volume is queried.

  • FormatBased on the specified options (-N,-C,-WOr-N) And the table or view delimiter to create a formatting file. When copying large data volumes,BcpCommand can reference a formatted file to avoid entering format information repeatedly in interactive mode.FormatRequired-FOption; you must specify-X.

    InCopy a file to a database table or view.
    OutCopy data from a database table or view to a file. If an existing file is specified, the file will be overwritten. Note the following when extracting data:BcpThe utility represents a null string as null, and a null string as a null string.
    QueryoutCopy from query. This option must be specified only when copying data from a large volume is queried.

-C Use the character data type to perform this operation. This option does not prompt you to enter each field. It uses char as the storage type without a prefix.\ T(TAB) as the field separator, use\ R \ n(Line break) as the line terminator.
-W Use Unicode characters to perform large-capacity copy operations. This option does not prompt you to enter each field. It uses nchar as the storage type without a prefix.\ T(TAB) as the field separator, use\ N(Line break) as the line terminator.
-TField_term Specifies the Field Terminator. The default value is\ T(Tab ). This parameter can be used to replace the default field Terminator.
-RRow_term Specifies the line terminator. The default value is\ N(Line break ). This parameter can be used to replace the default line terminator.
-SServer_name [\Instance_name] Specify the SQL Server instance to connect. If no server is specifiedBcpThe utility connects to the default SQL Server instance on the local computer. If you runBcpCommand, you must use this option. To connect to the default SQL Server instance on the Server, specify only server_name. To connect to a named instance of SQL Server, specify server_name\Instance_name.
-ULogin_id The logon ID used to connect to SQL Server.
-PPassword Password of the logon ID. If this option is not used,BcpThe command prompts you to enter the password. If you use this option at the end of the command prompt but do not provide a passwordBcpThe default password (NULL) will be used ).
-T SpecifyBcpThe utility connects to SQL Server by using a trusted connection with integrated security. Security creden。, login_id, and password of network users are not required. If not specified? T, You must specify? UAnd? PTo log on successfully.

More detailed parameters: https://msdn.microsoft.com/zh-cn/library/ms162802%28v= SQL .105%29.aspx


2. Practice

2.1 export data

This section describes how to export and import BCP and bulk insert. To be close to the actual environment, create a table with 10 fields, which contains several common data types and construct 20 million of the data, including Chinese and English. To insert test data more quickly, no index is created first. Before executing the following code, check whether the log recovery mode of the database is set to large capacity or simple mode, and whether the disk space is sufficient (in my practice, after the data is generated, the data file and log file need about 40 GB space ).

USE AdventureWorks2008R2GOIF OBJECT_ID (N 'T1') is not nullbegin drop table T1ENDGOCREATE TABLE T1 (id _ INT, col_1 NVARCHAR (50), col_2 NVARCHAR (40), col_3 NVARCHAR (40 ), col_4 NVARCHAR (40), col_5 INT, col_6 FLOAT, col_7 DECIMAL (18, 8), col_8 BIT, input_date datetime default (GETDATE () GOWITH CTE1 AS (SELECT. [object_id] FROM master. sys. all_objects AS a, master. sys. all_objects AS B, sys. databases AS cWHERE c. database_id <= 5), CTE2 AS (SELECT ROW_NUMBER () OVER (order by [object_id]) as row_no FROM CTE1) insert into T1 (id _, col_1, col_2, col_3, col_4, col_5, col_6, col_7, col_8) SELECT row_no, REPLICATE (N 'blog service', 10), NEWID (), CAST (row_no * RAND () * 10 as int), row_no * RAND (), row_no * RAND (), CAST (row_no * RAND () as int) % 2 FROM CTE2 WHERE row_no <= 200020.go

The process takes several minutes to complete. Please wait.

Use the usage described above to export data:

EXEC [master] .. xp_mongoshell 'bcp AdventureWorks2008R2. dbo. T1 out E: \ T1_04.txt-w-T-s ken \ SQLSERVER08R2 'GO


Here the-w parameter is used. BCP can export data under CMD and test and export 20 million records. My notebook took about 8 minutes. BCP can also be executed in SSMS. It takes more than six minutes, which is faster than that in CMD. The size of the generated file is the same, and each file is nearly 5 GB.




For complex large-capacity imports, files are usually formatted. The format file must be used in the following cases:

Multiple tables with different architectures use the same data file as the data source.

The number of fields in the data file is different from the number of columns in the target Table. For example:

The target table contains at least one column that defines the default value or allows NULL.

You do not have the SELECT/INSERT permission on one or more columns of the target table.

Two or more tables with different architectures use the same data file.


The column order of data files and tables is different.

The ending character or prefix length of the data file column is different.


The format file is not used for export and import demonstration. For more information, see books online.


2.2 import data

Use bulk insert to import data to the target table. To improve performance, you can temporarily delete the index and re-create the index after the import. Be sure to reserve enough disk space. It takes about 15 minutes to complete the import.


3. Expansion

3.1 Data export and import automation and data interfaces

Due to work relationships, sometimes some customer data interfaces need to be developed, and a large amount of data is automatically imported every day. It is limited to applications and other factors. Therefore, you should directly use bulk insert of SQL SERVER to automatically read the intermediate files in the relevant directories every day. Although the directory is dynamic, because the intermediate file is in a fixed format, by writing dynamic SQL statements, the stored procedure is closed and put into the JOB to configure the running plan, to complete automation. The following is a simple demonstration of the process:

3.1.1 write an import script

Create procedure sp_import_dataASBEGIN DECLARE @ path NVARCHAR (500) DECLARE @ SQL NVARCHAR (MAX) /* S_PARAMETERS table is the path that can be configured on the Application */SELECT @ path = value _ + CONVERT (NVARCHAR, getdate (), 23) + '.txt 'FROM S_PARAMETERS WHERE [type] = 'import'/* T4 is a temporary intermediate table. First, read the data from the file to the intermediate table, finally, INSERT the data in the T4 intermediate table to the actual business table */SET @ SQL = n' BULK INSERT T4 FROM ''' + @ path + ''' WITH (FIELDTERMINATOR = ''*'', ROWTERMINATOR = ''\ n') 'exec (@ SQL) ENDGO


3.1.2 configure a JOB

First, you must configure SQL SERVER to have the permission to read related directories and files. In Windows service, open the SQL server attributes. On the Log On TAB, use the user with sufficient permissions to start SQL SERVER and have the permission to read relevant directories, such as reading network disks.



Create a job on the SQL Server Agent


On the General page, select Owner. Select sa here.



On the Steps Page, execute the written stored procedure in Command.


On the Schedules page, configure the execution time and frequency. Complete.




3.2 downgrade a database with a higher version to a lower version

Generally, databases backed up from earlier versions can be recovered directly in later versions. For example, SQL2000 backup can be recovered in SQL2005 or SQL2008, unless the time span is too large. For example, SQL2000 backup cannot be recovered directly in SQL2012, but can only be recovered to SQL2008, then backed up from SQL2008, and finally recovered to SQL2012.

The backup of a later version cannot be recovered in the earlier version. For example, the backup of SQL2008 cannot be recovered in SQL2008 or SQL2000. However, in reality, this kind of requirement is encountered. It is best to directly connect two different versions of the database through the higher version of SSMS, through the data export between the databases to import or write scripts, the higher version of data to the lower version of the database. This is a fast and secure method. However, if two versions of databases cannot be connected, the data can only be exported and then imported. If the data volume is small, use the export and import function of SSMS or generate a script containing the data (as shown below ). For big data, it is a disaster. For example, if there is a large table with 20 million data in front, there are several GB of scripts for generating data, and it is impossible to directly use SSMS for execution. Only large data export and import tools such as BCP and bulk insert can be used.



4. Summary

You can use BCP and bulk insert to quickly export and import large data volumes and automate the work. Operations are not complex for a small amount of data. This is a very practical tool besides the graphical tool on SSMS.

Use the bcp utility to import and export large data volumes

This topic describes how to use the bcp utility to export data from any location (including the partition view) where the SELECT statement can be used in the SQL Server database.

Bcp utility (Bcp.exe) is a command line tool that uses the large-capacity replication program (BCP) API. The bcp utility can perform the following tasks:

Export large data volumes from SQL Server tables to data files.
Export data from a large capacity in the query.
Import large amounts of data in the data file to the SQL Server table.
Generate a formatted file.

Use the bcp command to access the bcp utility. When using the bcp command to import large data volumes, unless you use an existing formatting file, you must understand the table architecture and the data types of each column.

The bcp utility exports data from SQL Server tables to data files for use by other programs. This utility also imports data from other programs (typically another database management system (DBMS) into the SQL Server table. First, the data is exported from the source program to the data file, and then the data in the data file is copied to the SQL Server table through a separate operation.

The bcp command can be used to specify the data type and other information of a data file. If you do not specify these switches, this command will prompt you to specify the format information, such as the type of data fields in the data file. This command then asks if you want to create a formatted file containing the interactive response. If you want to be flexible in subsequent large-capacity import or export operations, formatting files is usually useful. You can specify the formatted file later when you use the bcp command for the same data file. For more information, see use bcp to specify the data format for compatibility.


Starting from Microsoft SQL Server 7.0, you can use the ODBC large-capacity copy API to write bcp utility. Earlier versions of bcp were written using the database-Library mass copy API.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.