BCP Export Import

Source: Internet
Author: User
Tags bulk insert field table sql server books

Objective

SQL Server provides a variety of different data export import tools, you can also write SQL scripts, use stored procedures, generate the required data files, and even can generate a script file containing SQL statements and data. Each has its advantages and disadvantages to suit different needs. The following is an introduction to the bulk data export tool--BCP utility. At the same time, the bulk INSERT import bulk data is also introduced, as well as the practice of bcp with BULK INSERT as the data Interface (practice on SQL2008R2).

1. Use of bcp

The BCP utility can bulk copy data between a Microsoft SQL Server instance and a data file in a user-specified format. Use the bcp utility to import a large number of new rows into a SQL Server table, or to import the table data into a data file. Unless used with the queryout option, you do not need to know the Transact-SQL knowledge to use the utility. BCP can be run either at the CMD prompt or under SSMs.

Figure-1

Grammar:

bcp {[[database_name.] [Schema]. {table_name | view_name} | "Query"}    {in | out | queryout | format} data_file    [-mmax_errors] [-fformat_file] [-X] [-eerr_file]    [-ffirst_ Row] [-llast_row] [-bbatch_size] [    -ddatabase_name] [-n] [-c] [-n] [-W] [-V (| |)]     [-Q] [-C {ACP | OEM | RAW | Code_page}] [-tfield_term]     [-rrow_term] [-iinput_file] [-ooutput_file] [-apacket_size]    [-S [server_name[\ instance_name]] [-ulogin_id] [-ppassword]    [-t] [-v] [-r] [-K] [-e] [-h] hint [,... n] "]

Simple Export Example 1:

Figure-2

Simple Export Example 2:

Figure-3

The same can be done on SSMs:

EXEC [Master].. xp_cmdshell ' BCP testdb_2005.dbo. T1 out E:\T1_02.txt-c-t ' GO

Code-1

Figure-4

EXEC [Master].. xp_cmdshell ' BCP ' select * from Testdb_2005.dbo. T1 "Queryout E:\T1_03.txt-c-T" GO

Code-2

Figure-5

Personally, I prefer to use the second notation used with the queryout option because it gives you more flexibility in controlling the data to be exported. If the Execute bcp command encounters such an error message:

MSG 15281, Level A, State 1, Procedure xp_cmdshell, line 1
SQL Server blocked access to procedure ' Sys.xp_cmdshell ' of component ' xp_cmdshell ' because this component is turned off a s part of the security configuration for this server. A system administrator can enable the use of the ' xp_cmdshell ' by using sp_configure. For more information about enabling ' xp_cmdshell ', see "Surface area Configuration" in SQL Server Books Online.


For security reasons, the system does not have the xp_cmdshell option turned on by default. Use the following statement to turn this option on.

EXEC sp_configure ' show advanced options ', 1RECONFIGUREGOEXEC sp_configure ' xp_cmdshell ', 1RECONFIGUREGO

Code-3

After use, you can turn off the Sp_cmdshell.

EXEC sp_configure ' show advanced options ', 1RECONFIGUREGOEXEC sp_configure ' xp_cmdshell ', 0RECONFIGUREGO

Code-4

BCP Import data

Modify the figure-2 in to be in, and import the data.

Figure-6

Figure-7

Import data using BULK INSERT

BULK INSERT dbo. T1 from ' E:\T1.txt ' with (    fieldterminator = ' \ t ',    rowterminator = ' \ n '    )

Code-5

Figure-8

For more detailed instructions on bulk INSERT, refer to: https://msdn.microsoft.com/zh-cn/library/ms188365%28v=sql.105%29.aspx

BULK Insert provides a more flexible choice than the import of bcp.

BCP a few common parameter descriptions:

database_name The name of the database in which the specified table or view resides. If not specified, the user's default database is used.
in | Out | queryout | format
  • in   Copy from file to database table or view.

  • out   Copy from a database table or view to a file. If an existing file is specified, the file is overwritten. When extracting data, be aware of   bcp   utility represents an empty string as NULL, and a null string as an empty string.

  • queryout   Copy from query, this option must be specified only when bulk copying data from a query.

  • format   According to the specified options (-N , -C , -W   or   -n ) and the delimiter for the table or view to create a format file. When bulk copying data, the bcp  command can reference a format file, avoiding the ability to interactively re-enter formatting information. The format   options require that you specify   -F   options; You also need to specify &NBSP when you create an XML format file; -X   Options.

    in   Copy from file to database table or view.
    out   Copy from database table or view to file. If an existing file is specified, the file is overwritten. When extracting data, be aware of   bcp   utility represents an empty string as NULL, and a null string as an empty string.
    queryout   Copy from query, this option must be specified only when bulk copying data from a query.
-C The operation is performed using the character data type. This option does not prompt for each field; it uses char as the storage type, without a prefix, using \ t(tab) as the field delimiter, and \ r \ n(newline character) as the line terminator.
-W Bulk copy operations are performed using Unicode characters. This option does not prompt for each field; it uses nchar as the storage type, without a prefix, using \ t(tab) as the field delimiter, and \ n(newline character) as the line terminator.
- Tfield_term Specifies the field terminator. The default value is \ t(tab). Use this parameter to override the default field terminator.
- Rrow_term Specifies the line terminator. The default value is \ n(newline character). Use this parameter to override the default line terminator.
- sserver_name[ \instance_name] Specifies the instance of SQL Server to connect to. If you do not specify a server, the bcp utility connects to the default SQL Server instance on the local computer. You must use this option if you are running a bcp command from a remote computer on a network or locally named instance. To connect to the default instance of SQL Server on the server, specify only server_name. To connect to a named instance of SQL Server, specify server_name\instance_name.
- ulogin_id Specifies the login ID used to connect to SQL Server.
- PPassword Specifies the password for the login ID. If this option is not used, thebcp command prompts for a password. If you use this option at the end of the command prompt, but do not provide a password, bcp uses the default password (NULL).
-T Specifies that the bcp utility connects to SQL Server by using a trusted connection that uses integrated security. Security credentials, login_id, and password for network users are not required. If –Tis not specified, you will need to specify –U and –P to successfully log on.

For more detailed parameters, please refer to: https://msdn.microsoft.com/zh-cn/library/ms162802%28v=sql.105%29.aspx

2. Practice

2.1 Exporting data

After you have introduced the export import of BCP and the import of bulk INSERT, here are some practical things to do. To get closer to the real world, create a 10-field table that contains several commonly used data types and constructs 20 million of the data, including Chinese and English. In order to insert test data more quickly, the index is not created first. Before executing the code below, be aware that the log recovery model for the database is set to either bulk or simple mode, and that the disk space is sufficient (in my practice, data files and log files may take up to 40G of space).

Use Adventureworks2008r2goif object_id (N ' T1 ') are not nullbegin    DROP table t1endgocreate table T1 (    id_ INT,    C Ol_1 NVARCHAR (+),    col_2 NVARCHAR (+),    col_3 NVARCHAR (+),    col_4 NVARCHAR (+),    col_5 INT,    col_6 FLOAT,    col_7 DECIMAL (18,8),    col_8 BIT,    input_date DATETIME DEFAULT (GETDATE ())) Gowith CTE1 as (SELECT a.[ OBJECT_ID] from master.sys.all_objects as a,master.sys.all_objects as b,sys.databases as Cwhere c.database_id <= 5), CT E2 as (SELECT row_number () over (ORDER by [object_id]) as Row_no from CTE1) inserts into T1 (Id_,col_1,col_2,col_3,col_4,col _5,col_6,col_7,col_8) SELECT row_no,replicate (N ' Blog Park ', ten), NEWID (), NEWID (), NEWID (), CAST (Row_no * RAND () * as INT), row _no * RAND (), Row_no * rand (), CAST (Row_no * rand () as INT)% 2FROM CTE2 WHERE row_no <= 20000000GO

Code-6

It will take a few minutes for the process to finish, please be patient. For the construction of data, refer to my other blog post: http://www.cnblogs.com/fishparadise/p/4781035.html

Export the data using the usage described above:

EXEC [Master].. xp_cmdshell ' BCP AdventureWorks2008R2.dbo.T1 out E:\T1_04.txt-w-t-s ken\sqlserver08r2 ' GO

Code-7

The- w parameter is used here. BCP can export the data under CMD, test and export 20 million records, my notebook used for nearly 8 minutes or so time. BCP can also be executed in SSMs, using more than 6 minutes of time, faster than CMD, resulting in a consistent file size, nearly 5GB per file.

Figure-9

Figure-10

For complex Bulk Import cases, formatting files are often required. You must use a format file in the following situations:

    • Multiple tables with different schemas use the same data file as the data source.

    • The number of fields in the data file differs from the number of columns in the target table;

      • The target table contains at least one column that has a default value defined or is allowed to be NULL.

      • The user does not have Select/insert permissions on one or more columns of the target table.

      • Two or more tables with different schemas use the same data file.

    • The column order of the data file and table is different.

    • The terminating character or prefix length of the data file column is different.

This does not use the format file to export the imported demo. Please refer to the Books Online for more information and use.

2.2 Importing Data

Use BULK insert to import data into the target table data. To improve performance, you can temporarily delete the index, and then rebuild the index after the completion. Be aware that you want to reserve enough disk space. It took about 15 minutes to finish the tour.

Figure-11

3. Expansion

3.1 Data export import automation and data interface

Because of the working relationship, sometimes we have to develop some customer's data interface, and automatically import a large amount of data every day. Limited to factors such as applications, so consider using BULK INSERT directly from SQL Server to automatically read the intermediate files of the relevant directory every day. Although the directory is dynamic, because the intermediate files are fixed format, by writing dynamic SQL, finally closed into a stored procedure, put into the job, configure the running plan, you can complete the automated work. Below is a simple demonstration of the following procedure:

3.1.1 Writing an import script

CREATE PROCEDURE sp_import_dataasbegin DECLARE @path NVARCHAR ($) DECLARE @sql NVARCHAR (MAX)/*s_ The PARAMETERS table is the */select of the path that can be configured on the application  @path = Value_ + CONVERT (NVARCHAR, GETDATE (), +) + '. txt ' from s_parameters WHERE [ Type] = ' Import '/*t4 is a temporary intermediate table. First read the data from the file into the intermediate table, and finally through the script to insert the T4 intermediate table data into the actual business table */set @sql =n ' BULK insert T4 from "+ @path +" "with (    fieldterminator =" * ',    rowterminator = ' \ n '    ) ' EXEC (@sql) Endgo

Code-8

3.1.2 Configuration Job

The first thing to configure is that SQL Server has permission to read the relevant directories and files. In the Windows service, open the properties of SQL Server, on the Log On tab, use sufficient permissions to start SQL Server and users who have permission to read the relevant directory, such as reading the network disk.

Figure-12

Create a new job in SQL Server agent

Figure-13

On the General page, select owner and select SA here.

Figure-14

On the steps page, execute the stored procedure written in command.

Figure-15

On the Schedules page, configure the execution time and frequency, and so on. Complete.

Figure-16

3.2 High-version database degraded to a lower version

In general, databases that are backed up from a lower version can be recovered directly from a higher version of the database, such as SQL2000 backups that can be recovered in SQL2005 or SQL2008, unless the span is too large. For example, SQL2000 Backup can not be directly in the SQL2012 recovery, can only restore to SQL2008, and then from SQL2008 back up, and finally to SQL2012 recovery.

A higher version of a backup is generally not recoverable in a lower version, such as a backup of SQL2008 that cannot be recovered in SQL2005 or SQL2000. In reality, however, this demand will be met. It is best to connect two different versions of the database directly from the high version of SSMs, import or write scripts from data export between databases, and direct the high version data to the lower version of the database. This is a faster and safer approach. However, if the two versions of the database can not be connected, only the data can be exported, and then imported. For a small amount of data, use the Export Import feature of SSMS or generate a script that contains data (). For big data, it is a disaster, such as a large table with 20 million of data ahead, the script to generate data also has a few g large, directly using SSMs is impossible to execute. You can only export imported tools with such bulk data as BCP, BULK Insert, and so on.

Figure-17

4. Summary

Using BCP with BULK INSERT enables fast Export import of bulk data and automates its work. For a small amount of data, the operation is not very complicated. This is a very useful tool in addition to the graphical tools on SSMs.

BCP Export Import

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.