If you are engaged in database-related work, it may involve inserting data from external data files into SQL Server. This article will show you how to use the bulk insert command to import data and how to change some options of the command to facilitate and more effectively insert data.
BULK INSERT
In SQL Server, BULK INSERT is a T-SQL command used to load external files to a database table in a specific format. This command allows developers to directly load data to database tables without using external programs like integration services. Although bulk insert does not allow any complicated logic or conversion, it provides formatting-related options and shows how the import is implemented. Bulk insert has a limit that only data can be imported into SQL Server.
The following example shows how to use the bulk insert command. First, create a table named sales. We will insert data from text files into this table.
Create Table [DBO]. [sales]
(
[Saleid] [int],
[Product] [varchar] (10) null,
[Saledate] [datetime] Null,
[Saleprice] [money] Null
)
When we use the bulk insert command to insert data, do not start the trigger in the target table, because the trigger will slow down the data import process.
In the next example, we will create a trigger on the sales table to print the number of records inserted into the table.
Create trigger tr_sales
On Sales
For insert
As
Begin
Print cast (@ rowcount as varchar (5) + 'rows inserted .'
End
Here, we select a text file as the source data file. The values in the text file are separated by commas. This file contains 1000 records, and its fields are directly associated with the fields in the sales table. Because the values in the text file are separated by commas, you only need to specify fieldterminator. Note: When the following statement is run, the trigger we just created is not started:
Bulk insert sales from 'C: salestext.txt 'with (fieldterminator = ',')
When the amount of data we need is very large, sometimes the trigger needs to be started. The following script uses the fire_triggers option to specify that any trigger on the target table should be started:
Bulk insert sales from 'C: salestext.txt 'with (fieldterminator =', ', fire_triggers)
We can use the batchsize command to set the number of records that can be inserted into the table in a single transaction. In the previous example, all the 1000 records are inserted into the target table in the same transaction. In the following example, we set the batchsize parameter to 2, that is, we need to execute 500 independent insert transactions on the table. This also means that the trigger is started for 500 times, so 500 printing commands are output to the screen.
Bulk insert sales from 'C: salestext.txt 'with (fieldterminator =', ', fire_triggers, batchsize = 2)
Bulk insert can be applied not only to the local ing drive of SQL Server 2005. The following statement tells us how to import the data of the salestext file from the drive D of the server named fileserver.
Bulk insert sales from 'fileserverd=salestext.txt 'with (fieldterminator = ',')
Sometimes, before performing the import operation, we 'd better first check the data to be input. The following statement uses the OpenRowSet function when using the bulk command to read source data from the salestext text file. This statement also requires a format file (the specific content of the file is not listed here) to indicate the data format in the text file.
Select *
From OpenRowSet (bulk 'C: salestext.txt ',
Formatfile = 'C: salesformat. xml'
) As mytable;
Go
Recently, I analyzed the database of a project. to import a large amount of data, I want to import up to 2 million pieces of data to sqlserver at a time. If I use a normal insert statement to write the data, I am afraid that the task cannot be completed without an hour. BCP is considered first, but this is based on command line. It is too unfriendly for users and is unlikely to be used. Finally, we decided to use the BULK INSERT statement for implementation, bulk insert can also be used to import large amounts of data, and can be implemented through programming. The interface can be very friendly, and its speed is also high: importing 1 million pieces of data in less than 20 seconds, i'm afraid there is no right person in terms of speed.
However, using this method also has several disadvantages:
1. Tables whose data needs to be exclusively accepted
2. A large number of logs are generated.
3. There are restrictions on the format of the files from which data is retrieved
However, compared with its speed, these shortcomings can be overcome, and if you are willing to sacrifice a little speed, you can also perform more precise control, or even control the insert of each row.
For logs that occupy a large amount of space, we can dynamically change the Database Log mode before import to the large-capacity log record recovery mode, so that no logs will be recorded, after the import is complete, restore the original database logging method.
The specific statement can be written as follows:
The Code is as follows:
Alter database taxi
Set recovery bulk_logged
Bulk insert taxi .. detail from 'e: \ out.txt'
With (
Datafiletype = 'Char ',
Fieldterminator = ',',
Rowterminator = '\ n ',
Tablock
)
Alter database taxi
Set recovery full
This statement exports the data file from E: \ out.txt to the detail table of the database taxi.