1. Some methods for importing and exporting bulk data
SQL Server provides a variety of tools for data import and export of various data sources, including this document, ODBC data sources, OLE DB data sources, ASCII text files, and Excel spreadsheets.
2. Common Tools
DTS: Data Transformation Services Import Export Wizard or DTS Designer to create a DTS package
Using SQL Server to replicate published data
bcp command prompt utility implements data import and export between SQL Server instances and data files
BULK Insert implements importing data from a data file to an instance of SQL Server
Distributed query Implementation Select data from one data source to insert into an instance of SQL Server
SELECT into statement Insert data table
3. Import the exported data
1. The destination table for the imported data must exist. If the destination file for the exported data exists, the above content will be overwritten. If it does not exist, BCP automatically creates the file
2. Data in a data file must be in character format or in a format previously generated by the BCP tool (native format)
3. Must have sufficient permissions on the corresponding table
4. Simple usage of the data import and export tool A. Dts
DTS is a set of graphical tools and programmable objects that developers can use to extract, transform, and merge data from disparate sources into one or more.
It is characterized by the integration of a completely different source of data sources, which is used in enterprise improvement is very large.
Here comes a DTS package, which is a collection of organized links, DTS tasks, DTS transformations, and workflow constraints.
Refer to the relevant documentation for the operation of DTS.
B.bcp
It is often used to transfer large amounts of data from another program to a SQL Server table. Of course, it can also be used to transfer data from a table to a file.
Here are some simple uses of BCP (see related documents for a lot of options)
-
-
SQL Code
-
--Pre-order, open xp_cmdshell--some knowledge about xp_cmdshell see Http://blog.csdn.net/feixianxxx/archive/2009/08/14/4445603.aspxEXEC sp_ Configure ' Show advanced options ', 1; RECONFIGURE; EXEC sp_configure ' xp_cmdshell ', 1; reconfigure;--Environment CREATE TABLE test (ID int, value varchar) Goinsert test VALUES (1, ' s1 ') Insert test values (2, ' S2 ') Insert test VALUES (3, ' S3 ') Insert test values (4, ' S4 ') go--1 export the table's data to the TEXT.txt file in exec master: xp_cmdshell ' bcp tempdb.dbo.test out e:/test.txt-c-usa-p123456 '--if Windows identity is directly xec master: xp_cmdshell ' bcp tempdb.dbo.test out e:/test.txt-t-C '--2 copy data from TEXT.txt file to test1 table select * into Test1 from test where 1=2e Xec Master. xp_cmdshell ' bcp tempdb.dbo.test1 in e:/test.txt-c-usa-p123456 ' select * from test1--3 copy the ID field of the test table to TEXT.txt exec maste R.. xp_cmdshell ' bcp ' SELECT ID from tempdb.dbo.test "queryout e:/test.dat-t-C '--4 move the first row in the test table to Text.txt in exec master: xp_cmdshell ' bcp ' select top 1 * from Tempdb.dbo.test "queryout e:/test.txt-c-usa-p123456"--close xp_cmdshellexec Sp_confi Gure ' Show advanced options ', 1; RECONFIGURE; EXEC sp_configure ' xp_cmdshell ', 0; RECONFIGURE;
C.bulk INSERT
It can only be used for data import into an instance of SQL Server, but we generally choose to use it because it is faster than using a tool in BCP.
Small example:
-
SQL Code
-
--truncate table Testbulk INSERT tempdb: Test from ' E:/test.txt ' with (FieldTerminator = ', ',--field split symbol Rowterminator = '/n '--line break symbol) SELECT * FROM test/*id value----- -----------------1 S12 asds3 sadsa100 2asda*/
PS: Write only the simplest usage, the specific parameters are many, refer to MSDN
D. Distributed queries
-
-
SQL Code
-
--Contains all the connection information required to access remote data from the OLE DB data source. --This method is an alternative to accessing a table in a linked server, and is a one-time temporary method of connecting and accessing remote data using OLE DB. --For more frequent references to OLE DB data sources, use a linked server instead. --a. Using the OPENROWSET with the SELECT and SQL Server Native client OLE DB providers (MSDN) The following example uses the SQL Server Native client OLE DB provider to access the TE St. Table A, which is located in the Poofly database on the remote server SERVER1. SELECT a.*from OPENROWSET (' sqlncli ', ' server=server1; Trusted_connection=yes; ', ' SELECT GroupName, Name, DepartmentID from Poofly. TEST. A ORDER by GroupName, Name ') as A;--b. Use Microsoft OLE DB Provider for Jet (MSDN) The following example through Microsoft OLE DB Provider for Jet Access the Customers table in the Microsoft access Northwind database. SELECT CustomerID, CompanyName from OPENROWSET (' microsoft.jet.oledb.4.0 ', ' C:/Program files/microsoft office/office11 /samples/northwind.mdb '; ' admin ', ', Customers ' go--c. Use OPENROWSET to BULK Insert file data into the varchar (max) column/* In order to import large object data, the OPENROWSET BULK clause supports three options. Allows the user to import the contents of a data file in a single row or column rowset. Instead of using a format file, you can specify one of the large object options. The large object options include: Single_blob reads the contents of data_file in a single line, and returns content in a single-column rowset of type varbinary (max). Single_clob reads the contents of the specified data file as characters, in varchar (mAX) type of single-row, single-column rowset returns content that uses the collation of the current database, such as text or Microsoft Word documents. Single_nclob reads the contents of the specified data file in Unicode, returns content as a single row, single column rowset of type nvarchar (max), and uses the collation of the current database. */The following example creates a small table for demonstration and inserts the file data from a file named Text1.txt into the varchar (max) column. CREATE TABLE my_test (Document varchar (max)) Goinsert into my_test select * from OPENROWSET (BULK N ' e:/test.txt ', Single_clo B) as Documentgoselect * from My_test/*document------------------------------------------------------- Asdsadasdsadsadsafkjhfas hklasjhashbkdsahkjdhsakjdhsakdhsakdhsa*/
E.select into
The usage of this is believed to be clear to all of us.
5. Some ways to optimize import and export data 1. Use minimal logging:
A. The recovery model is either simple mode or bulk-logged mode. If you are in full mode, you can change it to bulk-logged mode before you insert it and change it back.
B. The destination table has no triggers, no indexes, and TABLOCK is specified.
2. To import data from multiple clients in parallel to a single table:
A. If it is the full recovery model, change to bulk-logged mode
B. Tablock was specified
C. No indexes on the table
3. Use batching: The option to set bcp or BULK INSERT, which is used to specify the number of rows per batch to be sent to SQL during the operation.
4. Disable triggers and constraints: disabled by default. If you want to check, you can do an update once the copy is complete (the value cannot be changed, of course)
5. Sort data in a data file: Improves performance by setting the order hint. The default data file is not sorted.
6. Control locking Behavior: Specifies that the bulk operation process obtains a large-capacity update of table-level locks, which reduces the contention for table locking.
7. Avoid default: When copying data into a table by setting the relevant options, insert the default value for the column that has defaulted, but instead change the value in the column to null.
Original: http://blog.csdn.net/leixg/article/details/6256063
T-SQL statements for database backup jobs