With the progress of the Times, the development of society, all kinds of technology endless variety of chaotic countless (writing!!!) )
No nonsense, in short, many companies will use a variety of databases at the same time, so the data in the import and export between different databases become an egg pain problem, for the requirements of the cycle line, you can develop a special program processing, but for the occasional uncertainty of demand, it is time to need the DBA, What happens when MySQL data needs to be imported into SQL Server?
Of course the use of tools is the simplest, but I ignore the tool!!!
Create test data in MySQL:
Create TableTB001 (C1intAuto_incrementPrimary Key, C2varchar( -));Insert intotb001 (C2)Select 'abc\r\n';Insert intotb001 (C2)Select 'Hello, \ r \ n';Insert intotb001 (C2)Select 'Oh , hello.';Insert intotb001 (C2)Select 'double quotes "';Insert intotb001 (C2)Select 'full-width double quotes "';Insert intotb001 (C2)Select 'single quotation marks" ";Insert intotb001 (C2)Select 'Full Width single quote '';
##=====================================================================##
Use mysqldump to export an INSERT statement that is compatible with MS SQL Server:
-- host= ' 192.168.166.169 '--port=3358--user= ' mysql_admin '--password= ' [email protected] @Pwd '--skip-add-locks-- Compatible=mssql--complete-insert--compact--extended-insert=false--default-character-set=utf8-t--databases ' Test '--table ' tb001 ' >/tmp/t4.sql
Some comments on the above script are explained:
--COMPATIBLE=MSSQL # #导出的SQL与MS SQL Server compatible
--complete-insert # #导出的INSERT语句包含列名
--compact # #采用精简模式, do not output all kinds of MySQL information
--extended-insert=false # #采用一行数据一条INSERT的方式
--default-character-set=utf8 # #指定导出的字符集
-T ##-t means only data is exported,-D means that only the structure is exported
--databases ' Test ' # #数据库名称
--table ' Citymatchup ' # #表名
The export results are:
INSERT into"Tb001" ("C1", "C2")VALUES(1,'abc\r\n');INSERT into"Tb001" ("C1", "C2")VALUES(2,'Hello, \ r \ n');INSERT into"Tb001" ("C1", "C2")VALUES(3,'Oh , hello.');INSERT into"Tb001" ("C1", "C2")VALUES(4,'double quotation marks \ "');INSERT into"Tb001" ("C1", "C2")VALUES(5,'full-width double quotes "');INSERT into"Tb001" ("C1", "C2")VALUES(6,'single quotation mark \"'INSERT into "tb001" ("C1", "C2") VALUES (7,'Full width single quote '');
For column names with double quotes, you can use SET QUOTED_IDENTIFIER on to handle them, or you can use the SQLCMD plus-I parameter to handle
However, the single quotation marks in the text are not solved, and MySQL uses "\" as the escape character, while SQL Server uses two single quotation marks to represent a single quotation mark.
Mysqldump can import data into INSERT statements and provide parameters that are configured to be compatible with other databases, but because different database escape characters are different, even using compatible=mssql does not guarantee that the exported script will execute properly in SQL Server.
##===========================================================================##
Use SELECT INTO outfile to export data
SELECT * into ' /tmp/tb001.txt ' by'| | --|| ' by'| | ==|| ' from test.tb001;
What you see under Linux is this:
Although a bit messy, but endure!
Then download the file, open it using notepad++, choose Format >> go to ANSI encoding format, then save as a new file, and use BULK INSERT in SQL Server to import:
CREATE TABLEtmp_tb001 (IDNVARCHAR( -), C1NVARCHAR( -))GOBULK INSERTtmp_tb001 from 'D:\tb002.txt' with(FieldTerminator='|| --||', Rowterminator='|| ==||')GOSELECT * fromtmp_tb001
You can also use the SQL Server Import and Export tool to process, mainly modify the delimiter.
Note When you export a file using SELECT INTO OutFile, the null value is represented as \ n, and \ n is processed by the string "\ n" When you import SQL Server, so it is recommended that you first create a table that consists entirely of nvarchar type columns to "stage" the import. Then after the data is cleaned and then imported into the formal table, for lazy and a column of a row of people, you can spell SQL to get all the column conversions of the table:
SELECT 'Case ['+T1.name+']="'\ n"'Then NULL ELSE ['+T1.name+'] END as ['+T1.name+'],' fromsys.all_columns T1WHERET1.object_id=object_id('tmp_tb001')
Since we forcibly convert \ n as null, it is unavoidable to cause accidental injury, and the value of the real data as ' \ n ' becomes null, so it is necessary to check after the data has been completed.
The final statement is:
SELECT Case when [ID]='\ n' Then NULL ELSE [ID] END as [ID], Case when [C1]='\ n' Then NULL ELSE [C1] END as [C1] fromtmp_tb001
The result of the execution is:
##=======================================================================##
Exporting an insert script with an escape character single quote problem, while exporting data does not include go, there are serious performance issues when importing large amounts of data into SQL Server, and you can try to refer to my powershell--batch split SQL statement for transactional and batch processing. But it's also a lot of questions.
and the export file and then import the way, you need to convert the file, the file is larger when the notepad++ may not punch, UE can give a little to lidian weights, but the face of several G text file is also unable to heaven, while the null value processing also need to be treated with caution.
##========================================================================##
Well, it's time to go to sister.
mysql--importing MySQL data to SQL Server