recently encountered a requirement to migrate data from a subset of the tables in Oracle to a SQL Server database. But the structure of these tables is somewhat different.
method one, using the SQL Develper Data Export tool, export the SQL INSERT statement, and then manually modify some of the details, and finally put in the SQL Executed in the server. because the structure of the table before and after the difference, I need a statement to delete some of the fields, and then delete some values, a few or dozens of data is OK, these 400,000 data, too many, this method does not work.
I wrote a data migration console applet in C #, read all the data in the Oracle table into a DataTable, and then took the required fields, stitching them into SQL statements, and writing them to the SQL Server database using C # code. Although this method is fully automated, it has a performance problem because both Oracle and SQL Server databases are not local, both on the database server, when fetching data, but it is slow to insert data. When I used a SqlConnection to receive an SQL statement, the new one SqlCommand to execute, A total of 400,000 new SqlCommand. Considering the performance, I want to use a SqlCommand, set a loop, loop 400,000 times, change its text string every time, but I find there is not much performance improvement. Then you want to put 400,000 insert statements into a string, and all of a sudden into the database to execute. But this way, the command was executed. After ExecuteNonQuery, because the database is very large, the long time is unresponsive, and the program throws a timeout exception exception.
method three, since the program can not be inserted, then I use C # program to export data, splicing into a SQL statement into a text file. because it is their own splicing, controllability strong, you do not have to change as the method one. Then open SQL Server Management Studio, connect to the remote database, execute the SQL script file, which is much faster than the C # code insertion.