powershell--Bulk splitting of SQL statements into transactions and batching

Source: Internet
Author: User

As a DBA, occasionally encountering data import into a database, suppose the business or development provides a script file containing millions of lines of INSERT statements, and these INSERT statements do not use Go for batch splitting, Then directly using SQLCMD to execute will find that the process consumes a lot of physical memory and slow data write, even if every line in the script added go, you will still find that the insertion efficiency is too poor, you can not stand, how to do, the following small code to help you toss under:

$old _file_path=' C:\insert_scripts.sql '$new _file_path=' C:\insert_scripts_new.sql '$tran _rows=50$line _content="SET NOCOUNT ongobegin tranbegin TRY"$line _content| Out-file $new _file_path$line _num=1$SR 1=[io. StreamReader]$old _file_path$line _content="" while(-not $SR 1. Endofstream) {$tmp _content=$SR 1. ReadLine ()$line _content=$line _content+"' r"+$tmp _content    $line _num_mode=$line _num%$tran _rows        if($line _num_mode -eq0)    {        $line _content=$line _content+"commitprint ' executive Section"+($line _num-$tran _rows)+"line to page"+($line _num)+"The line succeeds ' END Trybegin Catchrollbackprint ' executes the first"+($line _num-$tran _rows)+"line to page"+($line _num)+"row failed ' END catchgobegin tranbegin TRY"        $line _content| Out-file-append$new _file_pathWrite-host"processing to Rows" $line _num        $line _content=""    }    $line _num=$line _num+1    }$line _content=$line _content+"commitprint ' executive Section"+($line _num-$tran _rows)+"line to page"+($line _num)+"line Complete ' END trybegin catchrollbackprint ' execution"+($line _num-$tran _rows)+"line to page"+($line _num)+"row failed ' END Catchgo"$line _content| Out-file-append$new _file_path$SR 1. Close ()

Let's have a cure, the original SQL is:

Changes that have been made to this script have changed to:

This implementation has the following somewhat:

1. Use "SET NOCOUNT on" to block each statement return affect the number of rows, reduce the number of times to interact with the CMD window;

2. Split every 50 statements into one batch, reduce the consumption of the database for syntax check analysis, commit in a transaction, reduce the number of write logs;

3. Print out the result of the transaction execution, easy to troubleshoot the error (can modify the code to output only failed transactions)

Execution Result:

====================================================

In the personal computer test, to 100 a transaction to split, about 1 minutes can be imported 500,000 to 600,000, according to the different number of rows to split the insertion efficiency is different, the specific appropriate number of Split branch should be based on the actual test prevail.

For large data volume import, or recommend the use of CSV+BCP way to import, insert+sqlcmd efficiency is always too low too low!

====================================================

No technical content, thick face to share, just because a long time did not write blog!

powershell--Bulk splitting of SQL statements into transactions and batching

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.