server|sqlserver| data recently done a database analysis of a project, to achieve the import of massive data, that is, the maximum of 2 million data imported into SQL Server, if the use of ordinary insert statement to write, I am afraid not a few hours to complete the task, First consider using bcp, but this is based on the command line, unfriendly to the user, very unlikely to be used, and finally decided to use the BULK INSERT statement, BULK Insert can also achieve large amount of data import, and can be implemented programmatically, the interface can be done very friendly, It is also very high speed: Import 1 million data less than 20 seconds, in the speed I am afraid that the other person.
But there are several drawbacks to using this approach:
1. Tables that require exclusive acceptance of data
2. Will generate a lot of log
3. The file from which data is extracted has formatting limitations
But these drawbacks can be overcome relative to its speed, and if you are willing to sacrifice a little bit of speed, you can do more precise control and even control the insertion of each row.
In the case of a large space-consuming log, we can take the log mode of dynamically changing the database before importing for the bulk-logged recovery model, so that the log is not logged and the original database logging is restored after the import is complete.
We can write a specific statement like this:
ALTER DATABASE Taxi
Set RECOVERY bulk_logged
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.