Exported a script file, will be nearly 900M, back to SQL Studio lost, reported a lack of memory, and then there is this article:
Problem Description:
When a client server does not allow direct backup, it is often deployed by exporting database scripts-restoring the database,
However, when the database export script is large, when you execute a script with Microsoft SQL Server Management Studio, you often experience "out of memory" prompts.
Workaround:
With Microsoft's own sqlcmd tool, you can import execution. Take the SQL Server 2008R version as an example:
First step: Win+r type:cmd command, open command line tool;
Step two: Type:cd C:\Program Files\Microsoft SQL Server\100\tools\binn (the specific directory path is related to the SQL location you installed)
Step three: Type:sqlcmd-s.-u sa-p 123-d test-i data.sql
Parameter description:-S server address- u user name- p password - D database name-I script file path
(It is recommended to copy the data script file to this directory, just write the file name instead of the full path), note the parameter case and the space symbol. "Actually, since the script is executed, the database is not written."
More information: https://msdn.microsoft.com/zh-cn/library/ms162773 (v=sql.120). aspx
Turn http://blog.csdn.net/a497785609/article/details/47262151
When MSSQL executes a large script file, it prompts for a "out of memory" workaround