The following is an example: The code is as follows:Copy code Create table [dbo]. [course] ([Id] [int] NULL,[Name] [nvarchar] (50) NULL,[CourseType] [nvarchar] (50) NULL,[Course] [float] NULL) Import data:Store the following data as a text or SQL file2, Li Gang, Chinese, 89; 3, Li Gang, Mathematics, 79; 3, Li Gang, English, 69; 4, Li Gang, chemistry, 89Import statement: The code is as follows:Copy code Bu
In this article, I'll explain a lot about inserting data in SQL Server in the future.
To create a database and table to test, the primary key in the table is a GUID, and no indexes are created in the table in order for the data to be inserted faster. GUIDs are bound to be faster than self growth, because the time you take to generate a GUID algorithm is certainly less than the number of times you requery t
static void Update (String connstring, DataTable table){SqlConnection conn = new SqlConnection (connstring);SqlCommand COMM = conn. CreateCommand ();Comm.commandtimeout = _commandtimeout;Comm.commandtype = CommandType.Text;SqlDataAdapter adapter = new SqlDataAdapter (comm);SqlCommandBuilder Commandbulider = new SqlCommandBuilder (adapter);Commandbulider.conflictoption = conflictoption.overwritechanges;Try{Conn. Open ();Set the number of processing bars per batch updateAdapter. UpdateBatchSize =
Open function--to-allow advanced options to be changed.EXEC sp_configure ' show advanced options ', 1GO--To update the currently configured value for advanced options. RECONFIGUREGO--to enable the feature. EXEC sp_configure ' xp_cmdshell ', 1GO--To update the currently configured value for this feature.RECONFIGUREGOMaster.. xp_cmdshell ' net use \\192.168.0.126\ftpfiles 12345678/user:server\administrator 'Tip: Execution of the command completes successfully. For example: Master. xp_cmdshell ' ne
follows
Copy Code
Delete from a where exists (Select 1 from a where a=1)
The above method only applies to the simple small data volume of bulk data deletion, if it is a large number of data removal we can refer to the following methods
The code is as follows
Copy Code
Create PROCEDURE Batch_delete@TableName nvarchar (100),--table name@FieldName nvarchar (100),--delete field name@DelCharIn
Label:MySQL Bulk SQL Insert performance optimizationsPosted by Kiki TitaniumOn December 7, 2012
For some systems with large data volume, the problem of database is not only inefficient, but also the data storage time is long. Especially like a reporting system, the time spent on data import every day can be as long as a few hours or more than 10 hours. Therefo
+ transaction + ordered data in the data volume up to tens performance is still good, in a large amount of data, ordered data index positioning is more convenient, do not need to read and write the disk frequently, so can maintain high performance. Precautions:1. SQL statements are limited in length, and in the same SQL the data merge must not exceed the SQL len
. From the test results, the performance of the optimization method has improved, but the improvement is not very obvious. Comprehensive Performance Testing:The test for Insert efficiency optimization using the above three methods is provided here. From the test results can be seen, the method of merging data + transactions in the small amount of data, performance improvement is very obvious, when the data is large (more than 10 million), performance
--Use table-valued parameters to bulk insert data into another data tableUse Df17datapro--Create and use table-valued parameter steps/*1. Create a table type and define the table structure.For information about how to create a SQL Server type, see user-defined table types. For more information about how to define the t
number of insert bars is greater than 1000, the 3 kinds of insertions make a big difference, the time spent on stitching SQL statements is greatly increased, and transaction processing takes about 1/2 of the time spent in a single insert.Let's look at what it looks like when the number of records is less than 1000: we can see that when the number of records is less than 100, the efficiency of splicing
resources. The index positioning efficiency of the inserted record will be decreased, and there will be frequent disk operation when the data volume is large.Comprehensive Performance Testing :The test for Insert efficiency optimization using the above three methods is provided here.From the test results can be seen, the method of merging data + transactions in the small amount of data, performance improvement is very obvious, when the data is large
);
}
}
//3. Bulk execution of SQL or saving objects
Batchexecutesql (recordlist);
return null;
}
public static int batchexecutesql (arraylist
System.out.println ("can then execute SQL statements or save objects");
System.out.println ("======== bulk Execute SQL
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.