After the field colleagues reflect, they use a good XML export tool has been error, often reported the database connection time-out, view the database has been found to have more than 100G space.
However, the data stored in the export process only 1000 data per time, near the manager said there is time to filter should not cause problems.
Pull the stored procedure to execute in SQL SERVER, it takes 5 minutes to find
650) this.width=650; "src="/img/fz.gif "alt=" Copy Code "style=" Border:none; "/>
--Create a temporary table to hold the data CREATE TABLE #temp (ID varchar (), name varchar (), zg_id varchar (), ks_id varchar (10))--insert data by time period Into #temp (id,name) Select Id,name from TB1 (nolock) where beginTime between ' 20150606 ' and ' 20150706 '--associating other tables by existing data, Complete other data update #tempset zg_id=b.idfrom #temp a,tb_zg B (nolock) where a.zg_id=b.idupdate #tempset ks_id=b.idfrom #temp a,tb_ KS B (nolock) where a.ks_id=b.id
650) this.width=650; "src="/img/fz.gif "alt=" Copy Code "style=" Border:none; "/>
Only thousands of of the data were queried.
After careful analysis, we have just started the stored procedure to insert three tables of data into a temporary table and then do the processing.
After the query found that three tables each table has more than 7 million rows of data, the associated fields of each table is the primary key, and the first table has been used for the time period to worry about, that is, only query the first table, and by the time period to worry about less than a second.
Because I inserted the data from the first table into a temporary table.
SQL Stored Procedure Optimization experience