SQL Server database migration remedies

Source: Internet
Author: User

I. Purpose

Previously, in the blog post SQL Server database minimal downtime migration solution, we mentioned that the full backup + differential backup feature has been used to complete database transfer, however, this method is obviously not applicable when we encounter more than 700 GB of data, so in this article, how do I migrate a database of GB to a new server.

Ii. Analysis and Design Ideas

(1) Environment Description

Our database uses SQL Server 2005 and is deployed on Windows Server 2003 x86 operating system. One Table occupies most of the space of this database.

In the face of the above situation, our database is under a lot of pressure, so we plan to find another machine in the same cluster and transfer the data in this database, optimize the database by setting parameters for the new server.

(2) Data Analysis

When we get a database, we should check the relevant information of the database and make a preliminary evaluation after understanding the database conditions and parameters, for example, we need to know how much space the tables in this GB database occupy and how much space the index occupies. There is an SQL statement that can directly view the information) and whether the table partition is performed.

When you know the parameters, you can check the server hardware information, such as memory, hard disk, RAID policy, operating system, database version, memory pressure, CPU pressure, and so on. Understanding this information is an important factor for us to decide whether to migrate to a new server.

If data migration is decided, it is appropriate to move the number of records at a time so that the production database can still enter data without affecting our production database, we also need to calculate this. The relocation Job should try to make the time interval bigger. If the previous Job has not been executed, the latter Job will not be executed even when the time is reached .)

(3) design ideas

Iii. Reference script

The following lists some key SQL statements for your reference.

-- 1.1: Create a table

 
 
  1. CREATE TABLE [dbo].[Temp_MoveManage](  
  2.  
  3.     [Id] [int] NOT NULL,  
  4.  
  5.     [IsDone] [bit] NOT NULL,  
  6.  
  7.     [UpdateTime] [datetime] NULL,  
  8.  
  9.  CONSTRAINT [PK_Temp_MoveManage] PRIMARY KEY CLUSTERED   
  10.  
  11. (  
  12.  
  13.     [Id] ASC 
  14.  
  15. )WITH (IGNORE_DUP_KEY = OFF) ON [PRIMARY]  
  16.  
  17. ) ON [PRIMARY] 

-- 1.2 Insert data

 
 
  1. declare @i int   
  2.  
  3. set @i=1   
  4.  
  5. while @i < 50000000  
  6.  
  7. begin   
  8.  
  9.     insert into dbo.Temp_MoveManage values(@i,0)  
  10.  
  11.     set @i = @i + 50000  
  12.  
  13. end 

-- 1.3 Test

 
 
  1. select * from Temp_MoveManage 

-- 2, link server (omitted)

-- 3. Stored Procedure

 
 
  1. SET ANSI_NULLS ON
  2.  
  3. GO
  4.  
  5. SET QUOTED_IDENTIFIER ON
  6.  
  7. GO
  8.  
  9. -- ===================================================== ======
  10.  
  11. -- Author: <Viajar>
  12.  
  13. -- Create date: <2011.04.14>
  14.  
  15. -- Description: <transfer data>
  16.  
  17. -- ===================================================== ======
  18.  
  19. Alter procedure [dbo]. [sp_GetMoveData]
  20.  
  21. AS
  22.  
  23. BEGIN
  24.  
  25. DECLARE @ Id1 INT, @ Id2 INT
  26.  
  27. DECLARE @ MaxId INT -- maximum value of the original table
  28.  
  29. SET @ Id1 = 0
  30.  
  31. SET @ Id2 = 0
  32.  
  33. Select top 1 @ Id1 = Id FROM Temp_MoveManage WHERE IsDone = 0 order by Id
  34.  
  35. Select top 1 @ Id2 = Id FROM Temp_MoveManage WHERE IsDone = 0 AND Id> @ Id1 order by Id
  36.  
  37. SELECT @ MaxId = MAX (Id) FROM [dbo]. [ClassifyResult]
  38.  
  39. IF (@ Id1! = 0 AND @ Id2! = 0 AND @ MaxId >=@ Id2)
  40.  
  41. BEGIN
  42.  
  43. DECLARE @ SQL VARCHAR (MAX)
  44.  
  45. SET @ SQL ='
  46.  
  47. SET IDENTITY_INSERT [ClassifyResult_T] ON
  48.  
  49. Insert into [dbo]. [ClassifyResult_T] (
  50.  
  51. [Id]
  52.  
  53. , [ClassId]
  54.  
  55. , [ArchiveId])
  56.  
  57. SELECT
  58.  
  59. [Id]
  60.  
  61. , [ClassId]
  62.  
  63. , [ArchiveId]
  64.  
  65. FROM [dbo]. [ClassifyResult]
  66.  
  67. WHERE Id> = '+ CONVERT (VARCHAR (100), @ Id1) +' and Id <'+ CONVERT (VARCHAR (100), @ Id2) +'
  68.  
  69. Order by Id
  70.  
  71. SET IDENTITY_INSERT [ClassifyResult_T] off'
  72.  
  73. EXEC (@ SQL)
  74.  
  75. UPDATE Temp_MoveManage SET IsDone = 1 WHERE Id = @ Id1
  76.  
  77. END
  78.  
  79. END

-- 4, Job (omitted)

Iv. Defects

Defect 1:In the table generated by CreateTable, the last record cannot be executed. Because the last Id is <, NO = is used, the last record in the transferred table has not been transferred to the partition table;

Defect 2:If the records of the transfer table are constantly increasing, the data cannot be transferred to the partition table, the original idea is to add more records to the Id segment of the relocation auxiliary table so that the latest data can be executed;

Defect 3:The problem above is also flawed. For example, the current Id segments are 100 and 200. When the new data Id> 100 for a certain period of time, the IsDone of the two segment values will be updated to 1, resulting in a lot of missing data. This defect can also be solved, when the Id of the new data is greater than 200, scripts with IDs 100 and 200 are executed. Max (Id) can be determined during the storage process)

5. Note

Vi. Others

This is a suggestion from some friends. I haven't tried it yet because of Environment restrictions, for example, the database is in simple mode. We are looking forward to using suitable environment kids shoes.

Edit recommendations]

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.