Summary of SQL SERVER partition tables-Maintenance and Management of partition tables
After creating a partition table as required, you must manage and maintain the partition table. The main content is:
1. Use the Sliding Window Scenario solution to split partition tables and data moving intermediate tables.
2. Structure Change Management of Partitioned Tables.
Of course, the above two points are both theoretical points. I have two main requirements at work:
1. Quick Archive Management of outdated partition data: that is, data of some inactive partitions is switched to the Data Archiving table.
2. A partitioned table needs to add new partitions to process data due to data growth. That is, all data in a partitioned table is active and new partitions are required to carry new data.
These two requirements will be reflected in the following code. The partition table follows the table in the previous article.ProductBut it is calledProducts. Because the table in the previous article is in use by the test system and I cannot move the data, I have to clone a table.
CreateTable StructureAndPrimary Key clustered IndexWith ProductsExactlyTable Products_Tmp (temporary table used for data transfer); this table alsoRequiredThe Partition to be removed is located in the same file group (otherwise, the Partition Switch cannot be applied for fast switching ).
Create an archive table Products_Archive on the new file group to archive the data of the temporary table in the middle.
USE [master]
GO
ALTER DATABASE [TEST]
ADD FILEGROUP [FG_TEST_Products_Archive]
GO
ALTER DATABASE [TEST]
ADD FILE ( NAME = N'FG_TEST_Products_Archive_data_1',
FILENAME = N'D:\Data\FG_TEST_Products_Archive_data_1.ndf' ,
SIZE = 50MB ,
FILEGROWTH = 10% )
TO FILEGROUP [FG_TEST_Products_Archive]
GO
USE TEST
GO
CREATE TABLE [dbo].[Products_Archive](
[ID] [int] NOT NULL,
[PName] [nvarchar](100) NULL,
[AddDate] [datetime2](3) NULL
)ON [FG_TEST_Products_Archive];
GO
Next, we will transfer the data. Encapsulated into a stored procedure for convenient calling. Originally, I wanted to write the imported data in it. However, considering the large data volume in the production environment, I would use other data import schemes to discard them.
Create procedure usp_TransferPartitionData_ForArchive
@ PartitonNumber INT -- number of the partition to be transferred
AS
DECLARE @ filegroup NVARCHAR (100) = n', -- the file group in which the partition is to be migrated
@ SQL NVARCHAR (4000) = n', -- create a dynamic statement for the temporary table in the middle
@ RangeValue INT; -- maximum partition boundary value to be migrated
SET @ rangeValue = (select cast (value as int) FROM sys. partition_range_values WHERE boundary_id = @ rangeValue );
-- Here, the value is assigned using the UDF fn_GetFileForPartition in my previous article.
SET @ filegroup = (SELECT [FILEGROUP_NAME] FROM dbo. fn_GetFileForPartition (N 'sch _ Product_ID ', @ rangeValue ));
SET @ SQL = n' CREATE TABLE [dbo]. [Products_Tmp] (
[ID] [int] not null,
[PName] [nvarchar] (100) NULL,
[AddDate] [datetime2] (3) NULL,
CONSTRAINT [PK_Products_Tmp] PRIMARY KEY CLUSTERED
(
[ID] ASC
) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
) ON '+ QUOTENAME (@ filegroup );
IF OBJECT_ID ('dbo. Products_Tmp ') IS NOT NULL
RAISERROR ('the database object Products_Tmp already exists! ', 16,1 );
ELSE
BEGIN
-- PRINT @ SQL;
EXEC (@ SQL );
-- Cut the data in the specified partition to Products_Tmp
Alter table dbo. Products
Switch partition @ PartitonNumber
TO dbo. Products_Tmp;
END
-- Because the small amount of data is a heap table, insert into is directly used to import data.
-- If the data volume is large, BULK + BCP or SSIS may be used.
-- Insert into [dbo]. [Products_Archive]
-- ([ID], [PName], [AddDate])
-- SELECT [ID], [PName], [AddDate] FROM [dbo]. [Products_Tmp];
-- Drop table [dbo]. [Products_Tmp
GO
-- Migrate data and delete Products_Tmp.
EXEC usp_TransferPartitionData_ForArchive 1; -- specify to transfer the first partition
-- Because the small amount of data is a heap table, insert into is directly used to import data.
-- If the data volume is large, BULK + BCP or SSIS may be used.
Insert into [dbo]. [Products_Archive]
([ID], [PName], [AddDate])
SELECT [ID], [PName], [AddDate] FROM [dbo]. [Products_Tmp];
Drop table [dbo]. [Products_Tmp];
Now the old data is removed, but new data comes in every day, and the last partition is getting bigger and bigger, so we need to join the new file group to undertake this partition. But here, we will not create a new file group, but use the PRIMARY file group that is "empty" when the data is migrated above.
First, if the partition is empty, remove it from the partition table. Then add it as a new file group to the partition table. In fact, the following code can also be integrated into the above SP.
DECLARE @ rangeValue INT, -- the upper limit of the partition boundary value to be deleted
@ SplitValue INT, -- minimum boundary value to be split into new partitions
@ Filegroup NVARCHAR (100) = n', -- the file group to which the partition to be merged belongs
@ SQL NVARCHAR (4000); -- Dynamic statement
-- Merge the empty partition, that is, delete the partition.
SET @ rangeValue = 80000;
Alter partition function fn_Partition_Product_ID ()
Merge range (@ rangeValue );
-- Modify the architecture and add a file group
SET @ filegroup = (SELECT [FILEGROUP_NAME] FROM dbo. fn_GetFileForPartition (N 'sch _ Product_ID ', 80000 ));
SET @ SQL = n' ALTER PARTITION SCHEME Sch_Product_ID
Next used '+ QUOTENAME (@ filegroup) + N ';';
-- PRINT @ SQL;
EXEC (@ SQL );
-- Add new partitions to accept new data
SET @ splitValue = 380000;
Alter partition function fn_Partition_Product_ID ()
Split range (@ splitValue );