As the use of SQL Server databases increases, it may become increasingly slow. This is related to your lack of reasonable maintenance plans. Regular processing of index fragmentation is one of the essential tasks. For more information, see msdn.
Http://msdn.microsoft.com/zh-cn/library/ms189858.aspx I encountered a table at work, with 3.2 million records, data tables occupying more than 800 mb of space, all index fragmentation is greater than 80%, and even 100%, the index space is 500 mb, after the index is regenerated, the occupied space is reduced to more than 200 MB. A script that can be tested in sql2005
-- Drop database db_index_test -- create a test environment
Create Database db_index_test
Go
Use db_index_test
Go
Create Table tbtest (rownum int identity (100), Id varchar (), date datetime)
Go
Create index index_id on tbtest (ID) Go
-- Insert test data and delete part of the data as appropriate
Declare @ I int
Set @ I = 1
While @ I <10
Begin
Insert into tbtest (ID, date)
Select newid (), getdate () from syscolumns
Delete from tbtest where rownum % 2 = 0
Set @ I = @ I + 1
End
Go
-- Check the index
Select avg_fragmentation_in_percent from sys. dm_db_index_physical_stats (db_id (), object_id (N 'tbtest'), null, null) as a join sys. indexes as B on. object_id = B. object_id and. index_id = B. index_id where name = 'index _ id'
Go -- Re-Indexing
Alter index index_id on tbtest rebuild go
-- Check the index
Select avg_fragmentation_in_percent from sys. dm_db_index_physical_stats (db_id (), object_id (N 'tbtest'), null, null) as a join sys. indexes as B on. object_id = B. object_id and. index_id = B. index_id where name = 'index _ id' -- delete the test environment go use master go drop database db_index_test
Go
You can also manually check and recreate the index in SQL Server Management studio.