Problem
Database backups occupy a very important place in the day-to-day tasks of the database administrator. This task is typically done through maintenance plans, regular SQL Server Agent jobs, or the execution of Third-party tools. Given the importance of backup, it is necessary to periodically analyze the performance and efficiency of this process. So how do we get an insight into the performance of any data backup process?
Expert answers
Let's look at what we can see through a number of different scripts. To achieve the purpose of this article, I created a database called Backupreport in this example. For test purposes, the full log, variance and transaction logs are executed to list the value of the script. View the following script:
Script – Create Backup process statistics |
SELECT S.database_name,
M.physical_device_name, CAST (s.backup_size/1000000 as varchar) + ' + ' MB ' as Bksize, cast (DATEDIFF (second,s.backup_ Start_date, S.backup_finish_date) as VARCHAR (4)) + ' + ' Seconds ' Timetaken, s.backup_start_date, case S.[type] when ' D ' the N ' full ' when ' I ' THEN ' differential ' when ' L ' THEN ' Transaction Log ' end as BackupType, S.server_name, S.recovery_model F ROM msdb.dbo.backupset s INNER join msdb.dbo.backupmediafamily m on s.media_set_id = m.media_set_id WHERE s.database_name = ' Backupreport ' ORDER by database_name, backup_start_date, backup_finish_date |
This is based on the results of our example:
Now, we have statistics for all the backup processes for a particular database. With this data, we can analyze changes in a specific period or different backup types.