Original: Database management--powershell--use Powershell scripts to find files that consume the most disk space
Original translation from:
http://www.mssqltips.com/sqlservertip/2774/ Powershell-script-to-find-files-that-are-consuming-the-most-disk-space/?utm_source=dailynewsletter&utm_ medium=email&utm_content=headline&utm_campaign=2012923
Explain, Csdn's editing function is quite rotten, make my script all messed up, see the Hard please mo.
In peacetime backup process, more or less will encounter the problem of insufficient space, in order to prevent this situation, you can do regular check disk space operation, but because the pure SQL statement is difficult to implement, so you can use PowerShell to achieve this kind of function, here, using Get-chileitem:
Grammar:
Get-childitem [[-path]] [[-filter]] [-include] [-exclude] [-name] [-recurse] [-force] [Commonparameters] |
First Open PowerShell, note that this article opens PowerShell in two ways:
In order to get get-childitem more information, you can execute the following statement in PowerShell:
# # for detailed information Get-help get-childitem-detailed # # For technical information, type: Get-help Get-childitem-full |
First, let's look at some examples of Get-childitem:
In the first example, the list of files and folders in the current directory is queried, although PowerShell is case-insensitive, it is recommended to use a normalized encoding format:
Second example: Sort by name in descending order:
Get-childitem C:\Python27 | Sort-object-property name-descending |
The results are as follows:
A third example: use the contents of the –recurse parameter folder and its subfolders:
Get-childitem C:\SP2-recurse |
Get the results:
You can use the-include/-exclude parameter to find or exclude specific condition files. You can use-first[number of rows] (top to bottom) to qualify the number of rows for the output. Or use the-last[number of rows] (bottom to top) parameter to qualify.
Get-childitem e:\db\*.*-include *.ldf,*.mdf | Select Name,length-last 8
Get the following results:
You can use the Where-object cmdlet to find information based on specific criteria. The where-object clause follows curly braces {} and begins with the $_ prefix. PowerShell uses the following operators to achieve comparisons:
- -lt Less than
- -le Less than or equal to
- -GT Greater than
- -ge Greater than or equal to
- -eq Equal to
- -ne Not equal to
- -like uses wildcards for pattern matching
Get-childitem e:\db\*.*-include *.mdf | Where-object {$_.name-like "t*"}
Since I created a test library, starting with T, I get the following results:
Anyway
You can use the following script to find large files, in which you must define $PATH (for specifying a path), $size (to limit the size of the lookup), $limit (to limit the number of rows), and $extension (for qualifying file extensions).
In this case, it's a bit out of the original and changed to my local directory and file name. Find the top five files in E:\DB and its subdirectories, with files greater than 1M, with a suffix of MDF.
# #Mention The path to search the files $path = "E:\" # #Find the files greater than equal to below mentioned size $size = 1MB # #Limit the number of rows $limit = 5 # #Find out the specific extension file $Extension = "*.mdf" # #script to ' Find out ' the files based on the above input $largeSizefiles = Get-childitem-path $path-recurse-erroraction "silentlycontinue"-include $Extension |? { $_. GetType (). Name-eq "FileInfo"} | Where-object {$_. LENGTH-GT $size} | Sort-object-property length-descending | Select-object Name, @{name= "SIZEINMB"; Expression={$_. Length/1mb}},@{name= "Path"; Expression={$_.directory}}-first $limit $largeSizefiles |
Get the following results:
You can save the script as Filename.ps1. Then use it in PowerShell./Run as follows:
Note that because Win7 is disabled by default, the first execution will be marked with a red-letter error, which can be changed after the steps in.
You can also use Export-csv to export files to CSV to view:
After executing the script, the Lsfreport.csv file will appear on the C drive. The rest, you know.
2008 of the job has the steps to execute PowerShell script, you can add some judgment to the above statement, the close and above the threshold value when the corresponding processing:
Database management--powershell--using Powershell scripts to find files that consume the most disk space