Final optimization
When you write a script, don't always think about optimizing it, because some of your optimized code may end up being discarded. And always thinking about optimization also lowers your productivity, because scripting times can be more valuable than CPU time.
Using Filter Parameters
PowerShell may consume a lot of resources, because many of the cmdlet itself are designed to provide a large number of data. So, if you use the cmdlet command to support-filter,-include, and-exclude filter conditions, use them as much as possible.
First, if a command supports-filter filter parameters, it means that there may be an object access API hidden here. With filtering, you might get your code to execute very quickly because filtering is usually performed before the object is created. Instead, a command supports-include, and-exclude, which occurs before the object enters the pipeline after the object is created. So the latter is less efficient than-filter. However, using-include, and-exclude, allows some objects to not enter the pipeline, speed is also very fast.
Sometimes, more types should be used to filter. For example, you want to search all files with the suffix name htm under D disk. We should use *.htm as a filter-filter condition, powershell use the traditional file system wildcard character and only return all matching file objects. This is highly efficient because this simple pattern matches the Windows API itself at the bottom. However, the Windows API itself has many limitations because it is too old to ignore all characters except the three characters in the file suffix name. So it will return even if there is an HTML suffix name. So at this time, we need to-filter and-include double swords, defeated Golden Wheel King.
Copy Code code as follows:
Dir D:-filter ' *.htm '-include ' *.html '-recurse
But one thing to keep in mind is that using-filter filter conditions is very fast, but how quickly to what extent depends on the underlying API of the-filter call. Let's give an example:
Copy Code code as follows:
Get-wmiobject-class win32_product-filter ' Vendor like '%microsoft% '
This example will query all Microsoft products installed by the machine, even though we use-filter, still very slow? Because-filter is invoking the Windows Management Instrumentation (WMI) API, which is based on the WMI Query Language (WQL), filtering occurs inside WMI.
Reduce the footprint of resources
Performance optimization includes reduced time complexity and space complexity, but most of the time, fish and paws cannot be both. You can only choose one. For example, if you want to list all the files in D, and then do something for each file, you might use Foreach-object to variable the file system objects throughout the collection:
Copy Code code as follows:
Get-childitem-path D:\-recurse | Foreach-object {do-something}
After using this command, each file object passes through the boundary of the pipe after extra wrapping, and the execution of the code decreases significantly, but it does not consume too much memory, because only one object flows in the pipeline at a time.
In another way, you might use a Foreach loop:
Copy Code code as follows:
foreach ($file in (Get-childitem-path d:\-recurse)) {
Do-something}
This code executes very quickly because it avoids the pipe boundary. However, it adds all the file objects to the collection before it is processed. So if this collection is very large, it may be like a country's energy bureau leaders occupy hundreds of millions of yuan, flood and animal-like consumption of system resources.
The Foreach loop is faster and consumes more memory than Foreach-object. But if you can be sure that the data you're dealing with isn't very large, the Foreach loop is certainly the right choice.
Use sleep to reduce CPU usage
PowerShell scripts that touch many objects are usually executed for a long time and are not Shang to the processor. Now the single core CPU has become the past, in multicore today this may not be a thing, but it still may cause the system to spend a lot of time to wait. If your script consumes a lot of CPU cycles, or if you need to wait for things to happen, you can use the Start-sleep command to reduce the processor usage. The default Slee is waiting in seconds, which is certainly unbearable, you can set a millisecond-level pause. It's better to always have a resolution of no more than 10 to 20 milliseconds (no longer makes sense), so you can specify a minimum pause time of 20 milliseconds.
In addition, you may not want your script to sleep in every cycle, just part of the cycle, to make time for the CPU to do other things. The following script uses the operator% modulo to ensure that every 10 rounds are paused:
Copy Code code as follows:
$i =0
Get-childitem-recurse |
foreach-object{$i +=1
if ($i%10–eq 0)
{Sleep-mill 20}
Do-something
}
A simple optimization scheme
You can sum up these tips into a unified optimization scheme. First of all, don't rush to optimize until the script is finished. Then, using filter filtering to reduce the number of returned objects and then fine-tune them with-include and-exclude will both reduce the elapsed time and reduce the utilization of resources. Based on this, if you have a large amount of data, use a foreach loop instead of Foreach-object, which will make your script more efficient. Of course, if you have more than 100,000 levels of data, this may introduce new performance issues. Finally, if you find that your script CPU usage is old and high, see if you can use the Start-sleep command in some loops to relax.