Storage has been a major drag to reduce operating costs, although the cost of storage has been decreasing in recent years, but the growth rate of enterprise data is far more than the reduction of storage costs, so how to reduce the pressure on the storage to the enterprise is a big test for IT staff
Microsoft has brought a surprising feature in Windows Server 2012, called Deduplication, which allows Windows Server 2012 to store more data in less physical space and gain more than previous versions of Windows Significantly higher storage efficiency for the operating system.
A simple explanation of the principle of deduplication, in the operating system prior to Windows Server 2012, we mainly store data in the form of files, and after the deduplication system is enabled we will store the data in the form of "block", for example
, in general we store two files, the two files may be the same ABC part, only MN and XY, but we still store twice times the data, and after the deduplication is enabled we will only need to store a copy of ABC, this ABC will be used for two files, This is, of course, a mechanism for internal storage, and we actually see the same thing when we use it when we do not have data deduplication enabled. Data deduplication for Windows Server 2012 is designed to be installed on the primary data volume without adding any additional dedicated hardware. This means that we only need to have Windows Server 2012 that the operating system can use the data deduplication feature
650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/3F/3C/wKiom1PIv7LzZMVFAADWQ-VtYoU274.jpg "title=" Ic564245.jpg "alt=" Wkiom1piv7lzzmvfaadwq-vtyou274.jpg "/>
Deduplication is a feature enabled for individual volumes, such as the ability to enable deduplication for a single disk partition, rather than having to enable deduplication to read, process, and write large amounts of data for an entire hard disk. This will take up a certain amount of server resources, according to Microsoft, the server requires 1 CPU cores and free memory to run data deduplication jobs on a single volume, which can process approximately GB of data per hour, or about 2 terabytes of data can be processed per day. Data deduplication uses additional CPU core processors and available memory to scale to allow parallel processing of multiple volumes
For example, if the server is equipped with 16 CPU core processors and five GB of memory, the deduplication feature will use 25% of the system memory in the default background processing mode, which in this case is 4 GB. If you divide by three MB, you can calculate that the server will process approximately 11 volumes at a time. If you add 8 GB of memory, the system will process 17 volumes at a time. If you set the optimization job to run in throughput mode, the system will use up to 50% of the system memory for the optimization job.
Finally, we would like to mention that data deduplication requires that the partition format must be NTFS
This article is from the IT Technology sharing blog, so be sure to keep this source http://mxyit.blog.51cto.com/4308871/1439922
Windows Server 2012 Data deduplication