Windows 8.1 Data deduplication-concept (i)

Source: Internet
Author: User

Function description

Data deduplication refers to finding and deleting duplicates in the data without affecting their fidelity or integrity. The goal is to change (32-128 KB) small chunks by splitting the files into sizes, identify duplicate chunks, and then keep a copy of each chunk to store more data in a smaller space. A redundant copy of a chunk is replaced by a reference to a single copy. Chunks are compressed and then organized into the System Volume Information folder as a special container file.

650) this.width=650; "alt=" 20131125101161026102.gif "src=" http://www.muchlab.com/images/image/ 20131125101161026102.gif "/>

After you enable deduplication for a volume and optimize the data, the volume contains the following:

    • files that are not optimized: For example, files that are not optimized can include files that do not meet the selected file Retention policy settings, System State files, alternate data streams, encrypted files, files with extended properties, files less than KB, Other re-profiling point files or files that are being used by other applications.

    • optimized file: a file stored as a re-analysis point that contains pointers to maps of individual chunks in a chunk store that are required to restore the requested file.

    • Block Storage: The location where the optimized file data resides.

    • Additional free space: optimized file and chunk storage is much smaller than the space used before optimization.

Practical application

To address the growth of enterprise data storage, administrators want to consolidate multiple servers and set capacity scaling and data optimization as key goals. The data deduplication feature provides a viable way to achieve these goals, including:

  • Capacity Optimization: Data deduplication in Windows Server 2012 stores more data in less physical space. It can achieve greater storage efficiency than single-instance storage (SIS) or NTFS compression capabilities. The Deduplication feature uses sub-file variable-size chunks and compression, and the general file server has a total optimization rate of 2:1, while the optimization rate for virtual data is up to 20:1.

  • Scalability and performance: in Windows Server 2012, the deduplication feature is highly scalable, effectively leveraging resources, and does not cause interference. It can process approximately MB of data per second and can run on multiple volumes at the same time without impacting other workloads on the server. Maintain a low impact on server workloads by limiting the consumption of CPU and memory resources. If the server is too busy, the data deduplication feature may stop completely. In addition, administrators are more flexible: You can run data deduplication at any time, set a run schedule for data deduplication, and establish a selection strategy.

  • Reliability and data integrity : Maintain data integrity when you apply data deduplication. Windows Server 2012 uses checksum, consistency, and authentication to ensure data integrity. Also, for all metadata and most commonly referenced data, deduplication remains redundant, ensuring that data is recoverable when data is corrupted.

  • increase bandwidth efficiency with BranchCache: the same optimization techniques can be applied to data transferred over the WAN to the branch office through integration with BranchCache. The result is shorter file download times and lower bandwidth usage.

  • use the familiar tools for optimal management: Windows Server 2012 has the optimization features built into Server Manager and Windows PowerShell. The default setting enables immediate savings, and administrators can fine-tune the settings for more savings. Users can easily use Windows PowerShell cmdlets to start an optimization job or plan to run in the future. You can also use the Unattend.xml file (which invokes Windows PowerShell scripts and is used with Sysprep to deploy deduplication when the system first starts) to install the Deduplication feature and enable Deduplication on the selected volume.

Environmental requirements

To take advantage of data deduplication in Windows Server 2012, the environment must meet the following requirements:

    • Server: A computer running Windows Server 2012, or a virtual machine that contains at least one data volume;

    • Other: A computer running Windows Server 2012 or Windows 8 that is connected to the server through the network ;


Follow-up will bring you Windows 8.1 deduplication-Planning and Deployment (ii), and everyone together to uncover the win 8.1 deduplication function of the magic veil, hey, let's wait and see.

Good night, everyone, a good week has been opened, I hope everyone has a happy mood every day, we would like to have a happy life, work smoothly. 650) this.width=650; "alt=" J_0028.gif "src=" Http://img.baidu.com/hi/jx2/j_0028.gif "/>

This article is from the "Heard" blog, please make sure to keep this source http://wenzhongxiang.blog.51cto.com/6370734/1635887

Windows 8.1 Data deduplication-concept (i)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.