Just a few years ago, duplicate data deletion was a stand-alone feature, and data deduplication offered an alternative to the storage systems in the enterprise backup and archiving department. It also found new uses in cloud gateways to filter out unnecessary chunks of data before entering the array or virtual tape library. Now, it has become a unified computing system of the pre-integrated functionality. Understanding how to use this technology more effectively becomes a requirement. At the same time IT managers should re-examine storage issues and ask the vendors who provide them with storage.
1. What is the impact of duplicate data removal technology on backup performance?
High performance is critical for larger businesses that multiply. At the same time, enterprises need to be in a limited backup environment to ensure the security of massive data backup environment. Over time, understanding the De-duplication technology, especially the performance gap between categories, is a critical factor in choosing the most appropriate technology in a particular environment.
2. Data de-duplication reduces the performance of recovery data?
Learn about the time it takes to recover a file (for example, backing up last week, which is also the most common category of recovery requests). The enterprise needs to ask the vendor if the technology provided will ensure that the backup of the last instant recovery and fast tape transfer is available.
3. How will capacity and performance expansion grow with the environment?
Calculates the amount of data stored in a single duplicate-deletion data system by calculating the specific data de-duplication ratios, policies, data types, and growth rates. Also understand the impact of excess storage, such as when excess data capacity requires you to keep it in an additional data backup system. But this also poses new problems that you must consider for additional management complexity costs, capital expenditures, and damage to the existing environment.
4. What is the efficiency of data deduplication for large databases?
Ensure that data de-duplication is optimized while processing SUB-8KB data while maintaining previous performance levels. A large, mission-critical database. such as Oracle, SAP, SQL Server, and DB2 typically change data in 8KB or less data segments. However, in the absence of a significant slowdown in the backup process, most data deduplication cannot provide performance comparisons that are less than 16KB data segments.
5. What is the efficiency of data de-duplication in a incremental incremental backup environment?
Some data deduplication packages are less efficient at eliminating de-duplication for incremental incremental backups of TSM and for backing up data-segmented applications. For example, Networker and HP Data protector. Ask the vendor if the data de-duplication technology can use metadata from the backup application to confirm the data field that contains the duplicate data deletion data. Ensures that optimized data is reduced while performing byte-level data comparisons while maintaining high performance.
6. What are the realistic expectations for capacity reduction?
What big businesses need is to provide more effective policies to ensure the security of data transfers within backup windows while providing efficient de-duplication solutions, rather than higher de-duplication ratios. Parallel processing and determined transmission rates, as well as duplicate data deletion and replication, are key factors driving the enterprise environment.
7. Can administrators monitor backups, data de-duplication, replication, and recovery at the enterprise level?
A holistic view of the data protection environment enables administrators of backup information to manage more data. Fine-tune your backup environment to optimize utilization and efficiency, and accurately plan all performance and capacity requirements for your future enterprise.
8. Data de-duplication technology can help large enterprises reduce the replication bandwidth requirements of data volumes?
Some data de-duplication technologies have byte-level data that enables organizations to more efficiently replicate across wide area networks, reducing requirements for WAN bandwidth and improving security times.
9. Can IT departments meet their own needs by fine-tuning data de-duplication technologies?
There may be specific data types for data de-duplication requirements in the enterprise data protection environment. Enable it to find solutions that choose the right data set to perform de-duplication backup strategies and data types, and the backup and execution of data types that are automatically detected. Select the appropriate de-duplication data technology to enable it to choose the most efficient method of data de-duplication for different data types.
10. How much experience does a supplier have to backup large enterprise environmental data?
Enterprise-class data centers with large data scales and complex policies need the support of data protection vendors with expertise and enterprise-level backup applications. Examples include NetBackup, NetBackup Ost, and Tivoli Storage Manager. Vendors should provide assessment and guidance on how to optimize the overall backup infrastructure, as well as replication, backup, and data de-duplication technologies in large environments. (Li/compiling)
(Responsible editor: Lu Guang)