Most organizations have data flooding, or, more accurately, an overflow of internally required storage, which also holds copies of many redundant data that will continue unabated.
In an era of cost-cutting and consolidation, we must ask ourselves whether there is a better way to manage the problem of excess. Unfortunately, the company has been fretting about the problem for decades. So far, the technology choices in addressing storage management core issues are very limited: unbridled data growth, low resource utilization, and inefficient planning. The latest technology is expected to be cloud storage. But will cloud storage be more effective than other methods?
One of the main reasons to recall the initial San, to get rid of independent direct attached storage to support storage arrays and SAN infrastructures, is to increase efficiency by increasing resource utilization. Although San Systems also offer other benefits, such as increased availability and recoverability, utilization is still less than optimal in most cases.
The next big move is information lifecycle management. Developing a strategy for storage configuration and distribution based on the business value of the data, theoretically we can reduce the cost by reducing a lot of expensive high-end storage. The real result is that a large number of organizations buy extra tiers of storage, but the cost savings are not obvious, at least not as close to the expected amount.
The most recent hardening technology is a streamlined configuration, which is still expected to be a niche technology due to current application and operating system constraints. The other is data deduplication, which is designed primarily to achieve a more advantageous cost point for backup-to-disk than for tape. Both streamlined configuration and duplicate data removal will help drive future storage efficiencies.
We cannot ignore the fact that organizations have also struggled with storage efficiency at the same time, and the actual equipment costs of storing data have continued to fall and fall rapidly. So why is it so hard to solve the problem?
To a large extent, the answer lies in the fact that in most organizations they still lack comprehensive storage management policies and data management policies. This problem is exacerbated by the lack of metrics and reports on data, storage space usage and trends. For example, cleaning the data. Like diamonds, once the data is established, it will be forever. This is typical storage, backup, replication, and possibly archiving (all of which require more storage). However, the likelihood that it will be cleaned up is actually very low.
Interestingly, storage management's worst state could mean a huge opportunity for cloud storage. The cloud can be stored as a second or more likely to be the third level. Based on aging and access policies, it is possible to manually or using automated data mobile hard drives to transfer these data-primarily unstructured data-to the cloud. In addition to freeing up capacity and slowing device acquisition speeds, this data will no longer need to be backed up or replicated, so the multiplier effect can be eliminated. Furthermore, if cloud service providers are truly service-oriented, they are likely to provide more comprehensive service level agreements and reports on this data than within.
Obviously, moving data to the cloud cannot be taken lightly. It requires a variety of considerations, including security, usability, access, and control. And most importantly, keep in mind that while the cloud may offer attractive prices for certain categories of data, cloud storage must be complemented by sophisticated storage and data management strategies to drive cost-driven systems.
(Responsible editor: The good of the Legacy)