Microsoft Azure combines cloud to build Izumo storage

Source: Internet
Author: User
Keywords Microsoft hyper-v windows azure windows Server R2

Tiered storage Management (HSM) was first born in the mainframe era, when even a 300MB "high-capacity" disk would cost tens of thousands of of dollars. So, at that point, we had to look for a slower but cheaper disk repository that was built outside the disk array.

However, while many emerging storage has emerged, HSM systems are still playing a role in today's specific environments. For example, Australia's CSIRO (the federal Organization for Scientific and Industrial research) still relies heavily on HSM schemes.

Cloud providers are now in their own golden age, and the 200TB-capacity consumer disk is only 20,000 pounds (the disk itself).

If you asked me six months ago whether I chose a traditional HSM system as a good choice, my answer must be very clear: "Depending on the circumstances." But today, my answer will be different: "Absolutely not." ”

While Microsoft's new "Inside" Azure will one day allow users to create a managed cloud that is compatible with Azure and hyper-V anywhere in the world, it is still in the Envisioning phase.

For now, only Hyper-V and its local cloud creation mechanism, which is similar to azure, are available exactly. By combining it with the Windows Server R2, you'll get a perfect pair.

As a result, we will be able to set up our own private cloud to compete with the Azure instance hosted by Microsoft. This leaves the question "Should we run our own private cloud solution?" Into the "Why shouldn't we?"

Here we will make a specific analysis.

Popular and unpopular

The first thing we need to do is to put aside the notion that the term HSM is inherently content. HSM did have a glorious moment, but as the life cycle goes on, it is now in its twilight.

In the HSM area, I suggest that you replace it with another term that is similar in structure, which is ast--automated storage tiering.

In essence, this is just a way of describing different types of storage within the datacenter. Any technician familiar with virtualized data centers should not be unfamiliar with this concept.

We can analyze the typical AST program as follows:

Hot data: Flash drive based on PCIe, serial Attached SCSI SSD (SSD).

Unpopular data: Based on traditional hard drive (HDD).

As the supply of disk and SSD products continues to rise and prices fall sharply, most of us are now starting to equip ourselves with multiple sets of storage solutions. The AST manages access tasks in an automated manner, ensuring that hot data is kept in SSD and that the unpopular data is always within the HDD.

It sounds pretty good, but in fact it's popular. Tiered storage solutions are not easy to set up and manage. Although some suppliers will bring us the convenience of out-of-the-box products, but often with the dependence of the manufacturer, the specific cost of use and additional restrictions are endless. So, to be honest, this is not the ideal solution.

In another way, the storage space in Server R2 (Storage MSN) mechanism by virtue of its own excellent results and excellent performance to be worth each user to try the rescue weapon.

Microsoft has concentrated all of its experience over the years on its Azure cloud services, which it hopes will improve service levels and include them in a set of interfaces and back ends. It works on everyone's JBOD (that is, simple disk bundles) and physical access storage devices, and allows users to pool these resources.

The storage space mechanism also allows you to create an automated storage layer with three-step settings. Although the system is not rigorously tested (the actual effect may not reach our expectations) and there are obvious support limitations, you can build your own point-to-point virtual private network and connect remote managed devices to the Azure Server 2012 Storage System, Make it a set of overall iSCSI storage space.

If you have a local datacenter storage repository, then we can all get the same performance as our existing local HSM after the right links.

Consider costs

In order to achieve the desired results, we need to first establish a test environment, and after the completion of the concept validation through a single test case and the server to test the water. For some azure hosting mechanisms, we also need to be reviewed and approved by the CIO.

At this stage, we usually need to contact the local representative of Microsoft to discuss the specific cost of use. If you are already a big customer of Microsoft products, you have the opportunity to get attractive discounted prices. If not, I'm afraid we'll have to pay the usage fee according to the product list on the Azure website.

If the existing mechanism is weak, you may need to ensure that all changes do not have a serious impact on the business – or that the business itself is likely to be tossed into a crash.

For example, we might need 200TB of storage because our real need is to migrate all the data that was originally saved in the HSM array into the cloud environment.

To do this locally, we may need to pay up to 70,000 pounds in the next three years to buy disks, chassis, servers, and matching power cables.

According to the "on-demand (Pay as Go)" Price list on the Azure website, you may find that the actual cost is about three times times higher than the individual purchase cost.

Microsoft is willing to offer a maximum of 32% discount if everyone pays a full year's usage fee in advance. But even if the discount is expanded to One-third, there is still a need to make a huge amount of money for Microsoft's cloud-managed data-which is more than 130,000 pounds higher than running the local cloud.

After taking care of the data delivery cloud system, we also need to face new problems-the negative effects of the data ownership controversy.

Microsoft is working to build and launch a localized hosted Azure solution in various countries around the world, but these efforts are still in the research and development phase with no real products available. And Microsoft has not made any guarantees, so even if the previous work is fully ready, we may not be able to easily achieve the service migration or have the local hosting cloud close to the reasonable cost of use.

As you may think, this is no big deal. We can, of course, encrypt the virtual hard disk and encrypt the data that is saved in the future. But are we really sure that the procedures and encryption you are about to use will not be compromised?

From the point of view of data ownership, the safety factor of cloud scheme is still the lowest.

Core Attraction

To sum up, we may not be able to remotely host cloud products because of their high scale costs. So, is there a reasonable use form that we can afford? In addition, could we avoid the problem of data ownership caused by cloud services?

In fact, the answer is yes. When it comes to how Azure helps users manage their data at the data center level, Microsoft enumerates the significant advantages of remote hosting: Providing a set of cryptographic backup storage services.

For "on-demand" planning, the cost of 50TB storage per month is 3333 U.S. dollars (from Microsoft's official website quotes, in dollars, the amount will vary depending on the country and the type of currency of the customer).

At first glance, most business users seem to be having a hard time withstanding such high costs, as their total spending will be as high as $119988 by the 36-month usage cycle. But given that Azure's free backup services can help businesses save a portion of their spending, and a reliable backup solution can significantly improve the company's accidental losses, it can be said that Microsoft's cloud solution has the ideal core appeal.

Start Small

If you realize that Microsoft has combined Azure with its Windows Server R2 Backup and Restore Center, your confidence in this cloud scenario will be further encouraged.

So, even if we can't keep all of our customers ' data in azure, we can at least give them our business data.

As a cloud service, Azure is the ideal choice for development and small-scale storage requirements, but its performance in large-scale business is not affordable.

Microsoft is working hard to extend the scope of Azure's hosting to all areas-as long as this is achieved, or Azure's full internal deployment, please do not miss this historic opportunity.

For now, you may want to consider Azure as an excellent and reliable online backup service, and continue to keep an eye on its trends.

"Edit Recommendation"

Upgrading and migrating Domain Services to Windows Server R2 win Server-R2 Hyper-V online Adjust the size of the virtual hard disk Microsoft will launch a number of new products to the full impact of cloud computing "responsible editor: Xiao Yun TEL: (010) 68476606"

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.