This blog provides an overview of data operations in cloud computing solutions.
Overview
Data is a crucial part of most solutions. In cloud computing, most of the existing suggestions can be used directly. However, cloud computing also has its own uniqueness. This blog will discuss the following two use cases:
- Publish your data stored in the cloud to the world
- Use your local data in a cloud project.
General recommendations
Regardless of the use case, these suggestions are generic.
Select a topology
In the SOA world, the most important concept is contract ). In the world of cloud computing, the most important concepts about communication are also contracts. When a contract is used by many cloud computing solutions, we can call it a topology.
Now we only discuss data communication. If you select Microsoft's solution, we recommend that you use open data protocol (odata ). Odata is created based on international standards such as HTTP and atompub. It provides a cross-platform Data Communication solution. If your cloudProgramTo use odata to publish data, any program in the world can enjoy your data as long as it supports the odata standard. Similarly, your cloud applications can also use odata to access your local data.
Many current Microsoft products are already using odata. For example, indows azure table Storage, Dallas, Sharepoint 2010, SQL Server 2008 R2, and so on.
If you plan to use other topologies, it is necessary to carefully consider their scalability, how many people are using them, and so on.
Select a technology
Since the topology has been selected, the next step is to select a technology to implement this topology.
If you select Microsoft's solution, we recommend that you use WCF to process communication between all programs. For data communication, WCF data services is naturally the best choice.
First, WCF Data Services is a WCF Service, so you can use all existing WCF knowledge. Secondly, WCF Data Services has implemented the odata topology, so you can focus on the representation of your data format in your program, rather than the real data format transmitted on the network by atompub/JSON. In addition, WCF Data Services focuses on data transmission rather than data storage. Your data can be stored anywhere: local databases, cloud databases, external web services, XML files, and so on. No matter how the data comes from, you can publish/use them in the same way.
If you choose other technologies, it is necessary to carefully consider the amount of effort required to use the technology to complete your solution, whether the technology can provide future solution expansion, and so on.
Next, let's take a look at how Microsoft products help you complete the above two use cases.
Publish your data stored in the cloud to the world
Many cloud computing solutions are not isolated and need to interact with the external world. When it comes to data, you are likely to directly reflect daas (data as a service, data as a service ).
Cloud computing data can be stored in many places, and the data itself is also very diverse. This article will focus on structured data (such as XML) and relational data (such as relational databases ). Currently, Microsoft provides two products for storing data in the cloud.
- Windows azure table Storage: used to store structured data. Use Dynamic schema ).
- SQL azure: used to store relational data. Use static schema ).
The following table compares the advantages of static and dynamic architectures.
Select an appropriate data storage method for your specific scenarios. Generally, if your service has the write permission to the external world (allow the external world to update data), dynamic architecture is a good choice, because third-party programs may need to modify the data structure you provide as appropriate. However, currently, Windows azure table storage has some limitations. It does not implement all odata functions. In addition, the relational model has been used for decades, your developers may also be very familiar with the relational model, so if the cost of using dynamic architecture is too high for you, select static architecture.
No matter what architecture you choose, odata and WCF data services can play a very important role.
As mentioned earlier, WCF data services can use any data source. By default, it provides two data providers: Ado. NET Entity Framework (EDM) and LINQ to SQL (l2s ). If you are using these two data sources, you usually only need to write a small partCodeYou can complete a project. If you choose SQL azure to store data, you can use EDM and l2s as data sources.
If you use other data sources (such as Windows azure table storage), you need to convert your data model to a model that is understood by WCF data services. If your data is read-only, this process is very simple, because you only need to write a very common class to represent your data structure. If you need the complete crud function, you must implement the iupdatable interface. This is called "reflection provider for WCF Data Services ". In more advanced scenarios, you can also use "custom data service providers ". For more information, see http://msdn.microsoft.com/en-us/library/dd672591 (vs.100). aspx.
Windows azure table storage also uses the odata topology, so you may try to allow your customers to directly access your data source. But in most cases, do not do this. You must do your best to protect the key of your storage account (think of it as your password ). If you give your password to a trusted user so that he/she can directly access your table storage, and he/she abused this permission, at the end, so that you must pay for your storage account. We recommend that you encapsulate data and business logic into services. using WCF Data Services is a good choice for this task.
You can download an example from all-in-one code framework (azure).zip to demonstrate how to use WCF data services to publish data stored in Windows azure table storage to the world. The example name is csazuretablestoragewcfds/vbazuretablestoragewcfds. This example also provides a Silverlight client for testing the service.
Use your local data in a cloud Project
Another common scenario is to use your local data in a cloud project. In most cases, the data is stored in relational databases (such as SQL Server) using a static architecture. Therefore, you generally do not consider how to store the data. In this scenario, you are more concerned with connectivity and security.
Many companies have firewalls and Nat. It is difficult to find a machine that can be accessed from the Internet and has a fixed IP address. Therefore, it is difficult to directly connect a program on the cloud to a local database. Permission control is also a problem. Cloud programs are not in your company's LAN, and are not in the same domain as the database. It is impossible to use integrated windows verification, federated verification does not provide a good solution for databases.
To solve the first problem, Microsoft provides Windows azure platform appfabric service bus. The Service Bus is like a bridge between your local service and the cloud program. The local service is actually a client for the service bus, so even if the local server is located after Nat, it can still communicate with the service bus. The service bus will send messages sent by your cloud programs to your local service.
Service Bus supports both TCP and HTTP. Most firewalls allow at least outbounding connections through port 80/443, which is also the minimum requirement of the service bus. In this way, the service bus can traverse NAT and firewall.
Security is a complex topic. I will not discuss it in detail in this article. However, it is worth noting that Windows azure platform appfabric access control is very helpful in many cases, and it is integrated with service bus by default.
Of course, odata and WCF data services are also helpful in this case.
you can download an example from all-in-one code framework (azureapps.zip. It demonstrates how to use service bus and WCF data services to access local SQL server data on the cloud. Project name: csazureservicebuswcfds/vbazureservicebuswcfds. This project also provides an ASP. NET client for testing services. You can easily convert this customer segment into a Windows azure web role and perform real tests on the cloud.