Joseph Fultz
Chris Mabry
Download Sample Code
Over the past few months, I and a colleague have been working on a project using Microsoft extensibility framework (MEF. In this article, we will look at how to use MEF To Make cloud deployment easier to manage and more flexible. MEF (and similar frameworks such as unity) is a software structure that can release developers from managing dependency parsing, object creation, and instantiation. From time to time, you may find yourself creating dependent objects in writing factory methods or constructor or required initialization methods, but most of this work is no longer necessary with frameworks like MEF.
By combining MEF with storageclient APIs in our deployment, we can deploy and provide new classes without reusing or re-deploying our web roles. In addition, we can deploy all types of updated versions to the cloud, instead of re-deploying them all, but simply re-using the application. Please note that while we are using MEF here, using unity, Castle Windsor, structuremap or any other similar container and using similar structures should achieve the same results, the main difference lies in syntax and type registration semantics.
Design and deployment
As the saying goes: Work hard and reap the harvest. In this example, some construction standards and other work around deployment are required. First, if you are used to using dependency injection (DI) or composite containers, you may like to separate implementations from interfaces in code. We should not deviate from this goal here-all of our concrete class implementations have an inheritance that traces back to a certain interface type. This does not mean that each class will directly inherit from an interface, but the class will usually have an abstraction layer, which follows a pattern like "interface" virtual "concrete.
Figure 1Show that not only are the main classes I'm interested in having such a chain, but in fact, one of its essential attributes is also abstract. With all abstractions, you can easily replace parts or add additional features in the form of a new library that exports the required conventions (in this example, interfaces. In addition to composite, strict requirements on the abstraction of your class design also come with the advantage of being able to perform better testing through a simulated interface.
Figure 1 Relationship Diagram
The harder part of this requirement is the changes made to the application in the deployment model. Because we want to generate our import and export directory at runtime, And we can refresh this directory without re-deploying it. Therefore, we must deploy binary files of specific classes outside of Web role deployment. For the application at startup, this will also cause a little extra work.Figure 2It depicts the startup work in global. asax when global. asax calls the helper class we have created named mefcontext.
Figure 2 Directory generated at startup
Runtime combination
Because we are going to load directories from stored files, we need to make these files into our cloud storage container. Therefore, enabling these files to enter the Windows azure storage location must be a part of the deployment process. This can be easily achieved by using the Windows azure powershell cmdlet (wappowershell.codeplex.com) and some post-generation steps. For us, we will use the Windows azure storage browser (azurestoragepolicer.codeplex.com) to manually move these binaries.
We have created a project that contains a common diagnostic class, a customer entity, and several rule libraries. All rule libraries must inherit interfaces of the ibusinessrule <t> type and export interfaces of this type. t indicates the entity that implements these rules for them. The following is the import part of the class declaration for the rule:
- [Export(typeof(IBusinessRule<ICustomer>))]
- public class CustomerNameRule : IBusinessRule<ICustomer>
- {
- [Import(typeof(IDiagnostics))]
- IDiagnostics _diagnostics;
- ...
- }
-
You can see the diagnostic dependency that MEF will inject to us when exporting and requesting rule objects. It is very important to know the content to export, because it will become a convention for parsing the instance you need. Microsoft. NET Framework 4.5 will bring some improvements to MEF, which will allow you to relax some of the constraints currently surrounding the generics in the container. For example, you can register and retrieve content such as ibusinessrule <icustomer>, but cannot register or retrieve content such as ibusiness-Rule <t>. Sometimes, you want all instances of a certain type to exceed the actual template type. Currently, the simplest way to achieve this goal is to register a string convention name, which will be an agreement agreed upon in your project or solution. In this example, the statement described above applies.
We have two rules: one for the phone number, one for the name, and one for the diagnosis library, which will be provided through the MEF container. The first thing we need to do is to get a library from the Windows azure storage and put it into a local resource (local directory) so that we can use directorycatalog to load them. Therefore, we include several function calls in application_start of Global. asax:
- // Store the local directory for later use (directory catalog)
- MEFContext.CacheFolderPath =
- RoleEnvironment.GetLocalResource("ResourceCache").RootPath.ToLower();
- MEFContext.InitializeContainer();
-
We just obtained the required resource path, which is configured as part of the Web role, and then calls the method to set the container. This initialization method also calls updatefromstorage to obtain files, and CALLS buildcontainer to create directories and MEF containers.
The updatefromstorage method searches for and traverses the files in the pre-determined container, and downloads the files to the local resource folder. The first part of this method is shown in figureFigure 3.
Figure 3: First Half of updatefromstorage
- // Could also pull from config, etc.
- string containerName = CONTAINER_NAME;
- // Using development storage account
- CloudStorageAccount storageAccount =
- CloudStorageAccount.DevelopmentStorageAccount;
- // Create the blob client and use it to create the container object
- CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
- // Note that here is where the container name is passed
- // in order to get to the files we want
- CloudBlobContainer blobContainer = new CloudBlobContainer(
- storageAccount.BlobEndpoint.ToString() +
- "/" + containerName,
- blobClient);
- // Create the options needed to get the blob list
- BlobRequestOptions options = new BlobRequestOptions();
- options.AccessCondition = AccessCondition.None;
- options.BlobListingDetails = BlobListingDetails.All;
- options.UseFlatBlobListing = true;
- options.Timeout = new TimeSpan(0, 1, 0);
-
In the first half, we set the storage client to get what we need. In this solution, we require any content there. If you are uploading files from storage to local resources, it may be worthwhile to complete the steps and get all the content. To extract files more specifically, you can assign an ifmatch condition to the options. accesscondition attribute. This requires you to set etag for blob during upload. In addition, you can optimize the re-generation of MEF container updates by storing the last update time and the accesscondition of ifmodifiedsince.
Figure 4Display the second half of updatefromstorage.
Figure 4 half of updatefromstorage
- // Iterate over the collect
- // Grab the files and save them locally
- foreach (IListBlobItem item in blobs)
- {
- string fileAbsPath = item.Uri.AbsolutePath.ToLower();
- // Just want the file name ...
- fileAbsPath =
- fileAbsPath.Substring(fileAbsPath.LastIndexOf('/') + 1);
- try
- {
- Microsoft.WindowsAzure.StorageClient.CloudPageBlob pageblob =
- new CloudPageBlob(item.Uri.ToString());
- pageblob.DownloadToFile(MEFContext.CacheFolderPath + fileAbsPath,
- options);
- }
- catch (Exception)
- {
- // Ignore exceptions, if we can't write it's because
- // we've already got the file, move on
- }
- }
-
After the storage client is ready, we only need to traverse blob items and download them to resources. Based on the download conditions and objectives, we can copy the folder structure locally or generate a folder structure based on conventions in this operation. Sometimes, the folder structure is a requirement to avoid name conflicts. We continue to use the volume method and get all the files, and place them in one location, because we know that it is only two or three DLL in this example.
In this way, we put the file in place and only need to generate the container. In MEF, composite containers are generated from one or more directories. In this example, we will use directorycatalog, because this can easily point the catalog to a directory and load available binary files. Therefore, the code used to register the type and prepare the container is very short:
- // Store the container for later use (resolve type instances)
- var catalog = new DirectoryCatalog(CacheFolderPath);
- MEFContainer = new CompositionContainer(catalog);
- MEFContainer.ComposeParts();
-
Now we will run the site and we should see the dump of the type provided in the container, as shown inFigure 5.
Figure 5 initial export
Here, we did not dump the entire container, but specifically requested the idiagnostics interface, and then exported all types of ibusinessrule <icustomer>. As you can see, we have one of them before uploading the new business rule repository to the storage container.
We have placed newrules. dll in the storage location, and now we need to load it into the application. Ideally, you want to trigger container re-generation by executing a file observation on the storage container. In addition, you can use ifmodifiedsince accesscondition to easily perform the preceding operations through fast polling. However, we chose a more manual process, that is, click "Update catalog" on our test application (update directory ).Figure 8The result is displayed.
Figure 8 update Rule Export
We have just repeated the steps for creating directories and initializing containers, and now we have a new rule repository to be implemented. Note that we did not restart or redeploy the application, but we have new code to run in the environment. The only legacy problem here is that some synchronization methods are required, because we cannot try to use a composite container while replacing the references:
- var catalog = new DirectoryCatalog(CacheFolderPath);
- CompositionContainer newContainer =
- new CompositionContainer(catalog);
- newContainer.ComposeParts();
- lock(MEFContainer)
- {
- MEFContainer = newContainer;
- }
-
The main reason for generating the second container and replacing the reference is to reduce the lock time limit and immediately return the container to be used.
To further enrich the code library, the next step is to implement your custom directory type, such as azurestoragecatalog, as shown inFigure 9. Unfortunately, the current object model does not have an appropriate interface or a defined code library that can be easily reused. Therefore, using a little inheritance and some encapsulation may be the best choice. Implementing a class similar to the azurestoragecatalog list will implement a simple model that instantiate a custom directory and uses it directly in a composite container.
Figure 9 azurestoragecatalog
- public class AzureStorageCatalog:ComposablePartCatalog
- {
- private string _localCatalogDirectory = default(string);
- private DirectoryCatalog _directoryCatalog =
- default(DirectoryCatalog);
- AzureStorageCatalog(string StorageSetting, string ContainerName)
- :base()
- {
- // Pull the files to the local directory
- _localCatalogDirectory =
- GetStorageCatalog(StorageSetting, ContainerName);
- // Load the exports using an encapsulated DirectoryCatalog
- _directoryCatalog = new DirectoryCatalog(_localCatalogDirectory);
- }
- // Return encapsulated parts
- public override IQueryable<ComposablePartDefinition> Parts
- {
- get { return _directoryCatalog.Parts; }
- }
- private string GetStorageCatalog(string StorageSetting,
- string ContainerName)
- { }
- }
-
Update existing functions
It is quite easy to add new features to our deployment, but it is not possible to update existing features or libraries. Although this process is better than full redeployment, it still involves a lot of manpower, because we must move the file to the storage, and the related web role must update its local resource folder. However, we will also use these roles cyclically because we need to upload and reload the appdomain to refresh the Type Definitions stored in the container. Even if you load the composite container and type to the secondary appdomain and try to load it from there, the appdomain from which you request the type will still load it from the previously loaded metadata. We can see that the only way to do this is to send the entity to the secondary appdomain and add some custom messages, rather than the type used for exporting the primary appdomain. This mode seems to be a problem for us; there seems to be a problem with the dual appdomain in itself. Therefore, a simpler solution is to reuse these roles when new binary files are available.
There are some good news related to the Windows azure update domain. Please take a look at my February 2012 column "Windows azure deployment domain" (msdn.microsoft.com/magazine/hh781019), which describes the general situation of the new domain and how to restart the instance in various circumstances. From a positive perspective, the site does not need to be completely redeployed to continue working. However, you may encounter two different actions during the refresh process. However, this is an acceptable risk, because if you have fully deployed, the same problem exists during the rollback update process.
You can configure this to happen during deployment, but one of the problems is how to coordinate. Therefore, it is required to coordinate the restart of the instance. Therefore, you may need to select a primary instance or have a voting system. We believe that by monitoring the process and the Windows azure cmdlet mentioned earlier, it is easier to process the task, rather than writing some artificial intelligence into web roles.
There are many reasons for using frameworks such as MEF, but this is a bit more than what we will focus on in this article. We want to emphasize that by combining the inherent features of Windows azure with the composite/DI/reversal of the control type framework, you can create a dynamic cloud application, the application can easily respond to the latest changes that seem to always appear.
Joseph FultzHe is a software architect of Hewlett-Packard Co. and participates in the Global IT team of hp.com. Previously, he was a software architect at Microsoft and assisted Microsoft top-level enterprises and isV customers in defining architecture and designing solutions.
Chris MabryHe is the Development Director of Hewlett-Packard Co. and currently leads a team to provide a rich UI experience based on the client framework of the support service.
Address: http://msdn.microsoft.com/zh-cn/magazine/jj553511.aspx