Application of. NET cache framework CacheManager in hybrid development framework (1)-introduction and use of CacheManager, cachemanager

Source: Internet
Author: User
Tags redis desktop manager redis server

Application of. NET cache framework CacheManager in hybrid development framework (1)-introduction and use of CacheManager, cachemanager

In many distributed projects we have developed (for example, based on the WCF Service and Web API Service), because data provision involves database-related operations, if the client concurrency exceeds a certain number, the processing of database requests is explosively increasing. If the database server cannot process these concurrent requests quickly, the client request time will be increased. In serious cases, the database service or application service may be paralyzed directly. The cache solution is born for this. With the introduction of the cache, you can convert the IO time-consuming operations of the database to the fast response operation of the memory data, or cache the entire page into the cache system. The cache framework has many implementations on various platforms, most of which are implemented using distributed cache Redis and Memcached. This series of articles introduces the entire process of using the open-source cache framework CacheManager in the. NET platform to cache data. This article mainly introduces the use of CacheManager and related tests.

1. Introduction to CacheManager

CacheManager is an open-source. Net cache framework abstraction layer developed in C # language. It is not a specific cache implementation, but it supports multiple cache providers (such as Redis and Memcached) and provides many advanced features.
The main purpose of CacheManager is to make it easier for developers to handle a variety of Complex cache scenarios. CacheManager can be used to implement multi-layer caching so that the in-process cache is before the distributed cache, only a few lines of code are required.
CacheManager is more than just an interface to unify the programming models of different cache providers. It makes it easy for us to change the cache policy in a project. It also provides more features: such as cache synchronization, concurrent updates, serial numbers, event processing, and performance computing. developers can select these features as needed.

The GitHub Source Code address of CacheManager is: Release /.

Use Nuget to add a CacheManager package reference for the project. CacheManager contains many packages. Among them, CacheManager. Core is required, and other packages for different cache platforms. The entire Nuget Package contains the following parts.

The CacheManager cache framework supports application development such as Winform and Web, and supports a variety of popular cache implementations, such as MemoryCache, Redis, Memcached, Couchbase, and System. Web. Caching.

Throughout the entire cache framework, it is very specific. In addition to supporting multiple cache implementations, it is mainly based on the memory cache (in-process) and the multi-layer Cache architecture supplemented by other distributed caches, in order to achieve the mechanism of fast Hit and processing, they have related message processing internally, so that even distributed cache can timely implement concurrent and synchronous cache processing.

The Internet is filled with the trend of Implementation and Application Based on a specific individual cache. This is a more abstract layer and a cache framework that provides more advanced features. On the basis of providing a unified programming model, it has also achieved very powerful compatibility, so that as soon as I come into contact with this framework, I can't help it.

On GitHub, the first few of the cache frameworks, in addition to the cache framework, also have some, but from the perspective of document richness and other aspects, this cache framework is worth it.

In terms of configuration, the CacheManager cache framework supports code configuration, XML configuration, and JSON configuration processing, which is very convenient.

By default, the CacheManager cache framework serializes cached data in binary format. It also supports multiple custom serialization methods, such as JSON serialization Based on JOSN. NET or custom serialization.

The CacheManager cache framework records the addition, deletion, and update of cache records.

The cache data of the CacheManager cache framework is strongly typed and supports various common types of processing, such as Int, String, List, and other basic types, as well as various objects with serial numbers and List objects.

The CacheManager cache framework supports multi-layer caching. It provides an internal mechanism to efficiently and timely synchronize cached data at all layers.

The CacheManager cache framework supports logging for various operations.

The CacheManager cache framework supports update locking and transaction processing in the distributed cache implementation to ensure better synchronous processing of the cache, and internal mechanisms implement version conflict processing.

The CacheManager cache framework supports two types of cache expiration processing, such as absolute expiration processing and fixed-period expiration processing, which makes it easier for us to process cache expiration.

....

Many features basically cover the general features of caching, and the interfaces provided are basically common interfaces such as Add, Put, Update, and Remove, which are also very convenient to use.

 

2. Application of the CacheManager cache framework

Through a brief understanding of the CacheManager cache framework, we probably know some features of its application, but we need to learn and understand how to use it, first, we need to know the role played by the cache framework in the entire application framework.

In general, for single-Host application scenarios, this cache framework is basically not required, because the client concurrency is very small, and there are only a few data requests, so there will be no problems with performance convenience.

For distributed application systems, such as my hybrid development framework and Web development framework, because the number of data requests increases with the growth of users, especially for some Internet application systems, in extreme cases, a certain point in time may reach the peak of the whole application concurrency. In this distributed system architecture, data caching is introduced to reduce the number of I/O concurrency and convert time-consuming requests to memory-based high-speed requests, which can greatly reduce the risk of system downtime.

Taking the conventional Web API layer as an example to build an application framework, the entire data cache layer should be a layer under the Web API layer and above the business implementation layer, as shown below.

In this data cache layer, we introduced the CacheManager cache framework to implement distributed cache processing so that our cache data can be processed on the Redis server, at the same time, you can quickly restore cached data without losing data when the system is restarted.

To implement the use of this CacheManager cache framework, we need to perform a usage test first to understand the convenience of the framework and then to make it widely used in our data middle layer.

Create a project, open the NuGet management package in the referenced area, search for the relevant modules and Applications of CacheManager, and add them to the project reference. This is the first step.

We create a customer object class to simulate data storage and display, as shown in the following code.

/// <Summary> /// Customer object class simulating data storage /// </summary> public class Customer {private static Customer m_Customer = null; private static ICacheManager <object> manager = null; // initialization List Value: private static list <string> List = new List <string> () {"123", "456 ", "789"}; // <summary> // single-piece Instance of the Customer object // </summary> public static Customer Instance {get {if (m_Customer = null) {m_Customer = new Customer ();} if (manager = null) {manager = CacheFactory. build ("getStartedCache", settings => {settings. withSystemRuntimeCacheHandle ("handleName") ;}) ;}return m_Customer ;}}

This class first implements a Singleton, initializes the cache Customer class object, and the Cache Management class ICacheManager <object> manager. This is the main reference object we will use later to operate the cache data.

We have written several functions to implement data acquisition, data addition, and data deletion operations, and trigger cache updates when data is added or deleted, in this way, we will obtain the latest data next time.

/// <Summary> /// obtain all customer information /// </summary> /// <returns> </returns> public List <string> GetAll () {var value = manager. get ("GetAll") as List <string>; if (value = null) {value = list; // initialize and add cache manager. add ("GetAll", value); Debug. writeLine ("initialize and add to List");} else {Debug. writeLine ("Access cache get: {0}", DateTime. now);} return value ;} /// <summary> /// Insert a new record // </summary> /// <param name = "customer"> </param> /// <returns> </Returns> public bool Insert (string customer) {// obtain all records first, and then add the record if (! List. contains (customer) {list. add (customer);} // reset the cache manager. update ("GetAll", v => list); return true ;} /// <summary> /// Delete the specified record /// </summary> /// <param name = "customer"> </param> /// <returns> </returns> public bool Delete (string customer) {if (list. contains (customer) {list. remove (customer);} manager. update ("GetAll", v => list); return true ;}

We compile a Winform program to test the cache so that we can understand the mechanism.

During the test, GetAll is processed, and insertion and deletion are mainly used to test the processing of cache updates. The Code is as follows.

Private void btnTestSimple_Click (object sender, EventArgs e) {var list = Customer. Instance. GetAll (); Debug. WriteLine ("client retrieve records: {0}", list! = Null? List. count: 0);} private void btnInsert_Click (object sender, EventArgs e) {var name = "abc"; Customer. instance. insert (name); Debug. writeLine (string. format ("insert record: {0}", name);} private void btnDelete_Click (object sender, EventArgs e) {var name = "abc"; Customer. instance. delete (name); Debug. writeLine (string. format ("delete record: {0}", name ));}

We track the records to see the following log information.

We can see that initialization is performed when the cache does not exist for the first time, and the number of initialized records is three. After the records are inserted, when the data is retrieved again, the number of cache updates becomes four.

We have introduced the background code for inserting records, and it also updates the cache data.

/// <Summary> /// Insert a new record // </summary> /// <param name = "customer"> </param> /// <returns> </returns> public bool Insert (string customer) {// obtain all records first, and then add the record if (! List. Contains (customer) {list. Add (customer);} // reset the cache.Manager. Update ("GetAll", v => list );Return true ;}

The cache Initialization Configuration we introduced earlier uses the memory cache by default and does not use the distributed cache configuration. Its initialization code is as follows:

manager = CacheFactory.Build("getStartedCache", settings =>{    settings.WithSystemRuntimeCacheHandle("handleName");}); 

Under normal circumstances, we still need to use this powerful distributed cache. For example, we can use Redis cache for processing. For Redis installation and use, please refer to my article "C #-based MongoDB database development and application (4)-Redis installation and use".

Introducing distributed Redis cache implementation, we only need to make some changes to the configuration code, as shown below.

manager = CacheFactory.Build("getStartedCache", settings =>{    settings.WithSystemRuntimeCacheHandle("handleName")    .And    .WithRedisConfiguration("redis", config =>    {        config.WithAllowAdmin()            .WithDatabase(0)            .WithEndpoint("localhost", 6379);    })    .WithMaxRetries(100)    .WithRetryTimeout(50)    .WithRedisBackplane("redis")    .WithRedisCacheHandle("redis", true)    ;}); 

Other usage remains unchanged. At the same time, we add some test data to facilitate our access to the corresponding cache data.

/// <Summary> /// add several different data to the test /// </summary> /// <returns> </returns> public void TestCache () {manager. put ("string", "abcdefg"); manager. put ("int", 2016); manager. put ("decimal", 2016.9 M); manager. put ("date", DateTime. now); manager. put ("object", new UserInfo {ID = "123", Name = "Test", Age = 35 });}
Private void btnTestSimple_Click (object sender, EventArgs e) {var list = Customer. Instance. GetAll (); Debug. WriteLine ("client retrieve records: {0}", list! = Null? List. Count: 0); // Add some values to the testCustomer. Instance. TestCache ();}

In our test, everything is no different from the original, and the recorded information of the program is normal.

However, we configured Redis cache processing, so we can use the "Redis Desktop Manager" software to view the corresponding cache data. Open the software and we can see the corresponding cache records as follows.

We can see that all the cache key values we add can be viewed by the Redis client, because the cache is implemented based on the Redis cache, similarly, if we configure other cache implementations, such as MemCache, we can also view them on the corresponding management interface.

After processing these data, we can find that the cached data can be cached in multiple layers. The most efficient is the memory cache (which is also its primary cache ), it automatically cooperates with the data version conflicts of various distributed caches.

One advantage of introducing distributed cache such as Redis is that our data can be found (not hit) in the memory cache when the program is restarted ), then we will find the distributed cache and load it, so that even if the program restarts, our previous cache data remains intact.

 

The above is my introduction to the cache framework based on my overall understanding and role playing, as well as the use of CacheManager and some scenarios. Through the above simple case study, we can gradually introduce a more practical Web API framework for use, in order to bring the cache framework into full play its truly powerful value, at the same time, we also need to explore at a higher level for different caches, and hope you will continue to support them.

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.