. NET Cache Management Framework CacheManager

Source: Internet
Author: User

Cache caching is a widely used concept in the field of computers. The CPU in the hardware has the first level cache, the level two cache, the browser has the cache, the software development also has the distributed cache Memcache, the Redis. The reason the cache is ubiquitous is that it can greatly improve the speed of hardware and software. In project development, where performance is often slow, where IO operations are frequent, reading a database is a common place to consume performance. At this time, if the use of frequent data cache to high-speed read media, the next visit, no more to request the database, directly from the cache to obtain the required data, can greatly improve performance. The main discussion in this article is on the. NET development, how to use the CacheManager framework to easily manage the cache in your project.

One, CacheManager introduction and advantages

CacheManager is the open source. NET Cache management framework. It is not a specific cache implementation, but rather on top of the cache, allowing developers to configure and manage a variety of different caches, providing the middle tier of a unified cache interface for upper-level applications.

Here are some of the benefits of CacheManager:

    • Make life easier for developers to handle and fund the cache, even with very complex caching scenarios.
    • CacheManager can manage multiple caches, including memory, AppFabric, Redis, couchbase, Windows Azure Cache, MemoryCache, and more.
    • Provides additional features such as cache synchronization, concurrency updates, events, performance counters, and more ...
Two CacheManager start Tour

CacheManager is very easy to get started with. The following use of memory caching, combined with an instance of CacheManager, can help us quickly familiarize ourselves with how CacheManager is used.

First, create a console application in Visual Studio.

Use NuGet to add CacheManager package references to your project. CacheManager contains a lot of package. Where Cachemanager.core is required, there are different corresponding package for different cache platforms.

In this demo, we use memory as a cache, so we just need cachemanager.core and cachemanager.systemruntimecaching

Next, configure our cache in the main function:

1 using System; 2 using Cachemanager.core; 3 namespace Consoleapplication 4 {5     class program 6     {7         static void Main (string[] args) 8         {9             var CAC he = Cachefactory.build ("Getstartedcache", settings =>10             {                 settings. Withsystemruntimecachehandle ("Handlename");         }14     }15}

The above code uses Cachefactory to create a cache instance called Getstartedcache, which uses the Systemruntime cache, the memory cache. A cache instance can be configured with multiple handle, we can use memory as storage media, we can use Redis distributed cache as storage media, and can be used in a cache instance, we will describe the configuration and use of multi-level cache later.

Next, we add some code to test the cache

1 static void Main (string[] args) 2 {3  4     var cache = cachefactory.build ("Getstartedcache", settings = 5     { 6         settings. Withsystemruntimecachehandle ("Handlename"); 7     }); 8  9     cache. ADD ("Keya", "Valuea"), ten     cache. Put ("KeyB"), one     cache. Update ("KeyB", V = +),     Console.WriteLine ("Keya is" + cache. Get ("Keya"));      Should be valueA13     Console.WriteLine ("KeyB is" + cache. Get ("KeyB"));      Should be 4214     cache. Remove ("Keya");     Console.WriteLine ("Keya removed?" + (cache. Get ("Keya") = = null). ToString ());     Console.WriteLine ("We are done ...");     Console.readkey (); 18}
Three, CacheManager multi-level cache configuration

In practical development, we often need to use multilevel caches.

A common scenario is that you have a distributed cache server, such as Redis, which allows multiple system applications to share these cached data because the creation of these cache entries is expensive.

The distributed cache is faster than accessing the database, but it is not fast enough compared to memory. Because distributed cache usage also requires serialization and time consumption for network transmissions.

In this time, it's a good solution to have a tiered cache, which uses the memory cache in conjunction with the distributed cache and uses high-frequency data to read directly from memory, which greatly improves the overall performance of the application.

The read speed of using the memory cache can reach 100 times times the distributed cache, or even higher.

Configuring multi-level caching is a very easy thing to do with CacheManager

1 var cache = cachefactory.build<int> ("Mycache", settings = 2 {3     settings 4         . Withsystemruntimecachehandle ("Inprocesscache")//Memory cache handle 5         . and 6         . Withredisconfiguration ("Redis", config =>//redis cache configuration 7         {8             config. Withallowadmin () 9                 . Withdatabase (0) Ten                 . Withendpoint ("localhost", 6379);         Withmaxretries (1000)//number of attempts         . Withretrytimeout (100)//try to time out         . Withredisbackplate ("Redis")//redis use the back Plate15         . Withrediscachehandle ("Redis", true);//redis cache Handle16});

In the above code, the memory cache and the Redis cache configuration section are easy to see. But what is the role of backplate? Next, let's look at the backplate baffle mechanism in CacheManager.

Iv. backplate resolving synchronization issues in distributed cache

For large-scale software systems, are often divided into a number of independent sub-projects, each sub-project in order to save costs or data sharing convenience, often share the same distributed cache server. This makes it possible for data inconsistencies to occur when using multilevel caches.

Assuming that a data item in the cache is updated in system A, the CacheManager will update the data in all cache handle of the A setting, which also includes the data on the distributed cache. However, in the memory cache in system B, there is still an old, non-updated data. When system B takes this record from the cache, there is a case of inconsistent data in the memory cache and in the distributed cache.

To prevent this, the cache manager has a feature called Cachebackplate that will attempt to synchronize the caches in multiple systems.

In the multilevel cache set above, we will use Redis as the source of the backplate. This means that all data needs to be modeled on data that is cached in Redis.

After setting up Redis as backplate, the same data inconsistency occurs when the data in Redis is modified, triggering cachemanager to update the data in the in-memory cache on all systems, consistent with the data in Redis.

How does the sync work done?

Each time a cache record is deleted or updated, Cache Manager sends a message to let backplate store the data change information. All other systems will receive these messages asynchronously and will update and delete accordingly to ensure data consistency.

Five, Expirationmode and Cacheupdatemode

When it comes to caching, there must be a problem with cache expiration. Some simple settings for cache expiration are provided in CacheManager.

1 public enum ExpirationMode2 {3     None = 0,4     Sliding = 1,5     Absolute = 2,6}

CacheManager also sets different data update policies for multi-level caches

1 public enum CacheUpdateMode2 {3     None = 0,4 full     = 1,5 up     = 2,6}

With sliding and up, we can set different cache expiration times for multi-level caches, so that data with high frequency can be saved in faster access memory and accessed more frequently in the distributed cache. When CacheManager cannot find the cached data in memory, it tries to find it in the distributed cache. Once found, the cached data is then saved to the in-memory cache, depending on the up setting.

The specific configuration is as follows:

1 var cache = cachefactory.build<int> ("Mycache", settings = 2 {3     settings. Withupdatemode (cacheupdatemode.up) 4         . Withsystemruntimecachehandle ("Inprocesscache")//Memory cache handle 5         . Withexpiration (expirationmode.sliding, Timespan.fromseconds)) 6         . and 7         . Withredisconfiguration ("Redis", config =>//redis cache configuration 8         {9             config. Withallowadmin () Ten                 . Withdatabase (0)                 . Withendpoint ("localhost", 6379)         .         Withexpiration (expirationmode.sliding, TimeSpan. Fromhours  ())         . Withmaxretries (1000)//number of attempts         . Withretrytimeout (100)//try to time out         . Withredisbackplate ("Redis")//redis use the back Plate17         . Withrediscachehandle ("Redis", true);//redis cache Handle18 19});
VI, Cache usage analysis

In cache usage, this data can help us analyze and adjust the cache settings for cache hit and miss data-state comparisons, helping to make the cache more reasonable to use.

1 var cache = Cachefactory.build ("CacheName", settings = Settings2     . Withsystemruntimecachehandle ("Handlename") 3         . Enablestatistics () 4         . Enableperformancecounters ());

After configuring the cached statistic feature, we are able to track the usage of the cache, which is to print the analysis data in each cache handle separately.

1 foreach (var handle in cache. Cachehandles) 2 {3     var stats = handle. Stats; 4     Console.WriteLine (string. Format (5             "Items: {0}, Hits: {1}, Miss: {2}, Remove: {3}, Clearregion: {4}, Clear: {5}, Adds: {6}, Puts: {7}, Gets: {8 } ", 6                 stats. Getstatistic (Cachestatscountertype.items), 7                 stats. Getstatistic (cachestatscountertype.hits), 8                 stats. Getstatistic (cachestatscountertype.misses), 9                 stats. Getstatistic (cachestatscountertype.removecalls),                 stats. Getstatistic (cachestatscountertype.clearregioncalls),                 stats. Getstatistic (cachestatscountertype.clearcalls),                 stats. Getstatistic (cachestatscountertype.addcalls),                 stats. Getstatistic (cachestatscountertype.putcalls),                 stats. Getstatistic (cachestatscountertype.getcalls))             ; 16}
Seven, end

Caching is a good thing, and good performance can be greatly improved. The use of the cache itself is a big topic, this article just from the perspective of cache management introduced the use of Cachmanager.

Below is CacheManager related information and links:

Official homepage

http://cachemanager.net/

Source

Https://github.com/MichaCo/CacheManager

Sample of the official MVC project

Https://github.com/MichaCo/CacheManager/tree/master/samples/CacheManager.Samples.Mvc

Recently thinking about the difference between cache usage in different situations. For Internet projects, the consistency requirements for data are often not very high, and in cache management, the focus may be on the cache hit ratio. For applications, access requests are small, but the data consistency requirements are high, and the data update strategy in the cache may be more important.

What is the best caching design for the application system? If you are interested, please discuss the advice.

. NET Cache Management Framework CacheManager

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.