Many people who have done program performance optimizations, or who are concerned about the performance of process programs, should have used various caching techniques. And the cache I'm talking about today is specifically about the cache, and we can use the cache that Httpruntime.cache accesses, not other caching techniques.
I used to simply mention it in my "My idea of the ASP. NET Core Object" blog, and today I'm going to write a feature blog for it, specifically to talk about it, because it's so important. In this blog, I will not only introduce some of its common uses, but also introduce some of its advanced usage. At the end of the previous blog "Various methods of reading and writing config files in. Net", I left a question for you today, and I will give you an answer that I think is perfect in this blog post.
The "Deferred action" method mentioned in this article (e.g., deferred merging to the database) is a summary of my experience and I hope everyone will like this idea.
Back to the top of the cache for basic purposes
Mention the cache, and have to say its main function: improve the performance of the program.
Asp. NET is a kind of dynamic page technology, Web-based pages are almost dynamic, so-called dynamic refers to: the content of the page with different users or continuously updated data, and the display of different results. Since it is dynamic, where does the dynamic content come from? I think most of the sites have their own data sources, the program to access the data source to obtain the data required by the page, and then according to some business rules calculation processing, and finally become suitable for the content of the page display.
Because this dynamic page technology usually needs to get data from the data source, and after some calculation logic, eventually become some HTML code to the client display. And these computational processes are obviously cost-based. The most direct performance of these processing costs is to affect the responsiveness of the server, especially when the data processing process becomes complex and the access is large, it becomes more obvious. On the other hand, some data are not always changing, and if we can cache the final results of some infrequently changed data (including page output), we can significantly improve the performance of the program, which is reflected in the most common and important uses of caching. This is why when it comes to performance optimizations, the cache is generally the first place. What I'm going to talk about today is that the ASP. NET cache is also a technology that can implement this kind of caching. However, it has other features, some of which are not available in other caching techniques.
Back to the top of the cache definition
Before introducing the use of the cache, let's take a look at the definition of the cache: (Description: I've overlooked some insignificant members)
Asp. NET to facilitate our access to the cache, a static property cache is added to the HttpRuntime class so that we can use the cache function anywhere. Also, ASP. NET adds two "shortcuts" to it: Page.cache, Httpcontext.cache, we also have access to httpruntime.cache through these two objects, note: These three are accessing the same object. Page.cache visited Httpcontext.cache, and Httpcontext.cache directly visited Httpruntime.cache
Back to top cache common usage
Generally, when we use the cache, there are generally only two operations: Read, write.
To get a cache entry from cache, we can call the Cache.get (key) method, and to put an object into the cache, we can call the Add, insert method. However, the ADD, insert method has a number of parameters, sometimes we just want to simply put in the cache, all accept the default value, you can also call its default indexer, let's see how this indexer works:
public object this[string key]{ get { return this. Get (key); } Set {this . Insert (key, value);} }
As you can see: the read cache is actually called the Get method, while the write cache is the simplest version of the overload that calls the Insert method.
Note: The Add method can also put an object into the cache, this method has 7 parameters, and the insert has a similar overloaded version of the signature, which has a similar function: add the specified item to the System.Web.Caching.Cache object, which has dependencies, Expiration and priority policies, and a delegate that can be used to notify the application when an insert item is removed from the Cache. However, they have a small difference: when the cache entry to be added is already present in the cache, insert overwrites the original cache entry, and add does not modify the original cache entries.
In other words: If you want a cache item to be cached and not be modified, then calling add does prevent subsequent modifications. When the Insert method is called, the existing item is always overwritten (even if it was previously called add).
From another point of view, the effect of add is more like the behavior of static readonly, while the effect of insert is like static behavior.
Note: I'm just saying "like", in fact they have more flexible usage than normal static members.
Because cache entries can be accessed at any time, it does seem to be a bit of a static member's taste, but they have more advanced features such as cache expiration (absolute expiration, sliding expiration), cache dependencies (dependent files, dependency on other cache entries), removal of priorities, notification of cache removal, and so on. I will describe these four categories of features in the following sections.
Back to top of the cache class features
The cache class has a very rare advantage, with the words on MSDN:
This type is thread-safe.
Why is this a rare advantage? Because in. NET, the vast majority of classes are implemented as a method of ensuring that static types are thread safe, regardless of the instance method being thread safe. It's a basic one, too. NET design specification principles.
For those types, MSDN is usually described in this way:
public static (shared in Visual Basic) members of this type are thread-safe. However, there is no guarantee that any instance members are thread-safe.
So, this means that we can read and write the cache anywhere, without worrying about the data synchronization of the cache's data in a multithreaded environment. In multithreaded programming, the most complex problem is the synchronization of data, and the cache has solved these problems for us.
However, I would like to remind you that ASP. NET itself is a multithreaded programming model, and all requests are handled by threads of the thread pool. Usually, we in the multi-threaded environment in order to solve the data synchronization problem, generally adopts the lock to ensure the data synchronization, naturally, the ASP. NET is no exception, it is in order to solve the data synchronization problem, internal also adopted a lock.
Speaking of which, some people may think: Since only a static instance of the cache, then this lock will affect concurrency?
The answer is yes, there is a certain degree of lock will affect concurrency, there is no way to do things.
However, ASP. NET when implementing the cache, will create multiple cache containers according to the number of CPUs, as far as possible to reduce the conflict, the following is the core process of cache creation:
Description: Cacheinternal is an internal wrapper class, and many of the cache operations are done by it.
In the above code, Numsinglecaches's computational process is important, if the above code is not easy to understand, then look at my sample code below:
The program will output:
1,2,4,4,8,8,8,8,16,16,16,16,16,16,16,16,32,32,32,32
The Cachemultiple constructor is as follows:
Now you should understand: Cachesingle is actually a cache container that is used internally by ASP. When multiple CPUs are created, it creates multiple cache containers.
When writing, how does it locate these containers? Please continue to see the code:
Note: The hashcode in the parameter is a direct call to our Key.gethashcode (), GetHashCode is defined by the object class.
So, from this point of view, although ASP. NET's cache has only one httpruntime.cache static member, it may have more than one cache container inside it, and this design can reduce the concurrency effect to some extent.
No matter how the design, in the multi-threaded environment, sharing a container, conflict is unavoidable. If you simply want to cache some data and do not need many of the advanced features of the cache, consider not using the cache. For example: You can create a static instance of dictionary or Hashtable, it can also do some basic caching, but I want to remind you that you want to handle the data synchronization problem of multi-threaded access data yourself.
Incidentally: hashtable.synchronized (New Hashtable ()) is also a thread-safe collection, and if you want to be simple, consider it.
Next, let's look at the advanced features of the cache, which are not dictionary or Hashtable.
Back to the top cache entry expiration time
Asp. NET supports expiration policies for two types of cache entries: absolute expiration and sliding expiration.
1. Absolute expiration, this is easy to understand: it is when the cache is placed in the caches, specify a specific time. When the time reaches the specified time, the cached item is automatically removed from the cache.
2. Sliding expiration: Some cache entries, we may only want to keep in the cache when a user is accessing it, and only remove it if the user no longer accesses the cache entry for a period of time, which optimizes memory usage because it guarantees that the cached content is "hot". Is the memory of the operating system and the disk cache not all designed like this? And this very useful feature, the cache is also ready for us, as long as the cache entry in the cache, specify a sliding expiration time can be achieved.
The above two options correspond to add, the DateTime absoluteexpiration in the Insert method, and the TimeSpan slidingexpiration two parameters.
Note: These two parameters are used in pairs, but they cannot be specified as a "valid" value at the same time, only one parameter value is valid. When you do not use another parameter item, define two static readonly field assignments with the cache class.
These two parameters are relatively simple, I will not say more, just say: If you use the Noxxxxx two options, then the cache entries are kept in the cache. (may also be removed)
Back to top cache entry dependencies-dependent on other cache entries
The ASP. NET cache has a powerful feature, which is caching dependencies. One cache entry can depend on another cache entry. The following sample code creates two cache entries and has dependencies between them. First, look at the page code:
Page Background code:
When you run the example page, the result is as shown, and when you click the button "set Key1 value", the contents of the cache entry will appear (left). When you click the button "set Key2 value", you will not get the contents of the cached item (right).
Based on the results and analyzing the code, we can see that we used this cache dependency when we created the Key1 cache entry:
CacheDependency dep = new CacheDependency (null, new string[] {"Key2"});
Therefore, when we update the Key2 cache entry, the Key1 cache is invalidated (not present).
Don't underestimate this example. Indeed, just looking at these lines of sample code, maybe they really don't make sense. So, I'll give you a practical usage scenario to illustrate its usefulness.
The picture above is a small tool that I wrote. In the lower-left corner is a cache table cachetable, which is maintained by a class called TABLE1BLL. The cachetable data is from Table1, which is displayed by the Table1.aspx page. At the same time, Reporta, REPORTB's data is mainly derived from Table1, because Table1 's access to almost the vast majority of read and write less, so, I will Table1 data cache up. Moreover, Reporta, REPORTB These two reports are drawn directly from GDI (generated by the report module, can be recognized as the upper class of the TABLE1BLL), given that the two reports have a large number of views and the data source is read and write less, so the output of these two reports, I also cache them up.
In this scenario, we can imagine how to invalidate the cached results of two reports if you want to change the Table1 data.
Let TABLE1BLL to inform the two report module, or TABLE1BLL to delete the cache of two reports directly?
In fact, whether the choice of the former or the latter, when it is necessary to do other Table1 cachetable cache implementation (perhaps other new reports), then, it is bound to modify the TABLE1BLL, it is definitely a failure of the design. This is also the result of the coupling between the modules of the consequences.
Fortunately, the ASP. NET cache supports a feature called cache dependency, and we just need to let TABLE1BLL expose it to cache cachetable key (assuming key is Cachetablekey), and then Other cache results if you want to base on cachetable, set a dependency on "Cachetablekey" can achieve this effect: when the cachetable update, the dependent cache results will be automatically purged. This completely solves the problem of cache data dependency between modules.
Back to top cache entry dependencies-file dependencies
At the end of the previous blog "Various methods of reading and writing config files in. Net", I left a question for you:
I hope that after the user modifies the configuration file, the program can immediately run with the latest parameters without restarting the site.
Today I'm going to answer that question and give all the implementation code I need.
First, I want to make a point: The last blog issue, although the solution is related to the cache's file dependencies, but also need to be used in conjunction with the cached removal notification in order to solve the problem perfectly. In order to facilitate the content arrangement, I first use the cache's file dependencies to implement a rough version of the simple, in the following part of this article to improve the implementation.
Let's look at a rough version first. If I have one of these configuration parameter types in my site:
I can configure it in such an XML file:
<?xml version= "1.0" encoding= "Utf-8"? ><runoptions xmlns:xsi= "Http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd= "Http://www.w3.org/2001/XMLSchema" > <WebSiteUrl>http://www.cnblogs.com/fish-li< /websiteurl> <username>fish li</username></runoptions>
One more page to display the running parameters:
The following code can be implemented: after the XML is modified, the browse page will immediately see the latest parameter values :
Note: Here is still the use of cachedependency, just we are now giving it the first parameter of the constructor to pass the file name to depend on.
There are two points to add before the end of the cache dependency introduction:
1. CacheDependency also supports "nesting", that is, the CacheDependency constructor supports passing in other cachedependency instances, which can form a very complex tree-like dependency.
2. Cache-dependent objects can also be SQL SERVER, refer to SqlCacheDependency
Back to top cache item removal priority
There are many ways of caching, and a static variable can also be called a cache. A static collection is a cached container. I think a lot of people use dictionary,list, or Hashtable do cache containers, and we can use them to save all kinds of data and improve the performance of the program. In general, if we use this kind of collection directly to cache all kinds of data, then the memory occupied by those data will not be recycled, even if they are not many opportunities to use. As more and more data is cached, the memory they consume will naturally grow. So, can you release some cache entries that are not frequently accessed when the memory is not sufficient?
The question is indeed a more realistic one. Although using a cache will run faster with a program, however, our data is infinitely large and cannot be cached, after all, memory space is limited. Therefore, we can solve this problem by using the previously mentioned policy that is deleted based on no access over time . However, when we encode, we do not know at all what our program will run on a standard computer, so it is impossible to make any assumptions about the size of the memory at this time, we may want to when the cache consumes too much memory, and when the memory is not enough, can automatically remove some less important cache items, This may also be more meaningful.
For this requirement, there are two solutions available in the. NET Framework, one using the WeakReference class and the other using the cache. However, since we are using ASP, it is certainly more convenient to choose the cache. In the cache's add, some overloaded versions of the Insert method, you can specify the save priority policy for the cached item, which is passed in by the parameter cacheitempriority. Where CacheItemPriority is an enumeration type that contains the following enumeration values:
Description: When we call the cache's add, insert method, if you do not specify the cacheitempriority option, end up with the priority that normal represents. If we want to put a potentially less important data into the cache, you can specify a priority of low or belownormal. You can use notremovable if you want the cache entry to be stored in a low-memory way and not be removed (unless the expiration or dependency changes).
Obviously, we can use this feature to control the effect of caching on memory pressure. Other caching schemes, such as static Collection + WeakReference, are also more difficult to achieve with such flexible controls.
Back to top cache item removal notification
The ASP. NET cache does not have the same caching effect as some static variables, and its cache entries can be invalidated by certain conditions, and those that fail will be removed from memory. Although some removal conditions are not directly solved by our code, ASP. NET provides a way for us to notify our code when a cache entry is removed.
Note: The ASP. NET cache supports the removal of the "before" notification and the removal of the "after" notification two notification methods.
We can pass a delegate of type CacheItemRemovedCallback by parameter onRemoveCallback when we call the Add, insert method, so that we can be notified when the specified cache entry is removed. The definition of this delegate is as follows:
The meaning of each parameter of the delegate and the reason for removal are clearly explained in the comments, and I will not repeat it.
I think: a lot of people know that the cache add, the Insert method has this parameter, but also know that the delegate, but what is the use of them? In the next two subsections, I'll provide two examples to illustrate this powerful feature.
Typically, we get results from the cache in the following way:
Runoptions options = Httpruntime.cache[runoptionscachekey] as runoptions;if (options = = null) { //not in cache, load from file
// .................................. HttpRuntime.Cache.Insert (Runoptionscachekey, Options, DEP);} return options;
This is also a idiom: try to get from the cache first, and if not, load it from the data source and put it in the cache again.
Why is null returned when accessing the cache? The answer is nothing more than two reasons: 1. did not put cache,2. Cache entry invalidation was removed.
This is not a problem in itself, but what happens if the data is loaded from the data source for a longer period of time?
Obviously, it will affect the first access request in the following time. Have you ever thought that if the cache item can be kept in cache, then it's okay. Yes, in general, if you do not specify an expiration time when placing an object in the cache, do not specify a cache dependency, and are set to never remove, the object will always be in the cache, but the expiration time and cache dependency are also useful. How can both be done?
To solve this problem, Microsoft added the "Pre-removal notification" feature in the 3.5 SP1, 3.0 SP1, 2.0 SP1 versions of the. NET Framework, however, this method is only supported by insert, followed by a delegate and an enumeration definition of the removal reason:
Note: Cacheitemupdatereason This enumeration has only two items. For reasons, see MSDN's Explanation:
Unlike the CacheItemRemovedReason enumeration, this enumeration does not contain removed or underused values. Updatable cache entries are not removable and will never be automatically removed by ASP, even if the memory needs to be freed.
again: Sometimes we do need the cache invalidation feature, but the cache will be removed after it expires. Although we can allow subsequent requests to be loaded from the data source when the cached data is not available, it is also possible to reload the cached data into the cache in the CacheItemRemovedCallback callback delegate, but in the process of loading the data, The cache does not contain the cached data we expect, and the longer the load time, the more obvious this "vacancy" effect will be. This affects access to (subsequent) other requests. To ensure that the cached data we expect is always present in cahce, and that there is still a failure mechanism, we can use the "Pre-removal notification" feature.
Go back to the top. Implement "deferred action" with the removal notification of the cache entry
I've seen some of the net books, and I've seen some people writing about the cache, basically, either along the way, or just to give an example that doesn't make any sense. Unfortunately, it's such a powerful feature that I seldom see anyone use it.
Today, I'm going to give a practical example to reproduce the powerful features of the cache!
I have a page that allows the user to adjust (move up and down) the order in which a project branch record is launched:
When the user needs to adjust the location of a record, the page pops up a dialog box asking for a reason for the adjustment and sending an email informing all the people concerned.
Due to the limitations of the interface, one operation (click the upper and lower key header) simply moves a record one position, which must be moved multiple times when a record is to be executed across multiple lines of movement. Given the ease of operation and the effect of not being affected by duplicate messages, the program needs to implement a requirement that the page only requires one reason to perform multiple moves on a record, and that it does not send duplicate messages multiple times and that the final move results are sent in the message.
This demand is very reasonable, after all, everyone wants to operate simply.
So how do you achieve this demand? Here to achieve from two aspects, first of all, on the page we should do this function, a record only one time to play a dialog box. Since the interaction between the page and the server is all done in Ajax (not refreshed), the state can be maintained using the JS variable, so this feature is easy to implement in the page. Then look at the server, because the server does not have any state, of course, it can also be the page to its state to the server, but which operation is the last time? Obviously, this is not known, and finally only needs to be modified, if the user no longer operates a record within 2 minutes, the most recent operation is considered the last action.
Based on the new requirements, the program must record the user's last action so that after 2 minutes of non-action, the message is sent once, but the reason for the first entry should include the final modification result.
How to achieve this demand? I immediately thought of the ASP. NET Cache, because I know it and I know it can help me to do this function. Let me tell you how it is implemented on the server side.
The whole idea of implementation is:
1. The client page is also the rowguid of each record, adjust the direction, adjust the reason, these three parameters are sent to the server.
2. The service side after the order adjustment operation, the message information to be sent is insert into the cache, and the slidingexpiration and onRemoveCallback parameters are provided.
3. In the CacheItemRemovedCallback callback delegate, ignore the cacheitemremovedreason.removed notification and, if it is a different notification, send the message.
For the sake of understanding, I purposely prepared an example for you. The entire example consists of three parts: a page, a JS file, and a service-side code. First look at the page code:
The page appears as follows:
The JS code for processing the two buttons on the page is as follows:
Description: On the server side, I used the service framework that I provided in the blog "Write my own service framework with ASP." The entire code for the server is this:(note the comments in the code)
To enable JavaScript to directly invoke methods in C #, you need to include the following configuration in Web. config:
Well, here's the sample code. If you are interested, you can download the sample code at the end of this article and see for yourself the "deferred processing" functionality implemented with the cache.
In fact, this "deferred processing" function is very useful, for example, there is a scenario: some data records may need to be updated frequently, if each update to write a database, will certainly put pressure on the database, but because this data is not particularly important , so we can use this " Deferred processing "To merge the time of writing the database, and finally we can do this: To turn multiple writes into one or a few writes, I call this effect: deferred Merge Write
Here I have a way of thinking about deferred merge writes to the database: Put the data records that need to be written into the cache, call the Insert method and provide the slidingexpiration and onRemoveCallback parameters, Then, in the CacheItemRemovedCallback callback delegate, imitate my previous example code, which becomes one more time. However, there may be a problem: if the data is being modified all the time, it will never be written to the database. Finally, if the site restarts, the data may be lost. If you are concerned about this, then, in the callback delegate, when you encounter cacheitemremovedreason.removed, the count is incremented, and when a certain number is reached, it is written to the database. For example: encountered 10 times cacheitemremovedreason.removed I write a database, so that the original need to write 10 times the database operation becomes one time. Of course, writing a database is always necessary if there are other removal reasons. Note: Never use this method for sensitive data such as amounts.
Add two more points:
1. When the CacheItemRemovedCallback callback delegate is called, the cache entry is no longer in the caches.
2. In the CacheItemRemovedCallback callback delegate, we can also put the cache entry back into the cache.
Have you ever thought: This design can form a cycle? The effect of a timer can be achieved by combining the parameter slidingexpiration.
As to the expiration time of the cache, I would like to remind you again: through Absoluteexpiration, the time passed by the slidingexpiration parameter, when the cache time takes effect, the cached object is not immediately removed, and the ASP. The cache checks for these obsolete cached items at approximately 20-second frequency.
Go back to the top using the remove notification of the cache entry to implement the "Auto load configuration file"In the "File dependencies" subsection in the previous section of this article, one example demonstrates that the page can display the most recent modification results when the profile is updated. In that example, for the sake of simplicity, I put the configuration parameters directly in the cache, and then fetch them from the cache each time I use them. If there are many configuration parameters, this practice may also affect performance, after all, configuration parameters are not often modified, if you can directly access a static variable can be obtained, it should be faster. In general, we may do this:
However, one drawback of this approach is that it is not possible to automatically load the latest configuration results after a profile update.
To solve this problem, we can use the file dependencies provided by the cache and remove the notification function. The previous example shows the post-removal notification feature, where I'll show you the pre-removal notification feature.
Description: In fact, to complete this function, you can still use the Remove post notification, just before removing the notification I have not yet demonstrated, however, the use of a pre-removal notification here does not show its unique features.
The following code demonstrates how to automatically update the implementation of a run parameter after a configuration file modification:(note the comments in the code)
The changes are small, but the Loadrunoptions method has been modified, but the effect is very cool.
Remember the question I left at the end of my previous blog's "How to read and write config files in. Net"? This example is my solution.
Back to the top file monitoring technology selectionFor file monitoring, I think someone might think of FileSystemWatcher. It's just that I'm talking about the choice of "File monitoring technology".
Note that all the conclusions in this paper are my personal views and are for reference only.
This component, as early as the development of the WinForm has been used, it is also very deep impression.
It has a poorly packaged place: events are repeated. For example, once a file is saved, it raises two events.
What, you don't believe? Well, I've also prepared an example program.
Description: The picture shows two occurrences, but I just made a save after modifying the file. I have a sample program at the end of this article and you can try it yourself. Here for convenience, or to post the relevant code:
For the use of this class, just want to say: There are many events, so be sure to pay attention to filtering. The following is a description of MSDN:
The Windows operating system notifies the component file of changes in the buffer created by FileSystemWatcher. If there are many changes in a short period of time, the buffer may overflow. This causes the component to lose track of the directory changes, and it will only provide general notifications. Using the Internalbuffersize property to increase the buffer size is expensive because it comes from non-paged memory that cannot be swapped out to disk, so make sure that the buffer size is moderate (as small as possible, but also large enough so that no file change events are lost). To avoid buffer overflows, use the NotifyFilter and IncludeSubdirectories properties so that you can filter out unwanted change notifications.
Fortunately, the ASP. NET cache does not use this component, and we do not have to worry about duplicate operation problems caused by file dependencies. It is directly dependent on the API provided by Webengine.dll, so it is recommended that the file dependencies provided by the cache be preferred in an ASP.
Back to the top the coexistence of various caching schemesThe ASP. NET cache is a caching technology, however, we can also use other caching techniques in the ASP, and these different caches have their own merits. Because the ASP does not provide external access, it is not possible to replace the distributed cache technology represented by memcached, but it is more efficient than distributed caching because it does not require cross-process access. If the ASP is designed as a "cache-level", the distributed cache is designed as a "level two cache", which, like the CPU cache, will be able to take advantage of both of them to achieve more perfect functionality and speed.
In fact, the cache is not a clearly defined technology, a static variable is also a cache, a static collection is a cache container. This cache, compared to the ASP. NET caches, is obviously faster to access the static variable, and if the static collection is not poorly designed, the concurrency conflict may be smaller than the ASP., and because of this, the static collection is also widely used. However, some of the advanced features of the ASP. NET cache, such as expiration time, cache dependencies (including file dependencies), removal of notifications, are not available for static collections. Therefore, using them reasonably at the same time will allow the program to have the best performance and more powerful functions at the same time.
Transferred from: http://www.cnblogs.com/fish-li/archive/2011/12/27/2304063.html
About ASP. NET Cache and its advanced usage