Android Cache:
The use of cache, can further greatly alleviate the pressure of data interaction, but also to provide a certain offline browsing. Let me briefly list the applicable environment for cache management:
1. Application of network Services
2. Data updates do not need to be updated in real time, and even a 3-5-minute delay can be implemented with a caching mechanism.
3. Cache expiration Time is acceptable (similar to NetEase's news reading, support offline offline reading)
The benefits of this are:
1. Reduce the pressure on the server
2. Improve client response speed (local data extraction)
3. To a certain extent, support offline browsing (can refer to NetEase's news application, the personal feel that offline reading is very good. )
First, the method of cache management
The principle of cache management is simple: the time to determine whether to read the cache or re-download; there is nothing to say in the network, go directly to the cache.
There will be some details of the processing, will be elaborated in detail later. Based on this principle, the two most common methods of cache management currently used by individuals are: Database and file (TXT).
Second, the database (SQLite) cache mode
This method is after downloading the data file, the relevant information of the file such as URL, route, download time, expiration time and so on to the database, of course, I personally recommend the URL as a unique identifier. The next time you download the query from the database according to the URL, if the query to the current time has not expired, according to the path to read local files, so as to achieve the effect of caching.
From the implementation we can see that this method can flexibly store the properties of the file, which provides a lot of extensibility, can provide some support for other functions.
From the operation needs to create a database, each query database, if the expiration also need to update the database, clean the cache also need to delete the database data, a little trouble, and the database operation is not easy to appear a series of performance, ANR problem, pointer error problem, the implementation of the time to be cautious, concrete, But it's just a matter of adding a tool class or method.
There is also a problem, the cached database is stored in the/data/data/<package>/databases/directory, is occupied memory space, if the cache accumulated, easy to waste memory, need to clean up the cache in a timely manner.
Of course, this method from some of the practical application of the present, I did not find any problem, the estimated use of the amount is relatively small.
This article I do not like the database, the reason for the operation of trouble, especially to write their own statements to build the statement, you understand. I focus on the file caching method.
Third, the file cache mode
This method uses the File.lastmodified () method to get the last modification time of the file, and the current time to determine whether it expires, thereby achieving the cache effect.
Only this property can be used on the implementation, and there is no possibility of providing technical support for other features. The operation is simple, compare time can, and take the data is the JSON data in the file. itself is not easy to deal with other problems, the cost is low.
Iv. Two-point description of the method of file caching
1. The cache time for different types of files is not the same.
Generally speaking, the cache time of the unchanging file is permanent, and the cache time of the change file is the maximum tolerable constant time. White point, the picture file content is unchanging, usually exists on the SD card until it is cleaned, we can always read the cache. The contents of the configuration file are likely to be updated, and an acceptable cache time needs to be set.
2. The cache time standard in different environments is not the same.
No network environment, we can only read the cache file, in order to apply something to show, there is no expiration of the said.
WiFi network environment, the cache time can be set short, one is the speed faster, but not the flow of money.
3G traffic environment, cache time can be set a bit longer, save traffic, is to save money, but also better user experience.
GPS doesn't say updates or anything, it's slow enough. How long the cache time can be.
Of course, as a good application, will not die a situation, for different networks to transform different forms of caching function is necessary. And this time according to their own actual situation to set: Data update frequency, the importance of data and so on.
Five, when to refresh
Developers want to try to read the cache as far as possible, the user on the one hand want to refresh in real time, but the faster the better, the less traffic consumption the better (about this piece, it is true that I did not think of the development, after all, the interface is so much, now the company's products almost click on the visit, but also some of the Slowly modify haha), is a contradiction.
In fact when the refresh I do not know, here I provide two points of advice:
1. The longest and longest duration of the data is unchanged and has no significant impact on the application.
For example, your data update time is 4 hours, then the cache time is set to 1-2 hours more appropriate. That is, update time/cache time = 2, but the user's personal changes, site editors and other people to update the others. One day users will always see the update, even if there is a delay, depending on the purpose of your product; If you think you are an information application, then reduce, 2-4 hours, if you think the data is more important or more popular, users often play, then reduce, 1-2 hours, and so on.
Of course, similar to this interface data I think the update time can be long, as long as possible. If you take the amount of data behind it will change to stall. I will tell you: This is just a guide interface, how many games you have with the user half-dime relationship is not, 1 billion also with him, he just make sure here can find his looking for Tom Cat on the line. Otherwise you have lost another user.
2. Provide a refresh button.
The necessary or safest way to provide a refresh button in the relevant interface, or the current popular drop-down list refresh mode. As a cache, providing a chance for the load to fail once again. After all, I don't mind a lot of chopsticks beside the bowl when I drink bone soup.
All in all, the user first, in order to better user experience, methods will be endless. Looking for a better way
(Reference code: HTTP://BLOG.CSDN.NET/LNB333666/ARTICLE/DETAILS/8460159)
Picture cache:
President
Loading a bitmap (bitmap) into your UI interface is very simple, but if you want to load a large number of loads at once, things get a lot more complicated. In most cases (such as a ListView, GridView, or Viewpager), the picture on the screen and the amount of pictures that are immediately displayed on the screen are unrestricted in nature.
A component such as this will recycle the view after the child view is moved out of the screen, and memory usage is still preserved. But assuming you don't keep any long-lived references, the garbage collector will release the bitmap you've loaded. This is just as good as it can be, but in order to keep the UI flowing and loading fast, you should avoid having to re-process the image as it goes back to the screen. Using memory and hard disk caching usually solves this problem, and using caching allows components to quickly load and process pictures.
This lesson will take you through memory and hard disk cache bitmap to improve the responsiveness and fluency of your UI when loading multiple bitmap.
Use memory Cache
The memory cache provides fast bitmap access at the expense of valuable application memory. The LRUCache class (which can be obtained in the support library and supported to API level 4, or more than 1.6) is ideal for use as a caching bitmap task, storing the most recently referenced objects in a strongly referenced linkedhashmap. And the recently infrequently used objects are freed after the cache has exceeded the specified size.
Note: Previously a very popular memory cache implementation was a softreference (soft reference) or WeakReference (weak reference) bitmap caching scheme, but it is deprecated. Starting with the Android2.3 version (API level 9), the garbage collector focuses more on the recovery of soft/weak references, which makes the above scenario quite ineffective. In addition, prior to Android 3.0 (API level 11), Bitmap's backup data was stored directly in local memory and freed from memory in an unpredictable way, and it was likely that the program would crash temporarily beyond memory limits.
In order to choose a suitable size for LRUCache, there are a number of reasons to consider, such as:
Are other activity (activities) and/or programs very memory-intensive?
How many pictures are displayed on the screen at a time? How many pictures will be displayed on the screen?
What is the screen size and density of the device? An ultra-high-definition screen (XHDPI) device, such as the Galaxy Nexus, caches the same number of images for more cache space compared to Nexus S (hdpi).
Bitmap size, configuration and how much memory is needed for each picture?
Are the images accessed frequently? Are some more frequently accessed than others? If so, you might need to keep some pictures in memory and even require multiple LRUCache objects to be assigned to different groups of bitmap.
Can you balance the quality and quantity of the pictures? Sometimes it's more useful to store a lot of low-quality pictures, and then you can load another high-quality version of the picture in a background task.
For setting the cache size, there is no specification for all apps, it depends on the appropriate solution you gave after memory usage analysis. The cache space is too small to benefit, but it can cause additional overhead, which is too large to cause a java.lang.OutOfMemory exception or leave only a small space for the application's other programs to run.
Android Cache Explanation