Android Cache Processing

Source: Internet
Author: User

Android Cache:

The use of cache, can further greatly alleviate the pressure of data interaction, but also to provide a certain offline browsing. Let me briefly list the applicable environment for cache management:

1. Application of network Services

2. Data updates do not need to be updated in real time. Even a 3-5-minute delay can be used to cache the mechanism.

3. Cache expiration Time is acceptable (similar to NetEase's news reading, support offline offline reading)

This brings the advantages of:

1. Reduce the pressure on the server

2. Improve the response speed of the client (local data extraction)

3. To a certain extent support offline browsing (can participate in NetEase's news application, the individual feel that offline reading is very good.

)

First, the method of cache management

The principle of cache management is very simple: through the setting of time to infer whether to read the cache or another download; there is nothing to say on the network, just go to the cache.

There will be some handling of the details. This will be explained in detail later.

Based on this principle, the two commonly used cache management methods for personal use today are: Database and file (TXT).

Second, the database (SQLite) cache mode

This is done after the data file has been downloaded. File information such as URL, road warp, download time. Expiry time and so on to the database, of course, I personally recommend the URL as a unique identity. The next time you download the query from the database based on the URL, assuming the query to the current time has not expired. The effect of caching is achieved by reading local files based on the path.

From the implementation we can see that this method can flexibly store the properties of the file, and thus provide very large extensibility, can provide some support for other functions.

You need to create the database from the operation. Every time the database is queried, it is assumed that the database should be updated when the cache is cleared, and the database data should be deleted when it is cleaned up, which is a bit cumbersome, and the database operation is not easy and a series of performance, ANR problems. Pointer error problem, the implementation of the time to be cautious. In detail, but it's just a matter of adding a tool class or method.

Another problem is that the cached database is stored under the/data/data/<package>/databases/folder, which is taking up memory space, assuming the cache accumulates. Easy to waste memory, need to clean up cache in time.

Of course, such a method is useful in some applications at the moment. I did not find any problem, I expected to use less than the amount of it.

This article I do not like the database, the reason for the operation of trouble, especially to write their own statements to build the statement, you understand. My side heavy file cache mode.

Third, the file cache mode

This method uses the File.lastmodified () method to get the last modification time of the file. With the current time to infer whether it expires, thereby achieving the cache effect.

Only this property can be used on the implementation. There is no possibility to provide technical support for other features.

The operation is simple, more time can be. And the data is the JSON data in the file.

Its own handling is not easy to bring other problems, low cost.

Iv. Two-point description of the method of file caching

1. The cache time for different types of files is not the same.

The general saying. The cache time of the unchanging file is permanent, and the cache time of the change file is the maximum tolerable time. White point, the picture file content is constant, usually exists on the SD card until it is cleaned. We are able to read the cache forever. The configuration file contents are likely to be updated, and an acceptable cache time is set.

2. The cache time standard in different environments is not the same.

No network environment. We can only read the cache file. For the application there is something to display. There is nothing out of date to say.

WiFi network environment, the cache time can be set short, one is the speed faster, but not the flow of money.

3G traffic environment, the cache time can be set a bit longer. Save traffic. is to save money, and the user experience is better.

GPS doesn't say updates or anything, it's slow enough.

How long the cache time can be.

Of course. As a good application. It's not going to die. In a situation, it is necessary to change different forms of caching function for different networks. And this time according to their own actual situation to set: The frequency of data update, the importance of data and so on.

Five, when to refresh

Developers want to read the cache as much as possible, and the user wants to refresh in real time. But the faster the response, the better, the less traffic, the better (about this piece.) Indeed, I did not think much of development. After all, the interface is so much. Now the company's products almost a click on the visit, and there are some redundant features. Slowly change haha), is a contradiction.

In fact when the refresh I do not know, here I provide two points of advice:

1. The longest and longest time of the data is unchanged, the corresponding use has no big influence.

For example, your data update time is 4 hours, then the cache time is set to 1-2 hours is more appropriate. That is, update time/cache time = 2, but the user's personal changes, site editors and other people to update the others.

Users will always see updates one day. Even if there is a delay, depending on the purpose of your product; Suppose you think you are an information application, then lower, 2-4 hours, if you think the data is more important or more popular, users will often play, then lower, 1-2 hours, and so on.

Of course, similar to the interface of the data I think the update time can be much longer. As long as possible.

Let's say you take the amount of data behind that to stall. I'll tell you this: this is just an introductory interface. How many games you have with the user's half-dime relationship is not, 1 billion also with him, he just want to make sure that here can find him to find Tom Cat can.

Otherwise you have lost another user.

2. Provide a refresh button.

The necessary or safest way to make a refresh button available on the relevant interface. or the current popular drop-down list refresh mode.

is a cache, providing an opportunity to load failures again and again. After all, when you drink bone soup. I don't mind many pairs of chopsticks beside the bowl.

All in all, the user first, in order to better user experience, methods will be endless. Looking for a better way

(Reference code: HTTP://BLOG.CSDN.NET/LNB333666/ARTICLE/DETAILS/8460159)

Picture cache:

President

Loading a bitmap (bitmap) into your UI interface is easy, but assuming you're loading a large number of times, things get a lot more complicated. In most cases (such as a ListView, GridView, or Viewpager component). The picture on the screen and the total amount of the picture you want to scroll to the screen immediately. is inherently unrestricted.

Like this component, the view is reclaimed after the child view is moved out of the screen, and memory usage is retained.

But if you don't keep a reference for whatever long-term survival. The garbage collector will also release the bitmap you have loaded. This is just as good as it is, but in order to maintain a smooth and high-speed loading of the UI, you need to avoid continuing to process the picture back on the screen again. Using memory and hard disk caching usually solves the problem by using the cache consent component to load and process pictures at high speed.

This lesson will take you through memory and hard disk cache bitmap to improve the responsiveness and fluency of your UI when loading multiple bitmap.

Use memory Cache

At the expense of valuable application memory. The memory cache provides a high-speed way to bitmap access.

The LRUCache class (which can be obtained in the support library and supported to API level 4, or above the 1.6 version number) is a good fit for caching bitmap tasks, and it stores recently referenced objects in a strongly referenced linkedhashmap. It also frees objects that are not used most recently after the cache has exceeded the specified size.

Note: There was a very popular memory cache implementation that was a softreference (soft reference) or WeakReference (weak reference) bitmap caching scheme, but it is deprecated today.

Starting from the Android2.3 version number (API level 9). The garbage collector focuses more on the recovery of soft/weak references. This makes the above scheme quite ineffective. In addition, before the Android 3.0 (API level 11) version number, Bitmap's backup data is stored directly in local memory and released in an unpredictable way from memory, which is very likely to cause the program to crash beyond memory limits momentarily.

In order to choose a suitable size for LRUCache, there are a number of reasons to consider, such as:

Are other activity (activities) and/or programs very memory-intensive?

How many pictures are displayed on the screen at a time? How many pictures will be displayed on the screen?

What is the screen size and density of the device? An ultra-high-definition screen (XHDPI) device, such as the Galaxy Nexus, caches the same number of images for more cache space than the Nexus S (HDPI).

Bitmap size, configuration and how much memory is needed for each picture?

Are the images frequently visited? Are some more frequently interviewed than others? Suppose so, perhaps you need to keep some pictures in memory and even need multiple LRUCache objects assigned to different groups of bitmap.

Can you balance the quality and quantity of the pictures? Sometimes it's more practical to store a lot of low-quality images, and then you can load a picture with a high-quality version number in a background task.

For setting the cache size, there is no specification for all apps, it depends on the appropriate solution you gave after memory usage analysis. Cache space is too small for no benefit. Instead, it causes additional overhead. It is too big to be able to cause java.lang.OutOfMemory anomalies again or just leave very little space for the application's other programs to execute.

(Reference website: http://my.oschina.net/ryanhoo/blog/88443)

Android Cache Processing

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.