http://hukai.me/android-training-managing_your_app_memory/
The Random Access Memory (RAM) is a valuable resource in any software development environment. This is especially true on mobile operating systems where physical memory is usually limited. While Android's Dalvik virtual machine plays a regular garbage collection role, it doesn't mean you can ignore the timing and location of the app's memory allocation and release.
For GC to be able to reclaim memory from the app in a timely manner, we need to be careful to avoid memory leaks (usually caused by holding object references in global member variables) and to dispose of reference objects at the appropriate time (lifecycle callbacks, as described below). For most apps, Dalvik's GC automatically reclaims objects that leave the active thread.
This article explains how Android manages the process and memory allocations for apps, and how to proactively reduce the use of memory when developing Android apps. For resources management Mechanisms in Java, refer to other books or online materials. If you are looking for an article on how to analyze your memory usage, please refer here investigating Your RAM usage.
Part 1th: How Android Manages memory
Android does not provide swap space for memory, but it has a mechanism to manage memory using paging and memory-mapping (mmapping). This means that any memory you modify, whether by assigning new objects or accessing content in mmaped pages, is stored in RAM and cannot be paged out. So the only way to completely release memory is to release references to objects that you might hold, which can be recycled by GC when the object is not referenced by any other object. There is only one exception: if the system wants to reuse the object elsewhere.
1) Shared memory
Android implements shared Ram in a number of different processes in the following ways:
Each app's process is forked from a process called Zygote . The zygote process starts after the system boots and loads the code and resources of the generic framework. In order to start a new program process, the system will fork the zygote process to generate a new process and then load and run the app's code in the new process. This allows most of the RAM pages to be allocated to the framework's code, while allowing RAM resources to be shared across all processes in the app.
Most of the static data is mmapped into a process. This not only allows the same data to be shared between processes, but also allows it to be paged out when needed. For example, the following types of static data:
- Dalvik code (placed in a pre-linked. odex file for direct mapping)
- APP resources (by designing the resource table structure into a data structure that is easy to mmapping, you can also optimize the files in the APK by doing aligning operations)
- Traditional project elements, such as local code in. so files.
- In many cases, Android uses an explicit allocation of shared memory areas (such as Ashmem or gralloc) to implement some dynamic RAM areas that can be shared between different processes. For example, window surfaces uses shared memory between the app and screen compositor, and the cursor buffers uses shared memory between the content provider and the client.
For information on how to view the shared memory used by your app, see investigating Your RAM usage
2) Allocating and recovering memory
Here are a few facts about how Android allocates and reclaims memory:
- The Dalvik heap for each process has a limited range of virtual memory. This is the logical heap size, which can grow as needed, but there will be an upper limit defined by the system.
- The logical heap size and the actual amount of memory used for physical use are unequal, and Android accounting is a value called proportional Set size (PSS), which records the amount of memory that is shared with other processes. (assuming that the shared memory size is 10M, a total of 20 processes in the shared use, depending on the weight, you might think that there are 0.3M to really be used by your process)
- The Dalvik heap does not match the logical heap size, which means that Android does not do the defragmentation in the heap to shut down the idle area. Android only shrinks the size of the logical heap size when the unused space is present at the end of the heap. However, this does not mean that the amount of physical memory used by the heap cannot be shrunk. After garbage collection, Dalvik traverses the heap and finds pages that are not in use, and then uses Madvise (System call) to return those pages to kernal. Therefore, the paired allocations and deallocations chunks of data can allow the physical memory to be recovered normally. However, recycling of fragmented memory can be inefficient because those fragmented allocation pages may be shared elsewhere.
3) Limit the memory of your app
To maintain a multitasking environment, Android sets a hard heap size limit for each app. The exact heap size limit varies depending on the size of RAM for different devices. If your app has reached the heap limit and then tries to allocate memory, it will cause OutOfMemoryError
an error.
In some cases, you might want to query the current device's heap size limit, and then determine the size of the cache. Can be getMemoryClass()
queried by. This method returns an integer that indicates how many megabytes (Megabates) The heap size limit is for your app.
4) Switching applications
Android does not swap memory when the user switches between different applications. Android will put those processes that do not contain foreground components into the LRU cache. For example, when a user starts an app, the system creates a process for it, but when the user leaves the app, the process is not immediately destroyed. The system will put the process into the cache, if the user later back to the application, the process can be fully restored, so that the application quickly switch.
If you have a cached process in your application, the process consumes the memory that is temporarily unused, a process that is temporarily unused, and that is kept in memory, which can have an impact on the overall performance of the system. Therefore, when the system starts to enter low memory state, it will be determined by the system according to the LRU rules and other factors after the decision to kill some processes, in order to keep your process can be cached as long as possible, please refer to the following section to learn when to release your reference.
For those processes that are not in foreground, the question of how Android determines which class of process to kill, refer to processes and Threads.
Part 2nd: How your app manages memory
You should take the limitations of RAM into account at every stage of the development process, even including the limitations of RAM in the design phase before you start writing code. We can use a variety of design and implementation methods, they have different efficiency, even if these methods are just the same technology of the constant combination and evolution.
To make your application more efficient, you should follow the technical points below when designing and implementing your code.
1) Cherish Services Resources
If your app needs to use the service in the background, unless it is triggered and performs a task, the service should be in a stopped state at other times. It is also important to note the memory leaks caused by stopping service failures after the service completes the task.
When you start a service, the system tends to retain the service and keep the process in place. This makes the process expensive to run because the system has no way to free up the RAM space occupied by the service to other components, and the service cannot be paged out. This reduces the number of processes that the system can store into the LRU cache, which can affect the efficiency of switching between apps. It can even cause system memory usage to be unstable, which prevents all currently running service from being maintained.
The best way to limit your service is to use Intentservice, which will end up as soon as possible after processing the intent task that has been confessed to it. For more information, please read running in a Background Service.
Keeping it when a service is no longer needed is one of the worst mistakes in memory management for Android apps. So do not be greedy to make a service persist. Not only will it make your app worse off because of the lack of RAM space, it will also allow users to discover apps that have a permanent background behavior and possibly unload it.
2) free memory when UI is hidden
When a user switches to another app and your app's UI is no longer visible, you should release all memory resources that are occupied by your app's UI. Releasing UI resources at this point can significantly increase the ability of the system to cache processes, which have a direct impact on the user experience.
To be able to receive notifications when a user leaves your UI, you need to implement a callback method inside the Activtiy class onTrimMemory()
. You should use this method to hear a TRIM_MEMORY_UI_HIDDEN
level callback, which means that your UI is hidden and you should release resources that are only used by your UI.
Please note: Your app will only receive onTrimMemory()
callbacks with parameters when all UI components are hidden. TRIM_MEMORY_UI_HIDDEN
This is different from the OnStop () callback, OnStop will be executed when the activity's instance is hidden, such as when the user jumps from one activity in your app to another activity OnStop (). So you should implement the OnStop callback and release the activity's resources in this callback, such as releasing the network connection and logging off the listener receiver. Unless you receive a callback for Ontrimmemory (Trim_memory_ui_hidden), you should not release your UI resources. This ensures that when users cut back from other activity, your UI resources are still available and can quickly resume activity.
3) Release some memory when memory is tight
At any stage of your app's life cycle, Ontrimmemory's callback method can also tell you that the entire device's memory resources are starting to strain. You should further decide which resources to release based on the level of memory in the Ontrimmemory callback.
- Trim_memory_running_moderate: Your app is running and will not be listed as a kill. However, when the device is running in a low memory state, the system starts triggering a mechanism to kill the process in the LRU cache.
- Trim_memory_running_low: Your app is running and not listed as a kill. But when the device is running in a lower memory state, you should free up unused resources to improve system performance (but this will also directly affect the performance of your app).
- Trim_memory_running_critical: Your app is still running, but the system has already killed most of the processes in the LRU cache, so you should release all non-essential resources immediately. If the system is not able to reclaim enough RAM, the system clears all the processes in the LRU cache and starts killing processes that were previously thought not to be killed, such as the one containing a running service.
Similarly, when your app process is being cached, you may receive one of the following values returned from Ontrimmemory ():
- Trim_memory_background: The system is running in a low memory state and your process is in the most vulnerable location in the LRU cache list. Although your app process is not in a high-risk state of being killed, the system may have started to kill other processes in the LRU cache. You should release the resources that are easy to recover so that your process can be preserved so that you can recover quickly when the user rolls back to your app.
- Trim_memory_moderate: The system is running in a low memory state and your process is already close to the central location of the LRU list. If the system starts to become more memory intensive, your process is likely to be killed.
- Trim_memory_complete: The system is running with a low memory state and your process is in the most easily killed position on the LRU list. You should release any resources that do not affect your app's recovery status.
Because the callback for Ontrimmemory () is added in the API , for older versions, you can use the onlowmemory callback for compatibility. Onlowmemory quite with TRIM_MEMORY_COMPLETE
.
Note: When the system starts clearing processes in the LRU cache, although it first operates in the order of LRU, it also takes into account the amount of memory used by the process. As a result, the less expensive the process is, the more likely it is to stay.
4) Check how much memory you should use
As mentioned earlier, each Android device will have a different total ram size and free space, so different devices offer different size heap limits for apps. You can get the available heap size for your app by calling Getmemoryclass ()). If your app tries to request more memory, there will be OutOfMemory
an error.
In some special scenarios, you can declare a larger heap space by adding attributes under the Application tab of the manifest largeHeap=true
. If you do this, you can get to a larger heap size by Getlargememoryclass ()).
However, the ability to get a larger heap is designed for a small number of applications that consume a lot of RAM (such as an editing application for a large picture). don't be so easy because you need to use a lot of memory to request a large heap size. Use the large heap only when you know exactly where to use a lot of memory and why the memory must be preserved. Therefore, use as little as possible the large heap. Using additional memory affects the overall user experience of the system and makes the GC run longer each time. The performance of the system becomes compromised when the task is switched.
In addition, the large heap does not necessarily get a larger heap. On some severely restricted machines, the size of the large heap is the same as the usual heap size. So even if you apply for the large heap, you should check the actual heap size by executing getmemoryclass ().
5) Avoid the waste of bitmaps
When you load a bitmap, you just need to keep the data that fits the current screen device resolution, and if the original is higher than your device resolution, you need to do a smaller action. Keep in mind that increasing the size of the bitmap will increase the memory by 2, because both X and Y are increasing.
Note: in Android 2.3.x (API level 10) and below, the bitmap object's pixel data is stored in native memory, which is not easy to debug. However, starting with Android 3.0 (API level 11), bitmap pixel data is allocated in the Dalvik heap of your app, which improves GC productivity and makes debugging easier. So if your app uses bitmap and causes some memory problems on the old machine, switch to debug on more than 3.0 of the machines.
6) Use of optimized data containers
Use the optimized container classes within the Android framework, such as Sparsearray, Sparsebooleanarray, and Longsparsearray. The usual implementation of HashMap consumes more memory because it requires an additional instance object to record the mapping operation. In addition, Sparsearray is more efficient in that they avoid automatic boxing of key and value autobox, and avoid boxed unpacking.
7) Note the memory overhead
Learn about the cost and overhead of the language and library you use, from start to finish, and keep this in mind when designing your app. Often, seemingly innocuous things on the surface (innocuous) may actually lead to a lot of overhead. For example:
- The enums memory consumption is typically twice times the static constants. You should try to avoid using enums on Android.
- Each of the classes in Java, including anonymous inner classes, uses approximately bytes.
- The cost of each instance of a class is 12-16 bytes.
- Add a entry to HashMap that requires an extra bytes of entry objects.
8) Please note that the code "abstract"
Often, developers use abstractions as "good programming practices," because abstractions can improve the flexibility and maintainability of code. Abstraction, however, leads to a significant overhead: usually they need the same amount of code for execution. The code will be map into memory. So if your abstraction does not have a significant increase in efficiency, you should try to avoid them.
9) Use Nano Protobufs for serialized data
Protocol buffers is a language-independent, platform-independent, well-extensible protocol designed by Google to serialize structured data. Like XML, it's lighter, faster, and simpler than XML. If you need to implement a protocol for your data, you should always use the Nano Protobufs in the client's code. The usual protocol operations generate a lot of tedious code, which can be a lot of trouble for your app: increasing RAM usage, significantly increasing the size of the apk, slower execution speed, and more easily reaching Dex's character limit.
For more details, please refer to the "Nano version" section of Protobuf Readme.
10) Avoid using the dependency injection framework
Using framework injection packages like Guice or roboguice is very effective because they can simplify your code.
Notes:roboguice 2 changes the code style through dependency injection, making Android experience even better when developing. Do you getIntent().getExtras()
often forget to check for NULL when calling? Roboguice 2 can do it for you. Do you think it findViewById()
is unnecessary to cast the return value to TextView? Roboguice 2 can help you. Roboguice the need for speculative work to Android development. Roboguice 2 will be responsible for injecting similar details into your View, Resource, System service, or other objects, etc.
However, those frameworks will perform many initialization operations by scanning your code, which will cause your code to require a lot of RAM to mapping the code, and mapped pages is kept in RAM for a long time.
11) cautious use of third-party libraries
Many of the open Source library code is not written for the mobile network environment, if used on mobile devices, this is not high efficiency. When you decide to use a third-party library, you should do tedious migration and maintenance work for the mobile network.
Even the library, designed for Android, can be dangerous, because every library does things differently. For example, one of the Lib uses the nano Protobufs, while the other uses the Micro Protobufs. So, there are 2 ways to implement PROTOBUF in your app. Such conflicts can also occur in the output log, loading images, caches, and so on inside the module.
Also do not fall into the trap of importing an entire library for 1 or 2 functions. If you don't have a suitable library to match your needs, you should consider implementing it yourself rather than importing a chatty solution.
12) Optimize Overall performance
The official list of many articles that optimize the performance of the entire app: Best Practices for performance. This article is one of them. Some articles explain how to optimize the efficiency of your app's CPU usage, and some how to optimize your app's memory usage.
You should also read the optimizing your UI to optimize layout. You should also focus on the recommendations made by the lint tool and optimize it.
13) Use Proguard to remove unwanted code
Proguard is able to compress, optimize, and confuse code by removing unwanted code, renaming classes, domains and methods, and so on. Using Proguard can make your code more compact, allowing you to use less RAM for mapped code.
14) Use Zipalign for the final apk
After you have written all the code and generated the APK by compiling the system, you need to re-calibrate the APK using Zipalign. If you don't do this, it will cause your apk to need more RAM, because something like a picture resource can't be mapped.
Notes: Google play does not accept apk that has not been zipalign.
15) Analyze Your RAM usage
Once you get to a relatively stable version, you need to analyze the memory used throughout your app's life cycle and optimize it for more details, refer to investigating Your RAM usage.
16) Use multi-process
If appropriate, there is a more advanced technology that can help your app manage memory usage by cutting your app components into multiple components and running in different processes. This technique must be used with caution, and most apps should not run in multiple processes. Because if used improperly, it can significantly increase the use of memory rather than decrease it. Consider using this technique when your app needs to run a lot of the same tasks in the background as the foreground.
A typical example is creating a music player that can be played back in the background for a long time. If the entire app is running in a process, there is no way to release the UI resources from the foreground while the background is playing. Apps like this can be cut into 2 processes: one for manipulating the UI and the other for the backend service.
You can implement a component's operations on another process by declaring the ' android:process ' property in the manifest file.
12 |
<service android:name=".PlaybackService" android:process=":background" />
|
For more details on using this technology, please refer to the original text, link below. Http://developer.android.com/training/articles/memory.html
Article Learning from http://developer.android.com/training/articles/memory.html
Android Training-Manage the memory of your app