All content is sourced from official documents Https://developer.android.com/training/articles/memory.html
Only-to-completely release memory from your app is-to-release object references you could be holding, making the memory Available to the garbage collector.
The only way to completely free memory from your app is to release the object you are referencing to make it visible to the GC.
How Android manages Memory
How the Android system manages memory
1. Sharing Memory
Shared memory
In order to fit everything it needs in RAM, Android tries to share RAM pages across processes. It can do in the following ways:
To accommodate memory requirements, Android will share memory pages across processes in the following ways:
1) Each app process was forked from an existing process called Zygote.
Each application process fork from an existing process called zygote
2) Most of the static data is mmapped into a process.
Most of the static data is mapped in a process
3) In many places, Android shares the same dynamic RAM across processes using explicitly allocated shared memory regions (E Ither with Ashmem or Gralloc).
On many occasions, Android uses the specified area of allocated memory to share the same dynamic memory across processes.
2. Allocating and reclaiming App Memory
Allocating and reclaiming memory
Here is some facts about how to Android allocates then reclaims memory from your app:
Here are some system policies for recovering and allocating memory:
1) The Dalvik heap for each process are constrained to a single virtual memory range.
The Dalvik heap of each process is limited to one memory interval.
2) The logical size of the heap is not a same as the amount of physical memory used by the heap.
Heap memory is not the same logical size and physical size
3) The Dalvik heap does not compact the logical size of the heap, meaning this Android does not defragment the heap to Clos e up space. Android can only shrink the logical heap size if there is unused space at the end of the heap.
Dalvik Heap does not defragment memory in order to keep memory contiguous
3. Restricting App Memory
Limit application Memory
To maintain a functional multi-tasking environment, Android sets a hard limit on the heap size for each app. The exact heap size limit varies between devices based on what much RAM the device has available overall. If your app has reached the heap capacity and tries to allocate more memory, it'll receive an outofmemoryerror.
In order to maintain a multitasking environment, the system strictly limits the heap memory size of each application and, depending on the device, will cause Oom to exceed this capacity.
4. Switching Apps
Application switching
Instead of using swap space when the user switches between apps, Android keeps processes that is not hosting a foreground ("User visible") app component in a least-recently used (LRU) cache. For example, when the user first launches an app, a process was created for it, and if the user leaves the app, that proc ESS does not quit. The system keeps the process cached, so if the user later returns to the app, the process was reused for faster app Switchi Ng.
The system keeps the non-foreground process in the cache with the LRU policy, for example, when you start the app, creating a process, but when you leave the app, the process does not end. When you enter the app again, the process recovers quickly from the cache.
How Your App should Manage Memory
Apply how the memory is managed
You should apply the following techniques when designing and implementing your app to make it more memory efficient.
To use memory more efficiently for your app, you should adopt the following techniques:
1. Use Services sparingly
Use service conservatively
1) If Your app needs a service to perform work in the background, does not keep it running unless it ' s actively performing a Job. Also was careful to never leak your service by failing to stop it when it was done.
When the service is done, never forget to stop your service.
2) When you start a service, the system prefers to always keep the process for that service running. This makes the process very expensive because the RAM used by the service can ' t is used by anything else or paged out.
When you start a service, the system tends to keep the process going, which makes the process very expensive because the memory used by the service can no longer be used.
3) The best-of-the-lifespan of your service is-to-use an intentservice, which finishes itself as soon as it ' s Don e handling the intent that started it.
The best way to limit the service life cycle is to use Intentservice, which automatically summarizes itself after completing a task.
2. Release memory when your user interface becomes hidden
Frees memory when the application interface is not available.
1) When the user navigates to a different app and your UI was no longer visible, you should release any resources this is Used by the only your UI.
When a user switches to another app and your app UI is no longer visible, you should release all resources that are used only by your UI
2) to being notified when the user exits your UI, implement the Ontrimmemory () callback in your Activity classes. You should use this method to listen for the Trim_memory_ui_hidden level, which indicates your UI are now HIDDEN from view The should free resources is only your UI uses.
Override the Ontrimmemory () method to determine Trim_memory_ui_hidden to listen to whether the user is leaving the UI and release the resources that are used only by the UI once they leave.
3) So although-should implement OnStop () to release activity resources such as a network connection or to unregister BR Oadcast receivers, you usually should not release your UI resources until you receive Ontrimmemory (Trim_memory_ui_hidden). This ensures if the user navigates back from another activity in your app, your UI resources is still available to Resume the activity quickly.
We need to release some activity resources in OnStop, such as network connection, broadcast registration, etc., but only when the system callback Ontrimmemory (Trim_memory_ui_hidden) releases the UI resources.
3. Release memory as memory becomes tight
Freeing memory when memory is tight
During any stage of the Your app ' s lifecycle, the Ontrimmemory () callback also tells if the overall device memory is get Ting Low. You should respond by further releasing resources based on the following memory levels delivered by Ontrimmemory ():
During all life cycles of the app, the Ontrimmemory () callback can tell you if the device memory is down, and you should make the appropriate resource release action based on the following system callbacks that respond to varying degrees of memory usage.
1) trim_memory_running_moderate
2) Trim_memory_running_low
3) trim_memory_running_critical
Also, when your app process was currently cached, you could receive one of the following levels from Ontrimmemory ():
When your application process is currently cached, you can also receive the following system callbacks that respond to varying degrees of memory usage.
1) Trim_memory_background
2) Trim_memory_moderate
3) Trim_memory_complete
When the system begins killing processes in the LRU cache, although it primarily works bottom-up, it does give some consid Eration to which processes is consuming more memory and would thus provide the system more memory gain if killed
When the system starts to kill processes in the LRU cache, the processes that consume large amounts of memory are considered preferentially, although they are primarily compliant with LRU policies.
4. Check How much memory should use
Check how much memory you should use
As mentioned earlier, each android-powered device have a different amount of RAM available to the system and thus provides A different heap limit for each app. You can call Getmemoryclass () to get a estimate of your app ' s available heap in megabytes. If your app tries to allocate more memory than are available here, it'll receive an outofmemoryerror.
You can call the Getmemoryclass () method to estimate the available heap memory for your app, and if your app uses more than that amount of memory it will cause oom.
In very special situations, you can request a larger heap size by setting the Largeheap attribute to ' true ' in the Manifes T tag. If you don't have the Getlargememoryclass () to get a estimate of the large heap size.
In special cases, you can request a larger heap memory by setting the Largeheap property to true under the manifest file node, and if you do, you can call Getlargememoryclass () to see the larger heap memory that can be allocated to it.
Additionally, the large heap size is not a same on all devices and, when running on devices that has limited RAM, the L Arge heap size may be exactly the same as the regular heap size.
In addition, it can be known by distribution rules that even if largeheap is set, it may not work.
5. Avoid wasting memory with bitmaps
Handle bitmap with care and avoid wasting memory. See specific chapters. Http://developer.android.com/training/displaying-bitmaps/load-bitmap.html
6. Use Optimized data containers
Using an optimized data container
Take advantage of optimized containers in the Android framework, such as Sparsearray, Sparsebooleanarray, and Longsparsear Ray.
Use optimized containers in the Android API, such as Sparsearray, Sparsebooleanarray, and Longsparsearray, to replace containers such as Haspmap.
7, be aware of memory overhead
Clear memory overhead
1) Enums often require more than twice as much memory as static constants. You should strictly avoid using enums on Android.
Enums usually requires twice times more memory than static constants and should be avoided in Android enums
2) every class in Java (including anonymous inner classes) uses about bytes of code.
Each class uses about 500 bytes of code
3) Every class instance has 12-16 bytes of RAM overhead.
Each object has a cost of about 12-16 bytes of memory
4) Putting a single entry to a HASHMAP requires the allocation of an additional entry object this takes bytes (see the Previous section about optimized data containers).
HashMap each Key object requires an additional approximately 32 bytes of memory, which is why it is recommended to use the optimization container mentioned in 6.
8. Be careful with code abstractions
Be careful with abstract programming
9. Use of Nano Protobufs for serialized data
Using the Nano Protobufs when serializing data
10, Avoid Dependency Injection frameworks
Avoid dependency Injection frameworks
These frameworks tend to perform a lot of process initialization by scanning your code for annotations, which can require Significant amounts of your code to being mapped into RAM even though you don ' t need it.
These frameworks do a lot of process initialization by scanning the annotations in your code, which results in a considerable amount of code being mapped into memory, which is unnecessary.
11, being careful about using external libraries
Use external libraries with caution
12, Optimize Overall performance
Optimize overall performance, which has a special section in training Doc.
13. Use Proguard-strip out any unneeded code
Use Proguard to remove unnecessary code
The Proguard tool shrinks, optimizes, and obfuscates your code by removing unused code and renaming classes, fields, and M Ethods with semantically obscure names. Using Proguard can make your code more compact, requiring fewer RAM pages to be mapped.
Use the Proguard tool to slim, optimize, and confuse your code, making your code more compact and less memory-intensive.
14. Use Zipalign on your final APK
Use the Zipalign tool to manipulate your final apk
If you don't post-processing of an APK generated by a build system (including signing it with your final production Certi Ficate) must run zipalign on it to the IT re-aligned. Failing to does so can cause your apps to require significantly more RAM, because things like resources can no longer be mmap PED from the APK.
You must use the Zipalign tool to manipulate your final apk, which will reduce the memory usage of your app, because some unnecessary resources can no longer need to be mapped from the APK to memory.
15. Analyze Your RAM usage
Analyze Your Memory usage
Once achieve a relatively stable build, begin analyzing how much RAM your app is using throughout all stages of its Li Fecycle. For information on how to analyze your app, read investigating your RAM Usage.
For more information, see investigating Your RAM Usage https://developer.android.com/tools/debugging/debugging-memory.html this chapter. It will not only introduce the usual mat, but also some other tricks.
16. Use multiple processes
Using multi-process
If It's appropriate for your app, a advanced technique the may help you manage your app's memory is dividing components of your app into multiple processes. This technique must is used carefully and most apps should not run multiple processes, as it can easily increase-ra Ther than decrease-your RAM footprint if done incorrectly. It's primarily useful to apps that can run significant work in the background as well as the foreground and can manage th Ose
Operations separately.
You can specify a separate process for each app component by declaring the Android:process attribute for each component in The manifest file.
This technique can be used when the pre-and post-work division of the application is clear, but it must be used sparingly, otherwise it will backfire.
In the official documentation, we can get the most accurate hand information.
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Android: Managing App Memory