How the app should manage memory
At all stages of software development, you should pay attention to your RAM consumption (that is, in the design phase of the software). There are many ways to use them to help you design and write more efficient code,
You should use these techniques when designing and implementing applications to reduce the memory consumption of your application.
Use the service as little as possible
If your app needs to use a service for background operations, try to get him to work when it does. Also make sure to end your service when your work is finished.
When you start a service, the system retains its process while the service is running, which makes it very large because the resources consumed by the service cannot be used or recycled by other applications. This results in a reduction in the number of processes saved in the LRU cache, making it less efficient to switch applications. When there are enough services running in the background, he may even cause the whole system to crash due to memory tension.
The simplest way to manage your service is to use a intentservice and he will shut down the service after he finishes the event, see Using the front desk service for details.
It is one of the least mistakes in memory management to keep the service running in the background when it is not needed. So don't be greedy to keep your service running in the background, which not only leads to lower performance for your app, it's also possible for users to discover the behavior of the app and uninstall him.
Frees up memory resources when the user interface is hidden
When the user switches to another app and your UI is not visible, you should release the resources that are used by your own interface. Releasing the UI resources at this point can significantly increase the number of cache processes that the system holds, which can greatly improve the user experience.
When the interface of all the processes in your application is not visible to the user, it triggers the callback interface of Ontrimmemory () with a trim_memory_ui_hidden parameter. This interface differs from the callback interface of OnStop () in that the OnStop () interface is called when the jump is applied, and the Ontrimmemory interface is called only if all the interfaces are not visible, although you need to implement OnStop () method to free up activity resources, such as network access or logoff of a broadcast receiver, but that is not enough, you should not release your UI resources before ontrimmemory. This ensures that the user returns to your app via a return key, and that your UI resources are also present and can be displayed quickly.
Release some memory when memory is low
In the life cycle of your app, the Ontrimmemory () interface is also called when the memory of the entire device becomes low. You should selectively release your resources based on the level of memory coming from Ontrimmemory ().
Trim_memory_running_moderate
Your app is running and cannot be killed, but there is little memory left on the device and the system needs to kill some processes from the LRU cache
Trim_memory_running_low
Your app is running and cannot be killed, but the current device is out of critical memory, so you need to release unused memory to improve system efficiency
Trim_memory_running_critical
Your app is running, but the system is ready to kill most of the processes in the LRU, so you should release resources that are not critical. If the system cannot get enough RAM by recycling, then he empties all the LRU caches and starts to kill the processes that they want to keep, such as owning a process that is running a background service.
Similarly, when your app is in the cache, you may receive several ontrimmemory () levels
Trim_memory_background
The system is running in a low-memory state, and your process is at the beginning of the LRU list. So while your application process is unlikely to be killed, the system has already turned the cup to kill the process in the LRU. You should release resources that are easily restored to ensure that you are still in the list and can switch quickly when the user returns to the app
Trim_memory_moderate
The system is running in a low memory phase and your process is in the middle of the LRU list. Joining the system does not get enough memory resources for your application to be killed
Trim_memory_complete
The system is running in a low-memory phase, and your process will be one of the first processes to be killed by the system, and you should release all resources unrelated to your app's state.
Although the Ontrimmemory () interface is not joined until API14, you can still use the Onlowmemory interface as the old version of the callback, which can be the same as the trim_memory_complete level in Ontrimmemory ().
Note: When the system starts to kill the process in the LRU cache, although he starts from the bottom up, he will also consider prioritizing the memory consumption to reclaim more system resources. Therefore, if your application's memory is as low as possible, you may be kept in memory and can switch back quickly
Determine how much memory you should have
As mentioned earlier, each Android device provides a different size heap limit to the app based on its own RAM size. You can get the size of the heap that your app might have by Getmemoryclass (). Adding more memory to your app when it's out of memory will cause Oom
In a very special case, you can set the Largeheap property to True by setting the <application> tag in the manifest file. If you do this, you can get the evaluation value of your larger heap by Getlargememoryclass ().
However, it is best to request a larger heap space, such as a large image editing application, only for applications that really need more memory space. remember not to apply for a large heap because of your memory overflow , you need to know that only you understand where your memory is allocated and why he has been kept. When you are sure that your application needs a large heap, you should avoid wasting it at will, and using extra memory can greatly affect the user's experience, because garbage collection consumes more events and the efficiency of the system becomes slower in application switching or other operations
In addition, the size of a large heap differs on different devices, and when running on some devices that have limited memory, the size of the large heap may be exactly the same size as the normal heap. So even if you use a large heap, you should check the actual heap size by getmemoryclass () to ensure that the limit is exceeded.
[Android performance optimization] memory boost--how the app should manage memory