"Go" memory management in Android--good good, avoid using enum type

Source: Internet
Author: User
Tags compact

Original URL: http://android-performance.com/android/2014/02/17/android-manage-memory.html

This content is translated from: http://developer.android.com/training/articles/memory.html

Random access memory (RAM) is a valuable resource in any software development environment, but in a mobile operating system, memory resources are more valuable and are limited when used. While Android's Dalvik virtual machine has a runtime garbage collection mechanism, it doesn't mean that your app can use memory.

In order for the garbage collector to reclaim memory, you have to avoid a memory leak (usually caused by holding a reference to a global object), and release the type object at the appropriate time Reference (this is discussed further below). The garbage collector for most App,dalvik virtual machines handles the rest of the memory reclamation: The system reclaims its memory space when the object leaves the scope of the currently active thread.

This article focuses on how Android handles and allocates memory, and how to proactively use less memory when developing apps. For more information on how to manage your resources, you can refer to other books or online documentation. If you want to analyze the memory usage of an existing app, you can read investigating Your RAM usage.

How Android manages memory

Android does not provide swap space, but it uses memory paging technology (paging) and memory-mapping techniques (memory-mapping, also known as "mmapping", Translator Note: Mainly used to improve the reading and writing efficiency of large files and memory sharing between multiple processes, but also refer to the "Memory mapping File Principle exploration"). This means that any area of memory that you have manipulated-whether through object assignment or manipulation of mapped pages-will always reside in memory and will not be swapped out (the translator notes: I understand that because Android does not provide swap space, in-memory data is not automatically swapped into the disk, This means that we can only use physical memory). So the only way to completely free up memory is to release the object's reference so that it can be reclaimed by the garbage collector. But this also comes with a problem: files that have not been changed but have been mapped to memory, such as code, are swapped out of memory if the system needs to requisition it for memory pages that it occupies.

Shared memory

To meet the needs of memory operations, Android tries to share memory between processes, which is done in the following way:

    • Each app process derives (forks) from a process called "Zygote". The "Zygote" process was created when the system was started, and it loaded some common frameworks and resources (such as the theme style of the activity). To start a new app process, the system derives a new process from the "Zygote" process and then loads and runs the app's code in a new process. This process allows most of the memory pages allocated to frames and resources to be shared across all app processes.

    • Most of the static data is mapped into a process by memory. This allows the same data to be shared in different processes or swapped out as needed (page out). For example, static data includes the code for the Dalvik virtual machine (in pre-connected. odex files that can be mapped directly), app resources, and some traditional project elements, such as local code in the. so file.

    • In many cases, Android shares the same area of memory across multiple processes in a display way. For example, surface uses shared memory between the app and screen compositor, and the cursor buffer (cursor buffers) uses shared memory between the content provider and the client app.

Because of the widespread use of shared memory in Android, you should pay more attention to how much memory your app uses in the development process. To find out how much memory your app is using, refer to investigating Your RAM usage.

Allocate and reclaim memory for your app

Here are the key points for memory allocation and recycling for Android-targeted apps:

    • The Dalvik heap space for each process is limited to a virtual memory area. It defines the logical size of the heap space, which grows on demand (but there is an upper limit, which is defined by the system for each app).

    • The logical size value of this heap space is not the same as the amount of physical memory used by the heap. When you look at the usage of your app's heap memory, Android calculates a value called PPS (Proportional Set Size), which calculates the portion of memory that is shared with other processes--but this value is calculated only by the occupancy ratio. And how many apps share this inner relationship (translator Note: For example, three processes share a class library that occupies 30 pages of memory, and the amount of 10 pages of memory per process for the PSS value of this class library). The total number of PSS can reflect the use of your physical memory. More on PSS explanation, refer to investigating Your RAM Usage.

    • Dalvik does not compress the heap's logical space size, which means that Android does not organize the heap into compact space. Android reduces the size of this logical heap space only when there is unused space at the end of the heap. However, this does not mean that the physical memory used by the heap cannot be reclaimed. After garbage collection, Dalvik walks through the heap space and finds unused memory pages, and then madvise returns the pages to the kernel. Therefore, the allocation and release of large chunks of space can result in the recovery of all (or most) of the used physical memory. However, it may not be as effective to reclaim small chunks because the memory pages in the small area may still be shared by something that is not released.

Limit your app's memory

To maintain a multitasking environment, Android limits the amount of heap memory that each app can use to death. The exact heap memory size limit varies depending on the total memory size of the device. If you exceed this limit, you will throw an OutOfMemoryError exception.

In some cases, you might want to know exactly how much heap memory is actually available on your device-for example, you want to know how much data is safe to put in the memory cache. You can getMemoryClass() get this data by calling it, and the method returns an integer that represents how much megabytes of space you have in the heap space is available. We'll discuss this later in the section "Checking how much memory you should use".

App switching

Android does not use swap space for app switching, but instead places the app components that are not displayed on the front (the user is not visible) in an LRU (least recently used) cache. For example, when you start an app for the first time, the system creates a process for it, and the process does not exit when the user leaves the app. The system will cache the process. So, when the user returns to the app, the cached process is reused, so the app switches quickly.

If your app's process is cached and it holds memory that doesn't currently apply, it will limit the overall performance of the system. Therefore, when system memory is tight, the system will kill the least recently used processes in the cache based on the LRU principle, but will also take into account those memory-intensive processes, as appropriate. If you want your process to stay in the cache for longer, refer to the section below on how to release references.

For more information on how Android caches processes and how to decide to kill a process, refer to processes and Threads.

How you should manage memory

You should consider the issue of memory limitations at all stages of your app development, including in the design phase. There are many effective ways to integrate these methods and extrapolate will make your app more efficient.

When designing and formally developing your app, applying the techniques below will enable you to use memory more efficiently.

Using Services conservatively

If you need to start a service in the background, don't let it keep running unless the service does have tasks to perform. Also, after the background task completes, you need to be more careful when you want to stop the background service, and do not let the service shutdown fail.

When you start a service, the system generally maintains its process. This makes the service process expensive because the memory it consumes is not used by other programs and cannot be swapped out. This reduces the number of threads that the system can hold in the LRU cache, making app switching inefficient. When memory is tight, it can also cause system instability because the system may not sustain enough processes for all services.

The best way to limit service life is to use intentservice, primarily because it starts when a intent is captured and kills itself when a task is executed. For more information, please read running in a Background Service.

The most serious error in Android memory management is that you no longer need services that are still in the running state. So, don't let your service run all the time, not only does it not increase your app's risk of performing poorly, users will eventually find these bad behaviors, and then uninstall your app.

Remember to free up memory when your user interface is hidden

When the user switches to another app, your UI is hidden, so remember to release resources that are only available to your own UI. Releasing the UI resource at this time can significantly count the number of processes that the system can cache, which has a direct impact on the quality of the user experience.

By implementing the Ontrimmemory () callback method in your Activity, you will receive a notification when the user exits your UI. You should use this method to listen for the Trim_memory_ui_hidden event, which means that your UI is now hidden, and you need to release the resources that are exclusive to your UI.

Note that you can only receive Trim_memory_ui_hidden events in Ontrimmemory () if the UI components of your App process are hidden. This differs from the OnStop () callback function, which is called when the activity instance is hidden, and it often happens when you switch from one activity to another activity in your app. So, although you need to release some activity resources in the OnStop () method, such as a network connection, or log off the broadcast receivers, you usually receive ontrimmemory (Trim_memory_ui_hidden) UI resources should not be freed before. This ensures that the user can quickly switch activity in your app.

Free memory when memory is tight

At all stages of your application life cycle, you will also be notified by Ontrimmemory () when the overall memory usage of the device is gradually reduced. When you receive the following event, you need to further release the resource:

    • Trim_memory_running_moderate
      At this point your app doesn't have to worry about being killed, but the device's memory is already tight and the system is ready to kill some processes from the LRU cache.

    • Trim_memory_running_low
      At this point your app doesn't have to worry about being killed, but the device's memory is more stressful, so you should release some unused resources to improve system performance (which also affects the performance of your app directly).

    • Trim_memory_running_critical
      Your app is still running, but the system is ready to give you most of the process in the LRU cache, so you're in danger, and you'd better release some non-critical resources. If the system cannot reclaim enough resources, then the system will start killing all the processes in the LRU buffer, including which systems tend to hold the process, such as the background service process.

If your app process is in the cache, you can also get back the following message event from the Ontrimmemory () method:

    • Trim_memory_background
      At this point, the memory in the system is tight and your process is closer to the beginning of the LRU cache queue. While your app process is less risky to kill, the system has already started killing processes in the LRU queue. You should release some resources that you can recover quickly, so your process can remain in the queue, and when the user returns to your app, it can quickly recover.

    • Trim_memory_moderate
      At this point, the memory in the system is tight and your process is close to the middle of the LRU cache queue. If the memory is further strained, your process is likely to be killed.

    • Trim_memory_complete
      At this point, the system is in a tight memory, and if the system does not immediately get enough memory, then your process will become the next kill process. You should release all non-critical resources to maintain the status of your app.

Because Trim_memory_complete is added in the API 14 version, you can use the Onlowmemory () callback method in the lower version, which is basically equivalent to the Trim_memory_complete event.

Note: While the system is starting to kill processes in the LRU cache, the system will also consider taking out the processes that are taking up more memory, even though they are based on the bottom-up, as more memory is being reclaimed. So, when your process is in the LRU cache, the less memory you occupy, the higher the chances of the process surviving.

Check how much memory you should use

As we mentioned earlier, because the total amount of physical memory available on each Android device is different, the limit on the size of the heap space is different. With Getmemoryclass () you can get how much of the heap space you have available (this is an estimate), in megabytes. If you allocate more memory than this limit, an exception will be thrown OutOfMemoryError .

In very special cases, you can set the Largeheap property in the Mainifest file to get a larger heap space, at which point you can obtain the available heap space estimates through the Getlargememoryclass () method.

However, the ability to live larger memory is only available for applications that do need more memory (for example, an app for editing photos). It is not the last resort to use this feature, it is easy to have a memory overflow, unless you can clearly know when your memory is allocated, and why it is holding this memory. However, in time you have the confidence to use this feature, and we recommend that you avoid it as much as possible. Using more memory will cause damage to the overall performance of the system, which will become slower when the task is switched on, and will take longer when the garbage collection takes time.

In addition, the nature of the application for a larger heap of space is also very different from the device, in some limited memory use of the device, the size of a large heap of space and the size of the regular space is the same. So, in time you use this feature, you should also only use the Getmemoryclass () method to check the available heap space and try to keep the memory under this limit.

Avoid wasting memory on bitmap

When you load a bitmap, you need to display a graph of how large the resolution is loaded, and if the original image resolution is too large, scale it. Remember, the larger the resolution of the bitmap, the more memory it takes, because the X-and Y-axes of the picture are larger.

Note: in Android 2.3.x (API 10) and below, the bitmap object re-heap space occupies the same amount of memory as the resolution independent (the actual pixel data is stored separately in the local memory space (native)). This makes debugging for bitmap memory allocations difficult because most memory analysis software does not see the local memory space. However, after Android 3.0 (API 11), Bitmap's pixel data is allocated on the app's Dalvik heap space, which improves garbage collection and the ability to tune. So, if you find it troublesome to debug memory problems on older versions, switch to Android3.0 or later to debug.

For more information on the use of Bitmap, please refer to managing Bitmap Memory.

Using an optimized data container

Take advantage of some of the optimized data containers in the Android framework, such as Sparsearray,sparsebooleanarray, and Longsparsearray. The implementation of our common HashMap is inefficient in terms of memory, because each group of mappings requires an object as a portal. In addition, Sparsearray is more efficient because it avoids autobox operations (that is, to promote the original type to an object type, such as to raise the int type to an integer type). It's not good to be afraid. Use array types or use them as appropriate.

Note: For more information about Sparsearray, refer to an earlier article, "Sparsearray alternative hashmap to improve performance"

Maintain sensitivity to memory load

Always be aware of the language you use, the performance overhead of your code base, and your load, and when designing your app, remember that information from the beginning to the end. Seemingly innocuous issues often present significant performance overhead, such as:

    • An enumeration type uses more than twice times more memory than a static constant. Therefore, you should strictly avoid using enumeration types in Android.
    • Each class in Java (including anonymous inner classes) occupies approximately 500 bytes (translator Note: It may include class objects, applications on the phone in Ddms, each class object draw for each application is around 300 bytes, and 500 bytes may also include other and class-related data structures, such as Chang in a virtual machine, etc.).
    • Each class instance consumes 12-16 bytes of memory (translator Note: This may refer to a memory pointer)
    • Inserting a record into hashmap requires creating an entry object that requires an additional 32 bytes (refer to the previous section, "Using Optimized data Containers").
Careful use of abstraction (abstractions)

Often, developers use abstraction as a "best practice" because it can tune code flexibility and maintainability. However, there is a cost: It takes more code, more execution time, and more memory. Therefore, the abstraction does not bring you any obvious benefits, then do not use.

Serialization of data using the nano version of Protobufs

Protocol buffers is a language-and platform-independent, extensible mechanism developed by Google for serializing structured data, smaller, faster and simpler than XML. If you need to use PROTOBUFS, you should use the Nano version on the client. Regular PROTOBUFS generates some redundant code, which creates some problems on the client side: increased memory usage, the size of the normal-price apk, slower execution, and the symbolic limit of Dex soon.

For more information on the Nano Protobufs, please refer to the "Nano Version" section of the Protobuf Readme.

Avoid using the Dependency injection framework

Using a dependency injection framework like Guice or roboguice is sometimes tempting because they can simplify your code and provide an adaptable environment that is often useful when testing or configuring changes. However, these frameworks need to be scanned for your code annotations at initialization time, and there are quite a few processes here that map a lot of code into memory, even if you don't need them. These mapped memory pages are allocated in clean memory, so Android can purge them, but this happens only after the pages have resided in memory for a long time.

Use third-party libraries with care

Many tripartite libraries are not developed for mobile environments and may be inefficient to run on mobile devices. At the very least, when you decide to use a three-party library, you should assume that you will migrate it or optimize it to a mobile version. Remember to analyze memory usage before you end up using it.

Even those that support the use of libraries on Android may have potential risks. For example, one library uses the Nano Protobufs and the other library uses the Micro Protobufs. At this point, there are two implementations of PROTOBUFS in your app, followed by a variety of implementations, such as logging, parsing, picture loading, caching, and many other aspects you can't imagine. Proguard can not save you, because these are dependent on the underlying framework's characteristics. When you refer to a subclass of activity from the framework (which typically references a lot of external dependencies), or when the framework uses reflection (which usually means that you have to spend more time, do more manual changes to make Proguard effective), and so on, this can cause a lot of uncertain problems.

Also, when choosing a three-party library, don't give up other features just because you're using one of the one or two features, and you don't want to make a lot of memory and load out of your unwanted code. If you don't have to use a three-way library, it's best if you do it yourself.

Overall optimization

There are many optimization articles in the best practices for performance article, and this is one of them. There are a lot of things about CPU optimization, and there are many optimizations for memory usage and layout.

You'd better read this article, optimizing your UI, with the content of the Layout debugging tool and how to use the recommendations from the lint tool to optimize your app.

Filter out useless code with Proguard

Proguard to streamline, refine, and confuse your code by removing useless code and renaming names that are confused by the class name, field name, and method name semantics. Proguard can make your code more compact, and it also means taking up less memory.

Use the Zipalign tool on the total apk

After you have finished working with the APK build (including signing it with your production integer), you must recalibrate the APK using the Zipalign tool (re-aligned, which is not known to translate to us for the time being). If you don't do this or fail, you'll make your app take up more memory, because something like a resource will not be mapped from memory in the APK.

Note: apk that is not zipalign calibrated is not accepted by Google Play.

Analyze Memory usage

Once you've completed a relatively stable version, you'll have to start analyzing the memory usage of your app's lifecycle stages. For more information on this, refer to investigating Your RAM Usage.

Use of multiple processes

If appropriate, the more advanced approach is to split your app into multiple processes. But you have to be very careful, most apps should not use a multi-process approach , because if used improperly, not only will not reduce memory usage, but will take up more memory. This approach is used primarily for applications that have important tasks that need to be run in the background and that can be managed separately in the front and back.

For example, a music player is more suited to this multi-process approach. If the entire app uses a process, the memory assigned to the UI is maintained when the music is played, even if the user is using another program and does not see the player's interface. Apps like this are best split into two processes: one responsible for the UI interface and the other as a background service to play music.

You can declare the Android:process property in the manifest file to specify a separate process for each app component. For example, you can declare a new process called "background" (when you can get a name that you like) to make your service run on a process outside the main process.

<service android:name=".PlaybackService"             android:process=":background" />

To ensure that your process belongs to your app only, you should add ":" Before your process name.

Before you start a new process, you need to understand how memory is used, and it's important to know that an empty process without any business logic is consuming memory. As shown below, an empty process consumes approximately 1.4MB of memory.

adb shell Dumpsys meminfo com.example.android.apis:empty** meminfo in PID 10172 [com.example.android.apis:empty] * *   PSS PSS shared private Shared private heap heap total clean Dirty Dirty  Clean clean Size Alloc free------------------------------------------------------ Native Heap 0 0 0 0 0 0 1864 1800 Dalvik heap 764 0 5228 3       0 0 5584 5499 Dalvik other 619 0 3784 448 0 0 Stack 28       0 8 0 0 other dev 4 0 0 0 4. So mmap 287       0 2840 212 972 0 apk mmap 0 0 0 136 0. Dex MMAP 250 148 0 0 3704 148 other mmap 8 0 8 8 A 0 Unknown 403 0 6      00 380 0 0 Total 2417 148 12480 1392 4832 152 7448 7299 148 

Note: To read the above information, please read investigating Your RAM Usage. The key data is private Dirty and privateclean, These two sections indicate that the process uses almost 1.4MB of memory, and that 150K is code-intensive.

It is important to understand the memory of the empty process, and he will grow very fast when your business logic starts. For example, the following is a memory-intensive situation that only initiates an activity that shows some text:

* * Meminfo in PID 10226 [com.example.android.helloactivity] * * PSS PSS shared private shared private Heap heap Total clean Dirty Dirty clean clean Size Alloc Free---       ---------------------------------------------------Native Heap 0 0 0 0 0 0 Dalvik Heap 1074 0 4928 776 0 0 5744 5658 2951 Dalvik Oth     ER 802 0 3612 664 0 0 Stack 0 8-0 0 Ashmem        6 0 0 0 0 other Dev 108 0 104 0 4. So mmap 2166       0 2824 1828 3756 0 apk mmap 0 0 0 632 0. TTF Mmap 3        0 0 0 0. Dex mmap 292 4 0 0 5672 4 other mmap 10 0 8 8 68       0 Unknown 632 0 412 624 0 0 Total 5169 4 11832 4032 10152 8 8744 8609 134

Now the memory of this process has risen nearly 3 times times, to 4MB, just to show some text. This gives me an important revelation: if you are going to split your app into multiple processes, make sure that only one process is used for the UI, and that other processes avoid using any UI resources, the UI resource eats memory (especially when you load a bitmap resource or other file resource). Once the UI component is drawn, it is difficult to optimize it in memory.

Also, when you run multiple processes, make sure that your code is readable, because the load problem of one process repeats itself on another process. For example, if you use an enumeration type (although you should not use an enumeration type), each process will repeatedly create and initialize the required memory, and any abstract adapters, temporary variables, and other loads will also recur.

Another multi-process problem is the dependency between them. For example, if you run a content provider on the default process that hosts the UI component, running the code that uses the content provider in the background process will leave your UI process in memory. If your goal is to create a background process that is independent of the heavyweight UI process, then you cannot rely on the content providers or services created by the UI process.

"Go" memory management in Android--good good, avoid using enum type

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.