What areas of Android memory

Source: Internet
Author: User

Han Mengfei sha yue31313 Han Yafei Han_meng_fei_sha [email protected]

What areas of Android memory

What are the areas of memory

============

memory is divided into 5 large areas1. Stack (stack)-Automatically allocated by the compiler to release, store the function parameter value, local variable value and so on. It operates in a manner similar to a stack in a data structure. 2, heap area (heap)-Generally by the programmer assigned to release, if the programmer does not release, the end of the program may be recycled by the OS. Note that it is not the same as the heap in the data structure, but the distribution is similar to the linked list. 3. Global zone (Static storage)--the storage of global variables and static variables is placed in a block, initialized global variables and static variables in one area, uninitialized global variables and uninitialized static variables in another area adjacent. -released by the system after the program is finished. 4, literal constant area (constant storage)-constant string is put here. The constant store is released by the system after the program ends in the read-only memory ROM. 5. Program code area-binary code that holds the function body. Example program: This is a predecessor written, very detailed int a = 0; Global initialization Zone char * p1; Global uninitialized area main () {int b; Stackchar s[] = "abc"; stackchar *p2; stackchar *p3 = "123456"; 123456\0 in the constant area, p3 on the stack. static int c = 0; global (static) initialization zoneP1 = (char *) malloc ();P2 = (char *) malloc (20);//the area allocated 10 and 20 bytes is in the heap area. strcpy (P1, "123456"),//123456\0 is placed in a constant area, and the compiler may optimize it with the "123456" that P3 points to as a place. }
three, heap and stack of the differenceFrom the definition, the most obvious difference is the manual allocation and the automatic allocation of the policy is different. Use the following analogy to see: Using the stack like we go to a restaurant to eat, just order (send application), eat (use), eat to go, do not bother to cut vegetables, wash vegetables and other preparation work and washing dishes, brush pots and other finishing work, his advantage is fast, but the freedom is small. The use of the heap is like a DIY dish that you like to eat, more trouble, but more in line with their own tastes, and great freedom. Stack-to-heap differences: defined in the method body (local variables) some basic types of variables and object reference variables are allocated in the stack memory of the method. When a variable is defined in a block of methods, Java allocates a memory space for the variable in the stack, and when the scope of the variable is exceeded, the variable is invalidated, and the memory space allocated to it is freed, and the memory space can be reused. Heap memory is used to hold all objects created by new (including all member variables of the object) and arrays. The memory allocated in the heap is automatically managed by the Java garbage collector. After creating an array or an object in the heap, you can also define a special variable in the stack that is equal to the array or the first address of the object in the heap memory, and this particular variable is the reference variable that we refer to above.  We can use this reference variable to access objects or arrays in the heap. Android memory mechanism to learn about Android heap and Stack 1, Dalvik heap and stack

This is just Dalvik Java part of the memory, in fact, in addition to the Dalvik part, there are native.

The following is a description of the data types listed above, so that we can better control our own programs only if we know where the data we are applying for IS. 2. Object instance Data

is actually the property that holds the object instance, the type of the property and the type tag of the object itself, etc., but does not save the method of the instance. The method of the instance is the data instruction, which is stored in the stack, which is the class method in the table above.

After the object instance is allocated in the heap, a 4-byte heap memory address is saved in the stack to find an instance of the object. Because an instance of heap is used in a stack, especially when invoking an instance, you need to pass in a this pointer.

3. Method Internal variables

The internal variables of a class method are in two cases: the simple type is saved in the stack, the object type holds the address in the stack, and the value is saved in the heap.

4. Non-static methods and static methods

The non-static method has an implicit incoming parameter, which is passed in by the Dalvik virtual machine, which is the address pointer of the object instance in the stack. Therefore, a non-static method (the instruction code in the stack) can always find its own private data (object property values in the heap).

The non-static method must also obtain the implied argument, so the non-static method must first new an object instance to obtain the address pointer in the stack before the call, otherwise the Dalvik virtual machine cannot pass the implied parameter to the non-static method.

The static method does not have an implicit parameter and therefore does not require a new object, which can be called as long as the class file is ClassLoader load into the JVM's stack. So we can call the class's method directly using the class name. Of course, at this point the static method is not accessible to object properties in the heap.

5. Static properties and Dynamic Properties

Static properties are saved in the stack, unlike dynamic properties stored in the heap. Because all are in the stack, and the instructions and data in the stack are fixed-length, it is easy to calculate offsets so that class methods (both static and non-static) can access static properties of the class. It is also because the static property is stored in the stack, so it has a global property.

6. Summary

The Java heap is a run-time data area in which objects allocate space. These objects are established through directives such as new, NewArray, Anewarray, and Multianewarray, and they do not require program code to be explicitly released.

Heap is responsible for garbage collection, the advantage of the heap is the ability to dynamically allocate memory size, the lifetime does not have to tell the compiler beforehand, because it is at runtime to allocate memory dynamically, Java garbage collector will automatically take away these no longer use data. However, the disadvantage is that the access speed is slower due to the dynamic allocation of memory at run time.

The advantage of the stack is that the access speed is faster than the heap, after the register, the stack data can be shared. However, the disadvantage is that the size and lifetime of the data in the stack must be deterministic and inflexible. The stack mainly contains some basic types of variables (, int, short, long, byte, float, double, Boolean, char) and object handle.

Comparing the above analysis can be seen, in fact, Java processing heap and stack the approximate principle of C + + is the same. It's just a memory-recycling mechanism that allows programmers to release memory without having to call delete voluntarily. Just like in C + +, the memory used for new applications is usually placed in the heap, and the normal temporary variables are placed in the stack.

7. The app allocates memory size by default

In Android, program memory is divided into 2 parts: native and Dalvik,dalvik are our normal Java use memory, that is, the memory that was used when the stack was just analyzed.

The objects we create are assigned here, and the limit for memory is Native+dalvik cannot exceed the maximum limit.

Android program memory is generally limited to 16M, also some 24M (early Android system G1, is only 16M). Depending on the settings of the custom system, init.c in the Linux initialization code, you can find the default memory size. Interested friends, you can analyze the virtual machine startup related code.

    1. Gdvm.heapsizestart =2*1024*1024;//Heap Initialization size is 2mgdvm.heapsizemax =16*1024*1024;//the largest heap is 16M
8. How the Android GC reclaims memory

An Android application's memory leak has little impact on other applications. To enable Android apps to run safely and quickly, each Android application uses a proprietary Dalvik virtual machine instance that is hatched by the zygote service process, which means that each application runs in its own process.

Android allocates different memory usage caps for different types of processes, and if a memory leak occurs during a program that causes the application process to use more memory than the upper limit, it will be considered a memory leak by the system and killed, which causes only its own process to be killed. It does not affect other processes (which can cause a system restart if the system processes problems such as system_process).

When it comes to application development, you need to understand how the system's GC (garbage collection) mechanism works, and Android uses a graph as a mechanism for iterating through the recovered memory.

Java considers the reference relationship as a directed edge of the graph, and has a pointing edge from the referrer to the Reference object. The thread object can be the starting vertex of a forward graph, a tree starting from the starting vertex, objects that the root vertex can reach are valid objects, and the GC does not reclaim those objects. If an object (connected sub-graph) and this root vertex are unreachable (note that the graph is a forward graph), then we think that this (these) objects are no longer referenced and can be recycled by GC.

So for an object that we don't need to use, we can set it to null, so that when the GC runs, it's good to iterate over your object without reference and automatically reclaim the memory that the object consumes. We can't release unwanted memory as soon as C + +, but we can proactively tell the system which memory can be recycled.

9. View App Memory usage

Let's look at how the memory usage of our program runs during the development process. We can view it through one of the ADB commands:

    1. $package _name: App package name//$PID: Apply process ID, you can view adb shell Dumpsys meminfo $package _name or $pid with the PS command
There is a lot of information on the memory usage graph that I use to view the package name, but our main concern is the usage of native and Davilk.  The android underlying kernel is Linux-based, and Linux, in contrast to window, is particularly likely to use system memory to load some cached data or to share data between processes. Linux in line with the principle of not white, will try to use the system memory, speed up our application running speed. Of course, if we expect an application that requires large memory, the system can also release a certain amount of memory usage immediately, which is the implementation of the system internal scheduling. So strictly speaking, it is difficult for us to calculate the size of a process memory under Linux. Because there is paging out to disk (page change), if you add all the memory mapped to the process, it may be larger than the actual physical size of your memory.
    • Dalvik: Refers to the memory used by the Dalvik.
    • Native: Is the memory used by the native heap. It should refer to the memory allocated on the heap using c\c++.
    • Other: Refers to memory used in addition to Dalvik and native. But what exactly does it mean? Include at least non-heap memory allocated in c\c++, such as memory allocated on the stack. puzlle!
    • PSS: It is to compute the resulting process using memory by allocating shared memory to each process that shares it, based on a certain percentage. The network is also said to be the proportion of shared library memory occupied, that is, the above mentioned process sharing issues.
    • Privatedirty: It refers to the size of the memory that is unshared and cannot be paged out (can not be paged to disk). For example, the small object that Linux buffers in order to increase memory speed, even if your process ends, the memory will not be released, it just back into the buffer.
    • Shareddirty: Referring to Privatedirty I think it should mean the size of the memory that is shared and cannot be paged out (can not be paged to disk). For example, the small object that Linux buffers in order to increase the allocated memory speed, even if all the processes that share it end, the memory is not released, it just goes back to the buffer.
10, the program to obtain memory information

To get the relevant information through Activitymanager, here is an example code:

    1.  1 privatevoid displaybriefmemory () 2 {3 Finalactivitymanager activityManage R = (Activitymanager) getsystemservice (Activity_service); 4 Activitymanager.memoryinfo info = Newactivitymanager.memoryinfo (); 5 Activitymanager.getmemoryinfo (info), 6 log.i (tag, "System remaining Memory:" + (Info.availmem >&G T;10) + "K"), 7 log.i (tag, "System is in low memory run:" +info.lowmemory), 8 log.i (tag, "When the system remaining memory is lower than" +info.threshold+ "as Low memory Run"); 9} 
       1 privatevoid displaybriefmemory () 2 {3 Finalactivitymanager Activitymanager = (Activitymanager) getsystemservice (Activity_service); 4 Activitymanager.memoryinfo info = Newactivitymanager.memoryinfo (); 5 Activitymanager.getmemoryinfo (info), 6 log.i (tag, "System remaining Memory:" + (Info.availmem >&G T;10) + "K"), 7 log.i (tag, "System is in low memory run:" +info.lowmemory), 8 log.i (tag, "When the system remaining memory is lower than" +info.threshold+ "as Low memory Run"); 9}  
In addition, through the Debug getmemoryinfo (debug.memoryinfo memoryinfo) can get more detailed information. As detailed as the information we see in the ADB shell. = = = Memory-Optimized mode Use WeakReference
exception to release resources .
stop all animations and related threads when the interface is not visible
less use of inter-frame animation, when really need to use Surfaceview do
The asynchronous task queue must use a bounded queue
when reading Sqllite please try not to be on the UI thread, although you do not exception
Use Leakcanary
The concept of stacks and memory area allocation stacks in =======JVM is present in the data structure and also in the JVM virtual machine, which is completely different in both environments.
In the data structure, the stack is: heap and stack two kinds of data structure, the heap is the complete binary tree, the elements in the heap are orderly. In this binary tree, all the parent node and child nodes have a size relationship, such as all the parent node is larger than the child node is the big head heap, if all the parents node is smaller than the child node description This is a small pile, the process of building a heap is a sort of process, heap query efficiency is also very high. A stack is an advanced, post-out linear table.
The different areas of the stack corresponding to the memory in the JVM virtual machine are not the same as the stacks mentioned in the data structure. The JVM architecture consists of several major subsystems and memory areas:

Class loading subsystem, which is responsible for loading the class from the file system into memory

GC subsystem, the main studio of the garbage collector automatically reclaims the memory that the program that is no longer running references the object, and it may also be responsible for those objects that are still in use to reduce heap fragmentation.

The memory area, which is used to store bytecode, objects that are created when the program is run, parameters passed to methods, return values, local variables, and intermediate calculation results. ----1. Stacks and heaps (heap) are places that Java uses to store data in RAM. Unlike C + +, Java automatically manages stacks and heaps, and programmers cannot directly set up stacks or heaps.

2. The advantage of the stack is that the access speed is faster than the heap, second only to the registers directly in the CPU. However, the disadvantage is that the size and lifetime of the data in the stack must be deterministic and inflexible. In addition, the stack data can be shared, see 3rd. The advantage of the heap is that the memory size can be allocated dynamically, and the lifetime does not have to tell the compiler beforehand that the Java garbage collector automatically collects the data that is no longer in use. However, the disadvantage is that the access speed is slower due to the dynamic allocation of memory at run time. ========java program run-time data region

The Java Virtual machine performs a Java program by dividing the memory it manages into a number of different data regions (a virtual machine is started whenever a Java program is run)

Where the method area and heap are shared by all threads, for example ThreadPoolExecutor , when multiple threads are created, the heap and method extents can be read by multiple threads.

程序计数器People who have learned the principles of computers will know that there is a PC register in the register of the CPU, storing the next instruction address, where the virtual machine does not use the CPU's program counter, and sets up an area in memory to simulate the CPU's program counter. Only one program counter is not enough, when multiple threads switch execution, there is no way for a single program counter, the virtual machine specification indicates that each thread has a separate program counter. Note that the program counter in the Java Virtual machine points to the byte-code address that is being executed, not the next one.

虚拟机栈is thread-private , and its life cycle is the same as the thread. The virtual machine stack describes the memory model that is executed by the Java method: Each method executes with a stack frame (which I think can be viewed as a snapshot, recording some parameters before entering the method, actually the underlying data structure of the method runtime ), Used to store local variable tables, operand stacks, dynamic links, method exits and other information. Each method from the invocation until the completion of the process corresponds to a stack frame in the virtual machine in the stack into the stack process. We usually divide the memory into heap memory and stack memory, where the stack memory refers to the local variable table portion of the virtual machine stack. The local variable table holds the basic data types that can be known at compile time, the object reference, and the address of the bytecode that is pointed to after the return.

本地方法区 虚拟机栈 is similar to the role played, but note that the virtual machine specification does not enforce the methods in the local method area, the virtual machine can be freely implemented, that is, it can not be bytecode. But it can also be bytecode, so that the virtual machine stack and the local method area can be combined, in fact, OpenJDK and SunJDK the self-brought HotSpot虚拟机 directly to the virtual machine stack and the Local method zone into one.

This concept should be familiar to many people, such as the beginning of the C language, the teacher will say the malloc method will allocate space in the heap, and here is the same. This area is used to hold object instances, where almost all object instances allocate memory, and the virtual machine specification says that all instances of objects and arrays are allocated on the heap. But with the development of JIT (JUST-IN-TIME) compilation period, sometimes it is possible to allocate on the stack (I do not quite understand the truth here). Heaps are the main areas of Java garbage collector management (often referred to as GC heaps, not garbage dumps), and the garbage collector implements automatic destruction of objects.

方法区It is also a zone shared by various threads, which is used to store data such as class information, constants, static variables, and compiled code (class methods) that have been loaded by the virtual machine. Let's talk about the run-time constant pool, which is part of the method area that holds the various literal and symbolic references generated during the compilation period (in fact, eight basic types of wrapper types and string type data).

Creation of objects

In the case of an object-oriented language, we do not create objects through the new keyword at all, so what is the process?

When a virtual opportunity comes to a new command, it first checks to see if the new class has been loaded, and where is it checked? Of course, in the method area, the method area stores the loaded class information. If it is not loaded, then the class load is performed first.

After a class load check, the virtual machine begins to allocate memory for the new object, and the memory size required by the object is determined after the class is loaded, as long as the space is allocated in the heap. There are two ways to allocate memory, the first, we assume that the memory is absolutely regular, so long as the memory and unused memory to place a pointer, each time you allocate space when the pointer to the free space to move the corresponding distance. Second, we assume that free memory and non-free memory are mixed together, which is actually the case, then a list is needed to record the usage of the heap memory, which is how the operating system manages memory.

So, we also have to consider a problem, that is, in the case of multithreading, only one pointer how to ensure that a thread allocated memory pointer is not modified when another thread allocated memory will not overwrite the previous memory? Here is a way for each thread to pre-allocate a small chunk of memory ( TLAB local thread allocation buffers) in the heap, and each thread allocates memory only in its own memory.

Finally, the object is successfully allocated memory. We know through an object, we can get the class by the GetClass () method, the default comparison of two objects actually compares the object memory hash value, how is this implemented? In fact, after allocating memory, the virtual opportunity to make the necessary settings for the object, the object's class, the object's hash code and other information are stored in the object's object header, so the allocated memory size is never the sum of attributes.

Memory layout of the object

The layout of the object in the heap is divided into three regions: object Header , instance data , and alignment padding .

    • The object header consists of two parts, the first part is used to store its own runtime data such as GC flag bits, number of MONIRGC, hash code, lock state, which thread can have such as is called MarkWord (tag Word). The second part holds pointers to the method area class data. In a 32-bit system, the class pointer size is 4 bytes and the tag word size is 4 bytes. In a 64-bit system, the tag word size is 8 bytes.

    • The instance data holds the property information for the class, including the parent class's property information. the instance part of the array also includes the length of the array. The instance Information is aligned 4 bytes, respectively, by class.

    • Align padding this is the virtual machine requires that the object start address must be an integer multiple of 8 bytes, which can be said to have no special meaning for the aligned padding.

Access positioning of objects

We know that a reference is a reference and an object instance is an object instance. The reference is stored in the virtual machine stack, the data type is reference, and the object instance is stored in the heap. So how does a reference point to an object instance?

There are two main ways to access, the first is through 句柄池 , if you use a handle pool, a java堆 portion of memory will be divided as a handle pool, the handle contains the type information of the object type pointer to the method area, and an object instance pointer to the instance address in the heap.

The second is that the reference reference directly points to an object instance in the heap, and the object header of the object instance holds the object type pointer.

Both methods have advantages, the first of which can only change the object instance pointer in the handle pool when the object instance GC moves, without changing the reference reference itself. The second method is the fast access speed, reducing the time overhead of a pointer positioning. HotSpot虚拟机the second approach that is currently in use.

=====

    • The memory in Java is divided into the following four sections:

      ①, code area ②, stack area ③, heap area ④, static area

    • Stack area: The compiler automatically allocates the release, storing the function parameter value, local variable value, etc. the system automatically frees the JVM memory resources after the execution of the method is completed.
    • Heap area: Typically released by programmers, storing new allocated objects and arrays, the JVM does not periodically view this object, if no reference to this object is recycled
    • Static zones: Store global variables, static variables, and string constants without releasing
    • Code area: A binary code that stores methods in a program, and multiple objects that share a region of code space.

==========

What areas of Android memory

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.