Learn about Automatic Memory Management and Memory Management
Zookeeper learn about Automatic Memory Management
When an object, string, or array is createdHeapAllocate the memory required to store it. This project is no longer used. The memory occupied by this project can be recycled once and used for other things. In the past, it was usually explicit by programmers to allocate and release these block heap memories using appropriate function calls. Now, the system automatically manages the memory for you when running a single engine. Automatic Memory Management requires less coding than explicit allocation/release, greatly reducing the potential memory leakage (where the memory is allocated, but will never be released later ).
Value and reference type
When a function is called, the value of its parameter is copied to the memory area reserved for this specific requirement. Data Types that occupy only a few bytes can be copied, which is fast and easy. However, it is common for objects, strings, and arrays to be much larger. If these types of data are replicated on a regular basis, it will be very inefficient. Fortunately, this is not necessary; allocate a large project's actual storage space from the heap and a small "Pointer" value to remember its location. Since then, only the pointer needs to copy the parameters passed during the process. As long as the system can find the item identified by the pointer at runtime, it can be used as a single copy of data frequently when necessary.
The Type passed by parameters during direct storage and replication is called the value type. These include integers, floating-point numbers, Boolean values, and unified structure types (such as color and Vector3 ). The Type allocated on the heap and then accessed through the pointer is called the reference type, because only the value stored in the variable "refers to" real data. Examples of reference types include objects, strings, and arrays.
Allocation and garbage collection
The Memory Manager tracks the fields it knows are not used in the heap. When a new memory request (that is, when an object is instantiated) is made, the manager selects the region from which to allocate unused blocks, then, the allocated memory is deleted from the space that is not used. Subsequent requests are processed in the same way until there is no free range and the required block size cannot be allocated. At this point, it is very unlikely that all memory allocated from the heap is still in use. You can only access reference projects on the stack, and you can still find its reference variables. If all the references to the memory block are lost (that is, the referenced variables have been reassigned or they are all local variables out of the range) the memory occupied by it can be safely reallocated.
To determine which heap blocks are no longer used, the memory manager searches for all referenced variables of the current activity and marks the blocks they call "alive. After the search is complete, the space between any active blocks is considered as an empty Memory Manager and can be used for subsequent allocation. Obviously, the process of locating and releasing unused memory is called garbage collection (or GC ).
Optimization
Garbage collection is an automatic and invisible programmer, but the collection process actually requires a lot of CPU time, behind the scenes. If used properly, automatic memory management usually equals or beats manual allocation for overall performance. However, it is critical for programmers to avoid errors that will trigger collectors more frequently than necessary and introduce paused execution.
There are some notorious algorithms that can be the nightmare of GC, although they seem innocent at first glance. Repeat string connection is a classic example :-
function ConcatExample(intArray: int[]) {
var line = intArray[0].ToString();
for (i = 1; i < intArray.Length; i++) {
line += ", " + intArray[i].ToString();
}
return line;
}
The key detail here is that the new film will not be added to the local string, one by one. What exactly happened is that the previous content on each row variable of the surrounding loop has become dead--A brand new string is allocated with a new part at the end of the original slice. Because the string is obtained and added for a longer period of time, the heap space used is also increasing, so it is easy to use hundreds of bytes of available heap space to call this function each time. If you need to connect many strings together, the better choice is the single channel library System. Text. StringBuilder class.
However, even repeated concatenation does not cause too much trouble unless it is called frequent and usually means unified update of the framework. Like :-
var scoreBoard: GUIText;
var score: int;
function Update() {
var scoreText: String = "Score: " + score.ToString();
scoreBoard.text = scoreText;
}
When do you allocate new strings to call Update? Every time New garbage is generated. Most of them can be saved only when the score is changed by updating the text :-
var scoreBoard: GUIText;
var scoreText: String;
var score: int;
var oldScore: int;
function Update() {
if (score != oldScore) {
scoreText = "Score: " + score.ToString();
scoreBoard.text = scoreText;
oldScore = score;
}
}
Another potential problem occurs when a function returns an array value :-
function RandomList(numElements: int) {
var result = new float[numElements];
for (i = 0; i < numElements; i++) {
result[i] = Random.value;
}
return result;
}
This type is very elegant and convenient for functions. When you create a new array, fill it with values. However, if it is called repeatedly, then the fresh memory will be allocated each time. Because the array can be very large, the available heap space can be used quickly increased, leading to frequent garbage collection. One way to avoid this problem is to use the array as a reference type fact. You can modify a function in this function as an array passed as a parameter and the result will not be returned after the function. Features such as those frequently replaced :-
function RandomList(arrayToFill: float[]) {
for (i = 0; i < arrayToFill.Length; i++) {
arrayToFill[i] = Random.value;
}
}
This only replaces the content of the existing array with the new value. Although the array to be initially allocated must be in the calling code (which looks a little indecent), this function will not generate any new garbage when it is called.
Request set
As described above, it is best to avoid allocation. However, since they cannot be completely eliminated, there are also two main strategies you can use to minimize their intrusion into the game :-
Small heaps with fast and frequent garbage collection
This kind of strategy is often the best in the game. Where is the smooth frame rate for a long time? It is the main focus of the game. In such a competition, small blocks are often allocated, but these blocks will only be used briefly. When using this policy on iOS, the typical heap size is about 200 KB, and garbage collection takes about 5 ms of iPhone 3g. If the heap is increased to 1 MB, the set will be about 7 ms. Therefore, it is helpful for garbage collection that sometimes requires normal frame intervals. This usually causes a set to happen more frequently than strictly required, but they will handle games that are faster and have the smallest impact :-
if (Time.frameCount % 30 == 0)
{
System.GC.Collect]();
}
However, you should exercise caution when using this technology and check the event probe statistics to ensure that it really reduces collection time for your game.
Large Heaps and slow but rarely garbage collection
This strategy is suitable for games with a relatively small amount of funds (and collections) that can be processed during game suspension. It is used for the heap to be as big as not so big, so that your application will be killed due to insufficient system memory OS. However, during single-channel operation, avoid expanding the heap automatically if possible. You can manually expand the heap by pre-configuring some placeholder space during startup (that is, You instantiate a "useless" Object allocated by the Memory Manager ):-
function Start() {
var tmp = new System.Object[1024];
// make allocations in smaller blocks to avoid them to be treated in a special way, which is designed for large blocks
for (var i : int = 0; i < 1024; i++)
tmp[i] = new byte[1024];
// release reference
tmp = null;
}
A large enough heap should not be completely filled with those paused games, which can accommodate a collection. In this case, you can explicitly request the set :-
System.GC.Collect();
Again, you should take care of this policy when you notice the probe statistics, not just assuming it has the expected results.
Reusable Object pool
In many cases, you can avoid generating garbage only by reducing the number of objects created and destroyed. There are some types of objects in the game, such as the bullet, which may be encountered several times, even if only a small part has been played once. In this case, it is very likely that the object to be reused is not to destroy the old one and replace it with the new one.
Further Information
Memory Management is subtle and complex and requires a lot of academic efforts. If you are interested in learning more about it memorymanagement.org is an excellent resource for listing many publications and online articles. On the page of Wikipedia on Sourcemaking.com, you can find further information about the object pool.
C # What is automatic memory management?
Automatic Memory Management is one of the services provided by the Common Language Runtime library during the Managed execution process. The garbage collector of the Common Language Runtime Library manages the allocation and release of memory for applications. For developers, this means they do not have to write code to execute memory management tasks when developing managed applications. Automatic Memory Management can solve common problems, such as forgetting to release objects and causing memory leakage, or attempting to access the memory of released objects.
Why is automatic memory management in JAVA?
1. Java memory management is about object allocation and release.
In Java, the programmer needs to apply for memory space for each object through the keyword new (except for basic types), and all objects are allocated space in Heap.
The release of an object is determined and executed by GC.
In Java, the memory allocation is completed by the program, and the memory release is completed by GC. This two-line approach simplifies the programmer's work. But it also increases the JVM's work. This is also one of the reasons for the slow Java program running speed.
GC space release method:
Monitors the running status of each object, including application, reference, reference, and assignment of the object. Release an object when the object is no longer referenced.
2. Memory Management Structure
Java uses a directed graph for memory management. For every moment of the program, we have a directed graph to indicate JVM memory allocation.
Consider the object as the vertex of the directed graph, consider the reference relationship as the directed edge of the graph, and the directed edge points to the cited object from the quoter. In addition, each thread object can be used as the starting vertex of a graph. For example, if most programs start to run from the main process, this graph is a root tree starting from the vertex of the main process. In this directed graph, the objects accessible to the root vertex are all valid objects, and GC will not recycle these objects. If an object (connected subgraph) is inaccessible to this root vertex (note that this graph is a directed graph), we think this (these) object is no longer referenced and can be recycled by GC.
3. Advantages and Disadvantages of memory management using Directed Graphs
Java uses Directed Graphs for memory management to eliminate the issue of reference loops. For example, if there are three objects that reference each other, as long as they are inaccessible to the root process, GC can also recycle them.
The advantage of this method is that the memory management accuracy is high, but the efficiency is low.
++:
Another common memory management technology is the use of counters. For example, the COM model uses counters to manage components. Compared with Directed Graphs, it has low precision rows (it is difficult to handle the issue of circular references ), however, the execution efficiency is high.
★Java Memory leakage
Although GC is used to recycle memory, Java is exposed, but it is a little smaller than C ++.
1. Comparison with C ++
All objects in c ++ must be allocated and recycled by users. You need to manage points and edges. If an inaccessible point exists, the memory allocated to that point cannot be recycled, leading to memory leakage. There are useless object references, which naturally lead to memory leakage.
Java uses GC to manage memory collection. GC recycles the memory space occupied by inaccessible objects. Therefore, the memory leakage issues to be considered in Java are mainly those referenced but useless objects-that is, the edge can be managed. A referenced but useless object. The program references this object, but it will not be used later. The memory space it occupies is wasted.
If an object is referenced, the object is defined as "active" and will not be released.
2. Java Memory leakage handling
Handle Java memory leaks: Make sure this object is no longer in use.
Typical practice --
Set the object data member to null.
Remove this object from the collection
Note: When local variables are not needed, you do not need to explicitly set them to null because these references are automatically cleared when a method is executed.
Example:
List myList = new ArrayList ();
For (int I = 1; I <100; I ++)
{
Object o = new Object ();
MyList. add (o );
O = null;
}
// At this time, all Object objects are not released because the variable myList references these objects.
If myList is no longer used, set it to null to release all objects referenced by it. GC then recycles the memory occupied by these objects.
★GC operations
GC operations may not be able to manage the memory.
G ...... remaining full text>