Today, while tossing the 4 types of references for Oom and Java, the JVM reported an error while running:
This error is rarely encountered in the usual rare, today inadvertently encountered, here to make a record. Oracle/sun's official website explains:
The concurrent collector would throw a outofmemoryerror if too much time is being spent in garbage collection:if-more tha n 98% of the total time was spent in garbage collection and less than 2% of the heap are recovered, an outofmemoryerror would Be thrown. This feature are designed to prevent applications from running for a extended period of time while making little or no pro Gress because the heap is too small. If necessary, this feature can is disabled by adding the "option-xx:-usegcoverheadlimit to" command line.
I am using the jdk1.6.0_37 and jdk_1.7.0_60 versions, which are enabled when the JVM defaults to boot in these 2 versions-xx:+usegcoverheadlimit. This is actually an inference from the JVM that if garbage collection takes 98% of the time, but less than 2% of the memory is recycled, the JVM thinks it's about to Oom and the program ends prematurely. Of course we can use-xx:-usegcoverheadlimit to turn off this feature.
I don't quite understand why the JDK has to provide such a parameter. When we encounter this error can only be explained: either the memory space is not enough, or there is a memory leak. What makes sense for us at this time is heap dump, where we can analyze heap memory to diagnose whether there is a problem with the code.
We know that if the following parameters are set when the JVM is started, the JVM will print out heap dump when it crashes.
-xx:+heapdumponoutofmemoryerror-xx:heapdumppath=c:/
Personal feeling, if really happen "GC overhead limit Exceeded" error, then actually from the actual oom is not far away, so let the JVM to make a prediction to end early, feel little significance.