Java.lang.IllegalArgumentException:System memory 100663296 must is at least 4.718592E8. Please use a larger heap size.
When you develop a spark project in Eclipse and try to run the program directly in Spark, you encounter the following error:
Obviously, this is a JVM application that does not have enough memory to start Sparkcontext. But how do you set it up?
But I checked the startup script.
#!/bin/bash/usr/local/spark-1.6.0/bin/spark-submit --class cn.spark.study.Opt17_WordCount--num-executors 3-- Driver-memory <strong>100m </strong>\--executor-memory <strong>100m </strong>-- Executor-cores 3/root/sparkstudy/java/spark-study-java-0.0.1-snapshot-jar-with-dependencies.jar--master spark://yun01:7077
when the memory given to driver is trying to increase to 400M .
is still the following error
Exception in thread "main" Java.lang.IllegalArgumentException:System memory 402128896 must is at least 4.718592E8. Please use a larger heap size.
Now you can turn it up a little bit and give it to 1g.
And then run it again and get the results.
You can also specify in the code:
Val conf = new sparkconf (). Setappname ("word count")
Conf.set ("Spark.testing.memory", "1g")//after the value is greater than 512m can
[Java]View PlainCopy
- /**
- * return the total amount of memory shared between execution and storage, in bytes.
- */ &NBSP;&NBSP;
- private def getmaxmemory (conf: sparkconf): long = {
- val systemmemory = conf.getlong (" spark.testing.memory " , runtime.getruntime.maxmemory)
- Val reservedmemory = Conf.getlong ("Spark.testing.reservedMemory",
- if(Conf.contains ("Spark.testing")) 0 Elsereserved_system_memory_bytes)
- val minsystemmemory = reservedmemory * 1.5 &NBSP;&NBSP;
- if (systemmemory < minsystemmemory) {
- throw &NBSP; new illegalargumentexception (s &NBSP;+&NBSP;&NBSP;
- s" be at least $minSystemMemory . please use a Larger heap size. "
- }
- Val usablememory = systemmemory-reservedmemory
- val memoryfraction = conf.getdouble (" spark.memory.fraction " ,&NBSP; 0.75
- (Usablememory * memoryfraction). Tolong
- }
So, here is the main val systemmemory = Conf.getlong ("Spark.testing.memory", Runtime.getRuntime.maxMemory).
The definition and interpretation of conf.getlong () is
[Java]View PlainCopy
- getLong (Key: string, defaultvalue: long): long
- Get a Parameter as a long , falling back to a default &NBSP; if not set
Suddenly encountered an error: Error spark. Sparkcontext:error initializing Sparkcontext.