16/03/04 00:21:09 WARN sparkcontext:using spark_mem to set amount of memory to use per executor process is deprecated, pl Ease use spark.executor.memory instead.
16/03/04 00:21:09 ERROR sparkcontext:error initializing Sparkcontext.
Org.apache.spark.SparkException:Could not the parse Master URL: ' at '
At org.apache.spark.sparkcontext$.org$apache$spark$sparkcontext$ $createTaskScheduler (sparkcontext.scala:2554)
At Org.apache.spark.sparkcontext.<init> (sparkcontext.scala:489)
At Com.bigdata.deal.scala.domainlib$.main (domainlib.scala:22)
At Com.bigdata.deal.scala.DomainLib.main (Domainlib.scala)
At Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
At Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:57)
At Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43)
At Java.lang.reflect.Method.invoke (method.java:606)
At org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$ $runMain (sparksubmit.scala:664)
At Org.apache.spark.deploy.sparksubmit$.dorunmain$1 (sparksubmit.scala:169)
At Org.apache.spark.deploy.sparksubmit$.submit (sparksubmit.scala:192)
At Org.apache.spark.deploy.sparksubmit$.main (sparksubmit.scala:111)
At Org.apache.spark.deploy.SparkSubmit.main (Sparksubmit.scala)
When configuring Conf, be sure to have sparkhome master address
Note that you can configure these in the spark-env.sh.
[email protected] spark-1.4.0-bin-hadoop2.6]# cat conf/spark-env.sh
#!/usr/bin/env Bash
Spark_master_ip=mini-cp1
#必须导入JAVA根目录路径
Export java_home=/usr/local/jdk1.7.0_65
Export hadoop_home=/usr/local/hadoop-2.6.0
#export Scala_home=/opt/scala
Export SPARK_WORKER_MEMORY=3G
Export Hadoop_conf_dir=/usr/local/hadoop-2.6.0/etc/hadoop
#SPARK_MEM =${spark_mem:-1g}
Export SPARK_MEM=3G
Export hadoop_home=/usr/local/hadoop-2.6.0
Export Hadoop_common_lib_native_dir=/usr/local/hadoop-2.6.0/lib/native
Export hadoop_opts= "-djava.library.path=/usr/local/hadoop-2.6.0/lib"
This article from the "7274992" blog, reproduced please contact the author!
Spark-submit submit task times wrong, error initializing Sparkcontext