Spark-shell on yarn error resolving startup command Bin/spark-shell--master yarn-client error, class Executorlauncher cannot find __spark

Source: Internet
Author: User
Tags deprecated

Article Source: http://www.dataguru.cn/thread-331456-1-1.html


Today you want to make an error in the Yarn-client state of Spark-shell:
[Python] View plaincopy


[Hadoop@localhost spark-1.0.1-bin-hadoop2]$ Bin/spark-shell--master yarn-client   Spark Assembly has been Built with Hive, including DataNucleus jars on classpath   14/07/22 INFO 17:28:46. Securitymanager:changing View ACLs to:hadoop   14/07/22 17:28:46 INFO Spark. SecurityManager:SecurityManager:authentication Disabled; UI ACLs Disabled; Users with View Permissions:set (Hadoop)    14/07/22 17:28:46 INFO Spark. Httpserver:starting HTTP server   14/07/22 17:28:46 INFO Server. server:jetty-8.y.z-snapshot   14/07/22 17:28:46 INFO server. abstractconnector:started socketconnector@0.0.0.0:49827   Welcome to         ____              __        /_ _/__  ___ _____//__       _/_/_ '/__/   ' _/      /___/. _ _/_,_/_//_/_   version 1.0.1         /_/      Using Scala version 2.10.4 (Java HotSpot (TM) 64-bit Ser Ver VM, Java 1.7.0_55)    Type in expressions to have them evaluated.   for more type:help .   14/07/22 17:28:51 WARN Spark. sparkconf:    Spark_classpath is detected (set to '/home/hadoop/spark-1.0.1-bin-hadoop2/lib/*.jar ') .    This is deprecated in Spark 1.0+.      please instead use:  -./spark-submit with-- Driver-class-path to augment the driver classpath  -Spark.executor.extraClassPath to augment the executor Clas spath             14/07/22 17:28:51 WARN Spark. Sparkconf:setting ' Spark.executor.extraClassPath ' to '/home/hadoop/spark-1.0.1-bin-hadoop2/lib/*.jar ' as a work-around.   14/07/22 17:28:51 WARN Spark. Sparkconf:setting ' Spark.driver.extraClassPath ' to '/home/hadoop/spark-1.0.1-bin-hadoop2/lib/*.jar ' as A work-around.   14/07/22 17:28:51 INFO Spark. Securitymanager:changing View ACLs to:hadoop   14/07/22 17:28:51 INFO Spark. SecurityManager:SecurityManager:authentication Disabled; UI ACLs Disabled; Users with View Permissions:set (Hadoop)    14/07/22 17:28:51 INFO slf4j. Slf4jlogger:slf4jlogger started   14/07/22 17:28:51 INFO remoting:starting remoting   14/07/22 17:28:51 INFO remoting:remoting started; Listening on addresses:[akka.tcp://spark@localhost:41257]   14/07/22 17:28:51 INFO remoting:remoting now Listens on addresses: [akka.tcp://spark@localhost:41257]   14/07/22 17:28:51 INFO Spark. Sparkenv:registering mapoutputtracker   14/07/22 17:28:51 INFO Spark. Sparkenv:registering blockmanagermaster   14/07/22 17:28:51 INFO storage. diskblockmanager:created Local directory at/tmp/spark-local-20140722172851-5d58   14/07/22 17:28:51 INFO Storage. Memorystore:memorystore startedWith capacity 294.9 mb.   14/07/22 17:28:51 INFO Network. Connectionmanager:bound socket to port 36159 with id = Connectionmanagerid (localhost,36159)    14/07/22 17:28:5 1 INFO storage. blockmanagermaster:trying to register blockmanager   14/07/22 17:28:51 INFO storage. Blockmanagerinfo:registering block manager localhost:36159 with 294.9 MB ram   14/07/22 17:28:51 INFO storage.b lockmanagermaster:registered blockmanager   14/07/22 17:28:51 INFO Spark. Httpserver:starting HTTP server   14/07/22 17:28:51 INFO Server. server:jetty-8.y.z-snapshot   14/07/22 17:28:51 INFO server. abstractconnector:started socketconnector@0.0.0.0:57197   14/07/22 17:28:51 INFO broadcast. Httpbroadcast:broadcast server started at http://localhost:57197   14/07/22 17:28:51 INFO Spark. Httpfileserver:http File Server directory is/tmp/spark-9b5a359c-37cf-4530-85d6-fcdbc534bc84   14/07/22 17:28:51 INFO Spark. HttpseRver:starting HTTP server   14/07/22 17:28:51 INFO Server. server:jetty-8.y.z-snapshot   14/07/22 17:28:51 INFO server. abstractconnector:started socketconnector@0.0.0.0:34888   14/07/22 17:28:52 INFO server. server:jetty-8.y.z-snapshot   14/07/22 17:28:52 INFO server. abstractconnector:started selectchannelconnector@0.0.0.0:4040   14/07/22 17:28:52 INFO UI. sparkui:started Sparkui at http://localhost:4040   14/07/22 17:28:52 WARN. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where APPLICABLE&NB sp; --args is deprecated. Use--arg instead.   14/07/22 17:28:52 INFO client. Rmproxy:connecting to ResourceManager at/0.0.0.0:8032   14/07/22 17:28:53 INFO yarn. Client:got Cluster metric info from Applicationsmanager (ASM), number of nodemanagers:1   14/07/22-17:28:53 in FO yarn. Client:queue info ... queuename:default, queuecurrentcapacity:0.0, queuemaxcapacity:1.0,         queueapplicationcount = 1, Queuechildqueuecount = 0   14/07/22 17:28:53 INFO yarn. Client:max Mem Capabililty of a single resource in this cluster 8192   14/07/22 INFO 17:28:53. Client:preparing local resources   14/07/22 17:28:53 INFO yarn. Client:uploading File:/home/hadoop/spark/assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar to Hdfs://localhost:9000/user/hadoop/.sparkstaging/application_1406018656679_0002/spark-assembly_2.10-0.9.1- hadoop2.2.0.jar   14/07/22 17:28:54 INFO yarn. Client:setting up the launch environment   14/07/22 17:28:54 INFO yarn. Client:setting up container launch context   14/07/22 17:28:54 INFO yarn. Client:command for starting the Spark applicationmaster:list ($JAVA _home/bin/java,-server,-xmx512m,-djava.io.tmpdir= $PWD/tmp,-dspark.tachyonstore.foldername= "Spark-10325217-bdb0-4213-8ae8-329940b98b95",-DSpark.yarn.secondary.jars= "",-dspark.home= "/home/hadoop/spark",-dspark.repl.class.uri= "http://localhost:49827" ,-dspark.driver.host= "localhost",-dspark.app.name= "Spark Shell",-dspark.jars= "",-dspark.fileserver.uri= "http://" localhost:34888 ",-dspark.executor.extraclasspath="/home/hadoop/spark-1.0.1-bin-hadoop2/lib/*.jar ",-- Dspark.master= "Yarn-client",-dspark.driver.port= "41257",-dspark.driver.extraclasspath= "/home/hadoop/" Spark-1.0.1-bin-hadoop2/lib/*.jar ",-dspark.httpbroadcast.uri=" http://localhost:57197 ",   - Dlog4j.configuration=log4j-spark-container.properties, Org.apache.spark.deploy.yarn.ExecutorLauncher,--class, NotUsed,--jar, null,  --args   ' localhost:41257 ',--executor-memory, 1024,--executor-cores, 1,- -num-executors, 2, 1>, <log_dir>/stdout, 2&gt, <log_dir>/stderr)    14/07/22 17:28:54 INFO Yarn. client:submitting application to asm   14/07/22 17:28:54 INFO impl. Yarnclientimpl:submitted Application APplication_1406018656679_0002 to ResourceManager at/0.0.0.0:8032   14/07/22 17:28:54 INFO cluster. Yarnclientschedulerbackend:application asm:         appmasterrpcport:0         appStartTime:1406021334568        yarnappstate : accepted      14/07/22 17:28:55 INFO cluster. Yarnclientschedulerbackend:application asm:         appmasterrpcport:0         appStartTime:1406021334568        yarnappstate : accepted      14/07/22 17:28:56 INFO cluster. Yarnclientschedulerbackend:application asm:         appmasterrpcport:0         appStartTime:1406021334568        yarnappstate : accepted      14/07/22 17:28:57 INFO cluster. YarnClientschedulerbackend:application asm:         appmasterrpcport:0 & nbsp      appStartTime:1406021334568        yarnappstate:accepted    <span style= "color: #FF0000;" >   14/07/22 17:28:58 INFO cluster. Yarnclientschedulerbackend:application asm:         appmasterrpcport:0         appStartTime:1406021334568        yarnappstate : accepted      14/07/22 17:28:59 INFO cluster. Yarnclientschedulerbackend:application asm:         appmasterrpcport:0         appStartTime:1406021334568        yarnappstate : failed      Org.apache.spark.SparkException:Yarn Application already ended,might is killed or not a BLE to launch applicationmaster.       at ORG.APACHE.SPARK.SCHEDULER.CLUSTER.YARNCLIENTSCHEDULERBACKEND.WAITFORAPP ( yarnclientschedulerbackend.scala:105)        at Org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start (yarnclientschedulerbackend.scala:82)         at Org.apache.spark.scheduler.TaskSchedulerImpl.start (taskschedulerimpl.scala:136)         at org.apache.spark.sparkcontext.<init> (sparkcontext.scala:318)        At Org.apache.spark.repl.SparkILoop.createSparkContext (sparkiloop.scala:957)        at $iwC $ $iwC .<init> (<console>:8)        at $iwC .<init> (<console>:14)        at <init> (<console>:16)        at .<init> (<console>:20)        at .<clinit> (<console>)        at .<init> (<console >:7)      &nbsP At .<clinit> (<console>)        at $print (<console>)        at Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native method)        at Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:57)        at Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43)        At Java.lang.reflect.Method.invoke (method.java:606)        at org.apache.spark.repl.sparkimain$ Readevalprint.call (sparkimain.scala:788)        at org.apache.spark.repl.sparkimain$ Request.loadandrun (sparkimain.scala:1056)        at Org.apache.spark.repl.sparkimain.loadandrunreq$1 (sparkimain.scala:614)        at Org.apache.spark.repl.SparkIMain.interpret (sparkimain.scala:645)        at Org.apache.spark.repl.SparkIMain.interpret (sparkimain.scala:609)      &NBSP; At Org.apache.spark.repl.sparkiloop.reallyinterpret$1 (sparkiloop.scala:796)        at Org.apache.spark.repl.SparkILoop.interpretStartingWith (sparkiloop.scala:841)        at Org.apache.spark.repl.SparkILoop.command (sparkiloop.scala:753)        at org.apache.spark.repl.sparkiloopinit$ $anonfun $initializespark$1.apply (sparkiloopinit.scala:121)        at org.apache.spark.repl.sparkiloopinit$ $anonfun $initializespark$1.apply (sparkiloopinit.scala:120)        at org.apache.spark.repl.SparkIMain.beQuietDuring (sparkimain.scala:263)        at Org.apache.spark.repl.sparkiloopinit$class.initializespark (sparkiloopinit.scala:120)        at Org.apache.spark.repl.SparkILoop.initializeSpark (sparkiloop.scala:56)        at org.apache.spark.repl.sparkiloop$ $anonfun $process$1$ $anonfun $apply$mcz$sp$5.apply$mcv$sp (sparkiloop.scala:913)      &nbsP At Org.apache.spark.repl.sparkiloopinit$class.runthunks (sparkiloopinit.scala:142)        at Org.apache.spark.repl.SparkILoop.runThunks (sparkiloop.scala:56)        at Org.apache.spark.repl.sparkiloopinit$class.postinitialization (sparkiloopinit.scala:104)        at Org.apache.spark.repl.SparkILoop.postInitialization (sparkiloop.scala:56)        at org.apache.spark.repl.sparkiloop$ $anonfun $process$1.apply$mcz$sp (sparkiloop.scala:930)        At org.apache.spark.repl.sparkiloop$ $anonfun $process$1.apply (sparkiloop.scala:884)        at org.apache.spark.repl.sparkiloop$ $anonfun $process$1.apply (sparkiloop.scala:884)        at Scala.tools.nsc.util.scalaclassloader$.savingcontextloader (scalaclassloader.scala:135)        At Org.apache.spark.repl.SparkILoop.process (sparkiloop.scala:884)        at Org.apache.spark.repl.SparkILoop. Process (sparkiloop.scala:982)        at Org.apache.spark.repl.main$.main (main.scala:31)    

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.