Error message:
.....
14/11/23 06:04:10 ERROR tasksetmanager:task 2.0:1 failed 4 times; Aborting job
14/11/23 06:04:10 INFO dagscheduler:failed to run Sortbykey at main.scala:29
Exception in thread "main" Org.apache.spark.SparkException:Job aborted:task 2.0:1 failed 4 times (most recent failure:e Xception failure:java.lang.classnotfoundexception:youling.studio.main$ $anonfun $)
At org.apache.spark.scheduler.dagscheduler$ $anonfun $org$apache$spark$scheduler$dagscheduler$ $abortStage $1.apply (dagscheduler.scala:1020)
At org.apache.spark.scheduler.dagscheduler$ $anonfun $org$apache$spark$scheduler$dagscheduler$ $abortStage $1.apply (dagscheduler.scala:1018)
At Scala.collection.mutable.resizablearray$class.foreach (resizablearray.scala:59)
At Scala.collection.mutable.ArrayBuffer.foreach (arraybuffer.scala:47)
At org.apache.spark.scheduler.dagscheduler.org$apache$spark$scheduler$dagscheduler$ $abortStage ( dagscheduler.scala:1018)
At org.apache.spark.scheduler.dagscheduler$ $anonfun $processevent$10.apply (dagscheduler.scala:604)
At org.apache.spark.scheduler.dagscheduler$ $anonfun $processevent$10.apply (dagscheduler.scala:604)
At Scala. Option.foreach (option.scala:236)
At Org.apache.spark.scheduler.DAGScheduler.processEvent (dagscheduler.scala:604)
At org.apache.spark.scheduler.dagscheduler$ $anonfun $start$1$ $anon $2$ $anonfun $receive$1.applyorelse ( dagscheduler.scala:190)
At Akka.actor.ActorCell.receiveMessage (actorcell.scala:498)
At Akka.actor.ActorCell.invoke (actorcell.scala:456)
At Akka.dispatch.Mailbox.processMailbox (mailbox.scala:237)
At Akka.dispatch.Mailbox.run (mailbox.scala:219)
At Akka.dispatch.forkjoinexecutorconfigurator$akkaforkjointask.exec (abstractdispatcher.scala:386)
At Scala.concurrent.forkjoin.ForkJoinTask.doExec (forkjointask.java:260)
At Scala.concurrent.forkjoin.forkjoinpool$workqueue.runtask (forkjoinpool.java:1339)
At Scala.concurrent.forkjoin.ForkJoinPool.runWorker (forkjoinpool.java:1979)
At Scala.concurrent.forkjoin.ForkJoinWorkerThread.run (forkjoinworkerthread.java:107)
14/11/23 06:04:10 INFO Tasksetmanager:loss is due to java.lang.classnotfoundexception:youling.studio.main$ $anonfun $ [Duplicate 7]
....
The Youling.studio.Main is my own writing class.
The reason for this is that there is no such
Val conf = new sparkconf ()
Conf.setmaster ("spark://single:8081")
. Setsparkhome ("/cloud/spark-0.9.1-bin-hadoop2")
. Setappname ("word count")
. Setjars (jars)//This line is not written, plus it's good.
. Set ("Spark.executor.memory", "200m")
All code:
Package Youling.studio
Import Org.apache.spark.sparkcontext._
Import Org.apache.spark. {sparkconf, Sparkcontext}
Import Scala.collection.mutable.ListBuffer
/**
* Created by Administrator on 2014/11/23.
*/
Object Main {
def main (args:array[string]) {
if (args.length!=3) {
println ("Cmd:java-jar *.jar input Output")
System.exit (0)
}
Val jars = listbuffer[string] ()
Args (0). Split (', '). Map (Jars + = _)
Val conf = new sparkconf ()
Conf.setmaster ("spark://single:8081")
. Setsparkhome ("/cloud/spark-0.9.1-bin-hadoop2")
. Setappname ("word count")
. Setjars (Jars)
. Set ("Spark.executor.memory", "200m")
Val sc = new Sparkcontext (conf)
Val data = Sc.textfile (args (1))
Data.cache
println (Data.count)
Data.flatmap (_.split (")). Map ((_,1)). Reducebykey (_+_). Map (x=> (x._2,x._1)). Sortbykey (FALSE). Map (x=>, x._1). Saveastextfile (args (2))
}
}
Java-jar running the Spark program cannot find the error resolution of the class you wrote