Environment: hadoop2.7.4 spark2.1.0
After Spark-historyserver and Yarn-timelineserver are configured, there is no error when starting, but in spark
./spark-submit–class Org.apache.spark.examples.SparkPi
–master yarn
–num-executors 3
–driver-memory 1g
–executor-cores 1
/opt/spark-2.1.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.1.0.jar 20
When the command submitted application, the following error was reported
[Root@node-2 bin]#./spark-submit--class org.apache.spark.examples.SparkPi--master Yarn--num-executors 3-- Driver-memory 1g--executor-cores 1/opt/spark-2.1.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.1.0.jar 20 17 /12/08 02:01:37 INFO sparkcontext:running Spark version 2.1.0 17/12/08 02:01:38 WARN nativecodeloader:unable to load NAT Ive-hadoop Library for your platform ... using Builtin-java classes where applicable 17/12/08 02:01:38 INFO SecurityManager : Changing View ACLs to:root 17/12/08 02:01:38 Info securitymanager:changing Modify ACLS to:root 17/12/08 02:01:38 Info
Securitymanager:changing View ACLs groups to:17/12/08 02:01:38 INFO securitymanager:changing Modify ACLs to: 17/12/08 02:01:38 INFO SecurityManager:SecurityManager:authentication disabled; UI ACLs Disabled; Users with view permissions:set (root); Groups with view Permissions:set (); Users with modify Permissions:set (root); Groups with Modify Permissions:set () 17/12/08 02:01:38 INFO UtilS:successfully started service ' sparkdriver ' on port 40685. 17/12/08 02:01:39 Info sparkenv:registering mapoutputtracker 17/12/08 02:01:39 info sparkenv:registering BlockManagerMa Ster 17/12/08 02:01:39 INFO blockmanagermasterendpoint:using org.apache.spark.storage.DefaultTopologyMapper for Getting topology information 17/12/08 02:01:39 INFO blockmanagermasterendpoint:blockmanagermasterendpoint up 17/12/08 02:01:39 INFO diskblockmanager:created Local directory At/tmp/spark/blockmgr-5041ec61-5aa3-48ec-86ec-692db6efcce3 17 /12/08 02:01:39 Info Memorystore:memorystore started with capacity 413.9 MB 17/12/08 02:01:39 INFO sparkenv:registering
Outputcommitcoordinator 17/12/08 02:01:39 INFO utils:successfully started service ' Sparkui ' on port 4040. 17/12/08 02:01:39 Info sparkui:bound sparkui to 0.0.0.0, and started at http://192.168.1.82:4040 17/12/08 02:01:39 info S parkcontext:added JAR File:/opt/spark-2.1.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.1.0.jar at spark:// 192.168.1.82:40685/jars/spark-examples_2.11-2.1.0.jar with timestamp 1512698499980 Exception in thread "main" Java.lang.noclassdeffounderror:com/sun/jersey/api/client/config/clientconfig at Org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient (timelineclient.java:55) at Org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient (yarnclientimpl.java:181) at Org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit (yarnclientimpl.java:168) at Org.apache.hadoop.service.AbstractService.init (abstractservice.java:163) at Org.apache.spark.deploy.yarn.Client.submitApplication (client.scala:151) at Org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start (yarnclientschedulerbackend.scala:56) at Org.apache.spark.scheduler.TaskSchedulerImpl.start (taskschedulerimpl.scala:156) at Org.apache.spark.SparkContext
.<init> (sparkcontext.scala:509) at Org.apache.spark.sparkcontext$.getorcreate (SparkContext.scala:2313) At org.apache.spark.sql.sparksession$builder$ $anonfun $6.apply (sparksession.scala:868) at org.apache.spark.sql.sparksession$builder$ $anonfun $6.apply (sparksession.scala:860) at Scala. Option.getorelse (option.scala:121) at Org.apache.spark.sql.sparksession$builder.getorcreate (SparkSession.scala : 860) at Org.apache.spark.examples.sparkpi$.main (sparkpi.scala:31) at Org.apache.spark.examples.SparkPi.main (Spark Pi.scala in Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native method) at Sun.reflect.nativemethodaccessorimpl.i Nvoke (nativemethodaccessorimpl.java:62) at Sun.reflect.DelegatingMethodAccessorImpl.invoke ( delegatingmethodaccessorimpl.java:43) at Java.lang.reflect.Method.invoke (method.java:498) at Org.apache.spark.depl Oy. sparksubmit$.org$apache$spark$deploy$sparksubmit$ $runMain (sparksubmit.scala:738) at Org.apache.spark.deploy.sparksubmit$.dorunmain$1 (sparksubmit.scala:187) at org.apache.spark.deploy.sparksubmit$. Submit (sparksubmit.scala:212) at Org.apache.spark.deploy.sparksubmit$.main (sparksubmit.scala:126) at Org.apache.spark.deploy.SparkSubmit.main (Sparksubmit.scala) caused by:java.lang.ClassNotFoundException:com.sun.jersey.api.client.config.ClientConfig at Java.net.URLClassLoader.findClass (urlclassloader.java:381) at Java.lang.ClassLoader.loadClass (Classloader.java : 424) at Sun.misc.launcher$appclassloader.loadclass (launcher.java:335) at Java.lang.ClassLoader.loadClass (Classloa der.java:357) ... More 17/12/08 02:01:43 info diskblockmanager:shutdown hook called 17/12/08 02:01:43 info Shutdownhookmanager:shutdown Hook called 17/12/08 02:01:43 INFO shutdownhookmanager:deleting directory/tmp/spark/ spark-c9cb41bd-2a3e-4519-92a8-dd3cf232458f 17/12/08 02:01:43 INFO shutdownhookmanager:deleting directory/tmp/spark/
Spark-c9cb41bd-2a3e-4519-92a8-dd3cf232458f/userfiles-0d69f3fd-fd19-4b50-ac77-b52dcc686c78
The
means that the class com.sun.jersey.api.client.config.ClientConfig is not found. Find the answer that is missing jersey-core. Jar Package
I CD to the Spark_home/jars path, found that the existence of the jersey package is the following
really do not have this jar package, and Jersey version 2.22.2, and then I went to the official API to verify that the jersey2.22.2 version did not jersey-core this jar package, not com.sun.jersey.api.client.config.ClientConfig this class, I went back to the jersey1.9 API and found the missing class, which seems to be jersey in a later version of the class.