Spark version: 2.0.1
Most recently, when you submitted a task in the Scala language with spark, the submit task always fails, and the exception is as follows:
17/05/05 18:39:23 ERROR yarn. Applicationmaster:user class threw Exception:java.lang.NoSuchMethodError: Scala.reflect.api.JavaUniverse.runtimeMirror (ljava/lang/classloader;) lscala/reflect/api/javamirrors$javamirror
; Java.lang.NoSuchMethodError:scala.reflect.api.JavaUniverse.runtimeMirror (ljava/lang/classloader;) lscala/
Reflect/api/javamirrors$javamirror; At Lrpipeline$.main (lrpipeline.scala:17) at Lrpipeline.main (Lrpipeline.scala) at Sun.reflect.NativeMethodAccessorI Mpl.invoke0 (Native method) at Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:62) at Sun . reflect. Delegatingmethodaccessorimpl.invoke (delegatingmethodaccessorimpl.java:43) at Java.lang.reflect.Method.invoke ( method.java:497) at org.apache.spark.deploy.yarn.applicationmaster$ $anon $2.run (applicationmaster.scala:627) 17/05 /05 18:39:23 INFO yarn. Applicationmaster:final app status:failed, exitcode:15,
Online search for a half-day, combined with their own testing, finally resolved, the resolution process is as follows:
My original MAVEN configuration is as follows:
<properties> <scala.version>2.10.6</scala.version> </properties> <dependencie s> <dependency> <groupId>org.apache.spark</groupId> <artifactid> ;spark-core_2.11</artifactid> <version>2.0.1</version> </dependency> & Lt;dependency> <groupId>org.apache.spark</groupId> <artifactid>spark-mllib_2. 11</artifactid> <version>2.0.1</version> </dependency> <dependency&
Gt
<groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.11</artifactId> <version>2.0.1</version> </dependency> <dependency> <groupid>o Rg.scala-lang</groupid> <artifactId>scala-library</artifactId> <version>${ Scala.version}</version> <scope>compile</scope> </dependency> <DEPENDENCY&G
T <groupId>org.scala-lang</groupId> <artifactId>scala-reflect</artifactId> & Lt;version>${scala.version}</version> </dependency> </dependencies>
First, you can confirm that the Scala version used by spark2.0.1 is 2.11.8, and the configuration I used in the maven above is 2.10.6, so I tried to tweak the Scala version .
After adjusting the version, continue MVN package compile, submit spark task, the problem still exists.
After half a day of tracing, eventually finding out that it was replacing Maven's Scala version, I forgot to mvn clean cleanup project after compiling and MVN package packaging again . The change that caused Maven did not take effect.
There is no clean command of what is different place, you can refer to the collation of netizens, see here.
Therefore, execute MVN clean package, repackage, submit the task after the successful operation ...
Put this record here, I hope you can meet a few pits.
"Reference" http://stackoverflow.com/questions/40128956/ Getting-exception-java-lang-nosuchmethoderror-scala-reflect-api-javauniverse