The error log is as follows:
caused By:java.lang.abstractmethoderror:sparkcore.javawordcount$2.call (ljava/lang/object;) Ljava/lang/Iterable;
At org.apache.spark.api.java.javarddlike$ $anonfun $fn$1$1.apply (javarddlike.scala:129)
At org.apache.spark.api.java.javarddlike$ $anonfun $fn$1$1.apply (javarddlike.scala:129)
At scala.collection.iterator$ $anon $13.hasnext (iterator.scala:371)
At scala.collection.iterator$ $anon $14.hasnext (iterator.scala:388)
At scala.collection.iterator$ $anon $11.hasnext (iterator.scala:327)
At Org.apache.spark.util.collection.ExternalSorter.insertAll (externalsorter.scala:189)
At Org.apache.spark.rdd.rddoperationscope$.withscope (rddoperationscope.scala:150)
At Org.apache.spark.rdd.rddoperationscope$.withscope (rddoperationscope.scala:111)
At Org.apache.spark.rdd.RDD.withScope (rdd.scala:316)
At Org.apache.spark.rdd.RDD.saveAsTextFile (rdd.scala:1436)
At Org.apache.spark.api.java.javarddlike$class.saveastextfile (javarddlike.scala:507)
At Org.apache.spark.api.java.AbstractJavaRDDLike.saveAsTextFile (javarddlike.scala:46)
At SparkCore.JavaWordCount.main (javawordcount.java:68)
At Sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
At Sun.reflect.NativeMethodAccessorImpl.invoke (nativemethodaccessorimpl.java:62)
At Sun.reflect.DelegatingMethodAccessorImpl.invoke (delegatingmethodaccessorimpl.java:43)
At Java.lang.reflect.Method.invoke (method.java:498)
At org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$ $runMain (sparksubmit.scala:731)
At Org.apache.spark.deploy.sparksubmit$.dorunmain$1 (sparksubmit.scala:181)
At Org.apache.spark.deploy.sparksubmit$.submit (sparksubmit.scala:206)
At Org.apache.spark.deploy.sparksubmit$.main (sparksubmit.scala:121)
At Org.apache.spark.deploy.SparkSubmit.main (Sparksubmit.scala)
caused By:java.lang.abstractmethoderror:sparkcore.javawordcount$2.call (ljava/lang/object;) Ljava/lang/Iterable;
At org.apache.spark.api.java.javarddlike$ $anonfun $fn$1$1.apply (javarddlike.scala:129)
At org.apache.spark.api.java.javarddlike$ $anonfun $fn$1$1.apply (javarddlike.scala:129)
At scala.collection.iterator$ $anon $13.hasnext (iterator.scala:371)
At scala.collection.iterator$ $anon $14.hasnext (iterator.scala:388)
At scala.collection.iterator$ $anon $11.hasnext (iterator.scala:327)
At Org.apache.spark.util.collection.ExternalSorter.insertAll (externalsorter.scala:189)
At Org.apache.spark.shuffle.sort.SortShuffleWriter.write (sortshufflewriter.scala:64)
At Org.apache.spark.scheduler.ShuffleMapTask.runTask (shufflemaptask.scala:73)
At Org.apache.spark.scheduler.ShuffleMapTask.runTask (shufflemaptask.scala:41)
At Org.apache.spark.scheduler.Task.run (task.scala:89)
At Org.apache.spark.executor.executor$taskrunner.run (executor.scala:227)
At Java.util.concurrent.ThreadPoolExecutor.runWorker (threadpoolexecutor.java:1142)
At Java.util.concurrent.threadpoolexecutor$worker.run (threadpoolexecutor.java:617)
At Java.lang.Thread.run (thread.java:745)
Cause of Error:
From the error log see Java.lang.abstractmethoderror:sparkcore.javawordcount$2.call (ljava/lang/object;) Ljava/lang/Iterable; It is generally possible to guess that there is a function definition conflict.
The reason for this is basically because the version of the spark executor and the spark version of the cluster are different, such as the cluster spark version is 1.6.*, and the execution program uses 2.1.*; Due to the difference in version, some functions
Changes in definitions can cause these problems.
Spark Run Error: Java.lang.AbstractMethodError