Problems and workarounds for running Spark demo in IntelliJ

Source: Internet
Author: User

This stage is mainly in the study of Scala, it is recommended to learn about Haskell before learning Scala, but I do not necessarily want to curve the national salvation. However, there are many difficulties in the learning process, at least Scala is recognized that its characteristics are more complex than C + +:-)

My main motivation for learning Scala is to study spark, and while Python and Java can be used to develop spark applications, spark itself is a Scala project, and Spark is not a mature product. Maybe Scala can solve problems more efficiently when I'm having problems.

I took a look at some of the official spark documents today, and in IntelliJ I tried to run Sparkpi, and the whole process ran into some problems.

The first is when I put the relevant package into good, Run, error:

 Exception in thread "main"  Org.apache.spark.sparkexception:a master URL must be set I N your configuration at org.apache.spark.SparkContext.  <init> (Sparkcontext.scala:185 13) at SparkDemo.SimpleApp.main (Simpleapp.scala) at SUN.REFLECT.NATIVEMETHODACCESSORIMPL.INVOKE0 (Native Method) at Sun.reflect.NativeMethodAccessorImpl.invoke ( Nativemethodaccessorimpl.java:  57) at Sun.reflect.DelegatingMethodAccessorImpl.invoke (Delegatingmethodaccessorimpl.java:  43 ) at Java.lang.reflect.Method.invoke (Method.java:  606) at Com.intellij.rt.execution.application.AppMain.main (Appmain.java:  140) Using Spark  ' s default log4j profile:org/apache/spark/log4j-defaults.propertie 

Workaround: In the IDE, click on Run, Edit, and enter "-dspark.master=local" in the right VM options to instruct the program to run locally on a single thread

Run again, error still:

Exception in thread "main" Java.lang.nosuchmethoderror:scala.collection.immutable.hashset$.empty () lscala/collection/immutable/ HashSet; At akka.actor.actorcell$.<init> (actorcell.scala:336) at akka.actor.actorcell$.<clinit>(Actorcell.scala) at Akka.actor.RootActorPath. $div (Actorpath.scala:159) at Akka.actor.LocalActorRefProvider.<init> (actorrefprovider.scala:464) at Akka.actor.LocalActorRefProvider.<init> (actorrefprovider.scala:452) at Sun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method) at Sun.reflect.NativeConstructorAccessorImpl.newInstance (Nativeconstructoraccessorimpl.java:39) at Sun.reflect.DelegatingConstructorAccessorImpl.newInstance (Delegatingconstructoraccessorimpl.java:27) at Java.lang.reflect.Constructor.newInstance (Constructor.java:513) at akka.actor.reflectivedynamicaccess$ $anonfun $createinstancefor$2.apply (dynamicaccess.scala:78) at scala.util.try$.apply (Try.scala:191) at Akka.actor.ReflectiveDynamicAccess.createInstanceFor (Dynamicaccess.scala:73) at akka.actor.reflectivedynamicaccess$ $anonfun $createinstancefor$3.apply (dynamicaccess.scala:84) at akka.actor.reflectivedynamicaccess$ $anonfun $createinstancefor$3.apply (dynamicaccess.scala:84) at Scala.util.Success.flatMap (Try.scala:230) at Akka.actor.ReflectiveDynamicAccess.createInstanceFor (Dynamicaccess.scala:84) at akka.actor.actorsystemimpl.liftedtree1$1 (actorsystem.scala:584) at Akka.actor.ActorSystemImpl.<init> (actorsystem.scala:577) at akka.actor.actorsystem$.apply (Actorsystem.scala:141) at akka.actor.actorsystem$.apply (Actorsystem.scala:108) at Akka. akka$.delayedendpoint$akka$akka$1 (akka.scala:11) at Akka. Akka$delayedinit$body.apply (Akka.scala:9) at Scala. function0$class. APPLY$MCV$SP (function0.scala:40) at SCALA.RUNTIME.ABSTRACTFUNCTION0.APPLY$MCV$SP (Abstractfunction0.scala:12) at Scala. app$ $anonfun $main$1.apply (app.scala:76) at Scala. app$ $anonfun $main$1.apply (app.scala:76) at Scala.collection.immutable.List.foreach (List.scala:383) at scala.collection.generic.traversableforwarder$class. foreach (traversableforwarder.scala:35) at Scala. app$class. Main (app.scala:76) at Akka. Akka$.main (Akka.scala:9) at Akka. Akka.main (Akka.scala)

As mentioned in a previous blog post, I installed the Scala version for the 2.11.5,spark version of 1.2.0, it seems that there are some compatibility issues with the spark version and the Scala version, instead of changing Scala to 2.10.4 The problem is solved, the result of the program running:

Using Spark ' s default log4j profile:org/apache/spark/log4j-defaults.properties ....
....
15/07/27 19:50:23 INFO tasksetmanager:starting task 0.0 in stage 0.0 (TID 0, localhost, process_local, 1260bytes)15/07/27 19:50:23 INFO executor:running task 0.0 in stage 0.0 (TID 0)15/07/27 19:50:23 INFO executor:finished task 0.0 in stage 0.0 (TID 0). 727bytes result sent to driver15/07/27 19:50:23 INFO tasksetmanager:starting Task 1.0 in Stage 0.0 (TID 1, localhost, process_local, 1260bytes)15/07/27 19:50:23 INFO executor:running Task 1.0 in Stage 0.0 (TID 1)15/07/27 19:50:23 INFO tasksetmanager:finished task 0.0 in stage 0.0 (TID 0) in the MS on localhost (1/2)15/07/27 19:50:23 INFO executor:finished Task 1.0 in Stage 0.0 (TID 1). 727bytes result sent to driver15/07/27 19:50:23 INFO tasksetmanager:finished Task 1.0 in Stage 0.0 (TID 1) in the MS on localhost (2/2)15/07/27 19:50:23 INFO dagscheduler:stage 0 (reduce at sparkpi.scala:20) finished in 0.146s15/07/27 19:50:23 INFO taskschedulerimpl:removed TaskSet 0.0, whose tasks has all completed, from pool15/07/27 19:50:23 INFO dagscheduler:job 0 finished:reduce at Sparkpi.scala:20, took 0.380229sPi is roughly 3.1372615/07/27 19:50:23 INFO sparkui:stopped Spark Web UI at http://211.66.87.51:404015/07/27 19:50:23INFO dagscheduler:stopping Dagscheduler15/07/27 19:50:24 INFO mapoutputtrackermasteractor:mapoutputtrackeractor stopped!15/07/27 19:50:24INFO Memorystore:memorystore cleared15/07/27 19:50:24INFO Blockmanager:blockmanager stopped15/07/27 19:50:24INFO Blockmanagermaster:blockmanagermaster stopped15/07/27 19:50:24 INFO sparkcontext:successfully stopped sparkcontext

Problems and workarounds for running Spark demo in IntelliJ

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.