IntelliJ idea directly compiled spark source code and problem solving

Source: Internet
Author: User
Tags constant
Intelij idea compiling the spark source processUnder the spark source package, unzip IntelliJ to install Scala plugin using IntelliJ Ieda Open Project feature opens the source folder

After that, idea automatically downloads all of the dependencies, and after the download is finished, make Project compiles.

It is recommended to use the new version of Ieda, it is recommended to pre-install the SBT environment, to download the SBT required files.
For the installation configuration of SBT see: http://blog.csdn.net/tanglizhe1105/article/details/50528801
SBT establishes IntelliJ Ieda project see: http://blog.csdn.net/tanglizhe1105/article/details/50528824 Spark build Error 1

D:\spark-1.6.0\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\ Sparkavrocallbackhandler.scala
Error: ($) Not Found:type Sparkflumeprotocol
  Val transactiontimeout:int, Val backoffinterval:int) extends Sparkflumeprotocol with Logging {
                                                                 ^
Error: (+) not Found:type Eventbatch
  override def geteventbatch (n:int): Eventbatch = {
                                      ^
...

This problem is due to the fact that some of the source files required by Flume-sink are not automatically downloaded and are not available at compile time. How to resolve:

Inside the IntelliJ Ieda:
-Open View--Tool Windows--Maven Projects
-Right-click Spark Project External Flume Sink
-click Generate Sources and Update Folders
Then, Intellij idea will automatically download flume sink related packages

Then rebuild--make Project, everything ok!!

This should generate source code from SPARKFLUME.AVDL.
Generate Sources and Update Folders do can resolve type Sparkflumeprotocol not found issue.
Source: http://apache-spark-developers-list.1001551.n3.nabble.com/A-Spark-Compilation-Question-td8402.html Spark Build Error 2

Here is another example of a missing file:
Another type Hiveshim not found issue can is solved as follow operation while compiling spark source code on Vers Ion 1.4.1/1.4.0: How to resolve:

In Project settings–>modules,select "spark-hive-thriterver_2.10", "Sources" tab, select "v0.13.1" node, click "Sour Ces "to mark the node as sources. Then select "Spark-hive_2.10", "Sources" tab, select "v0.13.1" node, click "Sources" to mark the node as Sources. Spark Build Error 3

This process tends to occur repeatedly when compiling

Error:scalac:while Compiling:c:\users\administrator\ideaprojects\spark-1.6.0\sql\core\src\main\scala\org\apache \spark\sql\util\queryexecutionlistener.scala during PHASE:JVM library version:version 2.10.5 Compiler V Ersion:version 2.10.5 reconstructed args:-nobootcp-deprecation-classpath C:\Program files\java\jdk1.8.0_66\jre\lib\ Charsets.jar;
C:\Program ... C:\Program Files\java\jdk1.8.0_66\jre\lib\rt.jar; C:\Users\Administrator\IdeaProjects\spark-1.6.0\sql\core\target\scala-2.10\classes; C:\Users\Administrator\IdeaProjects\spark-1.6.0\core\target\scala-2.10\classes;
C:\Users\Administrator\.m2\repository\org\apache\avro\avro-mapred\1.7.7\avro-mapred-1.7.7-hadoop2.jar;
...... C:\Users\Administrator\.m2\repository\org\objenesis\objenesis\1.0\objenesis-1.0.jar; C:\Users\Administrator\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar-feature- Javabootclasspath; -unchecked last tree to Typer:literal (Constant (org.apache.spark.sql.test.ExaMplepoint) symbol:null symbol Definition:null Tpe:class (classof[org.apache.spark.sql . Test. Examplepoint]) symbol Owners:context Owners:anonymous class witherrorhandling$1, package util = = Encl Osing template or block = = Template (//Val <local $anonfun;: <notype>, Tree.tpe=org.apache.spark.sql.util.wi Therrorhandling$1 "Scala.runtime.AbstractFunction1", "Scala."        Serializable "//Parents Valdef (Private" _ "<tpt> <empty>) ... executionlistenermanager$ $anonfun $org$apache$spark$sql$util$executionlistenermanager$ $withErrorHandling. Super. " <init> "//Def <init> (): Scala.runtime.AbstractFunction1 in Class AbstractFunction1, tree.tpe= ()  Scala.runtime.AbstractFunction1 Nil))) = = Expanded type of tree = = Constanttype (value = Constant (org.apache.spark.sql.test.ExamplePoint)) uncaught exception during Compilation:java.lang.AssertionError 
How to solve

Build-Rebuild Project
It's so simple ...

Here are a number of spark Builder error collections:
Https://www.mail-archive.com/search?l=user@spark.apache.org&q=subject:%22build+error%22&o=newest&f =1

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.