Use Scala+intellij IDEA+SBT to build a development environment
Tips
Frequently encountered problems in building development environment:
1. Network problems, resulting in SBT plugin download failure, workaround, find a good network environment,
or download the jar in advance from the network I provided (link: http://pan.baidu.com/s/1qWFSTze password: LSZC)
Download the. Ivy2 compressed file, unzip it, and put it in your user directory.
2. Version matching issue, version mismatch will encounter a variety of problems, solutions, according to the following version of the build,
Scala (2.10.3), SBT (0.13), sbt-assembly (0.11.2), Spark (1.2.0)
3. If it is still unsuccessful to build according to this tutorial, it is recommended to see the course I recorded on www.bigdatastudy.cn Spark development environment (free)
Install Scala
Scala:http://www.scala-lang.org/download/2.10.3.html
In this tutorial, the relevant software download network address (link: http://pan.baidu.com/s/1qWFSTze password: LSZC)
The default installation option automatically configures environment variables.
If there is no automatic configuration, configure the environment variable
Scala_home:c:\program Files (x86) \scala\
Add the following path;%scala_home%\bin
IntelliJ idea's Download, installation
: https://www.jetbrains.com/idea/
The activation code is as follows:
Key:tommy
value:49164-ypnvl-oxuzl-xiwm4-z9ohc-lf053
Key:itey
Value:91758-t1cla-c64f3-t7x5r-a7ydo-crsn1
IntelliJ Idea Common settings
In Intellj/bin/idea64.exe.vmoptions (64-bit, large physical memory, recommended increase), increase the idea's boot memory:
-xms512m
-xmx1024m
-xx:maxpermsize=512m
Themes and colors:
Settings–ide Settings–appearance–theme:darcula
Then tick the Override Font option below and select Yahei 14th.
Editor Interface font settings:
You can save the editor–colors&fonts–fonts as a new theme and modify the configuration in this new topic.
Cursor Row background color:
Editor–colors&fonts–general–caret row, select the blue background to have a large chromatic aberration.
Specify a different version of the JDK for each project:
Idea can specify a different version of the JDK for each project and requires the developer to manually configure the version of the JDK used by the project. The configuration method is as follows:
Click File | Project Structure menu item, open Projectstructure dialog box;
In the Left list box, select the SDKs list item and go to the SDK Configuration page;
If there is no option in the Middle SDK list box, click the "+" sign to create a JDK list item;
Select the JDK list item, and on the SDK ' JDK ' tab page, click the Browse button for the JDK Home path project to locate the JDK installation path and save
Plug-in Installation
File>settings>plugins, search Scala direct installation, after installation will prompt restart. This plugin has Scala and SBT.
There is no need to download the SBT plugin separately.
BUILD.SBT file
Compiling with BT 0.13
The contents of the BUILD.SBT file are as follows:
Import functions that support compiling into jar packages (these functions are in the sbt-assembly plugin)
Import Assemblykeys._
Name: = "Sparkapp"
Version: = "1.0"
Scalaversion: = "2.10.3"
Librarydependencies ++= Seq (//Spark dependency
"Org.apache.spark"% "spark-core_2.10″%" 1.2.0 "%" provided ",
"Net.sf.jopt-simple"% "jopt-simple"% "4.3",
"Joda-time"% "joda-time"% "2.0")
This statement includes the Assembly plug-in feature
Assemblysettings
Configuring Jars with Assembly plug-in
Jarname in assembly: = "My-project-assembly.jar"
Exclude Scala from our Assembly jar, because Spark is already bundled with Scala
Assemblyoption in assembly: = (assemblyoption in assembly). Value.copy (Includescala = False)
Further configuration
To make the sbt-assembly plug-in effective, create a new file in the project/directory to list the dependencies for this plugin.
The new PROJECT/ASSEMBLY.SBT adds the following configuration:
Addsbtplugin ("com.eed3si9n"% "sbt-assembly"% "0.11.2")
Since then, the environment has been built.
Spark installation, please refer to the first step of the Spark Starter trilogy. Spark Installation
For the development and operation of Spark programs, refer to the third step of the Spark Starter trilogy Development and operation of the Spark program
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Spark Starter Trilogy The second step Spark development environment building