Environment: spark1.4.0,hadoop2.6.0
1. Install the JDK
2. Locate Spark-env.sh.template in the Conf directory of Spark, open it, and add it in the back
Export scala_home=/home/jiahong/scala-2.11. 6 export Java_home=/home/jiahong/jdk1. 8 . 0_45export hadoop_conf_dir=/home/jiahong/hadoop-2.6. 0/etc/hadoop
3. Install the Scala plugin on Eclipse's help->install new sofware
Address: Http://download.scala-ide.org/sdk/e38/scala29/stable/site
4. Using the Scala language to develop spark programs
3.1 in Eclipse, select "File" –> "New" –> "Other ..." –> "Scala Wizard" –> "Scala Project", create a Scala project, name
3.2 Right-click on the "Saprkscala" project, select "Properties", in the pop-up box, select "Java Build Path" –> "libraties" –> "Add External JARs ...", as shown, Join the Spark-assembly-1.4.0-hadoop2.6.0.jar in the Lib directory of Spark
5.
Spark is configured on eclipse