Spark is configured on eclipse

Source: Internet
Author: User

Environment: spark1.4.0,hadoop2.6.0

1. Install the JDK

2. Locate Spark-env.sh.template in the Conf directory of Spark, open it, and add it in the back

Export scala_home=/home/jiahong/scala-2.11. 6 export Java_home=/home/jiahong/jdk1. 8 . 0_45export hadoop_conf_dir=/home/jiahong/hadoop-2.6. 0/etc/hadoop

3. Install the Scala plugin on Eclipse's help->install new sofware

Address: Http://download.scala-ide.org/sdk/e38/scala29/stable/site

4. Using the Scala language to develop spark programs

3.1 in Eclipse, select "File" –> "New" –> "Other ..." –> "Scala Wizard" –> "Scala Project", create a Scala project, name

3.2 Right-click on the "Saprkscala" project, select "Properties", in the pop-up box, select "Java Build Path" –> "libraties" –> "Add External JARs ...", as shown, Join the Spark-assembly-1.4.0-hadoop2.6.0.jar in the Lib directory of Spark

5.

Spark is configured on eclipse

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.