Eclipse-based Spark application development environment built under windows

Source: Internet
Author: User
Tags scala ide

Original articles, reproduced please specify: Reproduced from www.cnblogs.com/tovin/p/3822985.html

First, the software download

Maven Download Install: Http://10.100.209.243/share/soft/apache-maven-3.2.1-bin.zip
JDK Download Installation:
Http://10.100.209.243/share/soft/jdk-7u60-windows-i586.exe (32-bit)
Http://10.100.209.243/share/soft/jdk-7u60-windows-x64.exe (64-bit)
Eclipse Download Installation:

HTTP://WWW.ECLIPSE.ORG/DOWNLOADS/DOWNLOAD.PHP?FILE=/TECHNOLOGY/EPP/DOWNLOADS/R elease/kepler/sr2/ Eclipse-jee-kepler-sr2-win32.zip (32-bit)
HTTP://WWW.ECLIPSE.ORG/DOWNLOADS/DOWNLOAD.PHP?FILE=/TECHNOLOGY/EPP/DOWNLOADS/R elease/kepler/sr2/ Eclipse-jee-kepler-sr2-win32-x86_64.zip (64-bit)

Scala Download Install: Http://www.scala-lang.org/files/archive/scala-2.10.4.msi

  

Second, the environment variable configuration

Java_home=c:\program files\java\jdk1.7.0_17
m2_home=d:\soft\apache-maven-3.2.1
scala_home=d:\soft\scala-2.10
Path=%path%;%java_home%\bin;%m2_home%\bin;%scala_home%\bin

  

Third, eclipse configuration

1. Eclipse Scala Plugin Installation
In Eclipse, select "Help" –> "Eclipse Marketplace", enter Scala to install Scala IDE

    

2. Eclipse Configuration JDK

Enter Window->preferences to configure the JDK

    

3. Eclipse Configuration Maven

IV. Spark Application Development

  1. Create Maven Project

    

2. Modify Pom.xml Add Dependency Package

    

    Note: If you have dependencies on a package other than Hadoop, spark, you need to add the dependency package as well. To add a dependency package, you need to add the following to the Pom.xml

   

3. Turn into Java Engineering (easy to develop with Spark Java API)

  

4, write code compilation, packaging
Project Point right-click to execute MAVEN clean, maven install respectively, so that under the project target directory will have the target jar package generation

    

5. Running the Spark application

Upload the compiled jar package to the Spark cluster client and execute the following command to run the program
/usr/local/spark/bin/spark-submit--class yourmainclass--master yarn-cluster Yourjarpath
Yourmainclass:main function Class name
Yourjarpath:jar Package Absolute Path

original articles, reproduced please specify: Reproduced from www.cnblogs.com/tovin/p/3822985.html

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.