Spark Standalone mode application development in MyEclipse using the Java language

Source: Internet
Author: User

I. Environment configuration

Although the MAVEN plugin is already integrated in MyEclipse, there is an error setting up MAVEN project due to the low version of the plugin.

Solution: Own to the official website http://maven.apache.org/Download the latest version of the Maven plugin, unzip, registered in the environment variable.

New environment variable M2_home

Path to add Maven's bin to path

After the configuration is complete, at the Windows command prompt, enter MVN-V test, the configuration is displayed successfully

Once the configuration is successful, you will need to replace it with the new Maven plugin in MyEclipse.

Two. Spark Application development

1. Create Maven Project

2. Writing Java source programs

/*Simpleapp.java*/ImportOrg.apache.spark.api.java.*;Importorg.apache.spark.SparkConf;Importorg.apache.spark.api.java.function.Function; Public classSimpleapp { Public Static voidMain (string[] args) {String logFile= "File:///spark-bin-0.9.1/README.md"; Sparkconf conf=NewSparkconf (). Setappname ("Spark Application in Java"); Javasparkcontext SC=Newjavasparkcontext (conf); Javardd<String> Logdata =Sc.textfile (logFile). cache (); LongNumas = Logdata.filter (NewFunction<string, boolean>() {             PublicBoolean Call (String s) {returnS.contains ("a");        }}). Count (); LongNumbs = Logdata.filter (NewFunction<string, boolean>() {             PublicBoolean Call (String s) {returnS.contains ("B");        }}). Count (); System.out.println ("Lines with a:" + Numas + ", Lines with B:" +numbs); }}

3. Modify Pom.xml Add Dependency Package

<project xmlns= "http://maven.apache.org/POM/4.0.0" xmlns:xsi= "Http://www.w3.org/2001/XMLSchema-instance"xsi:schemalocation= "http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelversion>4.0.0 </modelVersion> <groupId>cn.cas.siat.dolphin</groupId> <artifactid>spark. Simpleapp</artifactid> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging > <name>spark. Simpleapp</name> <url>http://maven.apache.org</url><properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </ properties> <dependencies> <dependency> <groupId>junit</groupId> <artifactid& Gt;junit</artifactid> <version>3.8.1</version> <scope>test</scope></dependency> <dependency> <groupId>org.apache.spark</groupId> <artifa ctid>spark-core_2.10</artifactid> <version>1.0.2</version> </dependency></dependencies></project>

4. Compile the package:

Perform maven clean, maven install separately, so that the project's jar package is generated in the target directory under the project, such as:

5. Run the Spark app

Upload the compiled jar package to the Spark cluster client and execute the following command to run the program

" foo. APP" --master Spark://

6. Implementation results

Web UI Results

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.