I. Environment configuration
Although the MAVEN plugin is already integrated in MyEclipse, there is an error setting up MAVEN project due to the low version of the plugin.
Solution: Own to the official website http://maven.apache.org/Download the latest version of the Maven plugin, unzip, registered in the environment variable.
New environment variable M2_home
Path to add Maven's bin to path
After the configuration is complete, at the Windows command prompt, enter MVN-V test, the configuration is displayed successfully
Once the configuration is successful, you will need to replace it with the new Maven plugin in MyEclipse.
Two. Spark Application development
1. Create Maven Project
2. Writing Java source programs
/*Simpleapp.java*/ImportOrg.apache.spark.api.java.*;Importorg.apache.spark.SparkConf;Importorg.apache.spark.api.java.function.Function; Public classSimpleapp { Public Static voidMain (string[] args) {String logFile= "File:///spark-bin-0.9.1/README.md"; Sparkconf conf=NewSparkconf (). Setappname ("Spark Application in Java"); Javasparkcontext SC=Newjavasparkcontext (conf); Javardd<String> Logdata =Sc.textfile (logFile). cache (); LongNumas = Logdata.filter (NewFunction<string, boolean>() { PublicBoolean Call (String s) {returnS.contains ("a"); }}). Count (); LongNumbs = Logdata.filter (NewFunction<string, boolean>() { PublicBoolean Call (String s) {returnS.contains ("B"); }}). Count (); System.out.println ("Lines with a:" + Numas + ", Lines with B:" +numbs); }}
3. Modify Pom.xml Add Dependency Package
<project xmlns= "http://maven.apache.org/POM/4.0.0" xmlns:xsi= "Http://www.w3.org/2001/XMLSchema-instance"xsi:schemalocation= "http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" > <modelversion>4.0.0 </modelVersion> <groupId>cn.cas.siat.dolphin</groupId> <artifactid>spark. Simpleapp</artifactid> <version>0.0.1-SNAPSHOT</version> <packaging>jar</packaging > <name>spark. Simpleapp</name> <url>http://maven.apache.org</url><properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </ properties> <dependencies> <dependency> <groupId>junit</groupId> <artifactid& Gt;junit</artifactid> <version>3.8.1</version> <scope>test</scope></dependency> <dependency> <groupId>org.apache.spark</groupId> <artifa ctid>spark-core_2.10</artifactid> <version>1.0.2</version> </dependency></dependencies></project>
4. Compile the package:
Perform maven clean, maven install separately, so that the project's jar package is generated in the target directory under the project, such as:
5. Run the Spark app
Upload the compiled jar package to the Spark cluster client and execute the following command to run the program
" foo. APP" --master Spark://
6. Implementation results
Web UI Results