Spark example and spark example

Source: Internet
Author: User

Spark example and spark example

1. Set up the Spark development environment in Java (fromHttp://www.cnblogs.com/eczhou/p/5216918.html)

1.1 jdk Installation

Install jdk in oracle. I installed jdk 1.7. After installing the new system environment variable JAVA_HOME, the variable value is "C: \ Program Files \ Java \ jdk1.7.0 _ 79 ", depends on the installation path.

Add C: \ Program Files \ Java \ jdk1.7.0 _ 79 \ bin and C: \ Program Files \ Java \ jre7 \ bin under the system variable Path.

1.2 spark environment variable configuration

Go to renewal

Unzip the downloaded file, assuming that the decompression directory is: D: \ spark-1.6.0-bin-hadoop2.6. Add D: \ spark-1.6.0-bin-hadoop2.6 \ bin to the system Path variable and create a new SPARK_HOME variable with the variable value: D: \ spark-1.6.0-bin-hadoop2.6

1.3 hadoop toolkit Installation

Spark is based on hadoop, and relevant hadoop libraries are called during the running process. If the hadoop runtime environment is not configured, related error information is prompted, although it does not affect the running, however, the hadoop-related libraries are also configured here.

1.3.1 download the hadoop 2.6 compiled package ghost,

1.3.2 unzip the downloaded folder and add the relevant library to the system Path variable: D: \ hadoop-2.6.0 \ bin; Create HADOOP_HOME variable with the variable value: D: \ hadoop-2.6.0

1.4 eclipse Environment

Create a new java project directly, add the spark-1.6.0-bin-hadoop2.6 under D: \ spark-assembly-1.6.0-hadoop2.6.0.jar \ lib to the project.

 2. Write the Spark WordCount program in Java'

Package cn. spark. study; import java. util. arrays; import org. apache. spark. sparkConf; import org. apache. spark. api. java. javaPairRDD; import org. apache. spark. api. java. javaRDD; import org. apache. spark. api. java. extends parkcontext; import org. apache. spark. api. java. function. flatMapFunction; import org. apache. spark. api. java. function. function2; import org. apache. spark. api. java. function. pairFunction; import org. apache. spark. api. java. function. voidFunction; import scala. tuple2; public class WordCount {public static void main (String [] args) {// create a SparkConf object and configure SparkConf conf = new SparkConf () for the program (). setAppName ("WordCount "). setMaster ("local"); // create the context object using conf; // create the initial RDD JavaRDD <String> lines = SC. textFile ("D: // spark.txt"); // ---- use various Transformation operators to operate RDD ----------------------------------------------- JavaRDD <String> words = lines. flatMap (new FlatMapFunction <String, String> () {private static final long serialVersionUID = 1L; @ Override public Iterable <String> call (String line) throws Exception {// TODO Auto-generated method stub return Arrays. asList (line. split ("") ;}}); JavaPairRDD <String, Integer> pairs = words. mapToPair (new PairFunction <String, String, Integer> () {private static final long serialVersionUID = 1L; @ Override public Tuple2 <String, Integer> call (String word) throws Exception {// TODO Auto-generated method stub return new Tuple2 <String, Integer> (word, 1) ;}}); JavaPairRDD <String, Integer> wordCounts = pairs. performancebykey (new Function2 <Integer, Integer, Integer> () {private static final long serialVersionUID = 1L; @ Override public Integer call (Integer v1, Integer v2) throws Exception {// TODO Auto-generated method stub return v1 + v2 ;}}); // ---- trigger job ----------------------------------------------------- wordCounts with an action operator. foreach (new VoidFunction <Tuple2 <String, Integer> () {@ Override public void call (Tuple2 <String, Integer> wordCount) throws Exception {// TODO Auto-generated method stub System. out. println (wordCount. _ 1 + "appeared" + wordCount. _ 2 + "times ");}});}

 

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.