Hadoop version of HelloWorld WordCount Run example

Source: Internet
Author: User
Tags hadoop fs

1. Write a Java program that counts the number of words, named Wordcount.java, with the following code:

Importjava.io.IOException;ImportJava.util.StringTokenizer;Importorg.apache.hadoop.conf.Configuration;ImportOrg.apache.hadoop.fs.Path;Importorg.apache.hadoop.io.IntWritable;ImportOrg.apache.hadoop.io.Text;ImportOrg.apache.hadoop.mapreduce.Job;ImportOrg.apache.hadoop.mapreduce.Mapper;ImportOrg.apache.hadoop.mapreduce.Reducer;ImportOrg.apache.hadoop.mapreduce.lib.input.FileInputFormat;ImportOrg.apache.hadoop.mapreduce.lib.output.FileOutputFormat; Public classWordCount { Public Static classTokenizermapperextendsMapper<object, text, text, intwritable>{    Private Final StaticIntwritable one =NewIntwritable (1); PrivateText Word =NewText ();  Public voidmap (Object key, Text value, context context)throwsIOException, interruptedexception {stringtokenizer ITR=NewStringTokenizer (value.tostring ());  while(Itr.hasmoretokens ()) {Word.set (Itr.nexttoken ());      Context.write (Word, one); }    }  }   Public Static classIntsumreducerextendsReducer<text,intwritable,text,intwritable> {    Privateintwritable result =Newintwritable ();  Public voidReduce (Text key, iterable<intwritable>values, context context)throwsIOException, interruptedexception {intsum = 0;  for(intwritable val:values) {sum+=Val.get ();      } result.set (sum);    Context.write (key, result); }  }   Public Static voidMain (string[] args)throwsException {Configuration conf=NewConfiguration (); Job Job= Job.getinstance (conf, "word count"); Job.setjarbyclass (WordCount.class); Job.setmapperclass (tokenizermapper.class); Job.setcombinerclass (intsumreducer.class); Job.setreducerclass (intsumreducer.class); Job.setoutputkeyclass (Text.class); Job.setoutputvalueclass (intwritable.class); Fileinputformat.addinputpath (Job,NewPath (args[0])); Fileoutputformat.setoutputpath (Job,NewPath (args[1])); System.exit (Job.waitforcompletion (true) ? 0:1); }}

2. Declare the Java environment variable:

Export java_home=/usr/java/defaultexport PATH=${java_home}/bin:${path}export hadoop_ CLASSPATH=${java_home}/lib/tools.jar

Note: If you do not declare the above environment variables, you will receive an error message when you run it later:

3. Compile and create the jar package.

bin/Hadoop com.sun.tools.javac.Main wordcount.javajar CF wc.jar WordCount*. class

4. Run the third step to build the Wc.jar package. It is important to note that the output folder is not created manually and is created automatically when the system is run.

Bin/hadoop jar Wc.jar Wordcount/user/root/wordcount/input/user/root/wordcount/output

At the end of normal operation, part-r-00000 and __success two files are generated under the output folder, where the analysis results are part-r-00000 stored. To run the command:

Bin/hadoop fs-cat/user/joe/wordcount/output/part-r-00000

You can view the analysis results as shown in:

At this point, this example completes.

Hadoop version of HelloWorld WordCount Run example

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.