Reproduced Hadoop Sample Program WordCount detailed

Source: Internet
Author: User

Recently in the study of cloud computing, research HADDOP Framework, spent a full day to run Hadoop under Linux fully up, see the official Map-reduce Demo program WordCount, carefully studied, counted as a primer.

In fact, WordCount is not difficult, but all of a sudden exposure to a lot of APIs, some strange, there is a very traditional development compared to map-reduce is really a new programming concept, in order to let you novice less detours, I will wordcount many APIs have made comments, In fact, these methods to understand the procedure is very simple, nothing is to be a word participle, first with the map processing and then reduce processing, and finally set some information in the main function, and then run (), the program is over. Okay, no nonsense, just on the code:

1 Package Com.felix;2 3 import java.io.IOException;4 import Java.util.Iterator;5 import Java.util.StringTokenizer;6 7 import Org.apache.hadoop.fs.Path;8 import org.apache.hadoop.io.IntWritable;9 import org.apache.hadoop.io.LongWritable;Ten import Org.apache.hadoop.io.Text; One import Org.apache.hadoop.mapred.FileInputFormat; A import Org.apache.hadoop.mapred.FileOutputFormat; - import org.apache.hadoop.mapred.JobClient; - import org.apache.hadoop.mapred.JobConf; the import org.apache.hadoop.mapred.MapReduceBase; - import Org.apache.hadoop.mapred.Mapper; - import Org.apache.hadoop.mapred.OutputCollector; - import Org.apache.hadoop.mapred.Reducer; + import Org.apache.hadoop.mapred.Reporter; - import Org.apache.hadoop.mapred.TextInputFormat; + import Org.apache.hadoop.mapred.TextOutputFormat; A /** at  *  - * Description: WordCount explains by Felix - * @author Hadoop Dev Group -  */ - Public class WordCount - { in  -     /** to * Mapreducebase class: Implements the base class for Mapper and Reducer interfaces (where the method simply implements the interface without doing anything) + * Mapper Interface: - * Writablecomparable Interface: Classes that implement writablecomparable can be compared to each other. All classes that are used as keys should implement this interface.  the * Reporter can be used to report the running progress of the entire application, which is not used in this example.  *      *  $      */Panax Notoginseng Public Static class Map extends Mapreducebase implements -Mapper<longwritable, text, text, intwritable> the     { +         /** A * longwritable, intwritable, Text are classes implemented in Hadoop to encapsulate Java data types that implement the Writablecomparable interface . the * can be serialized to facilitate data exchange in a distributed environment, and you can consider them as long,int,string alternatives.  +          */ - private final static intwritable one = new intwritable (1); $ private Text Word = new text (); $          -         /** - * The Map method in the Mapper interface: the* void Map (K1 key, V1 value, Outputcollector<K2, V2>output, Reporter Reporter) - * Map a single input k/v pair to an intermediate k/vWuyi * The output pair is not required and the input pair is the same type, the input pair can be mapped to 0 or more output pairs.  the* Outputcollector Interface: Collects mapper and reducer output<k, v>That 's right.  - * Outputcollector Interface Collect (k, V) method: Add a (k,v) to the output Wu          */ - Public void Map (longwritable key, Text value, AboutOutputcollector<Text, Intwritable>output, Reporter Reporter) $ throws IOException -         { - String line = value.tostring (); - StringTokenizer tokenizer = new StringTokenizer (line); A While (Tokenizer.hasmoretokens ()) +             { the Word.set (Tokenizer.nexttoken ()); - Output.collect (Word, one); $             } the         } the     } the  the Public Static class Reduce extends Mapreducebase implements -Reducer<Text, Intwritable, Text, intwritable> in     { thepublic void reduce (Text key, Iterator<intwritable>values, theOutputcollector<Text, Intwritable>output, Reporter Reporter) About throws IOException the         { the int sum = 0; the While (Values.hasnext ()) +             { - sum + = Values.next (). get (); the             }Bayi Output.collect (Key, New intwritable (sum)); the         } the     } -  - Public static void Main (string[] args) throws Exception the     { the         /** the * Jobconf:map/reduce job Configuration class, describing the work performed by Map-reduce to the Hadoop framework the * Construction Method: jobconf (), jobconf (Class exampleclass), jobconf (Configuration conf), etc. -          */ the jobconf conf = new jobconf (wordcount.class); the conf.setjobname ("WordCount"); Set a user-defined job name the 94 Conf.setoutputkeyclass (Text.class); Set the key class for the job's output data the Conf.setoutputvalueclass (Intwritable.class); Set the value class for the job output the  the Conf.setmapperclass (Map.class); Set the Mapper class for the job98 Conf.setcombinerclass (Reduce.class); Set the Combiner class for the job About Conf.setreducerclass (Reduce.class); To set the reduce class for a job - 101 Conf.setinputformat (Textinputformat.class); To set the InputFormat implementation class for a map-reduce task102 Conf.setoutputformat (Textoutputformat.class); To set the OutputFormat implementation class for a map-reduce task103 104         /** the * InputFormat describes the input definition of the job in Map-reduce106 * Setinputpaths (): Sets the path array as an input list for the Map-reduce job107 * Setinputpath (): Sets the path array as the output list for the Map-reduce job108          */109 fileinputformat.setinputpaths (conf, new Path (Args[0])); the fileoutputformat.setoutputpath (conf, new Path (Args[1]));111  the jobclient.runjob (conf); Run a job113     } the}
View Code

(article turns from: http://www.iteye.com/topic/606962)

Reproduced Hadoop Sample Program WordCount detailed

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.