The base class mapper and the base class in 024_mapreduce reducer

Source: Internet
Author: User

Content Outline

1) The base class Mapper class in MapReduce, customizing the parent class of the Mapper class.

2) The base class reducer class in MapReduce, customizing the parent class of the Reducer class.

1, Mapper Class

API documentation

1) inputsplit input shard, InputFormat input format

2) sorted sorting and group grouping of mapper output results

3) partition the mapper output according to the number of reducer patition

4) combiner the mapper output data

    • The Mapper class description in the official Hadoop documentation:

Maps input Key/value pairs to a set of intermediate key/value pairs.

Maps is the individual tasks which transform input records into a intermediate records. The transformed intermediate records need not being of the same type as the input records. A given input pair may map to zero or many output pairs.

The Hadoop map-reduce framework spawns one MAP task for each inputsplit generated by the InputFormat for the job. Mapper implementations can access the Configuration for the job via the Jobcontext.getconfiguration ().

The framework first calls Setup (Org.apache.hadoop.mapreduce.Mapper.Context), followed by Map (object, object, Context) For each key/value pair in the inputsplit. Finally Cleanup (Context) is called.

All intermediate values associated with a given output key is subsequently grouped by the framework, and passed to a redu CER to determine the final output. The Users can control the sorting and grouping by specifying the key Rawcomparator classes.

The Mapper outputs is partitioned per Reducer. Users can control which keys (and hence records) go to which Reducer by implementing a custom partitioner.

Users can optionally specify a combiner, via Job.setcombinerclass (Class), to perform local aggregation of the intermediate Outputs, which helps to cut down the amount of data transferred from the Mapper to the Reducer.

Applications can specify if and how the intermediate outputs is to be compressed and which compressioncodecs is to be us Ed via the Configuration.

If the job has a zero reduces then the output of the Mapper are directly written to the OutputFormat without sorting by keys.

    • Structure of the Mapper class:

  

    • Here's how:

The first Category: protected type, the user according to the actual needs of the overwrite.

1) Setup: Called once per task execution.

2) Map: Each key/value is called once.

3) Clearup: Called once before the end of each task execution.

The second type, the method of running

The run () method, which is the entry for the Mapper class, calls the Setup (), map (), and Clearup () three methods within a method.

The base class mapper and the base class in 024_mapreduce reducer

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.