Combiners programming
1. Each map generates a large amount of output, and the Combiner function is to do a merge on the map end to reduce the amount of data transferred to reducer.
2.combiner is the most basic implementation of local key merging, with similar local reduce function if not combiner, then all the results are reduced, the efficiency will be relatively low
3. Using Combiner, the first map will be aggregated locally to increase the speed.
The combiner phase is optional, and combiner is actually a reduce operation, so we see that the WordCount class is loaded with reduce. Combiner is a localized reduce operation, which is a follow-up operation of the map operation, mainly to do a simple merge duplicate key value before the map is computed, for example, we count the frequency of the word in the file, When a Hadoop word is recorded as 1 when the map is calculated, but Hadoop may appear multiple times in this article, the map output file will be redundant, so a merge of the same key before the reduce calculation will make the file smaller. This improves the transmission efficiency of broadband, after all, Hadoop computing power broadband resources are often the bottleneck of computing is the most valuable resource, but combiner operation is risky, the principle is that combiner input does not affect the final input of the reduce calculation, For example, if the calculation is only the total, the maximum value, the minimum value can be used combiner, but the average calculation using combiner, the final reduce calculation results will be wrong.
650) this.width=650; "src=" Http://s2.51cto.com/wyfs02/M02/78/1E/wKioL1Z2eLyjVG1wAAFZif_2Qic063.png "title=" Capture Waaaaaaaaaaaa. PNG "alt=" Wkiol1z2elyjvg1waafzif_2qic063.png "/>
Code implementation:
Import Org.apache.hadoop.io.intwritable;import Org.apache.hadoop.io.longwritable;import Org.apache.hadoop.io.Text ; Import Org.apache.hadoop.mapreduce.reducer;public class Wordcountcombiner extends Reducer<text, intwritable, Text , longwritable> {@Overrideprotected void reduce (Text key, iterable<intwritable> values, context context) throws IOException, interruptedexception {Long Count = 0;//iterates all of the value, is incremented for (intwritable value:values) {count + = Value.get () ;} The key and its summary value-count output context.write (key, New Longwritable (count));}}
Add code implementations in the Mian method:
public static void Main (string[] args) throws Exception {configuration conf = new Configuration ();//Create a Job object to assemble our WORDC Ount program Job wcjob = job.getinstance (conf);/** * Add a combiner component *///wcjob.setcombinerclass (Wordcountcombiner.class); Wcjob.setcombinerclass (Wordcountreducer.class);}}
Lines 11th and 12 mean the same thing, except that the reduce in the Combinerclass call is Maptaks Local run combiner inherits the reduce so both can write.
13: What is Combiners? What is the role? Programming implementation