Hadoop reading notes (eight) MapReduce into Jar package Demo

Source: Internet
Author: User

Hadoop reading Notes (i) Introduction to Hadoop: http://blog.csdn.net/caicongyang/article/details/39898629

Hadoop reading notes (ii) HDFS Shell operations: http://blog.csdn.net/caicongyang/article/details/41253927

Hadoop reading Notes (iii) Java API operations hdfs:http://blog.csdn.net/caicongyang/article/details/41290955

Hadoop reading Notes (iv) HDFS architecture: http://blog.csdn.net/caicongyang/article/details/41322649

Hadoop reading notes (v) MapReduce statistical Word demo:http://blog.csdn.net/caicongyang/article/details/41453579

Hadoop reading notes (vi) MapReduce custom data type demo:http://blog.csdn.net/caicongyang/article/details/41490379

Hadoop reading Notes (vii) MapReduce 0.x version API using demo:http://blog.csdn.net/caicongyang/article/details/41493325

1. Code transformation

Kpiapp.java

Package Cmd;import Java.io.datainput;import java.io.dataoutput;import java.io.ioexception;import Java.net.URI; Import Org.apache.hadoop.conf.configuration;import Org.apache.hadoop.conf.configured;import Org.apache.hadoop.fs.filesystem;import Org.apache.hadoop.fs.path;import Org.apache.hadoop.io.longwritable;import Org.apache.hadoop.io.text;import Org.apache.hadoop.io.writable;import Org.apache.hadoop.mapreduce.job;import Org.apache.hadoop.mapreduce.mapper;import Org.apache.hadoop.mapreduce.reducer;import Org.apache.hadoop.mapreduce.lib.input.fileinputformat;import Org.apache.hadoop.mapreduce.lib.input.textinputformat;import Org.apache.hadoop.mapreduce.lib.output.fileoutputformat;import Org.apache.hadoop.mapreduce.lib.output.textoutputformat;import Org.apache.hadoop.mapreduce.lib.partition.hashpartitioner;import Org.apache.hadoop.util.tool;import org.apache.hadoop.util.toolrunner;/** * * <p> * Title:KpiApp.java * Package mapReduce * </p> * <p>  * Description: Statistics flow(Package jar command line run): extends configured implements Tool * <p> * @author tom.cai * @created 2014-11-25 pm 10:23:33 * @versio  N V1.0 * */public class Kpiapp extends configured implements tool{public static void Main (string[] args) throws Exception {Toolrunner.run (new Kpiapp (), args);} @Overridepublic int Run (string[] arg0) throws Exception {String Input_path = arg0[0]; String Out_path = arg0[1]; FileSystem FileSystem = filesystem.get (new URI (Input_path), New Configuration ()); Path Outpath = new Path (Out_path), if (Filesystem.exists (Outpath)) {Filesystem.delete (Outpath, True);} Job Job = new Job (new Configuration (), KpiApp.class.getSimpleName ()); Fileinputformat.setinputpaths (Job, Input_path); Job.setinputformatclass (Textinputformat.class); Job.setmapperclass (Kpimapper.class); Job.setmapoutputkeyclass (Text.class); Job.setmapoutputvalueclass ( Kpiwite.class); Job.setpartitionerclass (Hashpartitioner.class); job.setnumreducetasks (1); Job.setReducerClass ( Kpireducer.class); Job.setoutputkeyclass (text.class); job. Setoutputvalueclass (Kpiwite.class); Fileoutputformat.setoutputpath (Job, New Path (Out_path)); Job.setoutputformatclass (Textoutputformat.class); Job.waitforcompletion (true); return 0;} Static class Kpimapper extends Mapper<longwritable, text, text, kpiwite> {@Overrideprotected void map (longwritable Key, Text value, Context context) throws IOException, interruptedexception {string[] splited = value.tostring (). Split ("\ t "); String num = splited[1]; Kpiwite KPI = new Kpiwite (splited[6], splited[7], splited[8], splited[9]); Context.write (new Text (num), KPI);}} Static class Kpireducer extends Reducer<text, kpiwite, text, kpiwite> {@Overrideprotected void reduce (Text key, Iter Able<kpiwite> value, Context context) throws IOException, interruptedexception {long uppacknum = 0l;long DOWNPACKNU m = 0l;long Uppayload = 0l;long downpayload = 0l;for (Kpiwite kpi:value) {uppacknum + = Kpi.uppacknum;downpacknum + = KPI. Downpacknum;uppayload + = kpi.uppayload;downpayload + = Kpi.downpayload;} ContexT.write (Key, New Kpiwite (String.valueof (Uppacknum), string.valueof (Downpacknum), string.valueof (UpPayLoad), String.valueof (Downpayload));}}} Class Kpiwite implements writable {long Uppacknum;long downpacknum;long uppayload;long downpayload;public KpiWite () {} Public Kpiwite (String uppacknum, String downpacknum, String uppayload, String downpayload) {This.uppacknum = Long.parselo Ng (uppacknum); this.downpacknum = Long.parselong (downpacknum); this.uppayload = Long.parselong (upPayLoad); This.downpayload = Long.parselong (downpayload);} @Overridepublic void ReadFields (Datainput in) throws IOException {this.uppacknum = In.readlong (); this.downpacknum = In.readlong (); this.uppayload = In.readlong (); this.downpayload = In.readlong ();} @Overridepublic void Write (DataOutput out) throws IOException {Out.writelong (uppacknum); Out.writelong (downpacknum); o Ut.writelong (uppayload); Out.writelong (downpayload);}}

2. Play into Jar

Use Eclipse's export to make the above class Kpi.jar

Upload the jar package to Linux and execute it via the Hadoop jar Xxx.jar [parameter] [parameter] command

For example: Hadoop jar Kpi.jar Hdfs://192.168.80.100:9000/wlan hdfs://192.168.80.100:9000/wlan_out

The code completion of the previous article can be run under the command line!


Welcome everybody to discuss the study together!

Useful Self-collection!

Record and share, let you and I grow together! Welcome to my other blogs, my blog address: Http://blog.csdn.net/caicongyang






Hadoop reading notes (eight) MapReduce into Jar package Demo

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.