Win7 install Hadoop's Eclipse plug-in under 64-bit and write run WordCount program

Source: Internet
Author: User

Win7 install Hadoop's Eclipse plug-in under 64-bit and write run WordCount program

Environment:

Win7 64-bit

hadoop-2.6.0


Steps:


1. Download Hadoop-eclipse-plugin-2.6.0.jar Package

2. Put the Hadoop-eclipse-plugin-2.6.0.jar in the plugins directory under the Eclipse installation directory

3. Open Eclipse Discovery to the left more than one DFS Locations

650) this.width=650; "title=" QQ picture 20150416165609.jpg "Src=" http://s3.51cto.com/wyfs02/M02/6B/86/ Wkiom1uvevkbufo3aac8gnrvicg978.jpg "alt=" Wkiom1uvevkbufo3aac8gnrvicg978.jpg "/>

4, on the Win7 decompression hadoop-2.6.0.

5, download Hadoop.dll, Winutils.exe and other documents.

Require support for hadoop2.6 version (low version of Hadoop.dll will be error), let the post-copy download file to the bin directory of Hadoop, if there are existing files directly skip the line, do not overwrite the original bin directory files

6, set Window->prefrences->hadoop map/reduce installation directory for your extracted Hadoop directory.

650) this.width=650; "title=" QQ picture 20150416165846.png "Src=" http://s3.51cto.com/wyfs02/M00/6B/86/ Wkiom1uvefshruwraaega2bqv1k850.jpg "alt=" Wkiom1uvefshruwraaega2bqv1k850.jpg "/>


7. Display the Map/reduce tab.

Select Window->open perspective->other->map/reduce

650) this.width=650; "title=" QQ picture 20150416171250.jpg "Src=" http://s3.51cto.com/wyfs02/M00/6B/86/ Wkiom1uvfukwuxczaakmm2vomzo531.jpg "alt=" Wkiom1uvfukwuxczaakmm2vomzo531.jpg "/>

8. Create an HDFS connection.

Right-click the Map/reduce Locations tab to select New


650) this.width=650; "title=" QQ picture 20150416171255.png "Src=" http://s3.51cto.com/wyfs02/M02/6B/86/ Wkiom1uvfbqtbaqbaacyetyllm8352.jpg "alt=" Wkiom1uvfbqtbaqbaacyetyllm8352.jpg "/>

9. Popup Configuration

The host and port are the addresses and ports you have configured in Mapred-site.xml, Core-site.xml, respectively.

650) this.width=650; "title=" QQ picture 20150416171740.png "Src=" http://s3.51cto.com/wyfs02/M00/6B/86/ Wkiom1uvfm2jwennaahzww6lu9i956.jpg "alt=" Wkiom1uvfm2jwennaahzww6lu9i956.jpg "/>


10, see if the connection is successful, you can see the files on the HDFs

650) this.width=650; "title=" QQ picture 20150416172221.png "Src=" http://s3.51cto.com/wyfs02/M00/6B/82/wKioL1UvgNvhf_ Npaabe-kmjnqs954.jpg "alt=" Wkiol1uvgnvhf_npaabe-kmjnqs954.jpg "/>


11. Create a MapReduce program

650) this.width=650; "title=" QQ picture 20150416172359.png "Src=" http://s3.51cto.com/wyfs02/M00/6B/86/wKiom1Uvf_ Jzprlraae8xkq518q778.jpg "alt=" Wkiom1uvf_jzprlraae8xkq518q778.jpg "/>


12, the next step to fill in the project name OK.

13, if the automatic import of Hadoop jar package is recommended to delete all (because there will be a lot of hadoop-2.6.0 in different directories of the same jar) will be in the lead is duplicated.

14. Search all the jar packages in the hadoop-2.6.0 directory to create a new directory of their own, repeat the skip, and then import all the jar packages into the project.

15, write the WordCount program to create a new class, the contents are as follows:

Package hdwordcount;

Import java.io.IOException;

Import Java.util.StringTokenizer;

Import org.apache.hadoop.conf.Configuration;

Import Org.apache.hadoop.fs.Path;

Import org.apache.hadoop.io.IntWritable;

Import Org.apache.hadoop.io.Text;

Import Org.apache.hadoop.mapreduce.Job;

Import Org.apache.hadoop.mapreduce.Mapper;

Import Org.apache.hadoop.mapreduce.Reducer;

Import Org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

Import Org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

Import Org.apache.hadoop.util.GenericOptionsParser;

public class WordCount {

public static class Tokenizermapper extends
Mapper<object, text, text, intwritable> {

Private final static intwritable one = new intwritable (1);

Private text Word = new text ();

public void Map (Object key, Text value, context context)
Throws IOException, Interruptedexception {

StringTokenizer ITR = new StringTokenizer (value.tostring ());

while (Itr.hasmoretokens ()) {

Word.set (Itr.nexttoken ());

Context.write (Word, one);
}
}
}

public static class Intsumreducer extends
Reducer<text, Intwritable, Text, intwritable> {

Private intwritable result = new intwritable ();

public void reduce (Text key, iterable<intwritable> values,
Context context) throws IOException, Interruptedexception {

int sum = 0;

for (intwritable val:values) {

Sum + = Val.get ();

}

Result.set (sum);

Context.write (key, result);
}

}

public static void Main (string[] args) throws Exception {

Configuration conf = new configuration ();

string[] Otherargs = new Genericoptionsparser (conf, args)
. Getremainingargs ();

if (otherargs.length! = 2) {

System.err.println ("Usage:wordcount <in> <out>");

System.exit (2);

}

Job Job = new Job (conf, "word count");

Job.setjarbyclass (Wordcount.class);

Job.setmapperclass (Tokenizermapper.class);

Job.setcombinerclass (Intsumreducer.class);

Job.setreducerclass (Intsumreducer.class);

Job.setoutputkeyclass (Text.class);

Job.setoutputvalueclass (Intwritable.class);

Fileinputformat.addinputpath (Job, New Path (Otherargs[0]));

Fileoutputformat.setoutputpath (Job, New Path (Otherargs[1]));

System.exit (Job.waitforcompletion (true)? 0:1);
}
}


16, right-click Run as Select configuration, configuration parameters

650) this.width=650; "title=" QQ picture 20150416172926.png "Src=" http://s3.51cto.com/wyfs02/M02/6B/87/ Wkiom1uvgsvgb4bpaag8etrqs9a813.jpg "alt=" wkiom1uvgsvgb4bpaag8etrqs9a813.jpg "/> One for input directory one for output directory (output directory cannot exist)


17. Right-click Run as, run on Hadoop complete, view output directory file contents

18, Complete!

This article is from the "Phoenix Nirvana" blog, please be sure to keep this source http://yntmdr.blog.51cto.com/3829621/1633528

Win7 install Hadoop's Eclipse plug-in under 64-bit and write run WordCount program

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.