Use eclipse to develop hadoop and eclipsehadoop in windows

Source: Internet
Author: User

Use eclipse to develop hadoop and eclipsehadoop in windows

1. Configure the hadoop plug-in

1. Install the plug-in

Copy hadoop-eclipse-plugin-1.1.2.jar to the eclipse/plugins directory and restart eclipse


2. Open the MapReduce View

Window-> open perspective-> other select Map/Reduce icon to be a blue Image


3. Add a MapReduce Environment

At the lower end of eclipse, there will be a Tab "Map/Reduce Locations" next to the console. Right-click the blank area below and select "New Hadoop location". The configuration is as follows:

Location name (start with a name)

Map/Reduce Master (the IP and port of the Job Tracker, filled in according to mapred. job. tracker configured in the mapred-site.xml)

DFS Master (IP and port of Name Node, based on fs. default. name configured in core-site.xml)



4. Use eclipse to modify HDFS content

After the previous step, the configured HDFS will appear in "Project Explorer" on the left. Right-click the HDFS to create and delete files.

Note: Changes cannot be immediately displayed in eclipse after each operation. You must refresh the changes.



2. Develop hadoop programs

1. WordCount. java

Public class WordCount {public static class TokenizerMapper extendsMapper <Object, Text, Text, IntWritable> {private final static IntWritable one = new IntWritable (1); private Text word = new Text (); public void map (Object key, Text value, Context context) throws IOException, InterruptedException {StringTokenizer itr = new StringTokenizer (value. toString (); while (itr. hasMoreTokens () {word. set (itr. next Token (); context. write (word, one) ;}} public static class IntSumReducer extendscer CER <Text, IntWritable, Text, IntWritable> {private IntWritable result = new IntWritable (); public void reduce (Text key, Iterable <IntWritable> values, Context context) throws IOException, InterruptedException {int sum = 0; for (IntWritable val: values) {sum + = val. get ();} result. set (sum); context. write (key, result) ;}} pub Lic static void main (String [] args) throws Exception {// use a program to generate a temporary jar File jarFile = EJob. createTempJar ("bin"); EJob. addClasspath ("/cloud/hadoop/conf"); ClassLoader classLoader = EJob. getClassLoader (); Thread. currentThread (). setContextClassLoader (classLoader); // set the hadoop Configuration parameter Configuration conf = new Configuration (); conf. set ("fs. default. name "," hdfs: // hadoop00001: 9000 "); conf. set ("hadoop. job. user "," root "); Conf. set ("mapred. job. tracker "," hadoop001: 9001 "); Job job = new Job (conf," word count "); (JobConf) job. getConfiguration ()). setJar (jarFile. toString (); job. setJarByClass (WordCount. class); job. setMapperClass (TokenizerMapper. class); job. setCombinerClass (IntSumReducer. class); job. setReducerClass (IntSumReducer. class); job. setOutputKeyClass (Text. class); job. setOutputValueClass (IntWritable. class); String Indium Ut = "hdfs: // hadoop001: 9000/user/root/tmp_file_1"; String output = "hdfs: // hadoop001: 9000/user/root/tmp_file_2"; FileInputFormat. addInputPath (job, new Path (input); FileOutputFormat. setOutputPath (job, new Path (output); System. exit (job. waitForCompletion (true )? 0: 1 );}}

2. Ejob. java

// Generate the public class EJob for the temporary jar file {// To declare global fieldprivate static List <URL> classPath = new ArrayList <URL> (); // To declare methodpublic static File createTempJar (String root) throws IOException {if (! New File (root ). exists () {return null;} Manifest manifest = new Manifest (); manifest. getMainAttributes (). putValue ("Manifest-Version", "1.0"); final File jarFile = File. createTempFile ("EJob -",". jar ", new File (System. getProperty ("java. io. tmpdir "); Runtime. getRuntime (). addShutdownHook (new Thread () {public void run () {jarFile. delete () ;}}); JarOutputStream out = new JarOutputStream (new FileOutputStream (ja RFile), manifest); createTempJarInner (out, new File (root), ""); out. flush (); out. close (); return jarFile;} private static void createTempJarInner (JarOutputStream out, File f, String base) throws IOException {if (f. isDirectory () {File [] fl = f. listFiles (); if (base. length ()> 0) {base = base + "/";} for (int I = 0; I <fl. length; I ++) {createTempJarInner (out, fl [I], base + fl [I]. getName () ;}} else {out. pu TNextEntry (new JarEntry (base); FileInputStream in = new FileInputStream (f); byte [] buffer = new byte [1024]; int n = in. read (buffer); while (n! =-1) {out. write (buffer, 0, n); n = in. read (buffer);} in. close () ;}} public static ClassLoader getClassLoader () {ClassLoader parent = Thread. currentThread (). getContextClassLoader (); if (parent = null) {parent = EJob. class. getClassLoader ();} if (parent = null) {parent = ClassLoader. getSystemClassLoader ();} return new URLClassLoader (classPath. toArray (new URL [0]), parent);} public static void addClasspath (S Tring component) {if (component! = Null) & (component. length ()> 0) {try {File f = new File (component); if (f. exists () {URL key = f. getCanonicalFile (). toURL (); if (! ClassPath. contains (key) {classPath. add (key) ;}} catch (IOException e ){}}}}


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.