Hadoop2.6.0 's Eclipse plugin compilation and setup

Source: Internet
Author: User

1. Compiling the Eclipse plugin for hadoop2.6.0


Download Source:
git clone https://github.com/winghc/hadoop2x-eclipse-plugin.git


Compile the source code:
CD Src/contrib/eclipse-plugin
Ant jar-dversion=2.6.0-declipse.home=/opt/eclipse-dhadoop.home=/opt/hadoop-2.6.0
Eclipse.home and Hadoop.home Set your own environment path


The command line executes the compilation, resulting in 8 warning messages, which are ignored directly.
Compile
[Echo] Contrib:eclipse-plugin
[Javac]/software/hadoop2x-eclipse-plugin/src/contrib/eclipse-plugin/build.xml:76:warning: ' Includeantruntime ' is not set, defaulting to Build.sysclasspath=last; Set to False for repeatable builds
[Javac] compiling source Files To/software/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/classes
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/path.class): Warning: Cannot find annotation method ' value () ' in type ' limitedprivate ': class file for Org.apache.hadoop.classification.Interfa Ceaudience not found
[Javac]/opt/hadoop-2.6.0/share/hadoop/hdfs/hadoop-hdfs-2.6.0.jar (org/apache/hadoop/hdfs/ Distributedfilesystem.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/filesystem.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/filesystem.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/filesystem.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/filesystem.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/ Fsdatainputstream.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac]/opt/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar (org/apache/hadoop/fs/ Fsdataoutputstream.class): Warning:cannot Find Annotation method ' value () ' in type ' limitedprivate '
[Javac] Note:some input files use or override a deprecated API.
[Javac] Note:recompile with-xlint:deprecation for details.
[Javac] Note:some input files Use unchecked or unsafe operations.
[Javac] Note:recompile with-xlint:unchecked for details.
[Javac] 8 warnings


Build Location:

[Jar] Building jar:/software/hadoop2x-eclipse-plugin/build/contrib/eclipse-plugin/hadoop-eclipse-plugin-2.6.0.jar


2. Installing plugins


Users who want to open eclipse after logging on to the desktop should be the administrator of Hadoop, the user configured when Hadoop is installed, or a denial of read and Write permission issue will occur.


Copy the compiled jar into the Eclipse plugin directory and restart eclipse
Configure the Hadoop installation directory
Windows->preference, Hadoop map/reduce, Hadoop installation directory
Configuring the Map/reduce View
Window->open Perspective, other->map/reduce click "OK"
Windows→show view→other->map/reduce locations-> Click "OK"
The console will have one more tab page for "Map/reduce Locations"
On the "Map/reduce Locations" tab, click the icon < elephant +> or right click on the blank, select "New Hadoop location ..." and the dialog box "new Hadoop locations ..." is configured as follows:


Note: The MR master and DFS master configurations must be consistent with the configuration files such as Mapred-site.xml and Core-site.xml
Open Project Explorer to view the HDFs file system.


3. New Map/reduce Task


File->new->project->map/reduce Project->next
Write The WordCount class:


Package mytest;

Import java.io.IOException;
Import java.util.*;
Import Org.apache.hadoop.fs.Path;
Import org.apache.hadoop.conf.*;
Import org.apache.hadoop.io.*;
Import org.apache.hadoop.mapred.*;
Import org.apache.hadoop.util.*;

public class WordCount {
public static class Map extends Mapreducebase implements
mapper<longwritable, text, text, intwritable> {
Private final static intwritable one = new intwritable (1);
Private text Word = new text ();

public void Map (longwritable key, Text value,
Outputcollector<text, intwritable> output, Reporter Reporter)
Throws IOException {
String line = value.tostring ();
StringTokenizer tokenizer = new StringTokenizer (line);
while (Tokenizer.hasmoretokens ()) {
Word.set (Tokenizer.nexttoken ());
Output.collect (Word, one);
}
}
}

public static class Reduce extends Mapreducebase implements
Reducer<text, Intwritable, Text, intwritable> {
public void reduce (Text key, iterator<intwritable> values,
Outputcollector<text, intwritable> output, Reporter Reporter)
Throws IOException {
int sum = 0;
while (Values.hasnext ()) {
Sum + = Values.next (). get ();
}
Output.collect (Key, New intwritable (sum));
}
}

public static void Main (string[] args) throws Exception {
jobconf conf = new jobconf (wordcount.class);
Conf.setjobname ("WordCount");

Conf.setoutputkeyclass (Text.class);
Conf.setoutputvalueclass (Intwritable.class);

Conf.setmapperclass (Map.class);
Conf.setreducerclass (Reduce.class);

Conf.setinputformat (Textinputformat.class);
Conf.setoutputformat (Textoutputformat.class);

Fileinputformat.setinputpaths (conf, new Path (Args[0]));
Fileoutputformat.setoutputpath (conf, new Path (Args[1]));

Jobclient.runjob (conf);
}
}


Configure run-time parameters: Right-click-->run as-->run confiugrations


In is the HDFs folder (created by yourself), which contains the files to be processed. Out store output results
Run the program on a Hadoop cluster: Right--->runas-->run on Hadoop, and the final output will be displayed in the appropriate folder in HDFs. At this point, the Linux hadoop-2.6.0 Eclipse plugin configuration is complete.


The first problem in the configuration process:
An issue that cannot be written to the file HDFs file system in Eclipse, which will directly cause programs written under eclipse not to run on Hadoop.
Open Conf/hdfs-site.xml, and find the Dfs.permissions property modified to False (default is True) OK.
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
The need to restart HDFs after the change;
The simplest is that the user who logged on to the desktop to start eclipse itself is the administrator of Hadoop


Hadoop2.6.0 's Eclipse plugin compilation and setup

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.