Hadoop1.0.4 Eclipse plug-in compilation

Source: Internet
Author: User
Tags echo message

I. required tools

Eclipse-java-juno-SR1-win32.zip jdk-6u37-linux-x64.bin apache-ant-1.8.4-bin.zip 2. Ant installation to unzip the ant in the appropriate Directory: such as D disk. Configure the ant environment variable and add: F: \ hadoop \ ant \ bin to the path. Input echo % paht % in the console to make the environment variable take effect. (No need to restart) III. Compile the hadoop-eclipse-plugins-1.0.4.jar 1. Unzip the hadoop directory to F: \ hadoop \ hadoop-1.0.4 2. Open the F: \ hadoop \ hadoop-1.0.4 \ SRC \ contrib directory. Copy the build-contrib.xml to the F: \ hadoop \ hadoop-1.0.4 \ SRC \ contrib \ eclipse-plugin directory. 3. Modify the build-contrib.xml file to put hadoop. change root to hadoop decompression directory <property name = "hadoop. root "location =" F:/hadoop/hadoop-1.0.4 "/> Add the following two lines below: Eclipse installation root directory, and hadoop version <property name =" Eclipse. home "location =" E:/software/deploymenttool/Eclipse "/> <property name =" version "value =" 1.0.4 "/> change the location to yours, note the direction of the slash in location 4. Modify build. XML: Find <import file = ".. /build-contrib.xml "/>, changed to <import file =" build-contrib.xml "/> 5, in F: \ hadoop \ ha Doop-1.0.4 \ SRC \ contrib \ eclipse-plugin directory to create the lib directory, and copy the following jar package in: commons-configuration-1.6.jar, commons-httpclient-3.0.1.jar, commons-lang-2.4.jar, jackson-core-asl-1.8.8.jar, jackson-mapper-asl-1.8.8.jar 6, in F: \ hadoop \ create build folder under the hadoop-1.0.4, copy the hadoop-core-1.0.4.jar in recent because of Windows permission issues for a long time plug-in problems, such as 0700bug permission authentication issues.
ERROR security.UserGroupInformation:PriviledgedActionException as:admin cause:java.io.IOException:Failed to set permissions of path:\home\hadoop\hadoop-1.0.4\data\data\mapred\staging\admin1107758487\.staging to 0700Execption in threa "main" java.io.IOException:Failed to set permissions of path:\home\hadoop\hadoop-1.0.4\data\data\mapred\staging\admin1107758487\.staging to 0700 at org.apache.hadoop.fs.FileUtil.checkRetrunVlues(FileUtil.java:682)at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java.655)

Here to solve the 0007 problem, the hadoop-core-1.0.4.jar to overwrite the fileutil class file inside, and then package one, here provides a hadoop-core-1.0.4.jar package for me to solve the 0007 problem. First change the previous hadoop-core-1.0.4.jar to the hadoop-core-1.0.4.jar_back, then put the hadoop-core-1.0.4.jar I provided here under the hadoop directory, and then copy the hadoop-core-1.0.4.jar into the build directory. : Solve 0700 of Eclipse plug-ins: http://download.csdn.net/detail/weijonathan/4919830solve hadoop-core-1.0.4.jar
Http://download.csdn.net/detail/weijonathan/4919824 7. Create f Under the F: \ hadoop \ hadoop-1.0.4 \ build directory: \ hadoop \ hadoop-1.0.4 \ build \ Ivy \ Lib \ hadoop \ common directory and copy commons-cli-1.2.jar to this directory 8. Modify build. XML file. Directory: F: \ hadoop \ hadoop-1.0.4 \ SRC \ contrib \ eclipse-plugin 1), in the <target name = "jar" depends = "compile" Unless = "Skip. contrib "> under the label

<Copy file = "$ {hadoop. root}/build/Ivy/lib/hadoop/common/commons-cli-$ {commons-cli.version }. jar "todir =" $ {build. dir}/lib "verbose =" true "/> Add the following line <copy file =" $ {root}/lib/commons-configuration-1.6.jar "todir =" $ {build. dir}/lib "verbose =" true "/> <copy file =" $ {root}/lib/commons-httpclient-3.0.1.jar "todir =" $ {build. dir}/lib "verbose =" true "/> <copy file =" $ {root}/lib/commons-lang-2.4.jar "todir =" $ {build. dir}/lib "verbose =" true "/> <copy file =" $ {root}/lib/jackson-core-asl-1.8.8.jar "todir =" $ {build. dir}/lib "verbose =" true "/> <copy file =" $ {root}/lib/jackson-mapper-asl-1.8.8.jar "todir =" $ {build. dir}/lib "verbose =" true "/>

2) in <target name = "compile" depends = "init, Ivy-retrieve-common" Unless = "Skip. contrib "> Add the includeantruntime attribute to the <javac> label of the tag -- prevent the producer deantruntime attribute from being set during compilation.

<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">    <echo message="contrib: ${name}"/>    <javac     encoding="${build.encoding}"     srcdir="${src.dir}"     includes="**/*.java"     destdir="${build.classes}"     debug="${javac.debug}"     includeantruntime="on">     <classpath refid="classpath"/>    </javac>  </target>

3) add in build. XML to prevent compilation errors and prompt "the software package org. Apache. hadoop. FS does not exist"

<path id="hadoop-jars">               <fileset dir="${hadoop.root}/">                    <include name="hadoop-*.jar"/>               </fileset>      </path>

In <path id = "classpath"> Add: <path refID = "hadoop-jars"/>

4) modify the F: \ hadoop \ hadoop-1.0.4 \ SRC \ contrib \ eclipse-plugin \ META-INF manifest. MF File

Bundle-ClassPath: classes/,  lib/hadoop-core.jar, lib/commons-cli-1.2.jar, lib/commons-configuration-1.6.jar,  lib/commons-httpclient-3.0.1.jar,  lib/commons-lang-2.4.jar, lib/commons-jackson-core-asl-1.8.8.jar,  lib/commons-jackson-mapper-asl-1.8.8.jar

5), enter F: \ hadoop \ hadoop-1.0.4 \ SRC \ contrib \ eclipse-plugin directory, execute ant command on it! The compilation results are stored in the F: \ hadoop \ hadoop-1.0.4 \ build \ contrib \ eclipse-plugin directory.

Iv. Install the eclipse hadoop plug-in

Put the compiled Eclipse plug-in into the dropins directory of eclipse, and restart eclipse!

5. Configure the Eclipse plug-in.

1. Open eclipse and MAP/reduce locations

2. Right-click New hadoop location..., enter location name (this can be entered at will), and modify MAP/reduce master and DFS master.

User name: Set the name of hadoop to be started.

3. Point to advanced Parameters

Modify hadoop. tmp. dir to the directory set for your hadoop cluster.

Modify DFS. permissions. supergroup to hadoop, hadoop

Modify the values configured by DFS. replication for your hdfs-site.xml File

4. Restart eclipse. After that, you will find that other parameters are automatically loaded to your network!

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.