Compile the Hadoop 1.2.1 Hadoop-eclipse-plugin plug-in

Source: Internet
Author: User
Tags echo message

Why is the eclipse plug-in for compiling Hadoop1.x. x so cumbersome?

In my personal understanding, ant was originally designed to build a localization tool, and the dependency between resources for compiling hadoop plug-ins exceeds this goal. As a result, we need to manually modify the configuration when compiling with ant. Naturally, you need to set environment variables, set classpath, add dependencies, set the main function, javac, and jar configuration file writing, verification, and deployment.

Let's start.

The main steps are as follows:
  • Set Environment Variables
  • Set ant initial parameters
  • Adjust java compilation Parameters
  • Set java classpath
  • Add dependency
  • Modify META-INF files
  • Compile, package, deploy, and verify
Set the language environment for specific operations

$export LC_ALL=en

Set ant initial parameters
Modify build-contrib.xml files

$ Cd/hadoop-1.2.1/src/contrib $
Vi build-contrib.xml

Edit and modify the hadoop. root value to the root directory of the hadoop decompressed file.

<Property name = "hadoop. root" location = "/Users/kangfoo-mac/study/hadoop-1.2.1"/>

Add eclipse dependency

<Property name = "eclipse. home" location = "/Users/kangfoo-mac/work/soft/eclipse-standard-kepler-SR1-macosx-cocoa"/>

Set version number

<Property name = "version" value = "1.2.1"/>

Adjust java compilation settings
Enable javac. deprecation

$ Cd/hadoop-1.2.1/src/contrib $
Vi build-contrib.xml

Set

<Property name = "javac. deprecation" value = "off"/>

Change

<Property name = "javac. deprecation" value = "on"/>

The javac includeantruntime = "on" parameter needs to be set for ant 1.8 +.

<! -- ===================================================== ==================== -->

<! -- Compile a Hadoop contrib's files -->
<! -- ===================================================== ==================== -->
<Target name = "compile" depends = "init, ivy-retrieve-common" unless = "skip. contrib">
<Echo message = "contrib: $ {name}"/>
<Javac
Encoding = "$ {build. encoding }"
Srcdir = "$ {src. dir }"
Includes = "**/*. java"
Destdir = "$ {build. classes }"
Debug = "$ {javac. debug }"
Deprecation = "$ {javac. deprecation }"
Includeantruntime = "on">
<Classpath refid = "contrib-classpath"/>
</Javac>
</Target>

Modify and compile the hadoop plug-in classpath

$ Cd hadoop-1.2.1/src/contrib/eclipse-plugin $
Vi build. xml

Add file path hadoop-jars

< path id = "hadoop-jars" > < fileset dir = "${hadoop.root}/" > < include name = "hadoop-*.jar" /> </ fileset > </ path >

Add hadoop-jars to classpath

< path id = "classpath" > < pathelement location = "${build.classes}" /> < pathelement location = "${hadoop.root}/build/classes" /> < path refid = "eclipse-sdk-jars" /> < path refid = "hadoop-jars" /> </ path >

Modify or add additional jar Dependencies
Because hadoop has not been compiled directly, you can directly use resources under $ {HADOOP_HOME}/lib. Note that the suffix of the dependent jar version is removed here.
Also modify or add in the hadoop-1.2.1/src/contrib/eclipse-plugin/build. xml file

$ cd hadoop-1.2.1 /src/contrib/eclipse-plugin $ vi build.xml

Find <! -- Override jar target to specify manifest --> modify the target name to the path of copy file in jar, as follows:

< copy file = "${hadoop.root}/hadoop-core-${version}.jar" tofile = "${build.dir}/lib/hadoop-core.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/commons-cli-${commons-cli.version}.jar" tofile = "${build.dir}/lib/commons-cli.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/commons-configuration-1.6.jar" tofile = "${build.dir}/lib/commons-configuration.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/commons-httpclient-3.0.1.jar" tofile = "${build.dir}/lib/commons-httpclient.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/commons-lang-2.4.jar" tofile = "${build.dir}/lib/commons-lang.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/jackson-core-asl-1.8.8.jar" tofile = "${build.dir}/lib/jackson-core-asl.jar" verbose = "true" /> < copy file = "${hadoop.root}/lib/jackson-mapper-asl-1.8.8.jar" tofile = "${build.dir}/lib/jackson-mapper-asl.jar" verbose = "true" />

Modify the jar File

Cd./hadoop-1.2.1/src/contrib/eclipse-plugin/META-INF

vi MANIFEST.MF

Find the Bundle-ClassPath line of the file and modify it

Bundle-ClassPath: classes/, lib/commons-cli.jar, lib/commons-httpclient.jar, lib/hadoop-core.jar, lib/jackson-mapper-asl.jar, lib/commons-configuration.jar, lib/commons-lang.jar, lib/jackson-core-asl.jar

Ensure that the preceding characters occupy one line or meet the line feed standard syntax of the osgi bundle configuration file. Write a line directly to save trouble.

Create a target to directly package and deploy the jar to the eclipse/plugin directory.

cd hadoop-1.2.1 /src/contrib/eclipse-plugin vi build.xml

Add target to directly copy the compiled plug-in to the eclipse plug-in directory.

< target name = "deploy" depends = "jar" unless = "skip.contrib" > < copy file = "${build.dir}/hadoop-${name}-${version}.jar" todir = "${eclipse.home}/plugins" verbose = "true" /> </ target >

Change ant's default target default = "java" to default = "deploy"

<project default="deploy" name="eclipse-plugin">

Compile and start the eclipse verification plug-in

Ant-f./hadoop-1.2.1/src/contrib/eclipse-plugin/build.xml

Start eclipse, create a Map/Reduce Project, configure hadoop location. Verify plug-in fully distributed plug-in configuration and core-site.xml port configuration

Related source files
Hadoop-1.2.1/src/contrib/build-contrib.xml
Hadoop-1.2.1/src/contrib/eclipse-plugin/build. xml
Hadoop-1.2.1/src/contrib/eclipse-plugin/META-INF/MANIFEST. MF
Hadoop-eclipse-plugin-1.2.1.jar

Free in http://linux.bkjia.com/

The username and password are both www.bkjia.com

The specific download directory is in/July 6,/July 21,/compiling hadoop 1.2.1 Hadoop-eclipse-plugin plug-in.

For the download method, see

Compile and install the Hadoop Eclipse plug-in Linux

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.