64-bit Linux compilation hadoop-2.5.1

Source: Internet
Author: User
Tags openssl hadoop mapreduce hadoop ecosystem

Apache Hadoop Ecosystem installation package: http://archive.apache.org/dist/

Software Installation directory: ~/app

jdk:jdk-7u45-linux-x64.rpmhadoop:hadoop-2.5. 1-src. Tar . Gzmaven:apache-maven-3.0. 5-bin. Zip protobuf:protobuf-2.5. 0. tar. gz

1. Download Hadoop

wget http://tar -zxvf hadoop-2.5. 1-src. Tar

There is a BUILDING.txt file under the extracted Hadoop root directory, and you can see the environment requirements for compiling Hadoop

1.63.01.3.92.5.02.6 or newer (if compiling native code) * Zlib Devel (if Compiling native code) * OpenSSL devel (if compiling native hadoop-pipes) * Internet connection for first build (to fetch All Maven and Hadoop dependencies)

2. Installing the JDK

sudo Yum Install jdk-7u45-linux-x64.rpm

To view the JDK installation location:

which Java/usr/java/jdk1. 7. 0_45/bin/java

Add JDK to environment variable (~/.bash_profile):

Export JAVA_HOME=/USR/JAVA/JDK1. 7 . 0_45export PATH=.: $JAVA _home/bin: $PATH

Verify:

Java-version
" 1.7.0_45 " 1.7. 0_45-24.45-b08, Mixed mode)

3. Install Maven

wget http://apache.fayea.com/apache-mirror/maven/maven-3/3.0.5/binaries/ Apache-maven-3.0.5-bin.zipUnzip apache-maven-3.0. 5-bin. Zip

Add MAVEN to environment variable (~/.bash_profile):

Export maven_home=/home/hadoop/app/apache-maven-3.0. 5 export PATH=.: $MAVEN _home/bin: $PATH

Verify:

MVN-version
Apache Maven3.0.5(R01DE14724CDEF164CD33C7C8C2FE155FAF9602DA; -- Geneva- + to:Wuyi: --0800) Maven Home:/home/hadoop/app/apache-maven-3.0.5Java Version:1.7. 0_45, Vendor:oracle Corporationjava home:/usr/java/jdk1.7.0_45/Jredefault Locale:en_us, platform Encoding:utf-8OS Name:"Linux", Version:"2.6.32-358.el6.x86_64", Arch:"AMD64", Family:"Unix"

4, Installation Protobuf

PROTOBUF official address seems to be unable to download the PROTOBUF installation package, in order to compile and install PROTOBUF, you need to first Gcc/gcc-c++/make

sudo Yum Install GCC sudo Yum Install gcc-c++sudoyuminstall make
tar -zvxf protobuf-2.5. 0. Tar . GZ CD protobuf-2.5. 0 . sudo  Make sudo  Make Install

Add PROTOBUF to environment variable (~/.bash_profile):

Export path=.:/ Usr/local/protoc/bin: $PATH

Verify:

Protoc--2.5. 0

5. Install other dependencies

sudo Yum Install CMake sudo Yum Install openssl-develsudoyuminstall ncurses-devel

6. Compiling Hadoop source code

CD ~/app/hadoop-2.5. 1--dskiptests-pdist,native
[INFO] Reactor summary:[info] [info] Apache Hadoop Main ....... ..... ............ SUCCESS [1. 980s] [INFO] Apache Hadoop Project POM ......... ......... SUCCESS [1. 575s] [INFO] Apache Hadoop Annotations ......... .......... SUCCESS [3. 324s] [INFO] Apache Hadoop assemblies ......... .......... SUCCESS [0. 318s] [INFO] Apache Hadoop Project Dist POM ......... ..... SUCCESS [1. 550s] [INFO] Apache Hadoop Maven Plugins ........ ......... SUCCESS [4. 548s] [INFO] Apache Hadoop minikdc ......... ............. SUCCESS [3. 410s] [INFO] Apache Hadoop Auth ......... ............... SUCCESS [4. 503s] [INFO] Apache Hadoop Auth Examples ........ ......... SUCCESS [2. 915s] [INFO] Apache Hadoop Common ......... ............. SUCCESS [1: -. 913s] [INFO] Apache Hadoop NFS ......... ................ SUCCESS [8. 324s] [INFO] Apache Hadoop Common Project ......... ....... SUCCESS [0. 064s] [INFO] Apache Hadoop HDFS ......... ............... SUCCESS [2: to. 023s] [INFO] Apache Hadoop Httpfs ......... ............. SUCCESS [ -. 389s] [INFO] Apache Hadoop HDFS bookkeeper Journal .... ..... SUCCESS [8. 235s] [INFO] Apache Hadoop HDFS-nfs ......... ............ SUCCESS [4. 493s] [INFO] Apache Hadoop HDFS Project ......... ......... SUCCESS [0. 041s] [INFO] Hadoop-yarn ....... ....................... SUCCESS [0. 031s] [INFO] Hadoop-yarn-api ........ ................... SUCCESS [1: One. 828s] [INFO] Hadoop-yarn-common ......... ............... SUCCESS [ -. 542s] [INFO] Hadoop-yarn-server ......... ............... SUCCESS [0. 047s] [INFO] Hadoop-yarn-server-common ......... .......... SUCCESS [ -. 953s] [INFO] Hadoop-yarn-server-nodemanager ....... ........ SUCCESS [ -. 537s] [INFO] Hadoop-yarn-server-web-proxy ........ ......... SUCCESS [3. 270s] [INFO] Hadoop-yarn-server-applicationhistoryservice ... SUCCESS [7. 840s] [INFO] Hadoop-yarn-server-resourcemanager .......... SUCCESS [ the. 877s] [INFO] Hadoop-yarn-server-tests ......... ........... SUCCESS [0. 421s] [INFO] Hadoop-yarn-client ......... ............... SUCCESS [6. 406s] [INFO] Hadoop-yarn-applications ......... ........... SUCCESS [0. 025s] [INFO] Hadoop-yarn-applications-distributedshell ..... SUCCESS [3. 208s] [INFO] Hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1. 885s] [INFO] Hadoop-yarn-site ......... ................. SUCCESS [0. 058s] [INFO] Hadoop-yarn-project ......... ............... SUCCESS [2. 870s] [INFO] Hadoop-mapreduce-client ......... ............ SUCCESS [0. 065s] [INFO] Hadoop-mapreduce-client-core ........ ......... SUCCESS [ -. 292s] [INFO] Hadoop-mapreduce-client-common ....... ........ SUCCESS [ +. 197s] [INFO] Hadoop-mapreduce-client-shuffle ....... ........ SUCCESS [5. 229s] [INFO] Hadoop-mapreduce-client-app ........ ......... SUCCESS [ A. 322s] [INFO] Hadoop-mapreduce-client-hs ......... ......... SUCCESS [Ten. 640s] [INFO] Hadoop-mapreduce-client-jobclient ........... SUCCESS [5. 154s] [INFO] Hadoop-mapreduce-client-hs-plugins .......... SUCCESS [1. 939s] [INFO] Apache Hadoop MapReduce Examples ........ ..... SUCCESS [8. 088s] [INFO] Hadoop-mapreduce ......... ................. SUCCESS [2. 979s] [INFO] Apache Hadoop MapReduce streaming ....... ..... SUCCESS [5. 615s] [INFO] Apache Hadoop distributed Copy ......... ..... SUCCESS [7. 668s] [INFO] Apache Hadoop Archives ......... ............ SUCCESS [2. 014s] [INFO] Apache Hadoop rumen ......... .............. SUCCESS [6. 567s] [INFO] Apache Hadoop gridmix ......... ............. SUCCESS [4. 398s] [INFO] Apache Hadoop Data Join ......... ........... SUCCESS [3. 151s] [INFO] Apache Hadoop Extras ......... ............. SUCCESS [3. 251s] [INFO] Apache Hadoop Pipes ......... .............. SUCCESS [1. 901s] [INFO] Apache Hadoop OpenStack support ....... ....... SUCCESS [5. 722s] [INFO] Apache Hadoop Client ......... ............. SUCCESS [4. 021s] [INFO] Apache Hadoop Mini-cluster ......... ......... SUCCESS [0. 095s] [INFO] Apache Hadoop Scheduler Load Simulator ...... SUCCESS [5. 776s] [INFO] Apache Hadoop Tools Dist ......... .......... SUCCESS [2. 768s] [INFO] Apache Hadoop Tools ......... .............. SUCCESS [0. 035s] [INFO] Apache Hadoop distribution ......... ......... SUCCESS [8. 571s] [INFO]------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO]------------------------------------------------------------------------[INFO] Total Time:Ten: -. 071s[info] finished At:sat Nov , Geneva: the: onPst the[INFO] Final memory:91m/324m[info]------------------------------------------------------------------------

After compiling the code under hadoop-2.5.1-src/hadoop-dist/target/hadoop-2.5.1, to build the Hadoop environment directly using the hadoop-2.5.1 folder deployment.

64-bit Linux compilation hadoop-2.5.1

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.