Compile Hadoop 2.6.0 on 64-bit CentOS
Hadoop does not provide 64-bit compiled versions. It can only be compiled with the source code. Learning a technology starts from the installation and starts from the compilation of hadoop.
1. Operating System compiling environment
Yum install cmake lzo-devel zlib-devel gcc-c ++ autoconf automake libtool ncurses-devel openssl-devel libXtst
2. Install JDK
Download JDK1.7. Note that only 1.7 is used; otherwise, an error occurs during compilation.
Http://www.Oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
Tar zxvf jdk-7u75-linux-x64.tar.gz-C/app
Export JAVA_HOME =/app/jdk1.7.0 _ 75
Export JRE_HOME = $ JAVA_HOME/jre
Export CLASSPATH =.: $ JAVA_HOME/lib/dt. jar: $ JAVA_HOME/lib/tools. jar
PATH = $ PATH: $ JAVA_HOME/bin
3. Install protobuf
Download protobuf-2.5.0, cannot use high version, otherwise Hadoop compilation cannot pass
Wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
Tar xvf protobuf-2.5.0.tar.gz
Cd protobuf-2.5.0
./Configure
Make
Make install
Ldconfig
Protoc -- version
4. Install ANT
Wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
Tar zxvf apache-ant-1.9.4-bin.tar.gz-C/app
Vi/etc/profile
Export ANT_HOME =/app/apache-ant-1.9.4
PATH = $ PATH: $ ANT_HOME/bin
5. Install maven
Wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz
Tar zxvf apache-maven-3.3.1-bin.tar.gz-C/app
Vi/etc/profile
Export MAVEN_HOME =/app/apache-maven-3.3.1
Export PATH = $ PATH: $ MAVEN_HOME/bin
Modify configuration file
Vi/app/apache-maven-3.3.1/conf/settings. xml
Modify the maven database and add the following content to <mirrors> </mirros>:
<Mirror>
<Id> nexus-osc </id>
<MirrorOf> * </mirrorOf>
<Name> Nexusosc </name>
<Url> http://maven.oschina.net/content/groups/public/ </url>
</Mirror>
Add new files in <profiles> </profiles>
<Profile>
Jdk-1.7
<Activation>
<Jdk> 1.7 </jdk>
</Activation>
<Repositories>
<Repository>
<Id> nexus </id>
<Name> local private nexus </name>
<Url> http://maven.oschina.net/content/groups/public/ </url>
<Releases>
<Enabled> true </enabled>
</Releases>
<Snapshots>
<Enabled> false </enabled>
</Snapshots>
</Repository>
</Repositories>
<PluginRepositories>
<PluginRepository>
<Id> nexus </id>
<Name> local private nexus </name>
<Url> http://maven.oschina.net/content/groups/public/ </url>
<Releases>
<Enabled> true </enabled>
</Releases>
<Snapshots>
<Enabled> false </enabled>
</Snapshots>
</PluginRepository>
</PluginRepositories>
</Profile>
6. Install findbugs (not required)
Wget http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz? Download
Tar zxvf findbugs-3.0.1.tar.gz-C/app
Vi/etc/profile
Export FINDBUGS_HOME =/app/findbugs-3.0.1
PATH = $ PATH: $ FINDBUGS_HOME/bin
Export PATH
Note:
Finally, the PATH of the environment variable in/etc/profile is set as follows:
PATH = $ PATH: $ JAVA_HOME/bin: $ ANT_HOME/bin: $ MAVEN_HOME/bin: $ FINDBUGS_HOME/bin
Export PATH
Execute in shell to make environment variables take effect
./Etc/profile
7. Compile Hadoop2.6.0
Wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
Cd hadoop-2.6.0-src
Mvn package-DskipTests-Pdist, native-Dtar
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [4.401 s]
[INFO] Apache Hadoop Project POM ...... SUCCESS [3.864 s]
[INFO] Apache Hadoop Annotations ...... SUCCESS [7.591 s]
[INFO] Apache Hadoop Assemblies...
[INFO] Apache Hadoop Project Dist POM ...... SUCCESS [3.585 s]
[INFO] Apache Hadoop Maven Plugins ...... SUCCESS [6.623 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [4.722 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [7.787 s]
[INFO] Apache Hadoop Auth Examples ...... SUCCESS [5.500 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [0: 47 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [12.793 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [20.443 s]
[INFO] Apache Hadoop Common Project ...... SUCCESS [0.111 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [29.896 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ...... SUCCESS [11.100 s]
[INFO] Apache Hadoop HDFS-NFS...
[INFO] Apache Hadoop HDFS Project ...... SUCCESS [0.069 s]
[INFO] hadoop-yarn .................................. ...... SUCCESS [0.066 s]
[INFO] hadoop-yarn-api ................................ .... SUCCESS [02: 05 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [46.132 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [0.123 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [19.166 s]
[INFO] hadoop-yarn-server-nodemanager ...... SUCCESS [25.552 s]
[INFO] hadoop-yarn-server-web-proxy ...... SUCCESS [5.456 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [11.781 s]
[INFO] hadoop-yarn-server-resourcemanager ...... SUCCESS [30.557 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [9.765 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [14.016 s]
[INFO] hadoop-yarn-applications ...... SUCCESS [0.101 s]
[INFO] hadoop-yarn-applications-distributedshell ...... SUCCESS [4.116 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher... SUCCESS [2.993 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [0.093 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [9.036 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [6.557 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [0.267 s]
[INFO] hadoop-mapreduce-client-core ...... SUCCESS [36.775 s]
[INFO] hadoop-mapreduce-client-common ...... SUCCESS [28.049 s]
[INFO] hadoop-mapreduce-client-shuffle ...... SUCCESS [7.285 s]
[INFO] hadoop-mapreduce-client-app ...... SUCCESS [17.333 s]
[INFO] hadoop-mapreduce-client-hs ...... SUCCESS [15.283 s]
[INFO] hadoop-mapreduce-client-jobclient ...... SUCCESS [7.110 s]
[INFO] hadoop-mapreduce-client-hs-plugins ...... SUCCESS [3.843 s]
[INFO] Apache Hadoop MapReduce Examples ...... SUCCESS [12.559 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [6.331 s]
[INFO] Apache Hadoop MapReduce Streaming ...... SUCCESS [45.863 s]
[INFO] Apache Hadoop Distributed Copy ...... SUCCESS [46.304 s]
[INFO] Apache Hadoop Archives...
[INFO] Apache Hadoop Rumen ................................ SUCCESS [12.991 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [10.105 s]
[INFO] Apache Hadoop Data Join...
[INFO] Apache Hadoop Ant Tasks...
[INFO] Apache Hadoop Extras ............................... SUCCESS [5.298 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [10.290 s]
[INFO] Apache Hadoop OpenStack support ...... SUCCESS [9.220 s]
[INFO] Apache Hadoop Amazon Web Services support ...... SUCCESS [11: 12 min]
[INFO] Apache Hadoop Client ............................... SUCCESS [10.714 s]
[INFO] Apache Hadoop Mini-Cluster...
[INFO] Apache Hadoop schedload Load Simulator ...... SUCCESS [7.664 s]
[INFO] Apache Hadoop Tools Dist...
[INFO] Apache Hadoop Tools ................................ SUCCESS [0.057 s]
[INFO] Apache Hadoop Distribution ...... SUCCESS [49.425 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 32: 26 min
[INFO] Finished at: 2015-03-19T19: 56: 40 + 08: 00
[INFO] Final Memory: 99 M/298 M
[INFO] ------------------------------------------------------------------------
After compilation is successful, it will be packaged and placed in hadoop-dist/target
# Ls
Antrun dist-tar-stitching.sh hadoop-2.6.0.tar.gz hadoop-dist-2.6.0-javadoc.jar maven-archiver
Dist-layout-stitching.sh hadoop-2.6.0 hadoop-dist-2.6.0.jar javadoc-bundle-options test-dir
-------------------------------------- Split line --------------------------------------
Tutorial on standalone/pseudo-distributed installation and configuration of Hadoop2.4.1 under Ubuntu14.04
Install and configure Hadoop2.2.0 on CentOS
Build a Hadoop environment on Ubuntu 13.04
Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1
Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)
Configuration of Hadoop environment in Ubuntu
Detailed tutorial on creating a Hadoop environment for standalone Edition
Build a Hadoop environment (use virtual machines to virtualize two Ubuntu systems in a Winodws Environment