Hadoop cmake maven protobuf
Problem description
Hadoop installed in 64-bit linux may encounter libhadoop. so.1.0.0 which might have disabled stack guard in many places. It is because hadoop is 32-bit and hadoop needs to be compiled manually.
Hadoop is 2.2.0, and the operating system is Oracle linux 6.3 64-bit.
Instance and solution process. Problems encountered
[Hadoop @ hadoop01 input] $ hadoop dfs-put./in
DEPRECATED: Use of this script to executehdfs command is deprecated.
Instead use the hdfs command for it.
Java HotSpot (TM) 64-BitServer VM warning: You have loaded library/app/hadoop/hadoop-2.2.0/lib/native/libhadoop. so.1.0.0 which might havedisabled stack guard. the VM will try to fix the stack guard now.
It's highly recommendedthat you fix the library with 'execstack-c <libfile> ', or link it with'-z noexecstack '.
13/10/24 04:08:55 WARNutil. NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicable
Put: 'in': No such file or directory
View local files
[Hadoop @ hadoop01 input] $ file/app/hadoop/hadoop-2.2.0/lib/native/libhadoop. so.1.0.0
/App/hadoop/hadoop-2.2.0/lib/native/libhadoop. so.1.0.0: ELF 32-bit LSB shared object, Intel 80386, version 1 (SYSV), dynamically linked, not stripped
The reason seems to be 32-bit and 64-bit.
Http://mail-archives.apache.org/mod_mbox/hadoop-user/201208.mbox/%3C19AD42E3F64F0F468A305399D0DF39D92EA4521578@winops07.win.compete.com%3E
Http://www.mail-archive.com/common-issues@hadoop.apache.org/msg52576.html
The operating system is 64-bit, and the software is 32-bit. Tragedy... The installed cluster cannot be used.
Solution: recompile hadoop
The solution is to re-compile the hadoop software:
Download program code
If the machine is connected to the Internet, you can find a machine that can be connected to the Internet to download the machine, but you still need to download something during compilation. You 'd better find a machine that can access the Internet on the same platform (or virtual machine) and copy it back.
# Svn checkout 'HTTP: // svn.apache.org/repos/asf/hadoop/common/tags/release-2.2.0'
All downloaded to this:
[Hadoop @ hadoop01 hadoop] $ ls
BUILDING.txt hadoop-common-project hadoop-maven-plugins hadoop-tools
Dev-support hadoop-dist hadoop-minicluster hadoop-yarn-project
Hadoop-assemblies hadoop-hdfs-project hadoop-project pom. xml
Hadoop-client hadoop-mapreduce-project hadoop-project-dist
Install the development environment 1. Necessary packages
[Root @ hadoop01/] # yum install svn
[Root @ hadoop01 ~] # Yum install autoconfautomake libtool cmake
Root @ hadoop01 ~] # Yum install ncurses-devel
Root @ hadoop01 ~] # Yum install openssl-devel
Root @ hadoop01 ~] # Yum install gcc *
2. Install maven
Download and decompress
Http://maven.apache.org/download.cgi
[Root @ hadoop01 stable] # mvapache-maven-3.1.1/usr/local/
Add/usr/local/apache-maven-3.1.1/bin to Environment Variables
3. Install protobuf
If protobuf is not installed, the subsequent compilation is incomplete. The result is as follows:
[INFO] --- hadoop-maven-plugins: 2.2.0: protoc (compile-protoc) @ hadoop-common ---
[WARNING] [protoc, -- version] failed: java. io. IOException: Cannot run program "protoc": error = 2, No suchfile or directory
[ERROR] stdout: []
........................
[INFO] Apache Hadoop Main ................................ SUCCESS [5.672 s]
[INFO] Apache Hadoop Project POM ...... SUCCESS [3.682 s]
[INFO] Apache Hadoop Annotations ...... SUCCESS [8.921 s]
[INFO] Apache Hadoop Assemblies...
[INFO] Apache Hadoop Project Dist POM ...... SUCCESS [4.590 s]
[INFO] Apache Hadoop Maven Plugins ...... SUCCESS [9.172 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [10.123 s]
[INFO] Apache Hadoop Auth Examples ...... SUCCESS [5.170 s]
[INFO] Apache HadoopCommon ...... FAILURE [1.224 s]
[INFO] Apache Hadoop NFS...
[INFO] Apache Hadoop Common Project...
[INFO] Apache Hadoop HDFS...
[INFO] Apache Hadoop HttpFS...
[INFO] Apache Hadoop HDFS BookKeeperJournal ...... SKIPPED
[INFO] Apache Hadoop HDFS-NFS...
[INFO] Apache Hadoop HDFS Project...
Protobuf Installation Process
Download: https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
Https://code.google.com/p/protobuf/downloads/list
[Root @ hadoop01 protobuf-2.5.0] # pwd
/Soft/protobuf-2.5.0
Execute the following commands in sequence.
./Configure
Make
Make check
Make install
[Root @ hadoop01 protobuf-2.5.0] # protoc -- version
Libprotoc 2.5.0
For more details, please continue to read the highlights on the next page:
Build a Hadoop environment on Ubuntu 13.04
Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1
Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)
Configuration of Hadoop environment in Ubuntu
Detailed tutorial on creating a Hadoop environment for standalone Edition
Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)