Compile and install Hadoop 2.2.0 in Ubuntu

Source: Internet
Author: User
Tags ssh access

I don't know what Map Reduce is. Today, we are helping students build Hadoop. Version 2.2.0 is used. When the result is run, the warning "libhadoop. so.1.0.0 which might have disabled stack guard" is displayed. Google found that hadoop 2.2.0 provides the libhadoop. so library with 32 bits and our machine with 64 bits. The solution is to re-compile hadoop on a 64-bit machine. Just now, students have just entered the ranks of linux users, and even Ubuntu on the machine is newly installed. Therefore, to compile hadoop, everything has to be configured from scratch.

Directory
Compiling environment
Java environment Configuration
Install dependency packages
Install and configure protobuf
Install and configure maven
Create new users and user groups
Compile hadoop 2.2.0
Install and configure hadoop 2.2.0
 

Compiling environment
OS: Ubuntu 12.04 64-bit

Hadoop version: 2.2.0

Java: Jdk1.7.0 _ 45

Java environment Configuration
No blank computer.

Download jdk: http://www.Oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html

Create a jvm folder under/usr/lib/and decompress the compressed file to the/usr/lib/jvm/directory.

Modify ~ /. Configure the environment variable for bashrc

Export JAVA_HOME =/usr/lib/jvm/jdk1.7.0 _ 45
Export JRE_HOME =$ {JAVA_HOME}/jre
Export CLASSPATH =. :$ {JAVA_HOME}/lib :$ {JRE_HOME}/lib
Export PATH =$ {JAVA_HOME}/bin: $ PATH. update it to make it take effect.

$ Source. bashrc check whether the configuration is successful

$ Java-version
Java version "1.7.0 _ 45"
Java (TM) SE Runtime Environment (build 1.7.0 _ 45-b18)
Java HotSpot (TM) 64-Bit Server VM (build 24.45-b08, mixed mode) install dependency packages
These libraries and packages are basically used in the compilation process. If they are missing, compilation will be affected. It is very troublesome to find solution after an error occurs.

$ Sudo apt-get install g ++ autoconf automake libtool cmake zlib1g-dev pkg-config libssl-dev because ssh is needed, so if there is no machine, install an openssh client (ubuntu 12.04 should be pre-installed)

$ Sudo apt-get install openssh-client

$ Sudo apt-get install openssh-server protobuf is also used during compilation. It seems that the latest version 2.5.0 is required. Therefore, you can reinstall protobuf in earlier versions.

Install and configure protobuf
Download the latest protobuf: https://code.google.com/p/protobuf/downloads/list

Decompress and run

$./Configure -- prefix =/usr
$ Sudo make
$ Sudo make check
$ Sudo make install check the version

$ Protoc -- version
Libprotoc 2.5.0 install and configure maven
Install with apt-get In ubuntu

$ Sudo apt-get install maven

Create new users and user groups
We have created a new group for hadoop called "hadoop" and a new user named "hduser" belongs to the "hadoop" group (we are also following the trend when we get the same name on the Internet)

$ Sudo addgroup hadoop
$ Sudo adduser -- ingroup hadoop hduser after a new user is created, the following operations will be completed for the new user.

$ Su hduser build ssh Trust
When hadoop is started, it requires ssh access to localhost to Establish a trust relationship, saving the old password.

$ Cd/home/hduser
$ Ssh-keygen-t rsa-P ""
$ Cat. ssh/id_rsa.pub>. ssh/authorized_keys run the command to verify whether the password-free connection to localhost is allowed.

$ Ssh localhost this is not a password for the rsa key, but I think it is better to give it a password. You can use ssh-agent to manage it. You will not be allowed to enter it every time in the future.

$ Ssh-add ~ /. Ssh/id_rsa.pub # The parameter is written as a public key. The private key should be passed in. Thanks for the twlkyao reminder.
$ Ssh-add ~ /. Ssh/id_rsa compile hadoop 2.2.0
Download hadoop 2.2.0 http://www.apache.org/dyn/closer.cgi/hadoop/common/

Decompress to user directory/home/hduser/. To enter hadoop-2.2.0-src directory

Because maven, protobuf, and java environments are installed, compiler runs directly.

$ Mvn package-Pdist, native-DskipTests-Dtar should be normal and there should be no errors. For parameters and other compilation options, see the BUILDING.txt file in the hadoop directory.

Install and configure hadoop 2.2.0
The compiled file is in the hadoop-2.2.0-src/hadoop-dist/target/hadoop-2.2.0/directory.

I will not write about how to configure it because there is a lot of content on the Internet.

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.