Run Hadoop2.2.0 on Ubuntu 64-bit OS [re-compile Hadoop]

Source: Internet
Author: User
Tags ssh access

We have recently learned how to build Hadoop. We will download the latest version of Hadoop2.2 from the official Apache website. Currently, the linux32-bit system executable file is provided officially. When the result is run, the warning "libhadoop. so.1.0.0 which might have disabled stack guard" is displayed. Google found that hadoop 2.2.0 provides the libhadoop. so library with 32 bits and our machine with 64 bits. The solution is to re-compile hadoop on a 64-bit machine.

Compiling environment
OS: Ubuntu 12.04 64-bit

Hadoop version: 2.2.0

Java: Jdk1.7.0 _ 45

Java environment Configuration
Refer to this article: Install JDK 1.7.0 _ 45 in Ubuntu

Install dependency packages
These libraries and packages are basically used in the compilation process. If they are missing, compilation will be affected. It is very troublesome to find solution after an error occurs.

$ Sudo apt-get install g ++ autoconf automake libtool make cmake zlib1g-dev pkg-config libssl-dev because ssh is needed, so if there is no machine, install an openssh client (ubuntu 12.04 should be pre-installed)

$ Sudo apt-get install openssh-client

$ Sudo apt-get install openssh-server protobuf is also used during compilation. It seems that the latest version 2.5.0 is required. Therefore, you can reinstall protobuf in earlier versions.

Install and configure protobuf
Download the latest protobuf-2.5.0.tar.gz: https://code.google.com/p/protobuf/downloads/list
Decompress and run

$./Configure -- prefix =/usr
$ Sudo make
$ Sudo make check
$ Sudo make install check the version

$ Protoc -- version
Libprotoc 2.5.0 install and configure maven
Install maven with apt-get In ubuntu

$ Sudo apt-get install maven

Create new users and user groups
We have created a new group for hadoop called "hadoop" and a new user named "hduser" belongs to the "hadoop" group (we are also following the trend when we get the same name on the Internet)

$ Sudo addgroup hadoop
$ Sudo adduser -- ingroup hadoop hduser after a new user is created, the following operations will be completed for the new user.

$ Su hduser build ssh Trust
When hadoop is started, it requires ssh access to localhost to Establish a trust relationship, saving the old password.

$ Cd/home/hduser
$ Ssh-keygen-t rsa-P ""
$ Cat. ssh/id_rsa.pub>. ssh/authorized_keys run the command to verify whether the password-free connection to localhost is allowed.

$ Ssh localhost this is not a password for the rsa key, but I think it is better to give it a password. You can use ssh-agent to manage it. You will not be allowed to enter it every time in the future.

$ Ssh-add ~ /. Ssh/id_rsa.pub # The parameter is written as a public key. The private key should be passed in. Thanks for the twlkyao reminder.
$ Ssh-add ~ /. Ssh/id_rsa compile hadoop 2.2.0
Download hadoop 2.2.0 http://www.apache.org/dyn/closer.cgi/hadoop/common/

Decompress to user directory/home/hduser/. To enter hadoop-2.2.0-src directory

Because maven, protobuf, and java environments are installed, compiler runs directly.

$ Mvn package-Pdist, native-DskipTests-Dtar should be normal and there should be no errors. For parameters and other compilation options, see the BUILDING.txt file in the hadoop directory.

If the following error occurs during compilation:

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1: 29. 469 s
[INFO] Finished at: Mon Nov 18 12:30:36 PST 2013
[INFO] Final Memory: 37 MB/120 MB
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org. apache. maven. plugins: maven-compiler-plugin: 2.5.1: testCompile (default-testCompile) on project hadoop-auth: Compilation failure:
[ERROR]/home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase. java: [84,13] cannot access org. mortbay. component. abstractLifeCycle
[ERROR] class file for org. mortbay. component. AbstractLifeCycle not found
[ERROR] server = new Server (0 );
[ERROR]/home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase. java: [94,29] cannot access org. mortbay. component. lifeCycle
[ERROR] class file for org. mortbay. component. LifeCycle not found
[ERROR] server. getConnectors () [0]. setHost (host );
[ERROR]/home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase. java: [96,10] cannot find symbol
[ERROR] symbol: method start ()
[ERROR] location: class org. mortbay. jetty. Server
[ERROR]/home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase. java: [] cannot find symbol
[ERROR] symbol: method stop ()

  • 1
  • 2
  • Next Page

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.