Use Hadoop 2.2.0 in Ubuntu 12.04

Source: Internet
Author: User
Tags node server

This article describes how to install Hadoop 2.2.0 single node.

First prepare a virtual machine, Ubuntu 12.04.4

Java environment:

Root @ hm1 :~ # Mvn -- version
Apache Maven 3.1.1 (0728685237757ffbf44136acec0402957f723d9a; 15:22:22 + 0000)
Maven home:/usr/apache-maven-3.1.1
Java version: 1.7.0 _ 51, vendor: Oracle Corporation
Java home:/usr/lib/jvm/java-7-oracle/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.2.0-59-virtual", arch: "amd64", family: "unix"

Create hadoop users and groups, group hadoop, user name hduser, and password hduser

Root @ hm1 :~ # Addgroup hadoop
Adding group 'hadoop '(GID 1001 )...
Done.
Root @ hm1 :~ # Adduser -- ingroup hadoop hduser
Adding user 'hduser '...
Adding new user 'hduser' (1001) with group 'hadoop '...
Creating home directory '/home/hduser '...
Copying files from '/etc/skel '...
Enter new UNIX password:
Retype new UNIX password:
Passwd: password updated successfully
Changing the user information for hduser
Enter the new value, or press ENTER for the default
Full Name []:
Room Number []:
Work Phone []:
Home Phone []:
Other []:
Is the information correct? [Y/n] y

Add to sudo Group

Root @ hm1 :~ # Adduser hduser sudo
Adding user 'hduser' to group 'sudo '...
Adding user hduser to group sudo
Done.

To prevent the following errors when using sudo with hduser in the future:

Hduser is not in the sudoers file. This incident will be reported.

You need to run the export do command to edit the file/etc/sudoers and add a line

# Uncomment to allow members of group sudo to not need a password
# % Sudo ALL = NOPASSWD: ALL
Hduser ALL = (ALL) ALL

Log out of the root user and use hduser to log on.

Ssh hduser@192.168.1.70

To avoid installation script prompts, the following command creates a certificate file accessed by localhost.

Hduser @ hm1 :~ $ Ssh-keygen-t rsa-p''
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hduser/. ssh/id_rsa ):
Created directory '/home/hduser/. ssh '.
Your identification has been saved in/home/hduser/. ssh/id_rsa.
Your public key has been saved in/home/hduser/. ssh/id_rsa.pub.
The key fingerprint is:
B8: b6: 3d: c2: 24: 1f: 7b: a3: 00: 88: 72: 86: 76: 5a: d8: c2 hduser @ hm1
The key's randomart image is:
+ -- [RSA 2048] ---- +
|
|
|
| Ooo. |
| = E ++. S |
| Oo =. o. |
|... = Oo |
| O = o + |
| O +. o |
+ ----------------- +
Hduser @ hm1 :~ $ Cat ~ /. Ssh/id_rsa.pub> ~ /. Ssh/authorized_keys
Hduser @ hm1 :~ $ Ssh localhost
The authenticity of host 'localhost (127.0.0.1) 'can't be established.
ECDSA key fingerprint is fb: a8: 6c: 4c: 51: 57: b2: 6d: 36: b2: 9c: 62: 94: 30: 40: a7.
Are you sure you want to continue connecting (yes/no )? Yes
Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts.
Welcome to Ubuntu 12.04.4 LTS (GNU/Linux 3.2.0-59-virtual x86_64)

* Documentation: https://help.ubuntu.com/
Last login: Fri Feb 21 07:59:05 2014 from 192.168.1.5

If the password is not queried, the above settings are successful.

Download hadoop now, download URL: http://apache.mirrors.lucidnetworks.net/hadoop/common/

Now run the following command to download and modify the File Permission.

$ Cd ~
$ Wget http://www.trieuvan.com/apache/hadoop/common/hadoop-2.2.0/hadoop-2.2.0.tar.gz
$ Sudo tar vxzf hadoop-2.2.0.tar.gz-C/usr/local
$ Cd/usr/local
$ Sudo mv hadoop-2.2.0 hadoop
$ Sudo chown-R hduser: hadoop

 

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • Next Page
[Content navigation]
Page 1: user permission settings Page 1: Configure single node server
Page 6: 64-bit Compilation Page 7: Basic settings of Multi-node cluster
Page 5th: Build 2.3.0 64bit Page 1: HDFS cluster Construction
Page 1: Use HDFS cluster Page 1: HDFS cluster management commands
Page 1: HDFS cluster Topology Management Page 1: HDFS Federation

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.