Go Installing pseudo-distributed Hadoop-1.2.1 under CentOS

Source: Internet
Author: User

from:http://blog.csdn.net/yinan9/article/details/16805275

Environment: CentOS 5.10 (under virtual machine)

  1. [Email protected] hadoop]# lsb_release-a
    1. LSB Version:: core-4.0-ia32:core-4.0-noarch:graphics-4.0-ia32:graphics-4.0-noarch:printing-4.0-ia32: Printing-4.0-noarch
    2. Distributor Id:centos
    3. Description:centos release 5.10 (Final)
    4. release:5.10
    5. Codename:final

Get ready

JDK Installation and Configuration

Oracle website Download the JDK, here i download the jdk-6u45-linux-i586.bin, and upload to the virtual machine, using the root user, execute the following command to create a folder, move the installation files, perform the installation.

  1. Mkdir/usr/java

Mv/home/auxu/desktop/jdk-6u45-linux-i586.bin/usr/java

Cd/usr/java

./jdk-6u45-linux-i586.bin

Configuring Environment variables

    1. Vi/etc/profile

Join

Export java_home=/usr/java/jdk1.6.0_45
Export Jre_home= $JAVA _home/jre
Export classpath=.: $JAVA _home/lib/dt.jar: $JAVA _home/lib/tools.jar
Export path= $PATH: $JAVA _home/bin

After the save is complete, execute:

    1. Source/etc/profile

Validation Java Configuration

[Email protected] java]# java-version

Java Version "1.6.0_45"

Java (TM) SE Runtime Environment (build 1.6.0_45-b06)

Java HotSpot (TM) Client VMs (build 20.45-b01, mixed mode, sharing)

Can simply write a Java class to test, there is not much to explain

Create Hadoop users and related application folders

Also use the root user to create a new user named Hadoop

Useradd Hadoop

passwd Hadoop

Create an App folder for subsequent Hadoop configuration

Mkdir/hadoop

Mkdir/hadoop/hdfs

Mkdir/hadoop/hdfs/data

Mkdir/hadoop/hdfs/name

Mkdir/hadoop/mapred

Mkdir/hadoop/mapred/local

Mkdir/hadoop/mapred/system

Mkdir/hadoop/tmp

Change the folder owner to a Hadoop user

Chown-r Hadoop/hadoop

Set Hadoop users can make it password-free SSH to the localhost

Su-hadoop

Ssh-keygen-t Dsa-p "-F ~/.SSH/ID_DSA

Cat ~/.ssh/id_dsa.pub>> ~/.ssh/authorized_keys

Cd/home/hadoop/.ssh

chmod Authorized_keys

Note The permissions issue here, ensure that the. SSH directory permission is 700,authorized_keys to 600

Verify:

    1. [[email protected]. ssh]$ ssh localhost
    2. Last Login:sun Nov 17 22:11:55 2013

SSH after localhost can be connected without entering a password, configure OK

Installation Configuration Hadoop

Create a directory and install

Re-cut back to root user, create installation directory

Mkdir/opt/hadoop

Move the installation files to the new directory above, ensure that they execute permissions, and then perform

Mv/home/auxu/desktop/hadoop-1.2.1.tar.gz/opt/hadoop

  1. Cd/opt/hadoop
  2. TAR-XZVF hadoop-1.2.1.tar.gz

Change the owner of the Hadoop installation directory to a Hadoop user

Chown-r Hadoop/opt/hadoop

Switch to Hadoop user, modify the configuration file, which is configured according to the application file created earlier, according to the respective situation

  1. Su-hadoop
  1. Cd/opt/hadoop/hadoop-1.2.1/conf

Core-site.xml

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://localhost:9000</value>

</property>

<property>

<name>hadoop.tmp.dir</name>

<value>/hadoop/tmp</value>

</property>

</configuration>

Hdfs-site.xml

<configuration>

<property>

<name>dfs.replication</name>

<value>1</value>

</property>

<property>

<name>dfs.name.dir</name>

<value>/hadoop/hdfs/name</value>

</property>

<property>

<name>dfs.data.dir</name>

<value>/hadoop/hdfs/data</value>

</property>

</configuration>

Mapred-site.xml

<configuration>

<property>

<name>mapred.job.tracker</name>

<value>localhost:9001</value>

</property>

</configuration>

hadoop-env.sh

Configure Java_home and hadoop_home_warn_suppress.

Ps:hadoop_home_warn_suppress This variable can be avoided in some cases such reminders "warm:hadoop_home is deprecated"

Export java_home=/usr/java/jdk1.6.0_45

Export hadoop_home_warn_suppress= "TRUE"

SOURCE hadoop-env.sh

Re-configure /etc/profile files, and ultimately such as:

Export java_home=/usr/java/jdk1.6.0_45

Export Jre_home= $JAVA _home/jre

Export classpath=.: $JAVA _home/lib/dt.jar: $JAVA _home/lib/tools.jar

Export hadoop_home=/opt/hadoop/hadoop-1.2.1

Export path= $PATH: $JAVA _home/bin: $HADOOP _home/bin: $HADOOP _home/sbin

Make the updated configuration file effective

    1. Source/etc/profile

Test Hadoop installation

[[email protected] conf]$ Hadoop version

Hadoop 1.2.1

Subversion Https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2-r 1503152

Compiled by Mattf on Mon Jul 15:23:09 PDT 2013

From source with checksum 6923c86528809c4e7e6f493b6b413a9a

Start HADOOP

You need to format the Namenode before you start all of the services

    1. Hadoop Namenode-format
    2. start-all.sh

View process

[Email protected] conf]$ JPS

6360 NameNode

6481 DataNode

6956 Jps

6818 Tasktracker

6610 Secondarynamenode

6698 Jobtracker

If you can find these services, it means that Hadoop has started successfully.

If there is any problem, you can go to/opt/hadoop/hadoop-1.2.1/logs to view the corresponding log

Finally, you can access the HADDOP service through the following links
Localhost:50030/for the Jobtracker
Localhost:50070/for the Namenode
Localhost:50060/for the Tasktracker

Hadoop Jobtracker:

Hadoop Namenode:

Hadoop Tasktracker:

PS: Fully distributed installation and pseudo-distributed installation is similar, pay attention to the following points can be

1. In-cluster SSH user-free login

2. Specify the specific IP address (or machine name) in the configuration file instead of localhost

3. Configure the Masters and slaves files, add the relevant IP address (or machine name) to

The above configuration needs to be consistent across the nodes.

Go Installing pseudo-distributed Hadoop-1.2.1 under CentOS

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.