The learning prelude to Hadoop-Installing and configuring Hadoop on Linux

Source: Internet
Author: User
Tags validation examples

Preface:

I am currently in the development of Android, but also in an information security company. The company's main is to do in the virtual machine running products, and the cloud computing, thought is to go to the cloud computing. Self-feeling mobile internet + cloud Computing + information security will be a good direction, so plunge into it. Because it's Java origin, it's natural to choose Hadoop.


Author System environment:

Linux:centos Release 6.5 (Final)
Jdk:java Version "1.7.0_75"
OpenJDK Runtime Environment (rhel-2.5.4.0.el6_6-x86_64 u75-b13)
OpenJDK 64-bit Server VM (build 24.75-b04, Mixed mode)
SSH:OPENSSH_5.3P1, OpenSSL 1.0.1e-fips 2013


Environment Construction:1. Installing the JDK

The installation of the JDK is not the scope of this blog discussion, you can do it yourself Baidu or Google installation.


2. Configure SSH password-free login

Do the following on the premise that your machine can be networked

(1) # yum install SSH installation ssh

(2) # mkdir-p/HOME/U/.SSH # If these directories are not automatically generated after you install SSH, create your own

(3) # ssh-keygen-t dsa-p '-F/HOME/U/.SSH/ID_DSA

Ssh-keygen indicates that the key is generated

-T means the specified generated key type

DSA is the meaning of DSA key authentication, that is, the key type

-P provides a passphrase

-f Specifies the generated key file

(4) # cat/home/u/.ssh/id_dsa.pub >>/home/u/.ssh/authorized_keys

# Add the public key to the public key file for authentication, Authorized_keys is the public key file for authentication

(5) # Ssh-version

# Verify that SSH installation is complete and the correct installation will have the following results:



3. Install and run Hadoop

(1) Download hadoop-x.x.x

Extract to the specified directory, such as/home/u


(2) Modifying configuration information for a profile

# Vim/home/u/hadoop-1.2.1/conf/core-site.xml

# Vim/home/u/hadoop-1.2.1/conf/hdfs-site.xml

# Vim/home/u/hadoop-1.2.1/conf/mapred-site.xml

(3) # /home/u/hadoop-1.2.1/bin/hadoop Namenode-format # format File system


(4) # /home/u/hadoop-1.2.1/bin/start-all.sh # start All Processes


(5) Verify that Hadoop is installed successfully

Enter the browser to verify by entering the following URL:

http://localhost:50030 (Web page for MapReduce)

http://localhost:50070 (HDFS Web page)

Validation examples:

Web page for MapReduce


Web pages for HDFs


problems encountered:1. When starting Hadoop, always say Java_home is not configured


When I use the shell command in the tutorial to execute bin/start-all.sh in the Hadoop folder, I always report java_home is not set.

But I also set the java_home in the shell, and I go to check the system of Java_home is OK, as follows:


This makes me very strange, if I found a forum, there are some of the situation and I have some similarities, but one of his words let me realize that my mistake a place, that is the above hint of the java_home is actually to be set in the configuration file.

Run vim/home/u/hadoop-1.2.1/conf/hadoop-env.sh, this directory please follow your own path to write, modify the following:



2.-bash:bin/hadoop:permission denied

If you are downloading directly from the network under Linux, there should be nothing to it. However, if you, like the author, are uploading to Linux using WINSCP, there can be a small place to change. Otherwise the error will be reported:


You can see that this error is in our execution of the Hadoop executable times error, then we just have to modify the permissions of this file can be. Because there will be some other executable files in the back, so here I have made a modification to all the files (of course, because we are in the study and testing phase, in order to avoid trouble, steal a lazy. If we want to think from a security standpoint, we can't do this here .


3.Hadoop Safemode:on-hdfs Unavailable

Of course, when we put some of the previous configuration information, there may be a problem, that is, our HDFs Web page can not be accessed.

This problem is actually a legacy of some of our previous misconfiguration. We talked about a permission problem with file execution, and when we manipulate the format HDFs, this is exactly the problem of this privilege interfering. If I stop the process that was started before. Then re-format is OK.

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced. http://blog.csdn.net/lemon_tree12138

The learning prelude to Hadoop-Installing and configuring Hadoop on Linux

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.