"Hadoop 2.6" hadoop2.6 pseudo-distributed mode environment for building test use

Source: Internet
Author: User
Tags hdfs dfs

First download installs, this does not say, go to Apache official website to download installs, looks like 186M, very big

After extracting, we look at the following directory structure:

[Email protected] hadoop-2.6.0]# lltotal 64drwxr-xr-x 2 20000 20000  4096 Nov 05:20 bindrwxr-xr-x 3 20000 20000
   
    4096 05:20 etcdrwxr-xr-x 2 20000 20000 4096  05:20 includedrwxr-xr-x 2 root  root   4096 Jan 14 14:  Inputdrwxr-xr-x 3 20000 20000  4096-Nov 05:20 libdrwxr-xr-x 2 20000 20000  4096 Nov 05:20 libexec-rw-r--r-- 1 20000 20000 15429 Nov 05:20 license.txtdrwxr-xr-x 2 root  root   4096 Jan 15:23 logs-rw-r--r--1 20000 20000   101 Nov 05:20 notice.txtdrwxr-xr-x 2 root  root   4096 Jan 14:53 1 output-rw-r--r--20000  136 6 Nov 05:20 readme.txtdrwxr-xr-x 2 20000 20000  4096 Nov 05 05:20 sbindrwxr-xr-x 4 20000 20000  4096 Nov 14: Share
   
Here is a comparison between a yarn frame and a previous MapReduce framework: http://www.ibm.com/developerworks/cn/opensource/os-cn-hadoop-yarn/
After the decompression, first of all, a standalone run an example:

The following example copies the unpacked Conf directory to use as input and then finds and displays every match of the GI Ven regular expression. Output is written to the given output directory.

  $ mkdir Input  $ cp etc/hadoop/*.xml input  $ bin/hadoop jar share/hadoop/mapreduce/ Hadoop-mapreduce-examples-2.6.0.jar grep input Output ' dfs[a-z. + '  $ cat output/*
Here's a look at pseudo-distribution patterns.

Two configuration files involved

Hadoop-2.6.0/etc/hadoop

Core-ste.xml

<configuration>    <property>        <name>fs.defaultFS</name>        <value>hdfs:// Localhost:9000</value>    </property></configuration>

Hdfs-site.xml

<configuration>    <property>        <name>dfs.replication</name>        <value>1< /value>    </property></configuration>
Two configuration files are configured, so don't forget to configure java_home here.

In hadoop-env.sh and yarn-env.sh (if used, but with a match)

below to establish SSH localhost password-free login

Ssh-keygen-t Dsa-p "-F ~/.ssh/id_dsacat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

It's all done, and it's done:

1. File system formatting

Bin/hdfs Namenode-format
2. Start Namenode and Datanode

sbin/start-dfs.sh
At the end of this step, we can open the Hadoop monitoring page to see the individual modules: http://localhost:50070



Feeling 2.6 is cool!!

Set up the file system below

Bin/hdfs Dfs-mkdir/userbin/hdfs Dfs-mkdir/user/chiwei
We'll go to the page and watch.
The file system we just created has already appeared.

SH Bin/hdfs dfs-put Input/user/chiwei

Put the contents of the input folder into the file system you just created


SH bin/hadoop jar Share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar grep/user/chiwei/input output ' dfs[a-z. +

Use example to analyze the contents of the file you have just done by using the above command


The output has been generated.


Finally, close the file system, Datanode,namenode,secondary Namenode



"Hadoop 2.6" hadoop2.6 pseudo-distributed mode environment for building test use

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.