Getting Started with Hadoop-Installing on windows, testing Hadoop

Source: Internet
Author: User
Tags gz file

The previous article briefly described how to compile Hadoop on Windows, and then the previous one, this article describes how to install Hadoop and make simple verification that the installation is correct. The compiled machine is separate from the installed machine.

The machine I compiled is Windows7, and the installed machine is Windows R2.

First step: After compiling, the hadoop-2.2.0.tar.gz file is generated in the target directory, the file is extracted to a directory, and then the entire directory is copied to the target machine, as far as possible to choose a simple directory, such as E:\HD

The second step: Add Hadoop_home to the system environment variable, the value is E:\HD. and add%hadoop_home%\bin to the path.

Step Three: Configure Hadoop

1. Modify the Core_site.xml file

<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

2. Modify the Hdfs-site.xml file

<CONFIGURATION>
   <property>
       <name>dfs.replication</name>
      <value>1< /value>
   </property>
   <property>
      <name>dfs.namenode.name.dir</name>
      <value> file :/hdp/data/dfs/namenode </value>
   </property>
   < Property>
      <name>dfs.datanode.data.dir</name>
       <value> file:/hdp/data/dfs/datanode </value>
    </property>
</configuration>
3. Modify the Yarn-site.xml file

<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>
Yarn.nodemanager.aux-services.mapreduce_shuffle.class
</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
</configuration>
4. Modify the Mapred-site.xml file

<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
Fourth step: Prepare for Hadoop cluster

Perform

CD C:/%hadoop_home%/bin
HDFs Namenode-format

NOTE: If the hint java_home is not set or is not set, install the JDK, preferably install the JDK in a path without spaces, and then set the Java_home system variable.

Fifth step: Start HDFs

Perform

CD C:/%hadoop_home%/sbin
Start-dfs
Sixth step: Start yarn

Execute command:

CD C:/%hadoop_home%/sbin
Start-yarn

Seventh step: Verify that Hadoop cluster is working

Access to http://localhost:8042 and http://localhost:50070

Eighth Step: Test cluster

Perform

Hadoop dfs-mkdir/test

And

Hadoop Dfs-ls/




Getting Started with Hadoop-Installing on windows, testing Hadoop

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.