Single-machine pseudo-distributed deployment of Hadoop under Windows (2)

Source: Internet
Author: User

The following begins the installation and configuration of Hadoop;

(1) Installing the JDK

I installed the jdk1.7.0_40,windows version of x64.

After downloading, click Install directly,

My installation path is the default path: C:\Program files\java\jdk1.7.0_40;

When the installation is complete, set the environment variable:

Java_home pointing to the JDK installation directory

Path points to the bin directory of the JDK

After the setup is complete, enter the java command in CMD and if a screen appears, the installation succeeds:

(2) Installing Hadoop

Download Hadoop,http://hadoop.apache.org

The version I downloaded is hadoop-1.2.1.tar.gz

After downloading, unzip the file into the C:\cygwin64\home\ldm\ directory and change the name of the folder to Hadoop

Configuration conf/hadoop-env.sh:

Configure the JDK to:

export JAVA_HOME=/USR/LOCAL/JDK;

Here, to execute the command under Cygwin:

Ln-s C:\Program files\java\jdk1.7.0_40/usr/local/jdk;

Generate linked files;

Otherwise, when you configure the path directly under Windows, there are problems that can be unrecognized.

Configuration core-site.xml:

<configuration>

<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/hadoop</value>
</property>
</configuration>

Configuration hdfs-site.xml:

<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>

Configure Mapred-site.xml

<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:9001</value>
</property>
</configuration>

(3) Start Hadoop

Format File system:

Hadoop Namenode-format

Start Hadoop:

Start close All tasks: start-all.sh/stop-all.sh

Start off hdfs:start-dfs.sh/stop-dfs.sh

Start off mapreduce:start-mapred.sh/stop-mapred.sh

(4) Access to Hadoop

Important ports for Hadoop:

Jobtracker Management Interface: 50030

HDFs Management Interface: 50070

HDFs Communication interface: 9000

MapReduce Communication port: 9001

Common Access Interface:

Namenode Interface: http://localhost:50070

Jobtracker Management Interface: http://localhost:50030

If Hadoop starts successfully, it can be accessed from the above access interface.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.