Complete configuration of the eclipse-based hadoop development environment in Windows (2)

Source: Internet
Author: User
Tags hadoop fs

Next, configure hadoop,

1. decompress the file

Open cygwin and enter the following command:

CD.

Explorer.

A new window will pop up, put the original hadoop compressed file in it, and decompress it. In my opinion, it is not necessary to put it in the cygwin user root directory. I have never tried it.

Ii. Configure hadoop

Open the decompressed folder, there is a hadoop-0.19.2 file under the hadoop-site.xml/conf directory, open, in the original

<Configuration>

</Configuration> insert

<Property>
<Name> fs. Default. Name </Name>
<Value> HDFS :/// localhost: 9100 </value>
</Property>
<Property>
<Name> mapred. Job. Tracker </Name>
<Value> localhost: 9101 </value>
</Property>
<Property>
<Name> DFS. Replication </Name>
<Value> 1 </value>
</Property>

Save.

3. Format Name Node

Open a cygwin, go to the hadoop directory, if according to the preceding decompression, directly with the CD hadoop-0.19.2 can be, then enter the following command:
Mkdir logs
Bin/hadoop namenode-format

4. Install the Eclipse plug-in

Open the decompressed folder and put the hadoop-0.19.2-plugin under hadoop-0.19.2-eclipse/contrib/eclipse-plugin. copy the JAR file to the eclipse plugins directory and restart eclipse. In window-open perspective-other, there should be a map/reduce item in the pop-up window, which indicates that the installation is successful. If no, delete the configuration/org. Eclipse. Update folder under eclipse and restart eclipse.


5. Start the hudoop Cluster

Start five cygwin instances.

Start namenode in the first step and execute:
CD hadoop-0.19.2
Bin/hadoop namenode

In the second middle, start secondary namenode and execute:
CD hadoop-0.19.2
Bin/hadoop secondarynamenode

In the third program, start job tracker and execute:
CD hadoop-0.19.2
Bin/hadoop jobtracker

In the fourth step, start the data node and execute
CD hadoop-0.19.2
Bin/hadoop datanode

Start task tracker and execute:
CD hadoop-0.19.2
Bin/hadoop tasktracker

6. Configure the environment in eclipse

Start eclipse, go to map/reduce perspective, create a location in MAP/reduce locations, and fill in the following values

* Location name -- localhost
* MAP/reduce master
O host -- localhost
O Port -- 9101
* DFS master
O check "use M/R master host"
O Port -- 9100
* User Name -- default


7. upload files to HDFS

Open a cygwin and execute

CD hadoop-0.19.1
Bin/hadoop FS-mkdir in
Bin/hadoop FS-put *. txt in

In this case, the change should be reflected in the DFS location of Eclipse project explorer. If not, reconnect it.

Now that everything is ready, you can create a project in the next article.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.