Step-by-step learning from Me Hadoop (2)----The Hadoop Eclipse plugin to install and run the WordCount program

Source: Internet
Author: User
Tags parent directory hadoop fs

The Hadoop version of this blog is Hadoop 0.20.2.

Installing Hadoop-0.20.2-eclipse-plugin.jar
    1. To download the Hadoop-0.20.2-eclipse-plugin.jar file and add it to the Eclipse plug-in library, add a method that is simple: Locate the plugins directory under the Eclipse installation directory, copy directly to this directory, and restart eclipse
    2. Click on the Eclipse Toolbar window-----Show view------Other enter map in the pop-up window to confirm that it looks like the following

Plug-in installation is successful here

Map/reduce Configuration
    1. Configure Hadoop installation directory

      Click Eclipse Window-----Preference, find the Hadoop map/reduce in the pop-up window, select the Hadoop installation file address (the installation file here does not need to be exactly the same as the Hadoop environment in the cluster)

    2. Hadoop map/reduce Locations Configuration

Under Map/reduce View, click on the label

The popup window will appear as follows, enter the corresponding content as prompted by the figure

In the Advanced Parameters tab, enter the content as follows, here I cut two graphs

Other settings

Verifying the Hadoop map/reduce locations configuration

Under the Project Explorer view of Map/reduce, click the map/reduce locations that you configured under DFS, and the configuration is fine if each node is expandable

Test the WordCount Program

Add the input directory to the HDFs file system

Hadoop Fs-mkdir Input


Refresh Dfs locations in Eclipse and upload the file, here I uploaded two files, the contents of the file to add some space (WordCount according to the space to statistical words)

Run WordCount

Run WordCount need command line parameters, parameters have two, the first is to count the folder HDFs path, the other is the output path;

Note here that the output path is the parent directory of the upload file path, when filling in the DFS locations view to double-click the file, you can view the file's HDFs path, we want his directory, this is Hdfs://192.168.88.128:9000/user/root /input, another output parameter I wrote hdfs://192.168.88.128:9000/user/root/output.

After performing the refresh of DFS locations, you can see the output directory in the input peer directory

Executing commands on the master machine

Hadoop FS-LSR/

You can also see a more output directory, and there are more files below it, this file is the results of the statistics

Time is late, write here first, tomorrow I will upload the relevant plug-ins, but also upload a few Hadoop-related PDF documents

Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.

Step-by-step learning from Me Hadoop (2)----The Hadoop Eclipse plugin to install and run the WordCount program

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.