Windows Eclipse Debugging Hadoop detailed

Source: Internet
Author: User
Tags hadoop fs

1) Download eclipse

Eclipse Standard 4.3.2 64-bit

2) Download the Eclipse plugin for the Hadoop version

My Hadoop is 1.0.4, so download Hadoop-eclipse-plugin-1.0.4.jar


3) Installing the Hadoop plugin

Copy the Hadoop-eclipse-plugin-1.0.4.jar to Eclipse's plugins directory

4) Restart Eclipse to check if the Hadoop plugin is successfully loaded

Click menu: Window/open perspective/others ... The Open Perspective dialog box pops up, and Map/reduce indicates the installation is successful, see:

5) Set the Hadoop installation path

Select menu: Window/preferences, the Preferences dialog box appears, as shown in:

Select Hadoop map/reduce, set up Hadoop install directory (Hadoop installation path). The installation path for Hadoop is the path after the hadoop-1.0.4.tar.gz of the Hadoop installation package

5) Configure Hadoop

Select menu: Window/show view/other ..., pop up the Show View dialog box.

In the dialog box, select Map/reduce Locations under MapReduce tools to open the Map/reduce Locations view. As shown in the following:

In the Map/reduce locations view, right-click the new Hadoop location ... menu, pop up the new Hadoop Location dialog box, as shown in:

In the popup dialog you need to configure location name, such as Hadoop, and Map/reduce Master and DFS master. The host and port are the addresses and ports you have configured in Mapred-site.xml, Core-site.xml, respectively. Username is set to run Hadoop account name, for example: Hadoop.

Exit after configuration is complete. Click Dfs Locations-->hadoop If you can show the folder (2) that the configuration is correct and if "Deny connection" is displayed, please check your configuration.

At this point, the environment is basically built. The following is an example of a classic wordcount experiment.

6) Prepare test data

Create a new word.txt locally with the following content:

Java C + + python C
Java C + + JavaScript
HelloWorld Hadoop
MapReduce Java Hadoop hbase

Copy the local word.txt to HDFs with the copyfromlocal command as follows:

$:hadoop Fs-copyfromlocal/usr/hadoop/word.txt Word.txt

This example assumes that the word.txt is stored in the/usr/hadoop/directory

7) New Hadoop project

File-->new-->other-->map/reduce Project

The project name can be taken casually, such as WordCount. Copy the Hadoop installation directory Src/example/org/apache/hadoop/example/ to the project you just created.

8) Running the program

Right-click the WordCount project, select Run As-->runconfigurations ..., and the Run Configurations dialog box pops up. As shown in the following:

Right-click on the left Java application and choose New menu to create a configuration item named WordCount. Set program arguments in the Arguments tab:



The first behavior is the input file, and the second behavior outputs the result file.

If you run the Times Java.lang.OutOfMemoryError:Java heap space, configure the VM arguments parameter:


When you're finished setting up, click Run.

9) Error 1


ERROR Security. Usergroupinformation: : Failed To set permissions of path:\tmp\hadoop-zhuming\mapred\staging\zhuming1380233490\.staging to 0700

exceptionin Thread "main" : Failed toset permissions of path:\tmp\hadoop-zhuming\mapred\staging\zhuming1380233490\.staging to 0700


Download Hadoop-core-1.0.4-modified.jar replacement to the Hadoop-core-1.0.4.jar file in the Hadoop installation directory


10) Error 2

Phenomenon: Denied:user=zhuming, access=write,inode= "Hadoop": hadoop:supergroup:rwxr-xr-x

Atsun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method)

At Sun.reflect.NativeConstructorAccessorImpl.newInstance (


Because Eclipse uses the Hadoop plugin to submit jobs, it will default to Zhuming (Windows current user) to write jobs to the HDFs file system, corresponding to the/user/xxx on HDFs, my/user/hadoop, Because the zhuming user does not have write access to the User/hadoop directory, it causes the exception to occur.


To release the permissions for the/user/hadoop directory, command the following: $ Hadoop fs-chmod 777/user/hadoop

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.