Windows Eclipse Debugging Hadoop Walkthrough

Source: Internet
Author: User
Keywords Java Selection dialog box DFS

1) Download eclipse

http://www.eclipse.org/downloads/

Eclipse Standard 4.3.2 64-bit

2 Download the Hadoop version corresponding to the Eclipse plug-in

My Hadoop is 1.0.4, so download Hadoop-eclipse-plugin-1.0.4.jar

Download Address: http://download.csdn.net/detail/m_star_jy_sy/7376169

3 Install the Hadoop plugin

Copy Hadoop-eclipse-plugin-1.0.4.jar to eclipse's plugins directory

4 Restart Eclipse to check if the Hadoop plug-in is loaded successfully

Click menu: Window/open perspective/others ... The Open Perspective dialog box pops up, and the map/reduce indicates that the installation was successful, as shown in the following figure:

5 Set the Hadoop installation path

Select menu: Window/preferences, Pop-up Preferences dialog box, as shown in the following image:

Select Hadoop map/reduce, set up Hadoop Install directory (Hadoop installation path). The installation path for Hadoop is the path of the Hadoop installation package hadoop-1.0.4.tar.gz uncompressed

5 Configure Hadoop

Select menu: Window/show view/other ..., pop-up Show View dialog box.

In the dialog box, select Map/reduce locations under MapReduce tools to open the Map/reduce locations view. As shown in the following illustration:

In map/reduce locations view, right-click the new Hadoop Location ... menu, the New Hadoop Location dialog box is displayed, as shown in the following illustration:

In the pop-up dialog you need to configure location name, such as Hadoop, and Map/reduce Master and DFS master. The host and port are the addresses and ports that you configure in Mapred-site.xml, Core-site.xml, respectively. Username is set to run Hadoop's account name, for example: Hadoop.

Exit after configuration is complete. Click Dfs Locations-->hadoop If you can display the folder (2) to indicate that the configuration is correct, check your configuration if the "Deny connection" is displayed.

So far, the environment has been basically built. The following is a classic WordCount example to test.

6 Prepare test data

Create a new word.txt locally that reads as follows:

Java C + + python

Java C + + JavaScript

Hello Hadoop

MapReduce Java Hadoop hbase

Copy the local word.txt to the HDFs via the copyfromlocal command as follows:

$:hadoop Fs-copyfromlocal/usr/hadoop/word.txt Word.txt

This example assumes that word.txt is stored in the/usr/hadoop/directory

7 New Hadoop Project

File-->new-->other-->map/reduce Project

Project name can be arbitrarily taken, such as WordCount. Copy the Hadoop installation directory Src/example/org/apache/hadoop/example/wordcount.java to the project you just created.

8) Running the program

Right-click the WordCount project and select Run As-->runconfigurations ... to eject the Run Configurations dialog box. As shown in the following illustration:

Right-click the Java creator on the left, select the new menu to create a new configuration item named WordCount. Set up program arguments on the Arguments tab:

Hdfs://192.168.0.19:9000/user/hadoop/word.txt

Hdfs://192.168.0.19:9000/user/hadoop/out

The first behavior input file, the second behavior outputs the result file.

If the run Times Java.lang.OutOfMemoryError:Java heap space Configure the VM arguments parameters:

-xms512m-xmx1024m-xx:maxpermsize=256m

When you are finished setting, click Run.

9) Error 1

Phenomenon:

ERROR security. UserGroupInformation:PriviledgedActionExceptionas:zhumingcause:java.io.IOException:Failed to set permissions of Path:/tmp/hadoop-zhuming/mapred/staging/zhuming1380233490/.staging to 0700

Exceptionin thread "main" java.io.IOException:Failed Toset permissions of path:/tmp/hadoop-zhuming/mapred/staging/ Zhuming1380233490/.staging to 0700

Solution:

Download Hadoop-core-1.0.4-modified.jar Replace the Hadoop-core-1.0.4.jar file in the Hadoop installation directory

Download Address: http://download.csdn.net/detail/m_star_jy_sy/7376283

10) Error 2

Phenomenon:

Org.apache.hadoop.security.AccessControlException:org.apache.hadoop.security.AccessControlException:Permission Denied:user=zhuming, access=write,inode= "Hadoop": hadoop:supergroup:rwxr-xr-x

Atsun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native method)

At Sun.reflect.NativeConstructorAccessorImpl.newInstance (nativeconstructoraccessorimpl.java:39)

Reason:

Because Eclipse uses the Hadoop plug-in to submit a job, it defaults to zhuming (Windows current user) to write the job to the HDFs file system, which corresponds to the/user/xxx on HDFs, my/user/hadoop, Because the zhuming user does not have write access to the User/hadoop directory, it causes an exception to occur.

Workaround:

To release the permissions of the/user/hadoop directory, the command is as follows: $ Hadoop fs-chmod 777/user/hadoop

Original link: http://blog.csdn.net/m_star_jy_sy/article/details/26476907

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.