Go Hadoop Eclipse Development Environment Building

Source: Internet
Author: User

Source: http://www.cnblogs.com/justinzhang/p/4261851.html

This document is from my Evernote, when I am still at Baidu, I had a complete Hadoop development/debug environment. But at that time, I was tired of writing blogs. It costs me day's spare time to recovery from where I was stoped. Hope The blogs would keep on. Still cherish the time speed there, cause if doing the same thing at both different time and different place, t He things is still there, but mens is no more than the same one. Talk too much, let's go on.

in the Hadoop Cluster Setup , a haoop cluster has been set up for development/testing, and in this article we will show you how to use eclipse as a development environment for program development and testing.

1.) at this address http://download.csdn.net/detail/uestczhangchao/8409179 download,  Hadoop-eclipse-plugin-1.0.3.jar Eclipse Plugin, this article uses the Eclipse Java EE IDE for WEB developers. Version:luna Release (4.4.0) as an IDE tool, Place the downloaded Hadoop-eclipse-plugin-1.0.3.jar file in the Eclipse's plugin directory (if Myeclispe: D:\program_files\myeclipse\ MyEclipse 10\dropins\svn\plugins directory)

2.) in Eclipse Windows->preferences, choose Hadoop map/reduce, set up the installation directory for Hadoop, here I directly from the Linux/home/hadoop/ hadoop-1.0.3 Copy, click the OK button:

3.) Create a new map/reduce Project

4.) After you create a new Map/reduce project, the following two directories are generated, the DFS locations and SuSE Java Engineering, which automatically joins the Hadoop package dependency in the Java project:

5.) is a project built with the plug-in, with a dedicated view to correspond to:

6.) In map/reduce locations, select Edit Hadoop location ... Options, Map/recuce Master and DFS Master settings:

7.) In advanced parameters, set the configuration options for Hadoop, set the Dfs.data.dir to be the same as in the LINX environment, in advanced parameters, Set all path-related to the corresponding Linux path:

8.) Once the configuration of the Hadoop cluster is set up, you can see the files on the Hadoop cluster in the DFS location, which can be added and removed:

9.) in the generated Java project, add the Map/reduce program, here I added a wordcount program as a test:

10.) In the Java project run configurations set WordCount arguments, the first parameter is the input file in the HDFs path, the second parameter is the output path of HDFs:

11.) After setting the runconfiguration of Word count, select Run as-> run on Hadoop:

12.) In the console, you can see the output log information for the word count run:

13.) In DFS location, you can see that Word count produces results in the result directory:

14.) The Word Count program debugging, set a good breakpoint in Wordcount.java, click the Debug button, you can debug the program:

At this point, Hadoop+eclipse's development environment was built.

15.) Set up the environment of the abnormal situation, in the process of setting up the environment, encountered the more difficult problems, such as prompt widows on the user does not have permissions, this exception processing in the modification of Hadoop Fileutil.java, addressing the issue of permission checking, needs to be repaired by modifying the source code of Hadoop and recompiling:

15/01/30 10:08:17 WARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable15/ 01/30 10:08:17 ERROR Security. Usergroupinformation:priviledgedactionexception As:zhangchao3 cause:java.io.IOException:Failed to set permissions of Path: \tmp\hadoop-zhangchao3\mapred\staging\zhangchao3502228304\.staging to 0700Exception in thread "main" Java.io.IOException:Failed to set permissions of path:\tmp\hadoop-zhangchao3\mapred\staging\zhangchao3502228304\.staging to 0700    At Org.apache.hadoop.fs.FileUtil.checkReturnValue (fileutil.java:689) at Org.apache.hadoop.fs.FileUtil.setPermission (fileutil.java:662) at Org.apache.hadoop.fs.RawLocalFileSystem.setPermission (rawlocalfilesystem.java:509) at Org.apache.hadoop.fs.RawLocalFileSystem.mkdirs (rawlocalfilesystem.java:344) at Org.apache.hadoop.fs.FilterFileSystem.mkdirs (filterfilesystem.java:189) at Org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir (jobsubmissionfiles.java:116) at Org.apache.hadoop.mapred.jobclient$2.run (jobclient.java:856) at Org.apache.hadoop.mapred.jobclient$2.run ( jobclient.java:850) at java.security.AccessController.doPrivileged (Native Method) at javax.security.auth.Subject.do As (subject.java:415) at Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1121) at Org.apache.hadoop.mapred.JobClient.submitJobInternal (jobclient.java:850) at Org.apache.hadoop.mapreduce.Job.submit (job.java:500) at ORG.APACHE.HADOOP.MAPREduce. Job.waitforcompletion (job.java:530) at Org.apache.hadoop.examples.WordCount.main (wordcount.java:68)

Go Hadoop Eclipse Development Environment Building

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.