1) Download eclipse
http://www.eclipse.org/downloads/
Eclipse Standard 4.3.2 64-bit
2) Download the Eclipse plugin for the Hadoop version
My Hadoop is 1.0.4, so download Hadoop-eclipse-plugin-1.0.4.jar
: http://download.csdn.net/detail/m_star_jy_sy/7376169
3) Installing the Hadoop plugin
Copy the Hadoop-eclipse-plugin-1.0.4.jar to Eclipse's plugins directory
4) Restart Eclipse to check if the Hadoop plugin is successfully loaded
Click menu: Window/open perspective/others ... The Open Perspective dialog box pops up, and Map/reduce indicates the installation is successful, see:
5) Set the Hadoop installation path
Select menu: Window/preferences, the Preferences dialog box appears, as shown in:
Select Hadoop map/reduce, set up Hadoop install directory (Hadoop installation path). The installation path for Hadoop is the path after the hadoop-1.0.4.tar.gz of the Hadoop installation package
5) Configure Hadoop
Select menu: Window/show view/other ..., pop up the Show View dialog box.
In the dialog box, select Map/reduce Locations under MapReduce tools to open the Map/reduce Locations view. As shown in the following:
In the Map/reduce locations view, right-click the new Hadoop location ... menu, pop up the new Hadoop Location dialog box, as shown in:
In the popup dialog you need to configure location name, such as Hadoop, and Map/reduce Master and DFS master. The host and port are the addresses and ports you have configured in Mapred-site.xml, Core-site.xml, respectively. Username is set to run Hadoop account name, for example: Hadoop.
Exit after configuration is complete. Click Dfs Locations-->hadoop If you can show the folder (2) that the configuration is correct and if "Deny connection" is displayed, please check your configuration.
At this point, the environment is basically built. The following is an example of a classic wordcount experiment.
6) Prepare test data
Create a new word.txt locally with the following content:
Java C + + python C Java C + + JavaScript HelloWorld Hadoop MapReduce Java Hadoop hbase |
Copy the local word.txt to HDFs with the copyfromlocal command as follows:
$:hadoop Fs-copyfromlocal/usr/hadoop/word.txt Word.txt
This example assumes that the word.txt is stored in the/usr/hadoop/directory
7) New Hadoop project
File-->new-->other-->map/reduce Project
The project name can be taken casually, such as WordCount. Copy the Hadoop installation directory Src/example/org/apache/hadoop/example/wordcount.java to the project you just created.
8) Running the program
Right-click the WordCount project, select Run As-->runconfigurations ..., and the Run Configurations dialog box pops up. As shown in the following:
Right-click on the left Java application and choose New menu to create a configuration item named WordCount. Set program arguments in the Arguments tab:
Hdfs://192.168.0.19:9000/user/hadoop/word.txt Hdfs://192.168.0.19:9000/user/hadoop/out |
The first behavior is the input file, and the second behavior outputs the result file.
If you run the Times Java.lang.OutOfMemoryError:Java heap space, configure the VM arguments parameter:
-xms512m-xmx1024m-xx:maxpermsize=256m |
When you're finished setting up, click Run.
9) Error 1
Phenomenon:
ERROR Security. Usergroupinformation: priviledgedactionexceptionas:zhumingcause:java.io.IOException : Failed To set permissions of path:\tmp\hadoop-zhuming\mapred\staging\zhuming1380233490\.staging to 0700
exceptionin Thread "main" java.io.IOException : Failed toset permissions of path:\tmp\hadoop-zhuming\mapred\staging\zhuming1380233490\.staging to 0700
Solution:
Download Hadoop-core-1.0.4-modified.jar replacement to the Hadoop-core-1.0.4.jar file in the Hadoop installation directory
: http://download.csdn.net/detail/m_star_jy_sy/7376283
10) Error 2
Phenomenon:
Org.apache.hadoop.security.AccessControlException:org.apache.hadoop.security.AccessControlException:Permission Denied:user=zhuming, access=write,inode= "Hadoop": hadoop:supergroup:rwxr-xr-x
Atsun.reflect.NativeConstructorAccessorImpl.newInstance0 (Native Method)
At Sun.reflect.NativeConstructorAccessorImpl.newInstance (nativeconstructoraccessorimpl.java:39)
Reason:
Because Eclipse uses the Hadoop plugin to submit jobs, it will default to Zhuming (Windows current user) to write jobs to the HDFs file system, corresponding to the/user/xxx on HDFs, my/user/hadoop, Because the zhuming user does not have write access to the User/hadoop directory, it causes the exception to occur.
Workaround:
To release the permissions for the/user/hadoop directory, command the following: $ Hadoop fs-chmod 777/user/hadoop