Goal: Compile the Eclipse plugin for Apache Hadoop2.2.0 in the win7x64 environmentEnvironment:Win7x64 Family Normal edition Eclipse-jee-kepler-sr1-win32-x86_64.zipapache Ant (TM) version 1.8.4 compiled on 2012java version " 1.7.0_45 "Reference article: http://kangfoo.u.qiniudn.com/article/2013/12/build-hadoop2x-eclipse-plugin/plugin source Download: https:// Github.com/winghc/hadoop2x-eclipse-plugin premise: Ant, jdk, Eclipse, Apache Hadoop 2.2.0 installation package are ready plug-in source package has been downloaded in the virtual machine has been deployedApache Hadoop 2.2.0 Environment
Step 1:will beApache Hadoop 2.2.0 's installation package is copied to any Windows directory, the directory I use isD:\Development_ProgramFiles_2014\hadoop-2.2.0 Remember: The entire path can not appear spaces, or the ant compilation process will be wrong!
Step 2(This step should be omitted): Copy the folder hadooproot/etc/hadoop/the configuration file from the Hadoop cluster toD:\Development_ProgramFiles_2014\hadoop-2.2.0\etc under
Step 3:Run cmd,Enter the plug-in source root directory, executeCD Src/contrib/eclipse-plugin
Step 4: ExecutionAnt jar-dversion=2.2.0
-declipse.home = D:\Development_ProgramFiles_2014\eclipse2014
-dhadoop.home=D:\Development_ProgramFiles_2014\hadoop-2.2.0
Note: You need to specify the target Eclipse installation directory, the Hadoop installation directory, to compile the plug-in in the command.
Remember: There are no spaces in the full path of the two directories.
Step 5: Then is the long wait, the main slow Target:ivy-download,ivy-resolve-common these two steps.the last generated plug-inHadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin\hadoop-eclipse-plugin-2.2.0.jar
Step 6:WillHadoop-eclipse-plugin-2.2.0.jar Copy to Eclipse's plugins directory, launch Eclipse
Step 7:window--"Preferences Configure the Hadoop root directory:
Note: The path is configured only to perform the Mr Program, and Eclipse can find the appropriate jar package.
Step 8:Open the MapReduce view:
Configure location:
Note: The MR master and DFS master configurations must be consistent with the configuration files such as Mapred-site.xml and Core-site.xml.
Step 9:Open Project Explorer to view the HDFs file system:
Step Ten:New MapReduce Project
To create a MapReduce program:
Run the MR Program to indicate that you cannot connect to the cluster, see subsequent articles:
"Gandalf" Win7 Environment Eclipse connection Hadoop2.2.0 http://blog.csdn.net/u010967382/article/details/25368313