1. Need to install package 1.1 Hadoop source Package (hadoop-2.5.2-src.tar.gz) 1.2 Hadoop 2X plug-in (hadoop2x-eclipse-plugin-master.zip) 1.3 Hadoop window S tool (Hadoop-common-2.2.0-bin-master.zip) 1.4 Ant Compilation Tool (APACHE-ANT-1.9.6.TAR.GZ) 2. Steps (the JDK and Eclipse are already installed by default Where JDK is in version 1.6 or above) 2.1 install Ant and Hadoop decompression apache-ant-1.9.6.tar.gz and Hadoop bin package (over Hadoop standalone configuration here) configuration environment variable ant_home = D:\apache\ Apache-ant-1.9.6hadoop_home = D:\apache\apache-hadoop-2.5.2 is added on path,%ant_home\bin%;%hadoop_home\bin% note semicolon validation cmd- > ant-v 2.2 Compilation Plugin unzip Hadoop2x-eclipse-plugin-master.zipcmd-CD hadoop2x-eclipse-plugin-master\src\contrib\ Modify the hadoop.version= in the Hadoop2x-eclipse-plugin-master\ivy\libraries.properties file under the Eclipse-plugin directory 2.5.2 if it is 2.5.2 Comments Build.xml Drop the following line <copy file= "${hadoop.home}/share/hadoop/common/lib/htrace-core-${ Htrace.version}.jar "todir=" ${build.dir}/lib "verbose=" true "/> Run ant jar-dversion=2.5.2-declipse.home= The directory where Eclipse is located-dhadoop.home=hadoop (it is possible that the compilation does not pass, note that the hang VPN) runs successfully after the Adoop2x-eclipse-plugin-master\build directory gets a jar package ( Copy it to Eclipse's plugins.Record) Restart Eclipse 2.3 configuration plug-in to open window-->preferens, you can see the Hadoop map/reduc option, then click, then add hadoop-2.5.2 directory to come in configuration map/ REDUCELOCATIONS1) Click Window-->show View-->mapreduce Tools click Map/reducelocation2) Click on the Map/reducelocation tab, Click on the image icon on the right to open the Hadoop Location Configuration window: Enter location name, any name. Configure Map/reduce Master and DFS Mastrer, The host and port are configured to hdfs-site.xml with the Core-site.xml settings to verify that the DFS Locations should be able to display the file directories in Hadoop at this time. Then you can't add a directory or a file, notice that Hadoop's security mode is off, if it's turned off, turn it off Hadoop dfsadmin-safemode leave 2.4 Copy Tool decompression Hadoop-common-2.2.0-bin-master.zip Copy all of the files inside the bin directory of the Hadoop directory to replace some of the files 2.5 user Rights change the Windows login user to a user in a Hadoop virtual machine ( Same name can, without the password) 2.6 Source Modification Find Nativeio This class comment out return access0 this line add return true; Then replace the. class file in Hadoop in the Hadoop-common-2.5.2.jar file (in Windows) (or copy this class note package structure to the project), and then run it as modified 2.7 Add a log in the project src directory, add a log4j.properties content as follows: Log4j.rootlogger=debug,stdout,r log4j.appender.stdout= Org.apache.log4j.ConsoleAppender Log4j.appender.stdout.layout=org.apache.log4j.patternlayout log4j.appender.stdout.layout.conversionpattern=%5p-%m%nLog4j.appender.r=org.apache.log4j.rollingfileappender Log4j.appender.r.file=mapreduce_test.log LOG4J.APPENDER.R.MAXFILESIZE=1MB log4j.appender.r.maxbackupindex=1 log4j.appender.r.layout= Org.apache.log4j.PatternLayout log4j.appender.r.layout.conversionpattern=%p%t%c-%m%n Log4j.logger.com.codefutures=debug after the above steps can be tested without having to deploy the jar in Windows.
ecplise + Hadoop Debug Environment Setup