1. Install the Hadoop cluster Reference document first http://www.cnblogs.com/bornteam/p/6517960.html
2, first install the cluster, and then go to download Hadoop-eclipse-plugin-2.6.0.jar placed in the Eclipse installation folder plug-in download link https://github.com/winghc/ Hadoop2x-eclipse-plugin/blob/master/release/hadoop-eclipse-plugin-2.6.0.jar 3, restart Eclipse, there are more Hadoop Map/Reduce
4, fill in the local Hadoop decompression path, this Hadoop version is Linux, only need to unzip it
Download Http://pan.baidu.com/s/1qWG7XxU download Hadoop2.6.0-eclipse plugin. zip, then unzip it and put the Hadoop2.6.0-eclipse plugin. Zip\eclipse plugin \ The Winutils.exe in the directory after 2.4 is replicated in the Hadoop2/bin directory. :
5, Configuration map/reducelocations 1) Click Window-->show View-->mapreduce Tools Click Map/reducelocation, right-Create new Hadoop location 2) Click on the Map/reducelocation tab and click on the icon on the right to open the Hadoop Location Configuration window: Enter location name and any name. Configure Map/reduce Master and DFS Mastrer, Host and Port are configured to hdfs-site.xml with the Fs.defaultfs settings in Core-site.xml
6, new System user Hadoop password and distributed cluster, switch Hadoop user to do the operation
Configuring Environment variables
Configure your environment to restart Eclipse
Verify success:
New MapReduce project, select Dfslocation->master Upload a file to NDFs, if successful, the configuration can
The next project to do the actual development.
Windows Eclipse does Hadoop development