1. Implement your own map-reduce by modifying the instance template program: In order for the sample program to run: 1) install eclipse2) Installing Map-reduce Eclipse plug-in Eclipse's Map-reduce plugin makes it easy to create Hadoop projects (auto-import depends on the Map-reduce Jar package) and open map-reduce view (in which you can see the structure of the HDFs file system as intuitively as the native file system) installation detailed steps: i) find the installation path for Eclipse: $:whereis Eclipse1. Implement your own map-reduce by modifying the instance template program:
In order for the sample program to run:
1) Install Eclipse
2) Install the Eclipse plugin for Map-reduce
Eclipse's Map-reduce plugin makes it easy to create Hadoop projects (auto-import of dependent map-reduce jar packages) and open map-reduce view (where you can see the structure of the HDFs file system as intuitively as the native file system)
Installation Detailed steps:
i) find the installation path for eclipse:
$:whereis Eclipse
Obviously, our eclipse is installed under the/usr/lib/eclipse,
Copy the Eclipse plugin from map-reduce to the/usr/lib/eclipse/plugins directory:
sudo cp-rf [plugin name]/usr/lib/eclipse/plugins
II) restart Eclipse
III) Configure settings for the Map-reduce plugin in eclipse:
In Eclipse-window-preference, configure the installation path for the current Hadoop (Eclispe will automatically search for the jar package under the path)
(IV) Open the Eclipse map-reduce view:
Select other view in Window view and select Map-reduce
(V) After opening, you can see the configuration of map-reduce in the bottom column of the ECLISPE:
(VI) Right-click to configure the location of Hadoop:
Special Note: Here is the same configuration as the Hadoop configuration file:
Map/reduce Master is configured in the ${hadoop_home}/etc/hadoop/map-red.site file:
My configuration is:
The configuration of DFS master is
${hadoop_home}/etc/hadoop/core-site.xml
My configuration is:
Since I am configuring pseudo-distributed, the map-red master and HDFs master locations are localhost, which is determined after the above configuration is populated with eclipse and can be seen in the project bar:
We can easily view and download local files to HDFs
At this point, the ECLISPE development environment for Hadoop Map-reduce has been built.
Attachment:
1) sample Template program:
2) map-reduce Eclipse plugin Download
Baidu Cloud Disk: Http://pan.baidu.com/s/1nt5jqyt
Ext.: http://blog.csdn.net/guoqingpei/article/details/45620123
hadoop--Configuring the Map-reduce operating environment under Eclipse 1