1 make sure that eclipse has installed the hadoop-eclipse-plugin-1.0.2.jar plug-in
2. Select MAP/reduce in eclipse
Right-click map/reduce locations and choose new hadoop locations.
Note:
The MAP/reduce port is the mapred-site.xml in the hadoop configuration file
<Name> mapred. Job. Tracker </Name>
The port of the DFS master is in the core-site.xml in the hadoop configuration file
<Name> fs. Default. Name </Name> value
After submission: The following results are displayed on the left:
OK. Then, you can view the HDFS file information through eclipse.
But, just so, the loss can not upload files through eclipse, this is because the hadoop configuration hdfs-site.xml has a field not modified:
Modify the hadoop configuration file CONF/hdfs-core.xml on the server, locate the DFS. Permissions configuration item, and change the value to false
<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
Restart hadoop
The last step is to execute the hadoop program on Eclipse and your permissions are insufficient. This is caused by a package in hadoop: the solution is as follows:
# Bin/hadoop dfsadmin-safemode leave
Http://www.blogjava.net/yongboy/archive/2012/04/26/376486.html
That is, modify the checkreturnvalue in/hadoop-1.0.2/src/CORE/org/Apache/hadoop/fs/fileutil. Java, and comment it out (some rough, in the window, you do not need to check ).