connect a cluster with Eclipse view file information Tip 9000 port denied connection error
cannot connect to the Map/reduce location:hadoop1.0.3
Call to ubuntu/192.168.1.111:9000 failed on connection exception:java.net.ConnectException: Deny connection
1. Common Solution: Configuration is normal, is not connected. Later, the Hadoop location was reconfigured, the host from Map/reduce Master and DFS master changed from localhost to the IP address (192.168.1.111), and the problem was resolved. Strange, localhost is not this machine, and Namenode host (192.168.1.111) is also native.
The 2.eclipse Hadoop plugin is inconsistent with the Hadoop version number, and you'd better use the Eclipse plugin in the Contri file that was extracted from the downloaded Hadoop package
3.hadoop-site.xml File Configuration Error
<property>
<name>fs.default.name</name>
<value>hdfs://your user name:9000</value>
</property>
<property>
<name>mapred.job.tracker</name>
<value>hdfs://User name:9001</value>
</property>
4. Turn off HDFs permissions: Hdfs-site.xml <property> <!--whether to control permissions on files in DFS (usually false in testing)-- <name> dfs.permissions</name> <value>false</value> </property>5. Check to see if the Eclipse plugin has a Hadoop installation directory: Unzip the same Hadoop version of Hadoop installed under Linux to the next folder in Windows and specify the Hadoop plug-in directory in the preferences map/ Reduce's Hadoop installation Location6. Add the IP and hostname of all the Hadoop nodes in the Windows Hosts file 7. Turn off the Linux system Firewall 8. Change the hostname involved in the Hadoop three configuration file to an IP address, and the master and salves files to IP9 /etc/hosts Error
Reject link Solution Summary when using ECLISPE to connect to Hadoop under Ubuntu