connect a cluster with Eclipse view file information hint 9000port error denying connection
cannot connect to the Map/reduce location:hadoop1.0.3
Call to ubuntu/192.168.1.111:9000 failed on connection exception:java.net.ConnectException: deny connection
1. Common Solution: The configuration is very normal, is not connected.
Once again, Hadoop location was configured to change the host in Map/reduce Master and DFS master from localhost to the IP address (192.168.1.111), and then the problem was overcome.
Strange, localhost is not this machine, and Namenode host (192.168.1.111) is also native.
The 2.eclipse Hadoop plug-in is inconsistent with the Hadoop version. You'd better use the Eclipse plugin in the Contri file that was extracted from the downloaded Hadoop package
3.hadoop-site.xml File Configuration Error
<property><name>fs.default.name</name><value>hdfs://your username:9000</value></. property><property><name>mapred.job.tracker</name><value>hdfs://username:9001</ Value></property>
4. Close HDFs Permissions: Hdfs-site.xml<property> <!--whether to control permissions on files in DFS (usually false in test)-- <name>dfs.permissions</name> <value>false</value> </property>5. Check to see if the Eclipse plugin has a Hadoop installation folder: Extract Hadoop from the same version of Hadoop installed under Linux to the next folder in Windows, and then specify the Hadoop plug-in folder. Plugins folder in Preferences's Map/reduce Hadoop installation Location6. Add the IP and hostname of all Hadoop nodes in the Windows Hosts file 7. Turn off Linux System Firewall 8. Change the hostname involved in the Hadoop three configuration file to an IP address. Master and salves files also changed to ip9/etc/hosts error
10, add one of the most interesting and deceptive problems! Maybe you need Hadoop to never start!!
A summary of how to reject link resolution when using ECLISPE to connect to Hadoop under Ubuntu