This question seems strange at first, when the native configuration starts Hadoop, first we need to format the Namenode, but after executing the command, the following exception appears: FATAL Namenode. Namenode:exception in NameNode join Java.lang.IllegalArgumentException:URI have an authority component. Whatever else, just for this authority, I hesitate to add sudo in front of the format command, and found ... Wood has the slightest effect. So, just skip the format link and execute Start-all? The results look a bit magical, and a few processes actually run up. But a look at the eye ... Strange, except that there is no namenode process (as shown). Web UI on port 50070 via Namenode also found that the Web page could not be opened. Then look at the log file of Namenode, and found that the above IllegalArgumentException exception, and in the Datanode log file, also found a large string of retrying connect to server. Well...... All right...... It looks like it's going to be wasted. After several searches, attempts, finally found the cause of the problem. The occurrence of this anomaly is mainly related to such two files: Core-site.xml and Hdfs-site.xml. People who presumably have Hadoop configuration have a deep impression of these two files, and in these two files, the most direct link to this issue is the property: [HTML] View plaincopy on code to see a snippet derived from my Code slice In Core-site.xml, we configured the Hadoop.tmp.dir attribute to take my configuration here as an example: hadoop.tmp.dirfile:/home/hdusr/hadoop-2.2.0/tmp/[html] View Plaincopy on code to see a snippet derived from my Code slice in Hdfs-site.xml, we use the Hadoop.tmp.dir attribute value in the following way (the syntax below is also Hadoop Default configuration value for Hdfs-site in 2.2): dfs.namenode.name.dirfile://${hadoop.tmp.dir}/dfs/namedfs.datanode.data.dirfile://${ Hadoop.tmp.dir}/dfs/daTa such a configuration seems to be no big problem, but also contains ${hadoop.tmp.dir} This high-end atmosphere of the flexibility of the wording. But the problem is here, after the actual operation confirmed that if we use this variable method to configure the properties in the Hdfs-site file, there will be a permission problem about the URI. Personal speculation may be due to Hadoop installed in the HDUSR directory, so the process of starting the Hadoop service, the Hdfs-site file loading, parsing variable configuration information in the middle process encountered a permissions problem (in the next pair of Linux is a rookie rookie, stop here, Dare not to add to the falsehood). In a word, illegalargumentexception appears in my configuration situation, can be modified by the Hdfs-site.xml to the following form of resolution. [HTML] View plaincopy on code to see a snippet derived from my code to resolve this exception, we modified the following two properties in Hdfs-site.xml to: dfs.namenode.name.dirfile:/home/ Hdusr/hadoop-2.2.0/tmp/dfs/name (i.e. using the full absolute address) dfs.datanode.data.dirfile:/home/hdusr/hadoop-2.2.0/tmp/dfs/ Name (that is, using the full absolute address) after the modification of the Hdfs-site.xml file, finally ... namenode successfully formatted, and also can start the Hadoop service normally, after the start of the process display see. Above just I as a rookie of some immature opinion, said not the truth of the place also please forgive me, hope you expert guidance, grateful.
Hadoop cannot start Namenode process, illegalargumentexception exception occurs