Hadoop formatted HDFS error Java.net.UnknownHostException:localhost.localdomain:localhost.localdomain

Source: Internet
Author: User

Exception description

In the case of an unknown hostname when you format the Hadoop namenode-format command on HDFS, the exception information is as follows:

[Plain]View PlainCopy
  1. [Email protected] bin]$ Hadoop Namenode-format
  2. 11/06/22 07:33:31 INFO Namenode. Namenode:startup_msg:
  3. /************************************************************
  4. Startup_msg:starting NameNode
  5. Startup_msg:host = Java.net.UnknownHostException:localhost.localdomain:localhost.localdomain
  6. Startup_msg:args = [-format]
  7. Startup_msg:version = 0.20.0
  8. Startup_msg:build = Https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.20-r 763504; Compiled by ' Ndaley ' on Thu Apr 9 05:18:40 UTC 2009
  9. ************************************************************/
  10. Re-format filesystem In/tmp/hadoop/hadoop-shirdrn/dfs/name? (Y or N) Y
  11. 11/06/22 07:33:36 INFO Namenode. Fsnamesystem:fsowner=shirdrn,shirdrn
  12. 11/06/22 07:33:36 INFO Namenode. Fsnamesystem:supergroup=supergroup
  13. 11/06/22 07:33:36 INFO Namenode. Fsnamesystem:ispermissionenabled=true
  14. 11/06/22 07:33:36 INFO Metrics. Metricsutil:unable to obtain HostName
  15. Java.net.UnknownHostException:localhost.localdomain:localhost.localdomain
  16. At Java.net.InetAddress.getLocalHost (inetaddress.java:1353)
  17. At Org.apache.hadoop.metrics.MetricsUtil.getHostName (metricsutil.java:91)
  18. At Org.apache.hadoop.metrics.MetricsUtil.createRecord (metricsutil.java:80)
  19. At Org.apache.hadoop.hdfs.server.namenode.FSDirectory.initialize (fsdirectory.java:73)
  20. At Org.apache.hadoop.hdfs.server.namenode.fsdirectory.<init> (fsdirectory.java:68)
  21. At Org.apache.hadoop.hdfs.server.namenode.fsnamesystem.<init> (fsnamesystem.java:370)
  22. At Org.apache.hadoop.hdfs.server.namenode.NameNode.format (namenode.java:853)
  23. At Org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode (namenode.java:947)
  24. At Org.apache.hadoop.hdfs.server.namenode.NameNode.main (namenode.java:964)
  25. 11/06/22 07:33:36 INFO Common. Storage:image file of size saved in 0 seconds.
  26. 11/06/22 07:33:36 INFO Common. Storage:storage Directory/tmp/hadoop/hadoop-shirdrn/dfs/name has been successfully formatted.
  27. 11/06/22 07:33:36 INFO Namenode. Namenode:shutdown_msg:
  28. /************************************************************
  29. Shutdown_msg:shutting down NameNode at Java.net.UnknownHostException:localhost.localdomain:localhost.localdomain
  30. ************************************************************/

We can see by executing the hostname command:

[Plain]View PlainCopy
    1. [Email protected] bin]# hostname
    2. Localhost.localdomain

That is, Hadoop in the format of HDFs, the host name obtained through the hostname command is Localhost.localdomain, and then in the/etc/hosts file mapping, not found, look at my/etc/ Hosts content:

[Plain]View PlainCopy
    1. [Email protected] bin]# cat/etc/hosts
    2. # don't remove the following line, or various programs
    3. # that require network functionality would fail.
    4. 127.0.0.1 localhost localhost
    5. 192.168.1.103 localhost localhost

Also said that through the localhost.localdomain can not be mapped to an IP address, so the error.

At this point, we look at the/etc/sysconfig/network file:

[Plain]View PlainCopy
    1. Networking=yes
    2. Networking_ipv6=yes
    3. Hostname=localhost.localdomain

Visible, execute hostname Gets the value of the hostname configured here.

Workaround

Modify the value of hostname in/etc/sysconfig/network to localhost, or the host name that you specify, to ensure that localhost maps to the correct IP address in the/etc/hosts file, and then restart the Network service:

[Plain]View PlainCopy
    1. [Email protected] bin]#/etc/rc.d/init.d/network restart
    2. Shutting down interface eth0: [OK]
    3. Shutting down loopback interface: [OK]
    4. Bringing up loopback interface: [OK]
    5. Bringing Up interface eth0:
    6. Determining IP information for eth0 ... done.
    7. [OK]

At this point, the format HDFs command is executed, and the HDFs cluster is started properly.

Formatting:

[Plain]View PlainCopy
  1. [Email protected] bin]$ Hadoop Namenode-format
  2. 11/06/22 08:02:37 INFO Namenode. Namenode:startup_msg:
  3. /************************************************************
  4. Startup_msg:starting NameNode
  5. Startup_msg:host = localhost/127.0.0.1
  6. Startup_msg:args = [-format]
  7. Startup_msg:version = 0.20.0
  8. Startup_msg:build = Https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.20-r 763504; Compiled by ' Ndaley ' on Thu Apr 9 05:18:40 UTC 2009
  9. ************************************************************/
  10. 11/06/22 08:02:37 INFO Namenode. Fsnamesystem:fsowner=shirdrn,shirdrn
  11. 11/06/22 08:02:37 INFO Namenode. Fsnamesystem:supergroup=supergroup
  12. 11/06/22 08:02:37 INFO Namenode. Fsnamesystem:ispermissionenabled=true
  13. 11/06/22 08:02:37 INFO Common. Storage:image file of size saved in 0 seconds.
  14. 11/06/22 08:02:37 INFO Common. Storage:storage Directory/tmp/hadoop/hadoop-shirdrn/dfs/name has been successfully formatted.
  15. 11/06/22 08:02:37 INFO Namenode. Namenode:shutdown_msg:
  16. /************************************************************
  17. Shutdown_msg:shutting down NameNode at localhost/127.0.0.1
  18. ************************************************************/

Start:

[Plain]View PlainCopy
  1. [Email protected] bin]$ start-all.sh
  2. Starting Namenode, logging to/home/shirdrn/eclipse/eclipse-3.5.2/hadoop/hadoop-0.20.0/logs/ Hadoop-shirdrn-namenode-localhost.out
  3. Localhost:starting Datanode, logging to/home/shirdrn/eclipse/eclipse-3.5.2/hadoop/hadoop-0.20.0/logs/ Hadoop-shirdrn-datanode-localhost.out
  4. Localhost:starting Secondarynamenode, logging to/home/shirdrn/eclipse/eclipse-3.5.2/hadoop/hadoop-0.20.0/logs/ Hadoop-shirdrn-secondarynamenode-localhost.out
  5. Starting Jobtracker, logging to/home/shirdrn/eclipse/eclipse-3.5.2/hadoop/hadoop-0.20.0/logs/ Hadoop-shirdrn-jobtracker-localhost.out
  6. Localhost:starting Tasktracker, logging to/home/shirdrn/eclipse/eclipse-3.5.2/hadoop/hadoop-0.20.0/logs/ Hadoop-shirdrn-tasktracker-localhost.out

View:

[Plain]View PlainCopy
      1. [Email protected] bin]$ JPS
      2. 8192 Tasktracker
      3. 7905 DataNode
      4. 7806 NameNode
      5. 8065 Jobtracker
      6. 8002 Secondarynamenode
      7. 8234 Jps
      8. From 6562292

Hadoop formatted HDFS error Java.net.UnknownHostException:localhost.localdomain:localhost.localdomain

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.