Error in host address encountered while executing Spark-shell

Source: Internet
Author: User

Spark 1.4 was downloaded and the following error was encountered while executing Spark-shell:

Java.net.unknownhostexception:ukon-m-q0ep:ukon-m-q0ep:nodename nor servname provided, or not known
At Java.net.InetAddress.getLocalHost (inetaddress.java:1473)

This is not a spark-specific error, but a common problem with using Java on Mac, which is that application cannot query the IP address when querying the host's corresponding IP address.

UKON-M-Q0EPThis is the hostname of my Mac, which can be queried via Mac Terminal hostname command:

Ukon-m-q0ep:~ ukon$ hostname
Ukon-m-q0ep

However, the system cannot find the IP address of this hostname (in fact I am not quite able to understand why I cannot find it.) Wouldn't the system be a little smarter? But it does not, this can be verified by the following command:

ukon-m-q0ep:~ ukon$ Ping Ukon-m-q0ep
^c

But suppose I /etc/hosts add this hostname to the file and 127.0.0.1 ping it on the line:

ukon-m-q0ep:~ ukon$ Ping Ukon-m-q0ep
PING Ukon-m-q0ep (127.0.0.1): Data bytes
Bytes from 127.0.0.1:icmp_seq=0 ttl=64 time=0.056 ms
Bytes from 127.0.0.1:icmp_seq=1 ttl=64 time=0.133 ms
Bytes from 127.0.0.1:icmp_seq=2 ttl=64 time=0.121 ms
Bytes from 127.0.0.1:icmp_seq=3 ttl=64 time=0.134 ms
^c

As can be seen, the system does require us to manually configure the/etc/hosts in order to find the host's IP address.

Run ./bin/spark-shell it again, it's done.

ukon-m-q0ep:spark-1.4.0-bin-hadoop2.6 ukon$./bin/spark-shell
2015-07-04 00:12:04.604 java[31755:1803488] Unable to load realms info from Scdynamicstore
Welcome to
__
/ / _ _/ /__
\ \ \/_ '/ _/'/
// . /_,////_\ version 1.4.0
/_/

Using Scala version 2.10.4 (Java HotSpot (TM) 64-bit Server VM, Java 1.7.0_45)
Type in expressions to has them evaluated.
Type:help for more information.
Spark context available as SC.
SQL context available as SqlContext.

Scala>

done!

And then. Successfully executed a spark code that counts the number of rows in a local file and queries the text content of the first line:

scala> val textfile = Sc.textfile ("readme.md")
Textfile:org.apache.spark.rdd.rdd[string] = mappartitionsrdd[1] at textfile at:21

Scala> Textfile.count ()
Res0:long = 98

Scala> Textfile.first ()
res1:string = # Apache Spark

Very cool!

Error in host address encountered while executing Spark-shell

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.