Spark 1.4 was downloaded and the following error was encountered while running Spark-shell:
Java.net.unknownhostexception:ukon-m-q0ep:ukon-m-q0ep:nodename nor servname provided, or not known
At Java.net.InetAddress.getLocalHost (inetaddress.java:1473)
This is not a spark-specific error, but a common problem with Java on Mac, which is that application cannot query the IP address when querying the host's corresponding IP address.
Which UKON-M-Q0EP
is the hostname of my Mac, which can be queried by the command of Mac terminal hostname
:
Ukon-m-q0ep:~ ukon$ hostname
Ukon-m-q0ep
But the system can not find the host name of the IP address, (in fact, I do not quite understand why can not find, the system can not be smart a bit?). But it does not), this can be verified by the following command:
ukon-m-q0ep:~ ukon$ Ping Ukon-m-q0ep
^c
But if I /etc/hosts
add this hostname to the file, I 127.0.0.1
can ping it:
ukon-m-q0ep:~ ukon$ Ping Ukon-m-q0ep
PING Ukon-m-q0ep (127.0.0.1): Data bytes
Bytes from 127.0.0.1:icmp_seq=0 ttl=64 time=0.056 ms
Bytes from 127.0.0.1:icmp_seq=1 ttl=64 time=0.133 ms
Bytes from 127.0.0.1:icmp_seq=2 ttl=64 time=0.121 ms
Bytes from 127.0.0.1:icmp_seq=3 ttl=64 time=0.134 ms
^c
As can be seen, the system does require us to manually configure the/etc/hosts in order to find the IP address of the host.
Run again ./bin/spark-shell
and you'll be successful.
ukon-m-q0ep:spark-1.4.0-bin-hadoop2.6 ukon$./bin/spark-shell
2015-07-04 00:12:04.604 java[31755:1803488] Unable to load realms info from Scdynamicstore
Welcome to
__
/ / _ _/ /__
\ \ \/_ '/ _/'/
// . /_,////_\ version 1.4.0
/_/
Using Scala version 2.10.4 (Java HotSpot (TM) 64-bit Server VM, Java 1.7.0_45)
Type in expressions to has them evaluated.
Type:help for more information.
Spark context available as SC.
SQL context available as SqlContext.
Scala>
done!
Then, a spark code was successfully run, counting the number of rows in a local file and querying the text content of the first line:
scala> val textfile = Sc.textfile ("readme.md")
Textfile:org.apache.spark.rdd.rdd[string] = mappartitionsrdd[1] at textfile at:21
Scala> Textfile.count ()
Res0:long = 98
Scala> Textfile.first ()
res1:string = # Apache Spark
Cool!
Copyright NOTICE: This article for Bo Master original article, without Bo Master permission not reproduced.
Error in host address encountered while running Spark-shell