Follow the spark and Kafka tutorials step-by-step, and when you run the Kafkawordcount example, there is always no expected output. If it's right, it's probably like this:
......
-------------------------------------------
time:1488156500000 Ms
------------------------------------- ------
(4,5) (
8,12)
(6,14)
(0,19)
(2,11)
(7,20)
(5,10)
(9,9)
(3,9
) (1,11)
...
In fact, only:
......
-------------------------------------------
time:1488156500000 Ms
----------------------------------- --------
-------------------------------------------
time:1488156600000 Ms
--------------------- ----------------------
......
localhost:4040 view run log, found error: nosuchmethoderror, search related content, the result is not much, but some people say it may be a version problem.
Further row wrong, Kafka can work independently, spark streaming is also normal work. A combination of the two will not work, and finally found that Kafka and spark version is incompatible .
Tested incompatible versions are: kafka0.8.0/kafka0.10.2+spark2.1.0+scala2.11.8, pro-tested compatible versions are: kafka0.8.2.2+spark2.1.0+scala2.11.8.