cd sparkapp/
ls
find .
/usr/local/sbt/sbt package
Package Complete:
[email protected]:~/sparkapp$ ls
project
simple.sbt src
target
Packing location:
We can then submit the generated jar package to spark through Spark-submit:
/ usr / Span class= "KWD" >local / spark / bin / spark - submit -- class "Simpleapp" ~/ sparkapp / target / scala - 2.10 / simple - project_2 10 - 1.0 jar
/usr/Local/Spark/bin/Spark-Submit--class"Simpleapp" ~/Sparkapp/Target/Scala-2.10/ Simple-project_2.Ten-1.0.Jar2>&1 |grep"Lines with a:"
//精简信息查看运行结果。
- For a more in-depth understanding of the spark API, see the Spark Programming Guide (Spark programming Guides);
- If you want to know more about the use of Spark SQL, you can view the spark SQL, Dataframes, and Datasets guides;
- If you want to know more about the use of spark streaming, you can view the Spark streaming programming guide;
- If you need to run spark programs in a clustered environment, you can view the Spark cluster deployment on your website
Ref/reprint: http://www.powerxing.com/spark-quick-start-guide/
From for notes (Wiz)
Spark Configuration (6)-Standalone application