No hurry, step-up, first lay a good foundation
The Spark Shell principle
First, we clearly locate the few.
1, Spark-shell
2, Spark-submit
3, Spark-class
4, Sparksubmit.scala
5, Sparkiloop.scala
Source Code of Initializespark
Def Initializespark () {
intp.bequietduring {
Command ("" "
@transient val sc = {
Val _sc = Org.apache.spark.repl.Main.interp.createSparkContext ()
println ("Spark context available as SC.")
_sc
}
""")
Command ("" "
@transient val SqlContext = {
Val _sqlcontext = Org.apache.spark.repl.Main.interp.createSQLContext ()
println ("SQL context available as sqlcontext.")
_sqlcontext
}
""")
Command ("Import Org.apache.spark.sparkcontext._")
Command ("Import Sqlcontext.implicits._")
Command ("Import Sqlcontext.sql")
Command ("Import Org.apache.spark.sql.functions._")
}
Source code of Createsparkcontext
Note:must is public for visibility
@DeveloperApi
Def createsparkcontext (): Sparkcontext = {
Val Execuri = system.getenv ("Spark_executor_uri")
Val jars = Sparkiloop.getaddedjars
Val conf = new sparkconf ()
. Setmaster (Getmaster ())
. Setappname ("Spark shell")
. Setjars (Jars)
. Set ("Spark.repl.class.uri", Intp.classserveruri)
if (Execuri! = null) {
Conf.set ("Spark.executor.uri", Execuri)
}
Sparkcontext = new Sparkcontext (conf)
Loginfo ("Created spark context..")
Sparkcontext
}
Summarize
function call path process analysis from Spark-shell to Sparkcontext (source code)