Application in Spark standalone mode
Application is a user-submitted app for a Hadoop-like job in Spark. The SC is the Sparkcontext,spark created during the spark cluster initialization and contains the action operator and the transferer operator. There are wide dependencies and narrow dependencies. By default, the Spark Scheduler (Dagscheduler) is the FIFO mode.
Default sort output to disk file
scala> val r1 = sc.textfile ("/root/rdd1.txt"). FlatMap (_.split (")"). Map (x=> (x,1)). Reducebykey (_+_). Saveastextfile ("/root/rddout/nosort")
fileoutputcommitter:saved output of Task ' attempt_201507140546_0014_m_000000_14 ' to File:/root/rddout/nosort/_ temporary/0/task_201507140546_0014_m_000000
Dictionary sequence sequential sequencing output to disk file
Val r1 = sc.textfile ("/root/rdd1.txt"). FlatMap (_.split (")"). Map (x=> (x,1)). Reducebykey (_+_). Sortbykey (True). Saveastextfile ("/root/rddout/zsort")
fileoutputcommitter:saved output of Task ' attempt_201507140546_0017_m_000000_17 ' to File:/root/rddout/zsort/_ temporary/0/task_201507140546_0017_m_000000
Dictionary order reverse sort output to disk file
Val r1 = sc.textfile ("/root/rdd1.txt"). FlatMap (_.split (")"). Map (x=> (x,1)). Reducebykey (_+_). Sortbykey (False). Saveastextfile ("/root/rddout/fsort")
fileoutputcommitter:saved output of Task ' attempt_201507140547_0020_m_000000_20 ' to File:/root/rddout/fsort/_ temporary/0/task_201507140547_0020_m_000000
spark-User Application