Listen to Liaoliang's spark the IMF saga 19th lesson: Spark Sort, job is: 1, Scala two order, use object apply 2; read it yourself Rangepartitioner
The code is as follows:
/*** Created by Liaoliang on 2016/1/10.*/Object Secondarysortapp {def main (args:array[string]) {val conf=NewSparkconf ()//Create a Sparkconf objectConf.setappname ("Secondarysortapp")//set the application name, the program run monitoring interface can see the nameConf.setmaster ("local")//At this point, the program runs locally without the need to install the spark clusterVal SC=NewSparkcontext (CONF)//create Sparkcontext objects by passing in sparkconf instances to customize spark run-specific parameters and configuration informationVal Lines= Sc.textfile ("Src/scdfile")//read a local fileVal Pairwithkey= Lines.map (line =(New Secondarysortkey (Line.split ("") (0). Toint,line.split ("") (1) (ToInt), line)) Val sorted= Pairwithkey.sortbykey (false) Val Sortedresult= Sorted.map (line =line._2) SortedResult.collect.foreach (println)sc.stop ()}classSecondarysortkey (Val first:int,val second:int)extendsOrdered[secondarysortkey] with serializable{def compare (Other:secondarysortkey): Int={ if( This. First-other.first!=0){ This. First-Other.first}Else{ This. Second-Other.second} }}
The sorting result is:
: U
4 3
4 1
3 2
2 3
2 1
Follow-up courses can be referred to Sina Weibo Liaoliang _dt Big Data Dream Factory: Http://weibo.com/ilovepains
Liaoliang China Spark First person, public number Dt_spark
Forward please specify the source.
Spark IMF saga 19th lesson: Spark Sort Summary