Zhuan
Http://www.cnblogs.com/one--way/archive/2016/08/29/5818989.html
Http://www.cnblogs.com/one--way/p/5814148.html
Prerequisites:
1. Spark Standalone Cluster Deployment complete
2. Intellij idea can run the Spark local mode program.
Source:
1 Import Org.apache.spark. {Sparkcontext, sparkconf} 2 import scala.math._ 3 4/** 5 * Created by Edward on 2016/8/27.6 */7 Object Word Count {8 def main (args:array[string]) {9 val sparkconf = new sparkconf (). Setappname ("WordCount") one by one . Set Master ("spark://node1:7077"). Setjars (List ("D:\\documents\\spark\\mydemo\\test\\out\\artifacts\\spark_sample_ Jar\\test.jar ")) //val sc = new Sparkcontext (sparkconf) val spark = new Sparkcontext (sparkconf) Val slices = if (args.length > 0) args (0). ToInt Else 215 val n = math.min (100000L * slices, int.maxvalue). ToInt//Avoi D overflow16 val count = spark.parallelize (1 until n, slices). map {i =>17 val x = random * 2-118 val y = random * 2-119 if (x*x + y*y < 1) 1 else 020 }.reduce (_ + _) println ("Pi is roughly" + 4.0 * CO unt/n) spark.stop () }24}
The main idea here is to commit the packaged jar to the cluster.
Using the. Setjars method
Running Spark Standalone under IntelliJ idea in Windows