Code:
Import org.apache.spark.SparkConf
Import Org.apache.spark.SparkContext
Import Scala.math.random
Object SPARKPI {
def main (args:array[string]) {
Val conf=new sparkconf (). Setappname ("Sparkpi"). Setmaster ("local")
Val sc=new sparkcontext (conf)
Val num=500000;
Val numrdd=sc.parallelize (1 to NUM)
Val count= numrdd.map{
n=>{
Val x=random*2-1
Val y=random*2-1
if (x*x+y*y<1)
1
Else
0
}
}.reduce (_+_)
println (4.0*count/num)
}
}
Principle:
This is the use of probability to find PI.
A square with a side length of 1 has an inner circle with a radius of 1/2, and the probability of taking any one point (x, y) (0=<x<1,0=<y<1) into the circle
p= area of the circle/square area = (PI/4)
The distance from the random point to the center of the Circle (1/2,1/2) is less than 1/2 and is within the park.
sqrt ((X-1/2) * (X-1/2) + (Y-1/2) * (Y-1/2)) <1/2
After the reduction of
(2x-1) * (2x-1) + (2y-1) * (2y-1) <1
Number of times (count)/total number of occurrences (num) = p above
So
Count/num=pi/4
Pi=4*count/pi
Spark instance 2--SPARKPI