Colleagues were called to the meeting by the boss ... It's been driving for two hours, GOD.
Broadcast variable broadcast this variable can only be modified on the drive side and cannot be modified at the executor end without generating shuffle optimizations, but this rdd is required for a small amount of data
The accumulator accumulator is read at the executor end and displayed in the driver
(already code saved to Youdao ing)
Package com.ib.e3 import Org.apache.spark.
{sparkconf, sparkcontext}/** * Created by xxxxxxoooooo on 9/1/2016. */Object Broadcastaccumulators {def main (args:array[string]) {val conf = new sparkconf (). Setappname ("Broadcast Accumulators "). Setmaster (" local ") val sc = new Sparkcontext (conf)//1 test for broadcast//This variable can only be modified at the drive end, not Can be modified on the executor side, from the following Brdd can be seen that there is no transformation operator, that is, can not modify//But read the value of the data inside. Value//does not produce shuffle optimization, but this rdd amount of data is required The smaller//spark provides broadcast Variable and is read-only.
And there will be only one copy on each node, not one copy for each task.
Therefore, its maximum function is to reduce the variable to each node network transmission consumption, as well as the memory consumption on each node. When you use a broadcast variable, each node copies only one copy. Each node can get a value using the value () method of the broadcast variable.
The broadcast variable is read-only. Val data = Array (1,3,4,6,8,9) Val Datardd = sc.parallelize (data) Val num = 6 val broadcastnum = Sc.broadcast (n UM)//each value in the array is prefixed with the NUM datardd.map (data => data+ broadcastnum.value). foreach (x => println (x))//2 test For accumulators//is primarily used by multiple nodes to perform a shared operation on a variable.
The accumulator only provides additive functionality. But indeedgives us the ability to work with multiple tasks on a variable parallel operation.
But a task can only add to the accumulator and cannot read its value.
Only the driver program can read accumulator values.
val xxx = sc.accumulator (0) Datardd.foreach (x => xxx.add (x)) println (XXX)}}
Pictures from the network