Listen to Liaoliang's Spark 3000 disciple series lesson four Scala pattern matching and type parameters, summarized as follows:
Pattern matching:
def data (array:array[string]) {
Array match{
Case Array (a,b,c) = println (A+b+c)
Case Array ("Spark", _*) =//matches an array with spark as the first element
Case _ = ...
}
}
After-school assignments are:
Read the source code for the spark source RDD, Hadooprdd, Sparkcontext, Master, and worker, and analyze the contents of all pattern matching and type parameters used inside.
Here is a summary of my point:
T <% Writable:classtag
T can be converted to writable type by stealth
Classtag injection of implicit values in context
For manifest Context Bounds
[T:manifest] evolved to Classtag, T:classtag the runtime passes complete type context information
Seq[dependency[_]] [equivalent to seq[dependency[t]]
There is another important note:
{{{
* scala> def Mkarray[t:classtag] (elems:t*) = Array[t] (elems: _*)
* Mkarray: [T] (elems:t*) (implicit evidence$1:scala.reflect.classtag[t]) array[t]
*
* Scala> Mkarray (42, 13)
* Res0:array[int] = Array (42, 13)
*
* scala> Mkarray ("Japan", "Brazil", "Germany")
* Res1:array[string] = Array (Japan, Brazil, Germany)
* }}}
*
The implicit conversion mechanism of Classtag is shown.
Spark 3000 Disciples lesson four Scala pattern matching and type parameter summary