Lesson Summary Stage Division:
The first stage, all the knowledge points are set specific actual combat
The second stage, tens of millions of examples of projects, the value of million, master the relevant technology will be able to master even the annual million
Precision spark a variety of environments, core principles, operating mechanisms are very familiar with the failure to know the reason, solve, good performance optimization
Master a large Big data project that involves
Every time after the course to complete the work, to get the full source of the project.
The third stage, machine learning, the first 2 stages to grasp the firm, the third stage to be able to master
1.1
Scala, run in the JVM, can call any Java library,
Scala is a purely object-oriented language than Java,scala.
Scala is inherently functional programming and object-oriented programming, writing the same functionality Scala can write 4/5 less code than Java
Spark is very complex inside, must understand the source code, must develop the internal operation mechanism
Sina Weibo is the main mentor and disciple of the place where the main exchange of
installed VMware Linux ubantu 25 minutes
installed Scala 2.10.4
installed JDK 1.8.0
1.1scala basic
type can automatically convert such as 1.5*2
The top button on the command line has method auto-associative
Tochar,toint type conversion method
variables are divided into mutable variables and immutable variables, the value of the Val-defined variable is immutable, and the VAR definition is variable
After a variable definition type can only be assigned a value of that type value or subtype value
One line of code simultaneously applies for multiple variables, Val age1,age2,age3=0
0.to (5), equivalent to range (0,1,2,3,4,5) Everything is Object
1.+ (1), The result is 2, where add is the method
cannot be used i++, can be used i+=1
Import scala.math._ to introduce Scala library mode
min (20,4)
Val Array=array (1,2,3,4), Equals Val Array=array.apply (1,2,3,4) defines the data
Val result=if (age >=18) "adult" Else "child", if and the result output
code block the value of the last line of content, is the return value of the entire block of code
Print newline println ("spark\n")
printf ("%s world!\n", "Hello"), string with arguments replaced by
Readint,readline, get screen input
Do.. While 1:03 minutes
for (i<-0 to) if i%2 ==0) println (i), the For loop and if same
:p aste, command line multi-line code writing, such as write function
Note that the last return 0, because F1 to return int, the grammar check cannot determine the internal return, so the external need to design a default return value
1625811243yy
Val N = 10
def f1:int = {
For (I<-1 to 10) {
if (i==n) return I
println (i)
}
return 0
}
def f3 (param1:string,param2:int=30) =param1+param2, function definition with parameters and default value of the entry parameter
F3 (param2=100,param1= "Scala"), not paying attention to the way in which the function is in the parameter order
def sum (numbers:int*) ={var result=0; for (element <-numbers) result +=element;result}, variable length parameter type
SUM (1 to 100), this method of communication is not possible.
1626320654yy
1 To: _*???
SUM (1 to 100: _*), this way is correct
Lazy Val comtent =fromfile ("/ROOT/TESTSF"), reads the file, while the statement behind the lazy is not allocated at the time of definition, the resource is allocated when used, that is, the definition does not check, does not account for resources, checks when executing, allocates resources
try{
Val content=fromfile ("/ROOT/TESTSF"). mkstring
}catch{
Case _:filenotfoundexception=>
}flnally{
println ("")
}
Val arr= New Array[int] (s), defines the array, which is used here to Val only the object address cannot be changed, the value can be changed
Val Arr1=array ("Scala", "Spark"), Scala type deduction, program automatically defines the string type
Import Scala.collection.mutable.ArrayBuffer
Val Arrbuffer=arraybuffer[int] ()
Arrbuffer+=array (1,2,3,4)
Arrbuffer++=array (1,2,3,4) increases the value of two copies
Arrbuffer.insert (5,100)
Arrbuffer.insert (5,100,101)
Arrbuffer.trimend (3)
Arrbuffer.remove (3)
Arrbuffer.remove (3,2)
Val Arr2=arrbuffer.toarray
Arr2.tobuffer
Val Arr2=array (1,1,3,4,10,11,100)
For (i<-0 until (arr2.length,2)) println (ARR2 (i))
Arr2.mkstring (",") to a string and split with ","
Arr2.filter (_%3==0). Map (I->i*i),????
Val Persons=map ("Spark"->6, "Hadoop"->11)
Persons ("Hadoop")
persons+= ("Flink"->5)
persons-= "Flink"
Persions.contains ("Spark") to determine if there is a key
Persions.getorelse ("Spark", 1000) Change the value
For ((key,value) <-persions) println (key+ ":" +value)
Multi-valued Map
Tuble
3 days to speak Scala, so you can read spark95% source code
Tomorrow to speak of object-oriented content, the amount of knowledge is 3 times times today, the importance
Homework Two: Write a study note on this course and then on Sina Weibo @ I
Group:
462923555
437123764
418110145
First stage 2-3 months, second stage 2-3 months
The relationship with Spark is Hadoop DHFS
One of the jobs: removes all negative numbers after the first negative number in an array
Import Scala.collection.mutable.ArrayBuffer
def deleteplural:arraybuffer[int] = (Arr1:arraybuffer[int]) {
Val Arrbuffer=arraybuffer[int] ()
for (I<-ARR1) {
if (i>=0) {
Arrbuffer+=i
}
}
Arrbuffer
}
Val Arr1=arraybuffer (1,1,3,-1,4,-2,-5,10,11,100,-12)
Arr1=deleteplural (ARR1)
for (I<-ARR1) {
println (i)
}
1.1 Day First Lesson notes