First lesson Introduction to Scala and summary of actual combat notes

Source: Internet
Author: User
Tags array sort spark rdd

First lesson: Getting Started with Scala
The great value of 1:scala
2:scala basic function Getting started combat
3:scala function Getting Started combat
Array,map,tuple Combat in 4:scala
5: Comprehensive case and spark source code analysis

The relationship between Scala and Java:
One: They're all JVM-based, but Scala can invoke any Java feature, like Spark running on Hadoop, which can invoke everything on Hadoop.
Two: You can think of Scala as an upgraded version of Java, because Scala itself is an object-oriented language, in Scala everything is object, it is a purely object-oriented language, and Scala is an object-oriented and functional combination of the language.


The development language for big data is Scala for the following reasons:
One: The big data itself is the computing data, and Scala is the object-oriented organization of project engineering capabilities, but also the function of computing data.
Two: Now big data in fact computational standard framework spark, which is developed in Scala, because the computational data, Scala It is functional programming, it implements the algorithm very concise and elegant.
Example: Kafka, it is a message middleware, if the external data to flow into the big data center, we generally have to use Kafka as an adapter, if the Big Data center data flow to the outside, but also with Kafka (such as Spark computing data to be handed to HBase or MySQL, We all use Kafka in the meantime, many big data components are written in Scala, so, if you want to be a top-notch big data developer, you must master Scala.

Result type conversions:
1.5*2 Result res1:double = 3.0
Res1.toint//Press table to complete the auto-completion

Declaration of variables in Scala:
One: declaration of mutable variables
var name= "Spark"//declares a mutable variable name with a value of spark, you can change the value of name to Scala:name= "Scala"

Two: declaration of immutable variables
Val result = 2+10//declares an immutable variable result and its value is 12, its value is immutable, and if you re-assigns the result to 13,result=13, an error is given, because the variable result is an immutable variable.

Note: The data in the Spark Rdd is immutable by default.


Three: Manually specifying the type of variable assignment
Val age:int = 0//Specifies that the type of the age immutable variable is Int and the value is 0
Val name:string = NULL//Specifies that the type of the immutable variable of name is String, and its default value is NULL
Note: If you specify a variable-specific type, subsequent assignment to the variable: only the type or subtype of the type you specify can be assigned.

Four: One line of code declares multiple variables at the same time
Val age1, Age2,age3 = 0

Note: In Scala, it can do the basic type and the automatic conversion of our so-called object type, in Scala, the basic type is the object, such as: 10.toLong, in Java, the basic type is not this method, but Scala has.

There is no such 1++,1--in Scala, this operator.
For example: var age = 10; Age +//This is not a right operation, but you can do this: age+=1

To take the minimum value:
Import Scala.math._
Min (20,4)
Result is 4

To build an array:
Val Array = array (1,2,3,4)
Internal implementation: Val array = array.apply (1,2,3,4)

Five: If expression
Val Age = 19
if (age>=18) "adult" Else "child"
Result string = Adult
Note: In Scala, if expressions are results, but not in Java.

The value of the last line is the return value of the entire code block: * * * *
var buffered = 0;
Val result = if (age>=18) {
| "AUDLT"
| Buffered = 10
| Buffered
}
Result Result:anyval = 10

Six: Cycle
0 to Element

for (i <-0 to Element) println (i)

Description: <-It is an extract, extracts each element of this set range and assigns its value to I

for (i <-0 to element if i%2 ==0) {
println (i)
}
Print an even number

How to terminate the loop because there is no break statement in Scala, but there is a breaks object in Scala that we can use.
One: Guide package: Import scala.util.control.breaks._
Two: Use
for (I <-1 to 10) {
if (i==4)
Break
println (i)
}
Results 1 2 3
Scala.util.control.BreakControl
In fact, the return has been realized at this time

Seven: function definition:
Default parameters
def f3 (param1:string,param2:int=30) =param1+param2
F3 ("Spark")
Value: Spark30

Do not pass the parameter in order: with name parameters, as follows
F3 (param2=100,param1= "Scala")
Results: Scala100

Variable-length parametric functions
def sum (numbers:int*) ={var result=0;for (element <-number) Result+=element;result}
SUM (1,2,3,4,5,6,7,8,9,10)
Value of 55

The sum (1 to 100:_*) value of 5050,:_* means that each element inside is extracted and then added
A function that has no return value is called a procedure.


Eight: Lazy: If a variable is declared as lazy, it means that the variable will only take place when it is first used, such as opening a file, opening a database, manipulating the network, and declaring the variable to be lazy, especially useful.

IX: Exception
try{

}catch{

}finally{
println ("finally")
}

Note: The exception in Java is slightly different.


Ten: Set operation
Create immutable groups: Val arr = new Array[int] (5)//Create an immutable array, size 5, official saying: Val means that the object of ARR itself cannot be modified, but the contents of the object can be modified
Access element: Arr (3)//access 4th element.
Modify the value of an element: arr (2) = 8


When creating an immutable group and assigning values: val arr1 = Array ("Scala", "Spark")
In fact it is used internally by apply:val arr1 = array.apply ("Scala", "Spark")//apply is the internal factory method constructor


Create a mutable array: You can add or subtract elements
One: Guide Package import Scala.collection.mutable.ArrayBuffer
II: Val arrbuffer = Arraybuffer[int] ()
Additional elements: Arrbuffer +=10;
Arrbuffer + = (11,1,3,4,5)//Then this mutable array has 5 elements
Arrbuffer ++= (1,2,3,4)//append 1,2,3,4 four elements after 5 elements of the existing Arrbuffer

Subtract elements:
Arrbuffer.trimend (3)//Truncate 3 elements after Arrbuffer

Fixed position plus element: Arrbuffer.insert (5,100), the 5th element is incremented by 100, then the element behind will move backward
Fixed position Delete element: Arrbuffer.remove (5), remove element from 5th position
Arrbuffer.remove (5,3), the 5th position begins to delete 3 elements, including the 5 position

For loop traversal array
for (element <-Arrbuffer)
{
println (Element)
}
Array summation: Arrbuffer.sum
Array Maximum value: Arrbuffer.max
Array Sort: scala.util.Sorting.quickSort (Arrbuffer)

Map
Definition of Immutable map val persons = map ("Spark"->6, "Hadoop", 11)
Access the value inside the map of the key for Hadoop: Persons ("Hadoop")

Define a mutable map
Val persons = Scale.collection.mutable.Map ("Spark"->6, "Hadoop", 11)
add element:
persons=+ ("Flink", 5)
Minus elements:
persons-= "Flink"

Variant of the IF () else () operation in map
In spark, the notation is: Persons.getorelse ("Spark", 1000)//If persons contains Spark in this map, its value is removed, if not, the value is 1000.

For loop access to key and value inside persons
For ((key,value) <-persons)
println (key+ ":" +value)
Value: Hadoop:11
Spark:6

To create a sortable map: Sort the key of the map
Val persons = Scale.collection.immutable.SortedMap ("Spark"->6, "Hadoop", 11)

Tuple: It can store a lot of different types of data
Val tuple= ("Spark", 6,99.0)
Visit: tuple._1 results String=spark
Note: Its access subscript is starting from 1, not the same as the array.

Development environment:

Ubuntu 14.x

Scala 2.10.4

Java 1.8

First lesson Introduction to Scala and summary of actual combat notes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.