Scala code in one day (7) and scala code in one dayScala code in one day (7)To better control spark, I recently studied scala language features, mainly reading "quick learning scala" an
Build a scala environment in linux and write a simple scala Program (Code tutorial), linuxscala
Installing the scala environment in linux is very simple. If it is a ubuntu environment, it will be simpler. You can directly use apt-get to solve the problem. I just use ubuntu. java/s
Why do you write this blog?First of all, for the Spark project, it is highly recommended to build, with IntelliJ idea (Ultimate version), if you have another hobby to try Scala ideas for Eclipse, have time to play for yourself. But it's best to follow the crowd.For Hadoop projects, Eclipse is strongly recommended.Second, out of a Bo friends to give me the need to leave a message, in order to more efficient and high-quality help everyone, comb write do
Package Com.leegh.parameterization/*** @author Guohui Li*/Class PersonClass Student extends PersonClass C[+t] (Val args:t)Trait Friend[-t] {def makefriend (Somebody:t)}Object Variance {def makefriendwithyou (S:student, f:friend[student]) {f.makefriend (s)}def main (args:array[string]): Unit = {Val Value:c[person] = new C[student] (new Student)}}Report:This blog description:1. Organize your ideas and improve yourself.2. Be educated in Liaoliang teacher, what to gain, so recommend.3. Blog focus on
Scala code in one day (9) and scala code in one dayScala code in one day (9)To better control spark, I recently studied scala language features, mainly reading "quick learning scala" an
Scala code in one day (5) and scala code in one dayScala code in one day (5)To better control spark, I recently studied scala language features, mainly reading "quick learning scala" an
Package Com.leegh.parameterization/*** @author Guohui Li*/Class Pair_ordering[t:ordering] (Val first:t, Val second:t) {def bigger (implicit ordered:ordering[t]) = {if (Ordered.compare (first, second) > 0) First Else second}}Object Context_bounds {def main (args:array[string]): Unit = {Val pair = new pair_ordering ("Spark", "Hadoop")println (Pair.bigger)Val pairint = new Pair_ordering (3, 5)println (Pairint.bigger)}}Report:This blog description:1. Organize your ideas and improve yourself.2. Be ed
Package Com.leegh.implicits/*** @author Guohui Li*/Object Context_implicits {Implicit val default:string = "Java"}Object Param {def print (content:string) (implicit language:string) {println (language + ":" + content)}}Object Context_parameters {def main (args:array[string]): Unit = {Param.print ("Spark") ("Scala")Import Context_implicits._Param.print ("Hadoop")}}Report:This blog description:1. Organize your ideas and improve yourself.2. Be educated i
Package Com.leegh.parameterization/*** @author Guohui Li*/Because breathe returns This,scala the return type is inferred as animal, and because animal has no eat methodClass Animal {def breathe = this}Class Cat extends Animal {def eat = this}Class Animal {def Breathe:this.type = this}Class Cat extends Animal {def eat:this.type = this}Object Singleton_types {def main (args:array[string]): Unit = {Val cat = new CatCat.breathe.eat}}Report:This blog descr
Package Com.leegh.parameterization/*** @author Guohui Li*/Class Pair[t def bigger = if (First.compareto (second) > 0) First Else second}Class Pair_lower_bound[t] (Val first:t, Val second:t) {def Replacefirst[r;: T] (newfirst:r) = new Pair_lower_bound[r] (Newfirst, second)}Object Type_variables_bounds {def main (args:array[string]): Unit = {Val pair = new pair ("Spark", "Hadoop")println (Pair.bigger)}}Report:This blog description:1. Organize your ideas and improve yourself.2. Be educated in Liaol
Package Com.leegh.parameterization/*** @author Guohui Li*/Object Type_contraints {def main (args:array[string]): Unit = {def Rocky[t] (i:t) (implicit ev:t println ("Life is too short,you need spark!")}Rocky ("Spark")}}Report:This blog description:1. Organize your ideas and improve yourself.2. Be educated in Liaoliang teacher, what to gain, so recommend.3. Blog focus on practice, superfluous words are not much to say, are doing technology.4. Information from DT Big Data Dream Factory public accou
The take in list is implemented with Listbuffer:version 2.10.xHowever, in the 2.11.x version, it is not:override def take (n:int): list[A] = if (IsEmpty | | n val h = new::(head, Nil)var t = hvar rest = tailvar i = 1while ({if (rest. isEmpty) return this; i i + = 1val NX = new::(rest. Head, Nil)t.tl = NXT = NXrest = rest. Tail}h}final Case class:: [B] (override Val head:b, private [Scala] var tl:list[b]) extends List[b]declared as VAR allows listbuffe
'), (3, "three", ' 3 ')). UNZIP3//L1=list (All-in-one), L2=list ("One", "the" , "three"), L3=list (' 1 ', ' 2 ', ' 3 ')20.sliceSlice (From:int, until:int): List[a] Extracts list of elements from position from to position until (excluding this location)Val sliced = Nums.slice (2,4)//list (3,4)21.slidingSliding (Size:int, Step:int): Iterator[list[a]] Groups The list by a fixed size of sizes, stepping to step,step default to 1, and returning the result as an iteratorVal groupStep1 = nums.sliding (
Write a scala multi-threaded demo to prepare for later useRunnable/callableDifference:runnable no return value,callable thread executes with return value runnable ExampleImportjava.util.concurrent. {Executors,executorservice}objecttest{defmain (args:Array[String]) {//creating a thread pool valthreadpool:executorservice= Executors.newfixedthreadpool (5) try{//commit 5 threads for (iCallable ExampleImportjava.util.concurrent. {callable,futuretask,execut
To learn about Spark, I started learning Scala. Come on!A topic of recursion:Code:Why X was Float, but when we use 3.0 for example, it returns ERROR.DEF Xpown (x:float,n:int): float={ if (n==0) 1 else if (n>0) { if (n%2==0) Xpown (X,N/2) *xpown (X,N/2) else X*xpown (x,n-1) } else 1/xpown (x,-n) }println (Xpown (10,-2))Questions:Why the parameter definition, X is obviously float type, but I call the function, x=3.0,
One day Scala code (16)To better harness spark, learn more about Scala's language features recently, focusing on "learn Scala fast" and write down some code that you think is useful.Package Examplesobject Example16 { //scala operation XML //
9.2 The previous example demonstrates that higher-order functions can help reduce code duplication when you implement the API. Another important application of higher-order functions is to put them in the API to make the client code more concise. A good example is provided by the specific purpose loop method of the Scala collection type. These specific use cycle
Label:Debugging Java code in the Linux/C + + code is required through GDB. Then you need to use the Jdb tool. The use of jdb on the internet can be found in the corresponding article, but the Scala debugging is relatively small. In fact, the general process of debugging is the same, just need to pay attention to some of the details of the place to be. Here's a qu
Modify Scala code auto-completion Shortcut Keys code completion and scalacompletionI am used to using the Eclipse code to complete the shortcut key alt +/, but the shortcut key combination in IntelJIDEA has no effect, so let's modify it, press ctrl + alt + s to open the setting interface and select Eclipse in keymaps.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.