Spark some introductory materials

Source: Internet
Author: User
Tags scala ide scala tutorial

Spark some introductory materials


A Scala Tutorial for Java programmers

Http://docs.scala-lang.org/tutorials/scala-for-java-programmers.html


Learning Resources (Video tutorials, books, examples, etc.)

Spark.apache.org/documentation.html

Getting Started Guide

Spark.apache.org/docs/latest/quick-start.html

Programming Guide

Spark.apache.org/docs/latest/programming-guide.html

The official website comes with examples (excellent, there is the local version, there is spark version)

Https://github.com/apache/spark/tree/master/examples/src/main/scala/org/apache/spark/examples

Executing the Spark app (using the Spark-submit command, which is actually the Java call at the bottom)

Spark.apache.org/docs/latest/submitting-applications.html

./spark-submit--class "Simpleapp"--master Local[4]/home/linger/scala_target/simpleapp.jar

The example that comes with the executive network can take advantage of run-example, which encapsulates the Spark-submit command.

./run-example SPARKPI

Run-example script has such a sentence example_master=${master:-"local[*]"} The default MASTER is local[*].

According to this shell, there is a way to change master.

Before calling the script, export master=local (or other).

It is not known if there are any other ways to pass the master variable.

Spark Development Environment Setup (good)

http://blog.csdn.net/wankunde/article/details/41843217

http://bit1129.iteye.com/blog/2172164

When Setup is complete, error ungrateful wrong, saying that Eclipse relies on two Scala libraries, one of which is the system-installed Scala (C:\Scala_ 2.11.4, one is in Spark-assembly-1.2.0-hadoop2.4.0.jar, and the version in Spark-assembly-1.2.0-hadoop2.4.0.jar is 2.10.4, so two versions are inconsistent and cause a conflict

Since our code is running in spark and the Scala version used by Spark is 2.10.4, Scala in the development environment should be 2.10.4, so 2.11.4 this Scala library is removed from the Java Build path

After deletion, Eclipse continues to error, as shown below, meaning that the project relies on a Scala version that is not as new as the IDE's Scala version, as follows

Right-click on the previous Scala project, select Scala from the right-click Pop-Up menu, then in the cascading menu, select Set Scala installation, select the dialog box in the popup, Fixed Scala installation:2.10.4 ( Bundled

Clean the entire project, the Scala IDE environment is here to complete the configuration.

Scala-eclipse Run error occurred

Java.lang.ClassNotFoundException

Scala-ide

Http://scala-ide.org/download/sdk.html


SBT is the building tool for Scala

Www.scala-sbt.org/documentation.html

SBT installation is a bit of a pit ah, after the installation of SBT to download things

http://www.zhihu.com/question/23245141

SBT Skip-Over Wall Handbook

Http://afoo.me/posts/2014-11-05-how-make-sbt-jump-over-GFW.html

Build SBT project in offline environment

http://shzhangji.com/blog/2014/11/07/sbt-offline/

Scala SBT Download file failed the first time it runs

http://mooc.guokr.com/note/5879/

http://segmentfault.com/blog/zhongl/1190000002474507

Speed up SBT download dependent libraries

http://ju.outofmemory.cn/entry/74281

Greetings SBT 1000 times: Add Global Mirror Library

Spark and machine Learning technology Blog

http://www.cnblogs.com/fxjwind/

Http://blog.csdn.net/sunbow0

--------------------------------------------------------------------------------------------------------------- -----

Spark Run debugging methods and learning Resources summary

http://blog.csdn.net/melodyishere/article/details/32353929

IntelliJ idea installs the Scala plugin to build Scalawith SBT Project

http://8liang.cn/intellijidea-install-scala-plugin-create-with-sbt-project/

Apache Spark's IntelliJ IDEA development environment building

http://8liang.cn/intellij-idea-spark-development/

Spark Job Code (source) IDE Remote debugging

http://www.iteblog.com/archives/1192

Build a spark integrated development environment with eclipse

http://datalab.int-yt.com/archives/505

Apache Spark Learning: Building a spark integrated development environment with eclipse

http://dongxicheng.org/framework-on-yarn/spark-eclipse-ide/


Spark some configuration information

Http://spark.apache.org/docs/latest/configuration.html


This article linger

This article link: http://blog.csdn.net/lingerlanlan/article/details/46430915



Spark some introductory materials

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.