Build the Spark stand-alone development environment in Ubuntu16.04 (JDK + Scala + Spark)

Source: Internet
Author: User

1. Preparation

This article focuses on how to build the Spark 2.11 stand-alone development environment in Ubuntu 16.04, which is divided into 3 parts: JDK installation, Scala installation, and spark installation.

    1. JDK 1.8:jdk-8u171-linux-x64.tar.gz
    2. Scala 11.12:scala 2.11.12
    3. Spark 2.2.1:spark-2.2.1-bin-hadoop2.7.tgz

It is important to note that the Spark version and the Scala version need to match.

Note:starting version 2.0, Spark is built with Scala 2.11 by default. Scala 2.10 Users should download the Spark source package and build with Scala 2.10 support.

2.JDK Installation

After downloading jdk-8u171-linux-x64.tar.gz, unzip to your designated directory.

$ sudo mkdir /usr/local/java$ sudo tar -zxvf jdk-8u171-linux-x64.tar.gz -C /usr/local/java

Configure environment variables to open the profile file

$ sudo gedit /etc/profile

Write at the end of the file, note that Java_home is the installation path for the JDK:

After saving, exit and run the following command to make the Modify environment variable effective:

$ source /etc/profile

Detect if Java is installed successfully

$ java -version

3.Scala Installation

After downloading Scala 2.11.12, unzip to your designated directory.

$ sudo mkdir /usr/local/scala$ sudo tar -zxvf scala-2.11.12.tgz -C /usr/local/scala

Configure environment variables to open the profile file

$ sudo gedit /etc/profile

Write at the end of the file, note that Scala_home is the SCALA installation path:

export SCALA_HOME=/usr/local/scala/scala-2.11.12 export PATH=${SCALA_HOME}/bin:$PATH

After saving, exit and run the following command to make the Modify environment variable effective:

$ source /etc/profile

Detect if Java is installed successfully

$ scala  -version

4.Spark Installation

After downloading spark-2.2.1-bin-hadoop2.7.tgz, unzip to your designated directory.

$ sudo mkdir /usr/local/spark$ sudo tar -zxvf spark-2.2.1-bin-hadoop2.7.tgz -C /usr/local/spark

Configure environment variables to open the profile file

$ sudo gedit /etc/profile

Write at the end of the file, note that Spark_home is the installation path for SPARK:

After saving, exit and run the following command to make the Modify environment variable effective:

$ source /etc/profile

Detect if Java is installed successfully

$ spark-shell

Build the Spark stand-alone development environment in Ubuntu16.04 (JDK + Scala + Spark)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.