Install Spark Notes

Source: Internet
Author: User
Tags spark notes

Centos

Prepare three machines hadoop-1,hadoop-2,hadoop-3

Install Jdk,python,host name,ssh in advance.

Install Scala

Download the Scala RPM package

Under the/home/${user}/soft/

Wget http://www.scala-lang.org/files/archive/scala-2.9.3.rpm (not used, post installation directory not found after installation)

RPM-IVH scala-2.9.3.rpm

Pick a stable version under http://www.scala-lang.org/download/all.html download

Unzip TAR-ZXVF Scala Package

Add Scala environment variables

Added at end of/etc/profile

Export Scala_home=/home/${user}/soft/scala

Export path= $PATH: $SCALA _home/bin

Make the configuration file effective immediately

Source/etc/profile

Verifying the Scala installation

Scala-version

Scala

9*9

Res0:int = 81

Install Spark

Get Spark Pack

wget http://mirror.bit.edu.cn/apache/spark/spark-1.5.2/spark-1.5.2.tgz (Error required with compiled package)

Under the/home/${user}/soft/

wget http://mirror.bit.edu.cn/apache/spark/spark-1.5.2/spark-1.5.2-bin-hadoop2.6.tgz

Decompression TAR-ZXVF spark-1.5.2-bin-hadoop2.6.tgz

Add SPARK environment variable

Added at end of/etc/profile

Export spark_home=/home/${user}/soft/spark-1.5.2-bin-hadoop2.6

Export path= $PATH: $SPARK _home/bin

Make the configuration file effective immediately

Source/etc/profile

Modifying the Spark Configuration

Modifying the SPARK-ENV configuration

Copy file CP spark-env.sh.template spark-env.sh

Modify Spark-env.sh

#scala

Export Scala_home=/home/${user}/soft/scala

#jdk
Export java_home=/usr/java/jdk1.7.0_51

#master结点ip
Export spark_master_ip=10.171.29.191

#结点工作用内存
Export spark_worker_memory=512m

Configure conf files under slaves

CP Slaves.template Slaves

Edit Slaves

Add to

Hadoop-1

Hadoop-2

Hadoop-3

Distribute to other machines

Will configure the Scala and Hadoop directory SCP to other machines hadoop-2 and hadoop-3.

Start Spark

Execute Sbin directory under start-all.sh

Verifying the Spark environment

$ bin/run-example Org.apache.spark.examples.SparkPi

View http://hadoop-1:8080/

Start Spark-shell

View http://hadoop-1:4040/jobs/

Install Spark Notes

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.