Spark cluster 1.6.2 upgrade to 2.0.2

Source: Internet
Author: User
Reference Documents

HTTP://DBLAB.XMU.EDU.CN/BLOG/1187-2/Spark 2.0 Distributed cluster Environment setup
http://blog.csdn.net/andy572633/article/details/7211546 Change the original spark directory name by n Ways of Killing process (kill) under Linux

Advance Stop-all Available
Change the original spark to spark1.6.2

sudo mv/usr/local/spark  /usr/local/saprk1.6.2
master installation spark2.0.2

Download the installation package spark-2.0.2-bin-without-hadoop.tgz

sudo tar-zxf ~/download/spark-2.0.2-bin-without-hadoop.tgz-c/usr/local/
cd/usr/local
sudo mv. spark-2.0.2-bin-without-hadoop/./spark
sudo chown-r hadoop./spark
Vim ~/.BASHRC

Add the following configuration to the. BASHRC:

Export Spark_home=/usr/local/spark
export path= $PATH: $SPARK _home/bin: $SPARK _home/sbin
Master Spark Configuration Slaves

Copy the slaves.template to the slaves

cd/usr/local/spark/
CP./conf/slaves.template./conf/slaves

Add Slaves
spark-env.sh

Copy the spark-env.sh.template to the spark-env.sh

CP./conf/spark-env.sh.template./conf/spark-env.sh

Add content as follows:

Export spark_dist_classpath=$ (/usr/local/hadoop/bin/hadoop CLASSPATH)
export hadoop_conf_dir=/usr/local/ Hadoop/etc/hadoop
Export spark_master_ip=xxx (specific IP)
Distribute to Slaves

Send the Spark folder to each node:

cd/usr/local/
tar-zcf ~/spark.master.tar.gz./spark
CD ~
SCP./spark.master.tar.gz N01:/home/hadoop
SCP./spark.master.tar.gz n02:/home/hadoop
SCP./spark.master.tar.gz n03:/home/hadoop
SCP./ spark.master.tar.gz n04:/home/hadoop
SCP./spark.master.tar.gz n05:/home/hadoop
SCP./spark.master.tar.gz N06:/home/hadoop
SCP./spark.master.tar.gz n07:/home/hadoop
SCP./spark.master.tar.gz N08:/home/hadoop
SCP./spark.master.tar.gz n09:/home/hadoop
SCP./spark.master.tar.gz N10:/home/hadoop

Execute on N01...N10:

sudo rm-rf/usr/local/spark/
sudo tar-zxf ~/spark.master.tar.gz-c/usr/local
sudo chown-r hadoop/usr/local/ Spark
start the Spark cluster
cd/usr/local/spark/
sbin/start-master.sh

Open http://m01:8080, as follows

The upper left corner can see issues that may have been upgraded to spark2.0.2

If more than one master process appears in JPS (Web UI 8080 8081 and other port feedback)
1. Stop the current Spark-all first
2.jps find the redundant master PID
3. Then kill to end this master process

Ps-ef | grep Master
kill-s 9 Master_pid

Where-s 9 has developed a signal to pass to the process is 9, that is, forcing, terminate the process as soon as possible.
The world is quiet.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.