Sparkha cluster configuration

Source: Internet
Author: User
Keywords Cloud computing Sparkha cluster configuration

Sparkha cluster configuration, spark cluster Hadoop configuration based on Hadoop HDFs.

Su-rdato

Cd/u01
TAR-ZXVF spark-2.1.1-bin-hadoop2.7.tgz
MV spark-2.1.1-bin-hadoop2.7 Spark
#copy template
cp/u01/spark/conf/spark-env.sh.template/u01/spark/conf/spark-env.sh
Cp/u01/spark/conf/slaves.template/u01/spark/conf/slaves
#config spark env
Cat >>/u01/spark/conf/spark-env.sh << EOF
Export java_home=/usr/java/jdk1.8.0_131
Export Scala_home=/usr/share/scala
Export Spark_home=/u01/spark
Export Hadoop_conf_dir=/u01/hadoop/etc/hadoop
EOF
#config slaves vi/u01/spark/conf/slaves
Sparkgc1
Sparkgc2
Sparkgc3

#boot Spark
/u01/spark/sbin/start-all.sh

#web Spark
http://192.168.168.141:8080/
Related Article

Beyond APAC's No.1 Cloud

19.6% IaaS Market Share in Asia Pacific - Gartner IT Service report, 2018

Learn more >

Apsara Conference 2019

The Rise of Data Intelligence, September 25th - 27th, Hangzhou, China

Learn more >

Alibaba Cloud Free Trial

Learn and experience the power of Alibaba Cloud with a free trial worth $300-1200 USD

Learn more >

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.