The installation of Scala Scala before installing spark has been introduced in the Hadoop installation configuration
1. Download the spark installation package as follows
Http://spark.apache.org/downloads.html
I choose to be
Spark-1.4.1-bin-hadoop2.6.tgz put it in/root/software.
Extract
Tar zxvf spark-1.4.1-bin-hadoop2.6.tgz
2. Configure the system environment variables
Vim/etc/profile
Export spark_home=/root/sherry/spark-1.4.1
Export path= $SPARK _home/bin: $PATH
Source/etc/profile
3. Configure the Spark file
1) [[email protected] sherry]# CD spark-1.4.1-bin-hadoop2.6/
[Email protected] spark-1.4.1-bin-hadoop2.6]# CD conf
[Email protected] conf]# vim spark-env.sh
Add the following content
Export scala_home=/root/sherry/scala-2.10.4
Export java_home=/usr/java/jdk1.7.0_45
Export SPARK_WORKER_MEMORY=1G
Export spark_master_ip=10.118.46.22
2) configuration Slave
I didn't configure it myself.
4. Start the cluster
[Email protected] spark-1.4.1-bin-hadoop2.6]#./sbin/start-all.sh
Shut down
[Email protected] spark-1.4.1-bin-hadoop2.6]#./sbin/stop-all.sh
Check if the installation is successful
JPs
Spark installation Configuration