Environment:
Unbunt 12.04
Hadoop 2.2.x
Sprak 0.9
Scala scala-2.9.0.final.tgz
Steps
1. Download Scala
2. Unzip Scala, then change/etc/profile, add for example the following
Export Scala_home=/home/software/scala-2.9.0.final
Export path= $PATH: $JAVA _home/bin: $JRE _home/bin: $HADOOP _home/bin:/home/software/eclipse: $ANT _home/bin: $SQOOP _ Home/bin: $SCALA _home/bin: $SPARK _home/bin
3. Download Spark
Version number: spark-0.9.0-incubating-bin-hadoop2.tgz
4. Change/etc/profile
Export Spark_home=/opt/spark
Export path= $PATH: $JAVA _home/bin: $JRE _home/bin: $HADOOP _home/bin:/home/software/eclipse: $ANT _home/bin: $SQOOP _ Home/bin: $SCALA _home/bin: $SPARK _home/bin
5. Enter conf/to make changes such as the following
CP Spark-env.sh.template spark-env.sh
Vim spark-env.sh
export SCALA_HOME=/home/software/scala-2.9.0.finalexport JAVA_HOME=/home/software/jdk1.7.0_55export SPARK_MASTER_IP=172.16.2.104export SPARK_WORKER_MEMORY=1000m
6.vim conf/slaves
localhost
Datanode1
7. Start/Close Spark
sbin/start-all.sh
8. Browse the Master UI
http://robinson-ubuntu:8080
9. Implementation examples
run-example org.apache.spark.examples.SparkPi local
10. Implementation examples
run-example org.apache.spark.examples.SparkPi spark://172.16.2.104:707711.执行例子
run-example org.apache.spark.examples.SparkLR spark://172.16.2.104:7077
References:
Http://www.tuicool.com/articles/NB3imuY
http://blog.csdn.net/myboyliu2007/article/details/17174363
Ubuntu Install Spark