Built on top of the hadoop2.6.0
1. Download spark-1.4.0-bin-hadoop2.6.tgz on the website
2. Unzip to the folder you want to put, tar zxvf spark-1.4.0-bin-hadoop2.6.tgz
3. Configure profile
sudo gedit/etc/profile
Add the path configuration below the file, save the exit, and use Source/etc/profile to make the environment effective
Export scala_home=/home/jiahong/spark-1.4. 0-bin-hadoop2. 6 export PATH= $SPARK _home/bin: $PATH
4. Start Spark
[Email protected]7010: ~$ CD spark-1.4.0-bin-hadoop2.6/[email protected]-optiplex-7010: ~/spark-1.4.0-bin-hadoop2.6$ sbin/start-all.sh starting Org.apache.spark.deploy.master.Master, logging to/home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/. /logs/spark-jiahong-org.apache.spark.deploy.master.master-1-jiahong-optiplex-7010. outlocalhost:starting Org.apache.spark.deploy.worker.Worker, logging to/home/jiahong/spark-1.4.0-bin-hadoop2.6/sbin/. /logs/spark-jiahong-org.apache.spark.deploy.worker.worker-1-jiahong-optiplex-7010. ou
5. Accessing the Spark interface
localhost:8080
Install spark1.4.0 under Ubuntu