Ubuntu Basic Environment Configuration
: http://www.oracle.com/technetwork/java/javase/downloads/index.html
Address: http://www.scala-lang.org/
: http://spark.apache.org/downloads.html,
Python@ubuntu: ~$ sudo gedit/etc/profile
The maximum number of files added:
#Seeting JDK JDK Environment Variables
Export java_home=/opt/jdk1.8.0_45
Export JRE_HOME=${JAVA_HOME}/JRE
Export Classpath=.:${java_home}/lib:${jre_home}/lib
Export Path=${java_home}/bin:${jre_home}/bin: $PATH
#Seeting Scala Scala Environment Variables
Export scala_home=/opt/scala-2.11.6
Export Path=${scala_home}/bin: $PATH
#setting Spark Spark environment variables
export spark_home=/opt/spark-hadoop/
#PythonPath spark pyspark python environment
Export Pythonpath=/opt/spark-hadoop/python
Restart the computer, make /etc/profile Permanent, temporary effective, open command window, execute source/etc/profile Takes effect in the current window
Test the installation Results
There is no error message during startup. scala> , start successfully
There is no error during startup and the startup succeeds when it appears as shown above.
Test Spark available .
Open a command-line window, enter Python,python version is 2.7.6 , and note Spark not supported Python 3
Build the Spark development environment under Ubuntu