1. installation environment
Hardware: Virtual Machine
Operating System: centos 6.4 64-bit
IP: 10.51.121.10
Host Name: datanode-4
Installer: Root
Hadoop: hadoop2.6, hadoop2.6 standalone installation see: http://www.cnblogs.com/zouzhongfan/p/4309405.html
Hive: hive0.13, hive0.13 installation see: http://www.cnblogs.com/zouzhongfan/p/4309432.html
2. Install Scala
1, go to the http://www.scala-lang.org/download/ to download Scala corresponding to the spark version. Spark1.2 corresponds to scala2.10. Download the scala-2.10.4.tgz here.
2. Unzip and install Scala
1), execute the # tar-axvf scala-2.10.4.tgz and unzip it to/root/spark/scala-2.10.4.
2) In ~ /. Add the following configuration in bash_profile:
export SCALA_HOME=/root/spark/scala-2.10.4export PATH=$JAVA_HOME/bin$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$PATH
3) to make the environment variable take effect, # source ~ /. Bash_profile
3. Verify the installation. Enter the scala command in the command line to go To the scala command console.
# scalaWelcome to Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.6.0_45).Type in expressions to have them evaluated.Type :help for more information.scala>
3. Install spark
1, to begin.
2. Add the following configuration in. bash_profile:
export SPARK_HOME=/root/spark/spark-1.2.0-bin-hadoop2.4export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin:$HIVE_HOME/bin:$PATH
3. Make the environment variable take effect,#source ~/.bash_profile
4. Configure spark
1. Enter the spark configuration file path,#cd $SPARK_HOME/conf
2. Execute,#cp spark-env.sh.template spark-env.sh
3, add the following configuration in the spark-env.sh file:
export JAVA_HOME=/usr/lib/jdk1.6.0_45export SCALA_HOME=/root/spark/scala-2.10.4export HADOOP_CONF_DIR=/root/hadoop/hadoop-2.6.0/etc/hadoop
5. Start spark
1. Go to the spark installation path,#cd /root/spark/spark-1.2.0-bin-hadoop2.4
2. Run#./sbin/start-all.shCommand
3. Run#jpsCommand, there will be master and worker Processes
# jps38907 RunJar39030 RunJar54679 NameNode26587 Jps54774 DataNode9850 Worker9664 Master55214 NodeManager55118 ResourceManager54965 SecondaryNameNode
4, enter the spark Web Interface: http: // datanode-4: 8080/
5. Execute,#./bin/spark-shellCommand, you can enter the spark shell environment, you can through http: // datanode-4: 4040, you can see the situation of sparkui.
Standalone installation of spark1.2 Based on hadoop2.6