1. Node Preparation 192.168.137.129 spslave2 192.168.137.130 spmaster 192.168.137.131 spslave1 2. Modify Host name
3. Configure password-free login first to the user's home directory (CD ~), ls view the file, one of which is ". SSH", which is the file price that holds the key. The key we generate will be placed in this folder later. Now execute command generation key: Ssh-keygen-t rsa-p "" (using RSA encryption method to generate the key) returns, prompts three times to enter the information, we directly return. Go to the folder CD. SSH (you can perform LS view files after entering a folder) appends the generated public key id_rsa.pub content to Authorized_keys, executing the command:
cat id_rsa.pub >> authorized_keys Copy Each node's Authorized_keys contents to each other's file, and then ssh to each other to connect to each other
4. Installation configuration JDK
All nodes are installed JDK1.7, after the installation is complete, set environment variables:
export Java_home=/usr/java/jdk1.7.0_67-cloudera/export path= $JAVA _home/bin: $PATH export classpath=.: $JAVA _home /lib/dt.jar: $JAVA _home/lib/tools.jar 5. Installation Configuration Scala
All nodes Install scala2.10.6 version
After the installation is complete, configure the environment variables:
Export Scala_home=/usr/scala-2.10.6/export path= $PATH: $SCALA _home/bin: $SCALA _home/bin 6. Installation configuration Spark 6.1. Download spark1.6.1
6.2. Configure SPARK environment variable export spark_home=/usr/spark-1.6.0-bin-hadoop2.6 export path= $PATH: $SPARK _home/bin: $SPARK _home/bin 6.3. Configure $spark_home/conf/slaves
First copy the slaves.template, rename it to Slave2, and compile the Slave2 content:
6.4. Configure $spark_home/conf/spark-evn.sh
The same will be spark-env.sh.template copy, named spark-evn.sh, append content:
Export Java_home=/usr/java/jdk1.7.0_67-cloudera/export spark_master_ip=spmaster export SPARK_WORKER_MEMORY=1G Export SCALA_HOME=/USR/SCALA-2.10.6/7. Start Spark mode start Master./sbin/start-master.sh start brought./sbin/start-slave.sh master-spark-url:spark://spmaster:7077 Way one./sbin/start-all.sh