Linux commands involved in setting up the Spark Environment
Linux commands involved in setting up the Spark environment (simple and easy to use)
From one server responsible for content to another server: scp jdk-6u37-linux-x64.bin spark@10.126.45.56:/home/spark/opt
Tar compression and decompression file: Compressed tar-cvf java.tar.gz java/decompress: tar-xvf java.tar.gz
Configure java environment variables:
Go to various places to download jdk packages: jdk-6u37-linux-x64.bin, modify File Execution permissions: chmod a + x jdk-6u37-linux-x64.bin, install:./jdk-6u37-linux-x64.bin
Edit vi ~ /. Bash_profile, add content
# Set java env
Export JAVA_HOME =/home/spark/opt/java/jdk1.6.0 _ 37
Export CLASSPATH =.: $ JAVA_HOME/jre/lib: $ JAVA_HOME/lib/tools. jar
PATH = $ PATH: $ HOME/bin: $ JAVA_HOME/bin
Export PATH
Save and source ~ /. Bash_profile
[Spark @ S1PA11 ~] $ Ifconfig
-Bash: ifconfig: command not found ---> vi ~ /. Bashrc
Add. bashrc files in the home directory of a common user.
Export PATH = $ PATH:/sbin
Then execute
Source. bashrc can execute the ifconfig command.
-------------------------------------- Split line --------------------------------------
Spark1.0.0 Deployment Guide
Install Spark0.8.0 in CentOS 6.2 (64-bit)
Introduction to Spark and its installation and use in Ubuntu
Install the Spark cluster (on CentOS)
Hadoop vs Spark Performance Comparison
Spark installation and learning
Spark Parallel Computing Model
-------------------------------------- Split line --------------------------------------
Spark details: click here
Spark: click here
This article permanently updates the link address: