Spark SSH Configuration
Configure the Host Name
Vi/etc/hostname added S1PA11
Run # hostname
S1PA11 --- modification successful
Open the hosts file and modify the association:
127.0.0.1 localhost. localdomain localhost
: 1 localhost6.localdomain6 localhost6
Add the following two lines (local IP address and hostname)
10.58.44.47 S1PA11
10.126.45.56 S1PA222
Ping S1PA222
PING S1PA222 (10.126.45.56) 56 (84) bytes of data.
64 bytes from S1PA222 (10.126.45.56): icmp_seq = 1 ttl = 62 time = 0.235 MS
64 bytes from S1PA222 (10.126.45.56): icmp_seq = 2 ttl = 62 time = 0.216 MS
64 bytes from S1PA222 (10.126.45.56): icmp_seq = 3 ttl = 62 time = 0.276 MS
Ping S1PA11
PING S1PA11 (10.58.44.47) 56 (84) bytes of data.
64 bytes from S1PA11 (10.58.44.47): icmp_seq = 1 ttl = 62 time = 0.268 MS
64 bytes from S1PA11 (10.58.44.47): icmp_seq = 2 ttl = 62 time = 0.273 MS
Currently, two machines can communicate with each other.
Ssh password-free authentication Configuration
First, configure the machine on S1PA11 (the machine is the master)
Enter the. ssh file: [spark @ S1PA11 sbin] $ cd ~ /. Ssh/
Generate the key ssh-keygen: ssh-keygen-t rsa. Press the Enter key.
Final generation (id_rsa, id_rsa.pub)
Generate the authorized_keys file: [spark @ S1PA11. ssh] $ cat id_rsa.pub> authorized_keys
On the other machine S1PA222 (slave machine), public keys and keys are also generated.
Steps are similar to S1PA11
Enter the. ssh file: [spark @ S1PA11 sbin] $ cd ~ /. Ssh/
Generate the key ssh-keygen: ssh-keygen-t rsa. Press the Enter key.
Final generation (id_rsa, id_rsa.pub)
Copy the id_rsa.pub file of the S1PA222 machine to the S1PA11 machine: [spark @ S1PA222. ssh] $ scp id_rsa.pub spark@10.58.44.47 :~ /. Ssh/id_rsa.pub_sl
Switch to the machine S1PA11 and merge authorized_keys; [spark @ S1PA11. ssh] $ cat id_rsa.pub_sl> authorized_keys
Send authorized_keyscopy to the S1PA222 machine (/home/spark/. ssh): [spark @ S1PA11. ssh] $ scp authorized_keys spark@10.126.45.56 :~ /. Ssh/
Now we will change the. ssh/folder permission of the two machines to 700, and the authorized_keys File Permission to 600 (or 644)
Chmod 700 ~ /. Ssh
Chmod 600 ~ /. Ssh/authorized_keys
OK. After completing the preceding operations, you can start ssh verification.
S1PA11j machine sshS1PA222
[Spark @ S1PA11. ssh] $ ssh S1PA222
Last login: Mon Jan 5 15:18:58 2015 from s1pa11
[Spark @ S1PA222 ~] $ Exit
Logout
Connection to S1PA222 closed.
[Spark @ S1PA11. ssh] $ ssh S1PA222
Last login: Mon Jan 5 15:46:00 2015 from s1pa11
S1PA222 machine ssh S1PA11
Connection to S1PA11 closed.
[Spark @ S1PA222. ssh] $ ssh S1PA11
Last login: Mon Jan 5 15:46:43 2015 from s1pa222
[Spark @ S1PA11 ~] $ Exit
Complete ssh password-free Authentication
PS: Exception Handling
1. ssh localhost: publickey authorization failed
Sudo vi/etc/ssh/sshd_config
RSAAuthentication yes
PubkeyAuthentication yes
AuthorizedKeysFile. ssh/authorized_keys
Service sshd restart
Note: ssh supports both publickey and password authorization methods. publickey is disabled by default and must be set to yes.
If the client does not have. ssh/id_rsa, use password authorization; If yes, use publickey authorization;
If the publickey fails to be authorized, password authorization will continue.
Do not set PasswordAuthentication no. It means that Password Logon is prohibited, so that only local logon is allowed!
2. vi/etc/selinux/config
SELINUX = disabled
Chmod 700 ~ /. Ssh
Chmod 600 ~ /. Ssh/authorized_keys
Finally, restart your linux instance to execute ssh localhost.
3. ssh ip address or hostname prompts: connection refused
Whether the ssh server program of the target host is installed, whether the service is started, and whether port 22 is listened on;
Whether to allow the user to log on;
Whether iptables rules are set on the local machine, and ssh connection to/from is prohibited;
-------------------------------------- Split line --------------------------------------
Spark1.0.0 Deployment Guide
Install Spark0.8.0 in CentOS 6.2 (64-bit)
Introduction to Spark and its installation and use in Ubuntu
Install the Spark cluster (on CentOS)
Hadoop vs Spark Performance Comparison
Spark installation and learning
Spark Parallel Computing Model
-------------------------------------- Split line --------------------------------------
Spark details: click here
Spark: click here
This article permanently updates the link address: