1. I first declare the host
Pangzhiminglinux 192.168.200.129
CentOS2 192.168.200.130
This is probably the case: if you want to pangzhiminglinux from the ordinary user without authentication login to the CentOS2 of ordinary users (single-direction), then the two hosts will have a same name, the same password of the ordinary user Hadoop, Pangzhiminglinux Login to their own common user Hadoop, generate secret key pair, private key does not password, the public key content copied to the Hadoop home directory. Ssh/authorized_keys ( Authorized_keys to create it yourself ), modify the. SSH permissions to600, Authorized_keys, switch to root, edit/etc/ssh/sshd_ Config (uncomment a three line to release it). Next, create the. SSH under the CentOS2 user's Hadoop home directory, create the Authorized_keys under. SSH , and Pangzhiminglinux's common user's Hadoop public key is copied to it, modify permissions 700 and 600, in the same vein, switch to root, edit/etc/ssh/sshd_config (a three line uncomment the release).
2. Start here:
(1) This is in Pangzhiminglinux
#useradd Hadoop #创建普通用户passwd Hadoop #su-hadoop #切换到普通用户下去生成秘钥ssh-keygen-t RSA
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "Spacer.gif"/>650) this.width=650 ; "Src=" Http://s1.51cto.com/wyfs02/M01/7F/35/wKioL1cW89WAmRwmAAA8wUlt4oI797.png "title=" 1.png "alt=" Wkiol1cw89wamrwmaaa8wult4oi797.png "/>
The place of the circle directly enter the password is not required.
#touch ~/.ssh/authorized_keys #在. SSH to create Authorized_keys (these are the actions of ordinary users, do not be mistaken) #cat ~/.ssh/id_rsa.pub > ~/.ssh/ Authorized_keys #把公钥放进去 #chmod ~/.ssh/authorized_keys #改权限chmod ~/.ssh/#改权限
Down to root.
#su-root #vim/etc/ssh/sshd_config #重点来了
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "spacer.gif"/> Modify the configuration file in these three places, The yellow third line is where your public key is stored relative to your home directory.
650) this.width=650; "src=" Http://s5.51cto.com/wyfs02/M02/7F/37/wKiom1cW87PxS8QUAAAfeG0na4I855.png "title=" 2.png " alt= "Wkiom1cw87pxs8quaaafeg0na4i855.png"/>
#service sshd Restart #重启服务
(2) This is the CentOS2.
#useradd hadoop#passwd Hadoop #密码和名字必须和在pangzhiminglinux的一模一样!
#su-hadoop #这些都是在普通用户的操作 #mkdir ~/.ssh #创建. Ssh#chmod ~/.ssh #要改权限 #vim ~/.ssh/authorized_keys #创建这个文件, put in Pangz Himinglinux's Hadoop public key content is copied to #chmod ~/.ssh/authorized_keys #改权限
It's under the root.
#su-root#vim/etc/ssh/sshd_config #重点又来了
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "Spacer.gif"/>650) this.width=650 ; "Src=" Http://s5.51cto.com/wyfs02/M01/7F/37/wKiom1cW9AWA_caYAAAfeG0na4I514.png "title=" 2.png "alt=" Wkiom1cw9awa_ Cayaaafeg0na4i514.png "/>
(3) The top is finished, then the test
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "Spacer.gif"/>650) this.width=650 ; "Src=" Http://s3.51cto.com/wyfs02/M02/7F/35/wKioL1cW9NvjHvCWAAA0pVha060726.png "title=" 4.png "alt=" Wkiol1cw9nvjhvcwaaa0pvha060726.png "/>
See no, entered the Yes, successful no password login to CentOS2 of ordinary users, but I log in and then quit, and then come,
Do you want to know if there are any other magical places?
That is, Hadoop in pangzhiminglinux can transfer things without a password to CentOS2 's Hadoop, (note that I'm talking about the home directory of Hadoop that's transferred to CentOS2, Or you can use CentOS2 to create a directory test, the owner and the owning group are changed to Hadoop, regardless of where you drop the directory,pangzhiminglinux Hadoop can transfer it to it .
Below is the experiment:
From the 192.168.200.129, throw the thing without the verification code to 192.168.200.130.
650) this.width=650; "src=" Http://s4.51cto.com/wyfs02/M02/7F/35/wKioL1cW9cTyWbwBAAAlLi5sBoM368.png "title=" 666e39e936e7.png "alt=" Wkiol1cw9ctywbwbaaalli5sbom368.png "/>
And then, looking at 1 92.168.200.130, we found that the thing that had just been transferred was/testing .
650) this.width=650; "src=" Http://s4.51cto.com/wyfs02/M00/7F/35/wKioL1cW9pHzWnemAABAjpZ6tTI858.png "title=" 777-6761-4b67-8cd1-b5035266856b.png "alt=" Wkiol1cw9phzwnemaabajpz6tti858.png "/>
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "Spacer.gif"/>
650) this.width=650; "src="/e/u261/themes/default/images/spacer.gif "border=" 0 "style=" Background:url ("/e/u261/ Lang/zh-cn/images/localimage.png ") no-repeat center;border:1px solid #ddd;" alt= "Spacer.gif"/>
This article is from the "11165660" blog, please be sure to keep this source http://11175660.blog.51cto.com/11165660/1765688
Linux configuration ssh without password authentication, rsync