Hadoop advanced 1. Configure SSH-free (1) Modify the slaves file
Switch to master machine, this section is all done in master.
Enter the/usr/hadoop/etc/hadoop directory, locate the slaves file, and modify:
slave1slave2slave3
(2) Sending the public key
Enter the. SSH directory under the root directory:
Generate Public Private key
SSH-KEYGEN-T RSA
Will generate two files: Id.rsa id.rsa.pub
Send
Ssh-copy-id Master
Ssh-copy-id slave1
Ssh-copy-id Slave2
Ssh-copy-id Slave3
In the. SSH directory on four machines, a Authorized_keys file will be generated
2. Configure cluster set up, Shutdown (1) Open
start-dfs.sh
(2) Close
stop-dfs.sh
3.hdfs file Directory command
Command |
meaning |
Example |
meaning |
Hadoop Fs-ls Directory |
View all files and subdirectories under a directory |
Hadoop Fs-ls/ |
Viewing the root directory |
Hadoop fs-put Native Files directory |
Upload native files to the HDFs directory |
Hadoop fs-put hello.txt/ |
Upload the hello.txt to the HDFs root directory |
Hadoop fs-rm files or directories |
Delete a file or directory |
Hadoop fs-rm/hello.txt |
Delete Hello.txt |
Hadoop fs-text text File |
View text files |
Hadoop fs-text/hello.txt |
View Hello.txt Content |
2.Hadoop Cluster Installation Advanced