I sorted it out by referring to a person's blog on the Internet:
Start HadoopGo to the HADOOP_HOME directory. Run bin/start-all.sh close Hadoop go to HADOOP_HOME directory execute bin/stop-all.sh
1. view the content in the specified directory
Hadoop dfs-ls [file directory]
Eg: hadoop dfs-ls/user/wangkai.pt
2. Open an existing file
Hadoop dfs-cat [file_path]
Eg: hadoop dfs-cat/user/wangkai.pt/data.txt
3. Store Local files to hadoop
Hadoop fs-put [local address] [hadoop Directory]
Hadoop fs-put/home/t/file.txt/user/t
(File.txt is the file name)
4. Store Local folders to hadoop
Hadoop fs-put [local directory] [hadoop Directory]
Hadoop fs-put/home/t/dir_name/user/t
(Dir_name is the folder name)
5. down a file on hadoop to a local directory
Hadoop fs-get [file directory] [local directory]
Hadoop fs-get/user/t/OK .txt/home/t
6. delete a specified file on hadoop
Hadoop fs-rm [file address]
Hadoop fs-rm/user/t/OK .txt
7. Delete specified folders (including subdirectories) on hadoop)
Hadoop fs-rm [directory address]
Hadoop fs-rmr/user/t
8. Create a new directory in the specified directory of hadoop
Hadoop fs-mkdir/user/t
9. Create an empty file under the specified directory of hadoop
Run the touchz command:
Hadoop fs-touchz/user/new.txt
10. rename a file on hadoop
Use the mv command:
Hadoop fs-mv/user/test.txt/user/OK .txt (rename test.txt as OK .txt)
11. Save all content in the specified hadoop directory as a file and go down to the local directory.
Hadoop dfs-getmerge/user/home/t
12. kill a running hadoop job.
Hadoop job-kill [job-id]
The log information during hadoop startup. In the log folder under the HADOOP_HOME directory, the NameNode and DataNode cannot be started. You can check the log information in the log.