This is the original blog, reproduced please indicate the source: http://www.cnblogs.com/MrFee/p/4683953.html
1, Appendtofile
function: Appends the contents of one or more source file systems to the target file system
How to use: Hadoop fs-appendtofile source files 1, source files 2 ... Target file Hadoop fs-appendtofile/flume/web_output/part-r-00000/flume/app_output/part-r-00000
2. Cat
Function: Output the contents of the file under the specified URI to stdout (print in console)
How to use: Hadoop fs-cat URI Hadoop fs-cat/flume/web_output/part-r-00000
3, ChgrpFunction: Change the group to which the file belongs, using-R will make the changes recursive under the directory structure. The user of the command must be the owner or superuser of the file. How to use: Hadoop fs-chgrp [-r] group URI Hadoop fs-chgrp-r Hadoop/flume
4, chmodFunction: Change the permissions of the file, use-R will make the changes recursive under the directory structure. The user of the command must be the owner or superuser of the fileHow to use: Hadoop fs-chmod [-r] <mode[,mode] ... | octalmode> uri [uri ...] Hadoop fs-chmod-r 777/flume
5, ChownFunction: Change the owner of the file, use the-R to make the change recursive under the directory structure. The user of the command must be a superuser. How to use: Hadoop Fs-chown [-R] [OWNER] [: [GROUP]] URI Hadoop fs-chown-r Hadoop_mapreduce:hadoop/flume
6, CopyfromlocalFeatures: Similar to the use of the put command, except that source files can only be local, copy files from the Linux file system or other file systems to the HDFs file system using: Hadoop fs-copyfromlocal <localsrc> URIs ( Localremote) Hadoop Fs-copyfromlocal/usr/huawei/app/flume/app
7, CopytolocalFeatures: Similar to get methods, except that the destination file is limited to local files, copy the HDFs file system files to the Linux file system or other file system usage: Hadoop fs-copytolocal [-IGNORECRC] [-CRC] URI < Localdst> Hadoop fs-copytolocal Ignorecrc-crc/flume/app/usr/huawei/app
8. CountFunction: According to the specified file style to match directories, files, bytes and statistics under the path. How to use: Hadoop fs-count [-Q] <paths> Hadoop fs-count hdfs://nn1.example.com/file1 hdfs://nn2.example . com/file2 hadooo Fs-count-q Hdfs://nn1.example.com/file1
9. CPFunction: Copy files from source path to target path, allow multiple source paths, target path can only have one use method: Hadoop fs-cp URI ... <dest> Hadoop Fs-cp/user/app/user/flume/app
10.Du features: Displays the size of all files in the directory, or displays the size of the file when only one file is specified. If you specify a directory, the size of all files/folders in that directory is displayedHow to use: Hadoop fs-du uri [uri ...
Hadoop Fs-du/flume/app/flume/web
return value:
successful return 0, Failure returns-1.
11. Dusfunction: Displays the size of the file. How to use: Hadoop fs-dus <args>
Hadoop Fs-dus/flume/app
12, Expunge function: empty Recycle Bin. Refer to the HDFs design documentation for more information about the properties of the Recycle Bin.How to use: Hadoop fs-expunge
13. Get function: Copy files to local file system. AvailableThe-IGNORECRC option replicates the file that the CRC check failed. Use the -CRC option to copy files and CRC information.
How to use: Hadoop fs-get [-IGNORECRC] [-CRC] <src> <localdst>
Hadoop Fs-get/flume/app/usr/huawei/app
14, GetfaclFunction:Displays the access control lists (ACLs) for files and directories. If a directory has a default ACL, then Getfacl also displays the default ACL. How to use: Hadoop Fs-getfacl [-R] <path> Hadop Fs-getfacl-r/flume
15. Getmerge function: Accept a source directory and a destination file as input, and connect all the files in the source directory to the local destination file.ADDNL is optional and is used to specify that a line break is added at the end of each file.
How to use: Hadoop fs-getmerge <src> <localdst> [ADDNL]
16, ls function: If it is a file, then return the file information in the following format:
File name < copy count > Size Modified Date Modify time permission User ID Group ID
If it is a directory, it returns a list of its immediate sub-files, as in Unix. The directory returns information for the list as follows:
Directory name <dir> Modified date Modify time permission User ID Group ID
How to use: Hadoop fs-ls <args>
Hadoop fs-ls/user/hadoop
17. LSRFunction:the recursive version of the LS command. Similar to the Ls-r in Unix .
How to use: Hadoop FS-LSR <args>
Hadoop fs-lsr/flume
18, MkDirFunction: Accept the URI specified by the path as a parameter to create these directories. It behaves like a Unix mkdir-p, which creates levels of parent directories in the path.
How to use: Hadoop fs-mkdir [-p] <paths>
Hadoop fs-mkdir/a/b/c
19. movefromlocal function: Similar to put method, move files from local to HDFsHow to use: Hadoop fs-movefromlocal <src> <dst>
Hadoop fs-movefromlocal/usr/local/*/user/flume/
20, MovetolocalFunction: Move the HDFs file to the local file system using: Hadoop fs-movetolocal [-CRC] <src> <dst> Hadoop FS-MOVETOLOCAL-CRC/ user/hadoop_hdfs/*/usr/local/
21, MV Function: Move the file from the source path to the destination path. This command allows for multiple source paths, at which point the destination path must be a directory. Files are not allowed to move between different file systems.How to use: Hadoop fs-mv uri [uri ...] <dest>
Hadoop fs-mv/user/flume/user/hadoop
22. Put function: Copy single or multiple source paths from the local file system to the target file system. Read input from standard input is also supported to write to the target file system.How to use: Hadoop fs-put <localsrc> ... <dst>Hadoop fs-put LocalFile Hdfs://host:port/hadoop/hadoopfile
23, RMfunction: Deletes the specified file. Only non-empty directories and files are deleted. Refer to the RMR command for recursive deletions.
How to use: Hadoop fs-rm uri [uri ...]
Hadoop fs-rm/flume
Hadoop Fs-rm/flume/app/logparser.jar
24. RMR function: The recursive version of Delete. If you specify the-skiptrash option, garbage, if enabled, bypasses and deletes the specified file (s) immediatelyHow to use: Hadoop fs-rmr uri [uri ...]
Hadoop fs-rmr/flume
25, Setrep function: Change the copy coefficient of a file. The-r option is used to recursively change the copy factor for all files in the directory.How to use: Hadoop Fs-setrep [-R] [-W] <numReplicas> <path>
Hadoop fs-setrep-r-W 3/user/flume
26. Stat function: Returns statistics for the specified path. How to use: Hadoop fs-stat uri [uri ...] Hadoop Fs-stat/flume
27, tailFunction: Outputs the contents of 1K bytes at the end of the file to stdout. The-f option is supported, and behaves the same as UNIX.
How to use: Hadoop Fs-tail [-f] URI
Hadoop Fs-tail/flume
28. Test function: Check whether the directory or file existsHow to use: Hadoop fs-test-[ezd] URI
Options:
-E checks whether the file exists. Returns 0 if it exists.
-Z Checks if the file is 0 bytes. Returns 0 if it is.
-
D returns 1 if the path is a directory, otherwise 0 is returned.
29. Text function: source file and output file are text format. The allowed formats are zip and Textrecordinputstream.How to use: Hadoop fs-text <src>
30, Touchz function: Create a 0-byte empty file.How to use: Hadoop fs-touchz uri [uri ...]
Hadoop-touchz Pathname
Hadoop Shell full Translator helps beginners