HDFS basic Commands

Source: Internet
Author: User
Tags file copy mkdir hdfs dfs

HDFs Common commands:
Note: The following execution commands are in the bin directory of the Spark installation directory.
Path src for file path dist to folder
1.-help[cmd] Show Help for commands

./hdfs Dfs-help ls

2.-ls (r) displays all files in the current directory-R layer-by-layer follow-up folder

./hdfs dfs-ls/log/map
./hdfs dfs-lsr/log/   (Recursive)

3,-du (s) displays all file sizes in the directory, or displays the size of this file when only one file is specified

./hdfs Dfs-du/user/hadoop/dir1/user/hadoop/file1 Hdfs://host:port/user/hadoop/dir1

4,-count[-q] Displays all file sizes in the current directory
5.-MV move multiple file directories to the target directory

./hdfs Dfs-mv/user/hadoop/file1/user/hadoop/file2

6.-CP copy multiple files to target directory

./hdfs dfs-cp/user/hadoop/file1/user/hadoop/file2 (Copy the file from the source path to the destination path.)
This command allows for multiple source paths, at which point the destination path must be a directory. )

7.-RM (r) Delete file (clip)

./hdfs dfs-rmr/log/map1  (recursive deletion)

8.-put local file copy to HDFs

./hdfs Dfs-put test.txt/log/map/

9.-copyfromlocal local file copy to HDFs

./hdfs dfs-copyfromlocal/usr/data/text.txt/log/map1/   (Copy the local text.txt to the/log/map1/of HDFs)

10.-movefromlocal local file moved to HDFs

./hdfs dfs-movefromlocal/usr/data/text.txt/log/map1/   (move the local text.txt to the/log/map1/of HDFs)

11,-GET[-IGNORECRC] Copy file to local, can ignore CRC check

./hdfs dfs-get/log/map1/*  . (Copy to local current directory)
/hdfs dfs-get/log/map1/*/usr/data (copy all files under the/log/map1/under HDFs to local/usr/data/)

12.-getmerge[addnl] Merges all the files in the source directory into a single file, accepts a source directory and a destination file as input, and connects all the files in the source directory to the local destination file. ADDNL is optional and is used to specify that a line break is added at the end of each file.

./hdfs dfs-getmerge/log/map1/*/usr/data (Download all files under/log/map1/on HDFs to local/usr/data)

13,-cat in the terminal display file content

./hdfs dfs-cat/log/map1/part-00000  | Head (reads the part-00000 file head parameter under/log/map1 on HDFs    , representing the first 10 rows. )

 /hdfs dfs-tail/log/map1/part-00000 (view the last 1000 lines of the file)

14,-text in the terminal display file content, the source file output as text format. The allowed formats are zip and Textrecordinputstream
15.-COPYTOLOCAL[-IGNORECRC] Copy files to local
16.-movetolocal move files to Local
17.-mkdir Create folder followed by-P to create a parent path that does not exist

./hdfs Dfs-mkdir-p/dir1/dir11/dir111  

18.-touchz Create an empty file

19.-grep Filtering the contents of a line containing a character from HDFs

./hdfs dfs-cat/log/testlog/* | grep filter Field

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.