Examples of shell operations for Hadoop HDFs

Source: Internet
Author: User
Tags hadoop fs

This article was posted on my blog

We know that HDFs is a distributed file system for Hadoop, and since it is a file system, there will be at least the ability to manage files and folders, like our Windows operating system, to create, modify, delete, move, copy, modify permissions, and so on. Now let's look at how Hadoop operates.

Enter the Hadoop FS command first, and you will see the following output:

Usage:java Fsshell [-ls <path>] [-LSR <path>] [-du <path>] [-du s <path>] [-count[-q] <path>] [-mv <src> <dst>] [-CP <src> &lt           ;d St>] [-RM [-skiptrash] <path>] [-RMR [-skiptrash] <path>] [-expunge]           [-put <localsrc> ... <dst>]           [-copyfromlocal <localsrc> ... <dst>]           [-movefromlocal <localsrc> ... <dst>]           [-get [-IGNORECRC] [-CRC] <src> <localdst>]           [-getmerge <src> <localdst> [ADDNL]]           [-cat <src>]           [-text <src>]           [-copytolocal [-IGNORECRC] [-CRC] <src> <localdst>]           [-movetolocal [-CRC] <src> <localdst>]           [-mkdir <path>]           [-setrep [-R] [-W] <rep> <path/file>]           [-touchz <path>] [-test-[ezd] &lT;path>] [-stat [format] <path>] [-tail [-F] <file>] [-chmod [-R] <mode[,m ODE] ... |           Octalmode> PATH ...]           [-chown [-R] [Owner][:[group]] PATH ...]           [-CHGRP [-R] GROUP PATH ...] [-help [CMD]]

This shows the commands supported by Hadoop FS, such as (starting the Hadoop service) in the terminal input:

Hadoop Fs-ls/

You can also enter:

Hadoop Fs-ls Hdfs://hadoop-master:9000/

If you do not know the specific use then we can use the last command [-HELP [CMD]], such as the view of the LS command help input:

Hadoop fs-help ls

Then the output will make it easier for us to see Help:

-ls <path>:     List the contents that match the specified file pattern. If                path is not specified, the contents of/user/<currentuser> would be                listed. Directory entries is of the form                         dirName (full path) <dir> and file entries is of the                 form                         fileName (fu ll path) <r n> size where n is the number of replicas specified for the file and size are the size of the                 fil E, in bytes.

As for our operation under the shell, I would like to get drunk more common than to create folders, upload files, delete file folders, modify permissions, view the contents of the file of the several, let me say a few.

To create a folder entry:

Hadoop Fs-mkdir/data

A data folder is created in the root directory;

You can use the following command to view:

Hadoop Fs-ls/

Output:

Found 2 itemsdrwxr-xr-x   -Hadoop supergroup          0 2014-12-15 19:00/datadrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:26/USR

This format is much like the shell under Linux Oh, drwxr-xr-x the beginning of D for the directory, this is the same as Linux, the second character '-' represents the number of replicas, at this time the folder is not because the folder is a logical structure, only the file value. Specific instructions can refer to Hadoo fs-help ls.

Now upload a file to/data, we use:

Hadoop fs-put./test.txt/data/

Then use the following command to view:

Hadoop Fs-ls/data

Output:

Found 1 items-rw-r--r--   1 hadoop supergroup         2014-12-15 19:05/data/test.txt

At this point-rw-r--r--can see that the representation is a file, the number of copies 1.

Well, let's take a look at what we've uploaded. As with local, we can use the command:

Hadoop fs-text/data/test.txt

The output here is not in comparison. At this time we used the new command-text. commands for viewing the contents of a file, for specific reference

Hadoop fs-help Text

Now we have permission modifications to the/data folder that we just created, and we'll look at the current permissions:

Found 2 itemsdrwxr-xr-x   -Hadoop supergroup          0 2014-12-15 19:05/datadrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:26/USR

Now let's execute the following command to change the/data folder to the following folder file size to 777:

Hadoop fs-chmod-r 777/data

Then use the following command to view:

Hadoop FS-LSR/

Results:

Drwxrwxrwx   -Hadoop supergroup          0 2014-12-15 19:05/data-rw-rw-rw-   1 hadoop supergroup         33 2014-12-15 19:05/data/test.txtdrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:26/usrdrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:56/usr/localdrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:56/usr/local/ Hadoopdrwxr-xr-x   -Hadoop supergroup          0 2014-12-10 22:56/usr/local/hadoop/tmpdrwxr-xr-x   -Hadoop SuperGroup          0 2014-12-15 18:47/usr/local/hadoop/tmp/mapreddrwx------   -Hadoop supergroup          0 2014-12-15 18:47/USR/LOCAL/HADOOP/TMP/MAPRED/SYSTEM-RW-------   1 hadoop supergroup          4 2014-12-15 18:47/usr/local/ Hadoop/tmp/mapred/system/jobtracker.info

This time the/data permissions have changed, even the file under its folder has changed! But here I notice that this text file changes to 777 immediately but does not have permission to execute, do not know why. But it's rwx in Linux! Now let's use the command to see:

Hadoop fs-chmod-r A+x/data

Re-use:

Hadoop FS-LSR/

The result is not shown here, but it still does not change the execution permissions of the text file, or the same! May be really unable to change it!

-rw-rw-rw-   1 hadoop supergroup         2014-12-15 19:05/data/test.txt

The above a few try, feel like Linux, simple sometimes in debugging when you can execute the command to see, is about to end then delete the folder created above, execute:

Hadoop Fs-rmr/data

And, of course, this command:

Hadoop fs-rm/data/test.txt

Here I will not say their differences, I want to be familiar with the Linxu command should know!

come here first this time. Keep a record of every bit of drip!

Examples of shell operations for Hadoop HDFs

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.