Hadoop reading notes (ii) the shell operation of HDFs

Source: Internet
Author: User
Tags hadoop fs

Hadoop reading Notes (i) Introduction to Hadoop: http://blog.csdn.net/caicongyang/article/details/39898629


1.shell operation

1.1 All HDFs shell operation naming can be obtained through Hadoop FS:

[[email protected] ~]# Hadoop FS
Usage:java Fsshell
[-ls <path>]
[-LSR <path>]
[-du <path>]
[-dus <path>]
[-count[-q] <path>]
[-MV <src> <dst>]
[-CP <src> <dst>]
[-RM [-skiptrash] <path>]
[-RMR [-skiptrash] <path>]
[-expunge]
[-put <localsrc> ... <dst>]
[-copyfromlocal <localsrc> ... <dst>]
[-movefromlocal <localsrc> ... <dst>]
[-get [-IGNORECRC] [-CRC] <src> <localdst>]
[-getmerge <src> <localdst> [ADDNL]]
[-cat <src>]
[-text <src>]
[-copytolocal [-IGNORECRC] [-CRC] <src> <localdst>]
[-movetolocal [-CRC] <src> <localdst>]
[-mkdir <path>]
[-setrep [-R] [-W] <rep> <path/file>]
[-touchz <path>]
[-test-[ezd] <path>]
[-STAT [format] <path>]
[-tail [-F] <file>]
[-chmod [-R] <mode[,mode] ... | Octalmode> PATH ...]
[-chown [-R] [Owner][:[group]] PATH ...]
[-CHGRP [-R] GROUP PATH ...]
[-help [CMD]]

Generic options supported are
-conf <configuration file> Specify an application configuration file
-D <property=value> Use value for given property
-fs <local|namenode:port> Specify a Namenode
-JT <local|jobtracker:port> Specify a job tracker
-files <comma separated list of files> specify comma separated files to being copied to the map reduce cluster
-libjars <comma separated list of jars> specify Comma separated jar files to include in the classpath.
-archives <comma separated list of archives> specify Comma separated archives to being unarchived on the compute mac Hines.


The General Command line syntax is
Bin/hadoop command [genericoptions] [commandoptions]

1.2 Common operations

All HDFS operations are preceded by Hadoop FS with the corresponding operation

1.2.1 Lists all the files under the HDFs file

[Email protected] ~]# Hadoop fs-ls hdfs://hadoop:9000/

hdfs://hadoop:9000 The default file system name configured in Hadoop profile Core-site.xml, the above command can be abbreviated as:

[Email protected] ~]# Hadoop fs-ls/

1.2.1 File upload: Speak llinux under the/usr/local/hadoop-1.1.2.tar.gz upload to the/download folder under HDFs

[Email protected] ~]# Hadoop fs-ls/usr/local/hadoop-1.1.2.tar.gz/download

1.2.2 View uploaded files: Cycle through all files listed under/download

[Email protected] ~]# Hadoop fs-lsr/download

1.3HDFS Shell Operation Command Help

[Email protected] ~]# Hadoop fs-help chown

-chown [-R] [Owner][:[group]] PATH ...
Changes owner and group of a file.
This was similar to Shell's chown with a few exceptions.

-R modifies the files recursively. The only option
Currently supported.

If only owner or group was specified then only owner or
Group is modified.

The owner and group names may be only cosists of digits, alphabet,
and any of ' [email protected]/' i.e. [email protected]/a-za-z0-9]. The names is case
Sensitive.

Warning:avoid using '. ' to separate user name and group though
Linux allows it. If user names has the dots in them and is
Using the local file system, you might see surprising results since
Shell command ' chown ' is used for local files.


Welcome everybody to discuss the study together!

Useful Self-collection!

Record and share, let you and I grow together! Welcome to my other blogs, my blog address: Http://blog.csdn.net/caicongyang




Hadoop reading notes (ii) the shell operation of HDFs

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.