Hadoop Common Commands

Source: Internet
Author: User
Tags hadoop fs

HDFs Basic command:
Hadoop fs-cmd
CMD: The exact operation, basically the same as the Unix command line
Args: Parameters
HDFs Resource URI Format:
Scheme://authority/path
Scheme: protocol name, file or HDFs
Authority:namenode Host Name
Path: Paths
Example: Hdfs://localhost:9000/user/chunk/test.txt
Assuming you have configured fs.default.name=hdfs://localhost:9000 in Core-site.xml, you can use only/user/chunk/test.txt.
The default working directory for HDFs is/user/$USER, $USER is the current login user name.
HDFs Command Example:
Hadoop Fs-mkdir/user/trunk
Hadoop Fs-ls/user
Hadoop Fs-lsr/user (Recursive)
Hadoop fs-put Test.txt/user/trunk
Hadoop fs-put Test.txt. (Copy to HDFs current directory, first to create the current directory)
Hadoop fs-get/user/trunk/test.txt. (Copy to local current directory)
Hadoop fs-cat/user/trunk/test.txt
Hadoop fs-tail/user/trunk/test.txt (view last 1000 bytes)
Hadoop fs-rm/user/trunk/test.txt
Hadoop fs-help ls (see the Help documentation for the LS command)


Hadoop fs-cat '/user/hive/warehouse/ci_cuser_20141231141853691/_success '; Hadoop fs-cat '/user/hive/warehouse/ci_cuser_20141231141853691/* ' >ci_cusere_20141231141853691.csv && echo $?

~/.bash_profile: Each user can use this file to enter shell information dedicated to their own use, when the user logs on, the
The file is only executed once! By default, he sets some environment variables to execute the user's. bashrc file.


Hadoop fs-cat ' $1$2/* ' >$3.csv
MV $3.csv/home/ocdc/coc

String command = "CD" + Ciftpinfo.getftppath () + "&&" +hadooppath+ "Hadoop fs-cat '/user/hive/warehouse/" +listn ame+ "/*" > "+listtablename+". csv; ";

'/home/ocdc/spark-1.2.0-oc-bin-2.3.0-cdh5.1.3/bin/beeline-u jdbc:hive2://10.1.251.98:10000-n ocdc-p asiainfo

Tar zxvf file name. tar.gz
CP file name 1 file name 2 (copy file)

Administrator Common commands:

Hadoop job–list #列出正在运行的Job

Hadoop job–kill <job_id> #kill Job

Hadoop fsck/#检查HDFS块状态, damaged

Hadoop fsck/-delete #检查HDFS块状态, removing corrupted blocks

Hadoop dfsadmin–report #检查HDFS状态, including DN information

Hadoop Dfsadmin–safemode Enter | Leave

Hadoop distcp hdfs://a:8020/xxx hdfs://b:8020///#并行copy

Hadoop Common Commands

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.