Introduction to Hadoop Shell

Source: Internet
Author: User

Take Hadoop 2.7.3 as an example

The bin directory is the most basic cluster management script that allows users to perform various functions, such as HDFS management, MapReduce job management, and so on.

As a primer, first describe how Hadoop scripts are used in the bin directory, as follows: Refer to the Hadoop Command Reference on the website

Usage:hadoop [--config confdir] [COMMAND |CLASSNAME] CLASSNAME Run theClassNamed CLASSNAME or where COMMAND is one of:fs run a generic filesystem User Client version Print the version jar <jar>Run a jar file note:please use "Yarn jar"To launch YARN applications, notthis command. checknative [-a|-h] check native Hadoop and compression libraries availability DISTCP <SR curl> <desturl> Copy file or directories recursively archive-archivename name-p <parent path> <src >* <dest> Create a Hadoop archive Classpath prints the class path needed to get the credential Intera CT with credential providers Hadoop jar and the required libraries Daemonlog Get/set the log level for each  daemon Trace View and modify Hadoop tracing settingsmost commands print help when invoked w/o parameters.    

The use of the HDFs script under the bin directory is as follows: Refer to the HDFS Command reference on the official website

Usage:hdfs [--config Confdir] [--LogLevel loglevel] Command where command is one Of:dfs run a filesystem COMMAND on the file syste  Ms supported in Hadoop. Classpath Prints the Classpath namenode-Format format the DFS filesystem Secondarynamenode run the DFS secondary Namenode Namenode run the DF S Namenode Journalnode Run the DFS journalnode ZKFC run the ZK Failover Controller daemon Data Node run a DFS datanode dfsadmin run a DFS admin client haadmin run a Dfs HA admin  Client fsck Run a DFS filesystem checking utility balancer run a cluster balancing utility  Jmxget get JMX exported values from NameNode or DataNode.                  Mover run a utility to move block replicas across storage types OIV Apply the offline Fsimage viewer to a fsimage oiv_legacy apply the offline Fsimage Viewer to an legacy Fsimag e Oev apply the offline edits viewer to an edits file FETCHDT fetch a delegation token fro M the NameNode getconf get config values from configuration  Groups get the groups which users belong to Snapshotdiff diff. Snapshots of a directory or dif f The current directory contents with a snapshot lssnapshottabledir list all snapshottable dirs o Wned by the current user use-Help to see options Portmap Run a Portmap service NFS3 run an NFS version 3 Gateway      Cacheadmin Configure the HDFS cache crypto configure HDFs encryption zones storagepolicies  list/get/Set block storage policies version print the Versionmost commands print help when invoked w/o Parameters.  

The mapred script in the bin directory is used as follows: Refer to the MapReduce Command reference on the official website

 usage:mapred [--config confdir] [--loglevel loglevel] command where command is on E of:pipes run a pipes job job manipulate MapReduce jobs queue Get Info Rmation regarding jobqueues classpath prints the class path needed for running MapReduce subcommands Historyserver Run job history servers as a standalone daemon distcp <srcurl> <desturl> copy fi Le or directories recursively archive-archivename name-p <parent path> <src>* <dest> Create a Hadoop archive hsadmin job History Server Admin interfacemost commands print help when invoked w/o parameters.        

The use of yarn scripts in the bin directory is as follows: Yarn commands on the website

Usage:yarn [--config confdir] [COMMAND |CLASSNAME] CLASSNAME Run theClassNamed CLASSNAME or where COMMAND is one of:resourcemanager-format-state-Store deletes the Rmstatestore ResourceManager run the ResourceManager NodeManager                               Run a nodemanager on each slave timelineserver run the timeline server rmadmin                              admin tools Sharedcachemanager Run the Sharedcachemanager daemon scmadmin Sharedcachemanager Admin Tools version Print the version jar &L T;jar>Run a JAR file application prints application (s) report/Kill application Applicationattempt prints applicationattempt (s) Report container PRI NTS container (s) Report node prints node report (s) Queue prints queue information logs dump container logs classpath print s the class path needed to get the Hadoop jar and the required libraries cluster prints cluster information daemon Log get/set The log level is daemonmost commands print help when invoked w/o parameters.   

The use of the RCC script under the bin directory is as follows:

USAGE:RCC--language [java|c++] Ddl-files

Where--config is used to set up the Hadoop configuration file directory. The default directory is ${hadoop_home}/etc/hadoop. And command is a specific order, commonly used is the management of Hadoop command FS, job submission command jar and so on. CLASSNAME refers to a class that runs named CLASSNAME.

Introduction to Hadoop Shell

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.