Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which
Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which
. ADDNL is optional and is used to specify that a newline character be added at the end of each file.
Ls
How to use: Hadoop fs-ls
If it is a file, the file information is returned in the following format:
File name If it is a directory, it returns a list of its immediate subfolders, as in Unix. The information for the catalog return
FS ShellThe call file system (FS) shell command should use the form bin/hadoop FS . All of the FS shell commands use URI paths as parameters. The URI format is scheme://authority/path . For the HDFs file system, Scheme is HDFs ,
Use bin/hadoop FS Scheme: // authority/path. For HDFS file systems, scheme isHDFSFor the local file system, scheme isFile. The scheme and authority parameters are optional. If not specified, the default scheme specified in the configuration will be used. An HDFS file or directory such/Parent/childCan be expressedHDFS: // namenode: namenodeport/parent/child, Or simpler/Parent/child(Assume that the default va
file systems like local, HDFS etc. So the can is used when you is dealing with different file systems such as Local FS, Hftp FS, S3 FS, and others Hadoop DFS DFS is very specific to HDFS. Would work for operation relates to HDFS. This have been deprecated and we should use HDFs DFS instead. HDFs DFS Same as 2nd i.
Hadoop FS: The widest range of users can operate any file system.
Hadoop DFS and HDFs dfs: only HDFs file system related (including operations with local FS) can be manipulated, the former has been deprecated, generally using the latter.
The following reference from StackOverflow
Following are the three
Hadoop Shell commands
Use bin/hadoop FS
1. cat
Description: outputs the content of the specified file in the path to stdout.
Usage: hadoop fs-cat URI [URI…]
Example:
hadoopfs-cathdfs://host1:port1/file1hdfs://host2:port2/fi
[jobMainClass] [jobArgs]
Killing a running JOB
Hadoop job-kill job_20100531_37_0053
More HADOOP commands
Hadoop
You can see the description of more commands:
Namenode-format the DFS filesystem
Secondarynamenode run the DFS secondary namenode
Namenode run the DFS namenode
Da
Introduction to some common commands in hadoop. Assume that the Hadoop installation directory HADOOP_HOME is homeadminhadoop. Start and close Hadoop1. open the HADOOP_HOME directory. 2. run the shbinstart-all.sh to close Hadoop1. go to HADOOP_HOM. suppose Hadoop's installation directory HADOOP_HOME is/home/admin/hadoop
Assume that the installation directory for Hadoop is Hadoop_home/home/admin/hadoop.
Start and closeStart Hadoop1. Enter the Hadoop_home directory.
2. Execute SH bin/start-all.sh
Turn off Hadoop1. Enter the Hadoop_home directory.2. Execute SH bin/stop-all.shFile operationsHadoop uses HDFs, which is similar in functionality to the disk systems we use. and wildcard characters such as * are supported.
View a
This article provides a detailed analysis of some commonly used commands in hadoop. For more information, see Hadoop installation directory HADOOP_HOME:/home/admin/hadoop.
Start and closeStart Hadoop1. go to the HADOOP_HOME directory.
2. execute sh bin/start-all.sh
Disable Hadoop1. go to the HADOOP_HOME directory.2.
Assume that the Hadoop installation directory HADOOP_HOME is/home/admin/hadoop.Start and closeStart Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/start-all.sh
Disable Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/stop-all.shFile OperationsHadoop uses HDFS to implement functions similar to the disk system we use. Wildcard characters are also supported, such *.
View the file listView the files in the/user/admin/aaron directory
Assume that the Hadoop installation directory HADOOP_HOME is/home/admin/hadoop.Start and closeStart Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/start-all.sh
Disable Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/stop-all.shFile OperationsHadoop uses HDFS to implement functions similar to the disk system we use. Wildcard characters are also supported, such *.
View the file listView the files in the/user/admin/aaron directory
Suppose the installation directory for Hadoop is hadoop_home to/home/admin/hadoop.Start and closeStart Hadoop1. Enter the Hadoop_home directory.2. Perform SH bin/start-all.sh
Turn off Hadoop1. Enter the Hadoop_home directory.2. Perform SH bin/stop-all.shFile actionsHadoop uses HDFs, which is similar to the disk system we use. and supports wildcard characters, such as *.
View File ListView the files in the/user/admin/aaron directory in HDFs.1. Enter
/ci_cuser_20141231141853691/* ' >ci_cusere_20141231141853691.csv echo $?~/.bash_profile: Each user can use this file to enter shell information dedicated to their own use, when the user logs on, theThe file is only executed once! By default, he sets some environment variables to execute the user's. bashrc file.Hadoop fs-cat ' $1$2/* ' >$3.csvMV $3.csv/home/ocdc/cocString command = "CD" + Ciftpinfo.getftppath () + "" +hadooppath+ "
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.