Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which appears same but has minute differences
Ha
Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which appears same but has minute differences
Ha
After installing the Hadoop pseudo-distributed environment, executing the relevant commands (for example: Bin/hdfs dfs-ls) will appearWARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable, which is Because the installed Navtive packages and platforms do not match, the
DISTCP Parallel replication
The same version of the Hadoop cluster
Hadoop distcp Hdfs//namenode1/foo Hdfs//namenode2/bar
Different versions of the Hadoop cluster (HDFs version), executed on the writing side
Hadoop distcp Hftp://namenode1:50070/foo Hdfs://namenode2/bar
Archive of
Hadoop Shell commands
Use bin/hadoop FS
1. cat
Description: outputs the content of the specified file in the path to stdout.
Usage: hadoop fs-cat URI [URI…]
Example:
hadoopfs-cathdfs://host1:port1/file1hdfs://host2:port2/file2
hadoopfs-catfile:///file3/user/hadoop/
[jobMainClass] [jobArgs]
Killing a running JOB
Hadoop job-kill job_20100531_37_0053
More HADOOP commands
Hadoop
You can see the description of more commands:
Namenode-format the DFS filesystem
Secondarynamenode run the DFS secondary namenode
Namenode run the DFS namenode
Da
Introduction to some common commands in hadoop. Assume that the Hadoop installation directory HADOOP_HOME is homeadminhadoop. Start and close Hadoop1. open the HADOOP_HOME directory. 2. run the shbinstart-all.sh to close Hadoop1. go to HADOOP_HOM. suppose Hadoop's installation directory HADOOP_HOME is/home/admin/hadoop
job1. Enter the Hadoop_home directory.2. Execute SH bin/hadoop jar/home/admin/hadoop/job.jar [jobmainclass] [Jobargs]
Kill a running jobSuppose job_id is: job_201005310937_00531. Enter the Hadoop_home directory.2. Execute SH bin/hadoop job-kill job_201005310937_0053
more commands for HadoopThe operations
This article provides a detailed analysis of some commonly used commands in hadoop. For more information, see Hadoop installation directory HADOOP_HOME:/home/admin/hadoop.
Start and closeStart Hadoop1. go to the HADOOP_HOME directory.
2. execute sh bin/start-all.sh
Disable Hadoop1. go to the HADOOP_HOME directory.2.
/hadoop jar/home/admin/hadoop/job. jar [jobMainClass] [jobArgs]
Killing a running JobAssume that Job_Id is job_20100531_37_0053.1. Go to the HADOOP_HOME directory.2. Execute sh bin/hadoop job-kill job_20100531_37_0053
More Hadoop commandsThe Hadoop operation
A. Common Hadoop commands1. The FS command for Hadoop#查看hadoop所有的fs命令Hadoop FS#上传文件 (both put and copyfromlocal are upload commands)Hadoop fs-put jdk-7u55-linux-i586.tar.gz hdfs://hucc01:9000/jdkhadoop fs-copyfromlocal jdk-7u55-li
/hadoop jar/home/admin/hadoop/job. jar [jobMainClass] [jobArgs]
Killing a running JobAssume that Job_Id is job_20100531_37_0053.1. Go to the HADOOP_HOME directory.2. Execute sh bin/hadoop job-kill job_20100531_37_0053
More Hadoop commandsThe Hadoop operation
region:#hbase> major_compact ‘r1‘, ‘c1‘#Compact a single column family within a table:#hbase> major_compact ‘t1‘, ‘c1‘
Configuration Management and node restart1) Modify the HDFs configurationHDFs Configuration Location:/etc/hadoop/conf
# 同步hdfs配置cat /home/hadoop/slaves|xargs -i -t scp /etc/hadoop/conf/hdfs-site.x
directory.2. Perform sh bin/hadoop jar/home/admin/hadoop/job.jar [jobmainclass] [Jobargs]
Kill a running jobSuppose job_id is: job_201005310937_00531. Enter the Hadoop_home directory.2. Perform sh bin/hadoop job-kill job_201005310937_0053
More Hadoop commandsThe operation comman
FS Shell
Calling the file system (FS) shell command should use the form of Bin/hadoop FS Cat
How to use: Hadoop fs-cat uri [uri ...]
The path specifies the contents of the file to be exported to stdout.
Example:
Hadoop fs-cat Hdfs://host1:port1/file1 Hdfs://host2:port2/file2
Hadoop Fs-cat File:///file3/user/
To make it easier for you to review your memory, we will summarize the hadoop commands of today's experiment for later viewing.Note: The following commands are performed in hadoop/bin.1. hadoop FS-ls \-> View All the following directories.2.
commands in the Hadoop process:1. View the process
Linux under View process: $ PS Linux View Java process: $ jps
2. View Hadoop-related commands
$ hadoop (using such as: $ Hadoop namenode–format)
3. Vi
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.