Cloud computing, distributed big data, hadoop, hands-on, 8: hadoop graphic training course: hadoop file system operations

Source: Internet
Author: User
Tags hadoop fs

This document describes how to operate a hadoop file system through experiments.

Complete release directory of "cloud computing distributed Big Data hadoop hands-on"

Cloud computing distributed Big Data practical technology hadoop exchange group:312494188Cloud computing practices will be released in the group every day. welcome to join us!

 

First, let's look at some common hadoop file system operation commands:

The first common command: hadoop FS-ls

For example, run the following command to list the files and folders in the root directory of the file system:

 

 

Second common command: hadoop FS-mkidr

For example, use the following command to create a sub-directory under the root directory of HDFS. The specific effect is shown in:

 

 

Third common command: hadoop FS-Get

For example, run the following command to copy the jialingege folder in the root directory of HDFS to a local directory. The specific effect is shown in:

 

 

Fourth common command: hadoop FS-put srcfile/desfile

For example, run the following command to copy the stop-all.sh in the current directory to the root directory in HDFS, as shown in

 

 

Hadoop file system more operation commands open the official website file system operation commands page: http://hadoop.apache.org/docs/stable/file_system_shell.html

 

It can be seen that many commands are the same as those of the Linux File operating system. For example, the cat command in Linux means to print the content of a file on the screen, the meaning of the file system in hadoop is the same. Note that the file system operation commands in hadoop 1.1.2 have changed, but the official website has not changed accordingly, for example, let's look at the cat command:

 

RunCompositionThe system is "DFS", but in hadoop 1.1.2 It is "FS". We also saw this in the previous practice of the HDFS command line tool.

For operations on the hadoop file system, follow the official documentation to conduct a small experiment step by step. I will not go into details here.

 

 

 

 

 

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.