1. Introduction to hadoop File System Permissions
2. Command Introduction
3. References
<1>. Introduction to hadoop File System Permissions
The File Permission model implemented by the hadoop file system is very similar to the POSIX model. Each file and directory is associated with an owner and group. You can use the following command to view all files in the/tmp/directory and their permissions:
Xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $. /bin/hadoop FS-ls/tmp/11/05/10 06:42:56 Info Security. groups: group mapping impl = org. apache. hadoop. security. shellbasedunixgroupsmapping; cachetimeout = 30000011/05/10 06:42:56 warn Conf. configuration: mapred. task. ID is deprecated. instead, use mapreduce. task. attempt. idfound 2 itemsdrwxr-XR-X-root supergroup 0/tmp/hadoop-root-RW-r -- 3 xuqiang supergroup 541/tmp/hello. c
For a file, r indicates the permission to read the file, W indicates the permission to write the file, and for a directory, the r permission indicates that the file list under the directory can be read, and the W permission indicates that the file and directory can be created or deleted under the directory, the X permission indicates that the sub-directory can be accessed from this directory. Unlike the POSIX model, HDFS does not contain sticky, setuid, and setgid.
<2>. Command Overview
HDFS is designed to process massive data, that is, it can store a large number of files (Tb-level files) on it. After HDFS splits these files, it is stored on different datanode, but it provides a simple fact:
This is a simple and complete file, but it is large.
the HDFS file processing commands are basically the same as those on Linux. type./bin/hadoop FS to output a list of supported commands:
Usage: java fsshell [-ls <path>] [-LSR <path>] [-DF [<path>] [-Du [-S] [-H] <path>] [-DUS <path>] [-count [-q] <path>] [-MV <SRC> <DST>] [-CP <SRC> <DST>] [- rm [-skiptrash] <path>] [-RMR [-skiptrash] <path>] [-expunge] [-put <localsrc>... <DST>] [-copyfromlocal <localsrc>... <DST>] [-movefromlocal <localsrc>... <DST>] [-Get [-ignorecrc] [-CRC] <SRC> <localdst>] [-getmerge <SRC> <localdst> [addnl] [-cat <SRC >] [-text <SRC>] [-copytolocal [-ignorecrc] [-CRC] <SRC> <localdst>] [-movetolocal [-CRC] <SRC> <localdst>] [-mkdir <path>] [-setrep [-R] [-W] <rep> <path/File>] [-touchz <path>] [-test -[ ezd] <path>] [-stat [format] <path>] [-tail [-F] <File>] [-chmod [-R] <mode [, mode]... | octalmode> path...] [-chown [-R] [owner] [: [group] path...] [-chgrp [-R] group path...] [-help [cmd]
If you are familiar with Linux, these commands do not need to be explained. However, the commands in several common scenarios are listed below:
1. add files and directories
The directory structure of files on HDFS is also similar to that of Linux, and the root directory is represented. The following command creates the newdir directory in the root directory:
./Bin/hadoop FS-mkdir/newdir
Ls:
Xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $./bin/hadoop FS-ls/
11/06/01 18:04:11 Info Security. groups: group mapping impl = org. apache. hadoop. security. shellbasedunixgroupsmapping; cachetimeout = 30000011/06/01 18:04:11 warn Conf. configuration: mapred. task. ID is deprecated. instead, use mapreduce. task. attempt. idfound 3 itemsdrwxr-XR-X-xuqiang supergroup 0/jobtrackerdrwxr-XR-X-xuqiang supergroup 0/newdirdrwxr-XR-X-xuqiang Su Pergroup 0/tmp now that this directory is available, we will upload a local file to HDFS.
Xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $./bin/hadoop FS-put./readme.txt.
Note that. meaning: In HDFS, each logged-on user will have a default working directory/user/$ loginname (similar to the home directory in Linux ),. this is the default working directory.
2. Download an object
Xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $./bin/hadoop FS-Get/user/xuqiang/readme.txt 3. delete file xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $./bin/hadoop FS-RM/user/xuqiang/readme.txt 4. Help Command xuqiang @ Ubuntu :~ /Hadoop/src/hadoop-0.21.0 $./bin/hadoop FS-help ls
<3>. References
Http://hadoop.apache.org/common/docs/r0.18.3/hdfs_permissions_guide.html
<Hadoop in action>