One: Linux basic commands:
1. Check the IP address:
$ ifconfig
2. Clear the screen:
$ clear
3. Switch the root user:
$ su
4. View the host static IP address:
$ more/etc/sysconfig/network-scripts/ifcfg-eth0
5. Host Name:
View host Name: $ hostname Modify host Name: $ hostname Host Name
6. Catalogue:
View current directory: $ pwd Enter subdirectories under current directory: $ CD (such as CD data) lists all files in the current directory: $ ls Create directory: $ mkdir (such as $ mkdi R softwares) Enter the current directory: $ CD. . Go to User home directory: $ cd
7. Documents:
View files: (1). $ More (a page-by-page view.) For example: $ more/etc/sysconfig/network) lists the details of all files in the current directory: $ ll View all files: $ ls-al (start with. The file name indicates that the file is a hidden file) Create File: VI xxx.txt
8. Edit the file:
$ vim (such as: $ vim hello.txt) (1). The first entry is to view the file. (2). Enter edit mode after pressing "I". (3). Save: Press "ESC" key First, then ": Wq". (4). Do not Save: Press ": q!".
9. Modify File Permissions:
chmod (for example: $ chmod u+w/etc/profiles $chmod 777/etc/profiles (the file permission is All rights)) file permissions are: R (readable) w (writable) x (executable)
10. Rename the directory or file:
$ MV (such as $ MV software softwares)
11. Move the directory or file:
$ MV (such as $ MV software/hadoop/home)
12. Copy the directory or file:
$ CP (e.g. CP Hello.txt World.txt)
13. Delete a directory or file:
Delete File: $ rm Delete folder: $ rmdir (Delete non-empty folder $rm-RF)
14. Unzip the command:
$ tar zxvf text.tgz-c specified directory (for example: Tar zxvf/source/kernel.tgz-c/source/linux-2.6.9)
15. Firewall:
Turn off the firewall (view firewall #service iptables status): Temporarily: (1) on: # Service Iptabels Start (2). Close # servive Iptables Stop Permanent: (1). On: #chk Config--level iptables on (2). Off: #chkconfig--level iptables off
Second: Use the relevant commands in the Hadoop process:
1. View the process
Linux under View process: $ PS Linux View Java process: $ jps
2. View Hadoop-related commands
$ hadoop (using such as: $ Hadoop namenode–format)
3. View the status of the Hadoop-related daemons (such as namenode,jobtracker, etc.)
50030 (http://192.168.119.180:50030): View Map/reduce Process
50070 (http://192.168.119.180:50070): View the status of Namenode and the entire Distributed file system, browse the files and logs of the Distributed file system, etc.
50060 (http://192.168.119.180:50060): View Tasktracker Run status
4.HDFS the operation of the file (delete, copy, etc., roughly the same as the Linux command. These commands are in the Hadoop FS command set)
Note: Because HDFs does not have a working directory, only the default "home" directory, so do not enter the directory and other commands
[-ls <path>]: list content such as: Hadoop fs–ls/home: List directories in the "Home" directory
[-LSR <path>]: List all content such as: Hadoop fs–ls/home: List all directories in the "Home" directory
[-MV <src> <dst>]: Move directory or file
[-CP <src> <dst>]
[-RM [-skiptrash] <path>]
[-RMR [-skiptrash] <path>]
[-put <localsrc> ... <dst>]: Upload files such as:
[-get [-IGNORECRC] [-CRC] <src> <localdst>]
[-cat <src>]
[-text <src>]
[-mkdir <path>] Create a folder such as: Hadoop FS-MKDIR/HOME/HADOOP/WC: Create a WC directory in the/home/hadoop directory
[-test-[ezd] <path>]
[-STAT [format] <path>]
[-chmod [-R] <mode[,mode] ... | Octalmode> PATH ...]
[-chown [-R] [Owner][:[group]] PATH ...]
There are many commands that are in the Hadoop FS command set and can be $ Hadoop fs: view
Linux and Hadoop-related commands