Linux Command Learning (1) -- ls command, linux Command learning ls
The first thing you need to know is the command prompt.
[Root @ localhost ~] #
Current login user (root)
Host Name (localhost)
Current Directory (Home Directory)
Super User prompt (#)
Normal user prompt ($)
1.1 Command Format
Command [Option] [parameter]
Note: Some commands are not in this format.
Multiple options can be writte
When you enter the ls command in cmd, "ls is not an internal or external command to solve the problem. Generally, you prefer to use the command line window cmd. Suddenly, it is easy to use other commands, if you enter the ls command, "ls cannot be an internal or external command" is displayed, but the function similar
Ls command implementation, ls command implementation
In linux, the most commonly used command should be ls. Have you ever wondered how to implement this command? In fact, it is not difficult to understand the interfaces related to the UNIX environment ~
Objectives:
You can use ls to list the brief information of a dire
An interesting question: ls, an interesting question: ls
When we use the ls-l command, we will see the following similar information.
Many people may not be very concerned about the value of total 12 in the first line, but have you ever wondered what it means?
Man, we can see that total means"The total disk space used by the listed content and the value. Uni
Linux Command: ls, linux Command ls
Command Format:
Ls [OPTION]... [FILE]...
Function:
Lists the file information under a directory. By default, the current directory is listed. The output results are listed alphabetically by default.
Parameters:
-A, -- all,-A, -- almost-all
List all files in the directory, including hidden files starting.
-- Author
Like
The Linux ls command is used to display the content in the specified working directory (list the files and subdirectories contained in the current working directory ).SyntaxLs [-alrtAFR] [name...]Parameters: -A: displays all files and directories. (if the file name or directory name starts with "." is specified in ls, it is regarded as a hidden file and will not be listed) -L in addition to
CentOS file attributes: Command ls, centos attribute command ls
Column 1: 11 digits
1) the first digit indicates the file type.
1) l indicates the link file
2) d indicates the Directory
3)-common files
4) Disk Partitions of Device B are of this type.
5) c serial port device file (character device file), such as keyboard, mouse, printer, tty terminal, etc.
6) s socket file for inter-process communi
Linux C Programming series exercises system call file IO, memory ing program write ls program, the system calls ls
Click Connect to enter the article
1.1Linux System Call exercise
1.2 simulate Linux ls program to display the tree directory
1.3 Memory sharing for simple data sharing
Linux C development is not a matter of two or three days. You need to sta
explanation) hadoop's script is really good to write, not satisfied, and I learned a lot from it.
2.1 hadoop-config.sh this script is relatively simple, and basic other scripts are embedded through ". $ bin/hadoop-config.sh "calls this script, so this script does not have to declare the right of interpretation in the first line, this call method is similar to copying the script content to the parent script
installation directory, execute hadoop jar hadoop-0.17.1-examples.jar wordcount input path and output path, you can see the word count statistics. Both the input and output paths here refer to the paths in HDFS. Therefore, you can first create an input path in HDFS by copying the directories in the local file system to HDFS:Hadoop DFS-copyfromlocal/home/wenchu/test-in. Here,/home/wenchu/test-in is the loca
archive tool[Plain]View Plaincopy
Hadoop archive-archivename input.har-p/user/hadoop/input har
Archivename Specify the file name of archive,-p for the parent directory, you can put more than one directory file into the archive, we look at the creation of a good har file.[Plain]View Plaincopy
Drwxr-xr-x-hadoop supergroup 0 2013-06-20 12:38/us
WordCount directly2) Compiling filesWhen using Javac to compile the command, we used two parameters:-classpath Specifies the core package required to compile the class, and-d specifies the storage path of the class file generated after compilation The last Wordcount.java means that the compiled object is the Wordcount.java class under the current folder.[[emailprotected] ~]$ cd /usr/hadoop/workspace/MapReduce/WordCount[[emailprotected] WordCount]$ ja
ip6-allnodes
Ff02: 2 ip6-allrouters
Step 3: Modify the ip address
Step 4: restart
Step 5: ssh password-free Configuration
1. Generate a key
Hadoop @ datanode2 :~ $ Ssh-keygen-t rsa-P ""
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hadoop/. ssh/id_rsa ):
/Home/hadoop/. ssh/id_rsa already exists.
Overwrite (y/n )? Y
Your
authorized_keys file on the machine, add the content in id_rsa.pub to the end of the file. If there is no authorized_keys file, just copy it .)
3) First, set SSH for namenode to automatically log on without a password.
Switch to hadoop users (ensure that users can log on to hadoop without a password, because the hadoop owner we install later is a
Hadoop Foundation----Hadoop Combat (vi)-----HADOOP management Tools---Cloudera Manager---CDH introduction
We have already learned about CDH in the last article, we will install CDH5.8 for the following study. CDH5.8 is now a relatively new version of Hadoop with more than hadoop2.0, and it already contains a number of
-mkdir/tmp # create a 10 Gb empty file and calculate its MD5 value, put hdfsdd if =/dev/zero of =/data/test_10G_file bs = 1G count = 10md5sum/data/test_10G_filesudo-u hdfs dfs-put/data/test_10G_file/tmpsudo- u hdfs dfs-ls/tmp # Now you can try to disable a datanode, then pull out the test file and calculate MD5 again to see if it is the same as sudo-u hdfs dfs-get/tmp/test_10G_file/tmp/md5sum/tmp/test_10G_file
3. Enable yarn Cluster
In addition to hdf
This document describes how to operate a hadoop file system through experiments.
Complete release directory of "cloud computing distributed Big Data hadoop hands-on"
Cloud computing distributed Big Data practical technology hadoop exchange group:312494188Cloud computing practices will be released in the group every day. welcome to join us!
First, let's loo
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.