After installing the Hadoop pseudo-distributed environment, executing the relevant commands (for example: Bin/hdfs dfs-ls) will appearWARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable, which is Because the installed Navtive packages and platforms do not match, the
FS Shell
Calling the file system (FS) shell command should use the form of Bin/hadoop FS Cat
How to use: Hadoop fs-cat uri [uri ...]
The path specifies the contents of the file to be exported to stdout.
Example:
Hadoop fs-cat Hdfs://host1:port1/file1 Hdfs://host2:p
FS Shell
Use bin/hadoop FS
Cat
Usage:
hadoop fs -cat URI [URI …]
Output the content of the specified file in the path to stdout.
Example:
hadoop fs -cat hdfs://host1:port1/file1 hdfs://host2:port2/file2 hadoop fs -cat file:///file3 /user/
This article was posted on my blog We know that HDFs is a distributed file system for Hadoop, and since it is a file system, there will be at least the ability to manage files and folders, like our Windows operating system, to create, modify, delete, move, copy, modify permissions, and so on. Now let's look at how Hadoop operates.Enter the Hadoop FS command fir
Http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/FileSystemShell.html#Overview FS Shellthe call file system (FS) shell command should use the form bin/hadoop FS . All of the FS shell commands use URI paths a
specify the-skiptrash option, garbage, if enabled, bypasses and deletes the specified file (s) immediatelyHow to use: Hadoop fs-rmr uri [uri ...]Hadoop fs-rmr/flume 25, Setrep function: Change the copy coefficient of a file. The-r option is used to recursively change the copy factor for all files in the directory.How to use: Hadoop Fs-setrep [-R] [-W]
Shell translates into shell meaning, it is wrapped in the Linux kernel layer, a series of Linux commands can be issued to the operating system related instructions to the human-machine interface. The shell can combine a series of Linux commands with its conditional statement
I. Introduction to HDFS shell commands
We all know that HDFS is a distributed file system for data access. HDFS operations are basic operations of the file system, such as file creation, modification, deletion, and modification permissions, folder creation, deletion, and renaming. Commands for HDFS are similar to the operations on files by llinux
help when invoked w/o parameters. The use of yarn scripts in the bin directory is as follows: Yarn commands on the websiteUsage:yarn [--config confdir] [COMMAND |CLASSNAME] CLASSNAME Run theClassNamed CLASSNAME or where COMMAND is one of:resourcemanager-format-state-Store deletes the Rmstatestore ResourceManager run the ResourceManager NodeManager Run a nodemanager on each slave timelineserver run the timeline ser
Common Linux shell commands and shell commands
I. Basic commands
1. Shut down immediately and restart again. Execute the following command:
Shutdown-r now or reboot
2. Shut down immediately and execute the following command:
Shutdown-h now or poweroff
3. Wait for two minutes
Some common Shell commands and common Shell commands
Common system commands]
Passwd [hostname]Change Password
Useradd [hostname]Add User
Su-Switch to the root user
Ssh [hostname]Ssh connection
Cd [directory]Go to the directory
Sudo chown [username]Grant corresponding permiss
=> ' f2′, method => ' delete '}
(7) Statistics of the number of lines:
Copy Code code as follows:
hbase> Count ' t1′
hbase> count ' t1′, INTERVAL => 100000
hbase> count ' t1′, CACHE => 1000
hbase> count ' t1′, INTERVAL =>, CACHE => 1000
Count is typically time-consuming, using mapreduce for statistics, and the results are cached, by default, by 10 rows. The statistical interval defaults to 1000 rows (INTERVAL).
(8) Disable and enable operation
Many operations nee
Execute shell commands and shell commands
First, name the shell command as a. sh file.
Save the above Code as test. sh and cd it to the corresponding directory:
Chmod + x./test. sh # grant the script execution permission./test. sh # execute the scriptIf the error/bin/bash
($ results );Foreach ($ results as $ key => $ value){Print "$ key \ t $ value \ n ";}
The purpose of this code is to count the number of times each word appears, and"Hello 2World 1 ″Output in this form.
4. run with Hadoop
Upload sample text for statistics
The code is as follows:
Hadoop fs-put *. TXT/tmp/input
Execute PHP mapreduce program in Streaming mode
The code is as follows:
Use shell commands to analyze statistics logs and shell commands to collect statistics logs
When you need to collect user log information and analyze user behavior, you can use shell to conveniently retrieve a lot of data and place it in excel for statistics.
For example, fo
; "title=" clip_image004 "border=" 0 "alt=" clip_ image004 "src=" http://s3.51cto.com/wyfs02/M01/75/15/wKioL1YyOLeSUc4-AABleG0dA6k987.jpg "height=" "/>"Shell script execution: Simply save the various Linux commands you normally use in order to a text file, and then add executable permissions, this file becomes a shell script! Ps:chmod +x target file650) this.widt
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.