Wang Jialin's in-depth case-driven practice of cloud computing distributed Big Data hadoop in July 6-7 in Shanghai
Wang Jialin Lecture 4HadoopGraphic and text training course: Build a true practiceHadoopDistributed Cluster EnvironmentHadoopThe specific solution steps are as follows:
Step 1: QueryHadoopTo see the cause of the error;
Step 2: Stop the cluster;
Step 3: Solve the Problem Based on the reasons indicated in the log. We need to clear th
mkdir-P of Unix. It creates parent directories of all levels in the path.
Example:
Hadoop FS-mkdir/user/hadoop/dir1/user/hadoop/dir2
Hadoop FS-mkdir HDFS: // host1: port1/user/hadoop/dir hdfs: // host2: port2/user/hadoop/Dir
. The output information on the console should show that namenode, datanode, secondary namenode, jobtracker, and tasktracker are enabled. After the startup is complete, you can see that five new Java processes have been started through PS-ef.
$ Bin/start-all.sh
$ PS-ef
(5) run the wordcount application, as shown in code listing 6:
$ Bin/hadoop DFS-put./test-in input
# Copy the./test-In di
path is a local file, it is similar to the put command.
CopyToLocal
Usage:
hadoop fs -copyToLocal [-ignorecrc] [-crc] URI
Except that the target path is a local file, it is similar to the get command.
Cp
Usage:
hadoop fs -cp URI [URI …]
Copy the file from the Source Path to the target path. This command allows multiple source paths. The target path must be a d
Not much to say, directly on the dry goods!GuideInstall Hadoop under winEveryone, do not underestimate win under the installation of Big data components and use played Dubbo and disconf friends, all know that in win under the installation of zookeeper is often the Disconf learning series of the entire network the most detailed latest stable disconf deployment (based on Windows7 /8/10) (detailed) Disconf Learning series of the full network of the lates
password, press Enter direct pass, at this time will generate two files under ~/home/{username}/.ssh: Id_rsa and Id_rsa.pub, the former is the private key, the latter is the public key, now we append the public key to the Authorized_ Keys (Authorized_keys is used to save all public key content that is allowed to log on to the SSH client user as the current user):
~$ Cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keysYou can now log in to SSH to confirm that you do not need to enter your password wh
variables, the second is to take the parameters passed in through the command line from $ _ server ['argv']. Here, stdin is input using standard.
Its usage is as follows:
On the Linux console, enter./wc_mapper.php
Run wc_mapper.php and wait for the user's keyboard input status on the console
Enter text on the keyboard
Press Ctrl + D to terminate the input. wc_mapper.php starts to execute the real business logic and outputs the execution result.
So where is stdout? Print itself is already stdout
on error.
Put
Usage: hadoop FS-put
Copy single SRC, or multiple SRCS from local file system to the destination filesystem. Also reads input from stdin and writes to destination filesystem.
Hadoop FS-put localfile/user/hadoop/h
[Hadoop] how to install Hadoop and install hadoop
Hadoop is a distributed system infrastructure that allows users to develop distributed programs without understanding the details of the distributed underlying layer.
Important core of Hadoop: HDFS and MapReduce. HDFS is res
. The user of the command must be the owner of the file or the superuser. For more information, see the HDFs Permissions User's Guide.
Chown
How to use: Hadoop Fs-chown [-R] [Owner][:[group]] uri [URI]
Change the owner of the file. Using-R causes changes to be recursively performed under the directory structure. The user of the command must be a superuser. For more information, see the HDFs Permissions User's Guide.
Copyfromlocal
How to use:
FS Shell
Cat
Chgrp
chmod
Chown
Copyfromlocal
Copytolocal
Cp
Du
Dus
Expunge
Get
Getmerge
Ls
Lsr
Mkdir
Movefromlocal
Mv
Put
Rm
RMr
Setrep
Stat
Tail
Test
Text
Touchz
FS ShellThe call file system (FS) shell command should use the form Bin/hadoop
Original address: http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_shell.html
FS Shell
Cat
Chgrp
chmod
Chown
Copyfromlocal
Copytolocal
Cp
Du
Dus
Expunge
Get
Getmerge
Ls
Lsr
Mkdir
Movefromlocal
Mv
Put
Rm
RMr
Setrep
Stat
Tail
Test
Text
Touchz
FS ShellThe call file system (FS) she
Hadoop Study Notes 0004 -- Eclipse installation Hadoop Plugins1 , download hadoop-1.2.1.tar.gz , unzip to Win7 under hadoop-1.2.1 ;2 , if hadoop-1.2.1 not in Hadoop-eclipse-plugin-1.2.1.jar package, on the internet to download d
://host:port/file1 hdfs://host:port/file2 hdfs://host:port/file3 hdfs://host:port/dir1
return value:Successful return 0, Failure returns-1.putHow to use: Hadoop fs-put Copy single or multiple source paths from the local file system to the target file system. Read input from standard input is also supported to write to the target file system.
Hadoop
software, you can find useful to install Alsoscreento start sessions the work on the remote servers, an Dnmapto Check server ports in case something are not working in the cluster networking sudo apt-get install screen nmap Repeat This installation-procedure, up-to-this-point, on-every node you had in the cluster The following'll is necessary only on the first node: Then we start a screens to work remotely without fear of the losing work if disconnected. Screen-s installing After The-syou can
user of the command must be a superuser. For more information, see the HDFs Permissions User Guide. 5:copyfromlocalHow to use: Hadoop fs-copyfromlocal In addition to qualifying the source path as a local file, it is similar to the put command.6:copytolocalHow to use: Hadoop fs-copytolocal [-IGNORECRC] [-CRC] URI In addition to qualifying the target path as a loc
-copyfromlocal In addition to qualifying the source path as a local file, it is similar to the put command.CopytolocalHow to use: Hadoop fs-copytolocal [-IGNORECRC] [-CRC] URI In addition to qualifying the target path as a local file, it is similar to the get command.CpHow to use: Hadoop fs-cp uri [uri ...] Copies the file from the source path to the destination
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.