Modify permissions for a file or directory
Chown
chown [ options ] User [ . Group ] file/dir
Modify the owner of a file
Chgrp
Chgrp [-R] Group name Dir/file
Modify the owning group of a file
Iv. Systems and Networks
Option name
Meaning
passwd xxx
Change Password
Df-ah
View disk space
Ps-ef |grep
View process
Kill-9
Kill the process
mytable from Database mydb to the E:\MySQL\mytable.sql file.
c:\> mysqldump-h localhost-u root-p mydb mytable>e:\mysql\mytable.sql
Example 3: Export the structure of the database mydb to a e:\MySQL\mydb_stru.sql file.
c:\> mysqldump-h localhost-u root-p mydb--add-drop-table >e:\mysql\mydb_stru.sql
Note :-h localhost can be omitted, it is generally used on the virtual host.3) Export data structure onlyformat : mysqldump-u [Database user name]-p-t [the name of the data
localhost-u root-p mydb >e:\mysql\mydb.sql
Then enter the password, wait for an export to succeed, you can check the target file for success. Example 2: Export mytable from Database mydb to the E:\MySQL\mytable.sql file.
c:\> mysqldump-h localhost-u root-p mydb mytable>e:\mysql\mytable.sql
Example 3: Export the structure of the database mydb to a e:\MySQL\mydb_stru.sql file.
c:\> mysqldump-h localhost-u root-p mydb--add-drop-table >e:\mysql\mydb_stru.sql
We take RHEL6.3 as an example to illustrate.There are command options at the end of the Linux command, and some options have option values. The options are preceded by a dash "-", with a space separating the commands, options, and option values. Some commands have no options and there are parameters. An option is a command built-in feature, which is a user-supplied content that conforms to the command forma
There are command options at the end of the Linux command, and some options have option values. The options are preceded by a dash "-", with a space separating the commands, options, and option values. Some commands have no options and there are parameters. An option is a command built-in feature, which is a user-supplied content that conforms to the command format.1.1.1. Command promptRight-click on the de
: Mysqladmin-u username-P old Password New Password
Example:
Example 1: Add a password ab12 to the root user. First, enter the directory mysqlbin in DOS, and then type the following command:
Mysqladmin-u root-Password ab12
Remarks: Because the root account does not have a password at the beginning, the old-P password can be omitted.
Example 2: Change the root password to djg345.
Mysqladmin-u root-P ab12 password djg345
1.3 new users added
Remarks: Different from the above, the following
password after execution;Example/home/space/music/ root@www.cumt.edu.cn:/home/root/others//home/space/music/ www.cumt.edu.cn:/home/root/others/The above command copies the local music directory to the remote others directory, which is remotely available after replication. /others/music/DirectoryCopy from remote to localFrom remote copy to local, as long as the copy from the local to the remote command of the following 2 parameter Exchange order;For example:scp root@www.cumt.edu.cn:/home/root/ot
, but the same MRV1 computing framework as CDH3, in order to ensure that the code developed before the company's online environment can run correctly.
(1) Hadoop (HDFS) build
Environmental preparednessOs:centos 6.4 x86_64Servers:hadoop-master:172.17.20.230 Memory 10G-Namenode
hadoop-secondarynamenode:172.17.20.234 Memory 10G-Secondarybackupnamenode,jobtracker
Hadoop
Install times wrong: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (site) on project Hadoop-hdfs:an Ant B Uildexception has occured:input file/usr/local/hadoop-2.6.0-stable/hadoop-2.6.0-src/hadoop-hdfs-project/ Hadoop-hdfs/target/findbugsxml.xml
1. Hadoop Java APIThe main programming language for Hadoop is Java, so the Java API is the most basic external programming interface.2. Hadoop streaming1. OverviewIt is a toolkit designed to facilitate the writing of MapReduce programs for non-Java users.Hadoop streaming is a programming tool provided by Hadoop that al
Basics (10)----Linux Network configuration detailed steps---Bridging mode and remote communication for two machines
If you are unfamiliar with the virtual machine network, you can also refer to reading:
Hadoop Foundation-------virtual Machines (v)-----Three modes of network configuration for virtual machine Linux systems
PS: All set up after the discovery of physical and virtual machines can ping 192.168.x.1 and physical functions ping the virtual m
complete and can be used later.
The result of executing the hadoop command is as follows,
[Hadoop @ localhost ~] $ HadoopUsage: hadoop [-- config confdir] [COMMAND | CLASSNAME]CLASSNAME run the class named CLASSNAMEOrWhere COMMAND is one:Fs run a generic filesystem user clientVersion print the versionJar Note: please use "yarn jar" to launchYARN applications, no
script.The hadoop-daemons.sh starts hadoop distributed programs on all machines by calling slaves. Sh.Slaves. Sh runs a set of specified commands on all machines (using SSH without a password) for upper-layer use.The start-dfs.sh starts namenode on the local machine, starts datanode on the slaves machine, and starts secondarynamenode on the master machine by cal
= ' Org.apache.hadoop.hdfs.server.datanode.DataNode ' through the parameter Datanode, and set the Java environment variable classpath, The JVM occupies the largest amount of memory and executes Java commands to run
| Datanode class;
| The script eventually starts the process by executing the following command, such as Datanode
Complete the Exec "$JAVA" $JAVA _heap_max $HADOOP _opts-classpath "$CLASSPATH" $C
This document describes how to operate a hadoop file system through experiments.
Complete release directory of "cloud computing distributed Big Data hadoop hands-on"
Cloud computing distributed Big Data practical technology hadoop exchange group:312494188Cloud computing practices will be released in the group every day. welcome to join us!
First, let's loo
As a matter of fact, you can easily configure the distributed framework runtime environment by referring to the hadoop official documentation. However, you can write a little more here, and pay attention to some details, in fact, these details will be explored for a long time. Hadoop can run on a single machine, or you can configure a cluster to run on a single machine. To run on a single machine, you only
sync_h_script # In fact, these two commands are the alias of my own salt command, check/opt/hadoop_scripts/profile. d/hadoop. sh
Iii. Monitoring
A common solution is ganglia and nagios monitoring. ganglia collects a large number of metrics and uses graphical programs. nagios will trigger an alarm when a metric exceeds the threshold.
In fact, hadoop has an interf
)Configuration:
After hadoop is started, namenode starts and stops various daemon on each datanode through SSH (Secure Shell, this requires that you do not need to enter a password when executing commands between nodes. Therefore, we need to configure SSH to use the password-free public key authentication form.Take the three machines in this article as an example. Now node1 is the master node, and it need
Chapter 2 mapreduce IntroductionAn ideal part size is usually the size of an HDFS block. The execution node of the map task and the storage node of the input data are the same node, and the hadoop performance is optimal (Data Locality optimization, avoid data transmission over the network ).
Mapreduce Process summary: reads a row of data from a file, map function processing, Return key-value pairs; the system sorts the map results. If there are multi
the packaged object, and the final "." Command. Represents the file that will be packaged for the build to be saved in the current directory.[[emailprotected] WordCount]$ jar -cvf WordCount.jar -C bin/ .已添加清单正在添加: WordCount$TokenizerMapper.class(输入 = 1736) (输出 = 754)(压缩了 56%)正在添加: WordCount$IntSumReducer.class(输入 = 1739) (输出 = 74Special Note: The last character of the package command is ".", which means to save the package-generated file Wordcount.jar to the current folder, especially when ente
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.