One: the preparatory work
1. Steps
1) Hadoop
-"Download Unzip"
-"Modify the configuration file
-"hadoop-env
Java_home
-"Core-site
Fs.defaultfs
Hadoop.tmp.dir
-"Hdfs-site
Dfs.replication
Permission
-"Mapred-site
Mapreduce.frame.work
Historyserver
-"Yarn-site
Mapreduce-"Shuffle
ResourceManager Address: 0.0.0.0
Log Aggregation
-"yarn-env
Java_home
-"Slaves
Datanode/nodemanager hostname
-"Formatting
Bin/hdfs NAMENODE-FORMATF
-"Start
2) Hive
-"Download Unzip"
-"Create Data Warehouse"
/user/hive/warehouse
-"Modify the configuration
-"hive-env
Hadoop_home
Hive_conf_dir
-"log4j
-"Log Directory"
-"Hive-site
-"Connect MySQL
-"Database Address"
-"Connection Drive"
-"User name"
-"Password"
-"Displays the current database
-"Show Table Header"
-"Put MySQL connection driver into Lib
-"Start
Two: Install Hadoop
1. Create a new directory cdh-5.3.6 and Modify permissions
2. Unzip
3. Configuring the Java_home in *env.sh
4. Configuring the Core-site.xml Environment
5. Configuring the Hdfs-site.xml Environment
6. Configure Mapred-site.xml
7. Configure Slaves
8. Configure Yarn-site.xml
9. Formatting
10. Start
Three: Hive
1. Unzip
CDH Commercial version of the building