======. Installation of the Hive database ======
<code>
1. First you need to install the above Hadoop environment.
2. Install the MySQL environment to store hive metadata because the default metadata is stored in Derby (only supports a link for testing) the actual environment with MySQL.
3. Installation environment using the CentOS 6.5 IP: 192.168.0.12
</code>
====== two. Install the MySQL database to store hive metadata ======
<code>
Yum Install Mysql-server
Mysql-uroot-p
Create DATABASE hive;
Update Mysql.user set Password=password (' root ') where user= ' root ';
Flush privileges;
</code>
====== three. Installing the hive======
<code>
Requires a Java environment, which is already configured for Hadoop.
Cd/data/hadoop
Wget-c http://114.242.101.2:808/hive/apache-hive-2.3.2-bin.tar.gz
Tar XF apache-hive-2.3.2-bin.tar.gz
MV Apache-hive-2.3.2-bin Hive
Chown-r Hadoop:hadoop Hive
Setting the Hive environment variable Hadoop I've set
Vim/etc/profile
#hive
Export Hive_home=/data/hadoop/hive
Export path= $HIVE _home/bin: $PATH
Soure/etc/profile
</code>
====== four. Modify the hive configuration file ======
<code>
Su-hadoop
Cd/data/hadoop/hive/conf
MV Hive-default.xml.template Hive-site.xml
Clear the content between <configuration></configuration> in the file and add the following:
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://127.0.0.1:3306/hive?characterEncoding=UTF-8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>root</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>root</value>
</property>
Copy the MySQL JDBC driver package to the Lib directory of Hive
cd/data/hadoop/hive/lib/
Wget-c Http://114.242.101.2:808/hive/mysql-connector-java-5.1.44-bin.jar
</code>
====== five. Hive's default storage path on HDFs ======
<code>
Website Description:
Hive uses Hadoop, so:
You must has Hadoop in your path OR
Export hadoop_home=
In addition, your must use below HDFS commands to create
/tmp And/user/hive/warehouse (aka Hive.metastore.warehouse.dir)
and set them chmod g+w before can create a table in Hive.
Su-hadoop
cd/data/hadoop/hadoop-2.7.4
./bin/hadoop fs-mkdir/tmp
./bin/hadoop Fs-mkdir-p/user/hive/warehouse
./bin/hadoop Fs-chmod G+w/tmp
./bin/hadoop Fs-chmod G+w/user/hive/warehouse
</code>
====== six. Run hive======
<code>
The following appears to indicate a successful run.
[Email protected] hadoop]$ hive
Which:no HBase in (/data/hadoop/hadoop-2.7.4/bin:/data/hadoop/hive/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin :/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.151-1.b12.el6_9.x86_64/bin:/ Home/hadoop/bin)
Slf4j:class path contains multiple slf4j bindings.
Slf4j:found Binding in [jar:file:/data/hadoop/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/ Staticloggerbinder.class]
Slf4j:found Binding in [jar:file:/data/hadoop/hadoop-2.7.4/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/ Slf4j/impl/staticloggerbinder.class]
Slf4j:see http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Slf4j:actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/data/hadoop/hive/lib/hive-common-2.3.2.jar!/ Hive-log4j2.properties Async:true
HIVE-ON-MR is deprecated in Hive 2 and may isn't available in the future versions. Consider using a different execution engine (i.e. spark, Tez) or using Hive 1.X releases.
Hive>
>
>
>
</code>
====== seven. Initialize Hive database ======
<code>
Su-hadoop
Schematool-initschema-dbtype MySQL
</code>
====== eight. Run the Hive action command ======
<code>
Hive> CREATE TABLE pokes (foo INT, bar STRING);
Hive> CREATE TABLE invites (foo INT, bar string) partitioned by (DS string);
Hive> SHOW TABLES;
hive> SHOW TABLES '. *s ';
Hive> DESCRIBE invites;
hive> ALTER TABLE Events RENAME to 3KOOBECAF;
Hive> ALTER TABLE pokes ADD COLUMNS (New_col INT);
Hive> ALTER TABLE invites ADD COLUMNS (new_col2 INT COMMENT ' a COMMENT ');
Hive> ALTER TABLE invites REPLACE COLUMNS (foo int, bar STRING, Baz INT COMMENT ' Baz replaces New_col2 ');
Hive> ALTER TABLE invites REPLACE COLUMNS (foo INT COMMENT ' only keep the first column ');
Hive> DROP TABLE pokes;
</code>
Two. Installation of the Hive database