Linux Hadoop2.7.2 Hive2.3.2 Installation __linux

Source: Internet
Author: User
Tags chmod mkdir

Hive is a data Warehouse tool, once ETL is a data warehouse necessary tools, DB2, ORACLE, SQL Server and other database vendors to provide a variety of data warehousing tools, the advent of the Internet so that these database tools slightly senility.


1.Hadoop Environment
Server

Host Name IP Address Jdk User
Master 10.116.33.109 1.8.0_65 Root
Slave1 10.27.185.72 1.8.0_65 Root
Slave2 10.25.203.67 1.8.0_65 Root
2. Download Hive

Http://hive.apache.org/downloads.html

Apache-hive-2.3.2-bin.tar.gz


3. Installation 3.1 Decompression

CD    /data/spark
tar  -zxvf  apache-hive-2.3.2-bin.tar.gz

3.2 Configuring environment variables

Vim ~/.bash_profile
Configuration item
export  hive_home=/data/spark/apache-hive-2.3.2-bin
export  hive_conf _dir= $HIVE _home/conf 
export  hive_classpath= $HIVE _home/lib
export  path= $HIVE _home/bin: $PATH

Execute source ~/.bash_profile make environment variables effective

Complete configuration item

Export hadoop_home=/data/spark/hadoop-2.7.2
 export path= $PATH: $HADOOP _home/bin
 export path= $PATH: $HADOOP _ Home/sbin
 export hadoop_mapred_home= $HADOOP _home
 export hadoop_common_home= $HADOOP _home
 export HADOOP _hdfs_home= $HADOOP _home
 export yarn_home= $HADOOP _home
 export hadoop_root_logger=info,console
 Export hadoop_common_lib_native_dir= $HADOOP _home/lib/native
 export hadoop_opts= "-djava.library.path= $HADOOP _home/ Lib "

export hbase_home=/data/spark/hbase-2.0.0-alpha4
export path= $HBASE _home/bin: $PATH  
Export hbase_classpath=/data/spark/hbase-2.0.0-alpha4/conf

Export  hive_home=/data/spark/ Apache-hive-2.3.2-bin
export  hive_conf_dir= $HIVE _home/conf
Export  hive_classpath= $HIVE _ Home/lib
export  path= $HIVE _home/bin: $PATH


3.3Hive Configuration

Configuring Hive-site.xml Files

cd/data/spark/apache-hive-2.3.2-bin/conf
cp   hive-default.xml.template   hive-site.xml

3.4 Creating the HDF directory

Hive-site.xml Directory Configuration Hive Directory

<name>hive.metastore.warehouse.dir</name>

  <value>/user/hive/warehouse</value>

   <name>hive.exec.scratchdir</name>

  <value>/tmp/hive</value>

Create directories through Hadoop, authorize, view directories
Hadoop   FS   -mkdir-   p   /user/hive/warehouse
Hadoop   FS   -chmod   777   /user/ Hive/warehouse 

Hadoop   fs   -mkdir-  p   /tmp/hive/
Hadoop   fs   -chmod  777   /tmp/hive

hadoop   fs   -ls   /user/hive/
hadoop   fs   -ls   /tmp/


3.5 Create hive temporary directory, modify Hive-site.xml The 1.hive.downloaded.resources.dir item should be changed to/data/hive/tmp
Read and Write permission: chmod 777-r/data/hive/tmp
2.hive.server2.logging.operation.log.location Item modification parameter replaces ${system:user.name} with root

<property>
<name>hive.exec.scratchdir</name>
<value>/data/hive/tmp/</value >
<description> Specifies the temporary file directory for hive data on HDFs </description>
</property>

<property >
<name>hive.querylog.location</name>
<value>/data/hive/log</value>
<description>location of Hive run time structured log file</description>
</property>

3.6 Configuration database, modify Hive-site.xml
<property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql:// 192.168.139.200:3306/hive?usessl=false&zerodatetimebehavior=converttonull&characterencoding=utf-8</ value> <description> connect MySQL via the JDBC Protocol hive Library </description> </property> <property> <name >javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> < DESCRIPTION&GT;JDBC MySQL driver </description> </property> <property> <name> javax.jdo.option.connectionusername</name> <value>hive</value> <description>mysql User name <
/description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hive</value> <description>mysql user password </description> </property> <property > <name>hive.metastore.schema.verification</name> <value>false</value> <descript   Ion></description> </property> <property> <name>hive.hwi.war.file</name> <value> lib/hive-hwi-1.2.1.war</value> <description>hive Web page </description> </property>

3.7 hive-env.sh Configuration
cd/data/spark/apache-hive-2.3.2-bin/conf
cp    hive-env.sh.template    hive-env.sh
Vim hive-env.sh
Export  hadoop_home=/data/spark/hadoop-2.7.2
export  hive_conf_dir=/data/spark/apache-hive-2.3.2-bin /conf
Export  hive_aux_jars_path=/data/spark/apache-hive-2.3.2-bin/lib
Avoid appearing: Hive cannot find Hadoop installation: $HADOOP _home or $HADOOP _prefix must is set or Hadoop must be I error
4. Start-up and testing 4.1.MYSQL Database Initialization
CD   /data/spark/apache-hive-2.3.2-bin/bin
schematool   -initschema  -dbtype  MySQL
Review the output log to see if the table structure is created correctly.
4.2. Start HiveTo start the CLI service for Hive
Perform hive, if the background runs hive Nohup hive > Hive.log 2>&1 & Start service in the following 2 ways:
Nohup hiveserver2> hiveserver2.log 2>&1 &
Nohup Hive--service hiveserver2> hiveserver2.log 2>&1 &

5. Common Commands

--Perform a command to view the function: Show  functions;

--Executes the command to view the details of the SUM function:
desc function  sum;

--Execute the Hive command for the new database:
Create db  db_hive_test;

--Create a data table use  db_hive_test;
Create  Table  Student (ID int,name string)  row  format  delimited  fields   Terminated  by  ' t ';

--Manually Specify storage location
CREATE database hive02 location '/hive/hive02 ';

--Add additional information (create time and database notes)
CREATE database hive03 comment ' It is my i-i-' with dbproperties (' Creator ' = ' Kafka ', ' Date ' = ' 2015-08-08 ');

--View the detailed information of the database
describe DB hive03;
--More detailed view of
describe database extended hive03;

--The best command to view the database structure
describe DB formatted hive03;

--database can only modify dbproperties contents
ALTER DATABASE HIVE03 set dbproperties (' edited-by ' = ' Hanmeimei ');


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.