Hadoop installs Hive and Java calls Hive

Source: Internet
Author: User
Tags time in milliseconds

1. Installing hive

Before installing hive, make sure that Hadoop is installed, if it is not installed, refer to Centoos install Hadoop cluster for installation;

   1.1, download, unzip

Download hive2.1.1:http://mirror.bit.edu.cn/apache/hive/hive-2.1.1/apache-hive-2.1.1-bin.tar.gz;

Unzip the downloaded hive package into/usr/local

        tar -zxvf apache-hive-2.1.1-bin.tar.gz -C /usr/local/

Go to the/usr/local directory and rename the extracted file to hive
        mv apache-hive-2.1.1-bin/ hive 

    1.2. Set the HIVE environment variable

VI ~/.BASHRC

Add at end of file

Export Hive_home=/usr/local/hive
Export path= $PATH: $HIVE _home/bin

Let the configuration take effect

SOURCE ~/.BASHRC

  2. Install MySQL

To install MySQL, refer to

Yum install MySQL Database

  3. Download MySQL driver package

Download MySQL driver, the version used by this machine is mysql-connector-java-5.1.35.jar;

Copy the package into the Hive/lib

CP Mysql-connector-java-5.1.35.jar/usr/local/hive/lib

4. Modify Hive Configuration

   3.1. Modify the Log configuration

CP Hive/conf/hive-log4j2.properties.template Hive-log4j2.properties

Vim Hive/conf/hive-log4j2.properties

Modify

Property.hive.log.dir =/usr/local/hive-2.1.1/logs/

   3.2, modify the Hive-site.xml

CP Hive/conf/hive-default.xml.template Hive/conf/hive-site.xml

Vim Hive/conf/hive-site.xml

Delete all of the contents (not added will use the default configuration in Hive-default.xml.template), add the following configuration

<configuration>
< Property>    <name>Javax.jdo.option.ConnectionURL</name>    <value>jdbc:mysql://H30: 3306/hive?createdatabaseifnotexist=true</value>    </ Property>    < Property>    <name>Javax.jdo.option.ConnectionDriverName</name>    <value>Com.mysql.jdbc.Driver</value>    </ Property>    < Property>    <name>Javax.jdo.option.ConnectionUserName</name>    <value>Root</value>    </ Property>    < Property>    <name>Javax.jdo.option.ConnectionPassword</name>    <value>Root</value>    </ Property>
</configuration>

Where H30 is the hostname of the installation hive;

  5. Start Hive

Before you start hive, make sure that Hadoop and MySQL are started

Execute the schematool command to initialize

      schematool -dbType mysql -initSchema

Execute hive command after initialization to enter Hive command line interface

      

    If the following error occurs

      

Grant is required to authorize the connection in MySQL;

 6. Call hive using the Java API

    6.1. Modify Hive-site.xml Configuration

Add the following configuration to the <configuration></configuration> in Hive/conf/hive-site.xml

< Property>  <name>Hive.server2.thrift.bind.host</name>  <value>192.168.3.238</value>                             <Description>Bind host on which to run the HiveServer2 Thrift interface. Can be overridden by setting $HIVE _server2_thrift_bind_host</Description></ Property>< Property>  <name>Hive.server2.long.polling.timeout</name>  <value>5000</value>                                   <Description>Time in milliseconds that HiveServer2 would wait, before responding to asynchronous calls that use long polling</Description></ Property>

6.2. Start HiveServer2 Service

Hive--service Metastore &

   6.3. Start Hiveserver2 Service   

Hive--service Hiveserver2 &

See if Hiveserver2 is turned on by following command

        netstat -nl |grep 10000

See Hive/logs if errors occur;

   6.4. Write Java API

First, list the jar packages that the program relies on:

The project code can be downloaded HTTP://URL.CN/4EVRDBW download

If an error occurs while the project is running

  

Can be added in the Core-site.xml configuration of the master server in Hadoop

< Property>    <name>Hadoop.proxyuser.root.hosts</name>    <value>*</value></ Property>< Property>    <name>Hadoop.proxyuser.root.groups</name>    <value>*</value></ Property>

Where root indicates the MySQL database user name that is connected to hive in the project

   

Hive Common operations

1. New Table T_hive, "," as column delimiter
CREATE TABLE t_hive (a int, b int, c int) ROW FORMAT delimited fields TERMINATED by ', ';

2, the data in the import data t_hive.txt,t_hive.txt to the column "," as a delimiter
LOAD DATA LOCAL inpath '/home/cos/demo/t_hive.txt ' OVERWRITE into TABLE t_hive;

3. viewing tables and data
Show tables--View all tables in the library
Show tables ' *t* '-find a table in a library with regular expressions
SELECT * FROM T_hive--view data from the T_hive table
Desc t_hive; --View the table structure of the t_hive

4. Modification
ALTER TABLE t_hive Add COLUMNS (New_col String);--Add New_col field
ALTER TABLE t_hive RENAME to t_hadoop;--modify t_hive table named T_hadoop
5. Delete
Drop TABLE t_hive;--Delete
TRUNCATE TABLE t_hive--Delete all data from the T_hive table


Hadoop installs Hive and Java calls Hive

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.