Spring Boot with Apache Hive

Source: Internet
Author: User

5.29.1. Maven
<dependency><groupid>org.springframework.boot</groupid>< Artifactid>spring-boot-starter-jdbc</artifactid></dependency><dependency><groupid> Org.springframework.data</groupid><artifactid>spring-data-hadoop</artifactid><version> 2.5.0.release</version></dependency><!--https://mvnrepository.com/artifact/org.apache.hive/ Hive-jdbc--><dependency><groupid>org.apache.hive</groupid><artifactid>hive-jdbc< /artifactid><version>2.3.0</version><exclusions><exclusion><groupid> org.eclipse.jetty.aggregate</groupid><artifactid>*</artifactid></exclusion></ exclusions></dependency><dependency><groupid>org.apache.tomcat</groupid>< Artifactid>tomcat-jdbc</artifactid><version>8.5.20</version></dependency> 
5.29.2. application.properties

Hive Data Source Configuration Items

Hive.url=jdbc:hive2://172.16.0.10:10000/defaulthive.driver-class-name= org.apache.hive.jdbc.hivedriverhive.username=hadoophive.password=

User name is required to have HDFS Write permission, password can not be written

If using the YAML format APPLICATION.YML is configured as follows

Hive:    url:jdbc:hive2://172.16.0.10:10000/default  Driver-class-name:org.apache.hive.jdbc.hivedriver   type:com.alibaba.druid.pool.DruidDataSource  username:hive  password:hive
5.29.3. Configuration
Package Cn.netkiller.config;import Org.apache.tomcat.jdbc.pool.datasource;import Org.slf4j.logger;import Org.slf4j.loggerfactory;import Org.springframework.beans.factory.annotation.autowired;import Org.springframework.beans.factory.annotation.qualifier;import Org.springframework.context.annotation.Bean; Import Org.springframework.context.annotation.configuration;import org.springframework.core.env.Environment; Import org.springframework.jdbc.core.JdbcTemplate; @Configurationpublic class Hiveconfig {private static final Logger Logger = Loggerfactory.getlogger (hiveconfig.class); @Autowiredprivate environment env; @Bean (name = " Hivejdbcdatasource ") @Qualifier (" Hivejdbcdatasource ") public DataSource DataSource () {DataSource DataSource = new DataSource ();d Atasource.seturl (Env.getproperty ("Hive.url"));d Atasource.setdriverclassname (Env.getproperty (" Hive.driver-class-name "));d Atasource.setusername (Env.getproperty (" Hive.username ");d Atasource.setpassword ( Env.getproperty ("Hive.password")); Logger.debug ("Hive DataSource"); return DataSource;} @Bean (name = "Hivejdbctemplate") public JdbcTemplate hivejdbctemplate (@Qualifier ("Hivejdbcdatasource") DataSource DataSource) {return new JdbcTemplate (DataSource);}}

You can also use Druiddatasource

Package cn.netkiller.api.config; @Configuration public  class Hivedatasource {            @Autowired      private environment env;        @Bean (name = "Hivejdbcdatasource")    @Qualifier ("Hivejdbcdatasource") public    DataSource DataSource () {        Druiddatasource DataSource = new Druiddatasource ();        Datasource.seturl (Env.getproperty ("Hive.url"));        Datasource.setdriverclassname (Env.getproperty ("Hive.driver-class-name"));        Datasource.setusername (Env.getproperty ("Hive.username"));        Datasource.setpassword (Env.getproperty ("Hive.password"));        return dataSource;    }    @Bean (name = "Hivejdbctemplate") public     JdbcTemplate hivejdbctemplate (@Qualifier ("Hivejdbcdatasource") DataSource DataSource) {        return new JdbcTemplate (DataSource);}    }
5.29.4. Curd Operation Example

The insert and delete operations of the Hive database are not different from other databases.

Package Cn.netkiller.web;import Java.util.iterator;import Java.util.list;import java.util.map;import Org.slf4j.logger;import Org.slf4j.loggerfactory;import org.springframework.beans.factory.annotation.Autowired; Import Org.springframework.beans.factory.annotation.qualifier;import org.springframework.jdbc.core.JdbcTemplate; Import Org.springframework.stereotype.controller;import org.springframework.web.bind.annotation.RequestMapping; Import Org.springframework.web.servlet.ModelAndView; @Controller @requestmapping ("/hive") public class Hivecontroller {private static final Logger Logger = Loggerfactory.getlogger (hivecontroller.class); @ Autowired@qualifier ("Hivejdbctemplate") private JdbcTemplate hivejdbctemplate; @RequestMapping ("/create") public Modelandview Create () {StringBuffer sql = new StringBuffer ("CREATE table IF not EXISTS"); Sql.append ("hive_test"); Sql.app End ("(KEY INT, VALUE STRING)"); Sql.append ("Partitioned by (CTIME DATE)"); Partitioned storage sql.append ("ROW FORMAT delimited fields TERMINATED by' \ t ' LINES TERMINATED by ' \ n ' "); Defines the delimiter sql.append ("STORED as Textfile"); As text storage logger.info (sql.tostring ()); Hivejdbctemplate.execute (sql.tostring ()); return new Modelandview ("Index");} @RequestMapping ("/insert") public String Insert () {Hivejdbctemplate.execute ("INSERT into hive_test (key, value) VALUES ( ' Neo ', ' Chen '); return "Done";} @RequestMapping ("/select") public String Select () {string-sql = "SELECT * from Hive_test"; list<map<string, object>> rows = hivejdbctemplate.queryforlist (sql); iterator<map<string, Object >> it = Rows.iterator (); while (It.hasnext ()) {map<string, object> row = It.next (); System.out.println (String.Format ("%s\t%s", Row.get ("key"), Row.get ("value"));} return "Done";} @RequestMapping ("/delete") public String Delete () {StringBuffer sql = new StringBuffer ("DROP TABLE IF EXISTS"); sql.append ("Hive_test"); Logger.info (sql.tostring ()); Hivejdbctemplate.execute (sql.tostring ()); return "Done";}}

Spring boot with Apache Hive

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.