Hive JDBC Connection method

Source: Internet
Author: User
Tags stmt mail exchange
Note Four point 1 when operating. You want to import all of the package 2 below the hive file before the connection program runs. The run time may be longer, which is normal, as long as the patient Waits 3. After entering Hive--service Hiveserver, it is normal for the command box to be stuck, if you want to make it a background program, just change the input to hive--service Hiveserver & 4. The real application is to place the metadata on a single machine, and to import it into MySQL is a more general approach.

In general, our operations on hive are done through the CLI, that is, the Linux console, but, in essence, each connection is stored in a metadata, each is different from each other, so, for such a model I suggest to do some testing more appropriate, not suitable for product development and application.

First, the environment

Hadoop 0.20.2 version, Hive-0.5.0 version, JDK1.6

Ii. Purpose of Use

1, in general, we operate on the hive through the CLI, that is, the Linux console, but, in essence, each connection is stored in a metadata, each is not the same, so, for such a model I suggest to do some testing more appropriate, Not suitable for product development and application.

2, therefore, the way the JDBC connection is generated, of course, there are other ways of connecting, such as ODBC.

Third, the connection configuration

1. Modify Hive-site.xml

<property>

<name>javax.jdo.option.ConnectionURL</name>

<!--represents the automatic creation of a database using embedded derby,create as true, with the database named metastore_db-->

<value>jdbc:derby:;d atabasename=metastore_db;create=true</value>

The <!--means that the derby,hadoopor of the customer service mode is the database name, 192.168.0.3 is the IP address of the Derby server, and 4567 is the port number of the server-->

<!--<value>jdbc:derby://192.168.0.3:4567/hadoopor;create=true</value>->

<DESCRIPTION>JDBC connect string for a JDBC metastore</description>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>org.apache.derby.jdbc.EmbeddedDriver</value> means to use embedded derby

<!--<value>org.apache.derby.jdbc.ClientDriver</value>--> represents derby using customer service mode

<description>driver class name for a JDBC metastore</description>

</property>

2, for the embedded derby requirements in the Hive Lib directory has file Derby.jar, and for the customer Service mode derby requires Derbyclient.jar files, this file needs to download themselves.

3, after the configuration is complete, the input hive--service Hiveserver, may start the service.

Iv. Instance Code

The source code of the hive is mainly connected through JDBC.

public class Hivejdbcclient {

private static String drivername = "Org.apache.hadoop.hive.jdbc.HiveDriver";

/**

* @param args

* @throws SQLException

*/

public static void Main (string[] args) throws SQLException {

try {

Class.forName (drivername);

catch (ClassNotFoundException e) {

TODO auto-generated Catch block

E.printstacktrace ();

System.exit (1);

}

Connection con = drivermanager.getconnection ("Jdbc:hive://ip:10000/default", "" "," ");

Statement stmt = Con.createstatement ();

String tablename = "Http_test";

Stmt.executequery ("drop table" + tablename);

ResultSet res = stmt.executequery ("CREATE TABLE" + tablename + (key int, value string));

Show tables

String sql = "Show Tables";

System.out.println ("Running:" + sql);

ResultSet res = stmt.executequery (SQL);

if (Res.next ()) {

System.out.println (res.getstring (1));

}

Describe table

sql = "describe" + tablename;

System.out.println ("Running:" + sql);

res = stmt.executequery (SQL);

while (Res.next ()) {

System.out.println (res.getstring (1) + "T" + res.getstring (2));

}

Load Data into table

Note:filepath has to the hive server

Note:/tmp/a.txt are a ctrl-a separated file with two fields/

/**

* String filepath = "/tmp/a.txt"; sql = "Load data local inpath '" + filepath + "' into table" + tablename;

* SYSTEM.OUT.PRINTLN ("Running:" + sql); res = stmt.executequery (SQL);

*

*//select * Query sql = "SELECT * from" + tablename; System.out.println ("Running:" + sql); res =

* Stmt.executequery (SQL); while (Res.next ()) {System.out.println (String.valueof (Res.getint (1)) + "\ T" +

* Res.getstring (2)); }

*

*//Regular hive Query sql = "SELECT COUNT (1) from" + tablename; System.out.println ("Running:" + sql); res =

* Stmt.executequery (SQL); while (Res.next ()) {System.out.println (res.getstring (1));}

*/

}

}

V. Summary

1, you can refer to this connection: http://bbs.hadoopor.com/thread-219-1-3.html

2, wiki information, more authoritative, you can see: http://wiki.apache.org/hadoop/Hive/HiveClient

3, Taobao also has relevant information, you can see: http://www.tbdata.org/archives/499

4, the document is more coarse, have the question may send the mail exchange: dajuezhao@gmail.com

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.