Hive Table Import Elasticsearch

Source: Internet
Author: User

1, add Elasticsearch-hadoop-hive-2.1.2.jar to hive. Hive Add a third party package, view: http://blog.csdn.net/qianshangding0708/article/details/50381966

2, establish elasticsearch appearance in hive:

@Test public
void Testestable () {
        try {
		hivehelper
		. Excutenonquery () ' CREATE EXTERNAL TABLE es_user (ID string, name string, age int, create_date string) STORED by ' Org.elasticsearch.hadoop.hive.EsStorageHandler ' Tblproperties (' es.resource ' = ' es_hive/user_{create_date} ', ' es.index.auto.create ' = ' true ', ' es.nodes ' = ' 10.0.1.75:9200,10.0.1.76:9200,10.0.1.77:9200 ') ");
	} catch (Exception e) {
 	        e.printstacktrace ();
	}
}

To make the SQL statement look clear, paste the SQL statement again:

CREATE EXTERNAL TABLE es_user (
	ID string,
	NAME string, age
	INT,
	create_date string
) STORED by ' Org. Elasticsearch.hadoop.hive.EsStorageHandler ' tblproperties (
	' es.resource ' = ' es_hive/user_{create_date} ',
	' Es.index.auto.create ' = ' true ',
	' es.nodes ' = ' 10.0.1.75:9200,10.0.1.76:9200,10.0.1.77:9200 '
);

Es.index.auto.create: Set whether indexes are automatically created

Es.nodes:Elasticsearch cluster address.

3, create hive table

@Test public
void Testhivetable () {
	try {
		hivehelper
		. Excutenonquery () ' CREATE TABLE IF not EXISTS Hive_user (ID string, name string, age int) partitioned by (Create_date string) ROW FORMAT delimited FIELDS terminated by ' , ' LINES terminated by ' \ n ' STORED as Textfile ');
	} catch (Exception e) {
		e.printstacktrace ();
	}
}

SQL statement:

CREATE TABLE IF not EXISTS hive_user (
    ID string,
    name string, age
    int
) partitioned by (Create_date Stri NG) 
ROW FORMAT delimited 
FIELDS terminated by ', ' 
LINES terminated by ' \ n '
STORED as textfile;


4, upload the data file, and import the file into the Hive table (Hive_user)

Original data: Kkk.txt

1,fish1,1
2,fish2,2
3,fish3,3
4,fish4,4
5,fish5,5
6,fish6,6 7,fish7,7 8,fish8 , 8
9,fish9,9
Upload the kkk.txt to the/fish/hive/directory of HDFs.


@Test public
void Testloadhivetable () {
	try {
		hivehelper
		. Excutenonquery ("LOAD DATA Inpath '/fish /hive/kkk.txt ' into TABLE hive_user PARTITION (create_date= ' 2015-12-22 ');
	} catch (Exception e) {
		e.printstacktrace ();
	}
}
Load to Hive table.

View Hive_user table:

Hive> select * from Hive_user;
OK
1	fish1	1	2015-12-22
2	fish2	2	2015-12-22
3	fish3	3	2015-12-22
4	fish4	4	2015-12-22
5	fish5	5	2015-12-22
6	Fish6	6	2015-12-22
7	fish7	7	2015-12-22
8	fish8	8	2015-12-22
9	fish9	9	2015-12-22 time
taken:0.041 seconds, Fetched:9 row (s)

OK, the data has been load to hive.


5, insert the data from the Hive table into the Elasticsearch

@Test public
void Testinsertelasticsearch () {
	try {
	   hivehelper
	   . Excutenonquery ("INSERT OVERWRITE TABLE es_user SELECT s.id, S.name, S.age, s.create_date from Hive_user s where s.create_date= ' 2015-12-22 ' "); 
  } catch (Exception e) {
		e.printstacktrace ();
	}
}
View Elasticsearch:

The data has been successfully uploaded to Elasticsearch.

More details, reference: https://www.elastic.co/guide/en/elasticsearch/hadoop/current/hive.html

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.