Upgrade hive client and hive Client

Source: Internet
Author: User

Upgrade hive client and hive Client

Article from: http://blog.csdn.net/lili72

Background: when using hive of the old version, some bugs often occur and cannot be solved. For example, you cannot find partitions in tables. Due to high concurrency, the following common exceptions are:

Com. mysql. jdbc. exceptions. jdbc4.MySQLSyntaxErrorException: Table 'hive. DELETEME1414791576856 'doesn' t exist

FAILED: SemanticException [Error 10006]: Line Partition not found ''2014-10-26''

FAILED: SemanticException Line 1: 99 Exception while processing 'st _ USER_INFO ': Unable to fetch table ST_USER_INFO

FAILED: Error in metadata: MetaException (message: java. lang. RuntimeException: commitTransaction was called but opentransactioncils = 0. This probably indicates that there are unbalanced CILS to openTransaction/commitTransaction)

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exe c. DDLTask

(Increasing the maximum number of connections of hive to mysql is still ineffective. Show variables like 'max _ conn % '; set global max_connections = 2000 ;)

 

Hive upgrade

0 Download latest hive-0.14.0

Http://apache.fayea.com/hive/hive-0.14.0/

Upload to/home/bigdata/

Decompress the file tar-xvf apache-hive-0.14.0-bin.tar.gz

Rename mv apache-hive-0.14.0-bin hive0140

 

 

1. Stop all hive operations first, and disable hiveserver and client.

2. Back up the mysql database

 

Mysqldump-h 192.168.119.129-P 3306-uroot-p123 hive> hive-20150120-0.9.0. SQL

 

3. Modify Environment Variables

Modify the environment variable $ HIVE_HOME

Switch root

Vi/etc/profile

Source/etc/profile

# Export HIVE_HOME =/home/bigdata/hive

Export HIVE_HOME =/home/bigdata/hive0140

4. Execute the mysql upgrade script

Cd/home/bigdata/hive0140/scripts/metastore/upgrade/mysql

First, view the README file of this directory.

Because we already have version 0.9.0, we only need to execute 0.9 ---- 0.14

Mysql-h 192.168.119.129-P 3306-uroot-p123 hive <upgrade-0.9.0-to-0.10.0.mysql. SQL

Mysql-h 192.168.119.129-P 3306-uroot-p123 hive <upgrade-0.10.0-to-0.11.0.mysql. SQL

Mysql-h 192.168.119.129-P 3306-uroot-p123 hive <upgrade-0.11.0-to-0.12.0.mysql. SQL

Mysql-h 192.168.119.129-P 3306-uroot-p123 hive <upgrade-0.12.0-to-0.13.0.mysql. SQL

Mysql-h 192.168.119.129-P 3306-uroot-p123 hive <upgrade-0.13.0-to-0.14.0.mysql. SQL

 

5. Copy the mysql driver to the lib directory. Copy the old lib

Copy jdbcconnect from the previous hive client to the lib of the new hive0140.

Cp mysql-connector-java-5.1.23-bin.jar ../hive0140/lib/

 

6 put the hive-site.xml configured earlier versions, the hive-env.xml and the hive-log4j.properties back under the current version of conf.

 

7. The upgrade is complete. You can perform hive operations. Restart the client and server.

 

8. Run hive as the hadoop user.

 

No data query problems:

However, when creating a table, the following error occurs:

FAILED: Error in metadata: javax. jdo. JDODataStoreException: Insert of object "org. apache. hadoop. hive. metastore. model. MStorageDescriptor @ 139491b "using statement" insert into 'sds '('sd _ id', 'is _ COMPRESSED', 'output _ format', 'input _ format ', 'num _ buckets', 'serde _ id', 'CD _ id', 'location') VALUES (?,?,?,?,?,?,?,?) "Failed: Field 'IS _ STOREDASSUBDIRECTORIES' doesn' t have a default value

NestedThrowables:

Java. SQL. SQLException: Field 'is _ STOREDASSUBDIRECTORIES 'doesn' t have a default value

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exe c. DDLTask

 

The IS_STOREDASSUBDIRECTORIES field in the SDS table has no default value;

Modify and set a default value of 0.

First query the previous value

Select IS_STOREDASSUBDIRECTORIES from SDS limit 10;

Alter table SDS alter column IS_STOREDASSUBDIRECTORIES set default 0;

 

The data insertion problem is solved successfully.

 

After a period of trial run, the above bugs no longer appear.

 


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.