Host environment:
Ubuntu 13.10
Hadoop 1.2.1
Hive 0.12.0
Download, decompress, and transfer:
Wget http://mirrors.hust.edu.cn/apache/hive/hive-0.12.0/hive-0.12.0.tar.gz
Tar-xzvf hive-0.12.0.tar.gz
Mv hive-0.12.0/opt/
Configure system environment variables:
Sudo vim/etc/profile
Source/etc/profile
Modify hive configuration document
User-Defined profile: hive-site.xml
Default profile: hive-default.xml
The user-defined configuration file will overwrite the default configuration file, and Hive will also read the Hadoop configuration, because Hive is started as a Hadoop client.
Cd conf
Cp hive-default.xml.template hive-site.xml
Main Hive configuration items:
Hive. metastore. warehouse. dir specifies the Hive storage directory
Hive.exe c. scratchdir specifies the directory of temporary hive data files
Database Connection Configuration:
Hive needs to store metadata in RDBMS and configure mysql to store Hive metadata
<Property>
<Name> javax. jdo. option. ConnectionURL </name>
<Value> jdbc: mysql: // localhost: 3306/hive? CreateDatabaseIfNotExist = true </value>
<Description> JDBC connect string for a JDBC metastore </description>
</Property>
<Property>
<Name> javax. jdo. option. ConnectionDriverName </name>
<Value> com. mysql. jdbc. Driver </value>
<Description> Driver class name for a JDBC metastore </description>
</Property>
...........................
...........................
<Property>
<Name> javax. jdo. option. ConnectionUserName </name>
<Value> root </value>
<Description> username to use against metastore database </description>
</Property>
<Property>
<Name> javax. jdo. option. ConnectionPassword </name>
<Value> 111111 </value>
<Description> password to use against metastore database </description>
</Property>
Copy the JDBC driver to $ HIVE_HOME/lib.
Cp/home/dat/mysql-connector-java-5.1.24-bin.jar/opt/hive-0.12.0/lib/
Install Mysql and start mysql ..
Check the mysql startup status:
Sudo service mysql. server status
Start Hive
/Opt/hive-0.12.0/bin $ hive
Error:
Cannot find hadoop installation: $ HADOOP_HOME or $ HADOOP_PREFIX must be set or hadoop must be in the path
Reason hadoop is not enabled, easy to handle: source/opt/hadoop-1.2.1/conf/hadoop-env.sh
Continue to start. continue to report the following error:
Caused by: org. xml. sax. SAXParseException; systemId: file:/opt/hive-0.12.0/conf/hive-site.xml; lineNumber: 2000; columnNumber: 16; the element type "value" must be terminated by the matching end mark "</value>.
Cause: there is a problem with the xml file. The 2000 line <value> <auth> is paired and changed to vaule.
Restart again. OK!
Dat @ dat-HP:/opt/hive-0.12.0/bin $ hive
Logging initialized using configuration in jar: file:/opt/hive-0.12.0/lib/hive-common-0.12.0.jar! Hive-log4j.properties
Hive>
For more details, please continue to read the highlights on the next page:
Hive details: click here
Hive: click here
Hadoop cluster-based Hive Installation
Differences between Hive internal tables and external tables
Hadoop + Hive + Map + reduce cluster installation and deployment
Install in Hive local standalone Mode
WordCount word statistics for Hive Learning
For more information about how to install Ubuntu 13.10 on a hard drive in Windows 7, see
Ubuntu 13.10 download, installation, and configuration summary page