Environment only needs to be installed on one node 2. Set environment variable vi. bash_profileexportJAVA_HOMEusrlibjvmjava-1.6.0-openjdk-1.6
Install and configure hive 1. download wget http://mirror.mel.bkb.net.au/pub/apache//hive/stable/hive-0.8.1.tar.gz tar zxf hive-0.8.1.tar.gz only needs to install on one node 2. Set environment variable vi. bash_profile export JAVA_HOME =/usr/lib/jvm/java-1.6.0-openjdk-1.6
Install and configure hive
1. download
Wget http://mirror.mel.bkb.net.au/pub/apache//hive/stable/hive-0.8.1.tar.gz
Tar zxf hive-0.8.1.tar.gz
You only need to install
2. Set Environment Variables
Vi. bash_profile
Export JAVA_HOME =/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre
Export HADOOP_HOME =/home/hadoop/hadoop-1.0.0
Export HIVE_HOME =/home/hadoop/hive-0.8.1
Export HADOOP_CONF_DIR = $ HOME/conf
Export HIVE_CONF_DIR = $ HOME/hive-conf
Export CLASSPATH = $ HIVE_HOME/lib: $ JAVA_HOME/jre/lib: $ HADOOP_HOME
Export PATH = $ HIVE_HOME/bin: $ HADOOP_HOME/bin: $ JAVA_HOME/bin:/sbin/:/bin: $ PATH
3. Configure hive
Cp-r hive-0.8.1/conf $ HIVE_CONF_DIR/
Cd $ HIVE_CONF_DIR/
Cp hive-default.xml.template hive-default.xml
Cat hive-env.sh
Export HADOOP_HEAPSIZE = 512
Export HIVE_CONF_DIR =/home/hadoop/hive-conf
3. Test
$ Hive
Hive> show tables;
OK
Time taken: 4.824 seconds
Hive> create table hwz (id int, name string );
OK
Time taken: 0.566 seconds
Hive> select * from hwz;
OK
Time taken: 0.361 seconds
$ Hadoop dfs-lsr/user/hive
Warning: $ HADOOP_HOME is deprecated.
Drwxr-xr-x-hadoop supergroup 0 2012-03-22 12:36/user/hive/warehouse
Drwxr-xr-x-hadoop supergroup 0 2012-03-22 12:36/user/hive/warehouse/hwz
4. Configure the Metastore to use the mysql database so that multiple users can access it at the same time.
A. create user and database for hive in mysql
Create database hive;
GRANT all ON hive. * TO hive @ '%' identified by 'hivepass ';
B. change metastore to use mysql
Cat hive-site.xml
Hive. metastore. local
True
Javax. jdo. option. ConnectionURL
Jdbc: mysql: // slave1: 3306/hive? CreateDatabaseIfNotExist = true
Javax. jdo. option. ConnectionDriverName
Com. mysql. jdbc. Driver
Javax. jdo. option. ConnectionUserName
Hive
Javax. jdo. option. ConnectionPassword
Hivepass
C. Check
$ Hive
Hive> use dw2;
OK
Time taken: 3.43 seconds
Hive> create table hwz2 (id int, name string) row format delimited fields terminated ',';
OK
Time taken: 2.519 seconds
Hive> show tables;
OK
Hwz2
Time taken: 0.419 seconds
Hive> load data local inpath 'demo.txt 'overwrite into table hwz2;
Copying data from file:/home/hadoop/demo.txt
Copying file:/home/hadoop/demo.txt
Loading data to table dw2.hwz2
Deleted hdfs: // master: 9000/user/hive/warehouse/dw2.db/hwz2
OK
Time taken: 0.557 seconds
Hive> select * from hwz2;
OK
12 jack
12 jack
12 jack
12 jack
12 jack
12 jack
12 jack
12 jack
$ Hadoop dfs-lsr/user/hive
Warning: $ HADOOP_HOME is deprecated.
Drwxr-xr-x-hadoop supergroup 0 2012-03-22/user/hive/warehouse
Drwxr-xr-x-hadoop supergroup 0 2012-03-22/user/hive/warehouse/dw2.db
Drwxr-xr-x-hadoop supergroup 0 2012-03-22/user/hive/warehouse/dw2.db/hwz2
-Rw-r -- 2 hadoop supergroup 1201/user/hive/warehouse/dw2.db/hwz2/demo.txt
Drwxr-xr-x-hadoop supergroup 0 2012-03-22 12:36/user/hive/warehouse/hwz
Drwxr-xr-x-hadoop supergroup 0 2012-03-22/user/hive/warehouse/hwz2
-Rw-r -- 2 hadoop supergroup 1201 2012-03-22/user/hive/warehouse/hwz2/demo.txt
$ Hadoop dfs-cat/user/hive/warehouse/dw2.db/hwz2/demo.txt | head
Warning: $ HADOOP_HOME is deprecated.
12, jack
12, jack
12, jack
12, jack
12, jack
12, jack
12, jack
12, jack
12, jack
12, jack
D. Verify the new table created in mysql.
Mysql> use hive;
Database changed
Mysql> show tables;
+ ----------------- +
| Tables_in_hive |
+ ----------------- +
| BUCKETING_COLS |
| CDS |
| COLUMNS_V2 |
| DATABASE_PARAMS |
| DBS |
| PARTITION_KEYS |
| SDS |
| SD_PARAMS |
| SEQUENCE_TABLE |
| SERDES |
| SERDE_PARAMS |
| SORT_COLS |
| TABLE_PARAMS |
| Tsung |
+ ----------------- +
14 rows in set (0.00 sec)
9. Common Errors
Error 1:
-------------------------------------------------
Hive> show tables;
FAILED: Error in metadata: javax. jdo. JDOFatalInternalException: Error creating transactional connection factory
Solution:
If Hive does not contain the mysql JDBC driver, install it on your own:
Wget http://dev.mysql.com/get/Downloads/Connector-J/mysql-connector-java-5.1.18.tar.gz/from/http://mysql.mirror.kangaroot.net/
Tar zxf mysql-connector-java-5.1.18.tar.gz
Cd mysql-connector-java-5.1.18
Cp mysql-connector *. jar $ HIVE_HOME/lib
Error 2:
-------------------------------------------------
Hive> show tables;
FAILED: Error in metadata: javax. jdo. JDOException: Couldnt obtain a new sequence (unique id): Cannot execute statement: impossible to write to binary log since BINLOG_FORMAT = STATEMENT and at least one table uses a storage engine limited to row-based logging. innoDB is limited to row-logging when transaction isolation level is read committed or read uncommitted.
Solution:
Set binlog_format = 'mixed' in mysql'