Hive integration with hbase
ConfigurationSet hbase. jar package with the actual installation of hbase jar package replacement CD/opt/hive/lib/ls hbase-0.94.2 * Rm-RF hbase-0.92 * CP/opt/hbase/hbase-0.94.2 * hive lib/ zookeeper. jar package with hbase lib/replacement steps as above in the hive-site.xml to add: <property> <Name> hive. aux. jars. path </Name> <value> file: // opt/hive/lib/hive-hbase-handler-0.9.0.jar, file: // opt/hive/lib/hbase-0.94.2.jar, file: /// opt/hive/lib/zookeeper-3.4.3.jar </value> </property>
RunCD/opt/hive/bin. /hive-hiveconf hbase. master = Master: 60000 the process is as follows: Start hbase before creating a table in hive; after creating a table in hive, add data to hbase; ============ start hbase, add data ================== [[email protected] bin] $ CD/opt/hbase/bin
[[Email protected] bin] $. /start-hbase.sh [[email protected] bin] $. /hbase shell add data to hbase (main): 004: 0> put 'htest', '1', 'f: Values ', 'test' hbase (main ): 005: 0> scan 'htest' ================== start hive, create a table ============ CD/opt/hive/bin. /hive-hiveconf hbase. master = Master: 60000 hive> Create Table htest (Key int, value string) stored'
Org. Apache. hadoop. hive. hbase. hbasestoragehandler'With serdeproperties ('hbase. columns. mapping '=': Key, F: value') tblproperties ('hbase. table. name' = 'htest'); hive> show tables; hive> select * From htest;
Install pig
Decompress and installTar-zxvf pig-0.10.0.tar.gz/opt/mv pig-0.10.0/pigchown-r hadoop: hadoop pig
ConfigurationBecause there is no xxx-en.vsh file in pig/Conf, add the following content to pig/bin: export java_home =/usr/Program/jdk1.6.0 _ 13/export pig_install =/opt/pigexport hadoop_install =/home/hadoop-ENV/hadoop-1.0.1/export Path = $ pig_install/bin: hadoop_install/bin: $ pathexport pig_classpath = $ hadoop_install/Conf
RunStart hadoop first, and then start hivvecd/opt/hive/bin. /pig ======== upload data to hadoop HDFS ==================================== hadoop FS-copyfromlocal/opt/data/test.txt upload computer data to hadoop FS-ls/opt/data/test.txt hadoop FS-CAT/opt/ data/test.txt ========= display data in pig ============================= grunt> A = load '/opt/data/test.txt' using pigstorage ('#') as (ID, name); grunt> B = foreach A generate name; grunt> store B into 'opt/data/dist.txt 'using pigstorage (' \ t '); grunt> dump;
Pig Latin Common commands: load .... using pigstorage ('')...... as ......; foreach ...... generate ......; filter ...... by ......; dump; store ...... into; group ...... by; [[email protected] bin] $ hadoop FS-ls/user/hive/warehouse/my view the data warehouse in hive
Hive integration with hbase; pig Installation