Alex's Novice Hadoop Tutorial: Lesson 8th Sqoop1 Importing Hbase and Hive

Source: Internet
Author: User
Tags mysql import sqoop accumulo

Continue to write, in fact, MySQL import and export HDFs for the actual project development is not much use, but that can be used to get started. Write today's collaboration with HBase and hive. I suddenly found my tutorial written in a messy order, did not first introduce the installation of hive, this is to apologize to everyone, I made up the back.

Data preparation MySQL Build table employee in MySQL and insert data
CREATE TABLE ' employee ' (      ' id ' int (one) ' is not null,      ' name ' varchar () is not NULL,      PRIMARY KEY (' id ')    ) engine= MyISAM  DEFAULT Charset=utf8;  

INSERT into employee (Id,name) VALUES (1, ' Michael ');  

Hbase
HBase (main):006:0> create ' employee ', ' info ' 0 row (s) in 0.4440 seconds=> Hbase::table-employee

Hive does not require data preparation, and so on--create-hive-table will automatically build the table

Import from MySQL to HBase
# sqoop Import--connect jdbc:mysql://localhost:3306/sqoop_test--username root--password root--table employee--hbase- Table Employee--column-family Info--hbase-row-key id-m 1Warning:/usr/lib/sqoop/. /hive-hcatalog does not exist! Hcatalog jobs would fail. Please set $HCAT _home to the root of your hcatalog installation. Warning:/usr/lib/sqoop/. /accumulo does not exist! Accumulo imports would fail. Please set $ACCUMULO _home to the root of your accumulo installation.14/12/01 17:36:25 INFO sqoop. Sqoop:running Sqoop version:1.4.4-cdh5.0.114/12/01 17:36:25 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.14/12/01 17:36:25 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.14/12/01 17:36:25 INFO tool. Codegentool:beginning code generation14/12/01 17:36:26 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' employee ' as T LIMIT 114/12/01 17:36:26 INFO Manager. Sqlmanager:executing SQL Statement:SELECT t.* from ' employee ' as T LIMIT 114/12/01 17:36:26 INFO orm. Compilationmanager:hadoop_mapred_home is/usr/lib/hadoop-mapreduce ... There are too many intermediate logs, with ellipses instead of 14/12/01 17:37:12 INFO MapReduce. importjobbase:transferred 0 bytes in 37.3924 seconds (0 bytes/sec) 14/12/01 17:37:12 INFO MapReduce. Importjobbase:retrieved 3 Records.


Go check hbase.
HBase (main):001:0> scan ' employee ' slf4j:class path contains multiple slf4j bindings. Slf4j:found binding in [Jar:file:/usr/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/staticloggerbinder.class] Slf4j:found Binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/ Staticloggerbinder.class]slf4j:see http://www.slf4j.org/codes.html#multiple_bindings for an explanation.                                                                                                            Slf4j:actual binding is of type [Org.slf4j.impl.log4jloggerfactory]row Column+cell                                       1                                                               Column=info:name, timestamp=1417426628685, Value=michael                                                                   2 Column=info:name, timestamp=1417426628685, value=ted 3 Column=info: Name, timestamp=1417426628685, Value=jack 3 row (s) in 0.1 630 seconds

Successfully inserted 3 data
Importing hive from MySQL
# sqoop Import--connect jdbc:mysql://localhost:3306/sqoop_test--username root--password root--table employee--hive-i Mport--hive-table hive_employee--create-hive-tablewarning:/usr/lib/sqoop/. /hive-hcatalog does not exist! Hcatalog jobs would fail. Please set $HCAT _home to the root of your hcatalog installation. Warning:/usr/lib/sqoop/. /accumulo does not exist! Accumulo imports would fail. Please set $ACCUMULO _home to the root of your Accumulo installation ...... ........... 14/12/02 15:12:13 INFO Hive. hiveimport:loading data to table default.hive_employee14/12/02 15:12:14 INFO hive. hiveimport:table default.hive_employee Stats: [num_partitions:0, Num_files:4, num_rows:0, total_size:23, Raw_data_siz e:0]14/12/02 15:12:14 INFO Hive. hiveimport:ok14/12/02 15:12:14 INFO Hive. Hiveimport:time taken:0.799 seconds14/12/02 15:12:14 INFO hive. Hiveimport:hive import complete.14/12/02 15:12:14 INFO Hive. Hiveimport:export directory is empty, removing it.

here is the real environment MySQL JDBC link do not use localhost, because this task will be distributed to send different Hadoop machine, the machine can really be connected to MySQL through JDBC, otherwise it will lose data
Check the hive.
Hive> SELECT * from Hive_employee;ok1michael2ted3jacktime taken:0.179 seconds, Fetched:3 row (s)

There is one more thing to declare:currently Sqoop can only import data from MySQL into Hive's native table (that is, HDFs-based storage), unable to import data to external tables (e.g., hbase-built hive tables)
Class! Next Talk about exporting!

Alex's Novice Hadoop Tutorial: Lesson 8th Sqoop1 Importing Hbase and Hive

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.