Import MySQL data into Hive method instance

Source: Internet
Author: User
Tags sqoop log4j

The following is an instance of importing data from MySQL into hive.

–hive-import indicates that the import to hive,–create-hive-table represents the creation of hive tables. –hive-table Specifies the name of the Hive table.

[Zhouhh@hadoop46 ~]$ sqoop Import--connect jdbc:mysql://hadoop48/toplists--verbose-m 1--username root--hive-overwrit E--direct--table award--hive-import--create-hive-table--hive-table mysql_award ' t '-- Lines-terminated-by ' n '--append

12/07/20 16:02:23 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.
12/07/20 16:02:23 INFO tool. Codegentool:beginning code Generation
12/07/20 16:02:23 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' award ' as T LIMIT 1
12/07/20 16:02:24 INFO Orm.CompilationManager:HADOOP_HOME is/home/zhouhh/hadoop-1.0.0/libexec/.
Note:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.java uses or overrides an outdated API.
Note: For more information, please recompile using-xlint:deprecation.
12/07/20 16:02:25 ERROR Orm. Compilationmanager:could not Rename/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.java to/home/ Zhouhh/./award.java
12/07/20 16:02:25 INFO Orm. compilationmanager:writing jar File:/tmp/sqoop-zhouhh/compile/2fe3efbc94924ad6391b948ef8f8254f/award.jar
12/07/20 16:02:25 INFO Manager. Directmysqlmanager:beginning mysqldump Fast Path Import
12/07/20 16:02:25 INFO MapReduce. Importjobbase:beginning Import of award
12/07/20 16:02:27 INFO mapred. Jobclient:running job:job_201207191159_0322
12/07/20 16:02:28 INFO mapred. Jobclient:map 0% Reduce 0%
12/07/20 16:02:41 INFO mapred. Jobclient:map 100% Reduce 0%
12/07/20 16:02:46 INFO mapred. Jobclient:job complete:job_201207191159_0322
12/07/20 16:02:46 INFO mapred. Jobclient:counters:18
12/07/20 16:02:46 INFO mapred. Jobclient:job counters
12/07/20 16:02:46 INFO mapred. jobclient:slots_millis_maps=12849
12/07/20 16:02:46 INFO mapred. Jobclient:total time spent by all reduces waiting after reserving slots (ms) =0
12/07/20 16:02:46 INFO mapred. Jobclient:total time spent by all maps waiting after reserving slots (ms) =0
12/07/20 16:02:46 INFO mapred. jobclient:launched Map Tasks=1
12/07/20 16:02:46 INFO mapred. Jobclient:slots_millis_reduces=0
12/07/20 16:02:46 INFO mapred. Jobclient:file Output Format Counters
12/07/20 16:02:46 INFO mapred. Jobclient:bytes written=208
12/07/20 16:02:46 INFO mapred. Jobclient:filesystemcounters
12/07/20 16:02:46 INFO mapred. jobclient:hdfs_bytes_read=87
12/07/20 16:02:46 INFO mapred. jobclient:file_bytes_written=30543
12/07/20 16:02:46 INFO mapred. jobclient:hdfs_bytes_written=208
12/07/20 16:02:46 INFO mapred. Jobclient:file Input Format Counters
12/07/20 16:02:46 INFO mapred. Jobclient:bytes read=0
12/07/20 16:02:46 INFO mapred. Jobclient:map-reduce Framework
12/07/20 16:02:46 INFO mapred. Jobclient:map input Records=1
12/07/20 16:02:46 INFO mapred. Jobclient:physical memory (bytes) snapshot=78295040
12/07/20 16:02:46 INFO mapred. Jobclient:spilled records=0
12/07/20 16:02:46 INFO mapred. Jobclient:cpu Time Spent (ms) =440
12/07/20 16:02:46 INFO mapred. Jobclient:total committed heap usage (bytes) =56623104
12/07/20 16:02:46 INFO mapred. Jobclient:virtual memory (bytes) snapshot=901132288
12/07/20 16:02:46 INFO mapred. Jobclient:map Output records=44
12/07/20 16:02:46 INFO mapred. jobclient:split_raw_bytes=87
12/07/20 16:02:46 INFO MapReduce. importjobbase:transferred bytes in 20.349 seconds (10.2216 bytes/sec)
12/07/20 16:02:46 INFO MapReduce. Importjobbase:retrieved Records.
12/07/20 16:02:46 INFO util. Appendutils:creating Missing output Directory-award
12/07/20 16:02:46 INFO Hive. Hiveimport:removing temporary files from import process:award/_logs
12/07/20 16:02:46 INFO Hive. hiveimport:loading uploaded data into Hive
12/07/20 16:02:46 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' award ' as T LIMIT 1
12/07/20 16:02:48 INFO Hive. HiveImport:WARNING:org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use Org.apache.hadoop.log.metrics.EventCounter in the log4j.properties files.
12/07/20 16:02:48 INFO Hive. Hiveimport:logging initialized using configuration in jar:file:/home/zhouhh/hive-0.8.1/lib/hive-common-0.8.1.jar!/ Hive-log4j.properties
12/07/20 16:02:48 INFO Hive. Hiveimport:hive History File=/home/zhouhh/hive-0.8.1/logs/hive_job_log_zhouhh_201207201602_1448253330.txt
12/07/20 16:02:53 INFO Hive. Hiveimport:ok
12/07/20 16:02:53 INFO Hive. Hiveimport:time taken:4.322 seconds
12/07/20 16:02:53 INFO Hive. hiveimport:loading Data to Table Default.mysql_award
12/07/20 16:02:53 INFO Hive. Hiveimport:deleted Hdfs://hadoop46:9200/user/hive/warehouse/mysql_award
12/07/20 16:02:53 INFO Hive. Hiveimport:ok
12/07/20 16:02:53 INFO Hive. Hiveimport:time taken:0.28 seconds
12/07/20 16:02:53 INFO Hive. Hiveimport:hive import complete.


Query in hive, the data has been successfully imported

The code is as follows Copy Code
Hive> select * from Mysql_award;

Ok
2012-04-27 06:55:00:402713629 5947 433203828 2 4027102 402713629 1001 NULL 715878221 kill days a iOS
2012-04-27 06:55:00:406788559 778 433203930 19 4017780 406788559 1001 1 13835155880 Pro New New Dandan Android
Time taken:0.368 seconds

Hive> because of the UTF8, so no garbled problems encountered.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.