Reprint Please specify source: http://blog.csdn.net/l1028386804/article/details/46517039
Sqoop is used to import and export data.
(1) Import data from databases such as MySQL, Oracle, etc. into HDFs, Hive, HBase
(2) Export data from HDFs, Hive and hbase to databases such as MySQL, Oracle, etc.
1. Import data from MySQL to HDFs (default is/user/<username>)
Sqoop import--connect jdbc:mysql://hadoop0:3306/hive --username root--password admin--table tbls-- Fields-terminated-by ' \ t ' --null-string ' * * '- M 1--append --hive-import
Sqoop import--connect jdbc:mysql://hadoop0:3306/hive --username root--password admin--table tbls-- Fields-terminated-by ' \ t ' --null-string ' * * ' -M 1--append --hive-import --check-column ' tbl_id '-- Incremental Append--last-value 6
2. Export data from HDFs to MySQL
Sqoop export--connect jdbc:mysql://hadoop0:3306/hive --username root--password admin--table IDs-- Fields-terminated-by ' \ t '--export-dir '/ids '
3. Set as job, run job
Sqoop Job--create myjob--Import--connect jdbc:mysql://hadoop0:3306/hive --username root--password admin--table T BLS--fields-terminated-by ' \ t ' --null-string ' * * '- M 1--append
4. Import the exported transactions in the Mapper task unit.
Hadoop's--sqoop Notes