First, the data of a MySQL table is imported into HDFs using Sqoop
1.1, first in MySQL to prepare a test table
Mysql> descUser_info;+-----------+-------------+------+-----+---------+-------+|Field|Type| Null | Key | Default |Extra|+-----------+-------------+------+-----+---------+-------+|Id| int( One)|YES| | NULL | || user_name | varchar( -)|YES| | NULL | ||Age| int( One)|YES| | NULL | ||Address| varchar( -)|YES| | NULL | |+-----------+-------------+------+-----+---------+-------+4Rowsinch Set(0.14sec) MySQL> Select * fromUser_info;+------+-----------+------+--------------------+|Id| user_name |Age|Address|+------+-----------+------+--------------------+| 1 |Zhangsan| - |Shenzhen Nanshang|| 2 |Lisi| - |Shenzhen Futian|| 3 |Wangwu| at |Shenzhen Luohu|| 4 |Cailiu| - |Shenzhen Guangming|| 5 |Zhuqi| - |Shenzhen Baoan|| 6 |Houba| - |Shenzhen Xili|| 7 |Laojiu| - |Shenzhen Yantian|+------+-----------+------+--------------------+7Rowsinch Set(0.00sec) MySQL>
The first 100 data in the User_info table are exported below, as long as ID user_name and age 3 fields, the data exists under the HDFs directory/tmp/sqoop/user_info.
Importing MySQL data into a hive table with Sqoop