The sqoop1.4.6 how to import MySQL data into the Sqoop installation of Hadoop is described in the previous article , and the following is a simple use command for data interoperability between the two.
Display MySQL database information, General Sqoop installation test
Sqoop list-databases--connect jdbc:mysql://192.168.2.101:3306/--username root--password root
Show all tables in the database:
Sqoop list-tables--connectjdbc:mysql://192.168.2.101:3306/flowdb--username Root-password Root
MySQL Import into HDFs
Import ' \ t ' -M 1
Sqoop # #sqoop命令
Import # #表示导入
--connect jdbc:mysql://ip:3306/sqoop # #告诉jdbc, link mysql URL
--username Root # #连接mysql的用户名
--password Admin # #连接mysql的密码
--table AA # #从mysql导出的表名称
--fields-terminated-by ' \ t ' # #指定输出文件中的行的字段分隔符
--target-dir/user/hadoop/databases/ssa/fin_cashier_order
-M 1 # #复制过程使用1个map作业
If you do not write--target-dir, the default is the User/username/tablename path on HDFs
If executed repeatedly, the directory already exists and can be manually deleted
After the execution of the command, observe the directory of HDFs/user/{user_name}, there will be a folder is AA, there is a file is part-m-00000. The content of the file is the contents of the data table AA, and the fields are separated by tabs.
To view files on HDFs
Hadoop fs-cat/user/jzyc/worktable/part-m-00000
HDFs exported to MySQL
Export the data from the previous step to HDFs into MySQL. We are known to use tab-delimited. So, we now create a data table in database flowdb called Worktable_hdfs, which has two fields. Then execute the following command
' \ t '
Sqoop Importing views to HDFs
import --connect jdbc:mysql://192.168.2.101:3306/flowdb--username root--password root' SELECT * from view_worktable WHERE \ $CONDITIONS ' " , "
Import the data from the relational database into the Hive table, using the--query statement
Import " SELECT id,classify,name,modeltype,createdate,createuserid,designjson,status from worktable where \ $CONDITIONS "","
Import data from a relational database into a hive table, using the--columns--where statement
Import " Id,classify,name,modeltype,createdate,createuserid " " classify = 1 " " , "
Note:--target-dir/hive/warehouse/worktable2 can be replaced with--hive-import--hive-table WorkTable2
From the information above you can see Sqoop or the M/R engine of the Hadoop walk.
These are just a few simple examples that have been validated and are more complex to follow.
Sqoop instances of import and export between MySQL data and Hadoop