Sqoop exports hive data to oracle and sqoophive
Use sqoop to import data from hive to oracle
1. Create a table in oracle Based on the hive table structure
2. Run the following command:
Sqoop export -- table TABLE_NAME -- connect jdbc: oracle: thin: @ HOST_IP: DATABASE_NAME -- username USERNAME -- password PASSWORD password -- export-dir/user/hive/test/TABLE_NAME -- columns ID, data_date, data_type, c1, c2, c3 -- input-fields-terminated-by '\ 001' -- input-lines-terminated-by' \ n' -- input-null-string "\\\\ n" -- input -null-non-string "\\\\ N"
Where
-- Table: Specifies the name of the table to be exported.
-- Connect jdbc: oracle: thin: @ IP address of the oracle database to import: Imported Database
-- Username: oracle Database account
-- Password
-- Export-dir hive table data file path on hdfs
-- Columns specifies the column name of the table (this parameter must be specified; otherwise, an error is returned and cannot be imported)
-- Input-fields-terminated-by '\ 001' Separator
-- Input-lines-terminated-by '\ n' Separator
-- Input-null-string "\\\\ N" -- input-null-non-string "\\\\ N" if the hive table contains a null field, you must add the parameters. Otherwise, the Import fails.