First of all, we need to install sqoop. I use sqoop1. Secondly, we need ojdbc6.jar. The jar package is as follows: The www.oracle.comtechnetworkdatabaseenterprise-editionjdbc-112010-090769.html will copy the decompressed package to the lib directory under the sqoop installation directory and finally execute our import
First of all, we need to install sqoop. I use sqoop1. Secondly, we need ojdbc6.jar. The jar package is as follows: The http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html will copy the decompressed package to the lib directory under the sqoop installation directory and finally execute our import
First, install sqoop
I am using sqoop1
Next, we need the jar package ojdbc6.jar as follows:
Http://www.oracle.com/technetwork/database/enterprise-edition/jdbc-112010-090769.html
Copy the decompressed package to the lib directory under the sqoop installation directory.
Finally, execute our import command.
sqoop export --table FDC_JPLP --connect jdbc:oracle:thin:@localhost:port:test1 --username test --password test --export-dir /user/hive/warehouse/data_w.db/seq_fdc_jplp --columns goal_ocityid,goal_issueid,compete_issueid,ncompete_rank --input-fields-terminated-by '\001' --input-lines-terminated-by '\n'
Be sure to specify the-columns parameter. Otherwise, an error will be reported and the columns cannot be found.
Usage:-columns
Check whether data is imported successfully.
?sqoop eval --connect jdbc:oracle:thin:@localhost:port:test1 --query "select * from FDC_JPLP" --username fccsreport --password fccsoracle10g_report
Of course, you can also use Oracle-related management tools.
Note: When importing and exporting data, make sure that the table fields on both sides are of the same type. Otherwise, the data type conversion error may occur.
Original article address: Use sqoop to import hive/hdfs data to Oracle. Thank you for sharing it with me.