Introduce specific scenarios
There are several issues that need to be migrated from existing Oracle and imported into hive: 1. The existing Oracle database cannot be exported using the entire library, the database server does not have permissions, and 2, due to each data provider's upgrade, The Oracle database is updated in a way that adds fields without deleting the fields, and the table data structures created in hive are new, meaning that the Oracle-exported data needs to be field-mapped in the Hive table.
decided to first import Oracle into another Oracle, the table field in Oracle and the table field in hive to get each corresponding relationship, then write the corresponding Sqoop statement corresponding to the use of ETL process installation Account day by day import.
From one Oracle to another Oracle import scenario, use the SQLULDR2 tool written by God to export data, import and use
The SQLLDR command is imported into another library table.
Specific commands for exporting
Sqluldr2.bin User=root/[email protected] query= "Select/*+ Parallel (8) */* from table" Head=no file=/data/oracle/tmp/f Ile exporting to a local directory
Specific commands to import
Sqlldr Root/[email protected] @ip: 1521/tnsname control=/data/oracle/shell/xxx.ctl
Data=/data/oracle/tmp/file
Log=/data/oracle/log/file.log errors=0 rows=100000 bindsize=1024000000 direct=y;
Xxx.ctl is used to specify the fields and delimiters for importing tables as follows
Load data
Truncate into table TableName
Fields terminated by ', ' #文件的列分隔符
TRAILING Nullcols
(
IP char (255),
APPKEY char (255),
PLATFORM char (255),
PORTAL char (255),
CompanyID char (255))
Remember the migration of historical data from Oracle to Hive (i)