Requirements: Export the TBLs table from the hive database to HDFs;
$SQOOP 2_home/bin/sqoop. SH
Sqoop:Theset server--host hadoop000--port 12000--webapp sqoopServer is set Successfully
Create connection:
Sqoop the>Create connection--cid 1Creating Connection forConnector withID 1Please fill following values to create new connectionObjectName: tbls_import_demoConnection configurationjdbc Driver Class: com.mysql.jdbc.DriverJDBC Connection String: jdbc:mysql: //hadoop000:3306/hiveUsername: rootPassword:****JDBC Connection Properties:there is currently0ValuesinchThe map:entry#security related configuration Optionsmax connections:TenNew Connection is successfully created with validation status FINE and persistentID Ten
Create job:
Sqoop the>Create Job--xid -- Type importCreating job forConnection withID TenPlease fill following the values to create new jobObjectName: tbls_importDatabase configurationschema name: hivetable name: tblsTable SQL Statement:table column names:partition column Name:nullsinchpartition column:boundary query:output configurationstorage type:0: Hdfschoose:0Output Format:0: Text_file1: Sequence_filechoose:0Compression Format:0: NONE1: DEFAULT2: DEFLATE3: GZIP4: BZIP25: LZO6: LZ47: Snappychoose:0Output directory: HDFs: //hadoop000:8020/sqoop2/tbls_import_demothrottling ResourcesExtractors:Loaders:New job was successfully created with validation status FINE and persistent /c12>ID 6
Submit Job:
Start Job--jid
To view job execution status:
Status Job--jid 6
View files on HDFs after successful operation
Hadoop FS-ls hdfs://Hadoop000:8020/sqoop2/tbls_import_demo
SQOOP2 Import relational database data to HDFs