The sqoop2-1.99.4 and sqoop2-1.99.3 versions operate slightly differently: The new version uses link instead of the old version of connection, which is similar to other uses.
sqoop2-1.99.4 Environment Construction See: SQOOP2 Environment Construction
sqoop2-1.99.3 version Implementation see: SQOOP2 Import relational database data to HDFs
To start the sqoop2-1.99.4 version of the client:
$SQOOP 2_home/bin/sqoop. SH 12000 --webapp Sqoop
View All connector:
Show Connector--all
2connector (s) to Show:connector withID 1: Name: hdfs - connector Class:org.apache.sqoop.connector.hdfs.HdfsConnector Version:1.99.4-cdh5.3.0Connector withID 2: Name: generic -jdbc- connector Class:org.apache.sqoop.connector.jdbc.GenericJdbcConnector Version:1.99.4-cdh5.3.0
Query all Link:
Show Link
Delete the specified link:
Delete link--lid x
Query All jobs:
Show Job
To delete the specified job:
Delete Job--jid 1
Create a generic-jdbc-connector type of connector
Create link--cid 2Name: First Link jdbc Driver Class: com.mysql.jdbc.Driver jdbc Connection String: jdbc : MySQL: //hadoop000:3306/hiveUsername: root Password:****JDBC Connection Properties:there is currently0ValuesinchThe map:entry# protocol = TCP there is currently1ValuesinchThe map:protocol=TCP entry# New link is successfully created with validation status OK and persistentID 3
Show Link
+----+-------------+-----------+---------+| Id | Name | Connector | Enabled |+----+-------------+-----------+---------+| 3 | First Link | 2 | true |+----+-------------+-----------+---------+
To create a hdfs-connector type of connector:
Create Link-cid 1 Second Link hdfs://hadoop000:8020 ID4
Show Link+----+-------------+-----------+---------+| Id | Name 3 | First Link 2 true 4 1 true |+--- -+-------------+-----------+---------+
Show Links-all2Link (s) to Show:link withID 3and name First Link (Enabled:true, Created byNULLAt the-2-2?? One: -, Updated byNULLAt the-2-2?? One: -) Using ConnectorID 2Link configuration jdbc Driver Class:com.mysql.jdbc.Driver jdbc Connection string:jdbc:mysql://hadoop000:3306/hiveusername:root password:jdbc Connection properties:protocol=TCP link withID 4and name Second Link (Enabled:true, Created byNULLAt the-2-2?? One: +, Updated byNULLAt the-2-2?? One: +) Using ConnectorID 1Link configuration HDFS Uri:hdfs://hadoop000:8020
Create a job based on the connector ID:
Create job-f 3-t 4Creating Job forLinks with fromID 3and toID 4Please fill following the values to create new jobObjectName: sqoopy from database configuration Schema name: hive Table Name: tbls Table SQL statement:table column names:partition column name:null value allowed forThe partition column:boundary query:tojob configuration Output format:0: Text_file1: Sequence_file Choose:0Compression Format:0: NONE1: DEFAULT2: DEFLATE3: GZIP4: BZIP25: LZO6: LZ47: SNAPPY8: CUSTOM Choose:0Custom Compression format:output directory: HDFs: //hadoop000:8020/sqoop2/tbls_import_demo_sqoop1.99.4throttling Resources Extractors:Loaders:New job was successfully created with validation status OK a nd persistentID 2
Query All jobs:
Show Job+----+--------+----------------+--------------+---------+| Id | Name 2 2 1 true |+----+--------+----------------+-- ------------+---------+
Start the specified job: View the files on HDFs after the job is finished (HDFs fs-ls hdfs://hadoop000:8020/sqoop2/tbls_import_demo_sqoop1.99.4/)
2
To view the execution status of a specified job:
2
To stop the specified job:
2
SQOOP2 Import relational database data to HDFs (sqoop2-1.99.4 version)