-
- Tar-xzvf sqoop-1.4.1-cdh4.1.0.tar.gz
- Add sqljdbc4.jar into/usr/lib/sqoop/lib
- Set path
Export sqoop_home =/usr/lib/sqoop
- Export ant_lib =/home/OP1/jasonliao/Apache-ant-1.9.0/lib
Export Path = $ path:/home/OP1/logging/tool/play-1.2.5: $ java_home/bin: $ ant_home/bin: $ sqoop_home/bin
Sqoop import -- connect 'jdbc: sqlserver: // 192.168.83.50; username = uapp_system; Password = 12345wy_12345wy; database = mrtgtest '\
-- Table = A -- columns a, B, c -- where "B = 'F'" -- hbase-Table test_sqoop -- column-family CF -- hbase-row-key a-M 1
This type of column is not flexible, and multiple column families cannot be specified.
-
- Export sqoop to HDFS:
- Sqoop import -- connect 'jdbc: sqlserver: // 192.168.83.50; username = uapp_system; Password = 12345wy_12345wy; database = mrtgtest' -- table = A -- columns A, B, C -- where "B = 'F'" -- target-DIR/user/xgliao/output2-M 1
Importtsv to hfile:
- Hadoop JAR/usr/lib/hbase/hbase-0.94.2-cdh4.2.0-security.jar importtsv-dimporttsv. bulk. output =/user/xgliao/hfile/test-dimporttsv. separator =,-dimporttsv. timestamp = 20130322-dimporttsv. columns = hbase_row_key, cf: X, cf: Y test_sqoop/user/xgliao/output2
- Hfile File Import hbase:
- Hadoop JAR/usr/lib/hbase/hbase-0.94.2-cdh4.2.0-security.jar completebulkload/user/xgliao/hfile/test test_sqoop (note that the production HDFS file and hbase file are under the same directory)
This method has many steps, but is flexible. The import process basically does not occupy hbase resources.