Sqoop testing the connection usage of the Oracle database

Source: Internet
Author: User
Tags deprecated sqoop accumulo

Test connection usage for Oracle database

① Connect to Oracle database, list all databases

[[email protected] sqoop] $sqoop list-databases--connect jdbc 10.1.69.173:1521:orclbi--username huangq-p
or Sqoop list-databases--connect jdbc Racle:thin10.1.69.173:1521:orclbi--username Huangq--password 123456 or Mysql:sqoop list-databases--connectjdbc:mysql://172.19.17.119:3306/--username Hadoop--password Hadoopwarning:/home/hadoop/sqoop/. /hcatalog does not exist! Hcatalog jobs would fail.
Please set $HCAT _home to the root of your hcatalog installation.
Warning:/home/hadoop/sqoop/. /accumulo does not exist! Accumulo Imports Willfail.
Please set $ACCUMULO _home to the root of your Accumulo installation.
Warning: $HADOOP _home is deprecated.
14/08/17 11:59:24 INFO Sqoop. Sqoop:running Sqoop version:1.4.5
Enter Password:
14/08/17 11:59:27 INFO Oracle. Oraoopmanagerfactory:data Connector for Oracleand Hadoop is disabled.
14/08/17 11:59:27 INFO Manager. sqlmanager:using default Fetchsize of 1000
14/08/17 11:59:51 INFO Manager. Oraclemanager:time zone have been set to GMT
Mrdrp
Mkfow_qh

Table import of ②oracle database into HDFs

Note: By default, 4 map tasks are used, each of which writes the data it imports into a separate file, 4 files in the same directory, and in this case-m1 means that only one map task text file cannot be saved as a binary field, and the null value and string value "null" cannot be distinguished from each other. After executing the following command, a Enterprise.java file is generated that can be used by the LS Enterprise.java view, code generation is a necessary part of the Sqoop import process, Sqoop will be deserialized using the generated code before writing data from the source database to HDFs [[email protected]~]$ sqoop Import- -connect jdbc racle:thin10.1.69.173:1521:orclbi--username huangq--password 123456--table ord_uv-m 1--target-dir/user /sqoop/test--direct-split-size 67108864
Warning:/home/hadoop/sqoop/. /hcatalog does not exist! Hcatalog jobs Willfail.
Please set $HCAT _home to the root of your hcatalog installation.
Warning:/home/hadoop/sqoop/. /accumulo does not exist! Accumulo Imports Willfail.
Please set $ACCUMULO _home to the root of your Accumulo installation.
Warning: $HADOOP _home is deprecated.
14/08/17 15:21:34 INFO Sqoop. Sqoop:running Sqoop version:1.4.5
14/08/17 15:21:34 WARN tool. Basesqooptool:setting your password on thecommand-line is insecure. Consider Using-p instead.
14/08/17 15:21:34 INFO Oracle. Oraoopmanagerfactory:data Connector for Oracleand Hadoop is disabled.
14/08/17 15:21:34 INFO Manager. sqlmanager:using default Fetchsize of 1000
14/08/17 15:21:34 INFO tool. Codegentool:beginning code Generation
14/08/17 15:21:46 INFO Manager. Oraclemanager:time zone have been set to GMT
14/08/17 15:21:46 INFO Manager. Sqlmanager:executing SQL statement:select t.*from ord_uv t WHERE 1=0
14/08/17 15:21:46 INFO Orm.CompilationManager:HADOOP_MAPRED_HOME is/home/hadoop/hadoop
Note:/tmp/sqoop-hadoop/compile/328657d577512bd2c61e07d66aaa9bb7/ord_uv.javauses or overrides a deprecated API.
Note:recompile with-xlint:deprecation for details.
14/08/17 15:21:47 INFO Orm. compilationmanager:writing Jar File:/tmp/sqoop-hadoop/compile/328657d577512bd2c61e07d66aaa9bb7/ord_uv.jar
14/08/17 15:21:47 INFO Manager. Oraclemanager:time zone have been set to GMT
14/08/17 15:21:47 INFO Manager. Oraclemanager:time zone have been set to GMT
14/08/17 15:21:47 INFO MapReduce. Importjobbase:beginning Import of Ord_uv
14/08/17 15:21:47 INFO Manager. Oraclemanager:time zone have been set to GMT
14/08/17 15:21:49 INFO db. Dbinputformat:using Read commited Transactionisolation
14/08/17 15:21:49 INFO mapred. Jobclient:running job:job_201408151734_0027
14/08/17 15:21:50 INFO mapred. Jobclient:map 0% Reduce 0%
14/08/17 15:22:12 INFO mapred. Jobclient:map 100% Reduce 0%
14/08/17 15:22:17 INFO mapred. Jobclient:job complete:job_201408151734_0027
14/08/17 15:22:17 INFO mapred. Jobclient:counters:18
14/08/17 15:22:17 INFO mapred. Jobclient:job Counters
14/08/17 15:22:17 INFO mapred. jobclient:slots_millis_maps=15862
14/08/17 15:22:17 INFO mapred. Jobclient:total time spent by allreduces waiting after reserving slots (ms) =0
14/08/17 15:22:17 INFO mapred. Jobclient:total time spent by allmaps waiting after reserving slots (ms) =0
14/08/17 15:22:17 INFO mapred. jobclient:launched Map Tasks=1
14/08/17 15:22:17 INFO mapred. Jobclient:slots_millis_reduces=0
14/08/17 15:22:17 INFO mapred. Jobclient:file Output Formatcounters
14/08/17 15:22:17 INFO mapred. Jobclient:bytes written=1472
14/08/17 15:22:17 INFO mapred. Jobclient:filesystemcounters
14/08/17 15:22:17 INFO mapred. jobclient:hdfs_bytes_read=87
14/08/17 15:22:17 INFO mapred. jobclient:file_bytes_written=33755
14/08/17 15:22:17 INFO mapred. jobclient:hdfs_bytes_written=1472
14/08/17 15:22:17 INFO mapred. Jobclient:file Input Formatcounters
14/08/17 15:22:17 INFO mapred. Jobclient:bytes read=0
14/08/17 15:22:17 INFO mapred. Jobclient:map-reduce Framework
14/08/17 15:22:17 INFO mapred. Jobclient:map input records=81
14/08/17 15:22:17 INFO mapred. Jobclient:physical memory (bytes) snapshot=192405504
14/08/17 15:22:17 INFO mapred. Jobclient:spilled records=0
14/08/17 15:22:17 INFO mapred. Jobclient:cpu Time Spent (ms) =1540
14/08/17 15:22:17 INFO mapred. Jobclient:total committed heapusage (bytes) =503775232
14/08/17 15:22:17 INFO mapred. Jobclient:virtual memory (bytes) snapshot=2699571200
14/08/17 15:22:17 INFO mapred. Jobclient:map Output records=81
14/08/17 15:22:17 INFO mapred. jobclient:split_raw_bytes=87
14/08/17 15:22:17 INFO MapReduce. importjobbase:transferred 1.4375 KB in29.3443 seconds (50.1631 bytes/sec)
14/08/17 15:22:17 INFO MapReduce. Importjobbase:retrieved Bayi Records.

③ Data export Oracle and HBase

Use export to import data from HDFs into a remote database export--CONNECTJDBC racle:thin 192.168.**.**:* *:* *--username **--password=**-m1table Vehicle--export-dir/user/root/vehicle importing data to hbase sqoop import--connect jdbc Racle:thin 192.168.**.**:* *:* *--user NAME**--PASSWORD=**--M1--table VEHICLE--hbase-create-table--hbase-table vehicle--hbase-row-key ID--column-family Vehicleinfo--split-by ID More content please pay attention to: http://bbs.superwu.cn The two-dimensional code of the Superman Academy:

Sqoop testing the connection usage of the Oracle database

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.