Using sqoop1.4.4 to import data from Oracle to hive for error logging and resolution

Source: Internet
Author: User
Tags sqoop hadoop fs

The following error occurred during the use of the Command Guide data

Sqoop Import--hive-import--connect jdbc:oracle:thin:@192.168.29.16:1521/testdb--username NAME--  Passord PASS--verbose-m 1--table t_userinfo  

Error 1: File does not Exist:hdfs://opt/sqoop-1.4.4/lib/commons-io-1.4.jar

Filenotfoundexception:file does not EXIST:HDFS://Opt/sqoop-1.4.4/lib/commons-io-1.4.jar
... ...
At Org.apache ...

Cause Analysis:

Thanks Daniel Koverman's answer http://stackoverflow.com/questions/19375784/sqoop-jar-files-not-found

It is common forHadoop Services to look forJarsinchHDFS because all nodesinchThe cluster can access filesinchHdfs. This is importantifThe MapReduce job being kicked off by the Hadoop service,inchThis CaseSqoop, have a dependence on those jars.  Remember, the mappers is running on a DataNode, not the NameNode even though you is (probably) running the Sqoop command From the NameNode. Putting the jars on HDFS are not the only possible solution to this problem, but it's a sensible one. Now we can deal with the actual error. At least one, but probably all, of your mappers is unable toFindA jar they need. That means this either the jar does not exist or the user trying to access them does does have the required permissions. First checkifThefileexists by running Hadoop FS-lshome/sqoopuser/sqoop-1.4.3-cdh4.4.0/sqoop-1.4.3-cdh4.4.0. jar by a user with Superuser privileges on the cluster. If It does not exist, put it there with Hadoop fs-put {jarlocationon/namenode/filesystem/sqoop-1.4.3-cdh4.4.0. jar}/home/sqoopuser/sqoop-1.4.3-cdh4.4.0/sqoop-1.4.3-cdh4.4.0. jar.

Workaround:

Put the jar file involved in the hint to the same location in the HDFs file system, if there is no corresponding directory in the file system, you need to establish the corresponding directory, in my error prompt, because hdfs://master:8020/missing/opt/sqoop-1.4.4/lib /various jars in the folder, so my approach is to put the entire/opt/sqoop-1.4.4/lib folder here into hdfs://master:8020/

<!--view the file directory in the following file system, which is a recursive query, if the file is not recommended for home-R parameters, it is viewed by layer-by-Hadoop FS-ls-R/<!--build the same directory structure--Hadoop FS-mkdir/Opthadoop FS-mkdir/opt/sqoop-1.4.4<!--the local/opt/sqoop-1.4.4/lib Copy to/opt/sqoop-in HDFs1.4.4 Catalog--Hadoop FS-put/opt/sqoop-1.4.4/lib/opt/sqoop-1.4.4/<!--Check the results to confirm that the copy was successful-Hadoop FS-ls-r/opt/sqoop-1.4.4

Error 2:java.lang.classnotfoundexception:class U_basicinfo not found

For tables to be imported into hive, the error indicates that the corresponding. class and. jar files cannot be found

Java.lang.Exception:java.lang.RuntimeException:java.lang.ClassNotFoundException:Class U_basicinfo not found At    org.apache.hadoop.mapred.localjobrunner$job.runtasks (localjobrunner.java:462)    at Org.apache.hadoop.mapred.localjobrunner$job.run (Localjobrunner.java:522) caused by: Java.lang.RuntimeException:java.lang.ClassNotFoundException:Class u_basicinfo not found at    Org.apache.hadoop.conf.Configuration.getClass (Configuration.java:1895)    at Org.apache.sqoop.mapreduce.db.DBConfiguration.getInputClass (Dbconfiguration.java:394) At    ... ..

Cause analysis: Temporarily do not know

Solution:

Thanks user236575 ' s answer:http://stackoverflow.com/questions/21599785/sqoop-not-able-to-import-table/21626010#21626010

The default sqoop will generate the corresponding table's Java files and the. class and. jar files that are generated by the. java file in the import table process, and the. javascript files are saved in the Sqoop/bin directory, while the class and jar files are stored in the/tmp/ sqoop-hduser/compile/in the appropriate folder.

My solution is to find the class and jar files to import the tables, and then copy them to the Sqoop/bin directory below the/user/usernam/directory in the HDFs file system (after a post-test, just copy the. class and. Jar to the sqoop/ Can be successfully import in the bin directory).

<!--copy to the Sqoop/bin directory--CP /tmp/sqoop-root/compile/A temporary folder containing the required class and JAR files/ *  /opt /sqoop-1.4.4/bin/<!--put into the/user/username/folder in HDFs-->hadoop fs-put/tmp/sqoop-root/compile/ A temporary folder contains the required class and jar files/*/user/root/

Error 3 Org.apache.hadoop.mapred.file already exists Exception:output directory hdfs://user/root/...

Solution:

Once you have executed an Import data Table command, you may get this error when you execute it again, as long as you go to the appropriate file or folder in HDFs to delete it.

Hadoop FS-RM /user/username/*

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.