An error is reported during data migration between Hive and MySQL databases using Sqoop.

Source: Internet
Author: User
Tags hadoop ecosystem sqoop
An error is reported when Sqoop is used to migrate data between Hive and MySQL databases.

An error is reported when Sqoop is used to migrate data between Hive and MySQL databases.

Run./sqoop create-hive-table -- connect jdbc: mysql: // 192.168.1.10: 3306/ekp_11 -- table job_log -- username root -- password 123456 -- hive-table job_log

Prepare to copy the table structure of relational data to hive. However, the following error message is displayed:
Warning:/usr/lib/hbase does not exist! HBase imports will fail.
Please set $ HBASE_HOME to the root of your HBase installation.
Warning:/usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $ HCAT_HOME to the root of your HCatalog installation.
15/08/02 02:04:14 WARN tool. BaseSqoopTool: Setting your password on the command-line is insecure. Consider using-P instead.
15/08/02 02:04:14 INFO tool. BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15/08/02 02:04:14 INFO tool. BaseSqoopTool: delimiters with -- fields-terminated-by, etc.
15/08/02 02:04:14 INFO manager. MySQLManager: Preparing to use a MySQL streaming resultset.
15/08/02 02:04:14 INFO manager. SqlManager: Executing SQL statement: SELECT t. * FROM 'job _ log' AS t LIMIT 1
15/08/02 02:04:14 INFO manager. SqlManager: Executing SQL statement: SELECT t. * FROM 'job _ log' AS t LIMIT 1
15/08/02 02:04:14 WARN hive. TableDefWriter: Column fd_start_time had to be cast to a less precise type in Hive
15/08/02 02:04:14 WARN hive. TableDefWriter: Column fd_end_time had to be cast to a less precise type in Hive
Java HotSpot (TM) 64-Bit Server VM warning: You have loaded library/cloud/Hadoop-2.2.0/lib/native/libhadoop. so which might have disabled stack guard. the VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack-c ', Or link it with'-z noexecstack '.
15/08/02 02:04:16 WARN util. NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/02 02:04:17 INFO hive. HiveImport: Loading uploaded data into Hive
15/08/02 02:04:17 ERROR tool. CreateHiveTableTool: Encountered IOException running create table job: java. io. IOException: Cannot run program "hive": error = 2, No such file or directory
At java. lang. ProcessBuilder. start (ProcessBuilder. java: 1047)
At java.lang.Runtime.exe c (Runtime. java: 617)
At java.lang.Runtime.exe c (Runtime. java: 528)
At org.apache.sqoop.util.Executor.exe c (Executor. java: 76)
At org.apache.sqoop.hive.HiveImport.exe cuteExternalHiveScript (HiveImport. java: 382)
At org.apache.sqoop.hive.HiveImport.exe cuteScript (HiveImport. java: 335)
At org. apache. sqoop. hive. HiveImport. importTable (HiveImport. java: 239)
At org. apache. sqoop. tool. CreateHiveTableTool. run (CreateHiveTableTool. java: 58)
At org. apache. sqoop. Sqoop. run (Sqoop. java: 145)
At org. apache. hadoop. util. ToolRunner. run (ToolRunner. java: 70)
At org. apache. sqoop. Sqoop. runSqoop (Sqoop. java: 181)
At org. apache. sqoop. Sqoop. runTool (Sqoop. java: 220)
At org. apache. sqoop. Sqoop. runTool (Sqoop. java: 229)
At org. apache. sqoop. Sqoop. main (Sqoop. java: 238)
Caused by: java. io. IOException: error = 2, No such file or directory
At java. lang. UNIXProcess. forkAndExec (Native Method)
At java. lang. UNIXProcess. (UNIXProcess. java: 186)
At java. lang. ProcessImpl. start (processsimpl. java: 130)
At java. lang. ProcessBuilder. start (ProcessBuilder. java: 1028)

... 13 more

Inertial thinking, thinking that sqoop can intelligently find its own hive.

Solution: Configure your hive environment for sqoop

The procedure is as follows:
1. Find the sqoop-1.4.4 file under/sqoop-env-template.sh/conf and rename the file as sqoop-env.sh;
2. Edit the sqoop-env.sh file and match your hive installation directory.

For example, export HIVE_HOME =/cloud/apache-hive-1.2.1-bin

Related reading:

Implement data import between Mysql, Oracle, and HDFS/Hbase through Sqoop

[Hadoop] Detailed description of Sqoop Installation Process

Use Sqoop to export data between MySQL and HDFS Systems

Hadoop Oozie learning notes Oozie does not support Sqoop Problem Solving

Hadoop ecosystem construction (hadoop hive hbase zookeeper oozie Sqoop)

Full history of Hadoop learning-use Sqoop to import MySQL Data to Hive

Sqoop details: click here
Sqoop: click here

This article permanently updates the link address:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.