Summary of problems encountered by Sqoop from Hive to MySQL

Source: Internet
Author: User
Tags auth deprecated sqoop

Hive version hive-0.11.0
Sqoop version sqoop-1.4.4.bin__hadoop-1.0.0
From Hive to MySQL
MySQL table:
mysql> desc cps_activation;

+ ———— + ————-+--+-–+ ——— + —————-+
| Field | Type | Null | Key | Default | Extra |
+ ———— + ————-+--+-–+ ——— + —————-+
| ID | Int (11) | NO | PRI | NULL | auto_increment |
| Day | Date | NO | MUL | NULL | |
| Pkgname | varchar (50) | YES | | NULL | |
| CID | varchar (50) | YES | | NULL | |
| PID | varchar (50) | YES | | NULL | |
| Activation | Int (11) | YES | | NULL | |
+ ———— + ————-+--+-–+ ——— + —————-+
6 rows in Set (0.01 sec)

Hive table

hive> desc Active;
Ok
ID int None
Day String None
Pkgname string None
CID string None
PID String None
activation int None

Test Link succeeded


[Hadoop@hs11 ~]sqoop list-databases–connect jdbc:mysql://localhost:3306/–username root–password Admin


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/20 16:42:26 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/20 16:42:26 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


Information_schema


Easyhadoop


Mysql


Test


[Hadoop@hs11 ~]$ sqoop list-databases–connect jdbc:mysql://localhost:3306/test–username root–password Admin


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/20 16:42:40 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/20 16:42:40 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


Information_schema


Easyhadoop


Mysql


Test


[Hadoop@hs11 ~]$ sqoop list-tables–connect jdbc:mysql://localhost:3306/test–username root–password Admin


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/20 16:42:54 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/20 16:42:54 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


Active

[Hadoop@hs11 ~]$ sqoop create-hive-table–connect jdbc:mysql://localhost:3306/test–table active–username D admin–hive-table Test


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/20 16:57:04 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/20 16:57:04 INFO tool. Basesqooptool:using hive-specific delimiters for output. You can override


13/08/20 16:57:04 INFO tool. Basesqooptool:delimiters with–fields-terminated-by, etc.


13/08/20 16:57:04 WARN tool. Basesqooptool:it seems ' ve specified at least one of following:


13/08/20 16:57:04 WARN tool. Basesqooptool:–hive-home


13/08/20 16:57:04 WARN tool. Basesqooptool:–hive-overwrite


13/08/20 16:57:04 WARN tool. Basesqooptool:–create-hive-table


13/08/20 16:57:04 WARN tool. Basesqooptool:–hive-table


13/08/20 16:57:04 WARN tool. Basesqooptool:–hive-partition-key


13/08/20 16:57:04 WARN tool. Basesqooptool:–hive-partition-value


13/08/20 16:57:04 WARN tool. Basesqooptool:–map-column-hive


13/08/20 16:57:04 WARN tool. Basesqooptool:without specifying Parameter–hive-import. Please note that


13/08/20 16:57:04 WARN tool. Basesqooptool:those arguments is not is used in the session. Either


13/08/20 16:57:04 WARN tool. Basesqooptool:specify–hive-import to apply them correctly or remove them


13/08/20 16:57:04 WARN tool. Basesqooptool:from command line to remove this warning.


13/08/20 16:57:04 INFO tool. Basesqooptool:please Note That–hive-home,–hive-partition-key,


13/08/20 16:57:04 INFO tool. Basesqooptool:hive-partition-value and–map-column-hive options are


13/08/20 16:57:04 INFO tool. Basesqooptool:are also valid for Hcatalog imports and exports


13/08/20 16:57:04 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


13/08/20 16:57:05 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' active ' as T LIMIT 1


13/08/20 16:57:05 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' active ' as T LIMIT 1


13/08/20 16:57:05 WARN Hive. Tabledefwriter:column Day had to is cast to a less precise type in Hive


13/08/20 16:57:05 INFO Hive. hiveimport:loading uploaded data into Hive


1. Reject Connection


[Hadoop@hs11 ~]$ sqoop export–connect jdbc:mysql://localhost/test–username root–password admin–table test–export-dir /user/hive/warehouse/actmp


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/21 09:14:07 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/21 09:14:07 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


13/08/21 09:14:07 INFO tool. Codegentool:beginning code Generation


13/08/21 09:14:07 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:14:07 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:14:07 INFO Orm.CompilationManager:HADOOP_MAPRED_HOME is/home/hadoop/hadoop-1.1.2


Note:/tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.java uses or overrides a deprecated API.


Note:recompile with-xlint:deprecation for details.


13/08/21 09:14:08 INFO Orm. compilationmanager:writing jar File:/tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.jar


13/08/21 09:14:08 INFO MapReduce. Exportjobbase:beginning Export of test


13/08/21 09:14:09 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:14:09 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:14:09 INFO util. Nativecodeloader:loaded The Native-hadoop Library


13/08/21 09:14:09 WARN Snappy. Loadsnappy:snappy Native Library not loaded


13/08/21 09:14:10 INFO mapred. Jobclient:running job:job_201307251523_0059


13/08/21 09:14:11 INFO mapred. Jobclient:map 0% Reduce 0%


13/08/21 09:14:20 INFO mapred. Jobclient:task Id:attempt_201307251523_0059_m_000000_0, status:failed


Java.io.IOException:com.mysql.jdbc.CommunicationsException:Communications link failure due to underlying exception:

* * BEGIN NESTED EXCEPTION * *

Java.net.ConnectException
Message:connection refused

StackTrace:

Java.net.ConnectException:Connection refused


At Java.net.PlainSocketImpl.socketConnect (Native method)


At Java.net.PlainSocketImpl.doConnect (plainsocketimpl.java:351)


At Java.net.PlainSocketImpl.connectToAddress (plainsocketimpl.java:213)


At Java.net.PlainSocketImpl.connect (plainsocketimpl.java:200)


At Java.net.SocksSocketImpl.connect (sockssocketimpl.java:366)


At Java.net.Socket.connect (socket.java:529)


At Java.net.Socket.connect (socket.java:478)


At Java.net.socket.<init> (socket.java:375)


At Java.net.socket.<init> (socket.java:218)


At Com.mysql.jdbc.StandardSocketFactory.connect (standardsocketfactory.java:256)


At Com.mysql.jdbc.mysqlio.<init> (mysqlio.java:271)


At Com.mysql.jdbc.Connection.createNewIO (connection.java:2771)


At Com.mysql.jdbc.connection.<init> (connection.java:1555)


At Com.mysql.jdbc.NonRegisteringDriver.connect (nonregisteringdriver.java:285)


At Java.sql.DriverManager.getConnection (drivermanager.java:582)


At Java.sql.DriverManager.getConnection (drivermanager.java:185)


At Org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection (dbconfiguration.java:294)


At Org.apache.sqoop.mapreduce.asyncsqlrecordwriter.<init> (asyncsqlrecordwriter.java:76)


At Org.apache.sqoop.mapreduce.exportoutputformat$exportrecordwriter.<init> (ExportOutputFormat.java:95)


At Org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter (exportoutputformat.java:77)


At Org.apache.hadoop.mapred.maptask$newdirectoutputcollector.<init> (maptask.java:628)


At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:753)


At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)


At Org.apache.hadoop.mapred.child$4.run (child.java:255)


At Java.security.AccessController.doPrivileged (Native method)


At Javax.security.auth.Subject.doAs (subject.java:396)


At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)


At Org.apache.hadoop.mapred.Child.main (child.java:249)

* * End NESTED EXCEPTION * *

Last packet sent to the server was 1 ms ago.
At Org.apache.sqoop.mapreduce.ExportOutputFormat.getRecordWriter (exportoutputformat.java:79)
At Org.apache.hadoop.mapred.maptask$newdirectoutputcollector.<init> (maptask.java:628)
At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:753)
At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)
At Org.apache.hadoop.mapred.child$4.run (child.java:255)
At Java.security.AccessController.doPrivileged (Native method)
At Javax.security.auth.Subject.doAs (subject.java:396)
At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)
At Org.apache.hadoop.mapred.Child.main (child.java:249)
caused by:com.mysql.jdbc.CommunicationsException:Communications link failure due to underlying exception:

* * BEGIN NESTED EXCEPTION * *

Java.net.ConnectException
Message:connection refused

MySQL User rights issues


Mysql&gt; Show grants;


Mysql&gt; GRANT all privileges in *.* to ' root ' @ '% ' identified by PASSWORD ' *4acfe3202a5ff5cf467898fc58aab1d615029441′wit H GRANT OPTION;


mysql&gt; FLUSH privileges;


Mysql&gt; CREATE TABLE Test (mkey varchar (), pkg varchar (), CID varchar (), PID varchar (x), Count Int,primary key (Mkey , Pkg,cid,pid));


Alter ignore table cps_activation add unique index_day_pkgname_cid_pid (' Day ', ' pkgname ', ' CID ', ' pid ');


Query OK, 0 rows affected (0.03 sec)


2. Table does not exist


===========


[Hadoop@hs11 ~]$ sqoop export–connect jdbc:mysql://10.10.20.11/test–username root–password admin–table test–export-d Ir/user/hive/warehouse/actmp


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/21 09:16:26 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/21 09:16:26 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


13/08/21 09:16:26 INFO tool. Codegentool:beginning code Generation


13/08/21 09:16:27 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:16:27 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:16:27 INFO Orm.CompilationManager:HADOOP_MAPRED_HOME is/home/hadoop/hadoop-1.1.2


Note:/tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.java uses or overrides a deprecated API.


Note:recompile with-xlint:deprecation for details.


13/08/21 09:16:28 INFO Orm. compilationmanager:writing jar File:/tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.jar


13/08/21 09:16:28 INFO MapReduce. Exportjobbase:beginning Export of test


13/08/21 09:16:29 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:16:29 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:16:29 INFO util. Nativecodeloader:loaded The Native-hadoop Library


13/08/21 09:16:29 WARN Snappy. Loadsnappy:snappy Native Library not loaded


13/08/21 09:16:29 INFO mapred. Jobclient:running job:job_201307251523_0060


13/08/21 09:16:30 INFO mapred. Jobclient:map 0% Reduce 0%


13/08/21 09:16:38 INFO mapred. Jobclient:task Id:attempt_201307251523_0060_m_000000_0, status:failed


Java.io.IOException:Can ' t export data, please check task Tracker logs


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:112)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:39)


At Org.apache.hadoop.mapreduce.Mapper.run (mapper.java:144)


At Org.apache.sqoop.mapreduce.AutoProgressMapper.run (autoprogressmapper.java:64)


At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:764)


At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)


At Org.apache.hadoop.mapred.child$4.run (child.java:255)


At Java.security.AccessController.doPrivileged (Native method)


At Javax.security.auth.Subject.doAs (subject.java:396)


At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)


At Org.apache.hadoop.mapred.Child.main (child.java:249)


caused by:java.util.NoSuchElementException


At Java.util.abstractlist$itr.next (abstractlist.java:350)


At Test.__loadfromfields (test.java:252)


At Test.parse (test.java:201)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:83)


... Ten more


Export data to MySQL, of course, the database table must first exist, or it will error


This error is caused by the fact that the fields of the Sqoop parse file do not correspond to the fields of the tables in the MySQL database. Therefore, it is necessary to add parameters to sqoop at the time of execution, telling the Sqoop file delimiter so that it can parse the file fields correctly. Hive The default Field delimiter is ' 01′


===========


3. Null field padding to specify


Null field delimiter was not specified, causing dislocation.


[Hadoop@hs11 ~]$ sqoop export–connect jdbc:mysql://10.10.20.11/test–username root–password admin–table test–export-d Ir/user/hive/warehouse/actmp–input-fields-tminated-by ' 01′


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/21 09:21:07 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/21 09:21:07 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


13/08/21 09:21:07 INFO tool. Codegentool:beginning code Generation


13/08/21 09:21:07 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:21:07 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:21:07 INFO Orm.CompilationManager:HADOOP_MAPRED_HOME is/home/hadoop/hadoop-1.1.2


Note:/tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.java uses or overrides a deprecated API.


Note:recompile with-xlint:deprecation for details.


13/08/21 09:21:08 INFO Orm. compilationmanager:writing jar File:/tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.jar


13/08/21 09:21:08 INFO MapReduce. Exportjobbase:beginning Export of test


13/08/21 09:21:09 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:21:09 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:21:09 INFO util. Nativecodeloader:loaded The Native-hadoop Library


13/08/21 09:21:09 WARN Snappy. Loadsnappy:snappy Native Library not loaded


13/08/21 09:21:10 INFO mapred. Jobclient:running job:job_201307251523_0061


13/08/21 09:21:11 INFO mapred. Jobclient:map 0% Reduce 0%


13/08/21 09:21:17 INFO mapred. Jobclient:map 25% Reduce 0%


13/08/21 09:21:19 INFO mapred. Jobclient:map 50% Reduce 0%


13/08/21 09:21:21 INFO mapred. Jobclient:task Id:attempt_201307251523_0061_m_000001_0, status:failed


Java.io.IOException:Can ' t export data, please check task Tracker logs


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:112)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:39)


At Org.apache.hadoop.mapreduce.Mapper.run (mapper.java:144)


At Org.apache.sqoop.mapreduce.AutoProgressMapper.run (autoprogressmapper.java:64)


At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:764)


At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)


At Org.apache.hadoop.mapred.child$4.run (child.java:255)


At Java.security.AccessController.doPrivileged (Native method)


At Javax.security.auth.Subject.doAs (subject.java:396)


At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)


At Org.apache.hadoop.mapred.Child.main (child.java:249)


caused By:java.lang.NumberFormatException:For input string: "665a5ffa-32c9-9463-1943-840a5feae193″


At Java.lang.NumberFormatException.forInputString (numberformatexception.java:48)


At Java.lang.Integer.parseInt (integer.java:458)


At Java.lang.Integer.valueOf (integer.java:554)


At Test.__loadfromfields (test.java:264)


At Test.parse (test.java:201)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:83)


... Ten more


===========


4. Success


[Hadoop@hs11 ~]$ sqoop export–connect jdbc:mysql://10.10.20.11/test–username root–password admin–table test–export-d ir/user/hive/warehouse/actmp–input-fields-terminated-by ' 01′–input-null-string ' \ –input-null-non-string ' \ n '


Warning:/usr/lib/hcatalog does not exist! Hcatalog jobs would fail.


Please set $HCAT _home to the root of your hcatalog installation.


13/08/21 09:36:13 WARN tool. Basesqooptool:setting your password on the command-line is insecure. Consider Using-p instead.


13/08/21 09:36:13 INFO Manager. Mysqlmanager:preparing to use a MySQL streaming resultset.


13/08/21 09:36:13 INFO tool. Codegentool:beginning code Generation


13/08/21 09:36:13 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:36:13 INFO Manager. Sqlmanager:executing SQL statement:select t.* from ' test ' as T LIMIT 1


13/08/21 09:36:13 INFO Orm.CompilationManager:HADOOP_MAPRED_HOME is/home/hadoop/hadoop-1.1.2


Note:/tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.java uses or overrides a deprecated API.


Note:recompile with-xlint:deprecation for details.


13/08/21 09:36:14 INFO Orm. compilationmanager:writing jar File:/tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.jar


13/08/21 09:36:14 INFO MapReduce. Exportjobbase:beginning Export of test


13/08/21 09:36:15 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:36:15 INFO input. Fileinputformat:total input paths to process:1


13/08/21 09:36:15 INFO util. Nativecodeloader:loaded The Native-hadoop Library


13/08/21 09:36:15 WARN Snappy. Loadsnappy:snappy Native Library not loaded


13/08/21 09:36:16 INFO mapred. Jobclient:running job:job_201307251523_0064


13/08/21 09:36:17 INFO mapred. Jobclient:map 0% Reduce 0%


13/08/21 09:36:23 INFO mapred. Jobclient:map 25% Reduce 0%


13/08/21 09:36:25 INFO mapred. Jobclient:map 100% Reduce 0%


13/08/21 09:36:27 INFO mapred. Jobclient:job complete:job_201307251523_0064


13/08/21 09:36:27 INFO mapred. Jobclient:counters:18


13/08/21 09:36:27 INFO mapred. Jobclient:job counters


13/08/21 09:36:27 INFO mapred. jobclient:slots_millis_maps=13151


13/08/21 09:36:27 INFO mapred. Jobclient:total time spent by all reduces waiting after reserving slots (ms) =0


13/08/21 09:36:27 INFO mapred. Jobclient:total time spent by all maps waiting after reserving slots (ms) =0


13/08/21 09:36:27 INFO mapred. Jobclient:rack-local Map tasks=2


13/08/21 09:36:27 INFO mapred. jobclient:launched Map tasks=4


13/08/21 09:36:27 INFO mapred. Jobclient:slots_millis_reduces=0


13/08/21 09:36:27 INFO mapred. Jobclient:file Output Format Counters


13/08/21 09:36:27 INFO mapred. Jobclient:bytes written=0


13/08/21 09:36:27 INFO mapred. Jobclient:filesystemcounters


13/08/21 09:36:27 INFO mapred. jobclient:hdfs_bytes_read=1519


13/08/21 09:36:27 INFO mapred. jobclient:file_bytes_written=234149


13/08/21 09:36:27 INFO mapred. Jobclient:file Input Format Counters


13/08/21 09:36:27 INFO mapred. Jobclient:bytes read=0


13/08/21 09:36:27 INFO mapred. Jobclient:map-reduce Framework


13/08/21 09:36:27 INFO mapred. Jobclient:map input Records=6


13/08/21 09:36:27 INFO mapred. Jobclient:physical memory (bytes) snapshot=663863296


13/08/21 09:36:27 INFO mapred. Jobclient:spilled records=0


13/08/21 09:36:27 INFO mapred. Jobclient:cpu Time Spent (ms) =3720


13/08/21 09:36:27 INFO mapred. Jobclient:total committed heap usage (bytes) =2013790208


13/08/21 09:36:27 INFO mapred. Jobclient:virtual memory (bytes) snapshot=5583151104


13/08/21 09:36:27 INFO mapred. Jobclient:map Output records=6


13/08/21 09:36:27 INFO mapred. jobclient:split_raw_bytes=571


13/08/21 09:36:27 INFO MapReduce. exportjobbase:transferred 1.4834 KB in 12.1574 seconds (124.9446 bytes/sec)


13/08/21 09:36:27 INFO MapReduce. Exportjobbase:exported 6 Records.


———-


5. mysql string length definition is too short to save


Java.io.IOException:com.mysql.jdbc.MysqlDataTruncation:Data Truncation:data too long for column ' pid ' at row 1


At Org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close (asyncsqlrecordwriter.java:192)


At Org.apache.hadoop.mapred.maptask$newdirectoutputcollector.close (maptask.java:651)


At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:766)


At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)


At Org.apache.hadoop.mapred.child$4.run (child.java:255)


At Java.security.AccessController.doPrivileged (Native method)


At Javax.security.auth.Subject.doAs (subject.java:396)


At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)


At Org.apache.hadoop.mapred.Child.main (child.java:249)


caused By:com.mysql.jdbc.MysqlDataTruncation:Data truncation:data too long for column ' pid ' at row 1


At Com.mysql.jdbc.MysqlIO.checkErrorPacket (mysqlio.java:2983)


At Com.mysql.jdbc.MysqlIO.sendCommand (mysqlio.java:1631)


At Com.mysql.jdbc.MysqlIO.sqlQueryDirect (mysqlio.java:1723)


At Com.mysql.jdbc.Connection.execSQL (connection.java:3283)


At Com.mysql.jdbc.PreparedStatement.executeInternal (preparedstatement.java:1332)


At Com.mysql.jdbc.PreparedStatement.execute (preparedstatement.java:882)


At Org.apache.sqoop.mapreduce.asyncsqloutputformat$asyncsqlexecthread.run (asyncsqloutputformat.java:233)


———————-


6. Date format issues


MySQL Date date format, hive string must be yyyy-mm-dd, I used YYYYMMDD, reported the following error.


13/08/21 17:42:44 INFO mapred. Jobclient:task Id:attempt_201307251523_0079_m_000000_1, status:failed


Java.io.IOException:Can ' t export data, please check task Tracker logs


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:112)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:39)


At Org.apache.hadoop.mapreduce.Mapper.run (mapper.java:144)


At Org.apache.sqoop.mapreduce.AutoProgressMapper.run (autoprogressmapper.java:64)


At Org.apache.hadoop.mapred.MapTask.runNewMapper (maptask.java:764)


At Org.apache.hadoop.mapred.MapTask.run (maptask.java:370)


At Org.apache.hadoop.mapred.child$4.run (child.java:255)


At Java.security.AccessController.doPrivileged (Native method)


At Javax.security.auth.Subject.doAs (subject.java:396)


At Org.apache.hadoop.security.UserGroupInformation.doAs (usergroupinformation.java:1149)


At Org.apache.hadoop.mapred.Child.main (child.java:249)


caused by:java.lang.IllegalArgumentException


At Java.sql.Date.valueOf (date.java:138)


At Cps_activation.__loadfromfields (cps_activation.java:308)


At Cps_activation.parse (cps_activation.java:255)


At Org.apache.sqoop.mapreduce.TextExportMapper.map (textexportmapper.java:83)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.