Sqoop exporting hive data to MySQL error: caused By:java.lang.RuntimeException:Can ' t parse input data

Source: Internet
Author: User
Tags sqoop



Sqoop Export data to local database An error occurred, the command is as follows:


sqoop export --connect ‘jdbc:mysql://202.193.60.117/dataweb?useUnicode=true&characterEncoding=utf-8‘ --username root --password-file /user/hadoop/.password --table user_info_copy --export-dir /user/hadoop/user_info --input-fields-terminated-by "@"


The error log is as follows:




Error: java.io.IOException: Can‘t export data, please check failed map task logs at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.RuntimeException: Can‘t parse input data: ‘2,hello,456,0‘  at user_info_copy.__loadFromFields(user_info_copy.java:335)
    at user_info_copy.parse(user_info_copy.java:268)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
    ... 10 more Caused by: java.lang.NumberFormatException: For input string: "2,hello,456,0" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Integer.parseInt(Integer.java:580)
    at java.lang.Integer.valueOf(Integer.java:766)
    at user_info_copy.__loadFromFields(user_info_copy.java:317)
    ... 12 more


The problem with this error is the delimiter, because you do not specify a delimiter in the Hive table, the default Hive delimiter is:



' \ 001 ' field terminated



' \ 002 ' collection terminated



' \ 003 ' to terminate the map key



Lines terminated at ' \ n '



So to solve the problem, you need to change the delimiter in the hive creation or sqoop job to fix the problem. In this question, my representation is created locally through the Interface management tool, the delimiter replaced by "," can solve the problem, if the other separators corresponding changes in the line, or use the following statement to recreate the table and specify the delimiter to resolve.




ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘delimiter’





The above is the main content of this section Bo master for everyone, this is the master of his own learning process, hope to give you a certain guidance role, useful also hope that we point a support, if you do not use also hope to forgive, there are mistakes please point out. If there is hope to pay attention to bloggers to get updates the first time Oh, thank you!



Sqoop exporting hive data to MySQL error: caused By:java.lang.RuntimeException:Can ' t parse input data


Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.