Error Stack:
.- .- - +: -:Wuyi,002INFO [main] org.apache.hadoop.mapred.MapTask:Processing split:1=1and1=1 .- .- - +: -:Wuyi,043INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader:Working on split:1=1and1=1 .- .- - +: -:Wuyi,095INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader:Executing query:Select "EXTEND3","EXTEND2","EXTEND1","MEMO","oper_date","Oper_code","file_content","file_name","Inpatient_no","ID" fromHis_sdzl."Mdt_file"Tblwhere(1=1) and (1=1 ) .- .- - -:xx: A, theINFO [thread- -] Org.apache.sqoop.mapreduce.autoprogressmapper:auto-progress Thread isFinished. keepgoing=false .- .- - -:xx: A,185FATAL [main] org.apache.hadoop.mapred.YarnChild:Error running Child:java.lang.OutOfMemoryError:Java heap Space At Java.util.Arrays.copyOf (Arrays.java:3332) at Java.lang.AbstractStringBuilder.expandCapacity (Abstractstringbuilder.java:137) at Java.lang.AbstractStringBuilder.ensureCapacityInternal (Abstractstringbuilder.java:121) at Java.lang.AbstractStringBuilder.append (Abstractstringbuilder.java:514) at Java.lang.StringBuffer.append (Stringbuffer.java:352) at Java.util.regex.Matcher.appendReplacement (Matcher.java:888) at Java.util.regex.Matcher.replaceAll (Matcher.java:955) at Java.lang.String.replaceAll (String.java:2223) at Queryresult.readfields (Queryresult.java:205) at Org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue (Dbrecordreader.java:244) at Org.apache.hadoop.mapred.maptask$newtrackingrecordreader.nextkeyvalue (Maptask.java:556) at Org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue (Mapcontextimpl.java: the) at Org.apache.hadoop.mapreduce.lib.map.wrappedmapper$context.nextkeyvalue (Wrappedmapper.java: the) at Org.apache.hadoop.mapreduce.Mapper.run (Mapper.java:145) at Org.apache.sqoop.mapreduce.AutoProgressMapper.run (Autoprogressmapper.java: -) at Org.apache.hadoop.mapred.MapTask.runNewMapper (Maptask.java:787) at Org.apache.hadoop.mapred.MapTask.run (Maptask.java:341) at org.apache.hadoop.mapred.yarnchild$2. Run (Yarnchild.java:164) at java.security.AccessController.doPrivileged (Native Method) at Javax.security.auth.Subject.doAs (Subject.jav A:422) at Org.apache.hadoop.security.UserGroupInformation.doAs (Usergroupinformation.java:1657) at Org.apache.hadoop.mapred.YarnChild.main (Yarnchild.java:158)
The small fetchsize parameter can not be solved, the problem is likely to be a row of data occupies a large space. The sqoop generated by the import table corresponding to the instantiation of the class Queryresult.java 244 rows can be located to the error column is File_content, is a binary column, and then query the original library, sure enough, the largest column length reached 180M:
PS: How do I query the size of a BLOB field with a standard SQL statement?
There are a lot of blob fields. If it is a simple blob field of 9i, it should be length, or LENGTHB. Not really. can be used Dbms_lob.getlength ()
Troubleshooting Sqoop Error: Error running Child:java.lang.OutOfMemoryError:Java heap space