Error Stack:
.- .- the -: -: -,449INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader:Executing query:Select "CTJX60","CTJX61","CTJX62","CTJX63","CTJX64","CTJX65","CTJX66","CTJX67","CTJX68","CTJX69","CTJX70","CTJX71","CTJX72","CTJX73","CTJX74","CTJX75","CTJX76","CTJX77","CTJX78","CTJX79","CTJX80","CTJX81","CTJX82","CTJX83","CTJX84","CTJX85","CTJX86","CTJX87","CTJX88","CTJX89","CTJX90","CTJX91","CTJX92","CTJX93","CTJX94","CTJX95","CTJX96","CTJX97","CTJX98","CTJX99","CTJX55","IID","iexam_iid","CTSBGLX","CTJX01","CTJX02","CTJX03","CTJX04","CTJX05","CTJX06","CTJX07","CTJX08","CTJX09","CTJX10","CTJX11","CTJX12","CTJX13","CTJX14","CTJX15","CTJX16","CTJX17","CTJX18","CTJX19","CTJX20","CTJX21","CTJX22","CTJX23","CTJX24","CTJX25","CTJX26","CTJX27","CTJX28","CTJX29","CTJX30","CTJX31","CTJX32","CTJX33","CTJX34","CTJX35","CTJX36","CTJX37","CTJX38","CTJX39","CTJX40","CTJX41","CTJX42","CTJX43","CTJX44","CTJX45","CTJX46","CTJX47","CTJX48","CTJX49","CTJX50","CTJX51","CTJX52","CTJX53","CTJX54","CTJX56","CTJX57","CTJX58","CTJX59" fromRis."ICNRIS_UIS_EXAM_TJX"Tblwhere(1=1) and (1=1 ) .- .- the -: -: -,589INFO [thread- -] Org.apache.sqoop.mapreduce.autoprogressmapper:auto-progress Thread isFinished. keepgoing=false .- .- the -: -: -,604FATAL [main] org.apache.hadoop.mapred.YarnChild:Error running Child:java.lang.OutOfMemoryError:Java heap Space At Oracle.jdbc.driver.OracleStatement.prepareAccessors (Oraclestatement.java:870) at Oracle.jdbc.driver.OracleStatement.executeMaybeDescribe (Oraclestatement.java:1047) at Oracle.jdbc.driver.T4CPreparedStatement.executeMaybeDescribe (T4cpreparedstatement.java:850) at Oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout (Oraclestatement.java:1134) at Oracle.jdbc.driver.OraclePreparedStatement.executeInternal (Oraclepreparedstatement.java:3339) at Oracle.jdbc.driver.OraclePreparedStatement.executeQuery (Oraclepreparedstatement.java:3384) at Org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery (Dbrecordreader.java:111) at Org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue (Dbrecordreader.java:235) at Org.apache.hadoop.mapred.maptask$newtrackingrecordreader.nextkeyvalue (Maptask.java:556) at Org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue (Mapcontextimpl.java: the) at Org.apache.hadoop.mapreduce.lib.map.wrappedmapper$context.nextkeyvalue (Wrappedmapper.java: the) at Org.apache.hadoop.mapreduce.Mapper.run (Mapper.java:145) at Org.apache.sqoop.mapreduce.AutoProgressMapper.run (Autoprogressmapper.java: -) at Org.apache.hadoop.mapred.MapTask.runNewMapper (Maptask.java:787) at Org.apache.hadoop.mapred.MapTask.run (Maptask.java:341) at org.apache.hadoop.mapred.yarnchild$2. Run (Yarnchild.java:164) at java.security.AccessController.doPrivileged (Native Method) at Javax.security.auth.Subject.doAs (Subject.jav A:422) at Org.apache.hadoop.security.UserGroupInformation.doAs (Usergroupinformation.java:1657) at Org.apache.hadoop.mapred.YarnChild.main (Yarnchild.java:158)
Solution: Adjust the small sqoop parameters:--fetch-size resolution process, view Sqoop source code, see Fetchsize, think of adjusting this parameter:
PS: Adjust the mapper parameters, regardless of the parameters:-D mapreduce.map.memory.mb=8192-d yarn.app.mapreduce.am.resource.mb=6144. The problem is that when you fetch data from DB, the columns of the original table are not much, but the length of each column is large:
Resolve Sqoop Error: Java.lang.OutOfMemoryError:Java heap Space