When using Windows to invoke the Hadoop yarn platform, you will generally encounter the following error:
2014-05-28 17:32:19,761 WARN org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor:Exception from Container-launch with container id:container_1401177251807_0034_01_000001 and exit Code:1org.apache.hadoop.util.shell$exitcodeexception:/bin/bash:line 0: Fg:no Job Control at Org.apache.hadoop.util.Shell.runCommand (Shell.java:505) at Org.apache.hadoop.util.Shell.run (Shell.java:418) at Org.apache.hadoop.util.shell$shellcommandexecutor.execute (Shell.java:650) at Org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer ( Defaultcontainerexecutor.java:195) at Org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call ( Containerlaunch.java:300) at Org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call ( Containerlaunch.java:81) at Java.util.concurrent.FutureTask.run (Futuretask.java:262) at Java.util.concurrent.ThreadPoolExecutor.runWorker (Threadpoolexecutor.java:1145) at Java.util.concurrent.threadpoolexecutor$worker.run (Threadpoolexecutor.java:615) at Java.lang.Thread.run (Thread.java:744)
This error has been resolved on the information page (https://issues.apache.org/jira/browse/MAPREDUCE-5655) of the Hadoop MapReduce bug, and the affected version is Hadoop2.2, Hadoop2.3, and has been resolved (and not said to have been repaired in Hadoop2.4).
In the http://blog.csdn.net/fansy1990/article/details/22896249 blog, LZ follows https://issues.apache.org/jira/browse/ MAPREDUCE-5655 solution was resolved. Here to give a general solution to this problem.
1. First of all, this problem occurs when Windows Eclipse is a client-submitted task to a Linux Hadoop cluster, and if it is a Linux Eclipse submission task to a Linux Hadoop cluster, this problem does not occur. A very intuitive idea is to run a task with two clients at the same time, and then debug each step to determine the difference. It must be possible to do so. But it's certainly a time-consuming thing to do (and to install Eclipse on a Linux, Trouble);
2. According to the 1 approach, it is generally possible to see that there are two differences, one is the difference between Java commands, and the other is classpath. Place the breakpoint first:
(1) Breakpoints for Java commands:
Yarnrunner.java 390 Lines (cdh5.0 Hadoop2.3 version source)
Setup the command to run the AM list<string> vargs = new arraylist<string> (8); Vargs.add (environment.java_home.$ () + "/bin/java");
After you hit the breakpoint, and then run to the 445 line, you can see that the Vargs looks like this (or see the vargsfinal variable):
[%java_home%,-dlog4j.configuration=container-log4j.properties,-dyarn.app.container.log.dir=<log_dir>,- Dyarn.app.container.log.filesize=0,-dhadoop.root.logger=info,cla,,-xmx1024m, Org.apache.hadoop.mapreduce.v2.app.MRAppMaster, 1><log_dir>/stdout, 2><log_dir>/stderr, NULL, Null
(2) classpath breakpoint:
Yarnrunner.java 466 lines, view the value of environment, you can see the value as:
{classpath=%pwd%; $HADOOP _conf_dir; $HADOOP _common_home/*; $HADOOP _common_home/lib/*; $HADOOP _hdfs_home/*; $HADOOP _ hdfs_home/lib/*, $HADOOP _mapred_home/*, $HADOOP _mapred_home/lib/*; $HADOOP _yarn_home/*; $HADOOP _yarn_home/lib/*;% Hadoop_mapred_home%\share\hadoop\mapreduce\*;%hadoop_mapred_home%\share\hadoop\mapreduce\lib\*;job.jar/job.jar ; job.jar/classes/;job.jar/lib/*;%pwd%/*}
3. See the two values in 2 to determine the difference between Windows and Linux, there are mainly two:
(1) The difference between percent and $;
(2) The difference between the positive and negative slashes (this does not seem to be different);
4. If you see the difference between the two places above, if you change these two values directly to:
[$JAVA _home/bin/java-dlog4j.configuration=container-log4j.properties-dyarn.app.container.log.dir=<log_dir >-dyarn.app.container.log.filesize=0-dhadoop.root.logger=info,cla -xmx1024m Org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1><log_dir>/stdout 2><log_dir>/stderr]
and the
{classpath= $PWD: $HADOOP _conf_dir: $HADOOP _common_home/*: $HADOOP _common_home/lib/*: $HADOOP _hdfs_home/*: $HADOOP _ hdfs_home/lib/*: $HADOOP _mapred_home/*: $HADOOP _mapred_home/lib/*: $HADOOP _yarn_home/*: $HADOOP _yarn_home/lib/*:$ hadoop_mapred_home/share/hadoop/mapreduce/*: $HADOOP _mapred_home/share/hadoop/mapreduce/lib/*:job.jar/job.jar: job.jar/classes/:job.jar/lib/*: $PWD/*}
then it should be able to run;
5. How to change it?
(1) In our project to create a new Yarnrunner class, the class and the source of the Yarnrunner class exactly the same (package path, code content is the same);
(2) Replace 390 lines (the default Hadoop cluster is in the Linux environment):
that the
Vargs.add (environment.java_home.$ () + "/bin/java");
Replaced by
Vargs.add ("$JAVA _home/bin/java");
(3) Add in 466 rows:
Replaceenvironment (Environment);
This method is put on the last side, for:
private void Replaceenvironment (map<string, string> environment) { String Tmpclasspath = Environment.get (" CLASSPATH "); Tmpclasspath=tmpclasspath.replaceall (";", ":"); Tmpclasspath=tmpclasspath.replaceall ("%pwd%", "\ \ $PWD"); Tmpclasspath=tmpclasspath.replaceall ("%hadoop_mapred_home%", "\ \ $HADOOP _mapred_home"); Tmpclasspath= Tmpclasspath.replaceall ("\\\\", "/"); Environment.put ("CLASSPATH", Tmpclasspath);}
when this is done, submitting the task to the Linux Hadoop cluster in Windows Eclipse can be performed.
Finally, at run time, the Eclipse terminal does not have the log to print out, directly under the src add a log4j.properties file (can be downloaded in the Linux cluster/etc/hadoop/conf/).
When you use Windows to call Hadoop error/bin/bash:line 0:fg:no Job Control General workaround