Follow the Documentation: http://www.micmiu.com/bigdata/hadoop/hadoop2x-eclipse-mapreduce-demo/installation Configure Eclipse, run WordCount program error:
Log4j:warn No appenders could be found forLogger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). Log4j:warn Please initialize the log4j system Properly.log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.Exception in thread "main"java.lang.NullPointerException at Java.lang.ProcessBuilder.start (Processbuilder.java:1011) at Org.apache.hadoop.util.Shell.runCommand (Shell.java:404) at Org.apache.hadoop.util.Shell.run (Shell.java:379) at Org.apache.hadoop.util.shell$shellcommandexecutor.execute (Shell.java:589) at Org.apache.hadoop.util.Shell.execCommand (Shell.java:678) at Org.apache.hadoop.util.Shell.execCommand (Shell.java:661) at Org.apache.hadoop.fs.RawLocalFileSystem.setPermission (Rawlocalfilesystem.java:639) at Org.apache.hadoop.fs.RawLocalFileSystem.mkdirs (Rawlocalfilesystem.java:435) at Org.apache.hadoop.fs.FilterFileSystem.mkdirs (Filterfilesystem.java:277) at Org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir (Jobsubmissionfiles.java:125) at Org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal (Jobsubmitter.java:344) at org.apache.hadoop.mapreduce.job$10.run (job.java:1268) at org.apache.hadoop.mapreduce.job$10.run (job.java:1265) at java.security.AccessController.doPrivileged (Native Method) at Javax.security.auth.Subject.doAs (Subject.jav A:415) at Org.apache.hadoop.security.UserGroupInformation.doAs (Usergroupinformation.java:1491) at Org.apache.hadoop.mapreduce.Job.submit (Job.java:1265) at Org.apache.hadoop.mapreduce.Job.waitForCompletion (Job.java:1286) at Com.telchina.hadoop.WordCount.main (Wordcount.java:84)
After Google, found http://www.uniorder.com/2014/03/31/%E5%9C%A8%20Windows%20%E4%B8%8A%E4%BD%BF%E7%94%A8%20Eclipse%20% In e8%bf%9c%e7%a8%8b%e8%b0%83%e8%af%95%20hadoop%202.2%20mapreduce%20%e7%a8%8b%e5%ba%8f/:
Download Hadoop 2.2 to your Windows machine (I am windows 7) and set environment variables:hadoop_home= your Hadoop directory and add%hadoop_home%\bin to path ; 。 This is not done yet. You need to download some files to overwrite all the files under Hadoop/bin (poke here).
Problem solving.
Sdjnzqr
Source: http://www.cnblogs.com/sdjnzqr/
Copyright: This article is owned by the author and the blog Park
Reprint: Welcome reprint, but without the consent of the author, this paragraph must be retained; The original link must be given in the article; otherwise, legal liability