Java.io.IOException:Could not locate executable null\bin\winutils.exe in the Hadoop binaries

Source: Internet
Author: User

Debug Spark Read HBase in the WIN7 System IntelliJ development tool under the centos6.6+hadoop2.7+hbase0.98+spark1.3.1 of the built-in cluster environment. Run a direct error:

?
1234567891011121314151617181920212223242526272829 15/06/11 15:35:50 ERROR Shell: Failed to locate the winutils binary in the hadoop binary pathjava.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.    at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:356)    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:371)    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:364)    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)    at org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:272)    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)    at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:790)    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:760)    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:633)    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2001)    at scala.Option.getOrElse(Option.scala:120)    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2001)    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)    at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)    at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)    at SparkFromHbase$.main(SparkFromHbase.scala:15)    at SparkFromHbase.main(SparkFromHbase.scala)    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)    at java.lang.reflect.Method.invoke(Method.java:606)    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)

Check out Hadoop source discovery for one of these:

?
123456789101112131415161718192021222324252627   public static final String getQualifiedBinPath(String executable)   throws IOException {    // construct hadoop bin path to the specified executable    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin"      + File.separator + executable;    File exeFile = new File(fullExeName);    if (!exeFile.exists()) {      throw new IOException("Could not locate executable " + fullExeName        + " in the Hadoop binaries.");    }    return exeFile.getCanonicalPath();  }private static String HADOOP_HOME_DIR = checkHadoopHome();private static String checkHadoopHome() {    // first check the Dflag hadoop.home.dir with JVM scope    String home = System.getProperty("hadoop.home.dir");    // fall back to the system/user-global env variable    if (home == null) {      home = System.getenv("HADOOP_HOME");    }     ...}

Obviously, it should be a hadoop_home problem. If Hadoop_home is empty, it must be fullexename to Null\bin\winutils.exe. The workaround is simple, configure the environment variables, do not want to restart the computer can be added in the program:

?
1 System.setProperty("hadoop.home.dir", "E:\\Program Files\\hadoop-2.7.0");

Note:e:\\program files\\hadoop-2.7.0 is the path to the Hadoop that I have natively extracted.

You may still have the same error when you do it later, and you may be blaming me for this. In fact, at first I refused, because you enter your Hadoop-x.x.x/bin directory to see, you will find that you have no winutils.exe this stuff.

So I tell you, you can go to GitHub to download A, the Earth people know the address to send you one.

Address:Https://github.com/srccodes/hadoop-common-2.2.0-bin

Do not worry about its version, do not be afraid, because I use the latest hadoop-2.7.0 are no problem! after downloading, add Winutils.exe to your hadoop-x.x.x/bin.

This problem solved, if not solved, then you are a wonderful brother, you can add my qq!

Java.io.IOException:Could not locate executable null\bin\winutils.exe in the Hadoop binaries

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.