Copy local file to HDFs local test exception

Source: Internet
Author: User

The project needs to copy the local files to HDFs, because I am lazy, so use good Java program through the Hadoop.FileSystem.CopyFromLocalFile method to achieve. The following exception was encountered while running in local (Window 7 environment) Local mode:

An exception or error caused a run to ABORT:ORG.APACHE.HADOOP.IO.NATIVEIO.NATIVEIO$WINDOWS.CREATEFILEWITHMODE0 (Ljava/ lang/string; Jjji) ljava/io/FileDescriptor; java.lang.unsatisfiedlinkerror:org.apache.hadoop.io.nativeio.nativeio$ WINDOWS.CREATEFILEWITHMODE0 (Ljava/lang/string; Jjji) ljava/io/FileDescriptor; At Org.apache.hadoop.io.nativeio.nativeio$windows.createfilewithmode0 (Native Method) at Org.apache.hadoop.io.nativeio.nativeio$windows.createfileoutputstreamwithmode (NativeIO.java:559) at Org.apache.hadoop.fs.rawlocalfilesystem$localfsfileoutputstream.<init> (rawlocalfilesystem.java:219) at Org.apache.hadoop.fs.rawlocalfilesystem$localfsfileoutputstream.<init> (rawlocalfilesystem.java:209) at Org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode (Rawlocalfilesystem.java:307) at Org.apache.hadoop.fs.RawLocalFileSystem.create (Rawlocalfilesystem.java:295) at Org.apache.hadoop.fs.RawLocalFileSystem.create (Rawlocalfilesystem.java:328) at Org.apache.hadoop.fs.checksumfilesystem$checksumfsoutputsummer.<init> (checksumfilesystem.java:388) at Org.apache.hadoop.fs.ChecksumFileSystem.create (Checksumfilesystem.java:451) at Org.apache.hadoop.fs.ChecksumFileSystem.create (Checksumfilesystem.java:430) at Org.apache.hadoop.fs.FileSystem.create (Filesystem.java:920) at Org.apache.hadoop.fs.FileSystem.create (Filesystem.java:901) at Org.apache.hadoop.fs.FileSystem.create (Filesystem.java:798) at Org.apache.hadoop.fs.FileUtil.copy (Fileutil.java:368) at Org.apache.hadoop.fs.FileUtil.copy (Fileutil.java:341) at Org.apache.hadoop.fs.FileUtil.copy (Fileutil.java:292) at Org.apache.hadoop.fs.LocalFileSystem.copyFromLocalFile (Localfilesystem.java:82) at Org.apache.hadoop.fs.FileSystem.copyFromLocalFile (Filesystem.java:1882)

By analyzing the exception stack,

The Org.apache.hadoop.io.nativeio.nativeio$windows.createfilewithmode0 method has an exception and the CreateFileWithMode0 method is implemented as follows:
    /***/    privatestaticnative  filedescriptor CREATEFILEWITHMODE0 (String path,        longlonglongint  mode)         throws Nativeioexception;

The code shows that this method is not supported by Hadoop. So why would you call this method and continue up-tracing through the exception stack,

The Nativeio is called in the org.apache.hadoop.fs.rawlocalfilesystem$localfsfileoutputstream.<init> procedure. Nativeio$windows class, the corresponding method is as follows:

1  PrivateLocalfsfileoutputstream (Path F,BooleanAppend,2Fspermission permission)throwsIOException {3File File =PathToFile (f);4       if(Permission = =NULL) {5          This. Fos =Newfileoutputstream (file, append);6}Else {7         if(Shell.windows &&nativeio.isavailable ()) {8            This. Fos =NativeIO.Windows.createFileOutputStreamWithMode (file,9 Append, Permission.toshort ());Ten}Else { One            This. Fos =Newfileoutputstream (file, append); A           BooleanSuccess =false; -           Try { - setpermission (f, permission); theSuccess =true; -}finally { -             if(!success) { -Ioutils.cleanup (LOG, This. Fos); +             } -           } +         } A       } at}

By calling the vertebral stack, it is known that the Nativeio.windows class was called in line 8th above. Then if the judgment should be established, the analysis Nativeio.isavailable method code is as follows:

1   /** 2    * Return True if the jni-based native IO extensions is available. 3    */ 4    Public Static Boolean isavailable () {5     return nativecodeloader.isnativecodeloaded () && nativeloaded; 6   }

The IsAvailable method is primarily called the nativecodeloader.isnativecodeloaded method

1  Static {2     //Try to load native Hadoop library and set fallback flag appropriately3     if(log.isdebugenabled ()) {4Log.debug ("Trying to load the custom-built Native-hadoop library ...");5     }6     Try {7System.loadlibrary ("Hadoop");8Log.debug ("Loaded The Native-hadoop Library");9nativecodeloaded =true;Ten}Catch(Throwable t) { One       //Ignore failure to load A       if(log.isdebugenabled ()) { -Log.debug ("Failed to load Native-hadoop with error:" +t); -Log.debug ("java.library.path=" + theSystem.getproperty ("Java.library.path")); -       } -     } -      +     if(!nativecodeloaded) { -Log.warn ("Unable to load Native-hadoop library for your platform ... " + +"Using Builtin-java classes where applicable"); A     } at   } -  -   /** - * Check If Native-hadoop code is loaded for this platform. -    *  -    * @return<code>true</code> If Native-hadoop is loaded, in * Else <code>false</code> -    */ to    Public Static Booleanisnativecodeloaded () { +     returnnativecodeloaded; -}

As you can see, the Isnativecodeloaded method is to return an attribute value, so where does the problem occur?

After parsing the static constructor of the Nativecodeloaded class, there is a "system.loadlibrary (" Hadoop ") method. Is this the method that caused it? By debugging on other colleagues ' environments, system.loadlibrary ("Hadoop") will be abnormal, thus running the catch part, but my computer will not be abnormal, continue to run directly. So what is the purpose of the System.loadlibrary method, by analyzing the source know that this method is to load the local system and user environment variables. The analysis is due to the fact that I have the%hadoop_home%/bin directory in the Hadoop.dll file or environment variable path configured under the C:\\windows\system32 directory.

In short, because the configured system environment variable exists in any directory of path, the Hadoop.dll file is considered to be a Hadoop cluster environment, but the Hadoop cluster does not support the exception generated by the window environment. The processing method is also very simple, check the system environment variable path under each directory, to ensure that there is no Hadoop.dll file.

If you are deleting a directory of the system environment variable path, you will need to restart IntelliJ idea after usr_paths or sys_paths in the ClassLoader class to take effect.

Copy local file to HDFs local test exception

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.