標籤:
在Hadoop Eclipse開發環境搭建這篇文章中,第15.)中提到許可權相關的異常,如下:
15/01/30 10:08:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable15/01/30 10:08:17 ERROR security.UserGroupInformation: PriviledgedActionException as:zhangchao3 cause:java.io.IOException: Failed to set permissions of path: \tmp\hadoop-zhangchao3\mapred\staging\zhangchao3502228304\.staging to 0700Exception in thread "main" java.io.IOException: Failed to set permissions of path: \tmp\hadoop-zhangchao3\mapred\staging\zhangchao3502228304\.staging to 0700 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:662) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:509) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:344) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:189) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:116) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:856) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) at org.apache.hadoop.mapreduce.Job.submit(Job.java:500) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530) at org.apache.hadoop.examples.WordCount.main(WordCount.java:68)
根據提示資訊,可以看到,這個異常是org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:689)行拋出來的,把這個位置的原始碼貼出來,研究一下:
private static void checkReturnValue(boolean rv, File p, FsPermission permission ) throws IOException { if (!rv) { throw new IOException("Failed to set permissions of path: " + p + " to " + String.format("%04o", permission.toShort())); } }
在上面的異常資訊中,FileUtil.java:689行是 throw new IOException(“Failed to ….., 我直接將這個函數的if語句去掉,如下:
private static void checkReturnValue(boolean rv, File p, FsPermission permission) throws IOException {}
這樣,chekReturnValue成為一個空函數,這樣就不會對許可權進行檢查,在/home/hadoop/hadoop-1.0.3/中執行, ant jar進行編譯,編譯中可能遇到的問題見:編譯hadoop遇到maven timeout ,編譯好了後,產生如下的檔案:
將編譯好產生的hadoop-core-1.0.4-SNAPSHOT.jar檔案拷貝到windows上,使用Java Decoder可以看到checkReturnValue已經被修改:
將hadoop-core-1.0.4-SNAPSHOT.jar,修改為hadoop-core-1.0.3.jar檔案,覆蓋eclipse工程中的hadoop-core-1.0.3.jar檔案:
------》
重新執行Run on hadoop,就可以成功的運行了。(如何執行Run on hadoop見:Hadoop Eclipse開發環境搭建 11節)
修改後的jar包:http://download.csdn.net/download/uestczhangchao/8420249
【hadoop】——修改hadoop FileUtil.java,解決許可權檢查的問題