Description: Compile hadoop program using eclipse in window and run on hadoop. the following error occurs:
11/10/28 16:05:53 info mapred. jobclient: running job: job_201110281103_0003
11/10/28 16:05:54 info mapred. jobclient: Map 0% reduce 0%
11/10/28 16:06:05 info mapred. jobclient: task id: attempt_201110281103_0003_m_000002_0, status: Failed
Org. apache. hadoop. security. accesscontrolexception: Org. apache. hadoop. security. accesscontrolexception: Permission denied: User = drwho, access = write, inode = "hadoop": hadoop: supergroup: rwxr-XR-x
At sun. Reflect. nativeconstructoraccessorimpl. newinstance0 (native method)
At sun. Reflect. nativeconstruct%cessorimpl. newinstance (nativeconstruct%cessorimpl. Java: 39)
Solution:
Modify the hadoop configuration file CONF/hdfs-core.xml on the server, locate the DFS. Permissions configuration item, and change the value to false
<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
But all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
Owner or group of files or directories.
</Description>
</Property>
After modification, it seems that the hadoop process needs to be restarted to take effect.
Development Environment: Win XP SP3, eclipse 3.3, hadoop-0.20.2
Hadoop server deployment environment: Ubuntu 10.10, hadoop-0.20.2
Conclusion: It was not long before hadoop was introduced. I don't know how this modification affects the security of the cluster.
// Supplement:
Later, Google wrote an article explaining the cause of the error. The original Article address is as follows:
Http://hi.baidu.com/hontlong/blog/item/9ba50ddbd1e16270d0164ec4.html
When eclipse uses the hadoop plug-in to submit jobs, jobs will be written to the HDFS file system as drwho by default, corresponding to/user/XXX on HDFS, my account is/user/hadoop. An exception occurs because the drwho user has no write permission on the hadoop directory. The solution is to open the hadoop directory with the following command: $ hadoop FS-chmod 777/user/hadoop