1: Submit hadoop in eclipseProgramError: org. Apache. hadoop. Security. accesscontrolexception: Permission denied: User = drwho, access = write description:
11/10/28 16:05:53 info mapred. jobclient: running job: job_201110281103_0003
11/10/28 16:05:54 info mapred. jobclient: Map 0% reduce 0%
11/10/28 16:06:05 info mapred. jobclient: task id: attempt_201110281103_0003_m_000002_0, status: Failed
Org. apache. hadoop. security. accesscontrolexception: Org. apache. hadoop. security. accesscontrolexception: Permission denied: User = drwho, access = write, inode = "hadoop": hadoop: supergroup: rwxr-XR-x
At sun. Reflect. nativeconstructoraccessorimpl. newinstance0 (native method)
At sun. Reflect. nativeconstruct%cessorimpl. newinstance (nativeconstruct%cessorimpl. Java: 39)
Solution:
Modify the hadoop configuration file on the server:Conf/hdfs-core.xml, FindDFS. PermissionsTo set the value to false.
<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
But all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
Owner or group of files or directories.
</Description>
</Property>
After modification, it seems that the hadoop process needs to be restarted to take effect.
Development Environment: Win XP SP3, eclipse 3.3, hadoop-0.20.2
Hadoop server deployment environment: Ubuntu 10.10, hadoop-0.20.2
Conclusion: It was not long before hadoop was introduced. I don't know how this modification affects the security of the cluster.
// Supplement:
Then Google wrote an articleArticle, Explains the cause of the error. The original address is as follows:
Http://hi.baidu.com/hontlong/blog/item/9ba50ddbd1e16270d0164ec4.html
When eclipse uses the hadoop plug-in to submit jobs, jobs will be written to the HDFS file system as drwho by default, corresponding to/user/XXX on HDFS, my account is/user/hadoop. An exception occurs because the drwho user has no write permission on the hadoop directory. The solution is to open the permissions of the hadoop directory. The command is as follows: $Hadoop FS-chmod 777, user, and hadoop
2: cannot create directory/user/root/output01/_ temporary. Name node is in safe Mode
Solution: Wait a while.
11/10/28 16:05:53 info mapred. jobclient: running job: job_201110281103_0003
11/10/28 16:05:54 info mapred. jobclient: Map 0% reduce 0%
11/10/28 16:06:05 info mapred. jobclient: task id: attempt_201110281103_0003_m_000002_0, status: Failed
Org. apache. hadoop. security. accesscontrolexception: Org. apache. hadoop. security. accesscontrolexception: Permission denied: User = drwho, access = write, inode = "hadoop": hadoop: supergroup: rwxr-XR-x
At sun. Reflect. nativeconstructoraccessorimpl. newinstance0 (native method)
At sun. Reflect. nativeconstruct%cessorimpl. newinstance (nativeconstruct%cessorimpl. Java: 39)
Solution:
Modify the hadoop configuration file on the server:Conf/hdfs-core.xml, FindDFS. PermissionsTo set the value to false.
<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
But all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
Owner or group of files or directories.
</Description>
</Property>
After modification, it seems that the hadoop process needs to be restarted to take effect.
Development Environment: Win XP SP3, eclipse 3.3, hadoop-0.20.2
Hadoop server deployment environment: Ubuntu 10.10, hadoop-0.20.2
Conclusion: It was not long before hadoop was introduced. I don't know how this modification affects the security of the cluster.
// Supplement:
Later, Google wrote an article explaining the cause of the error. The original Article address is as follows:
Http://hi.baidu.com/hontlong/blog/item/9ba50ddbd1e16270d0164ec4.html
When eclipse uses the hadoop plug-in to submit jobs, jobs will be written to the HDFS file system as drwho by default, corresponding to/user/XXX on HDFS, my account is/user/hadoop. An exception occurs because the drwho user has no write permission on the hadoop directory. The solution is to open the permissions of the hadoop directory. The command is as follows: $Hadoop FS-chmod 777, user, and hadoop
2: cannot create directory/user/root/output01/_ temporary. Name node is in safe Mode
Solution: Wait a while.