An error occurred while submitting the hadoop program in eclipse.

Source: Internet
Author: User
Tags hadoop fs

1: Submit hadoop in eclipseProgramError: org. Apache. hadoop. Security. accesscontrolexception: Permission denied: User = drwho, access = write description:

11/10/28 16:05:53 info mapred. jobclient: running job: job_201110281103_0003
11/10/28 16:05:54 info mapred. jobclient: Map 0% reduce 0%
11/10/28 16:06:05 info mapred. jobclient: task id: attempt_201110281103_0003_m_000002_0, status: Failed
Org. apache. hadoop. security. accesscontrolexception: Org. apache. hadoop. security. accesscontrolexception: Permission denied: User = drwho, access = write, inode = "hadoop": hadoop: supergroup: rwxr-XR-x
At sun. Reflect. nativeconstructoraccessorimpl. newinstance0 (native method)
At sun. Reflect. nativeconstruct%cessorimpl. newinstance (nativeconstruct%cessorimpl. Java: 39)

 

Solution:

Modify the hadoop configuration file on the server:Conf/hdfs-core.xml, FindDFS. PermissionsTo set the value to false.

<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
But all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
Owner or group of files or directories.
</Description>
</Property>

 

After modification, it seems that the hadoop process needs to be restarted to take effect.

 

Development Environment: Win XP SP3, eclipse 3.3, hadoop-0.20.2

Hadoop server deployment environment: Ubuntu 10.10, hadoop-0.20.2

Conclusion: It was not long before hadoop was introduced. I don't know how this modification affects the security of the cluster.

 

// Supplement:

Then Google wrote an articleArticle, Explains the cause of the error. The original address is as follows:

Http://hi.baidu.com/hontlong/blog/item/9ba50ddbd1e16270d0164ec4.html

 

When eclipse uses the hadoop plug-in to submit jobs, jobs will be written to the HDFS file system as drwho by default, corresponding to/user/XXX on HDFS, my account is/user/hadoop. An exception occurs because the drwho user has no write permission on the hadoop directory. The solution is to open the permissions of the hadoop directory. The command is as follows: $Hadoop FS-chmod 777, user, and hadoop

 

2: cannot create directory/user/root/output01/_ temporary. Name node is in safe Mode

Solution: Wait a while.

11/10/28 16:05:53 info mapred. jobclient: running job: job_201110281103_0003
11/10/28 16:05:54 info mapred. jobclient: Map 0% reduce 0%
11/10/28 16:06:05 info mapred. jobclient: task id: attempt_201110281103_0003_m_000002_0, status: Failed
Org. apache. hadoop. security. accesscontrolexception: Org. apache. hadoop. security. accesscontrolexception: Permission denied: User = drwho, access = write, inode = "hadoop": hadoop: supergroup: rwxr-XR-x
At sun. Reflect. nativeconstructoraccessorimpl. newinstance0 (native method)
At sun. Reflect. nativeconstruct%cessorimpl. newinstance (nativeconstruct%cessorimpl. Java: 39)

 

Solution:

Modify the hadoop configuration file on the server:Conf/hdfs-core.xml, FindDFS. PermissionsTo set the value to false.

<Property>
<Name> DFS. Permissions </Name>
<Value> false </value>
<Description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
But all other behavior is unchanged.
Switching from one parameter value to the other does not change the mode,
Owner or group of files or directories.
</Description>
</Property>

 

After modification, it seems that the hadoop process needs to be restarted to take effect.

 

Development Environment: Win XP SP3, eclipse 3.3, hadoop-0.20.2

Hadoop server deployment environment: Ubuntu 10.10, hadoop-0.20.2

Conclusion: It was not long before hadoop was introduced. I don't know how this modification affects the security of the cluster.

 

// Supplement:

Later, Google wrote an article explaining the cause of the error. The original Article address is as follows:

Http://hi.baidu.com/hontlong/blog/item/9ba50ddbd1e16270d0164ec4.html

 

When eclipse uses the hadoop plug-in to submit jobs, jobs will be written to the HDFS file system as drwho by default, corresponding to/user/XXX on HDFS, my account is/user/hadoop. An exception occurs because the drwho user has no write permission on the hadoop directory. The solution is to open the permissions of the hadoop directory. The command is as follows: $Hadoop FS-chmod 777, user, and hadoop

 

2: cannot create directory/user/root/output01/_ temporary. Name node is in safe Mode

Solution: Wait a while.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.