Insufficient permissions during eclipse connection to remote hadoop Cluster Development Solution
Source: Internet
Author: User
An error is reported when eclipse is connected to a remote hadoop cluster for development.Exception in thread "Main" org. apache. hadoop. security. accesscontrolexception: Permission denied: User = D, access = write, inode = "data": zxg: supergroup: rwxr-XR-X at Org. apache. hadoop. HDFS. server. namenode. fspermissionchecker. check (fspermissionchecker. java: 207) the user name currently logged on to Windows is inconsistent with that of the hadoop cluster.
Solution (verified by the author under hadoop1.2.0 + jdk1.7 ):Manage the DFS system directory. The current practice is to disable the hadoop service cluster permission authentication, modify the hadoop installation cluster master hadoop-1.2.0/CONF/mapred-site.xml, add: <property> <Name> DFS. Permissions </Name> When <value> false </value> </property> is officially released, you can create a user with the same username as the hadoop cluster on the server without modifying the master's permissions policy.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.
A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service