1, reported hdfs insufficient authority: Org.apache.hadoop.security.AccessControlException:Permission denied:user=ouqiping, Access=write, Inode= "/user/administrator/datingrecommender/ratings.dat": root:supergroup:drwxr-xr-x
Workaround:
To modify the Hadoop configuration file on the server:conf/hdfs-core.xml, locate the configuration entry for dfs.permissions, and change the value to False
<property>
<name>dfs.permissions</name>
<value>false</value>
<description>
If "true", enable permission checking in HDFS.
If "false", permission checking is turned off,
behavior is unchanged.
Switching from one parameter value to the other does the mode,
Owner or group of files or directories.
</description>
</property>
2, the account name does not match: Org.apache.hadoop.security.AccessControlException:Permission denied:user=ouqiping, Access=execute, Inode= "/tmp/hadoop-yarn/staging/ouqiping/.staging/job_1458550621743_0001": root:supergroup:drwx------
There are 6 main ways to solve this problem:
(1), change the name of the Hadoop cluster on Linux to Ouqiping
(2), change the permissions for the directory where Hadoop's HDFs resides for Hadoop fs-chmod 777/user/hadoop
(3), close the authority authentication mechanism of HDFS, change dfs.permissions to False (tested, invalid, can only be uploaded and downloaded)
(4), change the system user name of Windows7 to Hadoop
(5), add Hadoop_user_name to the environment variable on Win7 and configure the corresponding user name on Linux.
(6), in the submission process through the code temporary settings specify the name of the Hadoop_user_name and Linux consistent
Issues with Eclipse developing mapreduce programs