Hadoop remote Client installation configuration, multiple user rights configuration

Source: Internet
Author: User
Tags hadoop fs
Hadoop remote Client installation configuration

Client system: ubuntu12.04

Client User name: Mjiang

Server username: Hadoop download Hadoop installation package, guaranteed and server version consistent (or the Hadoop installation package for direct copy server)
To http://mirror.bjtu.edu.cn/apache/hadoop/common/download tar.gz installation package, decompression.
Tar zxvf hadoopx.x.x.tar.gz Configuration
System Configuration
modifying ~/.BASHRC files
Add export path=/path/to/hadoop/home/: $PATH Hadoop configuration file modification
The client only needs to configure the cluster namenode and jobtracker information, as well as the Java installation directory. The file in the Conf directory is modified:
Hadoop-env.sh:
Export java_home=/home/mjiang/hadoop_work/jrockit-jdk1.6.0_29
Core-site.xml:
<property>
<name>fs.default.name</name>
<value>hdfs://master:8020</value>
</property>
Mapred-site.xml:
<property>
<name>mapred.job.tracker</name>
<value>master:8021</value>
</property> Now the Hadoop client has been configured to run basic commands such as:
Hadoop FS-LSR/
However, because the server side has not yet set permissions, the command to upload files to the HDFs system is not yet operational. Hadoop Multi-user rights configuration

A similar error may occur when a remote client user runs a program on a Hadoop cluster:

error phenomenon : Org.apache.hadoop.security.AccessControlException:Permission Denied:user=mjiang, Access=execute, Inode= "job_201111031322_0003": hadoop:supergroup:rwx-.

reason : The local user wants to remotely manipulate the Hadoop system without permission.

This is the right configuration is not the result of the solution: if it is a test environment, you can cancel the Hadoop HDFs user rights check. Open Conf/hdfs-site.xml and find the Dfs.permissions property modified to False (default is True) OK. Company-level application-time permission configuration:
Here only a simple permission configuration, remote users can be submitted to the normal operation of the job can be more professional, advanced permission configuration for the time being, and did not do too much research.
When we solve the problem of permission configuration, we transfer the problem of the multi user Rights configuration of Hadoop to the user right configuration problem of the simple HDFs file:
To add client users and the Hadoop default group name on the server host:
Sudo/usr/sbin/groupadd supergroup
Sudo/usr/sbin/useradd-e 20130630-g Supergroup-n-R Mjiang Modify the group permissions for HDFs files in the Hadoop cluster so that all users who belong to the supergroup group have read and write permissions
Hadoop fs-chmod 773/reference article:
http://jbm3072.iteye.com/blog/1048489

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.