Hadoop Remote Client Installation configuration

Source: Internet
Author: User
Tags hadoop fs
Hadoop remote Client installation configuration

Client system: ubuntu12.04

Client User name: Mjiang

Server user name: Hadoop downloads Hadoop installation package to ensure consistent server version (or Hadoop installation package for direct copy server)
To http://mirror.bjtu.edu.cn/apache/hadoop/common/download tar.gz install package, unzip.
Tar zxvf hadoopx.x.x.tar.gz Configuration
System Configuration
Modify the ~/.BASHRC file
Add export path=/path/to/hadoop/home/: $PATH Hadoop configuration file modification
The client only needs to configure the cluster namenode and jobtracker information, as well as the Java installation directory. That is, modify the files in the Conf directory:
Hadoop-env.sh:
Export java_home=/home/mjiang/hadoop_work/jrockit-jdk1.6.0_29
Core-site.xml:
<property>
<name>fs.default.name</name>
<value>hdfs://master:8020</value>
</property>
Mapred-site.xml:
<property>
<name>mapred.job.tracker</name>
<value>master:8021</value>
</property> now that the Hadoop client is configured, you can run basic commands such as:
Hadoop FS-LSR/
However, because the server side has not set permissions, so the HDFs system to upload files and other commands can not be run. Hadoop Multi-user rights configuration

A similar error may occur when a remote client user runs a program on a Hadoop cluster:

Error phenomena : Org.apache.hadoop.security.AccessControlException:Permission Denied:user=mjiang, Access=execute, Inode= "job_201111031322_0003": hadoop:supergroup:rwx-.

cause : The local user wants to operate the Hadoop system remotely, without permission.

This is the result of incorrect permissions configuration, the solution: if it is a test environment, you can cancel the user rights check for Hadoop HDFs. Open Conf/hdfs-site.xml, and find the Dfs.permissions property modified to False (default is True) OK. Permissions configuration for company-level apps:
Here only simple permissions configuration, remote users can normally submit the operation of the job can be, more professional, advanced permission configuration is not available temporarily, also did not do too much research.
In resolving the issue of permissions configuration, we transferred the problem of multi-user rights configuration of Hadoop to the user rights configuration problem of the simple HDFs file:
Add the client user and the Hadoop default group name on the server host:
Sudo/usr/sbin/groupadd supergroup
Sudo/usr/sbin/useradd-e 20130630-g Supergroup-n-R mjiang Modify group permissions for HDFs files in a Hadoop cluster so that all users belonging to the supergroup group have read and write permissions
Hadoop fs-chmod 773/reference article:
http://jbm3072.iteye.com/blog/1048489

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.