[HDFS] what is Hadoop's rack awareness policy?

Source: Internet
Author: User

More or less I have heard about the rack awareness policy about Hadoop. Whether it is balancer or jobtracker, the data copy placement policy uses rack awareness. What is rack awareness?

First, the so-called rack perception is the perception of the rack. who is aware of it? It's the hadoop system. To be more precise, hadoop can build a server and rack location topology inside the system, and identify the topological location of system nodes, in order to make a copy placement policy, job localization and other high-level design.

Can the hadoop system automatically sense the network topology in the cluster or data center? Think about it. The data center topology or network structure of each company are different, and the device types used are also different. Can hadoop really feel this way? Obviously, no! The hadoop system needs the help of the system administrator to obtain the network topology.

Imagine that hadoop can build a network topology, and the actual network topology is ever-changing. What should the Administrator do? Therefore, it is necessary for hadoop to design a set of standard topology structures. The administrator needs to adapt the actual network topology as much as possible.

With these basic ideas, we can proceed. I have read the datanode code for a while before. We all know that datanode has a registration process with namenode at startup to establish a superior-subordinate relationship with namenode. It can also be considered as the Bay pier. Then follow this route to view the rack perception principle. DatanodeProtocol defines the registration method Interface

Public DatanodeRegistration register (DatanodeRegistration registration
) Throws IOException;

Build a Hadoop environment on Ubuntu 13.04

Cluster configuration for Ubuntu 12.10 + Hadoop 1.2.1

Build a Hadoop environment on Ubuntu (standalone mode + pseudo Distribution Mode)

Configuration of Hadoop environment in Ubuntu

Detailed tutorial on creating a Hadoop environment for standalone Edition

Build a Hadoop environment (using virtual machines to build two Ubuntu systems in a Winodws environment)

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.