KeywordsRunning running providing running providing acting running providing as installing running providing as installing can
Apache Hadoop has basically become an industry standard implemented by MapReduce and is widely adopted by various agencies, while http://www.aliyun.com/zixun/aggregation/13463.html "> The Savanna project is designed to allow users to run and manage Hadoop clusters on OpenStack. It is worth mentioning that Amazon has been providing Hadoop services through EMR (elastic MapReduce) for several years.
Users need to provide Savanna with information to build clusters, such as the Hadoop version, cluster topology, node hardware details, and some other information. After the user provides these parameters, savanna will help the user set up the cluster within a few minutes, as well as help the user extend the cluster (add or remove the work node) as required.
The scenarios are for the following use cases:
quickly configures the Hadoop cluster for Dev and QA to use the computing power never used in the common OpenStack IaaS Cloud to provide "analytics as a service" (similar to the EMR in AWS) for dedicated or burst analysis loads.
The main features are as follows:
as a OpenStack component is managed through the rest API as part of the OpenStack dashboard. Supports multiple Hadoop distributions: A pluggable system that acts as a Hadoop installation engine. Integrates provider-specific management tools, such as Apache Ambari or Cloudera managent Console. A predefined template for Hadoop configuration, with configuration parameter functionality.
Savanna REST API and custom deepwater video link: YouTube video
Detail description
Savanna products are communicated mainly in the following OpenStack components:
horizon--provides the GUI to use all savanna features. keystone--authenticates the user and provides a security token to communicate with OpenStack to assign specific OpenStack permissions to the user. nova--configures virtual machines for the Hadoop cluster. glance--is used to store Hadoop virtual machine mirrors, each of which contains installed OS and Hadoop, and preinstalled Hadoop should give us the convenience of node placement. swift--can be used as a pre storage for Hadoop operations.
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.