Apache Ambari is a Web-based tool that supports the supply, management, and monitoring of Apache Hadoop clusters. Ambari currently supports most Hadoop components, including HDFS, MapReduce, Hive, Pig, Hbase, Zookeper, Sqoop, and Hcatalog.Apache Ambari supports centralized m
Apache Ambari is a Web-based open-source project that monitors, manages, and manages Hadoop lifecycles. It is also a project that selects management for the Hortonworks data platform. Ambari supports the following management services:
Apache HBaseApache HCatalogApache Hadoop HDFSApache HiveApache
Link: http://hortonworks.com/kb/get-started-setting-up-ambari/
Ambari is 100% open source and supported ded in HDP, greatly simplifying installation and initial configuration of hadoop clusters. in this article we'll be running through some installation steps to get started with ambari. most of the steps here are cov
@master.busymonkey:/root/.ssh /s1_keys and SCP Authorized_keys Root@master.busymonkey:/root/.ssh/s2_keys.
Then the master node's/root/.ssh/directory will be like this:
Then append the two copied key files to master's Authorized_keys file, using the command: Cat S1_keys >> Authorized_keys and Cat S2_keys >> Authorized_keys
Finally, the remote copy of master's Authorized_keys file is overwritten to two slaver, the order is: SCP Authorized_keys root@slaver1.busymonkey:/root/.ssh/ Authorized_keys a
1.1.rhel/centos/oracle Linux 6On a server host this has Internet access, use a command line editor to perform the following steps:
Log in to your host as root .
Download the Ambari repository file to a directory on your installation host.wget -nv http://public-repo-1.hortonworks.com/ambari/centos6/2.x/updates/2.2.2.0/ambari.repo -O /etc/yum.repos.d/ambari.repo
Confirm that the repository are configu
Installation Guide for Ambari and HDP
The Big Data Platform involves many software products. If you just download the software package from Hadoop and manually configure the file, it is not that intuitive and easy.
Ambari provides an option to install and manage hadoop clusters in a graphical manner.
the command: setenforce 0Permanently close the command: vi/etc/selinux/configSet SELinux to SELINUX = disabled.
5. Disable the Firewall:Permanently disable: chkconfig iptables offClose:/etc/init. d/iptables stop(You can also choose not to Enable Firewall During setup)
6. Set to disable packagekitVi/etc/yum/pluginconf. d/refresh-packagekit.confSet enabled to 0
7. Optional: Configure the Local Repositorie (no network connection, or Configure a large cluster and want to maintain bandwidth)Referenc
recommended to write a one-click installation script. After the server is ready (without keys, firewall shutdown, and time synchronization), you only need to run the script to install and deploy it on the server.
Install Hadoop Cluster Monitoring Tool Ambari
Use Ambari to quickly deploy the Hadoop Big Data Environment
-X Post-h "X-requested-by: ambari" http://nj01-hadoop-rd03.nj01.baidu.com: 8080/API/V1/clusters/abacicluster/hosts/nj01-hadoop-rd03.nj01.baidu.com
Where AbaciclusterNj01-hadoop-rd03.nj01.baidu.com
Note: If a cluster has multiple hosts, multiple commands are called.
Add services to the created Cluster (add HDF
startStarting the service, the following page appears to indicate successful startup:Note: If you are installing Ambari with a common user such as Hadoop, and the database is MySQL, the startup error will be the solution:① creating a user in MySQL' Admin '@'hadoop05' identified by ' admin ';Create the Ambari database.② with Admin user login MySQL,Use
Before installing hive, you need to install MySQL, as shown in the following steps for installing hive: http://blog.csdn.net/wang_zhenwei/article/details/50563718when using Ambari to install the hive process, you need to install MySQL on the machine first, the installation process is as follows: 1. Establishing a database for Ambari
# yum Install Mysql-connector-java
# mysql-u root-p mysql>
USER '
:
Create DATABASE hive character set UTF8;
CREATE USER ' hive ' @ '% ' identified by ' Hive-123 ';
GRANT all privileges on * * to ' hive ' @ '% ';
FLUSH privileges;
If you are installing Oozie, create the Oozie database and the user then execute the following statement:
Create database Oozie character set UTF8;
CREATE USER ' oozie ' @ '% ' identified by ' Oozie-123 ';
GRANT all privileges on * * to ' oozie ' @ '% ';
FLUSH privileges;
To install the MySQL JDBC driver:
Yum Install Mysql-conne
ConceptAmbari metrics is a functional component in Ambari that is responsible for monitoring cluster status. It has some of the following key concepts:
Terminology
Description
Ambari Metrics System ("AMS")
The built-in metrics collection system for Ambari.
Metrics Collector
The standalone server
Recently, as a result of the project's need to look at some Ambari API to get some information about the cluster, get the status information of the cluster nodes and the information of each service and component. Then in our data service management platform to do some monitoring operations, now to summarize some of the use:Website REST API Address: https://cwiki.apache.org/confluence/display/AMBARI/The
Ambari is hortonworks out an open source Hadoop management system, is written in Python, the current market is open source Hadoop management system seems to be the only one, although ambari problems, but also not good use, but there is no way.Recent surveillance systems often warn that a URL is always unreachable, just
1. Download Ambari-impala-service
sudo git clone https://github.com/cas-bigdatalab/ambari-impala-service.git/var/lib/ambari-server/resources/stacks /hdp/2.4/services/impala
2./ETC/YUM.REPOS.D New Impala.repo
[Cloudera-cdh5]
# Packages for Cloudera's distribution for Hadoop, Version 5, on RedHat or CentOS 7 x86_64
N
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.