Deploy hadoop and hbase with fabric

Source: Internet
Author: User
Tags svn update
Deploy hadoop and hbase with fabric

Fabric is an automated framework for executing programs in batches on multiple machines through SSH. The pre-edited project configuration file can be used to automatically deploy and maintain the project. The entire operation is performed in the local directory, which is very convenient.

Fabric's idea may be based on Capistrano: an automatic deployment framework designed for rails, which has been adopted by many Ruby non-Ruby projects, including Java projects such as hypertable.

This tool is naturally suitable for cluster management and deployment, so I posted the fabfile. py file I wrote for the company cluster. The configuration of the 4-node hadoop cluster is everywhere, so we will not detail it.

From fabric. API import ENV, roles, run, CD, local, put

Env. roledefs = {

'Tbbt': ['hadoop @ Spock ', 'hadoop @ Rock', 'hadoop @ paper', 'hadoop @ lizard '],

'Tbbt _ root': ['root @ Spock ', 'root @ Rock', 'root @ paper', 'root @ lizard '],

'Master': ['hadoop @ Spock '],

'Master _ root': ['root @ Spock '],

'Slafs': ['hadoop @ Rock', 'hadoop @ paper ', 'hadoop @ lizard']

}

Source_dir = "."

Install_dir = "hadoop_install_fabric"

Hadoop_data = "/home/hadoop_data"

Repository = "SVN: // ***/hadoop_install"

Hadoop_targz = "archives/hadoop-0.20.1.tar.gz"

Hbase_targz = "archives/hbase-0.20.2.tar.gz"

@ Roles ('tbbt ')

Def Init ():

"Docstring for deploy """

Run ('svn Co % S % s' % (repository, install_dir ))

Put (hadoop_targz, install_dir + hadoop_targz)

Put (hbase_targz, install_dir + hbase_targz)

With CD (install_dir ):

Run ('tar xf-C hadoop % s' % hadoop_targz)

Run ('tar xf-C hbase % s' % hbase_targz)

Run ('rm-RF hadoop/conf hbase/conf ')

Run ('ln-SF configs/hadoop/conf ')

Run ('ln-SF configs/hbase/conf ')

@ Roles ('tbbt ')

Def deploy ():

"Docstring for commit """

Local ('svn commit-M "Fabric Auto commit ."')

With CD (install_dir ):

Run ('svn Update ')

@ Roles ('master ')

Def start_namenode ():

"" Docstring for start_namenode """

Run ('% S/hadoop/bin/hadoop-daemon.sh start namenode' % install_dir)

@ Roles ('master ')

Def stop_namenode ():

"" Docstring for stop_namenode """

Run ('% S/hadoop/bin/hadoop-daemon.sh stop namenode' % install_dir)

@ Roles ('master ')

Def start_jobtracker ():

"" Docstring for start_jobtracker """

Run ('% S/hadoop/bin/hadoop-daemon.sh start jobtracker' % install_dir)

@ Roles ('master ')

Def stop_jobtracker ():

"" Docstring for stop_jobtracker """

Run ('% S/hadoop/bin/hadoop-daemon.sh stop jobtracker' % install_dir)

@ Roles ('slafs ')

Def start_datanodes ():

"" Docstring for start_datanodes """

Run ('% S/hadoop/bin/hadoop-daemon.sh start datanode' % install_dir)

@ Roles ('slafs ')

Def stop_datanodes ():

"" Docstring for stop_datanodes """

Run ('% S/hadoop/bin/hadoop-daemon.sh stop datanode' % install_dir)

@ Roles ('slafs ')

Def start_tasktrackers ():

"" Docstring for start_tasktrackers """

Run ('% S/hadoop/bin/hadoop-daemon.sh start tasktracker' % install_dir)

@ Roles ('slafs ')

Def stop_tasktrackers ():

"Docstring for stop_tasktrackers """

Run ('% S/hadoop/bin/hadoop-daemon.sh stop tasktracker' % install_dir)

@ Roles ('master ')

Def start_hbasemaster ():

"" Docstring for start_hbasemaster """

Run ('% S/hbase/bin/hbase-daemon.sh start master' % install_dir)

@ Roles ('master ')

Def stop_hbasemaster ():

"" Docstring for stop_hbasemaster """

Run ('% S/hbase/bin/hbase-daemon.sh stop master' % install_dir)

@ Roles ('master ')

Def start_zookeeper ():

"Docstring for stop """

Run ('% S/hbase/bin/hbase-daemon.sh start zookeeper' % install_dir)

@ Roles ('master ')

Def stop_zookeeper ():

"" Docstring for stop_zookeeper """

Run ('% S/hbase/bin/hbase-daemon.sh stop zookeeper' % install_dir)

@ Roles ('slafs ')

Def start_hbaseregions ():

"" Docstring for start_hbaseregions """

Run ('% S/hbase/bin/hbase-daemon.sh start regionserver' % install_dir)

@ Roles ('slafs ')

Def stop_hbaseregions ():

"Docstring for stop_hbaseregions """

Run ('% S/hbase/bin/hbase-daemon.sh stop regionserver' % install_dir)

@ Roles ('master ')

Def start_hbasethrift ():

"" Docstring for start_hbasethrift """

Run ('% S/hbase/bin/hbase-daemon.sh start thrift' % install_dir)

@ Roles ('master ')

Def stop_hbasethrift ():

"Docstring for fname """

Run ('% S/hbase/bin/hbase-daemon.sh stop thrift' % install_dir)

@ Roles ('master ')

Def start_dfs ():

"Docstring for start_dfs """

Run ('% S/hadoop/bin/start_dfs.sh' % install_dir)

@ Roles ('master ')

Def stop_dfs ():

"Docstring for stop_dfs """

Run ('% S/hadoop/bin/stop_dfs.sh' % install_dir)

@ Roles ('master ')

Def start_mapred ():

"" Docstring for start_mapred """

Run ('% S/hadoop/bin/start_mapred.sh' % install_dir)

@ Roles ('master ')

Def stop_mapred ():

"" Docstring for stop_mapred """

Run ('% S/hadoop/bin/stop_mapred.sh' % install_dir)

@ Roles ('master ')

Def start_hbase ():

"" Docstring for start_hbase """

Run ('% S/hbase/bin/start_hbase.sh' % install_dir)

@ Roles ('master ')

Def stop_hbase ():

"" Docstring for stop_hbase """

Run ('% S/hbase/bin/stop_hbase.sh' % install_dir)

@ Roles ('master ')

Def start_hadoop ():

"Docstring for start_hadoop """

Run ('% S/hadoop/bin/start_all.sh' % install_dir)

@ Roles ('master ')

Def stop_hadoop ():

"" Docstring for stop_hbase """

Run ('% S/hadoop/bin/stop_all.sh' % install_dir)

@ Roles ('master ')

Def start ():

"Docstring for start """

Run ('% S/hadoop/bin/start_all.sh' % install_dir)

Run ('% S/hbase/bin/start_hbase.sh' % install_dir)

@ Roles ('tbbt ')

Def stop ():

"Docstring for stop """

Run ('% S/hadoop/bin/stop_all.sh' % install_dir)

Run ('% S/hadoop/bin/stop_hbase.sh' % install_dir)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.