Summary of the use of Operation Dimension tool fabric

Source: Internet
Author: User
Tags mkdir ssh

Background:
Managing the Hadoop cluster requires bulk execution. For example, modify the Hosts file to sync configuration and jar packs to each machine

Why Use Fabric:
1. First of all, in Instgram's technical blog to see the introduction of fabric;
2. Previously in the reptile team with fabric management 200+ Aliyun machine, reliability, stability has been validated;
3. Low learning cost and convenient installation;

Fabric Introduction:
Official website of Fabric: http://www.fabfile.org/
The official website defines fabric as:
Fabric is a Python (2.5-2.7) library and command-line tool for streamlining the use of SSH for application deployment or S Ystems administration tasks.

My understanding:
Fabric is a Python module (once again proving the arrogance of the Python glue language), with an SSH connection at the bottom that allows for bulk deployment or system management of the target machine cluster

How to install Fabric:
$ pip Install Fabric
This will install the Fabric dependency module together.
At the command line, if you can see this output, the fabric is already installed.
Usage:fab [Options] [: arg1,arg2=val2,host=foo,hosts= ' h1;h2 ',...] ...

Use Fabric:
1. Create a new folder as our development directory
$ cd ~
$ mkdir-p Develop/python/fabricdev
$ CD Develop/python/fabricdev

2. Create a new fabfile.py file, add the following code, and save:
#encoding =utf-8
From FABRIC.API Import Run

Def host_type ():
Run (' uname-s ')

Explain:
The FAB command for fabric will look for fabfile.py this file under the current directory.
Every function defined by fabfile.py is a command
For example, this defines the Host_type command, which is used to obtain the operating system name

3. Run fabric on the machine, get the native Host_type
$ fab-h localhost host_type
Explanation:-h localhost Specifies to run on the local computer
Print output:
Liuyufan@liumatomacbook-pro ~/github/python/fabric% fab-h localhost host_type
[Liuyufan@localhost:22] Executing task ' host_type '
[Liuyufan@localhost:22] Run:uname-s
[Liuyufan@localhost:22] Out:darwin
[Liuyufan@localhost:22] Out:

Done.
Disconnecting from localhost ... done.

4. Get the Host_type of 4 machines in Hadoop cluster
I built 4 virtual machines on my computer to do the Hadoop cluster, and now I want to use fabric to get the host_type of these four machines. I just need to connect these 4 machines SSH IP, port, username, password set well
Edit fabfile.py and save

#encoding =utf-8
From FABRIC.API import Run, env # (new) env module, specifying running environment information

env.hosts=[# (ADD)
' Root@172.16.165.151:22 ', # (ADD)
' Root@172.16.165.152:22 ', # (ADD)
' Root@172.16.165.153:22 ', # (ADD)
' Root@172.16.165.154:22 ', # (ADD)
] # (NEW)
Env.password = ' Patrick ' # (new)

Def host_type ():
Run (' uname-s ')

To execute a command:
$ fab host_type

Print output:
Liuyufan@liumatomacbook-pro ~/github/python/fabric% Fab host_type
[Root@172.16.165.151:22] Executing task ' host_type '
[Root@172.16.165.151:22] Run:uname-s
[Root@172.16.165.151:22] Out:linux
[Root@172.16.165.151:22] Out:

[Root@172.16.165.152:22] Executing task ' host_type '
[Root@172.16.165.152:22] Run:uname-s
[Root@172.16.165.152:22] Out:linux
[Root@172.16.165.152:22] Out:

[Root@172.16.165.153:22] Executing task ' host_type '
[Root@172.16.165.153:22] Run:uname-s
[Root@172.16.165.153:22] Out:linux
[Root@172.16.165.153:22] Out:

[Root@172.16.165.154:22] Executing task ' host_type '
[Root@172.16.165.154:22] Run:uname-s
[Root@172.16.165.154:22] Out:linux
[Root@172.16.165.154:22] Out:

Done.
Disconnecting from root@172.16.165.152 ... done.
Disconnecting from root@172.16.165.153 ... done.
Disconnecting from root@172.16.165.151 ... done.
Disconnecting from root@172.16.165.154 ... done.

You can see that the fabric executes the host_type command sequentially according to the order of the machine in the env.hosts array. In fact, the fabric also supports parallel execution, and the following runs in parallel

5. Obtain the Host_type information of 4 machines in parallel
Edit fabfile.py and save
#encoding =utf-8
From FABRIC.API import Run, env, parallel# (ADD)

env.hosts=[
' Root@172.16.165.151:22 ',
' Root@172.16.165.152:22 ',
' Root@172.16.165.153:22 ',
' Root@172.16.165.154:22 ',
]
Env.password = ' Patrick '

@parallel # (ADD)
Def host_type ():
Run (' uname-s ')

Explain:
Each task in each host starts a new process, and by default the window algorithm guarantees that too many processes will not start. You can use @parallel (pool_size=5) to specify the maximum number of processes for a process pool

To execute a command:
$ fab host_type

Print output:
Liuyufan@liumatomacbook-pro ~/github/python/fabric% Fab host_type
[Root@172.16.165.151:22] Executing task ' host_type '
[Root@172.16.165.152:22] Executing task ' host_type '
[Root@172.16.165.153:22] Executing task ' host_type '
[Root@172.16.165.154:22] Executing task ' host_type '
[Root@172.16.165.154:22] Run:uname-s
[Root@172.16.165.153:22] Run:uname-s
[Root@172.16.165.152:22] Run:uname-s
[Root@172.16.165.151:22] Run:uname-s
[Root@172.16.165.152:22] Out:linux
[Root@172.16.165.152:22] Out:

[Root@172.16.165.154:22] Out:linux
[Root@172.16.165.154:22] Out:

[Root@172.16.165.153:22] Out:linux
[Root@172.16.165.153:22] Out:

[Root@172.16.165.151:22] Out:linux
[Root@172.16.165.151:22] Out:

Done.

This fabric will be able to run on the cluster.
Whether you need parallel functionality, you can make tradeoffs based on specific features. Sequential execution has the advantage of fail-fast, such as the top 4 machines, if the 3rd is just a network disconnect, so the fabric in connection to the third machine will be the error and exit. The next time you do this, you only need to start at 3rd, and add 3, 4 to the execution list.
In parallel execution, it is possible that the first 1,2,4 machine executes successfully, while the 3rd is not. This requires viewing from the output which failed. Suppose there are 100 machines that happen to be the first 42,57,69 machines fail to perform successfully, which requires viewing the failed machine from the 100 output, the lookup process will be very troublesome

6. Other useful commands for fabric
# 1. Execute shell command directly
# run (' Cat/var/crawl/client.xml |grep \ ')
# run (' cmd 2′ ')
# run (' cmd 3′ ')

# 2. Switch directories and Execute
# with CD ('/var/crawl '):
# run (' echo Hi >> test.txt ')

# 3. Determine if a file or directory exists
# if exists (' Var/crawl/client.xml '):
# print ' Config file exists '
# Else:
# print ' Config file not exist '

# 4. Downloading files from a remote server
# get ('/remote/path/to/file ', '/local/path/')

# 5. Uploading files to a remote server
# put ('/local/path/to/file ', '/remote/path ')

# 6. Nested run
# with prefix (' CD ~/shark-0.9.1/bin/'):
# with prefix (' chmod +x *.sh '):
# run (' shark-shell.sh ')

# 7. Sudo
# sudo ("Mkdir/var/www/new_docroot", user= "Www-data")

# 8. Get the return value and execute the command
# files = run (' ls ')
# run (' ls-l ', files)

These operations can basically meet daily operational requirements
= = Finish = = =

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.