Mydumper installation and installation fault summary and mydumper fault Summary

Source: Internet
Author: User
Tags time zones unix domain socket

Mydumper installation and installation fault summary and mydumper fault Summary

Mydumper is a lightweight third-party open-source tool for mysql database backup. The backup method uses the term logical backup. It supports multithreading, And the backup speed is much higher than the original mysqldump and many outstanding features. Therefore, this tool is the best choice for DBAs. This article describes how to install the tool and how to solve the problem.


1. mydumper features (see readme for details)
* Parallelism (hence, speed) and performance (avoids expensive character set conversion routines, efficient code overall)
* Easier to manage output (separate files for tables, dump metadata, etc, easy to view/parse data)
* Consistency-maintains snapshot into SS all threads, provides accurate master and slave log positions, etc
* Manageability-supports PCRE for specifying database and tables comprehensions and exclusions
It does not support schema dumping and leaves that to 'mysqldump -- no-data'


2. Get mydumper and install it
: Https://launchpad.net/mydumper
= How to build it? =
Run:
Cmake.
Make

One needs to install development versions of required libaries (MySQL, GLib, ZLib, PCRE ):
NOTE: you must use the correspondent mysql devel package.

* Ubuntu or Debian: apt-get install libglib2.0-dev libmysqlclient15-dev zlib1g-dev libpcre3-dev libssl-dev
* Fedora, RedHat and CentOS: yum install glib2-devel for mysql-devel zlib-devel pcre-devel openssl-devel
* OpenSUSE: zypper install glib2-devel libmysqlclient-devel pcre-devel zlib-devel
* MacOSX: port install glib2 mysql5 pcre pkgconfig cmake
(You may be want to run 'port select mysql mysql5' afterwards)

One has to make sure, that pkg-config, mysql_config, pcre-config are all in $ PATH

Binlog dump is disabled by default to compile with it you need to add-DWITH_BINLOG = ON to cmake options
### Use-DWITH_BINLOG = ON when compiling binlog dump


3. How snapshots work
= How does consistent snapshot work? =

This is all done following best MySQL practices and traditions:

* As a precaution, slow running queries on the server either abort the dump, or get killed
* Global write lock is acquired ("flush tables with read lock ")
* Various metadata is read ("show slave status", "show master status ")
* Other threads connect and establish snapshots ("start transaction with consistent snapshot ")
** On pre-4.1.8 it creates dummy InnoDB table, and reads from it.
* Once all worker threads announce the snapshot establishment, master executes "unlock tables" and starts queueing jobs.

This for now does not provide consistent snapshots for non-transactional engines-support for that is expected in 0.2 :)


4. How to filter Databases
= How to exclude (or include) databases? =

Once can use -- regex functionality, for example not to dump mysql and test databases:

Mydumper -- regex '^ (?! (Mysql | test ))'

Of course, regex functionality can be used to describe pretty much any list of tables.

= How to exclude MERGE or Federated tables =

Use same -- regex exclusion syntax. Again, engine-specific behaviors are targetted for 0.2


5. Actual Installation Process
# Ls mydump *
Mydumper-0.6.2.tar.gz
# Tar-xvf mydumper-0.6.2.tar.gz
# Cd mydumper-0.6.2
[Root @ GZ-APP-BAK01] # cmake.
[Root @ GZ-APP-BAK01 mydumper-0.6.2] # make & make install


6. Exceptions during installation
Error 1:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
PCRE_INCLUDE_DIR (ADVANCED)
Used as include directory in directory/root/mydumper-0.6.2
PCRE_PCRE_LIBRARY (ADVANCED)
Linked by target "mydumper" in directory/root/mydumper-0.6.2
Linked by target "myloader" in directory/root/mydumper-0.6.2
### Install the pcre-devel package and yum install pcre-devel


Error 2:
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
MYSQL_INCLUDE_DIR (ADVANCED)
Used as include directory in directory/home/robin/mydumper-0.6.2
Used as include directory in directory/home/robin/mydumper-0.6.2
Used as include directory in directory/home/robin/mydumper-0.6.2
### The above error occurs because the MYSQL_INCLUDE_DIR directory is not configured.
### If it is compiled and installed to a non-default path, you can add the mysql installation path to/etc/profile or home directory ~ /. The source takes effect after bash_profile


Error 3:
[Root @ GZ-APP-BAK01 ~] # Mydumper -- help | more
Mydumper: error while loading shared libraries: libmysqlclient. so.18: cannot open shared object file: No such file or directory
[Root @ GZ-APP-BAK01 ~] # Mydumper
Mydumper: error while loading shared libraries: libmysqlclient. so.18: cannot open shared object file: No such file or directory
### When the above error occurs, you should consider establishing a soft link
# Which libmysqlclient. so.18
/App/soft/mysql/lib/libmysqlclient. so.18
# Ln-s/app/soft/mysql/lib/libmysqlclient. so.18/usr/lib/libmysqlclient. so.18


Error 4:
# Mydumper-uusr1-ppwd-B blos-o/tmp/bak
Option parsing failed: Error parsing option-r, try -- help
### Parameters and strings cannot be written together. At the beginning, I was confused by the second monk.

# Mydumper -- version
Mydumper 0.6.2, built against MySQL 5.6.22


7. Get help
# Mydumper -- help | more
Usage:
Mydumper [OPTION...] multi-threaded MySQL dumping

Help Options:
-?, -- Help Show help options

Application Options:
-B, -- database Database to dump
-T, -- tables-list Comma delimited table list to dump (does not exclude regex option)
-O, -- outputdir Directory to output files
-S, -- statement-size Attempted size of INSERT statement in bytes, default 1000000
-R, -- rows Try to split tables into chunks of this batch rows. This option turns off -- chunk-filesize
-F, -- chunk-filesize Split tables into chunks of this output file size. This value is in MB
-C, -- compress Compress output files
-E, -- build-empty-files Build dump files even if no data available from table
-X, -- regex Regular expression for 'db. table' matching
-I, -- ignore-engines Comma delimited list of storage engines to ignore
-M, -- no-schemas Do not dump table schemas with the data
-K, -- no-locks Do not execute the temporary shared read lock. WARNING: This will cause inconsistent backups
-- Less-locking Minimize locking time on InnoDB tables.
-L, -- long-query-guard Set long query timer in seconds, default 60
-K, -- kill-long-queries Kill long running queries (instead of aborting)
-D, -- daemon Enable daemon mode
-I, -- snapshot-interval Interval between each dump snapshot (in minutes), requires -- daemon, default 60
-L, -- logfile Log file name to use, by default stdout is used
-- Tz-utc SET TIME_ZONE = '+ 00:00' at top of dump to allow dumping of TIMESTAMP data when a server has
Data in different time zones or data is being moved between servers with different time zones,
Defaults to on use -- skip-tz-utc to disable.
-- Skip-tz-utc
-- Use-savepoints Use savepoints to reduce metadata locking issues, needs SUPER privilege
-- Success-on-1146 Not increment error count and Warning instead of Critical in case of table doesn' t exist
-- Lock-all-tables Use lock table for all, instead of FTWRL
-H, -- host The host to connect
-U, -- user Username with privileges to run the dump
-P, -- password User password
-P, -- port TCP/IP port to connect
-S, -- socket UNIX domain socket file to use for connection
-T, -- threads Number of threads to use, default 4
-C, -- compress-protocol Use compression on the MySQL connection
-V, -- version Show the program version and exit
-V, -- verbose Verbosity of output, 0 = silent, 1 = errors, 2 = warnings, 3 = info, default 2

Copyright Disclaimer: This article is an original article by the blogger and cannot be reproduced without the permission of the blogger.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.