用Python備份MYSQL 資料庫

來源:互聯網
上載者:User
工作需要,對公司的MYSQL資料庫進行備份,趕上剛剛開始學python,看了一套簡單的python教學視頻,簡單的寫了個備份指令碼,個人表示 對python 的class 、function、build-in function 、私人變數、全域變數 等等,該怎麼用,啥時候用等 毫無概念 ,僅此記錄一下吧,也歡迎路過的pythoner賜教。

個人已知的一些問題:

1、該指令碼必須要求 mysql設定檔內的所有行為 key=value的格式,並且不能存在多餘的注釋,否則ConfigParser模組解析設定檔時會出錯,由於沒研究過ConfigParser是不是有容錯的方法可以調用,也沒時間寫容錯處理,而是通過整理my.ini 設定檔使其符合ConfigParser的要求解決的。後面會附上我用的mysql設定檔。

2、大量使用類私人成員變數,因為完全不知道python 變數、類方法、等等啥時候該私人化,以及有啥區別,只知道類私人成員變數在別的指令碼中import 或者繼承時,是不可見的。

3、比較多的進行檔案操作,以及傳值操作,目前只保證按正確格式傳值沒問題,沒有做多餘的容錯處理。 4、大量的在進行字串拼接,第一次寫營運相關指令碼,由於要調用系統命令,和傳遞很多參數,也不會subprocess模組,不知道別人寫營運指令碼都具體咋做,就直接拼接了。

5、其他未知的bug、未發現的邏輯錯誤等等。

環境:

- Server : Dell PowerEdge T110

- OS: CentOS 6.3_x86_64

- PythonVersion: 2.7.3

- MysqlVersion: 5.5.28 linux x86_64

MysqlBackupScript.py

#!/usr/bin/env python# coding: utf8# script MysqlBackupScript# by Becareful# version v1.0  """This scripts provides auto backup mysql(version == 5.5.x) database ."""  import osimport sysimport datetime         #用於產生備份檔案的日期import linecache        #用於讀取檔案的指定行import ConfigParser     #解析mysql設定檔    class DatabaseArgs(object):    """      """      __MYSQL_BASE_DIR = r'/usr/local/mysql'            #mysql安裝目錄    __MYSQL_BIN_DIR = __MYSQL_BASE_DIR + '/bin'       #mysql二進位目錄    __MYSQL_CONFIG_FILE = r'/usr/local/mysql/my.cnf'  #mysql設定檔      __ONEDAY = datetime.timedelta(days=1)             #一天的時間長度,用於計算下面的前一天和後一天日期    __TODAY = datetime.date.today()      #當天日期格式為 YYYY-MM-DD    __YESTERDAY = __TODAY - __ONEDAY                  #計算昨天日期    __TOMORROW = __TODAY + __ONEDAY                   #計算明天日期    __WEEKDAY = __TODAY.strftime('%w')                #計算當天是一星期的星期幾      __MYSQL_DUMP_ARGS = {                             #用一個字典儲存mysqldump 命令備份資料庫的參數        'MYISAM': ' -v -E -e -R --triggers -F  -n --opt --master-data=2 --hex-blob -B ',        'INNODB': ' -v -E -e -R --triggers -F --single-transaction -n --opt --master-data=2 --hex-blob -B '    }      __DUMP_COMMAND = __MYSQL_BIN_DIR + '/mysqldump'   #mysqldump 命令的 路徑 用於dump mysql資料    __FLUSH_LOG_COMMAND = __MYSQL_BIN_DIR + '/mysqladmin'    #mysqladmin 命令的路徑 ,用於執行 flush-logs 產生每天增量binlog      __BACKUP_DIR = r'/backup/'                        # 指定備份檔案存放的目錄      __PROJECTNAME = 'example'                         # 指定需要備份的資料庫對應的項目名,將來會產生 projectname-YYYY-MM-DD.sql 等檔案    __DATABASE_LIST = []                              # 指定需要備份的資料庫名,可以是多個,使用列表    __HOST = 'localhost'    __PORT = 3306    __USERNAME = 'root'    __PASSWORD = ''    __LOGINARGS = ''                                  # 如果在localhost登陸,要求輸入密碼,可以設定登陸的參數,具體在下面有說明    __LOGFILE = __BACKUP_DIR + '/backup.logs'      def __init__(self, baseDir=__MYSQL_BASE_DIR, backDir=__BACKUP_DIR, engine='MYISAM', projectName=__PROJECTNAME,                 dbList=__DATABASE_LIST, host=__HOST, port=__PORT, user=__USERNAME, passwd=__PASSWORD):          """            執行個體化對象時傳入的參數,如不傳入預設使用類的私人成員變數作為預設值    :param baseDir:       :param backDir:    :param engine:    :param projectName:    :param dbList:    :param host:    :param port:    :param user:    :param passwd:    """        self.__MYSQL_BASE_DIR = baseDir        self.__BACKUP_DIR = backDir        self.__PROJECTNAME = projectName        self.__DATABASE_LIST = dbList        self.__HOST = host        self.__PORT = port        self.__USERNAME = user        self.__PASSWORD = passwd        self.__ENGINE = self.__MYSQL_DUMP_ARGS[engine]        #下面定義了如需登陸時,參數 其實就是產生 這樣的格式  “-hlocalhost -uroot --password=‘xxxx’”        self.__LOGINARGS = " -h" + self.__HOST + " -P" + str(            self.__PORT) + " -u" + self.__USERNAME + " --password='" + self.__PASSWORD + "'"        self.checkDatabaseArgs()   #調用檢查函數      def __getconfig(self, cnf=__MYSQL_CONFIG_FILE, item=None):  # 解析mysql設定檔的小函數,簡單封裝了下,傳入一個值作為my.cnf的key去尋找對應的value          __mycnf = ConfigParser.ConfigParser()        __mycnf.read(cnf)        try:            return __mycnf.get("mysqld", item)        except BaseException, e:            sys.stderr.write(str(e))            sys.exit(1)      def __getBinlogPath(self): #  取每天需要增量備份的binlog日誌的絕對路徑,從mysql的binlog.index檔案取倒數第二行          __BINLOG_INDEX = self.__getconfig(item='log-bin') + '.index'          if not os.path.isfile(__BINLOG_INDEX):            sys.stderr.write('BINLOG INDEX FILE: [' + __BINLOG_INDEX + ' ] NOT FOUND! \n')            sys.exit(1)        else:            try:                __BINLOG_PATH = linecache.getline(__BINLOG_INDEX, len(open(__BINLOG_INDEX,'r').readlines()) - 1)                linecache.clearcache()            except BaseException, e:                sys.stderr.write(str(e))                sys.exit(1)            return __BINLOG_PATH.strip()      def flushDatabaseBinlog(self):  # 調用此函數,將會執行  mysqladmin flush-logs ,重新整理binlog日誌        return os.popen(self.__FLUSH_LOG_COMMAND + self.__LOGINARGS + ' flush-logs')      def dumpDatabaseSQL(self):  #|通過mysqladmin 對指定資料庫進行全備        if not os.path.isfile(self.__BACKUP_DIR + '/' + self.__PROJECTNAME + '/' + str(self.__YESTERDAY) + '-' + self.__PROJECTNAME + '.sql'):            return os.popen(self.__DUMP_COMMAND + self.__LOGINARGS + self.__ENGINE + ' '.join(                self.__DATABASE_LIST) + ' >> ' + self.__BACKUP_DIR + '/' + self.__PROJECTNAME + '/' +str(                self.__YESTERDAY) + '-' + self.__PROJECTNAME + '.sql')        else:            sys.stderr.write('Backup File [' + str(self.__YESTERDAY) + '-' + self.__PROJECTNAME +'.sql]  already exists.\n')          def dumpDatabaseBinlog(self):#通過copy2() 將需要備份的binlog日誌複製到指定備份目錄          if not os.path.isfile(self.__BACKUP_DIR + '/' + self.__PROJECTNAME + '/' +str(self.__YESTERDAY) + '-' + os.path.split(self.__getBinlogPath())[1]):            from shutil import copy2            try:                copy2(self.__getBinlogPath(), self.__BACKUP_DIR + '/' + self.__PROJECTNAME + '/' +str(self.__YESTERDAY) + '-' + os.path.split(self.__getBinlogPath())[1])            except BaseException, e:                sys.stderr.write(str(e))        else:            sys.stderr.write('Binlog File [' + str(self.__YESTERDAY) + '-' +os.path.split(self.__getBinlogPath())[1] + '] already exists\n' )        def checkDatabaseArgs(self):  #對一些必要條件進行檢查        __rv = 0          if not os.path.isdir(self.__MYSQL_BASE_DIR):  #檢查指定的mysql安裝目錄是否存在            sys.stderr.write('MYSQL BASE DIR: [ ' + self.__MYSQL_BASE_DIR + ' ] NOT FOUND\n')            __rv += 1          if not os.path.isdir(self.__BACKUP_DIR):   #檢查指定的備份目錄是否存在,如不存在自動建立            sys.stderr.write('BACKUP DIR: [ ' + self.__BACKUP_DIR + '/' + self.__PROJECTNAME +  ' ] NOT FOUND ,AUTO CREATED\n')            os.makedirs(self.__BACKUP_DIR + '/' + self.__PROJECTNAME)          if not os.path.isfile(self.__MYSQL_CONFIG_FILE): #檢查mysql設定檔是否存在            sys.stderr.write('MYSQL CONFIG FILE: [' + self.__MYSQL_CONFIG_FILE + ' ] NOT FOUND\n')            __rv += 1          if not os.path.isfile(self.__DUMP_COMMAND):  #檢查備份資料庫時使用的mysqldump命令是否存在            sys.stderr.write('MYSQL DUMP COMMAND: [' + self.__DUMP_COMMAND + ' ] NOT FOUND\n')            __rv += 1          if not os.path.isfile(self.__FLUSH_LOG_COMMAND): #檢查重新整理mysql binlog日誌使用的mysqladmin命令是否存在            sys.stderr.write('MYSQL FLUSH LOG COMMAND: [' + self.__DUMP_COMMAND + ' ] NOT FOUND\n')            __rv += 1          if not self.__DATABASE_LIST:  #檢查需要備份的資料庫列表是否存在            sys.stderr.write('Database List is None \n')            __rv += 1          if __rv:   # 判斷傳回值,由於上述任何一步檢查失敗,都會導致 __rv 值 +1 ,只要最後__rv != 0就直接退出了。            sys.exit(1)    def crontab():  # 使用字典,來進行相關參數傳遞,並執行個體化對象,調用相關方法進行操作      zabbix = {        'baseDir': '/usr/local/mysql/',        'backDir': '/backup/',        'projectName': 'Monitor',        'dbList': ['zabbix'],        'host': 'localhost',        'port': 3306,        'user': 'root',        'passwd': 'xxxxxxx'    }      monitor = DatabaseArgs(**zabbix)    monitor.dumpDatabaseSQL()    monitor.dumpDatabaseBinlog()    monitor.flushDatabaseBinlog()  if __name__ == '__main__':    crontab()

my.cnf

[client]port                            = 3306socket                          = /mysql/var/db.socket  [mysqld]socket                          = /mysql/var/db.socketdatadir                         = /mysql/db/skip-external-locking           = 1skip-innodb                     = 0key_buffer_size                 = 256Mmax_allowed_packet              = 10Mtable_open_cache                = 2048sort_buffer_size                = 4Mread_buffer_size                = 4Mread_rnd_buffer_size            = 8Mmyisam_sort_buffer_size         = 64Mmyisam_max_sort_file_size       = 1Gmyisam_repair_threads           = 1myisam_recover                  = DEFAULTthread_cache_size               = 32query_cache_size                = 32Mquery_cache_min_res_unit        = 2kbulk_insert_buffer_size         = 64Mtmp_table_size                  = 128Mthread_stack                    = 192Kskip-name-resolve               = 1max_connections                 = 65500default-storage-engine          = myisamfederated                       = 0server-id                       = 1slave-skip-errors               = all#log                            = /var/log/sql_query.logslow-query-log                  = 1slow-query-log-file             = /mysql/log/sql_query_slow.loglong-query-time                 = 5log-queries-not-using-indexes   = 1log-slow-admin-statements       = 1log-bin                         = /mysql/var/log/binlog/bin-loglog-error                       = /mysql/var/log/mysql.errmaster-info-file                = /mysql/var/log/master.inforelay-log                       = /mysql/var/log/relay-bin/relay-binrelay-log-index                 = /mysql/var/log/relay-bin/relay-bin.indexrelay-log-info-file             = /mysql/var/log/relay-bin/relay-bin.infobinlog_cache_size               = 8Mbinlog_format                   = MIXEDmax_binlog_cache_size           = 20Mmax_binlog_size                 = 1Gbinlog-ignore-db                = mysqlbinlog-ignore-db                = performance_schemabinlog-ignore-db                = information_schemareplicate-ignore-db             = mysqlreplicate-ignore-db             = performance_schemareplicate-ignore-db             = information_schema  innodb_data_home_dir            = /mysql/ibdata/innodb_data_file_path           = ibdata:156M:autoextendinnodb_log_group_home_dir       = /mysql/ibdata/log-slave-updates               = 0back_log                        = 512transaction_isolation           = READ-COMMITTEDmax_heap_table_size             = 246Minteractive_timeout             = 120wait_timeout                    = 120innodb_additional_mem_pool_size = 16Minnodb_buffer_pool_size         = 512Minnodb_file_io_threads          = 4innodb_thread_concurrency       = 8innodb_flush_log_at_trx_commit  = 2innodb_log_buffer_size          = 16Minnodb_log_file_size            = 128Minnodb_log_files_in_group       = 3innodb_max_dirty_pages_pct      = 90innodb_lock_wait_timeout        = 120innodb_file_per_table           = 1innodb_open_file                = 327500open_files_limit                = 327500  [mysqldump]quick                           = 1max_allowed_packet              = 50M  [mysql]auto-rehash                     = 1socket                          = /mysql/var/db.socketsafe-updates                    = 0  [myisamchk]key_buffer_size                 = 256Msort_buffer_size                = 256Mread_buffer                     = 2Mwrite_buffer                    = 2M  [mysqlhotcopy]interactive-timeout             = 100

最終產生的備份目錄結構是這樣的

[root@zabbix backup]# find ./././Monitor./Monitor/2013-03-16-bin-log.000008./Monitor/2013-03-14-bin-log.000006./Monitor/2013-03-16-Monitor.sql./Monitor/2013-03-15-Monitor.sql./Monitor/2013-03-15-bin-log.000007./Monitor/2013-03-14-Monitor.sql  ~END~
  • 聯繫我們

    該頁面正文內容均來源於網絡整理,並不代表阿里雲官方的觀點,該頁面所提到的產品和服務也與阿里云無關,如果該頁面內容對您造成了困擾,歡迎寫郵件給我們,收到郵件我們將在5個工作日內處理。

    如果您發現本社區中有涉嫌抄襲的內容,歡迎發送郵件至: info-contact@alibabacloud.com 進行舉報並提供相關證據,工作人員會在 5 個工作天內聯絡您,一經查實,本站將立刻刪除涉嫌侵權內容。

    A Free Trial That Lets You Build Big!

    Start building with 50+ products and up to 12 months usage for Elastic Compute Service

    • Sales Support

      1 on 1 presale consultation

    • After-Sales Support

      24/7 Technical Support 6 Free Tickets per Quarter Faster Response

    • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.