Mysql SQL Execution and slow query monitoring

Source: Internet
Author: User
Tags timedelta
[Preface] mysql can record the SQL statements executed by users: it can record the files and tables. mysql can define the execution time or longer. The SQL statements are slow queries, record relevant information to the document and table [background description ].

[Preface] mysql can record the SQL statements executed by users: it can record the files and tables. mysql can define the execution time or longer. The SQL statements are slow queries, record relevant information to the document and table [background description ].

[Preface]

Mysql records the SQL statements executed by users: records to files and tables

Mysql can define how much time the SQL statement is executed. It is a slow query and records related information to files and tables according to the configuration.

[Background description]

The company wants to monitor which sqls are executed every day and which sqls are slow queries, and then optimizes the sqls.

[Technical description]

As long as you understand how mysql records the execution of SQL statements

How to record slow queries

The next step is to write code to organize the report. Here I use python

[The final result is as follows]


[Technical Details]

1. modify my. cnf

# The overall effect is that both the table and log file are enabled globally, but only the table is written for general_log, And the slow_query_log and log files are recorded.

General_log = 1 # enable the log slow_query_log = 1 # enable the log of mysql slow SQL

# Setting will affect general_log and slow_query_log,

Log_output = table, File # The log output will write tables and log files. To facilitate program statistics, it is best to write tables.

# General_log_file is not configured here, So general_log will only write the table

# In mysql5.1.29 or later, set the following to enable mysql to record the executed SQL statements in the file

# General_log_file =/log/general. log

#5.1.29 previously:

# Log =/var/lib/mysql/SQL _row.log

Long_query_time = 1 # Set the slow query of mysql to slow_query_log_file =/log/slow. log for more than 1 s

2. modify the format of the mysql Log table (in the mysql database)

# By default, general_log is in csv format. modifying it to MyISAM is much more efficient.

Set global general_log = off; alter table general_log engine = MyISAM; # the innodb format set global general_log = on cannot be used;

# By default, general_log is in csv format. modifying it to MyISAM is much more efficient.

Set global slow_query_log = off; equal to 0. alter table slow_log engine = MyISAM; # set global slow_query_log = on in innodb format; equal to 1

3. Because mysql Log tables general_log and slow_query_log cannot be modified, you need to create a new table to facilitate deletion and modification (this log table is too large and needs to regularly clean data obtained n days ago)

3.1 create a slow_log_dba table

Create table 'slow _ log_dba '('start _ time' timestamp not null default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, 'user _ host' mediumtext not null, 'query _ time' time not null, 'lock _ time' time not null, 'rows _ sent 'int (11) not null, 'rows _ examined' int (11) not null, 'db' varchar (512) not null, 'last _ insert_id 'int (11) not null, 'insert _ id' int (11) not null, 'server _ id' int (10) unsigned not null, 'SQL _ text' mediumtext NOT NULL) ENGINE = MyISAM DEFAULT CHARSET = utf8 COMMENT = 'slow log for dba ';

3.2 Create a general_log_dba table

Create table 'General _ log_dba '('event _ time' timestamp not null default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, 'user _ host' mediumtext not null, 'thread _ id' int (11) not null, 'server _ id' int (10) unsigned not null, 'COMMAND _ type' varchar (64) not null, 'argument' mediumtext not null, KEY 'user _ host' (200), KEY 'event _ time' ('event _ Time ')) ENGINE = MyISAM default charset = utf8 COMMENT = 'General log for dba op ';

4. Because the program uses the general_log_dba and slow_log_dba tables, you need to copy the data of general_log and slow_query_log to general_log_dba and slow_log_dba at regular intervals.

Because the report is generated once a day, you only need to perform this operation once a day.

# The script stores the data obtained for 10 days. The data of general_log and slow_query_log are copied to general_log_dba and slow_log_dba every day.

# Execute mysqllogtable. sh once a daily scheduled task #! /Bin/shNDaysAgo = $ (date-d '-10 days' "+ % F % H: % M: % S ") /usr/local/mysql/bin/mysql-uXXXX-p 'xxxxxx'-D 'mysql'-e "insert general_log_dba select * from general_log; truncate general_log; delete from general_log_dba where event_time <\ "$ NDaysAgo \"; insert slow_log_dba select * from slow_log; truncate slow_log; delete from slow_log_dba where start_time <\ "$ NDaysAgo \""

5. python script write statistics on daily SQL operations and daily mysql slow queries (some of the scripts can be abstracted and processed as appropriate)

5.1 count the scripts of mysql daily execution records

#-*-Coding: UTF-8-*-_ author _ = 'river' import MySQLdb as mysqlimport refrom datetime import datetime, timedeltaimport smtplibfrom email. mime. text import MIMETextdef sendHtmlMail (mailcontent, myip): try: yestoday = (datetime. now ()-timedelta (days = 1 )). strftime ("% Y-% m-% d ") sender = 'xxx @ xxx.com 'guest ER = ['xxx @ xxx.com'] subject = myip + 'mysql operation report' + yestoday smtpserver = 'smtp. exmail. xx. Com 'username = 'xxx @ xxx.com 'password = 'xxxxx' msg = MIMEText (mailcontent, 'html', 'utf-8') # '', 'text ', 'utf-8' msg ['subobject'] = Subject msg ['from'] = sender msg ['to'] = 'xxx @ xxxxxx.com 'smtp = smtplib. SMTP () smtp. connect (smtpserver) smtp. login (username, password) smtp. sendmail (sender, receiver, msg. as_string () smtp. quit () Does T Exception, e: print e, 'Send mail error' if _ name __= = '_ main _': r Esult = None htmlfile='mysqlLogMon.html 'myiplist = ['2017. 168.10.10 ', '2017. 168.10.19 '] yestoday = (datetime. now ()-timedelta (days = 1 )). strftime ("% Y-% m-% d 00:00:00") today = datetime. now (). strftime ("% Y-% m-% d 00:00:00") for myip in myiplist: SQL = "select user_host, argument from general_log_dba where event_time> = '% s' and event_time <=' % S' "% (yestoday, today) try: dbcon = mysql. connect (host = myip, user = 'xxxxx', pa Sswd = 'xxxxx', db = 'mysql', port = 3306, charset = 'utf8') cur = dbcon. cursor () print "step 1," + myip + ',' + datetime. now (). strftime ("% Y-% m-% d % H: % M: % S") cur.exe cute (SQL) result = cur. fetchall () cur. close () dbcon. close () failed t Exception, e: print e, 'conn mysql error' user_host_set = set () print "step 2," + myip + ',' + datetime. now (). strftime ("% Y-% m-% d % H: % M: % S") allhash = {} if result: for user_host, argument in resu Lt: argument_delcom = re. compile (R' (\/\ * (\ s | .)*? \*\/)'). Sub ("", argument ). strip (). replace (u "\ x00 ",''). lower () if re. compile (R' ^ access. *'). match (argument_delcom) or re. compile (R' ^. *@. * on. *'). match (argument_delcom) or re. compile (R' ^ grant. *'). match (argument_delcom): tmpargument = argument_delcom.strip () else: tmpargument = argument_delcom.split ('') [0]. strip () if len (tmpargument)> 30: # Some SQL statements are u'select \ n \ t \ tcount (m. enquirymainid) ', you can use print repr (tmpargument) tmpargument = argument_delcom.split (' \ n') [0]. strip () # if all comments exist, this entry is not counted if not tmpargument or tmpargument. strip () = ''or tmpargument. strip () = '': continue if allhash. has_key (user_host): allhash [user_host] [tmpargument] = allhash [user_host]. get (tmpargument, 0) + 1 else: allhash [user_host] = {tmpargument: 1} print "step 3," + myip + ',' + datetime. now (). strftime ("% Y-% m-% d % H: % M: % S") headhtml = ''' '''Print "step 4," + myip + ',' + datetime. now (). strftime ("% Y-% m-% d % H: % M: % S") with open (htmlfile, 'w') as htmlfileobj: htmlfileobj. write (headhtml) htmlfileobj. flush () print "step 5," + myip + ',' + datetime. now (). strftime ("% Y-% m-% d % H: % M: % S") with open (htmlfile, 'A') as htmlfileobj: for hostkey in allhash. keys (): listtmp = sorted (allhash [hostkey]. iteritems (), key = lambda labkey: labkey [1], reverse = True) rowspan = len (allhash [hostkey]) # htmlfileobj. write () tmpline =' '% (Rowspan, hostkey. encode ('utf-8') htmlfileobj. write (tmpline) countn = 0 for runsql, count in listtmp: if countn = 0: tmpline =' '% (Runsql. encode ('utf-8'), count) else: tmpline =' '% (Runsql. encode ('utf-8'), count) countn + = 1 htmlfileobj. write (tmpline) tmpline = '''
User Execute SQL Number of executions
% S% S % S
% S % S
'''Htmlfileobj. write (tmpline) with open (htmlfile, 'R') as htmlfileobj: mailcontent = htmlfileobj. read () sendHtmlMail (mailcontent, myip) else: print 'SQL result is None, exit 'print "step 6," + myip +', '+ datetime. now (). strftime ("% Y-% m-% d % H: % M: % S ")

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.