Recently, colleagues have reflected that in the process of monitoring the master-slave replication delay using pt-heartbeat, if master down, Pt-heartbeat will fail to connect, but will try again.
It's understandable, after all, from the user's point of view, I hope pt-heartbeat can keep retrying until you reconnect to the database. However, they found that repeated retries could lead to slow memory growth.
Reproduce
Environment:
Pt-heartbeat V2.2.19,mysql Community Edition V5.6.31,perl V5.10.1,rhel 6.7, Memory 500M
To avoid the effect of database outage on pt-heartbeat memory usage, MySQL and pt-heartbeat are running on different hosts respectively.
Run Pt-heartbeat
# pt-heartbeat--update-h 192.168.244.10-u monitor-p monitor123-d test--create-table
Monitor Pt-heartbeat Memory usage
Get PID
# Ps-ef |grep pt-heartbeat
root 1505 1471 0 19:13 pts/0 00:00:08 perl/usr/local/bin/pt-heartbeat--update-h 192.168 .244.10-u monitor-p monitor123-d test--create-table
root 1563 1545 2 19:50 pts/3 00:00:00 grep pt-heartbeat
View memory usage for this process
# top-p 1505
Run the 0:15.00 (time+ column), the MEM has been stabilized at 3.3%
Now close the database
# Service Mysqld Stop
The Pt-heartbeat command has just been outputting the following information
After the same CPU time, Mem grew to 4.4%, an increase of 1%, taking into account the memory 500M, the process of memory consumption increased by 5M, although not many, but given the process of memory increase does not stop the meaning, this phenomenon is to be noticed.
At the same time, through the PMAP command, found that 0000000001331000 of the address of the RSS and Dirry will also grow, the rate of growth is 4k/s
Later, the source of the study pt-heartbeat found that the code a bit of a bug
I $tries = 2 while (! $DBH && $tries--) {Ptdebug && _d ($cxn _string, '
', $user, ', $pass, join (', ', map {"$_=> $defaults->{$_}"} keys% $defaults));
$dbh = eval {dbi->connect ($cxn _string, $user, $pass, $defaults)}; if (! $DBH && $EVAL _error) {if ($EVAL _error =~ m/locate dbd\/mysql/i) {die "Cannot connect to MySQL because T He Perl dbd::mysql the module is "." Not installed or not found. Run ' Perl-mdbd::mysql ' to the '. ' The directories that Perl searches for dbd::mysql. If "." Dbd::mysql is not installed, try:\n ". "Debian/ubuntu apt-get install libdbd-mysql-perl\n". "Rhel/centos yum install perl-dbd-mysql\n".
"OpenSolaris pgk install pkg:/sunwapu13dbd-mysql\n"; } elsif ($EVAL _error =~ m/not A compiled character Set|character set utf8/) {ptdebug && _d (' going to try again
Without UTF8 support ');
Delete $defaults->{mysql_enable_utf8};
} if (! $tries) {die $EVAL _error}} }
The above code is excerpted from the GET_DBH function, which is used to get the connection to the database and, if it fails, retry 1 times and then exit by throwing an exception via the Die function.
However, by setting the following breakpoint, it is found that when $tries is 0 o'clock, the ptdebug && _d ("$EVAL _error") statement inside the IF function executes, but the die function does not throw an exception and exits the script
Ptdebug && _d ($tries);
if (! $tries) {
ptdebug && _d ("$EVAL _error");
Die $EVAL _error; }
Later, modify the last if function of the preceding code as follows:
if (! $tries) {
die "test: $EVAL _error";
}
Test again
Start the database
# service Mysqld Start
Execute pt-heartbeat command
# pt-heartbeat--update-h 192.168.244.10-u monitor-p monitor123-d test--create-table
Stop database
# Service Mysqld Stop
The pt-heartbeat command I just executed exited unexpectedly
"Test:" is the tested character that was added.
Conclusion
It's strange, just a simple die $EVAL _error does not throw an exception and exits the script, but the modified die "test: $EVAL _error" exits the script.
Obviously, this is a bug, and I don't know if it's related to the Perl version.
Curious, how does a failed connection lead to a growing memory?
Finally, a bug was brought to the Percona official.
https://bugs.launchpad.net/percona-toolkit/+bug/1629164
The above is a small set to introduce when master down, Pt-heartbeat constantly retry will lead to slow growth of memory causes and solutions, I hope to help you, if you have any questions welcome to my message, small series will promptly reply to everyone!