Shell script +mysql database combination for data security monitoring

Source: Internet
Author: User
Tags unpack mysql backup

Shell scripting + MySQL database for enterprise-level session Border controller file parsing and alerting


dirsx30001.sh process files, generate MySQL backup files in a way to import

#/bin/bash# #环境变量path =/home/sbclog/sx30001# #由于抓取的日志文件中没有年份参数, import the system year parameter to reach the database Time field datetime# #也可以抓取tar包的名字中的字段date _year= ' date|cut -c 25-' #test  ! -e  $path/*.tar &&echo  "[ " ' Date '  ] no search the tar file ' >> $path/dir.log&&exit 0## Traverse the log file tar package, unpack, save the full path of the file, and the filename for a in  ' ls  $path/*.tar 2>>/dev/null ' do   # #输出当前解包的文件tar   echo  $a   tar -xf  $a  -C  $path   ## Unpack the package, transfer the tar file   mv  $a   $path/backup/  # #执行解析程序foundall, generate BYE.txt and INVITE.txt parsing files   /data/sbc1/foundall  $path/  # #删除包文件   rm -rf  $path/var    #echo   $b   # #判断解析文件是否为NULL   filebye= ' sed -n  ' $= '   $path/bye.txt '   fileinvite= ' sed -n  ' $= '   $path/invite.txt ' # #BYE   if [ ! -n   "$FILEBYE"  ]; then    echo  "[ " ' Date '  ] no BYE in  $a >> $path/ dir.log  else  echo  "use sbc;"  >  $a .bye.sql  echo  "Insert into sbc1bye values"  >>   $a .bye.sql  ##################  awk  ' {print  "(' \ ' $date _year/'" $ " " $ " "' \ '", "\" "$" ' \ ', ' \ ' "" $4 "' \ '", "\" "$ $" "\", "\" "$6" "\", "\" "$7" "\", "\" "$8" "\", "\" "$9" Path/bye.txt >> $a. bye.sql  # #替换最后一行的最后一个字符为分号   sed -i  ' $s/.$/;/g '   $a .bye.sql   #sed  -i  ' s/\//-/g '   $a. bye.sql  # #导入数据库   mysql  -usbc_user -p123456 < $a. bye.sql  # #保存文件   mv  $a .bye.sql  $path/ byesql/# #INVITE   fi  if [ ! -n  "$FILEINVITE"  ]; then     echo  "[ " ' Date '  ] no INVITE in  $a ">> $path/dir.log  else  echo " use sbc; "  >  $a .invite.sql  echo  "Insert into sbc1invite values"  > >  $a .invite.sql  ##################  awk  ' {print  "(' \ ' $date _year/'" $ "   "$" ' \ ', ' \ ' "$ $" ' \ ', ' \ ' "" $4 "' \ '", "\" "$ $" "\", "\" "$6" "\", "\" "$7" "\", "\" "$8" "\", "\" "$9" "   $path/invite.txt >> $a. invite.sql  # #替换最后一行的最后一个字符为分号   sed -i  ' $ S/.$/;/g '   $a .invite.sql   #sed  -i  ' s/\//-/g '   $a. invite.sql  # #导入数据库   mysql -usbc_user -p123456 < $a. invite.sql  # #保存文件   mv  $a . invite.sql  $path/invitesql/  fidone


Database design, to separate the receiving information and access information to manage.

The--sbc1bye table design, which uses a large number of varchar types to access data, saves space, and type uses the char type to promote retrieval efficiency (originally the original bye and invite are saved in the same table). create table  ' Sbc1bye '   (   ' time '  DATETIME DEFAULT NULL,   ' From '  varchar (255)  DEFAULT NULL,   ' to '  varchar (255)  default null,    ' type '  char  DEFAULT NULL,   ' server '  varchar (255)  default  NULL,   ' caller '  varchar (255)  DEFAULT NULL,   ' called '  varchar (255 )  DEFAULT NULL,   ' callid '  varchar (255)  default null)  engine=myisam  DEFAULT CHARSET=latin1; create table  ' Sbc1invite '   (   ' time '  DATETIME DEFAULT NULL,   ' From '  varchar (255)  DEFAULT NULL,   ' to '  varchar (255)  default null,    ' type '  char  DEFAULT NULL,   ' server '  varchar (255)  default  NULL,   ' CALler '  varchar (255)  DEFAULT NULL,   ' called '  varchar (255)  default null,    ' Callid '  varchar (255)  default null)  engine=myisam default charset= Latin1;


Remove duplicate messages by setting the view. You can also add a variety of conditions flexibly.

--with view constraints, more flexible create OR REPLACE view sbc1bye_v as SELECT distinct ' time ' btime, ' from ' bfrom, ' to ' bto, ' type ' btype , ' Server ' bserver, ' caller ' Bcaller, ' called ' bcalled, ' Callid ' Bcallid from Sbc1bye Group by ' Callid ';--through view constraints, More flexible Create OR REPLACE VIEW sbc1invite_v as SELECT distinct ' time ' btime, ' from ' bfrom, ' to ' bto, ' type ' btype, ' server ' Bserver, ' caller ' Bcaller, ' called ' bcalled, ' Callid ' Bcallid from Sbc1invite Group by ' Callid ';


Checks whether the data in the database reaches the alarm value.

#!/bin/bash# #导入变量, variables drift in multiple sessions root. /root/.bash_profile# #通知邮箱数组a = ([email protected]  # Experiment proves 139 Mobile mailbox, response speed is very slow, 163 mailbox recommended) # #邮件内容保存路径mailpath =/home/sbclog/sx30001# #告警值value_limit =200#mysql -usbc_user  -p123456 -e  "Use sbc;select * from heji where heji> $value _ Limit; " > $mailpath/mail.log # #生成NULL的邮件内容文件 > $mailpath/mail.log#echo  "$SCN _time_sx30002" # #产生告警的数据文件, Can be modified to select where havingmysql -usbc_user -p123456 -e  "Use sbc;select a .* from  (Select bcaller,bcalled,count (bcalled)  heji from  sbc1bye_v  where btime> $SCN _time_sx30001 group by bcaller,bcalled)  a where a.heji > $value _limit; " > $mailpath/mail.log# #判断邮件内容文件是否为NULLFILEMAIL = ' sed -n  ' $= '   $mailpath/mail.log ' if [  ! -n  "$FILEMAIL"  ]; then   #echo   "no value over the  Limit, So no mail will be sent. "   # #为NULL文件时, reset timing file/var/spool/cron/root for every minute check   sed -i  ' s/\*\/30 \* \*  \* \* \/bin\/bash \/home\/sbclog\/mailsx30001.sh/\* \* \* \* \*  \/bin\/bash \/home\/sbclog\/mailsx30001.sh/g '  /var/spool/cron/rootelse  ## The content file of the message is processed into an HTML table file for observation   sed  ' s/\t/\<\/th\>\<th\>/g '   $mailpath/mail.log  |sed  ' s/^/\<tr\>\<th\>/g ' |sed  ' s/$/\<\/td\>\<\/tr\>/g ' |sed  ' 1s/^/ <table border= "1" >/g ' |sed  ' $s/$/<\/table>/g ' > $mailpath/mail.html   #echo  ${#a [@]}  # #不为NULL文件时, set the timing file/var/spool/cron/root to alarm again 30 minutes until man-made processing   sed -i  ' s/\* \* \* \* \* \/bin\/bash \/home\/sbclog\/mailsx30001.sh/\*\/30 \*  \* \* \* \/bin\/bash \/home\/sbclog\/mailsx30001.sh/g '  /var/spool/cron/root  # #遍历数组发送邮件   for mail in ${a[*]}    do       # #html格式发送, you can design the message style       mail -s  "$ (echo  -e  "sbc1 exception information\ncontent-type:text/html") "  $mail  < $ Mailpath/mail.html  donefi


When the worker confirms that the data is normal, execute (restore point in time to prevent the query data time range too large).

#export scn_time_sx30002= "'" ' Date +%f ""%T ' "" #sed-i ' s/export scn_time_sx30002.*/export scn_time_sx30002=\ "\" Date + %F ""%T "\" \ "/'/root/.bash_profile#sed-i ' S/date_sub (now (), Interval 1 day)/" ' "' Date +%f" "%T" "'"/'/root/.bash_profile # #将环境变量在各个会话中传递sed-i "S/export scn_time_sx30001.*/export scn_time_sx30001=" date +%f ""%T "/"/root/.bash_profile# echo "restore success. Now: "$SCN _time_sx30001#at now + 1 day-f/home/sbclog/resx30002.sh#at now + 1 min-f/home/sbclog/resx30002.sh# #at一次性定时任务, Check for the day before the alarm is appropriate, so the recovery time point + one hours at now + hours-f/home/sbclog/resx30001.sh

When recovering, execute the script

#sed-i ' s/nnn/ggg/g '/home/sbclog/122.txt#sed-i ' S/date_sub (now (), Interval 1 day)/"'" ' Date +%f ""%T "" ' "/'/root/.bash_ profile# #恢复时间点 to restore the checkpoint to a day alarm sed-i ' s/export scn_time_sx30001.*/export scn_time_sx30001= ' Date_sub (now (), Interval 1 days ) "/'/root/.bash_profile


Shell script +mysql database combination for data security monitoring

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.