AWS S3 log files are uploaded to Elk via the server

Source: Internet
Author: User

View a large number of logs generated by S3 through Elk

First of all, clear the idea


Start by synchronizing the logs with the S3cmd command from S3 and then write the log to the file and show it through elk.


First, install the S3cmd command

installation and simple use of s3cmd tools:

Reference documents

Https://www.cnblogs.com/xd502djj/p/3604783.html https://github.com/s3tools/s3cmd/releases


Download the S3cmd installation package from GitHub first

mkdir/home/tools/&& Cd/home/tools/wget HTTPS://GITHUB.COM/S3TOOLS/S3CMD/RELEASES/DOWNLOAD/V2.0.1/ S3cmd-2.0.1.tar.gztar XF S3CMD-2.0.1.TAR.GZMV s3cmd-2.0.1/usr/local/mv/usr/local/s3cmd-2.0.1/usr/local/s3cmdln-s/ Usr/local/s3cmd/s3cmd/usr/bin/s3cmd

use S3cmd–configure to set key after installation is complete

Basically, access key and secure key are configured to generate the following configuration file

[[email protected] s3cmd]# cat /root/.s3cfg[default]access_key =  AKIAI4Q3PTOQ5XXXXXXX    AWS S3 's access key    must be access_token = add_encoding_exts =add_headers =bucket_location = usca_certs_file =cache_file = check_ssl_certificate = truecheck_ssl_hostname = truecloudfront_host =  Cloudfront.amazonaws.comdefault_mime_type = binary/octet-streamdelay_updates = falsedelete _after = falsedelete_after_fetch = falsedelete_removed = falsedry_run =  falseenable_multipart = trueencrypt = falseexpiry_date =expiry_days = Expiry_prefix =follow_symlinks = falseforce = falseget_continue = falsegpg_ command = /usr/bin/gpggpg_decrypt = % (Gpg_command) s -d --verbose -- No-use-agent --batch --yes --passphrase-fd % (PASSPHRASE_FD) s -o % (output_file) s % (input_file) sgpg_encrypt = % (gpg_ command) s -c --verbose --no-use-agent --batch --yes --passphrase-fd % ( PASSPHRASE_FD) s -o % (output_file) s % (input_file) sgpg_passphrase = aviagamesguess_mime _type = truehost_base = s3.amazonaws.comhost_bucket = % (Bucket) s.s3.amazonaws.comhuman_readable_sizes = falseinvalidate_default_index_on_cf =  Falseinvalidate_default_index_root_on_cf = trueinvalidate_on_cf = falsekms_key =limit  = -1limitrate = 0list_md5 = falselog_target_prefix =long_listing =  falsemax_delete = -1mime_type =multipart_chunk_size_mb = 15multipart_max_chunks  = 10000preserve_attrs = trueprogress_meter = trueproxy_host =proxy_port  = 0put_continue = falserecursive = falserecv_chunk = 65536reduced_redundancy = falserequester_pays =  falserestore_days = 1restore_priority = standardsecret_key =  0UONIJRN9QQHANXXXXXXCZXXXXXXXXXXXX  AWS S3 's secret_key   must be send_chunk =  65536server_side_encryption = falsesignature_v2 = falsesignurl_use_https =  falsesimpledb_host = sdb.amazonaws.comskip_existing = falsesocket_timeout =  300stats = falsestop_on_error = falsestorage_class =urlencoding_mode =  Normaluse_http_expect = falseuse_https = falseuse_mime_magic = trueverbosity  = warningwebsite_endpoint = http://% (bucket) s.s3-website-% (location) s.amazonaws.com/ Website_error =website_index = index.html


Second, S3cmd command after the installation is complete script

#!/bin/bash# Enter S3 Sync directory mkdir /server/s3dir/logs/ -p && cd /server/s3dir/logs/# Place the S3 log list in the S3.log file every 5 minutes #while true#do    /usr/bin/s3cmd ls s3://bigbearsdk /logs/ >s3.log     #执行同步命令确认  s3 is the same as server   log     /usr/bin/ s3cmd  sync --skip-existing s3://bigbearsdk/logs/ ./#done # The day's log sequence is appended to a file      grep $ (date +%f)  s3.log  |sort -nk1,2 |awk -f [/]   ' {print  $NF} '  > date.log    sed -i  ' s#\_#\\_#g '   date.log    sed -i  ' s#<#\\\< #g '  date.log    sed  -i  ' s#\ #\\  #g '  date.log    sed -i  ' s#>#\\\> #g '  date.log##[ -f elk.log ] &&#{#    cat  elk.log  >> elk_$ (date +%f). log#    echo > elk.log#    find / home/tools/ -name elk*.log -mtime +7 |xargs rm -f#}# Append the log of each file to the S3 upload log while read linedo   echo  "$line" |sed  ' s# (#\\\ (#g ' |sed   ' s# #\\\) #g ' | sed  ' s#\_#\\_#g ' |sed  ' s#<#\\\< #g ' |sed  ' s#>#\\\> #g ' |sed   ' s#\ #\\  #g '  >while.log   head -1 while.log |xargs  Cat  >> /server/s3dir/s3elk.logdone < date.log


In this case, the contents of the S3 log are all in the S3elk.log this file and then through the Elk monitoring log

Not to be continued ...

Intending to add Dellinger_blue

AWS S3 log files are uploaded to Elk via the server

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.