Logstash Record MongoDB Log

Source: Internet
Author: User
Tags logstash

Environment: MongoDB 3.2.17 Logstash 6


The MongoDB log Instance format file path is/root/mongodb.log:

2018-03-06T03:11:51.338+0800 I COMMAND  [CONN1978967] COMMAND TOP_FBA. $cmd   command: createindexes { createindexes:  "top_amazon_fba_inventory_data_2018-03-06",  indexes: [ { key: { sellerid: 1, sku: 1, updatetime: 1  }, name:  "Sellerid_1_sku_1_updatetime_1"  } ] } keyupdates:0 writeconflicts : 0 numyields:0 reslen:113 locks:{ global: { acquirecount: { r: 3,  w: 3 } }, database: { acquirecount: { w: 2, w: 1  } }, collection: { acquirecount: { w: 1 } }, metadata:  { acquirecount: { w: 2 } }, oplog: { acquirecount: {  w: 2 } } } protocol:op_query 5751ms2018-03-07t10:06:09.834+0800 i  COMMAND  [conn2020085] command top.top_order_list command: aggregate { aggregate:   "Top_order_list", pipeline: [ {  $match: { stock_id: {  $ne:  75 }, date_day: {  $gte:  "2018-01-01",  $lt:  "2018-02-01"  } }  }, {  $group: { _id: 1, order_id: {  $addToSet:  "$order _id"  }  } }, {  $project: { _id: 1, order_id: {  $size:  "$order _ ID " } } } ] } keyUpdates:0 writeConflicts:0 numYields:13924  reslen:103 locks:{ global: { acquirecount: { r: 27942 } },  database: { acquirecount: { r: 13971 } }, collection: {  Acquirecount: { r: 13971 } } } protocol:op_query 118899ms2018-03-07t10 : 09:03.590+0800 i&nbsp COMMAND  [CONN6175621] GETMORE TOP.TOP_PRODUCT_FLAT QUERY: { CATAGORY_ID:  {  $in:  [ 176, 170, 3447, 3448, 3449, 3450, 3451, 3452,  3453, 3454, 3455, 3456, 3457, 3458, 3459, 3460, 3461, 3462,  3463, 3464, 3783, 183, 3465, 3466, 3467, 3468, 3469, 3470,  3471, 3472, 3473, 3474, 3475, 3476, 3477, 184, 3446, 3479,  3480, 3481, 3482, 3483, 3484, 3485, 3486, 3487, 3488, 3489,  8186, 8187, 283, 3490, 3491, 3492, 3493, 3494, 3495, 3496,  3497, 3498, 3499, 3500, 3501, 3502, 3503, 3504, 284, 3505,  3506, 3507, 3509, 3510, 3511, 285, 3523, 3524, 3525, 3526,  3527, 3528, 286, 3512, 3513, 3514, 3515, 3516, 3522, 3569, 287,  3517, 3518, 8642, 288, 289, 3784, 3785, 3794 ] } }  cursorid:590981628130 ntoreturn:0 cursorexhausted:1 keyupdates:0 writeconflicts:0  numYields:533 nreturned:68330 reslen:2839556 locks:{ Global: {  acquirecount: { r: 1068 }, acquirewaitcount: { r: 202 },  timeacquiringmicros: { r: 130039 } }, database: { acquirecount: {  r: 534 } }, collection: { acquirecount: { r: 534 }  } } 530ms2018-03-07t10:09:03.639+0800 i command  [conn6184021] query  top.top_purchase_product_price_nagotiation query: {  $query:  { nagotiation_date:  {  $gt:  "2018-01-26  14:32:21 ",  $lt: " 2018-02-25 14:32:21 " }, product_id: 1239714 }, $ orderby: { nagotiation_date: 1 } } plansummary: collscan ntoreturn:0  ntoskip:0 keysExamined:0 docsExamined:242611 hasSortStage:1 cursorExhausted:1  keyupdates:0 writeconflicts:0 numyields:1895 nreturned:0 reslen:20 locks:{  global: { acquirecount: { r: 3792 }, acquirewaitcount: { r:  85 }, timeacquiringmicros: { r: 94774 } }, database: {  Acquirecount: { r: 1896 } }, collection: { acquirecount: { r :  1896 } } } 221ms2018-03-07t10:22:01.340+0800 i access   [ Conn2020395] unauthorized: not authorized on admin to execute command  { replsetgetstatus:  1.0, forshell: 1.0 }2018-03-07t10:22:01.344+0800 i network  [ conn2020395] end connection 192.168.1.100:52188  (268 connections now open) 2018-03-07t10:19:45.897+0800 i network  [initandlisten] connection accepted  from 192.168.1.100:51817  #2020374   (268 connections now open)


Logstash Configuration/root/logstash_mongodb.conf

Input {    file {        path =>   "/root/mongodb.log"         type =>  "Mongodblog"         start_position =>  "Beginning"      }} filter {    if [type] ==  "Mongodblog"  {         grok {             match => ["message", "%{timestamp_iso8601:timestamp}\s+i %{word:mongo_action}\s+\ [%{word:sock_action}\]\s+%{greedydata:body} "]             remove_field => [  "Message"  ]         }        if [body] =~  "ms$"   {               grok {                 match => ["Body", "%{word:command_action}\s+%{word :d Bname}\.\$?%{word:collname}\s+%{greedydata:command_content}\s+%{number:time_spend}ms "]             }        }         date {             match => [  "timestamp",  "UNIX",  "Yyyy-mm-dd hh:mm:ss",   "ISO8601"]            remove_field = > [  "Timestamp"  ]        }         mutate {             Remove_field => ["MEssage "]        }    }} output {     elasticsearch {        hosts => [" 192.168.220.100:9200 "]        index => " Mongodb-%{+YYYY. MM.DD} "    }}

The final execution of Logstash-f/root/logstash_mongodb.conf can be

Logstash Record MongoDB Log

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.