K8s and logs--output plug-in for fluent bit with Golang

Source: Internet
Author: User
Tags key string sprintf fluentd k8s

Output plug-in for fluent bit with Golang

Objective

At present, the Community log collection and processing of a number of components, before the Elk scheme in the LOGSTASH,CNCF community in the FLUENTD,EFK scheme filebeat, and big data used to more flume. While the fluent bit is a high-performance Log collection component written in C, the entire architecture originates from FLUENTD. The official comparison data are as follows:

fluentd Fluent Bit
Scope Containers/servers Containers/servers
Language C & Ruby C
Memory ~40mb ~450kb
Performance High performance High performance
Dependencies Built as a Ruby Gem, it requires a certain number of gems. Zero dependencies, unless some special plugin requires them.
Plugins More than 650 plugins available Around Plugins available
License Apache License v2.0 Apache License v2.0

As you can see from the data, the fluent bit consumes less resources. It is suitable to use fluent bit + FLUENTD scheme to realize the scheme of centralized log collection. Fluent bit is mainly responsible for collecting, FLUENTD is responsible for processing and transmitting.

Extending the output plug-in

The fluent bit itself is written in C, and the extension has some difficulty. Perhaps the official takes this into account, implemented the Fluent-bit-go, can implement the use of the go language to write plug-ins, currently only support output writing.
Fluent-bit-go is actually using CGO to encapsulate the C interface. The code is relatively simple, the main analysis of one of the key files

Package output/* #include <stdlib.h> #include "flb_plugin.h" #include "flb_output.h" */import "C" import "FMT"                  Import "unsafe"//Define Constants matching Fluent Bit coreconst flb_error = C.flb_errorconst FLB_OK = C.flb_okconst Flb_retry = C.flb_retryconst Flb_proxy_output_plugin = C.flb_proxy_output_plug Inconst Flb_proxy_golang = c.flb_proxy_golang//Local type to define a plugin DefinitionType flbplugin C.STRUCT_FL  B_plugin_proxytype Flboutplugin c.struct_flbgo_output_plugin//When the flbplugininit are triggered by Fluent Bit, a plugin context//is passed and the next step are to invoke this flbpluginregister () function//to fill the required information: Type, proxy type, flags name and//description.func flbpluginregister (ctx unsafe. Pointer, name string, desc string) int {p: = (*flbplugin) (unsafe. Pointer (CTX)) P._type = Flb_proxy_output_plugin P.proxy = Flb_proxy_golang p.flags = 0 p.name = c.cstring (nam    EP.description = c.cstring (desc) return 0}//Release resources allocated by the plugin Initializationfunc Flbpluginunreg Ister (ctx unsafe. Pointer) {p: = (*flbplugin) (unsafe. Pointer (CTX)) fmt. Printf ("[Flbgo] unregistering%v\n", p) c.free (unsafe. Pointer (P.name)) C.free (unsafe. Pointer (p.description))}func Flbpluginconfigkey (ctx unsafe. Pointer, key String) string {_key: = c.cstring (key) return c.gostring (C.output_get_property (_key, unsafe. Pointer (CTX))}

It mainly defines some variables and methods to be used to write plug-ins, such as Flbpluginregister registration component, Flbpluginconfigkey get configuration file setting parameters, etc.
Ps
Actually call Fluent-bit-go with Golang, add some actual business logic implementation, and eventually compile into a c-share. So dynamic link library.

Custom Fluent-bit-kafka-ouput Plugins

In fact, the Fluent-bit v0.13 version provides a plugin for Kafka output, but the actual project does not meet our needs and must be customized.
The next code, of course, is primarily as a demo that tells you how to write an output plugin.

Code Authoring and Analysis

First on the code:

Package Mainimport ("C" "FMT" "io" "Log" "Reflect" "StrConv" "Strings" "Time" "unsafe" "Git" Hub.com/shopify/sarama "" Github.com/fluent/fluent-bit-go/output "" Github.com/ugorji/go/codec ") var (brokers [] String producer Sarama. Syncproducer Timeout = 0 * time.  Minute topic String Module string Messagekey string)//export flbpluginregisterfunc flbpluginregister (ctx Unsafe. Pointer) int {return output. Flbpluginregister (CTX, "Out_kafka", "Kafka Output plugin.!")} Export flbplugininit//CTX (context) pointer to Fluentbit context (STATE/C code) Func flbplugininit (ctx unsafe. Pointer) int {if BS: = output. Flbpluginconfigkey (CTX, "brokers"); BS! = "" {Brokers = strings. Split (BS, ",")} else {log. PRINTF ("You must set brokers") return output. FLB_ERROR} if TP: = output. Flbpluginconfigkey (CTX, "topics"); TP! = "" {topic = TP} else {log.        PRINTF ("You must set topics")return output. Flb_error} if MO: = output. Flbpluginconfigkey (CTX, "module"); Mo! = "" {module = mo} else {log. PRINTF ("You must set module") return output. Flb_error} if key: = output. Flbpluginconfigkey (CTX, "Message_key"); Key! = "" {Messagekey = key} else {log. PRINTF ("You must set Message_key") return output. Flb_error} config: = Sarama. Newconfig () config. Producer.Return.Successes = True if required_acks: = output. Flbpluginconfigkey (CTX, "required_acks"); Required_acks! = "" {if acks, err: = StrConv. Atoi (required_acks); Err = = Nil {config. Producer.requiredacks = Sarama. Requiredacks (ACKs)}} if Compression_codec: = output. Flbpluginconfigkey (CTX, "Compression_codec"); Compression_codec! = "" {if codec, err: = StrConv. Atoi (COMPRESSION_CODEC); Err = = Nil {config. Producer.compression = Sarama. Compressioncodec (Codec)}} if Max_retry: = output. FlbpLuginconfigkey (CTX, "max_retry"); Max_retry! = "" {if max_retry, err: = StrConv. Atoi (Max_retry); Err = = Nil {config. Producer.Retry.Max = max_retry}} if timeout = = 0 {Timeout = 5 * time. Minute}//If Kafka is not running on init, wait to connect deadline: = time. Now (). ADD (timeout) for tries: = 0; Time. Now (). Before (deadline); tries++ {var err error if producer = = Nil {producer, err = Sarama. Newsyncproducer (Brokers, config)} if Err = = Nil {return output. FLB_OK} log. Printf ("Cannot Connect to Kafka: (%s) retrying ...", err) time. Sleep (time. Second *)} log. Printf ("Kafka failed to respond after%s", timeout) return output. Flb_error}//export flbpluginflush//Flbpluginflush is called from Fluent-bit when data need to be sent. is called from Fluent-bit when data need to be sent.func flbpluginflush (data unsafe. Pointer, length c.int, tag *c.char) int {VAr h codec. msgpackhandle var b []byte var m interface{} var err error B = c.gobytes (data, length) Dec: = codec. Newdecoderbytes (b, &h)//Iterate the original Messagepack array var msgs []*sarama. Producermessage for {//decode the Msgpack data err = Dec. Decode (&m) if err! = Nil {if Err = = Io. EOF {break} log. Printf ("Failed to decode Msgpack data:%v\n", err) return output. Flb_error}//Get a slice and their the Entries:timestamp and map slice: = reflect. ValueOf (m) Data: = Slice. Index (1)//Convert slice data to a real map and iterate mapData: = data. Interface ().            (map[interface{}]interface{}) Flattendata, err: = Flatten (MapData, "", Underscorestyle) if err! = Nil {             Break} message: = ' "" Host: = "" For K, V: = Range Flattendata {value: = "" Switch T: = V. (Type) {Case string:value = t case []byte:value = string (t) Default:value = FMT. Sprintf ("%v", V)} if k = = "Pod_name" {host = value} if k = = Messagekey {message = value}} if message = = "" | | Host = = "" {break} m: = &sarama. producermessage{Topic:topic, Key:sarama. Stringencoder (FMT. Sprintf ("host=%s|module=%s", host, module), Value:sarama. Byteencoder (Message),} msgs = Append (msgs, m)} err = producer. Sendmessages (msgs) if err! = Nil {log. Printf ("FAILED to send Kafka message:%s\n", err) return output. Flb_error} return output. Flb_ok}//export flbpluginexitfunc flbpluginexit () int {producer. Close () return output. Flb_ok}func main () {}
    • Some of the methods that need to be executed when the Flbpluginexit plugin exits, such as closing a connection.
    • Flbpluginregister Register Plugin
    • Flbplugininit Plugin Initialization
    • Flbpluginflush flush to data to output
    • Flbpluginconfigkey getting the parameters in the configuration file

Ps
Of course, in addition to Flbpluginconfigkey, you can get the setting parameters by getting environment variables.
CTX is equivalent to a context that is responsible for the transfer of data between.

Compiling and executing

Compile the time

go build -buildmode=c-shared -o out_kafka.so .

Generate out_kafka.so

At the time of execution

/fluent-bit/bin/fluent-bit" -c /fluent-bit/etc/fluent-bit.conf -e /fluent-bit/out_kafka.so

Summarize

With a similar authoring structure, you can customize your own output plug-in.

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.