Log4j2 sending messages to Kafka

Source: Internet
Author: User

Title: Custom Log4j2 send log to Kafka

Tags:log4j2,kafka
In
order to provide the company's big data platform each project group's log, but also makes each project group to change not to perceive. Did a survey only to find LOG4J2 default has the support to send the log to the Kafka function, under the surprise hurriedly looked under log4j to its realization source! found that the default implementation is synchronous blocking, if the Kafka service once hung up will block the normal service log printing, for this I made some changes based on the reference source.
log4j Log Workflow

Log4j2 for log4j in the performance of a significant improvement, this official has been clearly described and tested, so do not repeat. In order to use it more skillfully, it is necessary to understand its internal workflow. This is a class diagram of the official website log4j

applications using the LOG4J 2 API would request a Logger with a specific name from the Logmanager. The Logmanager would locate the appropriate loggercontext and then obtain the Logger from it. If The Logger must be created it'll be associated with the loggerconfig that contains either a) the same name as the Log GER, B) The name of a parent package, or C) the root loggerconfig. Loggerconfig objects is created from Logger declarations in the configuration. The loggerconfig is associated with the appenders that actually deliver the logevents.

The official website has explained the relationship between them, there is no longer a specific description of the function and function of each class, the focus of today is the Appender class, because he will decide where to export the log.

    • Appender
The
ability to selectively enable or disable logging requests based on their logger are only part of the picture. LOG4J allows logging requests to print to multiple destinations. In log4j speak, a output destination is called an Appender. Currently, appenders exist for the console, files, remote socket servers, Apache Flume, JMS, remote UNIX Syslog daemons, a nd various database APIs. See the sections on Appenders for more details on the various types available. More than one Appender can is attached to a Logger.
Core Configuration


is log4j2 send logs to Kafka core class, in fact, the most important KafkaAppender , the other several classes are connected kafka services.

    • Kafkaappender Core Configuration
@Plugin (name ="Kafka", category ="Core", ElementType ="Appender", PrintObject =TruePublicFinalClassKafkaappenderExtendsAbstractappender {/** * */PrivateStaticFinalLong Serialversionuid =1L;@PluginFactoryPublicStatic KafkaappenderCreateappender(@PluginElement ("Layout")Final layout<? Extends serializable> layout, @Pluginelement("Filter")Final filter filter, @Required(Message ="No name provided for Kafkaappender") @Pluginattribute("Name")Final String name, @Pluginattribute(Value ="Ignoreexceptions", Defaultboolean =TrueFinalBoolean ignoreexceptions, @Required(Message ="No topic provided for Kafkaappender") @Pluginattribute("topic")Final String topic, @Pluginelement("Properties")Final Property[] properties) {Final Kafkamanager Kafkamanager =New Kafkamanager (Name, topic, properties);ReturnNew Kafkaappender (name, layout, filter, ignoreexceptions, Kafkamanager); }PrivateFinal Kafkamanager manager;PrivateKafkaappender(Final String name,Final layout<? Extends serializable> layout,Final filter filter,FinalBoolean ignoreexceptions,Final Kafkamanager manager) {Super (name, filter, layout, ignoreexceptions);This.manager = Manager; }@OverridePublicvoidAppend(Final LogEvent event) {if (Event.getloggername (). StartsWith ("Org.apache.kafka")) {Logger.warn ("Recursive logging from [{}] for Appender [{}].", Event.getloggername (), GetName ()); }else {try {if (GetLayout ()! =NULL) {Manager.send (GetLayout (). Tobytearray (event));}else {manager.send (Event.getmessage (). Getformattedmessage (). GetBytes (Standardcharsets.utf_8));}}catch (final Exception e) {Logger.error ( "unable to write to Kafka [{}] for Appender [{}].", Manager.getname (), GetName () , e); throw new appenderloggingexception (  "unable to write to Kafka in Appender:" + e.getmessage (), E);}}}  @Override public void start () { Super.start (); Manager.startup (); }  @Override public void stop () { Super.stop (); Manager.release (); } 
    • Log4j2.xml Simple Configuration
<?xml version="1.0" encoding="UTF-8"?> ...<Appenders><KafkaName="Kafka"Topic="Log-test" ><Patternlayoutpattern="%date%message"/><PropertyName="Bootstrap.servers" >localhost:9092</Property></kafka> </Appenders> <loggers> <Root level= "DEBUG" > < appenderref ref= "Kafka"/> </root> < Logger name= "Org.apache.kafka" level=< Span class= "hljs-string" > "INFO"/> <!--Avoid recursive logging--</LOGGERS>         
where @Plugin the name attribute corresponds to the XML configuration file inside the Kafka tag, of course this can also be customized. At the same time, you need to @Plugin change the Name property to Mykafka. The following configuration:
<MyKafka name="Kafka" topic="log-test">
Custom Configuration

Sometimes the attributes we use are not KafkaAppender necessarily supported by default, so a certain amount of rewriting is required. But rewriting is also more convenient, just need to take the value from the constructor's Properties kafkaProps properties. To meet the project requirements, I have defined the platform and servicename two properties.

Through KafkaAppender the source, he sent the message to take a synchronous blocking way. After testing, once the Kafka service hangs, it will affect the normal log output of the project service, and this is not what I want to see, so I made a certain degree of changes to him.

Feature::

    • Kafka Service has been normal

      This is an ideal situation where messages will be sent to Kafka broker
    • Kafka service hangs up and returns to normal after a certain period of time

      When the Kafka service is suspended, all subsequent messages will be exported to the ConcurrentLinkedQueue queue. At the same time, the message of this queue will be consumed continuously and output to the local file. When the heartbeat detects that Kafka broker is back to normal, the contents of the local file will be read and sent to the Kafka broker. Notice that the 此时会有大量消息被实例化为ProducerRecord object, heap memory occupancy rate is very high, so I used a thread blocked a bit!
    • Kafka Service has been hanging
all messages will be exported to a local file.

SOURCE Point I

Log4j2 sending messages to Kafka

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.