"Flume" custom sink Kafka, and compile Package Jar,unapproval license Problem Resolution

Source: Internet
Author: User

, create a new Java project, edit the Pom file, and the contents of the Pom file are "remove parent here":

<?xml version= "1.0" encoding= "UTF-8"? ><project xmlns= "http://maven.apache.org/POM/4.0.0" xmlns:xsi= "http: Www.w3.org/2001/XMLSchema-instance "xsi:schemalocation=" http://maven.apache.org/POM/4.0.0/http Maven.apache.org/xsd/maven-4.0.0.xsd "> <modelVersion>4.0.0</modelVersion> <groupId> Org.apache.flume.flume-ng-sinks</groupid> <artifactId>flume-ng-kafka-sink</artifactId> < Name>flume Kafka sink</name> <version>1.0.0</version> <build> <plugins> <plu Gin> <groupId>org.apache.maven.plugins</groupId> <artifactid>maven-jar-plugin</artif actid> </plugin> </plugins> </build> <dependencies> <dependency> <gr Oupid>org.apache.flume</groupid> <artifactId>flume-ng-sdk</artifactId> <version>1.5.2   </version> </dependency> <dependency>   <groupId>org.apache.flume</groupId> <artifactId>flume-ng-core</artifactId> <versio n>1.5.2</version> </dependency> <dependency> &LT;GROUPID&GT;ORG.APACHE.FLUME&LT;/GROUPID&G      T <artifactId>flume-ng-configuration</artifactId> <version>1.5.2</version> </dependency > <dependency> <groupId>org.slf4j</groupId> &LT;ARTIFACTID&GT;SLF4J-API&LT;/ARTIFACTID&G      T <version>1.6.1</version> </dependency> <dependency> <groupid>junit</groupid&gt      ;    <artifactId>junit</artifactId> <version>4.10</version> <scope>test</scope> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactid>ka fka_2.10</artifactid> <version>0.8.1.1</version> </dependency> </dependencies></ ProJect> 
The parent is removed and the rat plugin is removed, which avoids common errors that occur at compile time https://issues.apache.org/jira/browse/FLUME-1372

The custom sink implementation needs to inherit the Abstractsink and implement the interface configurable, and override some of the methods as follows:

Package Com.cmcc.chiwei.kafka;import Java.util.arraylist;import Java.util.list;import java.util.Map;import Java.util.properties;import Kafka.javaapi.producer.producer;import Kafka.producer.keyedmessage;import Kafka.producer.producerconfig;import Org.apache.flume.channel;import Org.apache.flume.context;import Org.apache.flume.event;import Org.apache.flume.eventdeliveryexception;import Org.apache.flume.Transaction;import Org.apache.flume.conf.configurable;import Org.apache.flume.sink.abstractsink;import Org.slf4j.Logger;import Org.slf4j.loggerfactory;import Com.google.common.base.throwables;public class Cmcckafkasink extends AbstractSink Implements configurable {private static final Logger log = Loggerfactory.getlogger (cmcckafkasink.class);p ublic static Final string key_hdr = "KEY";p ublic static final String topic_hdr = "TOPIC";p rivate static final String CHARSET = "UTF-8"; Private Properties kafkaprops;private producer<string, byte[]> producer;private String topic;private int Batchsize;//the event number of a transaction, the whole is submitted to private list<keyedmessage<string, byte[]>> messagelist; @Overridepublic Status Process () throws Eventdeliveryexception {//TODO auto-generated method Stubstatus result = Status.ready; Channel channel = Getchannel (); Transaction Transaction = null; Event event = null; String eventtopic = null; String Eventkey = null;try {Long processedevent = 0;transaction = Channel.gettransaction (); Transaction.begin ();// Transaction begins Messagelist.clear (); for (; processedevent < batchsize; processedevent++) {event = Channel.take ();// Remove an event from the channel if (event = = null) {break;} The event object has a map<string of head and body, string> headers = Event.getheaders (); byte[] Eventbody = Event.getbody (); if (( Eventtopic = Headers.get (TOPIC_HDR)) = = null) {//Determine if TOPIC in the event header is Nulleventtopic = TOPIC;} Eventkey = Headers.get (KEY_HDR), if (log.isdebugenabled ()) {Log.debug ("{Event}" + Eventtopic + ":" + Eventkey + ":" + new St Ring (Eventbody, CHARSET)); Log.debug ("Event #{}", processedevent);} Keyedmessage<strinG, byte[]> data = new keyedmessage<string, byte[]> (Eventtopic, Eventkey, eventbody); Messagelist.add (data);} if (Processedevent > 0) {producer.send (messagelist);} Transaction.commit ();//batchsize Event processing complete, one transaction commit} catch (Exception e) {String errormsg = "Failed to publish events!"; Log.error (ErrorMsg, e); result = Status.backoff;if (transaction! = null) {try {transaction.rollback (); Log.debug (" Transaction rollback success! ");} catch (Exception ex) {log.error (errormsg, ex); throw Throwables.propagate (ex);}} throw new Eventdeliveryexception (ErrorMsg, e);} Finally {if (transaction! = null) {Transaction.close ();}} return result;} @Overridepublic synchronized void Start () {//TODO auto-generated method stubproducerconfig config = new Producerconfig (ka Fkaprops);p roducer = new producer<string, byte[]> (config); Super.start ();} @Overridepublic synchronized void Stop () {//TODO auto-generated method Stubproducer.close (); Super.stop ();} @Overridepublic void Configure (Context context) {//TODOAuto-generated Method stubbatchsize = Context.getinteger (constants.batch_size,constants.default_batch_size); Messagelist = new arraylist<keyedmessage<string, byte[]>> (batchsize); Log.debug ("Using batch size: {}", batchsize); topic = Context.getstring (Constants.topic, constants.default_topic); if (Topic.equals (Constants.DEFAULT_  TOPIC) {Log.warn ("The property ' TOPIC ' isn't set.) Using the default topic name ["+ Constants.default_topic +"] ");} else {log.info ("Using the configured topic:[" + topic+ "] This could be over-ridden by event headers"); Kafkaprops = Kafkautil.getkafkaconfig (context), if (log.isdebugenabled ()) {Log.debug ("Kafka producer properties:" + kafkaprops);}}}
Then mvn clean install to compile the package jar, throw the jar package into the Flume installation directory of the Lib can be, the following is the editing conf file

Of course, the key of the specific attribute in the Conf file is consistent with the attributes in your custom sink, and the key in the custom read is the key in your config file.

Such as:

Producer.sinks.r.type = org.apache.flume.plugins.kafkasinkproducer.sinks.r.metadata.broker.list= 127.0.0.1:9092producer.sinks.r.partition.key=0producer.sinks.r.partitioner.class= org.apache.flume.plugins.singlepartitionproducer.sinks.r.serializer.class= Kafka.serializer.stringencoderproducer.sinks.r.request.required.acks=0producer.sinks.r.max.message.size= 1000000producer.sinks.r.producer.type=asyncproducer.sinks.r.custom.encoding= Utf-8producer.sinks.r.custom.topic.name=testtoptic






















"Flume" custom sink Kafka, and compile Package Jar,unapproval license Problem Resolution

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.