Kafka Real Project Use _20171012-20181220

Source: Internet
Author: User
Tags serialization thread class try catch zookeeper

Recently used in the project to Kafka, recorded

Kafka role, here do not introduce, please own Baidu. Project Introduction

Briefly introduce the purpose of our project: The project simulates the exchange, carries on the securities and so on the transaction, in the Matchmaking transaction: Adds the delegate, updates the delegate, adds the transaction, adds or updates the position, will carry on the database operation frequently. Prevent in the process of frequent operation of the database, the database processing is not finished, resulting in an error, and then throw the exception, data loss. Also consider the project will use Kafka as a bus, data interaction, so at this stage, DB operations directly using Kafka, a little change can be.


Kafka installation and deployment, as well as demo, reference: http://blog.csdn.net/u010343544/article/details/78308881


Project Example Introduction

General idea:

1. When the message is sent over the database operation, we do not do database operations, but instead use Kafka to send messages to Kafka

2.kafka consumers, after the consumer to the message, the specific database operations, insert or update the database, if the error, is currently printed log, to record


Pom.xml Add Kafka Dependency Pack

<dependency>
            <groupId>org.apache.kafka</groupId>
            <artifactid>kafka_2.11</ artifactid>
            <version>0.11.0.1</version>
        </dependency>



Kafka Configuration Information Load configuration information: kafka.properties
# #produce
bootstrap.servers=10.20.135.20:9092
producer.type=sync
Request.required.acks=1
Serializer.class=kafka.serializer.defaultencoder
key.serializer= Org.apache.kafka.common.serialization.StringSerializer
value.serializer= Org.apache.kafka.common.serialization.StringSerializer
bak.partitioner.class= Kafka.producer.DefaultPartitioner
bak.key.serializer= Org.apache.kafka.common.serialization.StringSerializer
bak.value.serializer= Org.apache.kafka.common.serialization.StringSerializer

# #consume
zookeeper.connect=10.20.135.20:2181
group.id=test-consumer-group
zookeeper.session.timeout.ms=4000
zookeeper.sync.time.ms=200
#enable. Auto.commit=false
auto.commit.interval.ms=1000
auto.offset.reset=smallest
Serializer.class=kafka.serializer.stringencoder

# Kafka message configuration information
kafka.consumer.topic=test
kafka.consumer.key.989847=989847
kafka.consumer.key.989848=989848
kafka.consumer.key.989849=989849
kafka.consumer.key.989850=989850


Tool classes to load information:

Import Java.io.File;
Import Java.io.FileInputStream;

Import java.util.Properties;
Import Org.slf4j.Logger;

Import Org.slf4j.LoggerFactory; /** * Load Configuration kafka.properties file */public class Readkafkapropertiesutil {/** * Log/private STA

    Tic Logger Logger = Loggerfactory.getlogger (Readkafkapropertiesutil.class);

    /** * Property/private static properties properties; /** * Read Kafka.properties */static {//kafka.properties path Logger.debug ("Read Kafka.proper

        Ties ");
        Properties = new properties ();

        String Path = ReadKafkaPropertiesUtil.class.getResource ("/"). GetFile (). toString () + "kafka.properties";

        Logger.debug ("Read kafka.properties path:" + path);
            try {fileinputstream fis = new FileInputStream (new File (path));
        Properties.load (FIS);
        catch (Exception e) {logger.error ("Kafka Produce init Kafka properties" + E);

 }

    }   /** * Get Kafka configuration information * * @return/public static Properties GetProperties () {return PR
    operties; /** * Get Kafka topic * * * @return/public static String Gettopic () {retur
    N Properties.getproperty ("Kafka.consumer.topic");  /** * Get Kafka kafka.consumer.key.989847 * * @return/public static String getKey989847 ()
    {return Properties.getproperty ("kafka.consumer.key.989847");  /** * Get Kafka kafka.consumer.key.989848 * * @return/public static String getKey989848 ()
    {return Properties.getproperty ("kafka.consumer.key.989848");  /** * Get Kafka kafka.consumer.key.989849 * * @return/public static String getKey989849 ()
    {return Properties.getproperty ("kafka.consumer.key.989849"); /** * Get Kafka's kafka.consumer.key.989850 * * @return/Public STATic String getKey989850 () {return Properties.getproperty ("kafka.consumer.key.989850");
 /** * Private Readkafkapropertiesutil () {}}


Kafka producer Kafka producers mainly send messages to Kafka, have Kafka Topic,key and value values, code reference:
Import java.util.Properties;
Import Org.apache.kafka.clients.producer.Callback;
Import Org.apache.kafka.clients.producer.KafkaProducer;
Import Org.apache.kafka.clients.producer.ProducerRecord;
Import Org.apache.kafka.clients.producer.RecordMetadata;
Import Org.slf4j.Logger;

Import Org.slf4j.LoggerFactory;
Import Com.alibaba.fastjson.JSON;

Import com.hundsun.ftenant.common.exception.TengException; /** * Kafka Producer */public class Kafkaproduce {/** * Log/private static Logger Logger = Loggerfac

    Tory.getlogger (Kafkaproduce.class);
    private static final String Send_message_failed_num = "12000002";

    private static final String send_message_failed_message = "SEND message to Kafka error:"; /** * Send Message * * @param topic * @param key * @param value/public static void Sendmsg (S)

        Tring topic, string key, String value) {Properties Properties = Readkafkapropertiesutil.getproperties ();
       Instantiation of Produce kafkaproducer<string, string> kp = new kafkaproducer<string, string> (properties);

        Message Encapsulation producerrecord<string, string> PR = new producerrecord<string, string> (topic, key, value); Send data Kp.send (PR, new Callback () {//callback function @Override public void Onco Mpletion (Recordmetadata metadata, Exception Exception) {if (null!= Exception) {Logg
                    Er.error ("Kafka Produce Send message Error" + exception);
                    Logger.error ("Kafka Produce send Message info:metadata:" + json.tojsonstring (metadata));
                throw new Tengexception (Send_message_failed_num, Send_message_failed_message + exception.getmessage ());

        }
            }
        });
    Close produce kp.close (); }
}


Kafka's consumer mentality: 1. When the project starts, start Kafka Listener class: Kafkaconsumelinstener 2.kafka Listener, calling Kafka thread class: Kafkaconsumerunnable 3.kafka thread class, Run method, start Kafka consumer, and call interface Ikafkadataconsumer method to implement Kafka message processing 4.kafka specific message processing class Kafkadataconsumer implementation interface Ikafkadataconsume R
Code reference: 1.web.xml Configure Kafka Listener reference:
  <!--the Contextloaderlistener listener is to be loaded before the Kafka listener
  because it is enabled using Servletcontextevent for the load listener for the DAO class in Kafka listeners.
  Spring has not yet started loading, so annotations such as @Service
  need to manually use Servletcontextevent for subsequent initialization of some classes
   -->
  <listener>
    <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class>
  </ Listener>
  <!--Kafka-->
  <listener>
    <listener-class> Com.hundsun.cloudtrade.match.kafka.kafkaconsumelinstener</listener-class>
  </listener>


2.kafka Listener Reference:
Import javax.servlet.ServletContextEvent;
Import Javax.servlet.ServletContextListener;

Import Org.slf4j.Logger;
Import org.slf4j.LoggerFactory;

/** * Consumer */public
class Kafkaconsumelinstener implements Servletcontextlistener {

    /**
     * Log
    /private static Logger Logger = Loggerfactory.getlogger (kafkaconsumelinstener.class);

    @Override public
    void contextinitialized (Servletcontextevent sce) {

        logger.debug ("Init Kafka consume thread ...... ");

        Thread t = new Thread (new Kafkaconsumerunnable (SCE));
        T.start ();

        Logger.debug ("Init Kafka Consume Thread End");

    }

    @Override public
    void contextdestroyed (Servletcontextevent sce) {
        //TODO auto-generated method stub

    }< c23/>}


3.kafka Thread class We are using the implementation Runnable interface reference:
Import Java.util.HashMap;
Import java.util.List;
Import Java.util.Map;

Import java.util.Properties;

Import javax.servlet.ServletContextEvent;
Import Org.slf4j.Logger;
Import Org.slf4j.LoggerFactory;

Import Org.springframework.web.context.support.WebApplicationContextUtils;
Import Com.alibaba.fastjson.JSON;
Import Com.hundsun.cloudtrade.match.dao.IDayEntrustDao;
Import Com.hundsun.cloudtrade.match.dao.IDayHoldDao;
Import Com.hundsun.cloudtrade.match.dao.IDayTransactionDao;

Import Com.hundsun.ftenant.common.kafka.ReadKafkaPropertiesUtil;
Import Kafka.consumer.ConsumerConfig;
Import Kafka.consumer.ConsumerIterator;
Import Kafka.consumer.KafkaStream;
Import Kafka.javaapi.consumer.ConsumerConnector;
Import Kafka.message.MessageAndMetadata;
Import Kafka.serializer.StringDecoder;

Import kafka.utils.VerifiableProperties; /** * Kafka Thread class * * */public class Kafkaconsumerunnable implements Runnable {/** * Log/private STA Tic Logger Logger = Loggerfactory.getlogger (kafkAconsumerunnable.class);
    Entrust private final Idayentrustdao Aidayentrustdao;
    Deal private final Idaytransactiondao Aidaytransactiondao;

    Positions private final Idayholddao Aidayholddao;

    /** * Kafka Consumer Information Interface * * Private final Ikafkadataconsumer Kafkadataconsumer; /** * Spring is not loaded to manually load subsequent DAO classes that need to be used * * @param SCE/public kafkaconsumerunnable Servletcontexteve

        NT SCE) {logger.debug ("Kafka consumer init DAO Class"); Aidayholddao = Webapplicationcontextutils.getwebapplicationcontext (Sce.getservletcontext ()). GetBean (
        Idayholddao.class); Aidayentrustdao = Webapplicationcontextutils.getwebapplicationcontext (Sce.getservletcontext ()). GetBean (
        Idayentrustdao.class); Aidaytransactiondao = Webapplicationcontextutils.getwebapplicationcontext (Sce.getservletcontext ()). GetBean (

        Idaytransactiondao.class); Kafkadataconsumer = new Kafkadataconsumer (Aidayholddao, Aidayentrustdao, Aidaytransactiondao); * * * * Read KAFKA message */@Override public void Run () {//Kafka configuration Property Get Propertie
        s properties = Readkafkapropertiesutil.getproperties ();

        Kafka configuration property gets TOPIC String TOPIC = Readkafkapropertiesutil.gettopic ();
        Logger.info ("Kafka Consumer topic:" + topic);

        Logger.info ("Kafka Consumer Properties:" + json.tojsonstring (properties));

        Consumerconfig config = new Consumerconfig (properties);

        map<string, integer> topiccountmap = new hashmap<string, integer> ();

        Topiccountmap.put (TOPIC, New Integer (1));

        Stringdecoder Keydecoder = new Stringdecoder (new Verifiableproperties ());

        Stringdecoder Valuedecoder = new Stringdecoder (new Verifiableproperties ());
        Consumerconnector consumer = kafka.consumer.Consumer.createJavaConsumerConnector (config); Map<string, list<kafkastream<string, string>>> consumermap = consumer.createmessagestReams (Topiccountmap, Keydecoder, Valuedecoder);

        kafkastream<string, string> stream = Consumermap.get (TOPIC). Get (0);

        Consumeriterator<string, string> it = Stream.iterator (); while (It.hasnext ()) {//Kafka gets the data messageandmetadata<string, string> Keyvlaue = It.next (

            ); Logger.debug ("Kafka Get Message, Key:" + keyvlaue.key () + ";

            Value: "+ keyvlaue.message ());
        Processing Kafka Data Kafkadataconsumer.dealkafkamessage (Keyvlaue.key (), Keyvlaue.message ());
 }
    }

}




4.kafka message processing class interface and implementation class note point: Methods of processing messages dealkafkamessage the nested try catch themselves to handle the exception, do not throw. Throw an exception thread dead, Kafka received the message, but will not consume the message. Looking at the log, Kafka realizes that there is a message to be processed, but the pointer to the message is not changed. (not sure if the thread is using the Runnable reason, Runnable will not throw an exception, and callable can throw an exception)
Interface:
/**
 * Kafka Message Processing Interface
 */public
interface Ikafkadataconsumer {

    /**
     * Kafka Message processing method
     * 
     @param key
     * @param message
     *
    /public void Dealkafkamessage (string key, String message);





Implementation class:
Import Org.slf4j.Logger;

Import Org.slf4j.LoggerFactory;
Import Com.alibaba.fastjson.JSON;
Import Com.hundsun.cloudtrade.match.dao.IDayEntrustDao;
Import Com.hundsun.cloudtrade.match.dao.IDayHoldDao;
Import Com.hundsun.cloudtrade.match.dao.IDayTransactionDao;
Import Com.hundsun.cloudtrade.match.domain.DayEntrustDomain;
Import Com.hundsun.cloudtrade.match.domain.DayHoldDomain;
Import Com.hundsun.cloudtrade.match.domain.DayTransactionDomain;

Import Com.hundsun.ftenant.common.kafka.ReadKafkaPropertiesUtil; /** * Kafka specific message processing */public class Kafkadataconsumer implements Ikafkadataconsumer {/** * Log/P

    Rivate static Logger Logger = Loggerfactory.getlogger (Kafkadataconsumer.class);
    Entrust private final Idayentrustdao Aidayentrustdao;
    Deal private final Idaytransactiondao Aidaytransactiondao;

    Positions private final Idayholddao Aidayholddao;
   /** * @param Aidayholddao * @param aIDayEntrustDao2 * @param aidaytransactiondao  * Public Kafkadataconsumer (Idayholddao Aidayholddao, Idayentrustdao Aidayentrustdao, Idaytransactiondao AIDayTransa
        Ctiondao) {This.aidayentrustdao = Aidayentrustdao;
        This.aidaytransactiondao = Aidaytransactiondao;
    This.aidayholddao = Aidayholddao; * * * Processing data * * @see com.hundsun.ftenant.common.kafka.ikafkadataconsumer#dealkafkamessage (java.lang.s Tring, java.lang.String)/@Override public void Dealkafkamessage (string key, String value) {Logge R.debug ("Kafka Get Message, Key:" + key + ";

        Value: "+ value);

        Log whether the database operation succeeded int result = 0; try {if (readkafkapropertiesutil.getkey989847 (). Equals (key)) {//Add delegate LOGGER
                . Debug ("Kafka 989847");
                Dayentrustdomain domain = json.parseobject (value, Dayentrustdomain.class);

            result = Aidayentrustdao.insertone (domain); else if (readkafkapropertiesutil.getkey989848 (). Equals (key)) {//Update delegate Logger.debug ("Kafka 989848");
                Dayentrustdomain domain = json.parseobject (value, Dayentrustdomain.class);
            result = Aidayentrustdao.updateone (domain); else if (readkafkapropertiesutil.getkey989849 (). Equals (key)) {//Add a deal logger.debug ("Ka
                FKA 989849 ");
                Daytransactiondomain domain = json.parseobject (value, Daytransactiondomain.class);
            result = Aidaytransactiondao.insertone (domain);
                else if (readkafkapropertiesutil.getkey989850 (). Equals (key)) {//Add or update positions.
                Logger.debug ("Kafka 989850");
                Dayholddomain domain = json.parseobject (value, Dayholddomain.class);
            result = Aidayholddao.addorupdateone_addamount (domain); } catch (Exception e) {logger.error ("Insert or update DB error. Key: "+ key +";Value: "+ value);
        Logger.error ("Kafka Deal data Error" + E);

    } logger.debug ("Kafka insert or update database result:" + result);
 }

}



Above, that is the whole content, welcome message exchange Discussion






Kafka Configuration Information Load configuration information: kafka.properties

# #produce
bootstrap.servers=10.20.135.20:9092
producer.type=sync
Request.required.acks=1
Serializer.class=kafka.serializer.defaultencoder
key.serializer= Org.apache.kafka.common.serialization.StringSerializer
value.serializer= Org.apache.kafka.common.serialization.StringSerializer
bak.partitioner.class= Kafka.producer.DefaultPartitioner
bak.key.serializer= Org.apache.kafka.common.serialization.StringSerializer
bak.value.serializer= Org.apache.kafka.common.serialization.StringSerializer

# #consume
zookeeper.connect=10.20.135.20:2181
group.id=test-consumer-group
zookeeper.session.timeout.ms=4000
zookeeper.sync.time.ms=200
#enable. Auto.commit=false
auto.commit.interval.ms=1000
auto.offset.reset=smallest
Serializer.class=kafka.serializer.stringencoder

# Kafka message configuration information
kafka.consumer.topic=test
kafka.consumer.key.989847=989847
kafka.consumer.key.989848=989848
kafka.consumer.key.989849=989849
kafka.consumer.key.989850=989850


Tool classes to load information:

Import Java.io.File;
Import Java.io.FileInputStream;

Import java.util.Properties;
Import Org.slf4j.Logger;

Import Org.slf4j.LoggerFactory; /** * Load Configuration kafka.properties file */public class Readkafkapropertiesutil {/** * Log/private STA

    Tic Logger Logger = Loggerfactory.getlogger (Readkafkapropertiesutil.class);

    /** * Property/private static properties properties; /** * Read Kafka.properties */static {//kafka.properties path Logger.debug ("Read Kafka.proper

        Ties ");
        Properties = new properties ();

        String Path = ReadKafkaPropertiesUtil.class.getResource ("/"). GetFile (). toString () + "kafka.properties";

        Logger.debug ("Read kafka.properties path:" + path);
            try {fileinputstream fis = new FileInputStream (new File (path));
        Properties.load (FIS);
        catch (Exception e) {logger.error ("Kafka Produce init Kafka properties" + E);

 }

    }   /** * Get Kafka configuration information * * @return/public static Properties GetProperties () {return PR
    operties; /** * Get Kafka topic * * * @return/public static String Gettopic () {retur
    N Properties.getproperty ("Kafka.consumer.topic");  /** * Get Kafka kafka.consumer.key.989847 * * @return/public static String getKey989847 ()
    {return Properties.getproperty ("kafka.consumer.key.989847");  /** * Get Kafka kafka.consumer.key.989848 * * @return/public static String getKey989848 ()
    {return Properties.getproperty ("kafka.consumer.key.989848");  /** * Get Kafka kafka.consumer.key.989849 * * @return/public static String getKey989849 ()
    {return Properties.getproperty ("kafka.consumer.key.989849"); /** * Get Kafka's kafka.consumer.key.989850 * * @return/Public STATic String getKey989850 () {return Properties.getproperty ("kafka.consumer.key.989850");
 /** * Private Readkafkapropertiesutil () {}}

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.