Springboot Integrated Kafka

Source: Internet
Author: User
Tags serialization

This article describes how to integrate Kafka send and receive message in a Springboot project.

1. Resolve Dependencies First

Springboot related dependencies We don't mention it, and Kafka dependent only on one Spring-kafka integration package

<dependency>            <groupId>org.springframework.kafka</groupId>            <artifactId> spring-kafka</artifactid>            <version>1.1.1.RELEASE</version>        </dependency>

Here we first show the configuration file

#============== Kafka ===================kafka.consumer.zookeeper.connect=10.93.21.21:2181kafka.consumer.servers =10.93.21.21:9092kafka.consumer.enable.auto.commit=truekafka.consumer.session.timeout= 6000kafka.consumer.auto.commit.interval=100kafka.consumer.auto.offset.reset=latestkafka.consumer.topic= testkafka.consumer.group.id=testkafka.consumer.concurrency=10kafka.producer.servers= 10.93.21.21:9092kafka.producer.retries=0kafka.producer.batch.size=4096kafka.producer.linger= 1kafka.producer.buffer.memory=40960

2, Configuration:kafka producer

1) through @configuration, @EnableKafka, declare config and open kafkatemplate capability.

2) Inject the Kafka configuration in the Application.properties configuration file via @value.

3) Generate Bean, @Bean

Package Com.kangaroo.sentinel.collect.configuration;import Java.util.hashmap;import Java.util.map;import Org.apache.kafka.clients.producer.producerconfig;import Org.apache.kafka.common.serialization.StringSerializer; Import Org.springframework.beans.factory.annotation.value;import Org.springframework.context.annotation.Bean; Import Org.springframework.context.annotation.configuration;import Org.springframework.kafka.annotation.enablekafka;import Org.springframework.kafka.core.defaultkafkaproducerfactory;import org.springframework.kafka.core.KafkaTemplate;    Import org.springframework.kafka.core.ProducerFactory; @Configuration @enablekafkapublic class Kafkaproducerconfig {    @Value ("${kafka.producer.servers}") Private String servers;    @Value ("${kafka.producer.retries}") private int retries;    @Value ("${kafka.producer.batch.size}") private int batchsize;    @Value ("${kafka.producer.linger}") private int linger; @Value ("${kafka.producer.buffer.memory}") Private int BuffermeMory;        Public map<string, Object> Producerconfigs () {map<string, object> props = new hashmap<> ();        Props.put (Producerconfig.bootstrap_servers_config, SERVERS);        Props.put (Producerconfig.retries_config, retries);        Props.put (Producerconfig.batch_size_config, batchsize);        Props.put (Producerconfig.linger_ms_config, LINGER);        Props.put (Producerconfig.buffer_memory_config, buffermemory);        Props.put (Producerconfig.key_serializer_class_config, Stringserializer.class);        Props.put (Producerconfig.value_serializer_class_config, Stringserializer.class);    return props; } public producerfactory<string, String> producerfactory () {return new Defaultkafkaproducerfactory<&gt    ;(p roducerconfigs ()); } @Bean public kafkatemplate<string, string> kafkatemplate () {return new kafkatemplate<string, Stri    Ng> (Producerfactory ()); }}

Experiment with our producer and write a controller. Want to Topic=test,key=key, send messages message

Package Com.kangaroo.sentinel.collect.controller;import Com.kangaroo.sentinel.common.response.response;import Com.kangaroo.sentinel.common.response.resultcode;import Org.slf4j.logger;import Org.slf4j.LoggerFactory;import Org.springframework.beans.factory.annotation.autowired;import org.springframework.kafka.core.KafkaTemplate; Import Org.springframework.web.bind.annotation.*;import Javax.servlet.http.httpservletrequest;import    Javax.servlet.http.HttpServletResponse; @RestController @requestmapping ("/kafka") public class Collectcontroller {    Protected final Logger Logger = Loggerfactory.getlogger (This.getclass ());    @Autowired private Kafkatemplate kafkatemplate; @RequestMapping (value = "/send", method = requestmethod.get) public Response Sendkafka (httpservletrequest request, Http            Servletresponse response) {try {String message = Request.getparameter ("message");            Logger.info ("Kafka Message ={}", message); Kafkatemplate.send ("Test", "key", message);            Logger.info ("Send Kafka successfully.");        return new Response (Resultcode.success, "Send Kafka success", NULL);            } catch (Exception e) {logger.error ("Send Kafka failed", E);        return new Response (Resultcode.exception, "Send Kafka failed", NULL); }    }}

3, Configuration:kafka Consumer

1) through @configuration, @EnableKafka, declare config and open kafkatemplate capability.

2) Inject the Kafka configuration in the Application.properties configuration file via @value.

3) Generate Bean, @Bean

Package Com.kangaroo.sentinel.collect.configuration;import Org.apache.kafka.clients.consumer.ConsumerConfig; Import Org.apache.kafka.common.serialization.stringdeserializer;import Org.springframework.beans.factory.annotation.value;import Org.springframework.context.annotation.bean;import Org.springframework.context.annotation.configuration;import Org.springframework.kafka.annotation.EnableKafka; Import Org.springframework.kafka.config.concurrentkafkalistenercontainerfactory;import Org.springframework.kafka.config.kafkalistenercontainerfactory;import Org.springframework.kafka.core.consumerfactory;import Org.springframework.kafka.core.defaultkafkaconsumerfactory;import Org.springframework.kafka.listener.concurrentmessagelistenercontainer;import Java.util.HashMap;import Java.util.Map; @Configuration @enablekafkapublic class Kafkaconsumerconfig {@Value ("${kafka.consumer.servers}") Priv    Ate String servers; @Value ("${kafka.consumer.enable.auto.commit}") Private Boolean EnableautocommIt    @Value ("${kafka.consumer.session.timeout}") Private String sessiontimeout;    @Value ("${kafka.consumer.auto.commit.interval}") Private String autocommitinterval;    @Value ("${kafka.consumer.group.id}") Private String groupId;    @Value ("${kafka.consumer.auto.offset.reset}") Private String autooffsetreset;    @Value ("${kafka.consumer.concurrency}") private int concurrency; @Bean public kafkalistenercontainerfactory<concurrentmessagelistenercontainer<string, String>> Kafkalistenercontainerfactory () {concurrentkafkalistenercontainerfactory<string, String> factory = new Concu        Rrentkafkalistenercontainerfactory<> ();        Factory.setconsumerfactory (Consumerfactory ());        Factory.setconcurrency (concurrency);        Factory.getcontainerproperties (). Setpolltimeout (1500);    return factory; } public consumerfactory<string, String> consumerfactory () {return new Defaultkafkaconsumerfactory<&gt ;(consumerconfIGS ()); } public map<string, Object> Consumerconfigs () {map<string, object> propsmap = new Hashmap<> (        );        Propsmap.put (Consumerconfig.bootstrap_servers_config, SERVERS);        Propsmap.put (Consumerconfig.enable_auto_commit_config, enableautocommit);        Propsmap.put (Consumerconfig.auto_commit_interval_ms_config, autocommitinterval);        Propsmap.put (Consumerconfig.session_timeout_ms_config, sessiontimeout);        Propsmap.put (Consumerconfig.key_deserializer_class_config, Stringdeserializer.class);        Propsmap.put (Consumerconfig.value_deserializer_class_config, Stringdeserializer.class);        Propsmap.put (Consumerconfig.group_id_config, groupId);        Propsmap.put (Consumerconfig.auto_offset_reset_config, Autooffsetreset);    return propsmap;    } @Bean Public Listener Listener () {return new Listener (); }}

New Listener () generates a bean to handle the data read from Kafka. Listener simple implementation of the demo is as follows: simply read and print the key and message values

The Topics property in the @KafkaListener is used to specify the Kafka topic name, and the topic name is specified by the message producer, which is specified by Kafkatemplate when the message is sent.

Package Com.kangaroo.sentinel.collect.configuration;import Org.apache.kafka.clients.consumer.ConsumerRecord; Import Org.slf4j.logger;import Org.slf4j.loggerfactory;import Org.springframework.kafka.annotation.KafkaListener; public class Listener {    protected final Logger Logger = Loggerfactory.getlogger (This.getclass ());    @KafkaListener (topics = {"Test"}) public    void Listen (consumerrecord<?,? > Record) {        logger.info (" Kafka key: "+ Record.key ());        Logger.info ("Kafka Value:" + record.value (). toString ());}    }

Tips

1) I did not describe how to install the configuration Kafka, the best way to configure Kafka is to use a full bind network IP instead of localhost or 127.0.0.1

2) It is best not to use Kafka's own zookeeper deployment Kafka, which may cause access.

3) Theoretically consumer read Kafka should be through zookeeper, but here we use the address of kafkaserver, why did not delve into.

4) When defining a listener message configuration, the value of the Group_id_config configuration item is used to specify the name of the consumer group, and only one listener object can receive the message if there are multiple listener objects in the same group.

Reprint: https://www.cnblogs.com/kangoroo/p/7353330.html

Springboot Integrated Kafka

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.