Springboot Kafka Integration (for producer and consumer)

Source: Internet
Author: User
Tags zookeeper

This article describes how to integrate Kafka send and receive message in a Springboot project.

1. Resolve Dependencies First

Springboot related dependencies We don't mention it, and Kafka dependent only on one Spring-kafka integration package

<Dependency>            <groupId>Org.springframework.kafka</groupId>            <Artifactid>Spring-kafka</Artifactid>            <version>1.1.1.RELEASE</version>        </Dependency>

Here we first show the configuration file

#============== Kafka ===================kafka.consumer.zookeeper.connect=10.93.21.21:2181kafka.consumer.servers =10.93.21.21:9092kafka.consumer.enable.auto.commit=truekafka.consumer.session.timeout= 6000kafka.consumer.auto.commit.interval=100kafka.consumer.auto.offset.reset=latestkafka.consumer.topic= testkafka.consumer.group.id=testkafka.consumer.concurrency=10kafka.producer.servers= 10.93.21.21:9092kafka.producer.retries=0kafka.producer.batch.size=4096kafka.producer.linger= 1kafka.producer.buffer.memory=40960

2, Configuration:kafka producer

1) through @configuration, @EnableKafka, declare config and open kafkatemplate capability.

2) Inject the Kafka configuration in the Application.properties configuration file via @value.

3) Generate Bean, @Bean

 Packagecom.kangaroo.sentinel.collect.configuration;ImportJava.util.HashMap;ImportJava.util.Map;ImportOrg.apache.kafka.clients.producer.ProducerConfig;ImportOrg.apache.kafka.common.serialization.StringSerializer;ImportOrg.springframework.beans.factory.annotation.Value;ImportOrg.springframework.context.annotation.Bean;Importorg.springframework.context.annotation.Configuration;ImportOrg.springframework.kafka.annotation.EnableKafka;Importorg.springframework.kafka.core.DefaultKafkaProducerFactory;Importorg.springframework.kafka.core.KafkaTemplate;Importorg.springframework.kafka.core.ProducerFactory, @Configuration @enablekafka Public classkafkaproducerconfig {@Value ("${kafka.producer.servers}")    PrivateString servers; @Value ("${kafka.producer.retries}")    Private intretries; @Value ("${kafka.producer.batch.size}")    Private intbatchsize; @Value ("${kafka.producer.linger}")    Private intlinger; @Value ("${kafka.producer.buffer.memory}")    Private intbuffermemory;  PublicMap<string, object>Producerconfigs () {Map<string, object> props =NewHashmap<>();        Props.put (Producerconfig.bootstrap_servers_config, SERVERS);        Props.put (Producerconfig.retries_config, retries);        Props.put (Producerconfig.batch_size_config, batchsize);        Props.put (Producerconfig.linger_ms_config, LINGER);        Props.put (Producerconfig.buffer_memory_config, buffermemory); Props.put (Producerconfig.key_serializer_class_config, Stringserializer.class); Props.put (Producerconfig.value_serializer_class_config, Stringserializer.class); returnprops; }     PublicProducerfactory<string, string>producerfactory () {return NewDefaultkafkaproducerfactory<>(Producerconfigs ()); } @Bean PublicKafkatemplate<string, string>kafkatemplate () {return NewKafkatemplate<string, string>(Producerfactory ()); }}

Experiment with our producer and write a controller. Want to Topic=test,key=key, send messages message

 PackageCom.kangaroo.sentinel.collect.controller;ImportCom.kangaroo.sentinel.common.response.Response;ImportCom.kangaroo.sentinel.common.response.ResultCode;ImportOrg.slf4j.Logger;Importorg.slf4j.LoggerFactory;Importorg.springframework.beans.factory.annotation.Autowired;Importorg.springframework.kafka.core.KafkaTemplate;Importorg.springframework.web.bind.annotation.*;Importjavax.servlet.http.HttpServletRequest;ImportJavax.servlet.http.HttpServletResponse, @RestController @requestmapping ("/kafka") Public classCollectcontroller {protected FinalLogger Logger = Loggerfactory.getlogger ( This. GetClass ()); @AutowiredPrivatekafkatemplate kafkatemplate; @RequestMapping (Value= "/send", method =requestmethod.get) PublicResponse Sendkafka (httpservletrequest request, HttpServletResponse Response) {Try {String Message= Request.getparameter ("message"); Logger.info ("Kafka Message ={}",message); Kafkatemplate.send ("Test", "key", message); Logger.info ("Send Kafka successfully."); return NewResponse (resultcode.success, "Send Kafka Success",NULL); } Catch(Exception e) {logger.error ("Send Kafka Failed", E); return NewResponse (Resultcode.exception, "Send Kafka failed",NULL); }    }}

3, Configuration:kafka Consumer

1) through @configuration, @EnableKafka, declare config and open kafkatemplate capability.

2) Inject the Kafka configuration in the Application.properties configuration file via @value.

3) Generate Bean, @Bean

 Packagecom.kangaroo.sentinel.collect.configuration;ImportOrg.apache.kafka.clients.consumer.ConsumerConfig;ImportOrg.apache.kafka.common.serialization.StringDeserializer;ImportOrg.springframework.beans.factory.annotation.Value;ImportOrg.springframework.context.annotation.Bean;Importorg.springframework.context.annotation.Configuration;ImportOrg.springframework.kafka.annotation.EnableKafka;Importorg.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;Importorg.springframework.kafka.config.KafkaListenerContainerFactory;Importorg.springframework.kafka.core.ConsumerFactory;Importorg.springframework.kafka.core.DefaultKafkaConsumerFactory;ImportOrg.springframework.kafka.listener.ConcurrentMessageListenerContainer;ImportJava.util.HashMap;ImportJava.util.Map, @Configuration @enablekafka Public classkafkaconsumerconfig {@Value ("${kafka.consumer.servers}")    PrivateString servers; @Value ("${kafka.consumer.enable.auto.commit}")    Private BooleanEnableautocommit; @Value ("${kafka.consumer.session.timeout}")    PrivateString sessiontimeout; @Value ("${kafka.consumer.auto.commit.interval}")    PrivateString Autocommitinterval; @Value ("${kafka.consumer.group.id}")    PrivateString groupId; @Value ("${kafka.consumer.auto.offset.reset}")    PrivateString Autooffsetreset; @Value ("${kafka.consumer.concurrency}")    Private intconcurrency; @Bean PublicKafkalistenercontainerfactory<concurrentmessagelistenercontainer<string, String>>kafkalistenercontainerfactory () {concurrentkafkalistenercontainerfactory<string, string> factory =NewConcurrentkafkalistenercontainerfactory<>();        Factory.setconsumerfactory (Consumerfactory ());        Factory.setconcurrency (concurrency); Factory.getcontainerproperties (). Setpolltimeout (1500); returnFactory; }     PublicConsumerfactory<string, string>consumerfactory () {return NewDefaultkafkaconsumerfactory<>(Consumerconfigs ()); }     PublicMap<string, object>Consumerconfigs () {Map<string, object> propsmap =NewHashmap<>();        Propsmap.put (Consumerconfig.bootstrap_servers_config, SERVERS);        Propsmap.put (Consumerconfig.enable_auto_commit_config, enableautocommit);        Propsmap.put (Consumerconfig.auto_commit_interval_ms_config, autocommitinterval);        Propsmap.put (Consumerconfig.session_timeout_ms_config, sessiontimeout); Propsmap.put (Consumerconfig.key_deserializer_class_config, Stringdeserializer.class); Propsmap.put (Consumerconfig.value_deserializer_class_config, Stringdeserializer.class);        Propsmap.put (Consumerconfig.group_id_config, groupId);        Propsmap.put (Consumerconfig.auto_offset_reset_config, Autooffsetreset); returnPropsmap; } @Bean PublicListener Listener () {return NewListener (); }}

New Listener () generates a bean to handle the data read from Kafka. Listener simple implementation of the demo is as follows: simply read and print the key and message values

The Topics property in the @KafkaListener is used to specify the Kafka topic name, and the topic name is specified by the message producer, which is specified by Kafkatemplate when the message is sent.

 Packagecom.kangaroo.sentinel.collect.configuration;ImportOrg.apache.kafka.clients.consumer.ConsumerRecord;ImportOrg.slf4j.Logger;Importorg.slf4j.LoggerFactory;ImportOrg.springframework.kafka.annotation.KafkaListener; Public classListener {protected FinalLogger Logger = Loggerfactory.getlogger ( This. GetClass ()); @KafkaListener (Topics= {"Test"})     Public voidListen (consumerrecord<?,? >record) {Logger.info ("Kafka key:" +Record.key ()); Logger.info ("Kafka Value:" +Record.value (). toString ()); }}

Tips

1) I did not describe how to install the configuration Kafka, the best way to configure Kafka is to use a full bind network IP instead of localhost or 127.0.0.1

2) It is best not to use Kafka's own zookeeper deployment Kafka, which may cause access.

3) Theoretically consumer read Kafka should be through zookeeper, but here we use the address of kafkaserver, why did not delve into.

4) When defining a listener message configuration, the value of the Group_id_config configuration item is used to specify the name of the consumer group, and only one listener object can receive the message if there are multiple listener objects in the same group.

Springboot Kafka Integration (for producer and consumer)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.