Recently the project needs to use the Kafka information middleware, here to do a simple record, convenient for other projects to use later.
Introducing Dependencies
<dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId></dependency>
Configuration file
kafka.consumer.servers=127.0.0.1:9092kafka.consumer.enable.auto.commit=truekafka.consumer.session.timeout=6000kafka.consumer.auto.commit.interval=100kafka.consumer.auto.offset.reset=latestkafka.consumer.group.id=kafka-test-groupkafka.consumer.concurrency=10kafka.producer.servers=127.0.0.1:9092kafka.producer.retries=1kafka.producer.batch.size=4096kafka.producer.linger=1kafka.producer.buffer.memory=40960
Producer Configuration Classes
@Configuration @enablekafkapublic class Kafkaproducerconfig {@Value ("${kafka.producer.servers}") Private String Serv ERs @Value ("${kafka.producer.retries}") private int retries; @Value ("${kafka.producer.batch.size}") private int batchsize; @Value ("${kafka.producer.linger}") private int linger; @Value ("${kafka.producer.buffer.memory}") private int buffermemory; @Bean public kafkatemplate<string, string> kafkatemplate () {return new Kafkatemplate (Producerfactory ()); } public producerfactory<string, String> producerfactory () {return new defaultkafkaproducerfactory< > (Producerconfigs ()); } public map<string, Object> Producerconfigs () {map<string, object> props = new hashmap<> (); Props.put (Producerconfig.bootstrap_servers_config, SERVERS); Props.put (Producerconfig.retries_config, retries); Props.put (Producerconfig.batch_size_config, batchsize); Props.put (ProdUcerconfig.linger_ms_config, LINGER); Props.put (Producerconfig.buffer_memory_config, buffermemory); Props.put (Producerconfig.key_serializer_class_config, Stringserializer.class); Props.put (Producerconfig.value_serializer_class_config, Stringserializer.class); return props; }}
Consumer Configuration Class
@Configuration @enablekafkapublic class Kafkaconsumerconfig {@Value ("${kafka.consumer.servers}") Private String Serv ERs @Value ("${kafka.consumer.enable.auto.commit}") Private Boolean enableautocommit; @Value ("${kafka.consumer.session.timeout}") Private String sessiontimeout; @Value ("${kafka.consumer.auto.commit.interval}") Private String autocommitinterval; @Value ("${kafka.consumer.group.id}") Private String groupId; @Value ("${kafka.consumer.auto.offset.reset}") Private String autooffsetreset; @Value ("${kafka.consumer.concurrency}") private int concurrency; @Bean public kafkalistenercontainerfactory<concurrentmessagelistenercontainer<string, String>> Kafkalistenercontainerfactory () {concurrentkafkalistenercontainerfactory<string, String> factory = new Concu Rrentkafkalistenercontainerfactory<> (); Factory.setconsumerfactory (Consumerfactory ()); Factory.setconcurrency (concurrency); Factory. Getcontainerproperties (). Setpolltimeout (1500); return factory; } public consumerfactory<string, String> consumerfactory () {return new Defaultkafkaconsumerfactory<> ;(consumerconfigs ()); } public map<string, Object> Consumerconfigs () {map<string, object> propsmap = new Hashmap<> ( 8); Propsmap.put (Consumerconfig.bootstrap_servers_config, SERVERS); Propsmap.put (Consumerconfig.enable_auto_commit_config, enableautocommit); Propsmap.put (Consumerconfig.auto_commit_interval_ms_config, autocommitinterval); Propsmap.put (Consumerconfig.session_timeout_ms_config, sessiontimeout); Propsmap.put (Consumerconfig.key_deserializer_class_config, Stringdeserializer.class); Propsmap.put (Consumerconfig.value_deserializer_class_config, Stringdeserializer.class); Propsmap.put (Consumerconfig.group_id_config, groupId); Propsmap.put (Consumerconfig.auto_offset_reset_config, Autooffsetreset); return propsmap; }}
Producer Class
@Componentpublic class KafkaProducer { private Logger logger = LoggerFactory.getLogger(getClass()); @Autowired private KafkaTemplate kafkaTemplate; public void sendMessage(String topic, String message) { logger.info("on message:{}", message); kafkaTemplate.send(topic, message); }}
Consumer class
@Componentpublic class VideoCosConsumer { protected final Logger logger = LoggerFactory.getLogger(this.getClass()); @KafkaListener(topics = {"test-topic"}) public void consumerMessage(String message) { logger.info("on message:{}", message); }}
The above is the spring cloud integration Kafka process, now spring let us code porter more and more do not work, even copy and paste are not, can simply assemble the required entity class.
Spring Boot Integration Kafka