Kafka-2.11 Study Notes (iii) JAVAAPI visit Kafka

Source: Internet
Author: User
Tags zookeeper

Welcome to: Ruchunli's work notes, learning is a faith that allows time to test the strength of persistence.


The Kafka is based on the Scala language, but it also provides the Java API interface.

Java-implemented message producers

package com.lucl.kafka.simple;import java.util.properties;import  kafka.javaapi.producer.producer;import kafka.producer.keyedmessage;import  Kafka.producer.producerconfig;import org.apache.log4j.logger;/** * <p> copyright:  Copyright  (c)  2015 </p> *  * <p> Date :  2015-11-17 21:42:50 </p> *  * <p> description :  javaapi for kafka producer </p> * *  @author  luchunli *   *  @version  1.0 * */public class SimpleKafkaProducer {     private static final logger logger = logger.getlogger ( Simplekafkaproducer.class);    /**     *       */    private void execmsgsend ()  {  &Nbsp;     properties props = new properties ();         props.put ("Metadata.broker.list",  "192.168.137.117:9092");         props.put ("Serializer.class",  "Kafka.serializer.StringEncoder");         props.put ("Key.serializer.class",  " Kafka.serializer.StringEncoder ");         props.put (" Request.required.acks ", " 0 ");                 producerconfig config = new producerconfig (props);                  logger.info ("Set  config info (" + config + ")  ok. ");                 Producer< String, string> procuder = new producer<> (config);                 String topic =  "Mytopic";         for  (int i = 1; i <= 10;  i++)  {            string value  =  "Value_"  + i;             Keyedmessage<string, string> msg = new keyedmessage<string, string> ( Topic, value);             procuder.send (msg);         }        logger.info (" Send message over. ");                 &nbsP;   procuder.close ();    }         /**     *  @param  args     */     public static void main (String[] args)  {         simplekafkaproducer simpleproducer = new simplekafkaproducer ();         simpleproducer.execmsgsend ();     }}

At this point, the consumer who starts the console mode can see that the data produced by the producer is consumed by the consumer:

[Email protected] kafka0.8.2.1]$ bin/kafka-console-consumer.sh--zookeeper nnode:2181,dnode1:2181,dnode2:2181-- Topic Mytopic--from-beginninghello Worldthis is my first messagevalue_1value_2value_3value_4value_5value_6value_ 7value_8value_9value_10


Java-implemented message consumers

package com.lucl.kafka.simple;import java.util.hashmap;import java.util.iterator;import  java.util.list;import java.util.map;import java.util.properties;import kafka.consumer.consumer; import kafka.consumer.consumerconfig;import kafka.consumer.consumeriterator;import  kafka.consumer.kafkastream;import kafka.javaapi.consumer.consumerconnector;import  kafka.message.messageandmetadata;import kafka.serializer.decoder;import  kafka.serializer.stringdecoder;import kafka.utils.verifiableproperties;import  org.apache.log4j.logger;/** * <p> copyright: copyright  (c)  2015 </ p> *  * <p> date : 2015-11-17 21:42:50 </p> *   * <p> description : javaapi for kafka consumer </ p> * *  @author  luchunli *  *  @version  1.0 * */public class simplekafkaconsumer {    private static  Final logger logger = logger.getlogger (Simplekafkaconsumer.class);         /**     *      */     private void execmsgconsume ()  {         Properties props = new properties ();         Props.put ("Zookeeper.connect",  "nnode:2181,dnode1:2181,dnode2:2181");         props.put ("Group.id",  "group-1");         props.put (" Serializer.class ", " Kafka.serializer.StringEncoder ");                 consumerconfig config = new consumerconfig (props);    &Nbsp;    consumerconnector consumer = consumer.createjavaconsumerconnector ( config);                 map <String, Integer> topicCountMap = new HashMap<String, Integer> ();         topiccountmap.put ("Mytopic",  1);         decoder<string> keydecoder = new stringdecoder (new  Verifiableproperties ());        decoder<string>  Valuedecoder = new stringdecoder (New verifiableproperties ());         Map<String, List<KafkaStream<String, String>>>  Createmessagestreams = consumer.createmessagestreams (topiccountmap, keydecoder,  Valuedecoder);         for  (Iterator<string> it = createmessagestreams.keyset (). Iterator ();  It.hasnext (); )  {            string  key = it.next ();             Logger.info ("the key of the createmessagestreams is "  + key);             list<kafkastream<string, string >> values = createmessagestreams.get (Key);             for  (kafkastream<string, string> value : values)  {                  Consumeriterator<string, string> consumerit = value.iterator ();                  while  (Consumerit.hasnext ())  {                       Messageandmetadata<string, string> data = consumerit.next ();                       Logger.info ("the message got by consuer is "  + data.message ());                  }             }        }             }         /**     *  @param  args     */     public stAtic void main (String[] args)  {         Simplekafkaconsumer simpleconsumer = new simplekafkaconsumer ();         simpleconsumer.execmsgconsume ();     }}

Start the consumer program, and then start the producer program, at which point the consumer output reads as follows:

23:37:30,411  info simplekafkaconsumer:55 - the key of the  createmessagestreams is mytopic23:37:30,433  info verifiableproperties:68 -  verifying properties23:37:30,433  info verifiableproperties:68 - property  client.id is overridden to group-123:37:30,433  INFO  verifiableproperties:68 - property metadata.broker.list is overridden to  nnode:909223:37:30,433  info verifiableproperties:68 - property  request.timeout.ms is overridden to 3000023:37:30,451  info clientutils$:68  - Fetching metadata from broker id:117,host:nnode,port:9092 with  Correlation id 0 for 1 topic (s)  set (mytopic) 23:37:30,453  info  syncproducer:68 - connected to nnode:9092&Nbsp;for producing23:37:30,486  info syncproducer:68 - disconnecting from  nnode:909223:37:30,528  info consumerfetcherthread:68 - [ consumerfetcherthread-group-1_luchunlipc-1447947448911-f949268d-0-117], starting 23:37:30,546   INFO ConsumerFetcherManager:68 - [ConsumerFetcherManager-1447947449115] Added  Fetcher for partitions arraybuffer ([[Mytopic,0], initoffset -1 to broker  id:117,host:nnode,port:9092] ) 23:37:52,466  info simplekafkaconsumer:61 -  the message got by consuer is value_123:37:52,466  info  Simplekafkaconsumer:61 - the message got by consuer is value_ 223:37:52,466  info simplekafkaconsumer:61 - the message got by  Consuer is value_323:37:52,466  info&nBsp Simplekafkaconsumer:61 - the message got by consuer is value_ 423:37:52,466  info simplekafkaconsumer:61 - the message got by  consuer is value_523:37:52,466  info simplekafkaconsumer:61 - the  message got by consuer is value_623:37:52,466  info  Simplekafkaconsumer:61 - the message got by consuer is value_ 723:37:52,469  info simplekafkaconsumer:61 - the message got by  consuer is value_823:37:52,469  info simplekafkaconsumer:61 - the  message got by consuer is value_923:37:52,469  info  Simplekafkaconsumer:61 - the message got by consuer is value_ 1023:39:11,351  info clientcnxn:1096 - client sesSion timed out, have not heard from server in 4000ms for  sessionid 0x3512026596f0001, closing socket connection and attempting  reconnect23:39:11,452  info zkclient:449 - zookeeper state changed   (disconnected)


This article is from the "world of Stuffy Gourd" blog, please be sure to keep this source http://luchunli.blog.51cto.com/2368057/1714857

Kafka-2.11 Study Notes (iii) JAVAAPI visit Kafka

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.