Previous Kafka Development Combat (ii)-Cluster environment Construction article, we have built a Kafka cluster, and then we show through the code how to publish, subscribe to the message.
1. Add Maven Dependency
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients< /artifactid>
<version>0.9.0.1</version>
</dependency>
I use the Kafka version is 0.9.0.1, see below Kafka producer code
2, Kafkaproducer
Package Com.ricky.codelab.kafka;
Import java.io.IOException;
Import java.util.Properties;
Import Org.apache.kafka.clients.producer.KafkaProducer;
Import Org.apache.kafka.clients.producer.Producer;
Import Org.apache.kafka.clients.producer.ProducerRecord;
Import Com.ricky.codelab.kafka.util.PropertyUtils;
public class Kafkaproducerdemo {private int total = 1000000;
public static void Main (string[] args) {new Kafkaproducerdemo (). Send ();
public void Send () {Long start = System.currenttimemillis ();
System.out.println ("Kafka Producer send msg start,total msgs:" +total);
Set up the producer producer<string, string> producer = null;
try {Properties props = propertyutils.load ("producer_config.properties");
Producer = new kafkaproducer<> (props);
for (int i = 0; i < total; i++) {producer.send (new producerrecord<string, string> ("Hello", String.valueof (i), String.Format ("{\" type\ ": \" test\ ", \" t\ ":%d, \" K\ ":%d}", System.currenttimemillis (), i)
)); Every so often send to a different topic if (i% = = 0) {producer.send (new Pro Ducerrecord<string, string> ("Test", String.Format ("{\" type\ ": \" Marker\ ", \" t\ ":%d, \" K\ ":%d}",
System.currenttimemillis (), i)); Producer.send (New producerrecord<string, string> ("Hello", String.Format ("{\" type\ ": \" Marker\ ", \" t\ ":%d, \" K\
":%d}", System.currenttimemillis (), i)));
Producer.flush ();
System.out.println ("Sent msg number" + i);
}}} catch (IOException e) {e.printstacktrace ();
}finally{Producer.close ();
} System.out.println ("Kafka Producer send msg over,cost time:" + (System.currenttimemillis ()-start) + "MS"); }
}
The kafkaproducer needs to pass in a properties object to pass in the Kafka-related configuration information at the time of construction, producer_config.properties as follows:
bootstrap.servers=172.18.19.206:9092,172.18.19.207:9092,172.18.19.208:9092
acks=all
retries=0
batch.size=16384
linger.ms=1
buffer.memory=33554432
auto.commit.interval.ms=1000
Key.serializer=org.apache.kafka.common.serialization.stringserializer
value.serializer= Org.apache.kafka.common.serialization.StringSerializer
3, Kafkaconsumer
Package Com.ricky.codelab.kafka;
Import java.io.IOException;
Import Java.util.Arrays;
Import java.util.Properties;
Import Org.apache.kafka.clients.consumer.ConsumerRecord;
Import Org.apache.kafka.clients.consumer.ConsumerRecords;
Import Org.apache.kafka.clients.consumer.KafkaConsumer;
Import Com.google.gson.JsonObject;
Import Com.google.gson.JsonParser;
Import Com.ricky.codelab.kafka.util.PropertyUtils;
public class Kafkaconsumerdemo {public static void main (string[] args) {new Kafkaconsumerdemo (). consume ();
} public void consume () {Jsonparser jsonparser = new Jsonparser ();
and the consumer kafkaconsumer<string, string> consumer = null;
try {Properties props = propertyutils.load ("consumer_config.properties");
Consumer = new kafkaconsumer<> (props);
Subscribe Topics Consumer.subscribe (arrays.aslist ("Hello", "Test"));
while (true) { Consumerrecords<string, string> records = Consumer.poll (100); For (consumerrecord<string, string> record:records) {//System.out.printf ("offset,%d, key
%s, value,%s ",//Record.offset (), Record.key (), Record.value ()); Switch (Record.topic ()) {case "Hello": jsonobject jobj = (jsonobj
ECT) Jsonparser.parse (Record.value ()); Switch (Jobj.get ("type"). Getasstring ()) {case "test": Long L
Atency = System.currenttimemillis ()-Jobj.get ("T"). Getaslong ();
System.out.println (latency);
Break
Case "marker": break;
Default:break;
} Break
Case ' test ': break; Default:throw New IllegalStateException ("shouldn ' t is possible to get message on topic" + RE
Cord.topic ());
}}}} catch (IOException e) {e.printstacktrace ();
}finally{if (consumer!=null) {consumer.close (); }
}
}
}
Kafkaconsumer also needs to pass in the Java.util.Properties object when constructing, to tell it Kafka related configuration information, consumer_config.properties as follows:
bootstrap.servers=172.18.19.206:9092,172.18.19.207:9092,172.18.19.208:9092
group.id=test
Enable.auto.commit=true
auto.commit.interval.ms=1000
session.timeout.ms=30000
key.deserializer= Org.apache.kafka.common.serialization.StringDeserializer
value.deserializer= Org.apache.kafka.common.serialization.StringDeserializer
The Propertyutils class is used in both Kafkaproducer and Kafkaconsumer to convert the. properties file to a Java.util.Properties object with the following code:
Package com.ricky.codelab.kafka.util;
Import Java.io.File;
Import Java.io.FileInputStream;
Import java.io.IOException;
Import Java.io.InputStream;
Import java.util.Properties;
Import Org.apache.commons.io.IOUtils; public class Propertyutils {private Propertyutils () {} public static Properties load (file file) throws IOE
xception{InputStream in = null;
try {in = new FileInputStream (file);
Properties Props = new properties ();
Props.load (in);
return props;
}finally{ioutils.closequietly (in);
}} public static Properties load (String path) throws ioexception{InputStream in = null;
try {in = PropertyUtils.class.getClassLoader (). getResourceAsStream (path);
Properties Props = new properties ();
Props.load (in);
return props;
}finally{ioutils.closequietly (in); }
}
}
4. Package execution
Separate the Kafkaproducerdemo.java and Kafkaconsumerdemo.java into CLI jar packages (Kafkaproducer.jar and Kafkaconsumer.jar), respectively, and start Kafkaproducer.jar and Kafkaconsumer.java, you can see the console output information.
Click here to download the full project code
Resources:
Kafkaproducer javadoc:http://kafka.apache.org/090/javadoc/index.html?org/apache/kafka/clients/producer/kafkaproducer.html
Kafkaconsumer javadoc:http://kafka.apache.org/090/javadoc/index.html?org/apache/kafka/clients/producer/kafkaproducer.html