The Kafka version I am using is: 0.7.2
JDK version is: 1.6.0_20
Http://kafka.apache.org/07/quickstart.html The official example is not very complete, the following code is I supplemented and compiled to run.
Kafka Architecture design of distributed publish-Subscribe message system http://www.linuxidc.com/Linux/2013-11/92751.htm
Apache Kafka Code Instance http://www.linuxidc.com/Linux/2013-11/92754.htm
Apache Kafka Tutorial Note http://www.linuxidc.com/Linux/2014-01/94682.htm
Kafka How to use the Getting Started tutorial http://www.linuxidc.com/Linux/2014-07/104470.htm
Producer Code
ImportJava.util.*;ImportKafka.message.Message;ImportKafka.producer.ProducerConfig;ImportKafka.javaapi.producer.Producer;ImportKafka.javaapi.producer.ProducerData; Public classProducersample { Public Static voidMain (string[] args) {producersample ps=Newproducersample (); Properties Props=NewProperties (); Props.put ("Zk.connect", "127.0.0.1:2181"); Props.put ("Serializer.class", "Kafka.serializer.StringEncoder"); Producerconfig Config=Newproducerconfig (props); Producer<string, string> producer =NewProducer<string, string>(config); Producerdata<string, string> data =NewProducerdata<string, string> ("Test-topic", "Test-message2"); Producer.send (data); Producer.close (); }}consumer CodeImportJava.nio.ByteBuffer;ImportJava.util.HashMap;Importjava.util.List;ImportJava.util.Map;Importjava.util.Properties;ImportJava.util.concurrent.ExecutorService;Importjava.util.concurrent.Executors;ImportKafka.consumer.Consumer;ImportKafka.consumer.ConsumerConfig;ImportKafka.consumer.KafkaStream;ImportKafka.javaapi.consumer.ConsumerConnector;ImportKafka.message.Message;ImportKafka.message.MessageAndMetadata; Public classConsumersample { Public Static voidMain (string[] args) {//Specify some consumer propertiesProperties props =NewProperties (); Props.put ("Zk.connect", "localhost:2181"); Props.put ("Zk.connectiontimeout.ms", "1000000"); Props.put ("GroupID", "Test_group"); //Create The connection to the clusterConsumerconfig Consumerconfig =Newconsumerconfig (props); Consumerconnector Consumerconnector=Consumer.createjavaconsumerconnector (consumerconfig); //Create 4 partitions of the stream for topic ' test-topic ', to allow 4 threads to consumehashmap<string, integer> map =NewHashmap<string, integer>(); Map.put ("Test-topic", 4); Map<string, list<kafkastream<message>>> topicmessagestreams =consumerconnector.createmessagestreams (map); List<KafkaStream<Message>> streams = topicmessagestreams.get ("Test-topic"); //Create List of 4 threads to consume from each of the partitionsExecutorservice executor = Executors.newfixedthreadpool (4); //consume the messages in the threads for(FinalKafkastream<message>stream:streams) {Executor.submit (NewRunnable () { Public voidrun () { for(Messageandmetadata msgandmetadata:stream) {//Process Message (Msgandmetadata.message ())System.out.println ("topic:" +msgandmetadata.topic ()); Message Message=(Message) msgandmetadata.message (); Bytebuffer Buffer=message.payload (); <span style= "White-space:pre" > </SPAN>byte[] bytes =New byte[Message.payloadsize ()]; Buffer.get (bytes); String tmp=NewString (bytes); System.out.println ("Message content:" +tmp); } } }); } }}
After you start Zookeeper,kafka server separately, run the code for Producer,consumer
Run Producersample:
Run Consumersample:
Examples of Kafka's producer and Consumer