Kafka explains the specifics of three: developing Kafka applications

Source: Internet
Author: User

One, overall appearance Kafkawe know. The Kafka system has three main components: Producer, Consumer, broker.
Producers production (produce) message and pushes (push) to brokers,consumers from brokers to extract the message (pull) out of consumption (consume).

Ii. Development of a producer application Producers is used to produce messages and push the generated messages to the broker of Kafka. Producers can be used in a variety of applications. For example, Web applications. Server-side applications, proxy applications, log systems, and more.

Of course.     Producers now has a variety of language implementations such as Java, C, Python, and so on. Let's take a look at the role of producer in Kafka:

watermark/2/text/ahr0cdovl2jsb2cuy3nkbi5uzxqvc3vpzmvuzzmwnte=/font/5a6l5l2t/fontsize/400/fill/i0jbqkfcma==/ Dissolve/70/gravity/southeast "> 2.1.kafka Producer Apikafka and Producer related APIs have three classes

    • Producer: the most basic class. Used to create and push messages
    • Keyedmessage: Defines the message object to send, such as the definition of which topic,partition key is sent and what to send.
    • Producerconfig: Configure producer. For example, define the brokers, partition class, serializer class, partition key, etc. to be connected
2.2 or less we'll write one of the simplest producer: generate a message and push it to the broker
Package Bonree.producer;import Java.util.properties;import Kafka.javaapi.producer.producer;import Kafka.producer.keyedmessage;import kafka.producer.producerconfig;/********************************************* * Bidplanstructform.java Created on 2014-7-8 * Author: <a href=mailto:[email  protected]>houda</a> * @Title: Simpleproducer.java * @Package bonree.producer * Description: * version:1. 0 ******************************************************************************/public class SimpleProducer { private static producer<integer,string> producer;private final Properties props=new properties ();p ublic Simpleproducer () {//define the connected broker Listprops.put ("Metadata.broker.list", "192.168.4.31:9092");// Defines the serialization class (to serialize before the Java object is transmitted) Props.put ("Serializer.class", "Kafka.serializer.StringEncoder");p roducer = new producer< Integer, string> (new Producerconfig (props));} public static void Main (string[] args) {simpleproducer sp=new simpleproducer ();//Definition TOpicstring topic= "mytopic";//defines the message to be sent to the topic string MESSAGESTR = "Send a message to broker";//Build Message Object keyedmessage< Integer, string> data = new Keyedmessage<integer, string> (topic, messagestr);//Push message to Brokerproducer.send (data );p roducer.close ();}}
Iii. Development of a consumer applicationConsumer is used to consume producer generated messages, of course a consumer can be a variety of applications. If it can be a real-time analysis system. It can also be a data warehouse or a solution based on the advertised subscription model. The consumer end is the same for multiple languages, such as Java, C, Python, and so on.

Let's take a look at the role of consumer in Kafka: the 3.1.kafka Producer API Kafka and Producer are slightly different. It provides two types of APIs

    • High-level consumer API: provides an abstraction of the underlying API, which is simpler to use
    • Simple Consumer API: agree to rewrite the implementation of the underlying API, providing a lot of other control, of course, it is also more complicated to use.
Because it is the first application, we use this part of the high-level API, and it features each consumption of a message that itself actively moves the value of offset to the next message. About offset is described separately in the following sections. Similar to producer, there are three basic classes associated with consumer:
    • Kafkastream: This is the message of producer production.
    • Consumerconfig: Define some configuration information to connect to zookeeper (Kafka through zookeeper equalization pressure, please refer to the meeting several articles for details). For example, define zookeeper URL, group ID, Connection zookeeper expiration time, and so on.
    • Consumerconnector: Responsible for connecting with zookeeper and other work
3.2 or less we'll write one of the simplest consumer: consume a message from the broker
Package Bonree.consumer;import Java.util.hashmap;import Java.util.list;import java.util.map;import Java.util.properties;import Kafka.consumer.consumer;import Kafka.consumer.consumerconfig;import Kafka.consumer.consumeriterator;import Kafka.consumer.kafkastream;import kafka.javaapi.consumer.consumerconnector;/********************************************************************* * Created on 2014-7-8 Author: <a * href=mailto:[email protected]>houda</a> * @Title: Simplehl Consumer.java * @Package Bonree.consumer description:version:1.0 ************************************************** /public class Simplehlconsumer {private final consumerconnector consumer;private final String Topic;public Simplehlconsumer (String zookeeper, String groupId, String topic) {Properties props = new properties (); Define the connection Zookeeper information Props.put ("Zookeeper.connect", zookeeper);//Definition consumer all groupid, about GroupID, will continue to introduce Props.put (" Group.id ", groupId);p rops.put (" ZookeEper.session.timeout.ms "," "");p rops.put ("zookeeper.sync.time.ms", "$");p rops.put ("auto.commit.interval.ms", Consumer = consumer.createjavaconsumerconnector (new Consumerconfig (props)); this.topic = topic;} public void Testconsumer () {map<string, integer> topiccount = new hashmap<string, integer> ();// Define the number of subscription topic Topiccount.put (topic, New Integer (1)),//Return all topic mapmap<string, list<kafkastream<byte[], byte[]>>> consumerstreams = Consumer.createmessagestreams (TopicCount);//Remove the message flow from the topic we need list< Kafkastream<byte[], byte[]>> streams = consumerstreams.get (topic); for (final Kafkastream stream:streams) { Consumeriterator<byte[], byte[]> consumerite = Stream.iterator (); while (Consumerite.hasnext ()) System.out.println ("Message from single Topic::" + New String (Consumerite.next (). Message ()));} if (consumer! = null) Consumer.shutdown ();} public static void Main (string[] args) {String topic = "Mytopic"; Simplehlconsumer Simplehlconsumer = NEW Simplehlconsumer ("192.168.4.32:2181", "testgroup", topic); Simplehlconsumer.testconsumer ();}} 
Iv. Implementation of the view resultsStart the server-side related processes first:
    • Execution Zookeeper: [email protected] kafka-0.8]# bin/zookeeper-server-start.sh config/zookeeper.properties
    • Execution Kafkabroker: [email protected] kafka-0.8]# bin/kafka-server-start.sh config/server.properties
And then execute the application we wrote.
    • Executes the main function of the Simplehlconsumer class just written, waiting for the producer to produce the message
    • Executes the main function of the simpleproducer. Production message and push to broker

Result: The producer-produced message can be seen in the console of Simplehlconsumer after executing simpleproducer:"send a message to broker".


Copyright notice: This article blog original article. Blogs, without consent, may not be reproduced.

Kafka explains the specifics of three: developing Kafka applications

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.