Kafka ---- kafka API (java version), kafka ---- kafkaapi

Source: Internet
Author: User
Tags kafka streams

Kafka ---- kafka API (java version), kafka ---- kafkaapi

Apache Kafka contains new Java clients that will replace existing Scala clients, but they will remain for a while for compatibility. You can call these clients through some separate jar packages. These packages have little dependencies, and the old Scala client will still exist.

I. Producer API

We encourage all new developers to use the new java version producer. This client is tested in the production environment and is generally faster and has more features than the previous Scala client. You can call this client by adding dependencies on the jar package of the client, as shown below, using maven Configuration:

        <dependency>    <groupId>org.apache.kafka</groupId>    <artifactId>kafka-clients</artifactId>    <version>0.10.0.0</version></dependency>
 
You can use the javadoc file to view how to use producer.
 
Ii. Consumer API
In version 0.9.0, a new java version of consumer is added to replace the existing high-level zookeeper-based consumer and low-level consumer APIs.
This client considers it a beta version. To ensure smooth upgrade, we will continue to maintain the consumer Client of version 0.8, which will be in the kafka cluster of version 0.9.
It still takes effect. In the following sections, we will introduce the consumer APIs (including high-level conusmerconneand low-level SimpleConsumer) of the old 0.8 version and
The consumer API of the new Java version.
 
1、Old High  Level Consumer  API
class   Consumer{
  /**
* Create a ConsumerConnector: Create a consumer connector
   *
* @ Param config at the minimum, need to specify the groupid of the consumer and the zookeeper connection string zookeeper. connect. config parameter: Set the groupid of the consumer and the zookeeper connection string zookeeper. connect
   */
 
    public static kafka.javaapi.consumer.ConsumerConnector  createJavaConsumerConnector(ConsumerConfig  config);
  }
 
  /**
* V: type of the message: message type
* K: type of the optional key assciated with the message: type of the optional keyword carried by the message
   */
   public interface kafka.javaapi.consumer.ConsumerConnector {
     /**
* Create a list of message streams of type T for each topic.: Create a list of T-type message streams for each topic.
      * 
* @ Param topicCountMap a map of (topic, # streams) pair: Key-value pairs of topic and streams
* @ Param decoder a decoder that converts from Message to T: decoder for converting Message to T
* @ Return a map of (topic, list of KafakStream) pairs.: Key-value pairs of the topic and KafkaStream lists
      *           The number of items in the list is #streams . Each stream supports
* An iterator over message/metadata pairs.: the number of items in the list is # streams. Each stream supports message/metadata-based iterators.
      */
      public <K,V> Map<String, List<KafkaStream<K,V> > >
        createMessageStreams( Map<String, Integer> topicCountMap, Decoder<K> keyDecoder, Decoder<V> valueDecoder);
 
/*** Create a list of message streams of type T for each topic, using the default decoder. Create a T-type message list for each topic. Use the default decoder */
     public Map<String, List<KafkaStream<byte[], byte[]>>> createMessageStreams(Map<String, Integer> topicCountMap);
/*** Create a list of message streams for topics matching a wildcard. create a message stream list for topics matching wildcard ** @ param topicFilter a TopicFilter that specifies which topics to * subscribe to (encapsulates a whitelist or a blacklist ). specify the topics TopicFilter to be subscribed to (which encapsulates whitelist or blacklist) * @ param numStreams the number of message streams to return. number of streams to be returned * @ param keyDecoder a decoder that decodes the message key can be decoded Key decoder * @ param valueDecoder a decoder that decodes the message itself can decode the decoder of the message itself * @ return a list of KafkaStream. each stream supports an * iterator over its MessageAndMetadata elements. returns the KafkaStream list. Each stream supports an iterator Based on the MessagesAndMetadata element. */
 public <K,V> List<KafkaStream<K,V>>    createMessageStreamsByFilter(TopicFilter topicFilter, int numStreams, Decoder<K> keyDecoder, Decoder<V> valueDecoder);
 
/*** Create a list of message streams for topics matching a wildcard, using the default decoder. use the default decoder to create a message stream List */public List <KafkaStream <byte [], byte []> createMessageStreamsByFilter (TopicFilter topicFilter, int numStreams) for topics matching wildcard ); /*** Create a list of message streams for topics matching a wildcard, using the default decoder, with one stream. use the default decoder to create a message stream List */public List <KafkaStream <byte [], byte []> createMessageStreamsByFilter (TopicFilter topicFilter) for topics matching wildcard ); /*** Commit the offsets of all topic/partitions connected by this connector ctor. submit all topic/partitions offsets */public void commitOffsets () through ctor;/*** Shut down the connector: Close conne*/public void shutdown ();}
 
     
You can use this example to learn how to use the high level consumer api.
2、Old Simple Consumer  API
Class kafka. java API. consumer. simpleConsumer {/*** Fetch a set of messages from a topic. capture the message sequence from the topis ** @ param request specifies the topic name, topic partition, starting byte offset, maximum bytes to be fetched. specify the topic name, topic partition, start byte offset, maximum number of captured bytes * @ return a set of fetched messages */public FetchResponse fetch (kafka. java API. fetchRequest request);/*** Fetch metadata for a sequence of topics. capture metadata of a series of topics ** @ param request specifies the versionId, clientId, sequence of topics. specify versionId, clientId, topics * @ return metadata for each topic in the request. returns the element data of each topic in this requirement */public kafka. java API. topicMetadataResponse send (kafka. java API. topicMetadataRequest request);/*** Get a list of valid offsets (up to maxSize) before the given time. returns the list of correct offsets within the given time ** @ param request a [[kafka. java API. offsetRequest] object. * @ return a [[kafka. java API. offsetResponse] object. */public kafak. java API. offsetResponse getOffsetsBefore (OffsetRequest request);/*** Close the SimpleConsumer. close */public void close ();}
 
For most applications, the high level consumer Api is sufficient, and some features required by some applications have not yet appeared the high level consumer interface (for example,
When the consumer is restarted, set the initial offset ). They can use the low level SimpleConsumer Api. The logic may be somewhat complicated. You can learn it based on this example.
 
3、New Consumer API
 
The new consumer API has a unified standard, and the difference between high-level and low-level consumer APIs in version 0.8 does not exist. You can use the following maven configuration method,
Specify the jar package on which the client depends, so that you can use the new consumer API.
<dependency>    <groupId>org.apache.kafka</groupId>    <artifactId>kafka-clients</artifactId>    <version>0.10.0.0</version></dependency>
Examples showing how to use the consumer are given in the javadocs.
 
Iii. Streams API
In version 0.10.0 release, a new client call library Kafka Streams is added to support streaming applications. The Kafka Streams database is considered to be
Alpha version quality, and its public call APIs may be modified in the future. You can specify the Kafka Streams
To call Kafka Streams.
<dependency>    <groupId>org.apache.kafka</groupId>    <artifactId>kafka-streams</artifactId>    <version>0.10.0.0</version></dependency>
The Javadocs shows how to call this library (note that these classes are unstable and may be modified in future versions ).
Source Code address acquisition:Mingli

If you are interested, you can go to the ball ~ Sharing learning technologies: 2042849237

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.