The Kafka cluster (pseudo distributed) is already deployed, and the following is built into the Java development environment.
I. Environmental description
1, Win10 Eclipse (Kepler)
2, the machine set up a virtual machine system: CentOS 6.5 ip:192.168.136.134
3, deployed on the 134 zookeeper pseudo distributed deployment 192.168.136.134:2181,192.168.136.134:2182,192.168.136.134:2183
4. Deployment of the Kafka broker Cluster (pseudo distributed Deployment) on 134:192.168.136.134:9092,192.168.136.134:9093,192.168.136.134:9094
Broker ID is 1,2,3, respectively
II. Objectives
1. Build engineering on Eclipse, import dependent library, can compile normally, producter and consumer can realize
2, can be in Eclipse debugger, native----"Connect the virtual machine zookeeper cluster and Kafka's broker cluster
3, packaged programs, you can deploy to virtual machine 134, the implementation of virtual machine client---"Connect virtual machine zookeeper cluster and Kafka broker cluster
Third, the construction process
The whole process has gone some detours, some content on the network because of the version of the problem, may use Maven way, actually now can also be directly used in Java engineering development methods, see their favorite.
In addition, the latest API development is simpler and supposedly more efficient, so .... Groping all is tears. Hope this article can make everyone less detours.
1 Maven way to establish the project (feasible, personally do not recommend, because the workplace without extranet)
----------specific installation of MAVEN and configuration see Maven Practice
After Eclipse integration Maven has been implemented, create a MAVEN project: Open Eclipse,file->new->other select Maven Project
Then, modify the Pom.xml to add the following dependencies in the file
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
< artifactid>kafka_2.10</artifactid>
<version>0.8.2.0</version>
</dependency>
</dependencies>
When finished, save. Save triggers the MAVEN project itself will download the dependency library from the network, and the following interface shows that the project is downloading the dependent library automatically.
Write Producter code:
Package Com.newland.appkafka;
Import java.util.Properties;
Import Java.util.concurrent.TimeUnit;
Import Kafka.javaapi.producer.Producer;
Import Kafka.producer.KeyedMessage;
Import Kafka.producer.ProducerConfig;
Import Kafka.serializer.StringEncoder;
public class Kafkaproducer extends thread{private String topic;
Public kafkaproducer (String topic) {super ();
this.topic = topic;
@Override public void Run () {Producer Producer = Createproducer ();
int i=0;
while (true) {producer.send (new Keyedmessage<integer, string> (topic, "message:" + i++));
try {TimeUnit.SECONDS.sleep (1);
catch (Interruptedexception e) {e.printstacktrace ();
}} private Producer Createproducer () {Properties Properties = new properties (); Properties.put ("ZOokeeper.connect ", <span style=" color: #ff0000; " > "192.168.136.134:2181,192.168.136.134:2182,192.168.136.134:2183" </span>)//Declaration ZK Properties.put ("Seria
Lizer.class ", StringEncoder.class.getName ()); Properties.put ("Metadata.broker.list", <span style= "color: #ff0000;" > "192.168.136.134:9092,192.168.136.134:9093,192.168.136.134:9094</span>")//Declaration Kafka broker return New
Producer<integer, string> (New Producerconfig (properties)); public static void Main (string[] args) {new Kafkaproducer<span style= "color: #ff0000;" > ("Cwqsolotest") </span>.start ()//Use the Kafka cluster to create a good theme test}
Similar consumer code:
Package Com.newland.appkafka;
Import Java.util.HashMap;
Import java.util.List;
Import Java.util.Map;
Import java.util.Properties;
Import Kafka.consumer.Consumer;
Import Kafka.consumer.ConsumerConfig;
Import Kafka.consumer.ConsumerIterator;
Import Kafka.consumer.KafkaStream;
Import Kafka.javaapi.consumer.ConsumerConnector;
/** * Received data received: Message:10 Received: Message:11 Received: Message:12 Received: Message:13 Received: message:14 * @author ZM
* */public class Kafkaconsumer extends thread{private String topic;
Public Kafkaconsumer (String topic) {super ();
this.topic = topic;
@Override public void Run () {Consumerconnector consumer = Createconsumer ();
map<string, integer> topiccountmap = new hashmap<string, integer> (); Topiccountmap.put (topic, 1); Get a data from the subject one at a time map<string, list<kafkastream<byte[], byte[]>>> messagestreams = Consumer.createmessagestreams (TOPICCOUNTMAP); Kafkastream<byte[], byte[]> stream = Messagestreams.get (topic). Get (0)//fetch This data each time you receive CONSUMERITERATOR<
; byte[], byte[]> iterator = Stream.iterator ();
while (Iterator.hasnext ()) {String message = new String (Iterator.next (). message ());
System.out.println ("Received:" + message);
} private Consumerconnector Createconsumer () {Properties Properties = new properties (); Properties.put ("Zookeeper.connect", <span style= "color: #ff6600;" > "192.168.136.134:2181,192.168.136.134:2182,192.168.136.134:2183" </span>)//Declaration ZK Properties.put ("group . ID "," GROUP5 ");/You must use a different group name, and if both the producer and the consumer are in the same group, you cannot access the topic data return Consumer.createjavaconsumerconnector in the same group (new
Consumerconfig (properties)); public static void Main (string[] args) {new Kafkaconsumer<span style= "color: #ff0000;" > ("Cwqsolotest"). </span>start ()//Use the Kafka cluster to create a good theme test}
Then make sure you have no errors, and then right-click on the class that you want to test-run as Java application
Normally, you can connect kafka,producter to Message Queuing to push messages, consumer can accept messages from message queues.
Received: Message:11
Received: Message:12 Received:
message:13
I am running producter, not smooth, Eclipse console very disappointing error,
Once suspected of their own writing, or a dependency on the package has problems, and then try to package, the generated jar package, put on the virtual machine to run, all normal, only to determine the reason for the deployment of the machine and virtual machine two machines.
By looking for data, the reason is roughly determined, that is, the local and Kafka server virtual machines, not the same machine, if the Kafka server, set the IP localhost, this problem occurs.
So modify Kafka's server1.properties, server2.properties,server3.properties
The above is to build the project through the Maven method.
2 below explains how to build Java engineering.
As with ordinary Java engineering, after the establishment, the Producter and consumer classes in the project, in the Lib directory, copy the original Kafka Lib directory under the jar package
The red box is a jar package that needs to be copied from the Kafka Libs. One of the blue-green jar packages is written producter need, if only write consumer application does not need. After debugging smoothly.
In this project, and found a problem, the code in many places have been crossed out of the situation, this explanation API has been updated.
Ask colleagues, colleagues say the latest API, has simplified the development, the Blue jar package does not need to join, to modify the wording.
Package Com.newland.appkafka; <span style= "color: #ff0000;"
>import Org.apache.kafka.clients.producer.KafkaProducer;
Import Org.apache.kafka.clients.producer.Producer;
Import org.apache.kafka.clients.producer.producerrecord;</span> import java.util.Properties;
public class Producernewapi {public static void main (string[] args) {Properties props = new properties ();
Props.put ("Bootstrap.servers", "192.168.136.134:9092,192.168.136.134:9093,192.168.136.134:9094");
Props.put ("ACKs", "all");
Props.put ("retries", 0);
Props.put ("Batch.size", 16384);
Props.put ("linger.ms", 1);
Props.put ("Buffer.memory", 33554432);
Props.put ("Key.serializer", "Org.apache.kafka.common.serialization.StringSerializer");
Props.put ("Value.serializer", "Org.apache.kafka.common.serialization.StringSerializer");
producer<string, string> Producer = new kafkaproducer<string, string> (props); for (int i = 0; I < 100;
i++) Producer.send (New producerrecord<string, string> ("Cwqsolotest", "message:" + i++));
Producer.close (); }
}
New API writing, in import here there is a significant difference, the introduction of the jar package, also only need Kafka client.
Open a consumer on the virtual machine to accept the message, the results are as follows: