Initial knowledge of Kafka----------CentOS on stand-alone deployment, service startup, Java client calls

Source: Internet
Author: User
Tags message queue zookeeper

As Apach's next excellent open source Message queue framework, Kafka has become the first choice for many Internet vendors to log collection and processing. The latter may be applied in a real-world scenario, so we'll look at it first. After two nights of effort, it was finally possible to use it basically.

Operating system: Virtual machine CentOS 6.5

1, download Kafka installation files, first enter the official website, find the latest stable version

wget http://mirrors.hust.edu.cn/apache/kafka/0.10.2.0/kafka_2.12-0.10.2.0.tgz

2, unzip and copy to the required directory, my settings for/usr/

First CP then unzip TAR-XZVF

3, because I have installed the zookeeper, so directly modify the Server.properties file

4, start the service bin/kafka-server-start.sh config/server.properties, the problem comes:

[[email protected] kafka_2.12-0.10.2.0]# Exception in thread "main" Java.lang.UnsupportedClassVersionError: kafka/kafka:unsupported major.minor version 52.0at java.lang.ClassLoader.defineClass1 (Native Method) at Java.lang.ClassLoader.defineClassCond (classloader.java:631) at Java.lang.ClassLoader.defineClass ( classloader.java:615)   Start error, see prompt, due to the use of the latest Kafka version, requires 1.8 jdk. And my native view is currently 1.6.  5, replace the JDK version using wget download 1.8jdk,vim/etc/profile change the path of Java_home, Source/etc/profile, the execution java-version still show as 1.6. Restart is still invalid, Find a solution on the net: Which java/usr/bin/javawhich javac/usr/bin/javac 1. First delete the Usr/bin directory rm-rf  JAVARM-RF   JAVAC2 jdk1.8ln-s   $JAVA _home/bin/java /usr/bin/javaln-s   $JAVA _home/bin/javac /usr/ Bin/javac 6, start the service  bin/kafka-server-start.sh config/server.properties   start a producer without error, topic for test:bin/ kafka-console-producer.sh--broker-list localhost:9092--topic Test start two consumer bin/kafka-console-consumer.sh--zookeeper 192.168.118.131:3181--topic Test--from-beginning bin/kafka-console-consumer.sh--zookeeper 192.168.118.131:3181--topic test--from-beginning  

Both consumers can receive information from the server.

7, try to use Java code to receive information, first set up a MAVEN project, Pom.xml file to join the dependency:

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.1</version>
</dependency>

<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.8.2.1</version>
</dependency>

Then run the setup code to build the complete. The specific calling code is as follows

Import java.util.List;
Import java.util.Properties;
Import Java.util.concurrent.TimeUnit;

Import Kafka.consumer.Consumer;
Import Kafka.consumer.ConsumerConfig;
Import Kafka.consumer.ConsumerIterator;
Import Kafka.consumer.KafkaStream;
Import kafka.consumer.Whitelist;
Import Kafka.javaapi.consumer.ConsumerConnector;
Import Kafka.message.MessageAndMetadata;

Import Org.apache.kafka.common.utils.CollectionUtils;

/**
* File Name:KafkaConsumer.java
* Package_name:kafkatest
* date:2017-3-25 10:00:56
* AUTHOR:CAO.ZHI10
*
*/
public class Kafkaconsumer {
public static void Main (string[] args) throws Exception {
Properties Properties = new properties ();
Properties.put ("Zookeeper.connect", "192.168.118.131:3181");
Properties.put ("Auto.commit.enable", "true");
Properties.put ("auto.commit.interval.ms", "60000");
Properties.put ("Group.id", "Test");

Consumerconfig consumerconfig = new Consumerconfig (properties);

Consumerconnector javaconsumerconnector = Consumer.createjavaconsumerconnector (consumerconfig);

Filters for Topic
Whitelist Whitelist = new Whitelist ("test");
List<kafkastream<byte[], byte[]>> partitions = Javaconsumerconnector.createmessagestreamsbyfilter ( Whitelist);

if (partitions==null) {
System.out.println ("empty!");
TimeUnit.SECONDS.sleep (1);
}

Consumer News
For (kafkastream<byte[], byte[]> partition:partitions) {

Consumeriterator<byte[], byte[]> iterator = Partition.iterator ();
while (Iterator.hasnext ()) {
Messageandmetadata<byte[], byte[]> next = Iterator.next ();
System.out.println ("Partiton:" + next.partition ());
System.out.println ("offset:" + next.offset ());
System.out.println ("received message:" + New String (Next.message (), "Utf-8"));
}
}
}
}

Executes the main method, always reported disconnect exception, carefully analyze the boot log discovery has been trying to call localhost:9092. Search for the next problem, the online solution is to modify the service.properties inside the hostname information, but the latest version has been canceled the node, currently must use listeners.

8, restart the service, the producer, the consumer ( note must use the IP address to start, otherwise will error )

bin/kafka-console-producer.sh--broker-list 192.168.118.131:9092--topic test

9, Java code startup, the producer CRT interface input can normally receive information, but there is garbled:

The character conversion method is gbk,utf-8. Later thought of the original real-time view server-side logs also appear garbled solution, so modified the next CRT session inside the encoding method, as follows

The problem has been solved

Initial knowledge of Kafka----------CentOS on stand-alone deployment, service startup, Java client calls

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.