Simple test Project: 1, the new Java project structure is as follows:
The test class Flumetest code is as follows:
Package com.demo.flume;
Import Org.apache.log4j.Logger;
public class Flumetest {
private static final Logger Logger = Logger.getlogger (flumetest.class);
public static void Main (string[] args) throws Interruptedexception {for
(int i = i < i++) {
logger.in Fo ("Info [" + i + "]");
Thread.Sleep (1000);}}
Listen Kafka receive message consumer code as follows:
Package com.demo.flume; /** * Info:info * User:zhaokai * DATE:2017/3/17 * version:1.0 * History: <p> If there is a change process, please record </P> * * * impo
RT java.util.Arrays;
Import java.util.Properties;
Import Org.apache.kafka.clients.consumer.ConsumerRecord;
Import Org.apache.kafka.clients.consumer.ConsumerRecords;
Import Org.apache.kafka.clients.consumer.KafkaConsumer;
public class Consumer {public static void main (string[] args) {System.out.println ("Begin Consumer");
Connectionkafka ();
System.out.println ("Finish Consumer"); @SuppressWarnings ("resource") public static void Connectionkafka () {Properties props = new properties
();
Props.put ("Bootstrap.servers", "192.168.1.163:9092");
Props.put ("Group.id", "Testconsumer");
Props.put ("Enable.auto.commit", "true");
Props.put ("auto.commit.interval.ms", "1000");
Props.put ("session.timeout.ms", "30000"); Props.put ("Key.deserializer", "Org.apachE.kafka.common.serialization.stringdeserializer ");
Props.put ("Value.deserializer", "Org.apache.kafka.common.serialization.StringDeserializer");
kafkaconsumer<string, string> consumer = new kafkaconsumer<> (props);
Consumer.subscribe (Arrays.aslist ("flumetest"));
while (true) {consumerrecords<string, string> records = Consumer.poll (100);
try {thread.sleep (2000);
catch (Interruptedexception e) {e.printstacktrace (); For (consumerrecord<string, string> record:records) {System.out.printf ("===========
========offset =%d, key =%s, value =%s, Record.offset (), Record.key (), Record.value ()); }
}
}
}
The log4j configuration file is configured as follows:
Log4j.rootlogger=info,console
# for package Com.demo.kafka, log would is sent to Kafka Appender.
Log4j.logger.com.demo.flume=info,flume
Log4j.appender.flume = Org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = 192.168.1.163
Log4j.appender.flume.Port = 4141
Log4j.appender.flume.UnsafeMode = True
log4j.appender.flume.layout= Org.apache.log4j.PatternLayout
log4j.appender.flume.layout.conversionpattern=%d{yyyy-mm-dd HH:mm:ss}%p [%c:% L]-%m%n
# appender console
Log4j.appender.console=org.apache.log4j.consoleappender
Log4j.appender.console.target=system.out
Log4j.appender.console.layout=org.apache.log4j.patternlayout
log4j.appender.console.layout.conversionpattern=%d [%-5p] [%t]-[%l]%m%n
Note: the server to which hostname is installed for Flume Ip,port is the port corresponding to the listening port of the following flume
Pom.xml introduces the following jar:
<dependencies> <dependency> <groupId>org.slf4j</groupId> <ARTIFACTID>SLF
4j-log4j12</artifactid> <version>1.7.10</version> </dependency> <dependency>
<groupId>org.apache.flume</groupId> <artifactId>flume-ng-core</artifactId> <version>1.5.0</version> </dependency> <dependency> <GROUPID>ORG.APACHE.FL Ume.flume-ng-clients</groupid> <artifactId>flume-ng-log4jappender</artifactId> <versi
on>1.5.0</version> </dependency> <dependency> <groupId>junit</groupId>
<artifactId>junit</artifactId> <version>4.12</version> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactid>kafka-clients</ar Tifactid>;version>0.10.2.0</version> </dependency> <dependency> <groupid>org.apache .kafka</groupid> <artifactId>kafka_2.10</artifactId> <version>0.10.2.0</