Flume integrated Kafka:flume capture business log, sent to Kafka installation deployment KafkaDownload1.0.0 is the latest release. The current stable version was 1.0.0.You can verify your download by following these procedures and using these keys.1.0.0
- Released November 1, 2017
- Source download:kafka-1.0.0-src.tgz (ASC, SHA512)
- Binary Downloads:
- Scala 2.11-kafka_2.11-1.0.0.tgz (ASC, SHA512)
- Scala 2.12-kafka_2.12-1.0.0.tgz (ASC, SHA512)
We build for multiple versions of Scala. This is matters if you is using Scala and you want a version built for the same Scala version. Otherwise any version should work (2.11 is recommended). 1. Decompression: Tar zxvf kafka_2.11-1.0.0.tgz 2. Deployment directory: mv& Nbsp;kafka_2.12-1.0.0/usr/local/kafka2.12 3. Start zookeeper ... 4. Start Kafka: #nohup bin/ kafka-server-start.sh config/server.properties & 5. Create topic: #bin/kafka-topics.sh--create--zookeeper localhost:2181--partitions 1--replication-factor 1--topic testcreated topic "Test". 6. View topic:# bin/kafka-topics.sh-- List--zookeeper localhost:2181__consumer_offsetstest7. Test send data #bin/kafka-console-producer.sh--broker-list localhost:9092--topic Test input: My test 8. Test consumer message: #bin/kafka-console-consumer.sh--bootstrap-server localhost : 9092--topic Test--from-beginning installation Deployment Flumeflume Download: downloadapache Flume is distributed under the Apache License, version 2.0The link in the Mirrors column should display a list oF available mirrors with a default selection based on your inferred location. If you don't see the that page, try a different browser. The checksum and signature is links to the originals on the main distribution server.
Ap Ache Flume binary (tar.gz) |
apache-flume-1.8.0-bin.tar.gz |
apache-flume-1.8.0-bin.tar.gz.md5 |
apache-flume-1.8.0-bin.tar.gz.sha1 |
apache-flume-1.8.0-bin.tar.gz.asc |
Apache Fl Ume Source (tar.gz) |
apache-flume-1.8.0-src.tar.gz |
apache-flume-1.8.0-src.tar.gz.md5 |
AP ACHE-FLUME-1.8.0-SRC.TAR.GZ.SHA1 |
apache-flume-1.8.0-src.tar.gz.asc |
It is essential that verify the integrity of the downloaded files using the PGP or MD5 signatures. Read verifying Apache HTTP Server releases for More information on why do you should verify our RELEASES.&NB sp;1. Download: wget apache-flume-1.8.0-bin.tar.gz 2. Unzip: Tar zxvf apache-flume-1.8.0-bin.tar.gz 3. Setup directory: mv apache-flume-1.8.0-bin /usr/local/flume1.8 4. Preparation: Install Java and set Java environment variables, flume environment variables, in '/etc/ Profile ' Add export java_home=/usr/java/jdk1.8.0_65export flume_home=/usr/local/flume1.8export PATH= $PATH: $ Java_home/bin: $FLUME _home execution: source/etc/profile effective variable 5. Create Log Collection directory:/tmp/logs/kafka.log6. Configure copy configuration template: # cp Conf/flume-conf.properties.template CONF/FLUME-CONF.PROPERTIES#&NBSP;CP conf/flume-env.properties.template conf/ The flume-env.properties edit configuration is as follows: Agent.sources = s1 agent.channels = c1 agent.sinks = K1 agent.sources.s1.type=exec #日志采集位置 agent.sources.s1.command=tail-f/tmp/logs/kafka.log agent.sources.s1.channels=c1 agent.channels.c1.type=memory agent.channels.c1.capacity=10000 agent.channels.c1.transactioncapacity=100 agent.sinks.k1.type= org.apache.flume.sink.kafka.kafkasink # Kafka Address agent.sinks.k1.brokerlist=localhost:9092 #kafka topic agent.sinks.k1.topic=test agent.sinks.k1.serializer.class=kafka.serializer.stringencoder &NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;&NBSP;AGENT.SINKS.K1.CHANNEL=C1 Functional Verification 7. Start the service # bin/ Flume-ng agent--conf./conf/-F conf/kafka.properties-dflume.root.logger=debug,console-n Agent run log is located in the logs directory, or add-dflume.root.logger=info,console option on startup foreground, output print log, view specific running log, check the cause of service exception. 8. Create test log generation: Log_producer_test.shfor ((i=0;i<=1000;i++));d o echo "kafka_flume_test-" + $i >>/tmp/logs/ Kafka.log;do 9. Generating logs:./log_producer_test.sh Observing Kafka log consumption ...
Flume Consolidation Kafka