1, download the latest flume on the official website of Flumewget http://124.205.69.169/files/A1540000011ED5DB/mirror.bit.edu.cn/apache/flume/1.6.0/ apache-flume-1.6.0-bin.tar.gz 2. Solve Flume installation packagecd/export/software/TAR-ZXVF apache-flume-1.6.0-bin.tar.gz-c/export/servers/cd/export/servers/ln-s Apache-flume-1.6.0-bin Flume3. Create a flume configuration filecd/export/servers/flume/conf/mkdir myconfVI exec.confEnter the following:
A1.sources =R1a1.channels=c1a1.sinks=K1a1.sources.r1.type= execa1.sources.r1.command=tail-f/EXPORT/DATA/FLUME_SOURCES/CLICK_LOG/1 . Loga1.sources.r1.channels=C1a1.channels.c1.type=memorya1.channels.c1.capacity=10000a1.channels.c1.transactionCapacity=100A1.sinks.k1.type= org.apache.flume.sink.kafka.KafkaSinka1.sinks.k1.topic= myOrder #注意这里的topica1.sinks.k1.brokerList=kafka01:9092A1.sinks.k1.requiredAcks= 1a1.sinks.k1.batchSize= 20A1.sinks.k1.channel= C1
Note: The configuration is complete, the work of Flume link is completed basically. Next, prepare the target data file. 4, prepare the target data directorymkdir-p/export/data/flume_sources/click_log5. Create the target file and produce the data by scriptFor ((i=0;i<=50000;i++));Do echo "message-" + $i >>/export/data/flume_sources/click_log/1.log; DoneNote: The script name called click_log_out.sh needs to be empowered with the root user. chmod +x click_log_out.sh6. Start All ProcessesEach node launches the Zookeeper clusterFirst step: Start the Kafka cluster (mini1,mini2,mini3-----kafka1,kafka2,kafka3) nohup Kafka-server-start.sh/export/servers/kafka/config/server.properties &Step Two: Create a topic and turn on consumerkafka-topics.sh--create--zookeeper mini1:2181--replication-factor 1--partitions 4--topic myOrder< /c11>Start Kafka Consumer window (----consumer) kafka-console-consumer.sh--zookeeper mini1:2181--from-beginning--topic myOrderStep Three: Execute the script on the data (mini1-----dataSource) SH click_log_out.shFourth Step: Start the Flume client (mini1-----flume) ./bin/flume-ng agent-n a1-c conf-f conf/myconf/exec.conf-dflume.root.logger=info,consoleFifth step: Start the Kafka consumer window in the third step to see the effect
Flow Analysis System---Flume