Flume+kafka+hdfs detailed

Source: Internet
Author: User

Flume Frame Composition

650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/74/0A/wKiom1YPrdjguqxiAAJR5GnVzeg068.jpg "title=" Lesson 23: Practical Cases _flume and Kafka installation. Avi_20151003_183018.077.jpg "alt=" Wkiom1yprdjguqxiaajr5gnvzeg068.jpg "/>

Single-node flume configuration

flume-1.4.0 Start Flume

Bin/flume-ng agent--conf./conf-f conf/flume-conf.properties-dflume.root.logger=debug,console-n Agent

-N denotes the name of the agent in the configuration file

agent.sources = R1agent.sinks = S1agent.channels = C1agent.sources.r1.channels = C1agent.sinks.s1.channel = c1#describe/ Configure the Sourcesagent.sources.r1.type = Execagent.sources.r1.command = Tail-f/home/flume/loginfo#use a channel whi CH buffers events in Memoryagent.channels.c1.type = memoryagent.channels.c1.capacity = 1000 # Eventagent.channels.c1.transactionCapacity = 100agent.sinks.s1.type = Logger


650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M02/74/08/wKioL1YPs5XgkmmhAAIZ3G0tbb0587.jpg "title=" Lesson 24: Log processing is implemented in conjunction with flume_kafka_hdfs_hive. Avi_20151003_185520.258.jpg "alt=" Wkiol1yps5xgkmmhaaiz3g0tbb0587.jpg "/>



650) this.width=650; "src=" Http://s3.51cto.com/wyfs02/M00/74/0A/wKiom1YPs5jyA5Q9AANI-ME5zqU247.jpg "title=" Lesson 23: Practical Cases _flume and Kafka installation. Avi_20151003_184118.351.jpg "alt=" Wkiom1yps5jya5q9aani-me5zqu247.jpg "/>


flume-1.4.0 + Kafka-0.7.2+hdfs Flume Configuration


Agent.sources = r1agent.sinks = s_kafka s_hdfsagent.channels = c_kafka  c_hdfsagent.sources.r1.channels = c_kafka c_hdfsagent.sources.r1.type = exec# The following script tail a log agent.sources.r1.command = tail -f /home/flume/loginfoagent.channels.c_ kafka.type = memoryagent.channels.c_hdfs.type = memoryagent.sinks.s_kafka.type =  com.sink.firstkafkasinkagent.sinks.s_kafka.channel = c_kafka#kafka need to connect ZK, Write Broker Data agent.sinks.s_kafka.zkconnect = localhost:2181agent.sinks.s_kafka.topic =  Testagent.sinks.s_kafka.serializer.class = kafka.serializer.stringencoderagent.sinks.s_ kafka.metadata.broker.list = localhost:9092  #配置文件server. Propertiesagent.sinks.s_ Kafka.custom.encoding = utf-8agent.sinks.s_hdfs.type = hdfsagent.sinks.s_hdfs.channel  = c_hdfs# Default Port 8020agent.sinks.s_hdfs.hdfs.path = hdfs://localhost:9000/root/sourceagent.sinks.s_hdfs.hdfs.fileprefix = events-agent.sinks.s_hdfs.hdfs.filetype =  datastreamagent.sinks.s_hdfs.hdfs.writeformat = textagent.sinks.s_hdfs.hdfs.rollcount =  30  #达到某一数值记录生成文件agent. Sinks.s_hdfs.hdfs.rollsize = 0agent.sinks.s_hdfs.hdfs.rollinterval  = 0agent.sinks.s_hdfs.hdfs.uselocaltimestamp = trueagent.sinks.s_hdfs.hdfs.idletimeout  = 51agent.sinks.s_hdfs.hdfs.threadspoolsize = 2


Flume built-in Channel,source,sink rolluphttp://www.iteblog.com/archives/948



This article is from the "Nothing qq:934033381" blog, please be sure to keep this source http://tianxingzhe.blog.51cto.com/3390077/1700049

Flume+kafka+hdfs detailed

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.