Background: Use KAFKA+FLUME+MORPHLINE+SOLR to do real-time statistics.
SOLR has no data since December 23. View Log discovery because a colleague added a malformed buried point data, resulting in a lot of error.
It is inferred that because the use of MEM channel is full, the message is too late to process, resulting in the loss of new data.
Modify flume to use the file channel:
Kafka2solr.sources =SOURCE_FROM_KAFKAKAFKA2SOLR. Channels =FILE_CHANNELKAFKA2SOLR. Sinks =Solrsink#for each one of the sources, the type is definedKafka2solr.sources.source_from_kafka.type = Org.apache.flume.source.kafka.KAFKASOURCEKAFKA2SOLR. Sources.source_from_kafka.channels =FILE_CHANNELKAFKA2SOLR. sources.source_from_kafka.batchSize = -KAFKA2SOLR. sources.source_from_kafka.useflumeeventformat=FALSEKAFKA2SOLR. sources.source_from_kafka.kafka.bootstrap.servers= KAFKANODE0:9092, Kafkanode1:9092, Kafkanode2:9092KAFKA2SOLR. Sources.source_from_kafka.kafka.topics =EVENTCOUNTKAFKA2SOLR. sources.source_from_kafka.kafka.consumer.group.id =FLUME_SOLR_CALLERKAFKA2SOLR. Sources.source_from_kafka.kafka.consumer.auto.offset.Reset=Latest#File ChannelKafka2solr.channels.file_channel.type =FILEKAFKA2SOLR. Channels.file_channel.checkpointDir =/var/Log/flume-ng/CHECKPOINTKAFKA2SOLR. Channels.file_channel.dataDirs =/var/Log/flume-ng/DATAKAFKA2SOLR. Sinks.solrSink.type = Org.apache.flume.sink.solr.morphline.MORPHLINESOLRSINKKAFKA2SOLR. Sinks.solrSink.channel =File_channel#kafka2solr.sinks.solrSink.batchSize = 1000#kafka2solr.sinks.solrsink.batchdurationmillis =Kafka2solr.sinks.solrSink.morphlineFile = Morphlines.CONFKAFKA2SOLR. sinks.solrsink.morphlineid=MORPHLINE1KAFKA2SOLR. sinks.solrsink.isignoringrecoverableexceptions=true
Makes data persistent to disk is not lost.
Real-time Event statistics Project: Optimizing Flume: Replacing MEM Channel with file channel