This article introduces flume data insert hdfs and common directory (), this article continues to introduce flume-ng to insert data into the hbase-0.96.0.
First, modify the flume-node.conf file in the conf directory under the flume folder in node (for the original configuration, refer to the above) and make the following changes to it:
- Agent. sinks = k1
- Agent. sinks. k1.type = hbase
- Agent. sinks. k1.table = hello
- Agent. sinks. k1.columnFamily = cf
- Agent. sinks. k1.column = col1
- Agent. sinks. k1.serializer = org. apache. flume. sink. hbase. SimpleHbaseEventSerializer
- Agent. sinks. k1.channel = memoryChannel
agent.sinks = k1agent.sinks.k1.type = hbaseagent.sinks.k1.table = helloagent.sinks.k1.columnFamily = cfagent.sinks.k1.column = col1agent.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializeragent.sinks.k1.channel = memoryChannel
However, unlike the above, it is not that easy to get a successful result this time because of the dependency version. Here you need to replace protobuf under the lib folder of flume with 2.5.0 version in the Hadoop-2.2.0, you also need to replace guava under the lib folder of flume with guava in the hadoop-2.2.0, delete the original corresponding jar file. Start to take effect.
SimpleHbaseEventSerializer in flume-ng only provides the simplest function of inserting hbase data. If there are other requirements, you have to write the HbaseEventSerializer class by yourself, define your own class in apache-flume-1.4.0-src/flume-ng-sinks/flume-ng-hbase-sink/src/main/java to implement the HbaseEventSerializer interface in flume. A simple example is as follows:
- Publicclass MyHBaseSerializer implements HbaseEventSerializer {
- Privatestaticfinal String [] COLUMNS = "column1, column2". split (",");
- Privatestaticfinal String [] PARAMS = "col1, col2". split (",");
- Privatebyte [] columnFamily = "cf". getBytes ();
- Privatebyte [] content;
- @ Override
- Publicvoid configure (Context context ){
- }
- @ Override
- Publicvoid configure (ComponentConfiguration conf ){
- }
- @ Override
- Publicvoid initialize (Event event, byte [] columnFamily ){
- This. content = event. getBody ();
- This. columnFamily = columnFamily;
- }
- @ Override
- Public List <Row> getActions (){
- String string = Bytes. toString (content );
- String value1 = string. substring (0, string. length ()/2 );
- String value2 = string. substring (string. length ()/2, string. length ());
- Map <String, String> map = Maps. newHashMap ();
- Map. put (PARAMS [0], value1 );
- Map. put (PARAMS [1], value2 );
- List <Row> actions = new partition List <Row> ();
- String rowKey = String. valueOf (System. currentTimeMillis ());
- Put put = new Put (Bytes. toBytes (rowKey ));
- For (int I = 0; I <COLUMNS. length; I ++ ){
- String value = map. get (PARAMS [I]);
- If (value = null)
- Value = "";
- Put. add (columnFamily, Bytes. toBytes (COLUMNS [I]), Bytes. toBytes (value ));
- }
- Actions. add (put );
- Return actions;
- }
- @ Override
- Public List <Increment> getIncrements (){
- List <Increment> incs = new Entity List <Increment> ();
- Return incs;
- }
- @ Override
- Publicvoid close (){
- }
- }
public class MyHBaseSerializer implements HbaseEventSerializer { private static final String[] COLUMNS = "column1,column2".split(","); private static final String[] PARAMS = "col1,col2".split(","); private byte[] columnFamily = "cf".getBytes(); private byte[] content; @Override public void configure(Context context) { } @Override public void configure(ComponentConfiguration conf) { } @Override public void initialize(Event event, byte[] columnFamily) { this.content = event.getBody(); this.columnFamily = columnFamily; } @Override public List<Row> getActions() { String string = Bytes.toString(content); String value1 = string.substring(0,string.length()/2); String value2 = string.substring(string.length()/2, string.length()); Map<String,String> map = Maps.newHashMap(); map.put(PARAMS[0], value1); map.put(PARAMS[1], value2); List<Row> actions = new LinkedList<Row>(); String rowKey = String.valueOf(System.currentTimeMillis()); Put put = new Put(Bytes.toBytes(rowKey)); for (int i = 0; i < COLUMNS.length; i++) { String value = map.get(PARAMS[i]); if (value == null) value = ""; put.add(columnFamily, Bytes.toBytes(COLUMNS[i]), Bytes.toBytes(value)); } actions.add(put); return actions; } @Override public List<Increment> getIncrements() { List<Increment> incs = new LinkedList<Increment>(); return incs; } @Override public void close() { }}
The function of this class is to split the content in the file by line and insert the content into the column1 and column2 columns respectively. The rowKey is the current time. After that, recompile and package the flume-ng code. Replace the corresponding jar files in the lib folder in the flume-ng directory. Then, change the value of agent. sinks. k1.serializer in the preceding figure to test. MyHBaseSerializer. Test indicates the package name.
HBase details: click here
HBase: click here
Hadoop + HBase cloud storage creation summary PDF
Regionserver startup failed due to inconsistent time between HBase nodes
Hadoop + ZooKeeper + HBase cluster configuration
Hadoop cluster Installation & HBase lab environment setup
HBase cluster configuration based on Hadoop cluster'
Hadoop installation and deployment notes-HBase full distribution mode installation
Detailed tutorial on creating HBase environment for standalone Edition