Combine small files to sequence file or Avro files is a good method to feed Hadoop.
Small files in Hadoop would take more namenode memory resource.
Sequencefileinputformat is a file format in key value format.
The type of key and value can implement its own serialization and deserialization content.
The following code is for informational purposes only, and when used in real projects, appropriate adjustments can be made to better conserve resources and meet the needs of the project.
The sample code is as follows:
Package Myexamples;import Java.io.file;import java.io.ioexception;import java.util.hashmap;import java.util.Map; Import Org.apache.commons.io.fileutils;import Org.apache.hadoop.conf.configuration;import Org.apache.hadoop.fs.filesystem;import Org.apache.hadoop.fs.path;import Org.apache.hadoop.io.sequencefile;import Org.apache.hadoop.io.text;public class Localf2seqfile {/* * Local folder have a lot of txt file * We need handle it in map Reduce * So we want to load these files to one sequence file * Key:source file name * value:file content * */static void Write2seqfile (FileSystem fs,path hdfspath,hashmap<text,text> hm) {Sequencefile.writer writer=null; try {writer=sequencefile.createwriter (FS, fs.getconf (), Hdfspath, Text.class, Text.class); For (map.entry<text,text> Entry:hm.entrySet ()) Writer.append (Entry.getkey (), Entry.getvalue ()); } catch (IOException e) {e.printstacktrace (); }finally{try{writer.clOSE ();} catch (IOException IoE) {}}}static hashmap<text,text> collectfiles (String localpath) throws ioexception{hashmap& Lt Text,text> HM = new hashmap<text,text> (); File F = new file (LocalPath), if (!f.isdirectory ()) return hm;for (File File:f.listfiles ()) Hm.put (New Text (File.getname () ), New Text (fileutils.readfiletostring (file)); return HM;} static void Readseqfile (FileSystem FS, Path Hdfspath) throws ioexception{sequencefile.reader Reader = new Sequencefile.re Ader (fs,hdfspath,fs.getconf ()); Text key = new text (); Text value = new text (); while (Reader.next (key, value)) {System.out.print (key + ""); System.out.println (value); } reader.close ();} public static void Main (string[] args) throws IOException {args = "/home/hadoop/test/sub". Split (""); Configuration conf = new configuration (); Conf.set ("Fs.default.name", "hdfs://namenode:9000"); FileSystem fs = Filesystem.get (conf); System.out.println (Fs.geturi ()); Path file = new Path ("/user/hadoop/seqfiles/seqdemo.seq"), if (fs.exists (file)) Fs.delete (File,false); Hashmap<text,text> HM = Collectfiles (Args[0]); Write2seqfile (FS,FILE,HM); Readseqfile (fs,file); Fs.close ();}}
Combine small files to Sequence file