Combine small files to Sequence file

Source: Internet
Author: User

Combine small files to sequence file or Avro files is a good method to feed Hadoop.

Small files in Hadoop would take more namenode memory resource.

Sequencefileinputformat is a file format in key value format.

The type of key and value can implement its own serialization and deserialization content.

The following code is for informational purposes only, and when used in real projects, appropriate adjustments can be made to better conserve resources and meet the needs of the project.

The sample code is as follows:

Package Myexamples;import Java.io.file;import java.io.ioexception;import java.util.hashmap;import java.util.Map; Import Org.apache.commons.io.fileutils;import Org.apache.hadoop.conf.configuration;import Org.apache.hadoop.fs.filesystem;import Org.apache.hadoop.fs.path;import Org.apache.hadoop.io.sequencefile;import Org.apache.hadoop.io.text;public class Localf2seqfile {/* * Local folder have a lot of txt file * We need handle it in map  Reduce * So we want to load these files to one sequence file * Key:source file name * value:file content * */static void            Write2seqfile (FileSystem fs,path hdfspath,hashmap<text,text> hm) {Sequencefile.writer writer=null;              try {writer=sequencefile.createwriter (FS, fs.getconf (), Hdfspath, Text.class, Text.class);            For (map.entry<text,text> Entry:hm.entrySet ()) Writer.append (Entry.getkey (), Entry.getvalue ());      } catch (IOException e) {e.printstacktrace (); }finally{try{writer.clOSE ();} catch (IOException IoE) {}}}static hashmap<text,text> collectfiles (String localpath) throws ioexception{hashmap& Lt Text,text> HM = new hashmap<text,text> (); File F = new file (LocalPath), if (!f.isdirectory ()) return hm;for (File File:f.listfiles ()) Hm.put (New Text (File.getname () ), New Text (fileutils.readfiletostring (file)); return HM;} static void Readseqfile (FileSystem FS, Path Hdfspath) throws ioexception{sequencefile.reader Reader = new Sequencefile.re           Ader (fs,hdfspath,fs.getconf ());           Text key = new text ();           Text value = new text ();           while (Reader.next (key, value)) {System.out.print (key + "");           System.out.println (value); } reader.close ();} public static void Main (string[] args) throws IOException {args = "/home/hadoop/test/sub". Split (""); Configuration conf = new configuration (); Conf.set ("Fs.default.name", "hdfs://namenode:9000"); FileSystem fs = Filesystem.get (conf); System.out.println (Fs.geturi ()); Path file = new Path ("/user/hadoop/seqfiles/seqdemo.seq"), if (fs.exists (file)) Fs.delete (File,false); Hashmap<text,text> HM = Collectfiles (Args[0]); Write2seqfile (FS,FILE,HM); Readseqfile (fs,file); Fs.close ();}}

Combine small files to Sequence file

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.