The procedure is as follows:
PackageCom.lcy.hadoop.examples;Importorg.apache.hadoop.conf.Configuration;Importorg.apache.hadoop.io.IOUtils;ImportOrg.apache.hadoop.io.compress.CompressionCodec;ImportOrg.apache.hadoop.io.compress.CompressionOutputStream;Importorg.apache.hadoop.util.ReflectionUtils; Public classStreamcompressor { Public Static voidMain (string[] args)throwsexception{//TODO auto-generated Method StubString codecclassname=args[0]; Class<?> codecclass=Class.forName (codecclassname); Configuration conf=NewConfiguration (); Compressioncodec codec=(COMPRESSIONCODEC) reflectionutils.newinstance (codecclass, conf); Compressionoutputstream out=Codec.createoutputstream (System.out); Ioutils.copybytes (system.in, out,4096,false); Out.finish (); }}
To run the program, enter the following command:
[Email protected]:~$ echo "Lcyvino" | Hadoop Jar/usr/local/testjar/streamcompressor.jar Com.lcy.hadoop.examples.StreamCompressor Org.apache.hadoop.io.compress.GzipCodec | Gunzip
Operation Result:
Hadoop: Using APIs to compress data read from standard input and write it to standard output