Through the Java direct reading HDFs, will certainly use the Fsdatainputstream class, through the Fsdatainputstream in the form of streaming from the HDFs reads the data code as follows: Import java.io.IOException;
Import Java.net.URI;
Import org.apache.hadoop.conf.Configuration;
Import Org.apache.hadoop.fs.FSDataInputStream;
Import Org.apache.hadoop.fs.FileSystem;
Import Org.apache.hadoop.fs.Path;
public class Filereadfromhdfs {public
static void Main (string[] args) {
try {
String DSF = ' hdfs://hadoop1:90 00/tmp/wordcount/kkk.txt ";
Configuration conf = new Configuration ();
FileSystem fs = Filesystem.get (Uri.create (DSF), conf);
Fsdatainputstream Hdfsinstream = Fs.open (new Path (DSF));
byte[] Iobuffer = new byte[1024];
int readlen = Hdfsinstream.read (iobuffer);
while (Readlen!=-1)
{
System.out.write (iobuffer, 0, Readlen);
Readlen = Hdfsinstream.read (Iobuffer);
}
Hdfsinstream.close ();
Fs.close ();
} catch (IOException e) {
//TODO auto-generated catch block
e.printstacktrace ();
}
}
}
When you read HDFs directly from Java, you will use the Fsdatainputstream class to read the data from the HDFs through the Fsdatainputstream in the form of streams.
The code is as follows: