Single File on HDFs:
-bash-3.2$ Hadoop fs-ls/user/pms/ouyangyewei/data/input/combineorder/repeat_rec_categoryfound 1 items-rw-r--r-- 2 deploy supergroup 520 2014-08-14 17:03/user/pms/ouyangyewei/data/input/combineorder/repeat_rec_category /repeatreccategory.txt
File contents:
-bash-3.2$ Hadoop Fs-cat/user/pms/ouyangyewei/data/input/combineorder/repeat_rec_category/repeatreccategory.txt | more810496098554729719175320971895971902971922958261972047972050
How the Java API reads the HDFS single file using the FileSystem method
/** * Get repeatable recommended classes, comma delimited * @param filePath * @param conf * @return */public string getrepeatreccategorystr (String FilePath {final string DELIMITER = "\ T"; final string inner_delimiter = ","; String categoryfilterstrs = new string (); BufferedReader br = null;try {FileSystem fs = Filesystem.get (new Configuration ()); Fsdatainputstream InputStream = Fs.open (new Path (FilePath)); br = new BufferedReader (new InputStreamReader (InputStream) ); String line = Null;while (null! = (line = Br.readline ())) {string[] STRs = Line.split (DELIMITER); Categoryfilterstrs + = (st Rs[0] + inner_delimiter);}} catch (IOException e) {e.printstacktrace ();} finally {if (null! = BR) {try {br.close ();} catch (IOException e) {E.printsta Cktrace ();}}} return categoryfilterstrs;}