For the contents of a small file, when processing, you can simply load it all into memory, and then processing:
/** * Read all the contents of the file into memory */ try { list<String> list = Fileutils.readlines (new File (Filederec), charsets.utf_8); System.out.println (List.get (0)); Catch (IOException e) { e.printstacktrace (); }
However, when working with large files (GB level), simply loading them into memory can cause memory overflow. Therefore, it is usually handled in the following ways:
/*** java.util.* file stream processing*/FileInputStream fi=NULL; Scanner SC=NULL; Try{fi=NewFileInputStream (NewFile (Filederec)); SC=NewScanner (FI, "UTF-8"); if(Sc.hasnextline ()) {System.out.println (Sc.nextline ()); } } Catch(Exception e) {}finally { if(Fi! =NULL) { Try{fi.close (); } Catch(IOException e) {//TODO auto-generated Catch blockE.printstacktrace (); } } if(SC! =NULL) {sc.close (); } }
You can also use packages that are used by third parties:
/*** Stream processing provided by Apache-commons-io*/Lineiterator Li=NULL; Try{li= Fileutils.lineiterator (NewFile (Filederec), "UTF-8"); if(Li.hasnext ()) {System.out.println (Li.next ()); } } Catch(IOException e) {//TODO auto-generated Catch blockE.printstacktrace (); } finally{lineiterator.closequietly (LI); }
Java Processing Large files