read in memory
The standard way to read a file line is to read in memory, and both guava and Apache Commons io provide a quick way to read a file line as follows:
Files.readLines(newFile(path), Charsets.UTF_8);FileUtils.readLines(newFile(path));
The problem with this approach is that all the rows of the file are stored in memory and will soon cause the program to throw a OutOfMemoryError exception when the file is large enough.
Read a file of about 1G
publicvoidreadFilesthrows IOException { "..."; Files.readLines(new File(path), Charsets.UTF_8);}
This approach only starts with a small amount of memory, however, when the file is all read into memory, we can see that it takes up a lot of memory (about 2 g)
[main] INFO org.baeldung.java.CoreJavaIoUnitTest2666 Mb[main] INFO org.baeldung.java.CoreJavaIoUnitTest490 Mb
This means that the process consumes approximately 2.1G of memory, because all the lines of the file are stored in memory, and the contents of all the files in memory are quickly depleted of available memory--no matter how large the actual available memory is, it is obvious.
In addition, we don't usually need to put all the lines of a file into memory at once--instead, we just need to iterate through each line of the file, then do the appropriate processing, and throw it out after processing. So that's exactly what we're going to do--by iterating through the lines, rather than putting all the rows in memory.
file Stream
Now let's take a look at the following workaround, we use Java.util.Scanner to scan each line of the file, one line at a time, to read continuously:
FileInputStream InputStream =NULL; Scanner sc =NULL;Try{InputStream =NewFileInputStream (path); sc =NewScanner (InputStream,"UTF-8"); while(Sc.hasnextline ()) {Stringline = Sc.nextline ();//System.out.println (line);}//Note that Scanner suppresses exceptions if(Sc.ioexception ()! =NULL) {ThrowSc.ioexception (); }}finally{if(InputStream! =NULL) {inputstream.close (); }if(SC! =NULL) {sc.close (); }}
This scenario iterates through all the rows in the file, allowing each row to be processed without keeping a reference to it, and without storing them in memory, we can see that it consumes about 150MB of memory.
[main] INFO org.baeldung.java.CoreJavaIoUnitTest763 Mb[main] INFO org.baeldung.java.CoreJavaIoUnitTest605 Mb
Apache Commons io stream
You can also use the Commons IO stream to take advantage of the custom lineiterator classes provided by the library:
"UTF-8");try { while (it.hasNext()) { String line = it.nextLine(); // do something with line finally { LineIterator.closeQuietly(it);}
It also consumes quite a little memory, about 150M:
[main] INFO o.b.java.CoreJavaIoIntegrationTest752 Mb[main] INFO o.b.java.CoreJavaIoIntegrationTest564 Mb
How to read large files efficiently with Java