When dealing with large files, if the use of ordinary fileinputstream or FileOutputStream or randomaccessfile for frequent read and write operations, will cause the process due to frequent read and write memory to reduce speed. The following is a contrastive experiment.
Package test;
Import Java.io.BufferedInputStream;
Import Java.io.FileInputStream;
Import java.io.FileNotFoundException;
Import java.io.IOException;
Import Java.io.RandomAccessFile;
Import Java.nio.MappedByteBuffer;
Import Java.nio.channels.FileChannel; public class Test {public static void main (string[] args) {try {FileInputStream fis=new fileinputs
Tream ("/home/tobacco/test/res.txt");
int sum=0;
int n;
Long T1=system.currenttimemillis ();
try {while (N=fis.read ()) >=0) {sum+=n;
The catch (IOException e) {//TODO auto-generated catch block E.printstacktrace ();
Long T=system.currenttimemillis ()-t1;
System.out.println ("Sum:" +sum+ "Time:" +t);
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace (); try {fileinputstream fis=new fileinputstream ("/home/tobacco/test/res.txT ");
Bufferedinputstream bis=new bufferedinputstream (FIS);
int sum=0;
int n;
Long T1=system.currenttimemillis ();
try {while (N=bis.read ()) >=0) {sum+=n;
The catch (IOException e) {//TODO auto-generated catch block E.printstacktrace ();
Long T=system.currenttimemillis ()-t1;
System.out.println ("Sum:" +sum+ "Time:" +t);
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace ();
} Mappedbytebuffer Buffer=null; try {buffer=new randomaccessfile ("/home/tobacco/test/res.txt", "RW"). Getchannel (). Map (FileChannel.MapMode.READ_
WRITE, 0, 1253244);
int sum=0;
int n;
Long T1=system.currenttimemillis ();
for (int i=0;i<1253244;i++) {n=0x000000ff&buffer.get (i);
Sum+=n;
Long T=system.currenttimemillis ()-t1; System.out.println ("Sum:" +sum+ "tIME: "+t");
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace ();
catch (IOException e) {//TODO auto-generated catch block E.printstacktrace (); }
}
}
The test file is a file size of 1253244 bytes. Test results:
sum:220152087 time:1464
sum:220152087 time:72
sum:220152087 time:25
Note that the read data is correct. Delete the data Processing section.
Package test;
Import Java.io.BufferedInputStream;
Import Java.io.FileInputStream;
Import java.io.FileNotFoundException;
Import java.io.IOException;
Import Java.io.RandomAccessFile;
Import Java.nio.MappedByteBuffer;
Import Java.nio.channels.FileChannel; public class Test {public static void main (string[] args) {try {FileInputStream fis=new fileinputs
Tream ("/home/tobacco/test/res.txt");
int sum=0;
int n;
Long T1=system.currenttimemillis ();
try {while (N=fis.read ()) >=0) {//sum+=n;
The catch (IOException e) {//TODO auto-generated catch block E.printstacktrace ();
Long T=system.currenttimemillis ()-t1;
System.out.println ("Sum:" +sum+ "Time:" +t);
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace (); try {fileinputstream fis=new fileinputstream ("/home/tobacco/test/res.TXT ");
Bufferedinputstream bis=new bufferedinputstream (FIS);
int sum=0;
int n;
Long T1=system.currenttimemillis ();
try {while (N=bis.read ()) >=0) {//sum+=n;
The catch (IOException e) {//TODO auto-generated catch block E.printstacktrace ();
Long T=system.currenttimemillis ()-t1;
System.out.println ("Sum:" +sum+ "Time:" +t);
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace ();
} Mappedbytebuffer Buffer=null; try {buffer=new randomaccessfile ("/home/tobacco/test/res.txt", "RW"). Getchannel (). Map (FileChannel.MapMode.READ_
WRITE, 0, 1253244);
int sum=0;
int n;
Long T1=system.currenttimemillis ();
for (int i=0;i<1253244;i++) {//n=0x000000ff&buffer.get (i);
Sum+=n;
Long T=system.currenttimemillis ()-t1; System.out.println ("Sum:"+sum+ "Time:" +t);
catch (FileNotFoundException e) {//TODO auto-generated catch block E.printstacktrace ();
catch (IOException e) {//TODO auto-generated catch block E.printstacktrace (); }
}
}
Test results:
sum:0 time:1458
sum:0 time:67
sum:0 Time:8
This shows that the file part or all of the map to memory after reading and writing, the speed will improve a lot.
This is because the memory-mapped file first maps the files on the external memory to a contiguous area in RAM. is treated as an array of bytes, read and write operations directly to the memory operation, and then the memory area to be mapped to the external storage file, which saves the middle of the frequent to save for reading and writing time, greatly reducing the reading time.
Above this Java with Memory map processing large file Implementation code is small series to share all the content, hope to give you a reference, but also hope that we support cloud habitat community.