Read millions of lines of CSV file, because the amount of data is too large, once the contents of the CSV read out, stored in memory, will lead to serious memory, and finally directly down, so we recommend to take a batch read data and then save the database, the following is a simple test method, can be modified according to the specific needs. For the operation of large quantities of data, we recommend using JDBC to add, modify, delete and so on directly.
Import java.util.List;
Import Java.io.BufferedReader;
Import Java.io.File;
Import java.io.FileNotFoundException;
Import Java.io.FileReader;
Import java.io.IOException;
Import java.util.ArrayList;
public class Readcsv {
public static void Main (string[] args) throws filenotfoundexception{
File CSV = new file ("D:\\test.csv"); CSV file path
BufferedReader br = null;
try {
br = new BufferedReader (new FileReader (CSV));
String line = "";
while (line = Br.readline ()) = null) {//read to the line variable
while (GetList (BR)) {}
}
} catch (IOException e) {
E.printstacktrace ();
}
}
public static Boolean getList (BufferedReader br) {
list<string[]> allstring = new arraylist<> ();
Boolean status = FALSE;
String everyline = "";
try {
int index = 0;
while ((Everyline = Br.readline ()) = null) {
String [] strlist = Everyline.split (",");
System.out.println (Everyline);
Allstring.add (strlist);
Index + +;
if (index = = 3) {
Status = TRUE;
Break
}
}
} catch (IOException e) {
E.printstacktrace ();
}
SYSTEM.OUT.PRINTLN ("Total number of bars:" + allstring.size ());
Take value
for (int i = 0; i < allstring.size (); i++) {
System.out.println (Allstring.get (i) [0]);
}
Here to do the new operation, save to the database ...
return status;
}
}
Read the CSV file in batches and save to the database