Read large files to mysql
At first, I used the java Read File Processing format and stored it in the database. After running the command, I found that the efficiency was too low and too low. 4G text and more than 40 million lines. It was estimated that it would take about 30 days to complete the processing.
Data format:
Each row-> XXXX | XXXXXXX | XXXXXXXXXXXXXXXXX
The data is strictly structured, so you can find a method to import mysql directly from a file:
Load Data infile
The command used is as follows:
Load data infile '/home/lenovo/documentation/NLPCC2015/auxiliary-data/NLPCC-2015.Auxiliary.KB.Chinese' ignore into table detail_all character set utf8 fields terminated by' | 'Lines terminated by '\ n '('subobject ', 'predicate', 'value ');
Split a row of data according to [|] and split it into three sections. Then each row is identified by \ n, and the three sections are saved into three fields respectively.
47943428 rows imported successfully!
After the import, \ r still exists at the end of each line. \ r is removed below.
Update detail_all set value = trim (TRAILING '\ R' FROM 'value ');
Use trim to remove the start and end symbols.
Running error:
Error Code: 2013. Lost connection to MySQL server during query600.746 sec
Find solution: http://www.quora.com/How-can-I-solve-the-Error-Code-2013-Lost-connection-to-MySQL-server-during-query-600-135-sec-error-message
You can increase the timeout value.
The following error occurs:
Error Code: 1205. Lock wait timeout exceeded; try restarting transaction
Here, the rollback process is not over. Wait until it is over.