There is a TXT file that contains 100,000 records in the following format:
Column 1 column 2 column 3 column 4 column 5
A 00003131 0 0 adductive#1 adducting#1 adducent#1
A 00003356 0 0 nascent#1
A 00003553 0 0 emerging#2 emergent#2
A 00003700 0.25 0 dissilient#1
........................ There are 100,000 of them in the back .......
The requirements are to be imported into the database, the structure of the data table is
word_idAuto Increment
Word"Adductive#1 adducting#1 adducent#1" This TXT record to convert to 3 SQL records
value= Third column-fourth column; if = 0, the record is skipped over the data table
Copy CodeThe code is as follows:
$file = ' words.txt '; txt source file for//10w record
$lines = file_get_contents ($file);
Ini_set (' Memory_limit ', '-1 ');//Do not limit mem size, otherwise you will get an error
$line =explode ("\ n", $lines);
$i = 0;
$sql = "INSERT into Words_sentiment (word,senti_type,senti_value,word_type) VALUES";
foreach ($line as $key = $li)
{
$arr =explode ("", $li);
$senti _value= $arr [2]-$arr [3];
if ($senti _value!=0)
{
if ($i >=20000&& $i <25000)//Sub-batch import to avoid failure
{
$MM =explode ("", $arr [4]);
foreach ($mm as $m)//"Adductive#1 adducting#1 adducent#1" This TXT record is to be converted to 3 SQL records {
$nn =explode ("#", $m);
$word = $nn [0];
$sql. = "(\" $word \ ", 1, $senti _value,2),";//This place to note is that Word may contain single quotes (such as Jack's), so we want to enclose word with double quotation marks (note escaping)
}
}
$i + +;
}
}
echo $i;
$sql =substr ($sql, 0,-1);//Remove the last comma
Echo $sql;
File_put_contents (' 20000-25000.txt ', $sql); Batch Import database, 5,000 one time, about 40 seconds to look; too many max_execution_time will not be enough to cause a failure
?>
1, massive data import at that time, you should pay attention to some limitations of PHP, you can temporarily adjust, or will error
Allowed memory size of 33554432 bytes exhausted (tried to allocate bytes)
2,php Operation TXT file
File_get_contents ()
File_put_contents ()
3, Mass import, it is best to import batches, the probability of failure less
4, before the massive import, the script must be tested many times and then use, such as 100 data to test
5, after the import, if PHP mem_limit is not enough, the program still can't run up
(It is recommended to modify the php.ini way to improve the mem_limit, rather than using temporary statements)
http://www.bkjia.com/PHPjc/326817.html www.bkjia.com true http://www.bkjia.com/PHPjc/326817.html techarticle There is a TXT file that contains 100,000 records in the following format: Column 1 column 2 column 3 column 4 column 5 a 00003131 0 0 adductive#1 adducting#1 adducent#1 a 00003356 0 0 nascent#1 a 0 0003553 0 0 em ...