When the amount of data in the table member is small, it can be executed once:
$q = "SELECT * from member '"; $r = $obj _db->simplequery ($q) and while ($a = $obj _db->fetchrow ($r, Db_fetchmode_assoc)) { $id = $a [id]; $MCCD = $a [cca]+ $a [CCB]; $query = "INSERT into Mingxi (mid,mccd,mtime) VALUES (' $id ', ' $MCCD ', ' $time ')"; $obj _db->simplequery ($query);}
However, when the amount of data in the member reaches millions, the execution timeout error is said to occur:
Fatal error:maximum execution time of seconds exceeded in D:
So how to real-batch processing, such as the first 1000 data processing after the end of processing 1001 to 2000, and then processing 2001 to 3,000 ..., until the whole process is complete?
$q = "SELECT * from member '"; $r = $obj _db->simplequery ($q) and while ($a = $obj _db->fetchrow ($r, Db_fetchmode_assoc)) {$ i++; $ep =1000;//This assumes that the 1th 1000 is processed and then the second 1000 is processed until the end of processing? $id = $a [id]; $MCCD = $a [cca]+ $a [CCB]; $query = "INSERT into Mingxi (mid,mccd,mtime) VALUES (' $id ', ' $MCCD ', ' $time ')"; $obj _db->simplequery ($query);}
Please help write detailed code, thank you! Because the code is hidden in the background, so don't consider the page by page by hand to point the way
Reply to discussion (solution)
You can use the INSERT into Mingxi (mid,mccd,mtime) Select ID, CCA+CCB, and time from member
Recently processed some data about 560,000 articles, including read calculation processing and insertion for each piece
Use an Ajax method to request a PHP page, call a process that has not yet been processed, and then hit the processed tag repository, return to JS, Ajax received after the return (can be output to the page) to invoke itself, re-initiate the request
560,000 runs for about 100,000 seconds, and the page and program didn't die.
It's slow, but it's about my needs (I need to analyze every piece of data) and it doesn't time out.
Recently processed some data about 560,000 articles, including read calculation processing and insertion for each piece
Use an Ajax method to request a PHP page, call a process that has not yet been processed, and then hit the processed tag repository, return to JS, Ajax received after the return (can be output to the page) to invoke itself, re-initiate the request
560,000 runs for about 100,000 seconds, and the page and program didn't die.
It's slow, but it's about my needs (I need to analyze every piece of data) and it doesn't time out.
This method can not be tried asynchronously, the completed return value may be returned first, unless it is a query and then submit a way. Failure to do so may result in data loss.
Of course, it's not too much of a problem if you don't close for two days after 1 days of running.
Recently processed some data about 560,000 articles, including read calculation processing and insertion for each piece
Use an Ajax method to request a PHP page, call a process that has not yet been processed, and then hit the processed tag repository, return to JS, Ajax received after the return (can be output to the page) to invoke itself, re-initiate the request
560,000 runs for about 100,000 seconds, and the page and program didn't die.
It's slow, but it's about my needs (I need to analyze every piece of data) and it doesn't time out.
This method can not be tried asynchronously, the completed return value may be returned first, unless it is a query and then submit a way. Failure to do so may result in data loss.
Of course, it's not too much of a problem if you don't close for two days after 1 days of running.
Why did you return first? I don't see if I can give you an example to illustrate
$offs = isset ($_get[' offs ')? $_get[' offs ': 1; $sql = "SELECT * from member limit $offs, 1000";//do what you have to do header ("Location: $_server[php_self]?offs=". ($offs +1000));
$offs = isset ($_get[' offs ')? $_get[' offs ': 1; $sql = "SELECT * from member limit $offs, 1000";//do what you have to do header ("Location: $_server[php_self]?offs=". ($offs +1000));
Where do you put it? Does this mean you're constantly refreshing the page? There are some other code on the page that can only be executed once, thank you.
Is there a solution to the same question?