If you need to handle thousands of eloquent results, you can use
chunk
Command.
chunk
The eloquent method takes a "chunk" of the model and populates it with a given closure for processing. Use
chunk
Method can effectively reduce memory consumption when working with a large number of data sets:
Flight::chunk ($, function ($flights) { foreach ($flights as $flight) { // }});
$all _ark=arkvolume::chunk (50000, function ($flights) { foreach ($flights as $flight) { $GLOBALS [' Something '] [ ] = $flight [' id ']; } }); Var_dump ($GLOBALS [' something ']); exit;
This code is to perform an update of 100 pieces of data when the execution is complete and then continue with the other 100 data ...
This means that every time he operates a block of data rather than the entire database.
It is important to note that when using chunk with filtered conditions, if it is self-updating, you will miss out some data and then look at the code:
User::where (' approved ', 0)->chunk (function ($users) { foreach ($users as $user) { $user->update ([' Approved ' = 1]); });
If you want to run the above code, there will be no error, but the where
condition is filtered approved
to 0
user
and then approved
the value is followed by the new 1
.
In this process, the data of the first database is modified, the data of the next data block will be selected in the modified data, this time the data changed, and the page added 1. So after execution, only half of the data in the data is updated.
If not, let's take a look at the bottom-level implementation of chunk. Also take the above code as an example, if a total of 400 pieces of data, the data is divided into 100 pieces of processing.
page = 1: At the beginning of the page is 1, select 1-100 data for processing;
page = 2: At this time the value of the first 100 data is approved
all 1, then in the second filter data will start from 101th, and this time page=2, then the data will be processed before 第200-300 data
Then still.
public function chunk ($count, callable $callback) {$results = $this->forpage ($ page = 1, $count)->get (); while (count ($results) > 0) {//In each chunk result set, we'll pass them to the callback and then let the Developer take care of everything within the callback, which allows us to//keep the memory low for Spinni ng through large result sets for working. if (Call_user_func ($callback, $results) = = = False) {return false; } $page + +; $results = $this->forpage ($page, $count)->get (); } return true;}