Laravel Queue Service provides a unified API for a variety of background queue services, the following article through the source code analysis to you about the Laravel repeated execution of the same queue task reason, the text through the sample code introduced in very detailed, the need for friends can reference, Let's take a look below.
Objective
The Laravel Queue Service provides a unified API for a variety of different background queue services. Queues allow you to defer tasks that consume time, such as sending an email. This can effectively reduce the time of the request response.
Discover problems
Using the Redis processing Queue task in Laravel, the framework provides a very powerful feature, but a recent problem is finding that a task has been executed multiple times.
First say why:
Because in Laravel, if a queue (Task) execution time is greater than 60 seconds, it is considered to fail and rejoin the queue, which results in repeated execution of the same task.
The logic of this task is to push the content to the user, take the user and traverse it according to the queue content, and send it by requesting the backend HTTP interface. For example, there are 10,000 users, the number of users or the interface processing speed is not so fast, the execution time will certainly be greater than 60 seconds, so this task is re-join the queue. Worse, the previous task would rejoin the queue if it had not been executed in 60 seconds, so that the same task would be repeated more than once.
Below from Laravel source code to find the culprit.
Source code file: vendor/laravel/framework/src/illuminate/queue/redisqueue.php
/** * The expiration time of a job. * * @var int|null */protected $expire = 60;
This $expire member variable is a fixed value, and Laravel thinks it's time to execute a queue in 60 seconds. Fetch Queue Method:
Public Function pop ($queue = null) {$original = $queue?: $this->default; $queue = $this->getqueue ($queue); $this->migrateexpiredjobs ($queue. ':d elayed ', $queue); if (! Is_null ($this->expire)) { $this->migrateexpiredjobs ($queue. ': Reserved ', $queue);} List ($job, $reserved) = $this->getconnection ()->eval ( luascripts::p op (), 2, $queue, $queue. ': Reserved ', $ This->gettime () + $this->expire); if ($reserved) { return new Redisjob ($this->container, $this, $job, $reserved, $original);}}
The fetch queue has several steps, because the queue execution fails, or the execution timeout is put into another collection to be saved for retry, as follows:
1. Re-rpush the queue that failed due to execution from the delayed collection to the currently executing queue.
2. Re-rpush the queue due to execution timeout from the reserved collection to the currently executing queue.
3. The task is then taken from the queue and the queue is placed in an ordered collection of reserved.
The eval command is used here to perform this procedure and several LUA scripts are used.
To fetch a task from the queue to be executed:
Local job = Redis.call (' Lpop ', keys[1]) local reserved = Falseif (Job ~= false) then reserved = Cjson.decode (Job) reserved[' Attempts '] = reserved[' attempts '] + 1 reserved = Cjson.encode (reserved) redis.call (' Zadd ', keys[2], argv[1], reserved) end return {job, reserved}
You can see that Laravel is taking a copy of the queue to be executed by Redis at the same time, placing it in an ordered set and using an expiration timestamp as the score.
This task is removed from the ordered collection only when the task is completed. The code to remove the queue from this ordered set is omitted, so let's take a look at how Laravel handles queues with execution times greater than 60 seconds.
That's what this LUA script does:
Local val = Redis.call (' Zrangebyscore ', keys[1], '-inf ', argv[1]) if (Next (val) ~= nil) then Redis.call (' Zremrangebyrank ', Keys[1], 0, #val-1) for i = 1, #val, redis.call (' Rpush ', keys[2], unpack (Val, I, Math.min (i+99, #val))) Enden Dreturn true
Here Zrangebyscore find the element with a score from infinity to the current timestamp, that is, the task that was added to the collection 60 seconds ago, and then remove the elements from the collection and Rpush to the queue by Zremrangebyrank.
It should be an epiphany to see here.
If a queue is not completed in 60 seconds, the process will re-rpush the tasks from the reserved collection to the queue when the queue is taken.
Summarize
Articles you may be interested in:
About Laravel Redis Multi-process simultaneous fetch queue problem explanation
PHP-MSF source of the detailed
Thinkphp5 URL and routing features in detail and examples