For details about how multiple Laravel Redis processes fetch queues at the same time, laravelredis
Preface
Recently, I encountered a problem in my work. Will multiple process processing queues read Redis queues repeatedly? Does this result in repeated task execution? The following is a detailed description of the sample code.
Use the Supervisor to listen to Laravel queue tasks. The Supervisor configuration is as follows:
[program:laravel-worker]process_name=%(program_name)s_%(process_num)02dcommand=php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemonautostart=trueautorestart=truenumprocs=8redirect_stderr=truestdout_logfile=/var/www/xxx.cn/worker.log
Note: numprocs = 8
Which indicates that eight processes are enabled to execute commands in command.
As follows:
PS C:\Users\tanteng\website\laradock> docker-compose exec php-worker sh/etc/supervisor/conf.d # ps -ef | grep php 7 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 8 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 9 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 10 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 11 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 12 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 13 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 14 root 0:00 php /var/www/xxx.cn/artisan queue:work --queue=sendfile --tries=3 --daemon 44 root 0:00 grep php
Whether Laravel multi-process reads the queue content repeatedly
Put a controller method in Laravel into multiple task queues at a time:
public function index(Request $request){ $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile')); $this->dispatch((new SendFile3())->onQueue('sendfile'));}
Print logs in the queue processing method and the ID of the queue to be processed:
App/Jobs/SendFile3.php
public function handle(){ info('invoke SendFile3'); dump('invoke handle'); $rawbody = $this->job->getRawBody(); $info = json_decode($rawbody, true); info('queue id:' . $info['id']);}
Laravel uses the Redis list as the data structure of the queue and assigns an ID to each queue. The data structure is as follows:
{ "job": "Illuminate\\Queue\\CallQueuedHandler@call", "data": { "commandName": "App\\Jobs\\SendFile3", "command": "O:18:\"App\\Jobs\\SendFile3\":4:{s:6:\"\u0000*\u0000job\";N;s:10:\"connection\";N;s:5:\"queue\";s:8:\"sendfile\";s:5:\"delay\";N;}" }, "id": "hadBcy3IpNsnOofQQdHohsa451OkQs88", "attempts": 1}
Request this controller route (or command line), you can see that there are more queue tasks in Redis,
At this time, enable the Supervisor to process the queue task and view the log:
[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:JaClJzhDEvntzLCRIz6uRQkCVLbE8Y9C[2017-12-23 19:01:01] local.INFO: queue id:ukHv0Li4P2VgPa55qU6yEOJM27Mo5YwJ[2017-12-23 19:01:01] local.INFO: queue id:ObMpwDTmnaveBUkU7aan5abt3Agyt90l[2017-12-23 19:01:01] local.INFO: queue id:fo2qZn2ftSdQtdnKOciMK7iJb4qlhRGE[2017-12-23 19:01:01] local.INFO: queue id:uLjFMoOU7Wk7bOAd4zpHb3ccRMJHBtR6[2017-12-23 19:01:01] local.INFO: queue id:87ULqPBObFmGr16nl5wxFVOi71zGCeRM[2017-12-23 19:01:01] local.INFO: queue id:9UVl0muQLzBqlRI99rChGW2ElXwVEMIE[2017-12-23 19:01:01] local.INFO: queue id:a0vgyZuz9HtmH7DGHEpXqesFTcQU3QAF[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:2cXuXxopPkgYiV4WO8gv9CJ6CwXeKtYL[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:9acTAYa8cxpJX6Q3Gb1sULokotP8reqZ[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:BPHQvBboChlv4gr2I0vyLVyw9bijtTYJ[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:Fm6tNajdxYKtdQbDMYDmwWJFLnNikRyg[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:nyAbcvSkBVPbaH3e2ItQkoLJlP1ficib[2017-12-23 19:01:01] local.INFO: queue id:WBHsSVZtP43569UoPXxfLLJcvYmPW7cP[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:bliPnKcRSDApwVmKLNxEhaKelhm0RDEY[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:eOAoQucEIwRz9uZ64xm6IDKgiqj9Xc3W[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:lzise9EiqQqINrhALbmAI4qNg7qylpb2[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:WXYKvcfOhS1pPnwOwUTsenoMv5l5EUXe[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:XtH5JiwLgnrwWzI02Oyi70pihAOkuJUD[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:9ehmE5HImlpNubpY0xWN8UVrOzxeMqws[2017-12-23 19:01:01] local.INFO: queue id:C1sK87cpZl47edLA0zhfo7PJ9MIEcoyx[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:2kwl51oH4lyyRrljCReGUCkNiJRDl7oe[2017-12-23 19:01:01] local.INFO: queue id:ObRpoqrYTPYiyv2delMlOXu3sAPpWJlN[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:6qgu6W3TapLjSrt688yv9HRXvDDLxntz[2017-12-23 19:01:01] local.INFO: queue id:wiTlERhwn7s9cQkfUF9lLlNADpXjKncI[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:ZSLW0VLFBDpL4wjTJzu3Yb3V45pNe807[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:qhZlXLGfGWRluIeNm7VbllmTJZYb2h5n[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:LUx1IByD3L2psNl9BZwHhk2knXyRPzW6[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:M2RESPjyo5hpAFxxL0EQbWwsUq4jpmWn[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:hUsGaiIAOO6ZfGQc5kGHGpsv5RpoRPYO[2017-12-23 19:01:01] local.INFO: queue id:cEHJsOy6bLeZ4NbncPziaHqlarMeyyEF[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:w4bkFiJKMU5saqG2xKN3ZRL5BYXGATMk[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:0zBuwbxlrEhhxKfYBkVyTY4z35f154sI[2017-12-23 19:01:01] local.INFO: queue id:mvoZvyDPvq4tcPjEy9G7PMtH3MwPkPik[2017-12-23 19:01:01] local.INFO: invoke SendFile3[2017-12-23 19:01:01] local.INFO: queue id:TLvF74eeidECWKtjZqWvW03UJTRPTL9r[2017-12-23 19:01:01] local.INFO: queue id:me8wyPfgcz0nf9xvcXz0hf2xVxqa1FFS
These eight processes process the queue concurrently, but the same ID does not appear in the printed logs. Let's take a look at how Laravel uses Redis to process the queue.
Analyze the processing of Laravel queues
Laravel's inbound queue Method
public function pushRaw($payload, $queue = null, array $options = []){ $this->getConnection()->rpush($this->getQueue($queue), $payload); return Arr::get(json_decode($payload, true), 'id');}
Redis rpush command is used.
Method for retrieving queues in Laravel
public function pop($queue = null){ $original = $queue ?: $this->default; $queue = $this->getQueue($queue); $this->migrateExpiredJobs($queue.':delayed', $queue); if (! is_null($this->expire)) { $this->migrateExpiredJobs($queue.':reserved', $queue); } list($job, $reserved) = $this->getConnection()->eval( LuaScripts::pop(), 2, $queue, $queue.':reserved', $this->getTime() + $this->expire ); if ($reserved) { return new RedisJob($this->container, $this, $job, $reserved, $original); }}
Here we use the lua script to get the queue, as shown below:
public static function pop(){ return <<<'LUA'local job = redis.call('lpop', KEYS[1])local reserved = falseif(job ~= false) thenreserved = cjson.decode(job)reserved['attempts'] = reserved['attempts'] + 1reserved = cjson.encode(reserved)redis.call('zadd', KEYS[2], ARGV[1], reserved)endreturn {job, reserved}LUA;}
The conclusion is: From Laravel's processing method and printed log results, even if multiple processes read the same queue, they will not read the same data.
Summary
The above is all the content of this article. I hope the content of this article has some reference and learning value for everyone's learning or work. If you have any questions, please leave a message to us, thank you for your support.