Php foreach loop 60 thousand multiple times how to use multithreading

Source: Internet
Author: User
Tags php foreach
Background: In the thinkphp framework, I use the foreach loop to generate a cache, and then call the cache under other control and methods. The problem is that on-demand generation is ruled out first, because we are not sure that one of the more than 60 thousand pieces of data can be used, so we need to generate all the content, because it takes too much time to generate the content... background: In the thinkphp framework, I use the foreach loop to generate a cache, and then call the cache under other control and methods.

The problem is that on-demand generation is ruled out first, because we are not sure that one of the more than 60 thousand pieces of data can be used, so we need to generate all the content, and because the generated content takes too long to issue questions, each data entry in the cache has 6 fields, 4 of which are string and 2 are int.

The current running method is to directly execute the foreach loop and then use the S method to generate the File Cache. The generated File Cache is about 15 MB, and the generated duration is 5-7 s, including the database query time.

foreach($data as $v){    $arr[$v['id']]=$v;}S('cache',$arr);

1. How should I run the multi-thread mode?
Or you can tell me how to separate multiple threads.
[Solved] 2. How to Insert key values in batches using the hash type when using the phpredis cache?
I mean, write the array directly, but not serialize it.

$cache=new Redis();$cache->connect(......)foreach($data as $v){    $arr[$v['id']]=$v;    foreach($v as $k=>$val){        $cache->set($k,$val);    }}

It is found that there is an hmset (key, $ array) in phpredis that can be set in batches.

Reply content:

Background: In the thinkphp framework, I use the foreach loop to generate a cache, and then call the cache under other control and methods.

The problem is that on-demand generation is ruled out first, because we are not sure that one of the more than 60 thousand pieces of data can be used, so we need to generate all the content, and because the generated content takes too long to issue questions, each data entry in the cache has 6 fields, 4 of which are string and 2 are int.

The current running method is to directly execute the foreach loop and then use the S method to generate the File Cache. The generated File Cache is about 15 MB, and the generated duration is 5-7 s, including the database query time.

foreach($data as $v){    $arr[$v['id']]=$v;}S('cache',$arr);

1. How should I run the multi-thread mode?
Or you can tell me how to separate multiple threads.
[Solved] 2. How to Insert key values in batches using the hash type when using the phpredis cache?
I mean, write the array directly, but not serialize it.

$cache=new Redis();$cache->connect(......)foreach($data as $v){    $arr[$v['id']]=$v;    foreach($v as $k=>$val){        $cache->set($k,$val);    }}

It is found that there is an hmset (key, $ array) in phpredis that can be set in batches.

Use multi-process. php does not seem to support threads very well and cannot be used in some systems.
You can view the fork function and then know how to use multi-process.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.