Background: I use a foreach loop to generate a cache in the thinkphp framework and then downgrade the cache with other controls and methods.
The problem is to first exclude on-demand generation, because it is not certain that more than 60,000 data can be used, so you need to generate all the content, and because the resulting content time is too long to ask for advice, cache each data has 6 fields 4 of which are string,2 int
It is now run as a direct foreach loop and then use the S method to generate the file cache. This generates a file cache of approximately 15M, resulting in a length of 5-7s, with database query time.
foreach($data as $v){ $arr[$v['id']]=$v;}S('cache',$arr);
1, if the use of multi-threaded way, how should it run?
Or tell me how the principle of multi-threading can be
"Resolved" 2, using the Phpredis cache, using the hash type should be how to BULK INSERT key values
I mean, write the array straight in, but not serialize him.
$cache=new Redis();$cache->connect(......)foreach($data as $v){ $arr[$v['id']]=$v; foreach($v as $k=>$val){ $cache->set($k,$val); }}
Found in Phpredis that there is a hmset (key, $array) that can be set in bulk
Reply content:
Background: I use a foreach loop to generate a cache in the thinkphp framework and then downgrade the cache with other controls and methods.
The problem is to first exclude on-demand generation, because it is not certain that more than 60,000 data can be used, so you need to generate all the content, and because the resulting content time is too long to ask for advice, cache each data has 6 fields 4 of which are string,2 int
It is now run as a direct foreach loop and then use the S method to generate the file cache. This generates a file cache of approximately 15M, resulting in a length of 5-7s, with database query time.
foreach($data as $v){ $arr[$v['id']]=$v;}S('cache',$arr);
1, if the use of multi-threaded way, how should it run?
Or tell me how the principle of multi-threading can be
"Resolved" 2, using the Phpredis cache, using the hash type should be how to BULK INSERT key values
I mean, write the array straight in, but not serialize him.
$cache=new Redis();$cache->connect(......)foreach($data as $v){ $arr[$v['id']]=$v; foreach($v as $k=>$val){ $cache->set($k,$val); }}
Found in Phpredis that there is a hmset (key, $array) that can be set in bulk
Or with a multi-process bar, PHP seems to support the thread is not very good, on some systems can not be used.
You look at the fork function and then you know how to use multiple processes.