There are now more than 3,000 such records in the database, each with a regional Chinese name and a regional English name (e.g.)
I now want to implement batch update with Redis queue Lpush and rpop, SQL statements I have already written (for example)
Code I write this way (for example), I put more than 3,000 SQL statements into the list (Lpush), ready to take out each time a execution (Rpop), one-by-step, then do not know how to write, ask everyone how to write, now has put SQL statements in the list And I see the value of the list under the CLI, Chinese seems to have garbled (for example), I started the CLI added--raw, please answer
Thank you, everyone.
Reply content:
There are now more than 3,000 such records in the database, each with a regional Chinese name and a regional English name (e.g.)
I now want to implement batch update with Redis queue Lpush and rpop, SQL statements I have already written (for example)
Code I write this way (for example), I put more than 3,000 SQL statements into the list (Lpush), ready to take out each time a execution (Rpop), one-by-step, then do not know how to write, ask everyone how to write, now has put SQL statements in the list And I see the value of the list under the CLI, Chinese seems to have garbled (for example), I started the CLI added--raw, please answer
Thank you, everyone.
Xie invited
In fact, without Redis, you put all the SQL in a file, read one to execute one, not just?
function getSql(){ $fp = fopen('sql.txt', 'r'); while (!feof($fp)) { yield fgets($fp); }}foreach (getSql() as $sql) { executeSql($sql);}function executeSql(){ //执行语句}
insert into base_region(region_name, en_name) values('中国','Zhongguo'), ('北京', 'Beijing'),...,('上海', 'Shanghai') on duplicate key update en_name = values(en_name);
A very small cost of SQL is done (a database connection, performance is similar to bulk INSERT into data).
There are two points to note:
VALUES (), (), (), () are required to be spliced with the for loop.
Region_name must be a unique index, you can temporarily add a unique index to this field, and then after this SQL execution is finished, drop