Because it is definitely not feasible to import million data into the database at a time, so I will import 50 or more data at a time, next time, refresh again to solve this problem. The code is as follows:
| The code is as follows: |
Copy code |
<? Set_time_limit (0 ); // Connect to the database $ S = isset ($ _ GET ['s '])? $ _ GET ['s']: 0; $ E = isset ($ _ GET ['E'])? $ _ GET ['E']: 50; $ Count = 85000; If ($ s <$ count) { $ SQL = "select * from bac_info where isget = 0 order by id desc limit $ s, $ e "; $ Query = mysql_query ($ SQL ); While ($ rs = mysql_fetch_array ($ query )) { $ Id = $ rs ['id']; $ Sms = $ rs ['Ms']; $ Typeid = $ rs ['typeid']; $ Isget = $ rs ['isget']; $ SQL = "insert into bac_info_bak (id, sms, typeid, isget) values ('$ ID',' $ sms ',' $ typeid', '$ isget ')"; Mysql_query ($ SQL ); Echo $ SQL; // Exit; $ Sqlu = "update bac_info set isget = 1 where id =". $ rs ['id']; Mysql_query ($ sqlu ); } Echo '<meta http-equiv = "refresh" content = "0; url = rand. php? S = '. ($ s + 50).' & e = 50 "> processing data. The current value is '. $ s ......'; } Else { Echo 'complete all data processing <a href = rand. php> and then sort the data randomly. </a> '; } ?> |