Help! Garbled data inserted into MySQL:
The character set of the field is utf8_unicode_ci, and the PHP file encoding is also a UTF-8.
Reads data from json files, assembles the data into SQL statements, and inserts data into the database. Json is also UTF-8 encoded.
If the number is not large, there will be 2685 pieces of data. after the program is executed, you can view the database. starting from the second row, the Chinese characters following it are garbled.
I have been suffering from this problem for a day or two!
Reply to discussion (solution)
I tried to divide json into several files for separate import. when I imported a large amount of data in csv format, this would cause errors. after splitting, I imported the data.
I tried to divide json into several files for separate import. when I imported a large amount of data in csv format, this would cause errors. after splitting, I imported the data.
Thanks. it has been solved. it is not the problem of character set and file size. it is caused by the use of ez_ SQL. Replace it with the original mysql_query (); then OK!
Hope to help students who encounter the same problem!