Talking about the escaped characters in JSON
What are the benefits of doing so? When you call the jsonp interface or call a js file, the garbled characters caused by different File encodings should not be unfamiliar. If your file contains non-English characters, garbled characters may occur if the file encoding is inconsistent during the call. Many new friends have struggled with this problem. However, if the characters are escaped as Unicode, no garbled characters will occur regardless of whether the file encoding is consistent. This is why PHP uses Unicode encoding by default. She thought it was too thoughtful. Of course, if you want to directly display those characters, it is OK. Add JSON_UNESCAPED_UNICODE to the second parameter. However, this parameter is not supported in PHP 5.4.0. What does JSON. stringify escape? You can see this regular expression in json2.js line 351st. Text escapable =/[\ "\ x00-\ x1f \ x7f-\ x9f \ u00ad \ u0600-\ u0604 \ u070f \ u17b4 \ u17b5 \ u200c-\ u200f \ u2028- \ u202f \ u2060-\ u206f \ ufeff \ ufff0-\ uffff]/g; that is to say, JSON only escapes unicode characters. Let's test it. Text running console. log (JSON. stringify ("\ x00 \ x0a"); after running, we can see that \ x00 is escaped as \ u0000, while \ x0a is converted into special characters like \ n, which can be seen below the regular expression just now. However, when you test the character \ ufeff, you will find that firefox and chrome are not escaped at all. Indeed, it seems that only json2 is escaped for us. Why didn't native JSON. stringify escape so many characters? Didn't he consider compatibility issues for us? In fact, I think this problem can be avoided, because you will not directly use static pages to provide interfaces for other sites. It is often used only internally. Even if it is submitted to the backend, the code under a project is the same, so the compatibility issues do not need to be considered internally. In my hometown, do you want to communicate with them in Mandarin or English? Communication in dialect is smoother. Of course, this is my personal opinion, and I don't know what the idea of writing a js engine is. Let's repeat how native JSON can escape the characters \ u000-\ uffff. Run the for (var I = 0, str = '', arr = []; I <0 xffff; I ++) {str = JSON. stringify (String. fromCharCode (I); str. indexOf ("\")>-1 & arr. push (str);} console. log (arr. join (","); the result of my chrome 34 is text ["\ u0000", "\ u0001", "\ u0002", "\ u0003 ", "\ u0004", "\ u0005", "\ u0006", "\ u0007", "\ B", "\ t", "\ n", "\ u000b ", "\ f", "\ r", "\ u000e", "\ u000f", "\ u0010", "\ u0011", "\ u0012", "\ u0013 ", "\ u0014", "\ u0015", "\ u0016", "\ u0017", "\ u0018", "\ u0019", "\ u001a", "\ u001b ", "\ u001c", "\ u001d", "\ u001e", "\ u001f", "\" "," \ "];