The scenario is this, the page has an AJAX request, the server returns the JSON data is larger, generally 10mb or so. In order to save server traffic and save bandwidth, it is necessary for the backend PHP to compress the JSON data and transfer it to the front end, which is processed by the front-end JavaScript.
The situation that has been tried is as follows:
1, using Base64, but the compression ratio is small, only more than 20%.
2, PHP support Gzdeflate and gzcompress function, the JavaScript side did not find the corresponding solution Engross, git above some of the inflate.js is not used at all, always too many loops caused the browser crashes.
3, find a JavaScript support LZMA compression solution Engross, very good, but did not find PHP support related functions.
So here kneeling beg you God, whether there is a compression method, so that both front and back can use the method of compression and decompression, thank you!
Reply content:
The scenario is this, the page has an AJAX request, the server returns the JSON data is larger, generally 10mb or so. In order to save server traffic and save bandwidth, it is necessary for the backend PHP to compress the JSON data and transfer it to the front end, which is processed by the front-end JavaScript.
The situation that has been tried is as follows:
1, using Base64, but the compression ratio is small, only more than 20%.
2, PHP support Gzdeflate and gzcompress function, the JavaScript side did not find the corresponding solution Engross, git above some of the inflate.js is not used at all, always too many loops caused the browser crashes.
3, find a JavaScript support LZMA compression solution Engross, very good, but did not find PHP support related functions.
So here kneeling beg you God, whether there is a compression method, so that both front and back can use the method of compression and decompression, thank you!
10M is too scary, list type suggested paging or rolling load
Server side will response do gzip compression, I estimate 10MB files will probably only have 4MB.
JSON is intended to replace XML, simplifying network interaction, so it is already very streamlined, if your data is relatively large, it is recommended to do a data structure adjustment, such as the use of arrays to replace the key, value of the format.
Although there is an alternative, JSON serialization and deserialization is native, if you use a third party, you need to set up a decoding package, imagine a 10MB data structure, decoding the client will take a long time?