Test 1
1, less operation elements, a single element is larger, English, 3 elements to operate 1000 times
The code is as follows |
Copy Code |
$data = Array (' Hello ', ' word '); $d = "Helloword"; $d = Str_repeat ($d, 10000); for ($i = 0; $i $data [] = $d; //} Var_dump ($data); $jsonen _sarttime = Getmicrotime (); for ($i =0; $i $json = Json_encode ($data); } echo "JSON length:". strlen ($json). " n "; $jsonen _endtime = Getmicrotime (); echo "Jsonencode time consuming:". ($jsonen _endtime-$jsonen _sarttime). " n "; $jsonde _starttime = Getmicrotime (); for ($i =0; $i $unjson = Json_decode ($json, true); } $jsonde _endtime = Getmicrotime (); echo "Jsondecode time consuming:". ($jsonde _endtime-$jsonde _starttime). " n "; $seri 1_starttime = Getmicrotime (); for ($i =0; $i $serialize = serialize ($data); } echo "Serialize length:". strlen ($serialize). " n "; $seri 1_endtime = Getmicrotime (); echo "Serialize serialization time consuming:". ($seri 1_endtime-$seri 1_starttime). " n "; $seri 2_starttime = Getmicrotime (); for ($i =0; $i $unserialize = Unserialize ($serialize); } $seri 2_endtime = Getmicrotime (); echo "Serialize is time-consuming to deserialize:". ($seri 2_endtime-$seri 2_starttime). " n "; /** * Get time to remember * @return */ function Getmicrotime () { List ($usec, $sec) = Explode ("", Microtime ()); Return ((float) $usec + (float) $sec); } |
Output
JSON Length: 90019
Jsonencode time consuming: 1.0974299907684
Jsondecode time consuming: 1.6237480640411
Serialize Length: 90052
Serialize serialization time consuming: 0.025779962539673
Serialize deserialization time consuming: 0.029321193695068
You can see JSON in English processing, the array element is less, the volume is less than the serialized data. Processing efficiency is lower than serialization.
Change data to
The code is as follows |
Copy Code |
$data = Array (' Hello ', ' word '); $d = "Hello"; $d = Str_repeat ($d, 10000); for ($i = 0; $i $data [] = $d; |
Output
JSON Length: 120019
Jsonencode time consuming: 0.83260488510132
Jsondecode time consuming: 2.2054090499878
Serialize Length: 60052
Serialize serialization time consuming: 0.01835298538208
Serialize deserialization time consuming: 0.01848292350769
You can see that JSON has a larger volume and less processing efficiency than serialization when doing word processing.
3. Change the data to
The code is as follows |
Copy Code |
$data = Array (' Hello ', ' word '); $d = "Hello"; for ($i = 0; $i $data [] = $d; } |
Output
JSON Length: 150016
Jsonencode time consuming: 2.1428198814392
Jsondecode time consuming: 6.5845320224762
Serialize Length: 198939
Serialize serialization time consuming: 2.8011980056763
Serialize deserialization time consuming: 4.6967668533325
You can see that the JSON is slightly smaller than serialize
4. Change the data revision to
The code is as follows |
Copy Code |
$data = Array (' Hello ', ' word '); $d = "Hello"; for ($i = 0; $i $data [] = $d; } |
Output
JSON Length: 80016
Jsonencode time consuming: 1.6437809467316
Jsondecode time consuming: 4.5136170387268
Serialize Length: 188939
Serialize serialization time consuming: 2.909558057785
Serialize deserialization time consuming: 4.4678349494934
Test 2
A 1 million-element array is used as the raw data, with JSON, serialize, and igbinary for serialization and reverse operation.
The code is as follows |
Copy Code |
<?php Ini_set (' Memory_limit ', ' 512m '); $array = Array_fill (0, 1000000, rand (1, 9999)); $start = Microtime (true); $export = Json_encode ($array); $end = Microtime (true); $duration = $end-$start; Print (' JSON Encode: '. $duration. PHP_EOL); $start = Microtime (true); $import = Json_decode ($export); $end = Microtime (true); $duration = $end-$start; Print (' JSON Decode: '. $duration. PHP_EOL); $start = Microtime (true); $export = serialize ($array); $end = Microtime (true); $duration = $end-$start; Print (' Serialize: '. $duration. PHP_EOL); $start = Microtime (true); $import = Unserialize ($export); $end = Microtime (true); $duration = $end-$start; Print (' Serialize: '. $duration. PHP_EOL); $start = Microtime (true); $export = Igbinary_serialize ($array); $end = Microtime (true); $duration = $end-$start; Print (' Igbinary Serialize: '. $duration. PHP_EOL); $start = Microtime (true); $import = Igbinary_unserialize ($export); $end = Microtime (true); $duration = $end-$start; Print (' Igbinary Serialize: '. $duration. PHP_EOL); ?> |
Test results
JSON encode:0.084825992584229
JSON decode:0.34976410865784
serialize:0.38241410255432
serialize:7.7904229164124
Igbinary serialize:0.046916007995605
Igbinary serialize:0.23396801948547
Judging from the test results, the speed aspect priority is ranked Igbinary > JSON > Serialize. At the same time, we can see that PHP native serialize in the reverse operation of large objects, the speed is really behind a big cut.
byte count comparison
json:5000001
serialize:15888902
igbinary:7868681
In the absence of Chinese characters, JSON wins, Igbinary, and serialize is dumped a few blocks.
Conclusion
If the elements are average in English and numerals, JSON is recommended, both volume and efficiency are better than serialization
If only English and digital, individual elements larger, then recommended serialize efficiency than serialization
If Chinese, less elements, recommended serialization, volume and efficiency are better than JSON
If the elements are more average in Chinese, we recommend JSON
If it is a caching business, the higher the efficiency the better, if it is cached data, the smaller the better. Also depends on the specific scene.