The recent use of Python, encountered the problem of serializing objects, the traditional JSON and the new serialization Toolkit Msgpack are involved, so do a simple summary:
Colloquial: Serialization: Transforming object information into a form that can be stored or transmitted; deserialization: restores the stored content to an object.
JSON does not have to explain more, is a lightweight data interchange format, widely used in web development. Of course, the object is serialized into a format that conforms to the JSON specification. There is a heap of information on the Internet.
Official website: http://www.json.org
Msgpack is interesting, first look at the official explanation:
Messagepack is an efficient binary serialization format. It lets you exchange data among multiple languages
Like JSON. But it ' s faster and smaller.Small integers is encoded into a single byte, and typical short strings require only one extra byte in addition to the St Rings themselves. Messagepack is an efficient binary serialization format. It allows you to exchange data between languages like JSON. But it's faster and smaller than JSON. A small integer is encoded into a byte, and the short string is just a byte larger than its length. Sum up: Just like JSON, it's stronger than JSON: faster, smaller!
Official website: http://msgpack.org/
I am mainly based on the use of actual python, comparing the two serialization effects. Detailed details of this brother's blog: http://www.heyues.com/messagepack/
OK, no matter how good the others say, or to use their own code to try, is to see the well, simply wrote a test script:
For a Dictionary object, serialize and deserialize 10,000 times with JSON and Msgpack respectively, observing the speed and memory consumption after serialization.
Import Json,msgpack,sys,timea = {' name ': ' Yzy ', ' age ': +, ' gender ': ' Male ', ' location ': ' Shenzhen '}begin_json = Time.clock () for I in Range (10000): In_json = Json.dumps (a) Un_json = Json.loads (in_json) End_json = Time.clock () print (' Json Serialization time:%.05f seconds '% (End_json-begin_json)) Print (Type (in_json), ' content: ', In_json, ' Size: ', sys.getsizeof (In_json)) print (Type (un_json), ' content: ', Un_json, ' Size: ', sys.getsizeof (Un_json)) begin_msg = Time.clock () for I in Range (10000): In_msg = msgpack.packb (a) un_msg = msg PACK.UNPACKB (in_msg) "" "# Alias for compatibility to simplejson/marshal/pickle.load = Unpackloads = Unpackbdump = Packdump s = packb "" "# IN_MSG1 = Msgpack.dumps (a) # UN_MSG1 = Msgpack.loads (in_msg) end_msg = Time.clock () print (' Msgpack Serializati On time:%.05f seconds '% (end_msg-begin_msg)) print (Type (in_msg), ' content: ', in_msg, ' Size: ', sys.getsizeof (in_ msg)) print (Type (un_msg), ' content: ', ' Size: ', sys.getsizeof (un_msg)
Results:
It has to be said that Msgpack does have a clear advantage in terms of size and time.
As far as my own tests are concerned, the speed is at least 3 times times faster.
Json serialization time:0.16115 Seconds<class ' str ' > content: {"Age": +, "location": "Shenzhen", "name": "Yzy "," gender ":" Male "} Size: 117<class ' dict ' > content: {' age ': +, ' location ': ' Shenzhen ', ' name ': ' Yzy ', ' g Ender ': ' Male '} size: 288Msgpack serialization time:0.05043 seconds<class ' bytes ' > content: B ' \x84\ Xa3age\x1a\xa8location\xa8shenzhen\xa4name\xa3yzy\xa6gender\xa4male ' size: 78<class ' dict ' > Content: Size: 288
In this way, Msgpack still has a lot of potential. While the existing systems are mostly JSON-enabled, Msgpack will certainly be used in more and more data transfer as development, including Redis, supports Msgpack.
JSON and Msgpack serialization comparison under Python