When using JavaScriptSerializer for serialization in C,
Recently I am working on a data exchange service interface, using the. NET Web API project. You need to monitor the data of each request, that is, the operation log. The request data is an object, so I used serialization to read and write the request content to the log. At first glance, there seems to be no problem, but the time is not the format we usually see. Yes:
I guess this is because the js initialization time is usually the number of milliseconds added to 1970/01/01, And the JavaScriptSerializer will also be formatted as the number of milliseconds from 1970/01/01 to the current time point GMT + 0, if deserialization is performed directly, we can see that 8 hours is missing and the time precision is milliseconds. The original initialization time precision is 10-7 seconds.
This time format is used in js, but it is not suitable for reading if we want to save this information. Therefore, a conversion is required. The following code uses a regular expression to convert the time format of the time zone in milliseconds:
/// <Summary> /// serialize request data /// </summary> /// <param name = "obj"> request data </param> /// <returns> </returns> public string LocalSerialize (object obj) {var jser = new System. web. script. serialization. javaScriptSerializer (); var json = jser. serialize (obj); // convert the time format to a format suitable for reading habits. json = System. text. regularExpressions. regex. replace (json, @ "\/Date \ (\ d +) \/", match => {DateTime dt = new DateTime (1970, 1, 1); dt = dt. addMilliseconds (long. parse (match. groups [1]. value); dt = dt. toLocalTime (); // return dt local time. toString () ;;}); return json ;}
Then you can call this method. After completion:
Now the conversion is complete. This article references http://www.cnblogs.com/basterdaidai/p/6212760.html