[Signature pitfall] Decmail. GetBits (), getbits
The decimal type has the GetBits () method.
You can get the int [4] value of the value, and then get the byte [16] value.
In c #, the byte [] obtained by 0 m and 0.00m is different (for specific reasons, Baidu)
In SQL, the column of the decimal generated by EF is forced to 2 decimal places by default.
In c #, the value is 0 m, and the value is changed to 0.00 m after it is saved to the database.
So the question is
If the value of decimal. GetBits () is used as the content to be signed
After saving the data to the database, the signature verification fails.
To solve this problem, we need to convert decimal to 2 decimal places before saving it to the database.
"+ 0.01m-0.01m" is attached to the assignment statement, for example, item. Credit = value + 0.01 m-0.01 m;
This will solve this problem ......