Someone asked a js question:
var i = 0.07;
var r = i*100;
alert(r);
Why is the result 7.0000000000000001?
After checking the information, we know that in JavsScript, variables are stored by float instead of the number and float types. While javascript usesIEEE 754-2008Standard-defined 64-bit floating-point format storage number, as defined by IEEE 754: http://en.wikipedia.org/wiki/IEEE_754-2008
The length of the integer part corresponding to decimal64 is10,The length of the fractional part is16, so the default calculation result is "7.0000000000000001". If the last decimal point is 0, 1 is used as the valid number sign.
Similarly, we can imagine that the result of 1/3 is 0.3333333333333333.
So how can we correct this value?
You can use the following methods:
I. parseInt
var r4=parseInt(i*100);
Ii. Math. round
var r2=Math.round((i*100)*1000)/1000;
You can obtain
Appendix all test code:
<Html>
<Head>
<Title> test script </title>
<Script language = "JAVASCRIPT">
Function init ()
{
Var I = 0.07;
Var r = I * 100;
Var r2 = Math. round (I * 100) * 1000)/1000;
Var r3 = eval (I * 100 );
Var r4 = parseInt (I * 100 );
Var r5 = parseFloat (I * 100*1.0000 );
Var r6 = (1/3 );
Alert (r );
Alert ("Math. round =" + r2 );
Alert ("eval =" + r3 );
Alert ("parseInt =" + r4 );
Alert ("parseFloat =" + r5 );
Alert ("" + r6 );
}
</Script>
</Head>
<Body onload = "init ();">
</Body>
</Html>