Someone asked a js question:
Copy codeThe Code is as follows:
Var I = 0.07;
Var r = I * 100;
Alert (r );
Why is the result 7.0000000000000001?
After checking the information, we know that in JavsScript, variables are stored by float instead of the number and float types. While javascript uses the IEEE 754-2008 standard defined 64-bit floating point format storage number, according to the definition of IEEE 754: http://en.wikipedia.org/wiki/IEEE_754-2008
Decimal64 corresponds to an integer with a length of 10 and a decimal part with a length of 16. Therefore, the default calculation result is "7.0000000000000001". If the last decimal point is 0, 1 is used as a valid digit.
Similarly, we can imagine that the result of 1/3 is 0.3333333333333333.
So how can we correct this value?
You can use the following methods:
I. parseInt
Var r4 = parseInt (I * 100 );
Ii. Math. round
Var r2 = Math. round (I * 100) * 1000)/1000;
You can obtain
Appendix all test code:
Copy codeThe Code is as follows:
<Html>
<Head>
<Title> test script </title>
<Script language = "JAVASCRIPT">
Function init ()
{
Var I = 0.07;
Var r = I * 100;
Var r2 = Math. round (I * 100) * 1000)/1000;
Var r3 = eval (I * 100 );
Var r4 = parseInt (I * 100 );
Var r5 = parseFloat (I * 100*1.0000 );
Var r6 = (1/3 );
Alert (r );
Alert ("Math. round =" + r2 );
Alert ("eval =" + r3 );
Alert ("parseInt =" + r4 );
Alert ("parseFloat =" + r5 );
Alert ("" + r6 );
}
</Script>
</Head>
<Body onload = "init ();">
</Body>
</Html>