What does this mean when the javascript timer changes to a 16-bit floating point number after accumulating a decimal number? Please refer to this: {code...} is printed like this: 1.11.200000000000000000021.300000000000000000031.400000000000000000041.500000000000000000041.6000000000000005. Does this mean that the javascript timer will change to a 16-bit floating point number after adding?
See this:
var startNum = 1; setInterval(function () { startNum += 0.1; console.log(startNum); }, 500);
The output is as follows:
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
Is this a pitfall? Depressed.
Reply content:
What does this mean when the javascript timer changes to a 16-bit floating point number after accumulating a decimal number?
See this:
var startNum = 1; setInterval(function () { startNum += 0.1; console.log(startNum); }, 500);
The output is as follows:
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
Is this a pitfall? Depressed.
It's normal. This is caused by precision. Do not believe it. Try 0.1 + 0.2 not equal to 0.3.
Http://segmentfault.com/q/1010000000137297
Look at this standard http://www.baidu.com/link? Url = ax1PkOuwwFU_KTyr29jOuhSr9v-i-Zg8fqAqty9Xbo_2oLAYg-_ blank & wd = & eqid = 9ac65804000049bc00000005568e60b4
To use programming languages to calculate decimals, it is best to convert them into integers first.