What does it mean when a decimal is added to a 16-bit floating-point number in a JavaScript timer?
Take a look at this:
var startNum = 1; setInterval(function () { startNum += 0.1; console.log(startNum); }, 500);
Print it out like this:
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
Is this a pit? Depressed.
Reply content:
What does it mean when a decimal is added to a 16-bit floating-point number in a JavaScript timer?
Take a look at this:
var startNum = 1; setInterval(function () { startNum += 0.1; console.log(startNum); }, 500);
Print it out like this:
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
1.6000000000000005
Is this a pit? Depressed.
Very normal, this is due to the accuracy of the problem, do not believe you try 0.1+0.2 is not equal to 0.3.
http://segmentfault.com/q/1010000000137297
Look at this standard http://www.baidu.com/link?url=ax1PkOuwwFU_KTyr29jOuhSr9v-i-Zg8fqAqty9Xbo_2oLAYg-_ 2ciipeo3hkbxv3ohuddm6dkjk4rvzhgwao2rsrd7mf6qsewqo9shilxc1nik3ohei90mdfeyxbmtctmhycpcznpwuvwh_ Emjzz7imnbn99fbes8g4iphb32e&wd=&eqid=9ac65804000049bc00000005568e60b4
It is best to convert the decimal number to an integer calculation using the programming language.