I did not have this concept before I met this question, and I ran into the record;
Project JS is used in the number*0.2 (where number is an integer, I test the time is 259), the result is 51.80000000000000004 such a result,
At that time directly to force, do not know what reason, readily in Java write a System.out.println (259*0.2), get the same result;
At this time it is not the data type of problem, the internet to watch the cause of the great God
--------------------------------------------------------------------------------------------------------------- ------------------------------------
There is a problem on the Internet: the results of 0.1+0.2 in the computer
Whether the results in Java,javascript,python are 0.30000000000000004.
The reason given by the Great God (reproduced: http://blog.csdn.net/zcczcw/article/details/7362473):
Computers are stored in binary 0 and 1, and our real number in the computer to convert into a binary system may not be divisible, that is, not exactly divisible, so the binary representation of the real number and the calculation of the error. convert decimal 0.1 and 0.2 into binary: 0.1 = 0.0001 1001 1001 1001 ... (1001 infinite loop) 0.2 = 0.0011 0011 0011 0011 ... (0011 Infinite loop) but the number of bits of our computer's hardware storage is limited and impossible to loop indefinitely, the general double-precision floating-point numbers occupy a total of 64 bits, of which up to 53 bits is a valid precision number (including the sign bit), so when stored: 0.1=>0.0001 1001 1001 1001 1001 1001 1001 1001 1001 1001 1001 1001 10010.2=>0.0011 0011 0011 0011 0011 0011 0011 0011 0011 0011 0011 0011 001 10.1+0.2=>0.0100 1100 1100 1100 1100 1100 11001 100 1100 1100 1100 1100 1100 Convert to decimal is: 0.30000000000000004
The error of calculation accuracy of small and medium java,js