Why can't js handle decimal arithmetic correctly?

Source: Internet
Author: User

First look at the following program:

varsum=0;for(var=0<10; i++) {  sum+=0.1;}console.log(sum);

Will the above program output 1?

In the 25 JavaScript questions you need to know, the 8th question is plain about why the next JS can't handle the decimal operation correctly. Today to regain the old problem, a deeper analysis of the problem.

But first of all, the inability to properly handle decimals is not a design error in the JavaScript language itself, and other high-level programming languages, such as C,java, are also not able to handle decimal operations correctly:

#include <stdio.h>void main(){    sum;    int i;    sum0;    for0;  100; i++) {        sum0.1;    }    printf(‘%f\n‘sum);  //10.000002}
The representation of a number inside a computer

We all know that a program written in a high-level programming language needs to be interpreted, compiled, and manipulated into a machine language that the CPU (central processing Unit) can recognize, and for the CPU it does not recognize the decimal, octal, and 16 binary, etc. The binary numbers we declare in the program are converted to binary numbers for operation.

Why not convert to a three-digit operation?

The inside of the computer is made up of many ICS (Integrated circuit: integrated circuits), the electronic component, which looks like this:

There are many shapes in the IC, and many pins are arranged side-by-sides or in the interior (only one side of the diagram is drawn). All pins of IC, only DC voltage 0V or 5V two states, that is, an IC pin can only represent two states. This characteristic of the IC determines that the data inside the computer can only be processed using binary numbers.

Since 1 bits (one pin) can only represent two states, the binary calculation becomes 0, 1, 10, 11, 100 .... This form:

So, in a number of operations, all operands are converted into binary numbers to participate in the operation, such as 39, which is translated into binary 00100111

Binary representation of decimals

As mentioned above, the data in the program will be converted to binary number, the decimal in the operation, will also be transferred to the binary, such as the decimal 11.1875 will be converted to 1101.0010.

The 4-bit numeric range represented by a decimal point is 0.0000~0.1111, so this can only represent 0.5, 0.25, 0.125, 0.0625, four decimal digits, and the number of bit weights (added) after the decimal point:

binary
0.0000 0
0.0001 0.0625
0.0010 0.125
0.0011 0.1875
0.01 XX 0.25
0.1000 0.5
0.1001 0.5625
0.1010 0.625
0.1011 0 .6875
0.1111 0.9375

As can be seen from the above table, the next digit of decimal number 0 is 0.0625, so, the decimal between 0~0.0625, can not be used after the decimal point of 4 digits of the binary number, if the number of digits after the decimal point of the binary number, the corresponding number of decimal number will be increased, but no matter how many bits are added, can not get 0.1 this result. In fact, 0.1 conversion into binary is a 0.00110011001100110011…… note that 0011 is infinitely repetitive:

console.log(0.2+0.1);//操作数的二进制表示0.10.0001100110011001…(无限循环)0.20.0011001100110011…(无限循环)

JS number type is not like C/java integer, single-precision, double-precision, etc., but unified performance of double-precision floating-point type. According to the IEEE, single-precision floating-point number with 32 digits for all decimals, and double-precision floating-point numbers with 64-bit representation of all decimals, and floating-point numbers by the symbol, Mantissa, exponent and base composition, so not all the digits are used to represent decimals, symbols, indices and so also occupy the number of digits:

The decimal portion of a double-precision floating-point number supports up to 52 bits, so the two add up to get a string of 0.0100110011001100110011001100110011001100 ... A binary number truncated because of the limit of the number of floating-point numbers, when it is converted to decimal, it becomes 0.30000000000000004.

Summarize

JS does not correctly handle decimal operations, including other high-level programming languages, this is not the language itself design errors, but the computer itself can not properly handle the operation of decimals, the operation of the decimal will often get unexpected results, because not all decimal decimals can be represented by binary.

Reference

"How the program is running."

Original source:http://www.ido321.com/1661.html

Why can't js handle decimal arithmetic correctly?

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.