definition
Definition Format number (Precision,scale)
The precision represents a valid bit in a number (from the first number on the left that is not 0, the decimal point and minus sign do not count toward the number of significant digits), and the value range is "1-38" by default 38.
The scale represents the exact number of bits, with a value range of "84-127" and a default value of 0. Greater than zero, indicating the number of digits to the right of the decimal point, less than 0 o'clock, the number will be selected to the left of the decimal point of the specified number of digits, so the length of the integer part of number is (Precision-scale), whether the scale is positive or negative, and, If precision is less than scale, it means that there are no integers stored in decimals.
The accuracy (p) and scale (s) of number are subject to the following rules:
1) Oracle will error when the length of the integer part of a number > P-s
2) When the length of a number of decimal parts > S, Oracle rounds.
3) when S (scale) is negative, Oracle rounds the s number to the left of the decimal point.
4) When s > P, p indicates the number of digits to the left of the S bit after the decimal point, and if the Oracle error is greater than P, the number of s bits to the right after the decimal point is rounded