Oracle uses a standard, variable-length internal format to store numbers. This internal format can be up to 38 bits in precision.
The number data type can have two qualifiers, such as:
Column number (precision, scale)
Represents a valid bit in a number. If precision is not specified, Oracle uses 38 as the precision.
The scale represents the number of digits to the right of the decimal point, with a scale set to 0 by default. If the scale is set to a negative number, Oracle will trade the number to the specified digit to the left of the decimal point.
The above is the information
And then pit mine is I think precision refers to the number of digits to the left of the decimal point, in fact it refers to the significant bit (the integer digits + the number of digits after the decimal point)
such as 123.0, XXX. X These numbers have a valid bit of 4 and are defined in MySQL as double (4,1).
Hibernate configuration file Precision and scale