Home

Definition of Decimal

A data type that stores a signed, exact numeric value described as the number of digits appearing before and after the decimal point, with a maximum of 29 total digits. All possible digits cannot be represented if you are using the maximum number of digits.

Computer Science

Other definitions of Decimal

A number in a counting system that is based on units of 10. Most commonly used to refer to a real number expressed in decimal notation (i.e. containing a .’ between the integer part and the fraction part of the number).’

Computer Science