c# decimal vs float
// Precision is the main difference.
float flt = 1F/3;
double dbl = 1D/3;
decimal dcm = 1M/3;
Console.WriteLine("float: {0} double: {1} decimal: {2}", flt, dbl, dcm);
// OUTPUT:
float: 0.3333333
double: 0.333333333333333
decimal: 0.3333333333333333333333333333
/* Float - 7 digits (32 bit)
Double-15-16 digits (64 bit)
Decimal -28-29 significant digits (128 bit)
Decimals have much higher precision and are usually used within financial
applications that require a high degree of accuracy. Decimals are much slower
(up to 20X times in some tests) than a double/float.
Decimals and Floats/Doubles cannot be compared without a cast whereas Floats
and Doubles can. Decimals also allow the encoding or trailing zeros. */