Why does simple decimal arithmetic give strange results_?
For example, 5 * 1.015 does not give exactly 5.075 and 0.06+0.01 does not give exactly 0.07 in javascript. ECMAScript numbers are represented in binary as IEEE-754 (IEC 559) Doubles, with a resolution of 53 bits, giving an accuracy of 15-16 decimal digits; integers up to just over 9e15 are precise, but few decimal fractions are. Given this, arithmetic is as exact as possible, but no more. Operations on integers are exact if the true result and all intermediates are integers within that range.