Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.
Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.
And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.
I hope you work in a field where worrying about your integers hitting larger values than 9 quadrillion is justified.
Could be a crypto key, or a randomly distributed 64-bit database row ID, or a memory offset in a stack dump of a 64 bit program