Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.
Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.
JavaScript is truly a bizarre language - we don’t need to go as far as arbitrary-precision decimal, it does not even feature integers.
I have to wonder why it ever makes the cut as a backend language.
Popularity and ease of use I guess.
The JavaScript Number type is implemented as an IEEE 754 double and as such any integer between -253 and 253 are represented without loss of precision. I can’t say I’ve ever missed explicitly declaring a value as an integer in JS. It’s dynamically typed anyways. There’s the languages people complain about and the ones nobody uses.
And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.
I hope you work in a field where worrying about your integers hitting larger values than 9 quadrillion is justified.
Could be a crypto key, or a randomly distributed 64-bit database row ID, or a memory offset in a stack dump of a 64 bit program