Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.

  • @aubeynarf
    link
    3
    edit-2
    7 hours ago

    JavaScript is truly a bizarre language - we don’t need to go as far as arbitrary-precision decimal, it does not even feature integers.

    I have to wonder why it ever makes the cut as a backend language.

    • luciole (he/him)
      link
      fedilink
      14 hours ago

      The JavaScript Number type is implemented as an IEEE 754 double and as such any integer between -253 and 253 are represented without loss of precision. I can’t say I’ve ever missed explicitly declaring a value as an integer in JS. It’s dynamically typed anyways. There’s the languages people complain about and the ones nobody uses.

      • @aubeynarf
        link
        13 hours ago

        And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.

        • luciole (he/him)
          link
          fedilink
          1
          edit-2
          3 hours ago

          I hope you work in a field where worrying about your integers hitting larger values than 9 quadrillion is justified.

          • @aubeynarf
            link
            2
            edit-2
            2 hours ago

            Could be a crypto key, or a randomly distributed 64-bit database row ID, or a memory offset in a stack dump of a 64 bit program