• El Barto@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Bro you can’t say that “the 6502 is too weird” without providing more context. Too weird for today’s standards maybe?

    Because the 6502 was everywhere back in the day.

    It’s like saying that sony walkmans are too weird.

    • irdc@derp.foo
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 year ago

      The 6502 was weird back in the day as well, just weird in an ubiquitous way. It’s registers are too small, it’s stack is too small, it’s address space is too small. Argument passing often had to be done using the zero page, and since none of its registers can hold a respectable portion of its address space it requires hacks to implement such obscurities as C-style pointers. No current ABI can trace its origin to the 6502 (not even ARM).

      Sure, back in the day the alternatives at the price point were worse, but that doesn’t make the 6502 good.

      If you want a good CPU design with a 16-bit address space, take a look at the PDP-11.

      • El Barto@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I never said that the 6502 was good. I was questioning you calling it weird as if it was uncommon, when it wasn’t, because it was one of the most popular chips in its era.

        Many say that the x86 architecture is awful. But to say that it’s weird would be quite the stretch.

        • irdc@derp.foo
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          It’s weird in the sense that software development has moved in other directions. A tagged-architecture stack machine like the Burroughs Large System is weird as well, even though it’s been highly successful and very influential on later designs (eg. Forth, SmallTalk).

          If we’d still be using bank switching and overlays I’d say learning to code assembly on a 6502 is a great introduction to modern computers, but we’re not so it’s not.