So, this isn’t exactly retrocomputing and it looks like it’s from four years ago, but I just found it and thought that many here may find it interesting.

This guy built a simple 8-bit CPU out of essentially discrete logic circuits (from what I’ve see so far, nothing more complex than a 4-bit adder chip) and explained the entire process.

youtube.com/playlist?list=PLow…

  • screwlisp
    link
    fedilink
    48 months ago

    @me @retrocomputing

    1. extremely cool
    2. The thing is, while rolling back experienced computer performance/degrowthing computing clearly has to happen
      I think hand-wired 8 bit computing is an educational rather than practical thing. (Obviously 8 bit AVR MCUs are a practical thing). 32 bit physical lisp machines on fpgas! (Eventually)
    • @[email protected]
      link
      fedilink
      48 months ago

      Silicon foundries use a lot of water and raw materials and contaminate the ground. Full degrowth may involve abandoning semiconductor technologies and making computers out of simpler parts, such as electromagnetic relays. They’ll be a lot slower and simpler, but with the right knowledge, one can make them from raw materials without bootstrapping a complex technology chain.

      • @[email protected]
        link
        fedilink
        English
        18 months ago

        abandoning semiconductor technologies and making computers out of simpler parts

        I remember reading an article a while back about basically computing using cards which block or allow light to flow as a series of logic gates. Another way to think of it is reinventing the punch card.

    • meOP
      link
      fedilink
      38 months ago

      Oh yeah, I can’t see any practical use outside of educational, but for that purpose, it’s a fantastic resource.

      • screwlisp
        link
        fedilink
        2
        edit-2
        8 months ago

        @curtosis
        Unlike me, you should talk to @amszmidt. What do you think about the existing port/emulation of the 1985 LM-3 #MITCADR to #HDL as a starting point for a modern lisp computer (instead of the scheme things that happen sometimes)? (I’ve tried and failed to get involved because of excuses. ;_;).

        I forgot the LM discussion of Actually Using The Extra Bits Available.
        @me @retrocomputing