Pretty much the question. I heard about Usenet a while back but never managed to wrap my head around it.

  • Boozilla
    link
    fedilink
    English
    18
    edit-2
    10 months ago

    Usenet is a decades-old distributed message sharing system. It’s like an old school message board. To access it, you need a newsreader. Mozilla Thunderbird is one such example.

    I have not accessed newsgroups in several years, so I don’t know how active it is today. But it used to be the go-to source for “warez” and bootleg media and porn. Oh yeah, and discussions threads on myriad topics. :)

    Slashdot, digg, Reddit, lemmy, 4chan, etc. are all spiritual descendents of usenet.

    The software tech for usenet is old, slow, and has a learning curve. You might find it frustrating to navigate and use. However, modern newsreaders probably hide some of the complexity.

    • Em Adespoton
      link
      fedilink
      1410 months ago

      Usenet has its own protocol (NNTP) where digests of messages get passed from server to server, eventually making it out to all (or most) servers that host a particular group (like alt, sys, gov, etc.). In essence, it’s a federated digital bulletin board of bulletin boards. Many servers don’t participate in some groups such as alt.binary.*.

      Usenet pre-dates the world wide web, and even pre-dates Gopher. It was designed such that a Usenet server could spend most of its time disconnected from the Internet and accumulate local posts that would then be federated in a digest when the server dialed up and connected to other servers.

      The main NNTP network eventually made its way to a centralized web-accessible service and most places that used to provide an NNTP server (which was most ISPs in the 90s) eventually shut their servers down and only provided gateway and email services.

      The protocol still exists though, and there’s still a small connected network.

      In reality, Lemmy is the spiritual descendant of Usenet.

      • @[email protected]
        link
        fedilink
        110 months ago

        It was designed such that a Usenet server could spend most of its time disconnected from the Internet and accumulate local posts that would then be federated in a digest when the server dialed up and connected to other servers.

        …Would this have been local posts of an individual, or sometimes a group in a LAN or something? The way you describe it here puts me in the mind of recent stuff like Scuttlebutt, albeit that’s more clearly individual-focused.

        • @[email protected]
          link
          fedilink
          210 months ago

          What’s scuttlebutt?

          And no, not individual and not LAN. WAN. A Usenet server could easily service hundreds of folks if not thousands. It would collect all their posts and then aggregate upwards.

          • @[email protected]
            link
            fedilink
            210 months ago

            This gives a brief overview of Scuttlebutt with a link to a more technical breakdown.

            That said, I remain confused by the other person’s description, as I’m not sure how it’s accumulating posts while “disconnected from the Internet”. I follow how it works when connected, but not so much how it would work as they’ve described it, at least in the disconnected circumstances, unless it’s sorta how I asked.

            • Em Adespoton
              link
              fedilink
              210 months ago

              The Internet is a network of networks. Nowadays, everyone tends to have always-on connections to the entire thing, but back in the day, many of the networks spent a lot of time disconnected from each other. Usenet was designed to mostly transparently handle this by the local network having an aggregating server that would collect all the local activity and share it with other Usenet servers when it could reach them.

              Remember that even the local activity was people connecting with teletype terminals and dialing up over modems from remote systems. Long distance trunking fees were a big deal, and Internet routing had to deal with the possibility that there was currently no route to the destination address.