• @[email protected]
    link
    fedilink
    English
    692 days ago

    I mean, that’s literally how research works. You make small discoveries and use them to move forward.

        • @[email protected]
          link
          fedilink
          English
          262 days ago

          Must be the dumbest take on QC I’ve seen yet. You expect a lot of people to focus on how it’ll break crypto. There’s a great deal of nuance around that and people should probably shut up about it. But “dime stuck in the road is a stable datapoint” sounds like a late 19th century op-ed about how airplanes are impossible.

          • qaz
            link
            fedilink
            English
            42 days ago

            The internet is pointless, because you can transmit information by shouting. /s

      • Zement
        link
        fedilink
        English
        352 days ago

        Are you aware that RAM in your Computing devices looses information if you read the bit?

        Why don’t you switch from smartphone to abacus and dwell in the anti science reality of medieval times?

        • @[email protected]
          link
          fedilink
          English
          82 days ago

          And that it looses data after merely a few milliseconds if left alone, that to account for that, DDR5 reads and rewrites unused data every 32ms.

        • @[email protected]
          link
          fedilink
          English
          32 days ago

          You’re describing how ancient magnetic core memory works, that’s not how modern DRAM (Dynamic RAM) works. DRAM uses a constant pulsing refresh cycle to recharge the micro capacitors of each cell.

          And on top of that, SRAM (Static RAM) doesn’t even need the refresh circuitry, it just works and holds it’s data as long as it remains powered. It only takes 2 discreet transistors, 2 resistors, 2 buttons and 2 LEDs to demonstrate this on a simple breadboard.

          I’m taking a wild guess that you’ve never built any circuits yourself.

          • Zement
            link
            fedilink
            English
            13
            edit-2
            2 days ago

            I’m taking a wild guess that you completely ignored the subject of the thread to start an electronics engineering pissing contest?

            • @[email protected]
              link
              fedilink
              English
              12 days ago

              Do you really trust the results of any computing system, no matter how it’s designed, when it has pathetic memory integrity compared to ancient technology?

              • Zement
                link
                fedilink
                English
                102 days ago

                That is not a product. This is research.

          • @[email protected]
            link
            fedilink
            English
            62 days ago

            And you would have been there shitting on magnetic core memory when it came out. But without that we wouldn’t have the more advanced successors we have now.

              • @[email protected]
                link
                fedilink
                English
                62 days ago

                Doubt.

                Core memory loses information on read and DRAM is only good while power is applied. Your street dime will be readable practically forever and your abacus is stable until someone kicks it over.

                You’re not the arbiter of what technology is “good enough” to warrant spending money on.

                • @[email protected]
                  link
                  fedilink
                  English
                  11 day ago

                  Core memory is also designed to accomodate for that and almost instantly rewrite the data back to memory. That in itself might be a crude form of ‘error’ correction, but it still lasts way longer than an hour.

                  Granted that quantum computers are a different beast of their own, how much digital data does a qbit actually store? And how does that stack up in price per bit comparison?

                  If they already know quantum computers are more prone to memory errors, why not just use reliable conventional RAM to store the intermediate data and just let the quantum side of things just be the ‘CPU’, or QPU if you like?

                  I dunno, it just makes absolutely no sense to me to utilitze any sort of memory technology that even with error correction still manages to lose information faster than a jumping spider’s memory?