Not quite there yet … from left on surface, 5G internet, WireGuard router, pihole on a Zero W and 4x4 N95 HTPC, plus 1080p projector. When a computer that size (actually smaller, since I don’t need a SATA bay) can outperform my tower, though …

This photo of Meteor Lake shows 16GB of LPDDR5X on the package. AMD’s looking to kill the low-midrange GPU in the next couple of generations of APUs, with Intel attempting to reach parity. And all of this in a fraction of the power envelope of a midrange gaming rig.

Maybe it’s next-quarter-itis dominating the tech press, but these developments feel like they deserve a bit more attention given that all signs point to gaming 4x4 PCs with a wall wart in the next two years. This actually makes Intel’s exit from the NUC space somewhat surprising, but they’ve been shedding products pretty consistently and this may just be a part of that.

I’m in the situation of having a 5-year-old gaming rig that’s still going strong (caveat: I’m a factory/city-builder gamer so an RX 6600 works fine for me at 4K60), and moving into a stepvan in the next couple of weeks and therefore suddenly very aware of power draw, so all of this may be more exciting to me than the average bear, as I could see finally upgrading on account of a dead component in the next couple of years.

Yet there’s still that part of me from college that wants to keep abreast of the latest developments, and as I’ve watched now six desktop Intel generations hit benchmarks since I was the lucky winner of an 8086K, there’s been nothing that really draws a line in the sand and says “this will be the clear new minimum target.”

Intel starting over at 1st gen for Meteor Lake shows they see this finally changing. It honestly could have happened anywhere from introduction of E-cores to the seeming destination of Rentable Units, which have finally popped up outside of MLID. I’ve seen nothing about what AMD’s disaggregated endpoint looks like, even though I’m definitely looking to Strix Halo as where I may be able to ditch the ITX sandwich tower completely. Couple this with swapping out my TV for a native 1080p mini projector (a “maybe” suggestion that turned into having to try one at $40, and wow!), and I could be gaming in a van in fucking style with essentially zero dedicated hardware space in just a couple years!

Anyway, in situations like this, I’ve found that I may have inadequate sources, so I thought I’d see if anyone had suggestions.

  • @[email protected]
    link
    fedilink
    5
    edit-2
    1 year ago

    I did the TV -> projector swap last year, got myself a 4K projector that sits above my bed and projects a massive 100" image on the wall opposite my bed, and it’s awesome. I’ve got my PS5 and Switch hooked to it, and I’m currently living the dream of being able to game and watch movies on a giant screen, all from the comfort of my bed. Some games really shine on such a screen and you see them in a new light, like TotK, Horizon series, Spiderman etc and it’s 100% worth the switch, IMO.

    Now I also have a regular monitor - a nice low latency QHD 16:10 monitor with HDR, hooked up to my PC, which also uses a 6600 XT btw. Main reason I use this setup is for productivity, running some PC games that don’t have console equivalents, plus the colors look much nicer compared to my projector. Maybe if I bought a laser projector and had one of those special ALR screens I could get nicer colors, but all that is way beyond my budget. Although these days I’m not on my desktop as much as I used to be (I also have a Ryzen 6000 series laptop that I game on btw), I still like my desktop because of the flexibility and upgradability. I also explored the option of switching to a cloud-first setup and ditching my rig, back when I wanted to upgrade my PC and we had all those supply chain issues during Covid, but in the end, cloud gaming didn’t really work out for me. In fact after exploring all the cloud options, I’ve been kind of put off by cloud computing in general - at least, the public clouds being offered by the likes of Amazon and Microsoft - they’re just in it to squeeze you dry, and take control away from you, and I don’t like that one bit. If I were to lean towards cloud anything, it be rolling my own, maybe using something like a Linode VM with a GPU, but the pricing doesn’t justify it if you’re looking any anything beyond casual usage. And that’s one of the things I like about PC, I could have it running 24x7 if I wanted to and not worry about getting a $200 bill at the end of the month, like I got with Azure, because scummy Microsoft didn’t explain anywhere that you’d be paying for bastion even if the VM was fully powered off…

    Anyways, back to the topic of CPUs, I don’t really think we’re at the cusp of any re-imagining, what we’ve been seeing is just gradual and natural improvements, maybe the PC taking inspiration from the mobile world. I haven’t seen anything revolutionary yet, it’s all been evolutionary. At the most, I think we’d see more ARM-like models, like the integrated RAM you mentioned, more SoC/integrated solutions, maybe AI/ML cores bring the new thing to look for an a CPU, maybe ARM itself making more inroads towards the desktop and laptop space, since Apple have shown that you can use ARM for mainstream computing.

    On the revolutionary side, the things I’ve been reading about are stuff like quantum CPUs or DNA computers, but these are still very expiremental, with very niche use-cases. In the future I imagine we might have something like a hybrid semi-organic computer, with a literal brain that forms organic neural networks and evolves as per requirements, I think that would be truly revolutionary, but we’re not there yet, not even at the cusp of it. Everything else that I’ve seen from the likes of Intel and AMD, have just been evolutionary.

    • Pete HahnloserOP
      link
      fedilink
      81 year ago

      Short-throw 4K is certainly the stretch goal; my two 32" 4K60 HDR LGs are coming along in the meanwhile. Sometimes, you just can’t beat a shitton of pixels a couple of feet from your face.

      I’ve been going toward more control this year, having switched to KDE after one too many Win11 nags about OneDrive. It floors me that they really could have upped the telemetry without me jumping ship, but instead they toasted themselves off my desktop. I will not be participating in any sort of thin-client/VDI dystopia anytime soon for the reasons you enumerated, which is what makes the idea of that photo being my entire internet connection/VPN/pihole/gaming PC/display sometime soon so appealing even without the van situation.

      Yeah, there’s been a lot of progress on a lot of fronts, but SoCs coming to replace gaming towers that are essentially unchanged since the adoption of ATX as a form factor is to me bigger than the cores on a chip themselves. And the power envelope for a single-package CPU/GPU (RAM notwithstanding) with that level of performance would to me, as an enthusiast, obliterate everything up to the 80/800 GPU level. I’m sure people would still build towers because they like building towers, but I’m happy to let PSUs and power connectors sail quietly into the night.

      • @[email protected]
        link
        fedilink
        1
        edit-2
        1 year ago

        Just a heads up with short throw you have to be really sure to have a perfectly flat screen surface to project on, even just .5 throw means any kind of pull down screen will be a nightmare to use, even tab tensioned isn’t great (but acceptable)