• @[email protected]
    link
    fedilink
    English
    47 hours ago

    For the people looking to upgrade: always check first the used market in your area. It is quite obvious for now the best thing to do is just try to get 40 series from the drones that must have the 50 series

  • yeehaw
    link
    fedilink
    English
    1110 hours ago

    lol this reminds me of whatever that card was back in the 2000’s or so, where you could literally make a trace with a pencil to upgrade the version lower to the version higher.

    • @[email protected]OP
      link
      fedilink
      English
      113 minutes ago

      Yeah, those were the days when cost control was simply to use the same PCB but with just the traces left out. There were also quite a few cards that used the exact same PCB, traces intact, that you could simple flash the next tier card’s BIOS and get significant performance bumps.

      Did a few of those mods myself back in the day, those were fun times.

  • @[email protected]
    link
    fedilink
    English
    48 hours ago

    Wow, it looks like it is really a 5060 and not even a 5070. Nvidia definitely shit the bed on this one.

  • Majorllama
    link
    fedilink
    English
    2116 hours ago

    Just like I rode my 1080ti for a long time it looks like I’ll be running my 3080 for awhile lol.

    I hope in a few years when I’m actually ready to upgrade that the GPU market isn’t so dire… All signs are pointing to no unfortunately.

    • @[email protected]
      link
      fedilink
      English
      13 hours ago

      Same here.

      Only upgraded when my 1080 died, so I snagged a 3080 for an OK price. Not buying a new card untill this one dies. Nvidia can get bent.

      Maybe team red next time….

      • @[email protected]
        link
        fedilink
        English
        15 hours ago

        My 1080ti was probably peak Nvidia for me. It was so good, and I was super happy with it. Truely felt like upgrade.

      • @leftzero
        link
        English
        37 hours ago

        980ti here, playing Cyberpunk 2077 at sufficiently high settings without issues (if you call ~30fps 1440p with no path tracing “without issues”, that is).

      • Majorllama
        link
        fedilink
        English
        112 hours ago

        If you’re still playing at 1080 it’s still a capable card tbh.

        I gave mine to a friend who still had a 660 when I upgraded to the 3080 lol.

        • @[email protected]
          link
          fedilink
          English
          110 hours ago

          Oh yeah, it still gets the job done.

          Keeping my eyes open for used ones to upgrade with now that the new series is out though. Gives me an excuse to get the 1080 in my server.

    • @[email protected]
      link
      fedilink
      English
      19 hours ago

      1060 6GB gang here… I will probably get a 3060 or 4060 next time I uppgrade, unless I ditch Nvidia (thinking of moving to Linux).

  • circuitfarmer
    link
    fedilink
    English
    5721 hours ago

    Vote with your wallets. DLSS and Ray Tracing aren’t worth it to support this garbage.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      13 hours ago

      Just don’t buy a new card unless you really need to.

      I bought a 3080 because that gen seemed like a pretty solid bump in performance for the price. The new Nvidia gen is underwhelming so just wait.

      Or if you really need one now buy a 30 or 40 series card.

      • JackbyDev
        link
        fedilink
        English
        12 hours ago

        I got a 3070ti but my old one was something like a 780? I don’t remember exactly. Before that I had a Radeon 5970. That’s an upgrade roughly every 7 years I guess.

      • circuitfarmer
        link
        fedilink
        English
        612 hours ago

        That’s fair. I’m on an all AMD system with no need to change, tbh. 7800XT.

    • @[email protected]OP
      link
      fedilink
      English
      14
      edit-2
      15 hours ago

      Wish more gamers would but that ship has long sailed unfortunately. I mean look at what the majority of gamers tolerate now.

      • circuitfarmer
        link
        fedilink
        English
        520 hours ago

        Yeah, unfortunately you’re probably right. Brand image is also still good, somehow

    • @[email protected]
      link
      fedilink
      English
      10
      edit-2
      13 hours ago

      Dlss and RT are great… But this Gen definitely sucks. Just get a 4080 (if they’re cheaper)

        • @[email protected]
          link
          fedilink
          English
          19
          edit-2
          18 hours ago

          Lol wut? No it’s not. That’s a ridiculous thing to say. Properly implemented RT is gorgeous and worlds ahead of rasterized lighting. Sure, some games have shit RT, but RT in general is not a money grab. That’s a dumb thing to say

          • yeehaw
            link
            fedilink
            English
            310 hours ago

            While RT made my 2070s cry in cyberpunk, the difference was way less noticeable than the difference in FPS. I dunno, IMO when you watch some slowed down video or take your time inspecting things, you can tell a bit. But when you’re driving a car and shooting foes you can’t even tell. IMO, not worth $2,000 to improve my ray tracing to not even 144 FPS.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              10 hours ago

              Cyberpunk is night and day with and without ray tracing. And in motion is the best way to see a lot of the affects. I’m not sure what to say. The difference is so obvious to me, especially after the path tracing updates.

              • yeehaw
                link
                fedilink
                English
                38 hours ago

                Well 24fps sure as shit didn’t help me.

          • Nik282000
            link
            fedilink
            English
            1319 hours ago

            Game engines don’t have to simulate sound pressure bouncing off surfaces to get good audio. They don’t have to simulate all the atoms in objects to get good physics. There’s no reason to have to simulate photons to get good lighting. This is a way to lower engine dev costs and push that cost onto the consumer.

            • JackbyDev
              link
              fedilink
              English
              12 hours ago

              I can see your point to an extent. Good style is more important than hyper realism. But that doesn’t mean hyper realistic with good style can’t be good.

            • @[email protected]
              link
              fedilink
              English
              816 hours ago

              Game engines don’t have to simulate sound pressure bouncing off surfaces to get good audio.

              Sure, but imitating good audio takes a lot of work. Just look at Escape From Tarkov that has replaced its audio component twice? in 5 years and the output is only getting worse. I imagine if they could have an audio component that simulates audio in a more realistic way with miminal performance hit compared to the current solutions I think they’d absolutely use it instead of having to go over thousands of occlusion zones just to get a “good enough”.

              They don’t have to simulate all the atoms in objects to get good physics.

              If it meant it solves all physics interactions I imagine developers would love it. During Totk development Nintendo spent over a year only on physics. Imagine if all their could be solved simply by putting in some physics rules. It would be a huge save on development time.

              There’s no reason to have to simulate photons to get good lighting.

              I might be misremembering but I’m pretty sure raytracing can’t reenact the double slit experiment because it’s not actually simulating photons. It is simulating light in a more realistic way and it’s going to make lighting the scenes much easier.

              This is a way to lower engine dev costs and push that cost onto the consumer.

              The only downside of raytracing is the performance cost. But that argument we could’ve used in the early 90s against 3d engines as well. Eventually the tech will mature and raytracing will become the norm. If you argued they Raytracing is a money grab at this very moment I’d agree. The tech isn’t quite there yet, but I imagine within the next decade it will be. However you’re presenting raytracing as something useless and that’s just disingenuous.

              • Nik282000
                link
                fedilink
                English
                515 hours ago

                Ray tracing is a conceptually lazy and computationally expensive. Fire off as many rays as you can in every direction from every light source, when the ray hits something it gets lit up and fires off more rays of lower intensity and maybe a different colour.

                Sure you can optimize things by having a maximum number of bounces or a maximum distance each ray can travel but all that does is decrease the quality of your lighting. An abstracted model can be optimized like crazy BUT it take a lot of man power (paid hours) and doesn’t directly translate to revenue for the publisher.

                The only downside of raytracing is the performance cost.

                The downside is the wallet cost. Spreading the development cost of making a better conventional lighting system over thousands of copies of a game is negligible, requiring ray tracing hardware is an extra 500-1000 bucks that could otherwise be spent on games.

                • @[email protected]
                  link
                  fedilink
                  English
                  25 hours ago

                  The downside is the wallet cost.

                  The wallet cost is tied to the performance cost. Once the tech matures companies will start competing over pricing and “the wallet cost” comes down. The rest of what you’re saying is just you repeating yourself. And now I also have to repeat myself.

                  If you argued they Raytracing is a money grab at this very moment I’d agree. The tech isn’t quite there yet, but I imagine within the next decade it will be. However you’re presenting raytracing as something useless and that’s just disingenuous.

                  There’s no reason to argue over the now, I agree that right now raytracing really isn’t worth it. But if you’re going to continue arguing that raytracing will never be worth it you better come up with better arguments.

            • @[email protected]
              link
              fedilink
              English
              1318 hours ago

              You’re either not arguing in good faith or grossly misunderstanding why RT results in more realistic lighting. I suggest you read up on RT, how it works, and what it is supposed to be simulating.

            • @[email protected]
              link
              fedilink
              English
              417 hours ago

              You’re forgetting about the #1 reason why Ray Tracing is so good: it saves hard drive space because you don’t need to pre-bake lighting, when it’s rendered in realtime. You should be a fan for that reason alone. Games have gotten too large.

            • @[email protected]
              link
              fedilink
              English
              218 hours ago

              I looks pretty when enabled, can you get Cyberpunks level of ray tracing visuals without it? Honest question, I thought that’s what made the game gorgeous, but if it’s possible without it why don’t know why some teams haven’t shown those results without it? And again I may have missed some games that do near equivalent without that feature.

              • circuitfarmer
                link
                fedilink
                English
                616 hours ago

                Cyberpunk with RT off still looks damn good.

                Personally, if it’s between RT off and giving my money to a company normalizing lying to me, I’ll stick with RT off.

  • @[email protected]
    link
    fedilink
    English
    3019 hours ago

    I’ve got the feeling that GPU development is plateauing, new flagships are consuming an immense amount of power and the sizes are humongous. I do give DLSS, Local-AI and similar technologies the benefit of doubt but is just not there yet. GPUs should be more efficient and improve in other ways.

    • @[email protected]
      link
      fedilink
      English
      2918 hours ago

      I’ve said for a while that AMD will eventually eclipse all of the competition, simply because their design methodology is so different compared to the others. Intel has historically relied on simply cramming more into the same space. But they’re reaching theoretical limits on how small their designs can be; They’re being limited by things like atom size and the speed of light across the distance of the chip. But AMD has historically used the same dies for as long as possible, and relied on improving their efficiency to get gains instead. They were historically a generation (or even two) behind Intel in terms of pure hardware power, but still managed to compete because they used the chips more efficiently. As AMD also begins to approach those theoretical limits, I think they’ll do a much better job of actually eking out more computing power.

      And the same goes for GPUs. With Nvidia recently resorting to the “just make it bigger and give it more power” design philosophy, it likely means they’re also reaching theoretical limitations.

      • @[email protected]
        link
        fedilink
        English
        1114 hours ago

        AMD never used chips “more efficiently”. They hit gold with the RYZEN design but everything before since Athlon was horrible and more useful as a room heater. And before athlon it was even worse. The k6/k6-2 where funny little buggers extending the life of ziff7 but it lacked a lot of features and dont get me started about their dx4/5 stuff which frequently died in spectacular manners.

        Ryzen works because of chiplets and the stacking of the cache. Add some very clever stuff in the pipeline which I don’t presume to understand and the magic is complete. AMD is beating intel at it’s own game: it’s Ticks and tocks are way better and most important : executable. And that is something Intel hasn’t been able to really do for several years. It only now seems to be returning.

        And lets not forget the usb problems with ryzen 2/3 and the memory compatibility woes of ryzen’s past and some say: present. Ryzen is good but its not “clean”.

        In GPU design AMD clearly does the same but executes worse then nvidia. 9070 cant even match its own predecessor, 7900xtx is again a room heater and is anything but efficient. And lets not talk about what came before. 6xxx series where good enough but troublesome for some and radeon 7 was a complete a shitfest.

        Now, with 90 70 AMD once again, for the umpteenth time, promises that the generation after will fix all its woes. That that can compete with Nvidia.

        Trouble is, they’ve been saying that for over a decade.

        Intel is the one looking at GPU design differently. The only question is: will they continue or axe the division now gelsinger is gone. which would be monunentally stupid but if we can count on 1 thing then its the horrible shortsightness of corporate America. Especially when wall street is involved. And with intel, wall street is heavily involved. Vultures are circling.