• nthavoc@lemmy.today
    link
    fedilink
    English
    arrow-up
    56
    ·
    7 days ago

    Folks, ask yourselves, what game is out there that REALLY needs a 5090? If you have the money to piss away, by all means, it’s your money. But let’s face it, games have plateaued and VR isn’t all that great.

    Nvidia’s market is not you anymore. It’s the massive corporations and research firms for useless AI projects or number crunching. They have more money than all gamers combined. Maybe time to go outside; me included.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      28
      ·
      6 days ago

      Oh, VR is pretty neat. It sure as shit don’t need no $3000 graphics card though.

    • nuko147@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      Only Cyberpunk 2077 in path tracing. The only game i have not played until i can run it on ultra settings. But for that amount of money, i better wait until the real 2077 to see it happen.

      • Shanmugha@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 days ago

        The studio has done a great job. You most certainly have heard it already, but I am willing to say it again: the game is worth playing with whatever quality you can afford, save stutter-level low fps - the story is so touching it outplays graphics completely (though I do share the desire to play it on ultra settings - will do one day myself)

    • BT_7274@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      6 days ago

      Cyberpunk 2077 with the VR mod is the only one I can think of. Because it’s not natively built for VR you have to render the world separately for each eye leading to a halving of the overall frame rate. And with 90 fps as the bare minimum for many people in VR you really don’t have a choice but to use the 5090.

      Yeah it’s literally only one game/mod, but that would be my use case if I could afford it.

    • infyrian@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      ·
      6 days ago

      I plan on getting at least a 4060 and I’m sitting on that for years. I’m on a 2060 right now.

      My 2060 alone can run at least 85% of all games in my entire libraries across platforms. But I want at least 95% or 100%

  • Mr_Dr_Oink@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    6 days ago

    Since when did gfx cards need to cost more than a used car?

    We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.

    3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.

    Could it run better? Sure

    Does it need to? Not for 3 grand…

    Fuck me!..

    • Krompus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      I haven’t bought a GPU since my beloved Vega 64 for $400 on Black Friday 2018, and the current prices are just horrifying. I’ll probably settle with midrange next build.

    • Krompus@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 days ago

      AMD’s Windows drivers are a little rough, but the open source drivers on Linux are spectacular.

  • MHLoppy@fedia.io
    link
    fedilink
    arrow-up
    33
    ·
    7 days ago

    It covers the breadth of problems pretty well, but I feel compelled to point out that there are a few times where things are misrepresented in this post e.g.:

    Newegg selling the ASUS ROG Astral GeForce RTX 5090 for $3,359 (MSRP: $1,999)

    eBay Germany offering the same ASUS ROG Astral RTX 5090 for €3,349,95 (MSRP: €2,229)

    The MSRP for a 5090 is $2k, but the MSRP for the 5090 Astral – a top-end card being used for overclocking world records – is $2.8k. I couldn’t quickly find the European MSRP but my money’s on it being more than 2.2k euro.

    If you’re a creator, CUDA and NVENC are pretty much indispensable, or editing and exporting videos in Adobe Premiere or DaVinci Resolve will take you a lot longer[3]. Same for live streaming, as using NVENC in OBS offloads video rendering to the GPU for smooth frame rates while streaming high-quality video.

    NVENC isn’t much of a moat right now, as both Intel and AMD’s encoders are roughly comparable in quality these days (including in Intel’s iGPUs!). There are cases where NVENC might do something specific better (like 4:2:2 support for prosumer/professional use cases) or have better software support in a specific program, but for common use cases like streaming/recording gameplay the alternatives should be roughly equivalent for most users.

    as recently as May 2025 and I wasn’t surprised to find even RTX 40 series are still very much overpriced

    Production apparently stopped on these for several months leading up to the 50-series launch; it seems unreasonable to harshly judge the pricing of a product that hasn’t had new stock for an extended period of time (of course, you can then judge either the decision to stop production or the still-elevated pricing of the 50 series).


    DLSS is, and always was, snake oil

    I personally find this take crazy given that DLSS2+ / FSR4+, when quality-biased, average visual quality comparable to native for most users in most situations and that was with DLSS2 in 2023, not even DLSS3 let alone DLSS4 (which is markedly better on average). I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

    Increasingly complex manufacturing nodes are becoming increasingly expensive as all fuck. If it’s more cost-efficient to use some of that die area for specialized cores that can do high-quality upscaling instead of natively rendering everything with all the die space then that’s fine by me. I don’t think blaming DLSS (and its equivalents like FSR and XeSS) as “snake oil” is the right takeaway. If the options are (1) spend $X on a card that outputs 60 FPS natively or (2) spend $X on a card that outputs upscaled 80 FPS at quality good enough that I can’t tell it’s not native, then sign me the fuck up for option #2. For people less fussy about static image quality and more invested in smoothness, they can be perfectly happy with 100 FPS but marginally worse image quality. Not everyone is as sweaty about static image quality as some of us in the enthusiast crowd are.

    There’s some fair points here about RT (though I find exclusively using path tracing for RT performance testing a little disingenuous given the performance gap), but if RT performance is the main complaint then why is the sub-heading “DLSS is, and always was, snake oil”?


    obligatory: disagreeing with some of the author’s points is not the same as saying “Nvidia is great”

    • JuxtaposedJaguar@lemmy.ml
      link
      fedilink
      English
      arrow-up
      18
      ·
      7 days ago

      I don’t really care how a frame is generated if it looks good enough (and doesn’t come with other notable downsides like latency). This almost feels like complaining about screen space reflections being “fake” reflections. Like yeah, it’s fake, but if the average player experience is consistently better with it than without it then what does it matter?

      But it does come with increased latency. It also disrupts the artistic vision of games. With MFG you’re seeing more fake frames than real frames. It’s deceptive and like snake oil in that Nvidia isn’t distinguishing between fake frames and real frames. I forget what the exact comparison is, but when they say “The RTX 5040 has the same performance as the RTX 4090” but that’s with 3 fake frames for every real frame, that’s incredibly deceptive.

        • iopq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          It does add latency, you need 1-2ms to upscale the frame. However, if you are using a lower render resolution (instead of going up in resolution while rendering internally the same) then the latency will be lower because you have a higher frame rate

          • FreedomAdvocate@lemmy.net.au
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 days ago

            Yeah, so it doesn’t add latency. It takes like 1-2ms iirc in the pipeline, which like you said is less than/the same/negligibly more than it would take to render at the native resolution.

              • FreedomAdvocate@lemmy.net.au
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 days ago

                So it has limits? Oh no…… At 1000fps you can’t do much rendering effects at all. Luckily no one, and I do literally mean no one, plays games at 1000fps.

                • iopq@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  3 days ago

                  Yes, but that also means there’s no FPS advantage at all at 500 Hz using DLSS and people do play at 500Hz

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      7 days ago

      I think DLSS (and FSR and so on) are great value propositions but they become a problem when developers use them as a crutch. At the very least your game should not need them at all to run on high end hardware on max settings. With them then being options for people on lower end hardware to either lower settings or combine higher settings with upscaling. When they become mandatory they stop being a value proposition since the benefit stops being a benefit and starts just being neccesary for baseline performance.

        • KokoSabreScruffy@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 days ago

          Correct me if I am wrong but maybe they meant when Publisher/Devs list hardware requirement for their games and includes DLSS in the calculations. IIRC AssCreed Shadows and MH Wilds had that.

    • poopkins@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      Thanks for providing insights and inviting a more nuanced discussion. I find it extremely frustrating that in communities like Lemmy it’s risky to write comments like this because people assume you’re “taking sides.”

      The entire point of the community should be to have discourse about a topic and go into depth, yet most comments and indeed entire threads are just “Nvidia bad!” with more words.

      Obligatory disclaimer that I, too, don’t necessarily side with Nvidia.

  • MiDaBa@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    5 days ago

    Nvidia is using the “its fake news” strategy now? My how the mighty have fallen.

    I’ve said it many times but publicly traded companies are destroying the world. The fact they have to increase revenue every single year is not sustainable and just leads to employees being underpaid, products that are built cheaper and invasive data collection to offset their previous poor decisions.

  • yeehaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 days ago

    Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again :/

      • Victor@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 days ago

        Concur.

        I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn’t matter. It plays everything I want at way more frames than I need (240 Hz monitor).

        E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I went from a 2080 Super to the RX 9070 XT and it flies.

          You went from a 7 year old GPU to a brand new top of the line one, what did you expect? That’s not a fair comparison lol. Got nothing to do with FSR4 vs DLSS4.

          • Victor@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            what did you expect?

            I expected as much. 👍

            The thing to which I was concurring was simply that they said the 9070 was excellent.

            nothing to do with FSR4 vs DLSS4

            The 2080 Super supports DLSS. 🤷‍♂️

            I’m just posting an anecdote, bro. Chill.

            Also the 2080 Super was released in 2019, not 2018. 👍

  • ZeroOne@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 days ago

    AMD & Intel ARC are king now. All that CUDA nonsense, is just price-hiking justification

  • Quibblekrust@thelemmy.club
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    5 days ago

    Those 4% can make an RTX 5070 Ti perform at the levels of an RTX 4070 Ti Super, completely eradicating the reason you’d get an RTX 5070 Ti in the first place.

    You’d buy a 5070 Ti for a 4% increase in performance over the 4070 Ti Super you already had? Ok.

    • rdri@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      They probably mean the majority of people, not 4070 Ti owners. For them, buying that 4070 Ti would be a better choice already.

  • iopq@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    You don’t need NVENC, the AMD and Intel versions are very good. If you care about maximum quality you would software encode for the best compression

  • 3aqn5k6ryk@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 days ago

    My last nvidia card was gtx 980.I bought two of them. After i heard about 970 scandal. It didnt directly affect me but fuck nvidia for pulling that shit. Havent bought anything from them. Stopped playing games on pc afterwards, just occasionally on console and laptop igpu.

  • 3dcadmin@lemmy.relayeasy.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    After being AMD for years recently went back to nvidia for one reason. nvenc works way better for encoding livestreams and videos than amd

  • kepix@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 days ago

    “and the drivers, for which NVIDIA has always been praised, are currently falling apart”

    what? they were shit since hl2