• veni_vedi_veni@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    edit-2
    18 hours ago

    pretending their games run better on one brand over the other…

    That’s absolutely a thing. There are a lot of benchmark channels showing noticeable changes in fps for some games between brands. Depending on the game, that might change the value proposition.

    • jollyrogue@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      5 hours ago

      Famously, Nvidia drivers contain hacks for lots of games. Nvidia takes the crappy games and writes work arounds into the drivers to make the games work.

    • anomnom@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 hours ago

      CUDA support is what really pissed me off. I wanted to do some early machine learning (photogrammetry and computer vision stuff) 10-15 years ago, but the only way to use it was on nvidia hardware.

    • stressballs@lemmy.zip
      link
      fedilink
      arrow-up
      1
      ·
      9 hours ago

      That has more to do with driver compatibility. “Engineered for NVIDIA” does not mean your AMD is going to have a disadvantage unless the game just came out. And when it comes to AAA titles you’ve either got the power or you don’t. If the drivers are up to date for the game in question, how you program a game is not really that big of a concern.

      It’s just branding deals most of the time. They are not using some secret property tech.

    • BreakerSwitch@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      15 hours ago

      You’re right, but the further from release you are the less relevant this becomes. Another win for patient gamers

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        12 hours ago

        Over the lifetimes of the GPUs, many that benchmarked higher on nvidia early on swapped places as the AMD drivers matured.