• NannerBanner@literature.cafe
    link
    fedilink
    English
    arrow-up
    37
    ·
    2 days ago

    Lol, I feel bad for anyone new to the pc building community. At least those of us with 10+ year old computers at this point can play most of the indie games coming out. I AM still surprised by how intensive some games can be when they look like minecraft downgrades.

    • lightnsfw@reddthat.com
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      Kind of funny story, I launched Stardew valley yesterday and my displays absolutely shit themselves even though my graphics card is pretty new. Turned out that nvidias stupid app had changed the display settings to something weird. I had to manually flip it back to borderless and that fixed it but at first I was like “how out of everything I’ve played is this the one having problems?”

      • 87Six@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        Tx I’ll send this to bro that still thinks AMD is the one that makes shit drivers

      • AmbientChaos@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        My wife started Stardew valley the other day and we also had display issues trying to output 4K. Still had to max out the zoom and even then the dialog boxes are cut off until you zoom out. Unlucky

        • NannerBanner@literature.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Probably not the issue, but if you’re outputting 4k, is it to a tv? I had an issue with a big screen having its own weird zoom settings (the old widescreen/cinema/whatever).

          • AmbientChaos@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            We’re outputting to a projector, but the UI is what’s actually cutting it off unfortunately. I’m sure there is fix out there somewhere with a game this popular, I just need to do some digging 😁

  • trashcroissant@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    84
    ·
    2 days ago

    Jesus, I had to do a double take because I thought the stick person had somehow trapped a little human inside a pod for their entertainment and I was so confused.

  • bigchungus@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    33
    ·
    2 days ago

    You see, the big mistake in 2029 was the person installing Windows. Now they can see the horrible data center right outside of their house. As they say, out of sight, out of mind.

  • real_squids@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    35
    ·
    2 days ago

    You can pry my “double” slot GPU out of my cold dead hands (good luck trying to run away with it, it’s heavy as fuck and needs a supporting post)

    • autriyo@feddit.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 days ago

      Tbh, the whole card format feels very legacy, even for my Vega 56 “dual slot” card, and that thing “only” consumes ~230W.

      If ppl back then could’ve foreseen what obscenely power hungry parts would be shoehorned into the expansion card format, they probably would’ve chosen a different approach for GPUs specifically.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        11
        ·
        2 days ago

        iGPUs should have been a better option, but they were hamstrung by PCI conventions and graphics APIs favoring discrete VRAM.

        (Just look at how x86 SoC consoles run circles around similar-spec PCs.)

        I’m hoping that ARM is a chance to reset.

        • autriyo@feddit.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          I do like the modularity of discrete GPUs though.

          But a cooling setup similar to CPUs would’ve been better for airflow.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            Modularity is nice — both for personal preference reasons and incentivizing-market-competition reasons — but it does come at a cost.

            The thing is: even in our modular world right now, you don’t really have many choices. Two CPU companies, three GPU companies (two of them being the same as the CPU companies)…

            We could someday have a world where PC hardware is technically less modular than it is today but consumers have more choices in the marketplace than they do today.

          • real_squids@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 days ago

            Kinda hard to do when so many GPU vendors slap their memory and power circuits all over the place. Even if the die is in the same place cooler manufacturers would need to test fit a bajillion models, and on top of that they’d need insane R&D budgets to keep up with new additions, sometimes coming years after the original gpu comes out

            • autriyo@feddit.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              My concerns were more about the airflow path and less about actually interchangeable coolers, although those are a thing.

              Like it just feels wrong to blast the air into a solid PCB… That’s kind of solved with flow through designs, but a tower style cooler would probably be less noisy.

      • real_squids@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        2 days ago

        I was joking about it being 2.5 slots, tbf most modern cards should be triple slots. Mine is 300W and it’s pretty chunky to stay below 60C, best option for big cards is a horizontal mobo imo

        edit: unpopular opinion but I’d rather have a chunky card that stays cool as fuck than a slim one, that’s why I picked up the Nitro when I had a 6650XT

  • rose56@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    I guess we won’t do something about that, especially when we have the power in our hands.

  • wraekscadu@vargar.org
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    2 days ago

    Supply will catch up with demand. High PC component prices are a temporary thing.

    • Skullgrid@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      ·
      2 days ago

      Supply will catch up with demand. High PC component prices are a temporary thing.

      we said that about housing since 2008.

    • Wintry@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      Maria dear i think this may be your one character flaw. Ai is a huge force for evil. It’s a tool. And sure tools can be useful but they way it’s being wielded is to the detriment of humanity. It’s not a cute novelty anymore. It’s not being used appropriately at any scale. I know you are fond of some of the silly and fun use cases. And it probably still has limited ethical uses. But as it stands the power of comput is being used for propaganda repression of people and genocide. To dumb and pacify the population and with the way hardware is going the corporate fascist may kill personal computing. It’s not worth discussing the good it might be able to do when it’s actively harmful to humanity and only helpful to the controlling and ruling class. It’s really sweet and endearing you are so optimistic but sadly it’s an evil world and the point of this technology is to serve the interests of bad people and any good it can do will be for their profit and control not for the lives well being snd freedom of all of us.