• anamethatisnt@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 hours ago

    My wife’s old gaming pc lasted 2016 - 2025 with the only upgrade being a new and larger SATA SSD. I’ve never had a gaming pc last that long before and the cost per year turned out really great.

    I really hope that the consumer market get back on track before I need to make any more larger purchases.

  • ArmchairAce1944@discuss.online
    link
    fedilink
    English
    arrow-up
    17
    ·
    17 hours ago

    As an elder millennial I grew up in an era where a few years meant the difference between bleeding edge and obsolete. This continued until the late 2010s and things just seemed to seriously stagnate after that.

    • WalrusDragonOnABike [they/them]@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      15 hours ago

      I sorta feel like there was a significant dropoff after the first generation that included ddr4 and pcie4 unless you specifically except for GPUs if you specifically used the RTX features in terms of real-world performance* in the higher end consumer desktop side. I rarely had issue with my i7-4790k setup and it was a ddr3/pcie3/limited nvme support generation. Only replaced it last year because the mobo died.

      *Based entirely on my own use cases.

    • MBech@feddit.dk
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      My GTX 1070 became obesolete after around 4 years. Not in the “Can’t launch games” obsolete, but more “Can’t keep a stable fps in newer games” obsolete. However my 3070 is still going strong, with no real issues in anything after 5 years. I see absolutely no reason to upgrade it.

  • elbiter@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    16 hours ago

    Maybe it’s time to value debloated software…

    Maybe a mouse driver doesn’t have to take 1 GB of RAM and send usage statistics to the manufacturer…

    • Brummbaer@pawb.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      16 hours ago

      It’s the end of the Silicon Valley era. In a few years I guess new and faster stuff will come from China.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        Unless there is a paradigm or materials breakthrough I wouldn’t expect a major leap anytime soon.

        Its kinda like the same with TV quality/video game graphics. We’ve been squeezing a lot out of the current technology, but further enhancements will be incremental.

        • Spice Hoarder@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 hours ago

          The paradigm shift will be going back to low level programming. Quake type innovations like fast sqrt

        • PhoenixDog@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          Yeah I think we’re at the point of technology plateauing. Ways to interact with that tech may change, like VR vs traditional screens, but we can only compress so many pixels on a surface until it just starts looking marginally better for more money.

          I still watch a lot of youtube and movies and stuff no more than 1080p because the increase in quality beyond that doesn’t affect the quality of the video enough for me to give a shit. I’ve seen 4k and 8k tvs in tech stores like Best Buy and while they look awesome, it wouldn’t change how I consume the same media.

  • jballs@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    17
    ·
    20 hours ago

    I remember back in the day you could buy a hard drive that was shown on the box as having a something like 300 MB capacity, then when you opened the box you found out it actually had 500 MB capacity - because the advances in manufacturing were outpacing their ability to print new boxes.

    Damn we had it good then.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      15 hours ago

      Kind of unimaginable these days, since you’d expect it to be artificially limited to 300 MB, even if the hardware could support 500.

      Like how on the 970(?), on some video cards, you could overclock it into being a 970Ti if you were lucky, since it was before they started lasering parts off the chips to stop people doing that to lower-grade cards, and binning was just a matter of whether it passed muster or not to match the requirements for the higher-grade cards. They could sometimes be pushed to that level, just not reliably for all of them.

  • bearboiblake@pawb.social
    link
    fedilink
    English
    arrow-up
    80
    ·
    1 day ago

    20 years ago or so, I was at a computer parts store, pricing up parts to build a computer. I was on a limited budget and had already decided on a CPU, graphics card, and a motherboard.

    “Ah, crap, I forgot about RAM”, I said. “No problem”, the shopkeeper replied, “RAM is cheap”

    I don’t remember what the CPU I got was, but the GPU was an Nvidia GeForce 6600GT.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      15 hours ago

      “Ah, crap, I forgot about RAM”, I said. “No problem”, the shopkeeper replied, “RAM is cheap”

      That was also the mindset in programming up to very recently. Memory is cheap and plentiful, so you didn’t need to worry too much for all but the most conservative of memory management.

      Strange to think that with how computers are today, we’re looping back around to resource-limited compute, not because of software bloat or anything, but that people won’t be able to afford to change their computers out for something more powerful.

    • PhoenixDog@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      16 hours ago

      I bought my current PC 13 years ago. Haven’t changed a part out of it. At the time I bought it, it had a GTX 745 in it, i7 4790 @ 3.60GHz, 12gb DDR3 ram, and I got it for less than a $1000. While I can’t play games like RDR2 or the AAA modern day games on it, it still gets the job done for the games I enjoy playing in 2026.

  • Arghblarg@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    edit-2
    1 day ago

    I ‘panic bought’ (OK, not out of panic, but mild concern) a 22TB drive since the price seemed not too astronomical, and the local store had a few left. Just in case.

    Seems the supplies really are drying up. Fuck these AI companies. Doesn’t matter if they actually intended to wage a war on personal computation; their hoarding of the supply chain for years to come really is an assault on our ability to afford local self-hosted computing. I hope the bubble bursts, soon and hard.

    • GalacticSushi@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      16 hours ago

      Doesn’t matter if they actually intended to wage a war on personal computation

      I think they are intentionally waging war on personal hardware. They want everything cloud based so they have direct access and ownership of all user activity. Their wet dream is everyone’s computer being a streaming stick plugged into a monitor.

    • kameecoding@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      21 hours ago

      I have a video about me being giddy that I have an air fryer that has two compartments that you can set up to finish at the same time, so if you have something you need to have it in there for 15 minutes and the other thing only for 10 minutes, then the 10 minute side waits 5 minutes and starts when the 15 minute side counts down to 10 minutes, so now my protein and the sides get finished at the same time.

      That’s awesome for me.

    • frunch@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Hell yeah aging. It’s funny how such simple things can bring satisfaction but if stuff like that makes you happy, i say you’re doing a-ok 🥂

      • osanna@thebrainbin.org
        link
        fedilink
        arrow-up
        7
        ·
        24 hours ago

        I don’t have much (am long term unemployed/disabled), so the few things I can afford really do get me excited

  • RoidingOldMan@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 day ago

    I know RAM and graphics cards are up, but computers are still getting cheaper. A $400 laptop today vs. a $400 laptop 10 years ago, is slightly faster but also, due to inflation, the newer one is cheaper.

    • Quetzalcutlass@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      arrow-down
      1
      ·
      edit-2
      1 day ago

      RAM, graphics cards, SSDs, and HDDs are all up, some multiple-fold in price, and the disruption looks like it may continue for years. Combined with the death of Moore’s Law, a $400 computer at today’s prices might actually be worse than what you’d get for that money a decade ago.

      • FlordaMan@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 day ago

        I find that hard to believe. Do you have an example of a laptop from 2016 that was sold for $400 that would perform better than a 2026 laptop costing $400?

    • paultimate14@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      The pricing shocks are slower to hit OEM’s, but rest assured they will. A $400 laptop from July 2025 is going to cost $800 in July 2026.

    • Taldan@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      19 hours ago

      10 years ago, sure. But what about 2 years ago? Progress has slowed significantly, with core parts only very recently exploding in price

    • rumba@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      20 hours ago

      That’s only because Intel/AMD hasn’t figured out how to cash in on the artificial BS yet :)

      give em a week, they’ll tell us that all their processors are purchased through 2030