AI is asbestos in the walls of our tech society, stuffed there by monopolists run amok. A serious fight against it must strike at its roots
AI is asbestos in the walls of our tech society, stuffed there by monopolists run amok. A serious fight against it must strike at its roots
Unfortunately most of the GPUs aren’t usable by gamers. We aren’t talking mining booms where miners buy up gaming GPU stock, then sell cheap when the bubble bursts.
We’re talking companies buying huge GPUs that don’t have video outputs and have an altered software stack to what’s used for gaming, missing all kinds of features and game specific patches.
Granted, many 4090/5090s were also used, and those will be usable by gamers, but even with a significant price drop on those, only richer gamers will find that to be viable.
Somewhat similar story for memory - a lot of it is tied up in HBM, or as GDDR on enterprise graphics cards.
And their energy usage must be outrageous
True about those without video outputs. But in Linux, game specific patches are in Mesa and not in the graphics drivers for example.
Additionally, in windows (linux too?) one could use Moonlight / Sunshine to compute on the GPU and stream to secondary device (either directly, like say to a Chromecast, or via the iGPU to their monitor). Latency is quite small in most circumstances, and allows for some interesting tricks (eg: server GPUs allow you to split GPU into multiple “mini-gpus” - essentially, with the right card, you could host two+ entirely different, concurrent instances of GTA V on one machine, via one physical GPU).
A bit hacky, but it works.
Source: I bought a Tesla P4 for $100 and stuck it in a 1L case.
GPU goes brrr
I would pickup an inference GPU to run models locally. I can see a benefit in that especially if they’re on the cheap
I certainly wouldn’t let something like a cheap RTX PRO 6000 Blackwell Server with 96GB of VRAM go to waste. I’d put it to good use with Blender rendering, running models I actually care about, and maybe some Games on Whales.