floofloof@lemmy.ca to Technology@lemmy.worldEnglish · 24 hours agoMotherboard sales are now collapsing amid unprecedented shortages fueled by AIwww.tomshardware.comexternal-linkmessage-square108linkfedilinkarrow-up1533arrow-down13file-text
arrow-up1530arrow-down1external-linkMotherboard sales are now collapsing amid unprecedented shortages fueled by AIwww.tomshardware.comfloofloof@lemmy.ca to Technology@lemmy.worldEnglish · 24 hours agomessage-square108linkfedilinkfile-text
minus-squareboonhet@sopuli.xyzlinkfedilinkEnglisharrow-up2·6 hours agoYup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.
Yup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.