Sahwa@reddthat.com to Fuck AI@lemmy.world · 1 day agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square26linkfedilinkarrow-up1198arrow-down11
arrow-up1197arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 1 day agomessage-square26linkfedilink
minus-squareKaryoplasma@discuss.tchncs.delinkfedilinkarrow-up32·1 day agoI still have gift cards for Amazon. I have to use them or else Amazon gets free money. I buy stupid shit like wine gummies with it usually.
minus-squaresurewhynotlem@lemmy.worldlinkfedilinkarrow-up7arrow-down2·1 day agoBuy something with free returns. Return it. Repeat.
minus-squareUndergroundGoblin@lemmy.dbzer0.comlinkfedilinkarrow-up18·1 day agofrom an ecological standpoint, though, it’s also a total disaster
I still have gift cards for Amazon. I have to use them or else Amazon gets free money. I buy stupid shit like wine gummies with it usually.
That’s fair and acceptable.
Buy something with free returns. Return it. Repeat.
from an ecological standpoint, though, it’s also a total disaster