Sahwa@reddthat.com to Fuck AI@lemmy.world · 22 hours agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square26linkfedilinkarrow-up1197arrow-down11
arrow-up1196arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 22 hours agomessage-square26linkfedilink
minus-squareyucandu@lemmy.worldlinkfedilinkarrow-up13·20 hours agopoint out the corporation that isn’t evil so that Amazon can buy them
point out the corporation that isn’t evil
so that Amazon can buy them