Sahwa@reddthat.com to Fuck AI@lemmy.world · 20 hours agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square26linkfedilinkarrow-up1192arrow-down11
arrow-up1191arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 20 hours agomessage-square26linkfedilink
minus-square9WhiteTeeth@lemmy.todaylinkfedilinkEnglisharrow-up9·12 hours agoThe call for action is for folks who have options. You don’t, do what you can when you can <3
The call for action is for folks who have options. You don’t, do what you can when you can <3