Sahwa@reddthat.com to Fuck AI@lemmy.world · 22 hours agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square26linkfedilinkarrow-up1197arrow-down11
arrow-up1196arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 22 hours agomessage-square26linkfedilink
minus-squareatomicbocks@sh.itjust.workslinkfedilinkEnglisharrow-up3·18 hours agoPeople are not evil for not being able to change the system they live under.
minus-squarecub Gucci@lemmy.todaylinkfedilinkarrow-up1arrow-down3·17 hours agoBro… It only works for people from rich countries. People from shitholes like Russia or Iran have what they deserve
People are not evil for not being able to change the system they live under.
Bro… It only works for people from rich countries. People from shitholes like Russia or Iran have what they deserve