Sahwa@reddthat.com to Fuck AI@lemmy.world · 21 hours agoAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comexternal-linkmessage-square26linkfedilinkarrow-up1196arrow-down11
arrow-up1195arrow-down1external-linkAmazon's Rufus AI shopping assistant can be easily jailbroken and tricked into answering other questions — specific prompts break the chatbot's guidelines and reach underlying AI enginewww.tomshardware.comSahwa@reddthat.com to Fuck AI@lemmy.world · 21 hours agomessage-square26linkfedilink
minus-squareTexasDrunk@lemmy.worldlinkfedilinkarrow-up2·12 hours agoThe Gemini one is the one I find interesting. Protocol to put up pages that aren’t indexed by Google? Well now it’s harder to get information about. I don’t think it’s a conspiracy either. It’s just interesting.
The Gemini one is the one I find interesting. Protocol to put up pages that aren’t indexed by Google? Well now it’s harder to get information about.
I don’t think it’s a conspiracy either. It’s just interesting.