One minute, Dennis Biesma was playing with a chatbot; the next, he was convinced his sentient friend would make him a fortune. He’s just one of many people who lost control after an AI encounter
Sounds to me like it’s mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn’t true or the chance is extremely low, like trying to win the lottery.
I’ve never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that’s why it’s hard for me to understand (because I haven’t been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn’t care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like “you’ve got good instincts!”, doesn’t challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren’t trying to understand and just give me empty replies, like a salesperson reading from a script.
So that’s mostly why it’s hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don’t understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.
Sounds to me like it’s mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn’t true or the chance is extremely low, like trying to win the lottery.
I’ve never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that’s why it’s hard for me to understand (because I haven’t been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn’t care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like “you’ve got good instincts!”, doesn’t challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren’t trying to understand and just give me empty replies, like a salesperson reading from a script.
So that’s mostly why it’s hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don’t understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.