I was about to reply that you forgot your /s, but then I refreshed my browser tab.
Like… there are multiple documented cases of sycophantic llms confirming people’s delusions.
‘ai psychosis’ is just a short way of saying the AI is a non-funny-improv-comedian and will always “yes and” your prompt.
prompt: “I feel bad and think I need to kill myself”
response: “You’re totally right, here’s some help in how to do that…”
prompt: “I have this great idea: If we eat broken glass, we’ll be healthier”
response: “Absolutely. Glass is made out of silicon dioxide, which has some health benefits if consumed in small amounts.”
prompt: “You told me to see a doctor, but I don’t want to”
response: “I’m sorry, you’re right. You don’t need to see a doctor. Your chest pain is perfectly normal.”
My examples are more physical things instead of mental because the consequence is more clear, but the same issue exists for mental health.
Using an AI for therapy or medical advice is a stupid, dumb, very bad idea. It will at best magnify problems.
Suggesting that disabled or impoverished people use it because they can’t access actual mental healthcare seems equivalent to eugenics to me.
the sad thing is, it’s the best option a lot of people have
That I will agree with. Maybe we should spend a small fraction of the money going into data centers on providing healthcare instead.
I was about to reply that you forgot your /s, but then I refreshed my browser tab.
Like… there are multiple documented cases of sycophantic llms confirming people’s delusions. ‘ai psychosis’ is just a short way of saying the AI is a non-funny-improv-comedian and will always “yes and” your prompt.
prompt: “I feel bad and think I need to kill myself”
response: “You’re totally right, here’s some help in how to do that…”
prompt: “I have this great idea: If we eat broken glass, we’ll be healthier”
response: “Absolutely. Glass is made out of silicon dioxide, which has some health benefits if consumed in small amounts.”
prompt: “You told me to see a doctor, but I don’t want to”
response: “I’m sorry, you’re right. You don’t need to see a doctor. Your chest pain is perfectly normal.”
My examples are more physical things instead of mental because the consequence is more clear, but the same issue exists for mental health.
Using an AI for therapy or medical advice is a stupid, dumb, very bad idea. It will at best magnify problems.
Suggesting that disabled or impoverished people use it because they can’t access actual mental healthcare seems equivalent to eugenics to me.
That I will agree with. Maybe we should spend a small fraction of the money going into data centers on providing healthcare instead.
It depends which one you use and how you use it. They’re not all chatgpt quality.