Zak Stein is a researcher focused on child development, education, and existential risk. He joins the podcast to discuss the psychological harms of anthropomorphic AI. We examine attention and attachment hacking, AI companions for kids, loneliness, and cognitive atrophy. Our conversation also covers how we can preserve human relationships, redesign education, and build cognitive security tools that keep AI from undermining our humanity.
LINKS:
- AI Psychological Harms Research Coalition: https://aiphrc.org/
- Zak Stein official website: https://www.zakstein.org/



Ah but this tech is ahead of the expoiting-their-users-curve.
By using them now you’re opening yourself to psychosis, yes, but also your conversations are being used to further train the models. I do agree we can assume we’re at the high point and these tools are on the same downward slide as all big tech projects. It’s going to happen quickly considering the mind boggling levels of debt they are carrying.
They aren’t about to squander having insights into the deepest recesses of their most loyal users.
It’s an advertising wet dream.