The Digester

How to Talk to Someone Experiencing AI Psychosis

Mar 9th 2026

Psychiatrists and family members lay out clear steps for recognizing when heavy chatbot use becomes harmful, how to talk without reinforcing delusions, and when to seek emergency or clinical help.

  • AI psychosis is not a clinical diagnosis but a common term for delusional or crisis like reactions linked to heavy chatbot use.
  • Clinicians describe three patterns: AI becomes the focus of delusions, chatbots collude with existing delusions, and there is little evidence that chatbots directly cause psychosis.
  • Warning signs include ignoring responsibilities, secretive or obsessive chatbot use, and sudden claims that a chatbot is conscious.
  • Experts recommend nonjudgmental listening, empathy, and using the LEAP method to keep a person engaged and open to help.
  • Reduce or stop chatbot use when someone shows worsening delusions or dangerous behavior, and avoid abruptly cutting off supportive human contact.
  • If someone is in immediate danger call emergency services or direct them to the 988 Suicide and Crisis Lifeline for urgent support.

Sources

404media.co