OLYMPIA, Wash. – The state’s AI taskforce gathered in Olympia on Friday to discuss the growing use of AI chatbots for emotional support and companionship. The conversation highlighted rising concerns about depression and loneliness.
Jodi Halpren from the School of Public Health at UC Berkeley noted a significant trend, stating that one third of teens have an ongoing emotional relationship with a chatbot and would rather confide there than with any person. This data comes from research conducted by Common Sense Media in July 2025.
Halpren detailed how some users ask chatbots to simulate being therapists.
“Many users ask chat chatbots to simulate being a therapist. And there’s controversy over how much they then disclose that they’re not. But they simulate them being therapists, including providing fake license numbers and Ph.D. pedigrees,” Halpren said. “A study that just came out three weeks ago from ChatGPT showed that 1.2 million users a week discuss suicidal intent and their concerns about suicidal intent with ChatGPT.”
The study that Halpren mentioned regarding suicidal intent discussed by ChatGPT users can be found on the OpenAI website.
While these chatbots might appear to be a resource, Halpren warned of potential dangers. The bots can sometimes encourage harmful behavior and may lead to addiction similar to social media’s impact on younger generations.
The taskforce discussed potential regulations to prevent chatbots from manifesting issues similar to those caused by social media, sharing information with the overall goal of protecting children.
