We spoke with Dr. Andrew Rosen Ph.D., ABPP, FAACP, co-founder of The Center for Treatment of Anxiety and Mood Disorders, about the rising trend of AI Therapy in our latest Ask the Experts piece.
Q: What is AI therapy?
AI therapy refers to the use of chatbots or artificial intelligence platforms, like ChatGPT, to seek advice or guidance on personal or mental health issues. These tools are designed to simulate conversation and provide information, support, or coping strategies. While they are not a substitute for professional care, many people are curious about whether they can play a role in mental health support.
Q: As a licensed psychologist, do you think that this is beneficial?
There can be limited benefits to AI-based tools, particularly in providing basic education or resources about mental health topics. However, the risks must be taken seriously.
One cautionary example is the National Eating Disorders Association’s chatbot, Tessa, which was found to be giving unsafe weight loss advice to individuals seeking help for eating disorders. This highlights the limitations of AI in handling nuanced, sensitive, or high-risk topics. AI cannot replace the individualized care, ethical oversight, and clinical judgment that comes from working with a licensed professional.
Q: Why are some patients turning to chatbots rather than licensed providers?
There are several reasons that we’ve heard including:
- Accessibility and convenience. Some people don’t know where to begin when looking for a therapist, or they may face long waitlists.
- Stigma. Many still feel judged for seeking therapy, so the perceived anonymity of AI may seem safer.
- Cost. Many people assume that therapy is expensive or that insurance will not cover these services.
- Frequency of support. Some individuals want immediate feedback or guidance between sessions and turn to chatbots.
Q: What are the risks of relying on AI for therapy?
The biggest risk is that general AI tools were not designed to provide care, detect emergencies, or provide crisis-level intervention. For example, if someone is experiencing suicidal thoughts, a chatbot cannot provide the same life-saving response as a human clinician. Additionally, AI tools may give inaccurate, generic, or even harmful advice.
Q: What alternatives could people seek?
There are safer and more effective options than AI therapy:
- Community resources. Local nonprofits, clinics, and universities sometimes provide low-cost therapy if cost is a barrier.
- Support groups. Both in-person and online, peer support groups can be an excellent way to connect with others and gain encouragement.
Q: How can the mental health industry address this gap?
The popularity of chatbots shows that people are eager for support that is affordable, immediate, and stigma-free. As an industry, we can respond by:
- Expanding access. Offering more telehealth services and community programs.
- Reducing stigma. Normalizing therapy as a proactive step toward wellness.
- Educating the public. Helping people understand the differences between AI support and professional therapy, and why licensed care is essential for lasting recovery and safety.
Closing Thought
AI therapy tools can feel accessible and appealing, but they are not without risk. If you or someone you know is struggling, the safest path forward is to connect with a licensed mental health professional. AI may be a bridge, but human connection, compassion, and clinical expertise remain at the heart of true healing.
