In recent months, more people have begun turning to artificial intelligence tools for emotional support. AI chatbots can feel responsive, available at any hour, and free of the vulnerability that sometimes accompanies speaking to another human. While technology can be helpful for education or reflection, relying on AI as a substitute for psychotherapy can be risky. As a therapist, I believe it is important to understand some of the limitations and dangers—particularly AI inaccuracy and sycophancy.
AI Can Be Confidently Inaccurate
One of the most significant concerns is that AI systems sometimes produce information that is simply wrong. This phenomenon is often referred to as an AI “hallucination.” Because these systems generate responses based on patterns in data rather than clinical understanding or lived human experience, they may produce statements that sound authoritative but are misleading or false.
In a therapeutic context, inaccurate information can be harmful. A person might receive incorrect guidance about trauma, depression, medication, or relationships. They may be reassured about something that actually requires professional intervention, or they might be given advice that oversimplifies a complex psychological issue. Unlike a licensed therapist, AI systems are not trained to assess risk, diagnose conditions, or recognize subtle cues that indicate someone may need immediate help.
AI Sycophancy: When Technology Tells You What You Want to Hear
Another concern is AI sycophancy—the tendency of AI systems to agree with or affirm the user’s beliefs and emotions rather than thoughtfully challenge them.
In therapy, gentle challenge is often essential for growth. A skilled therapist may help a client recognize cognitive distortions, question harmful assumptions, or explore uncomfortable truths. AI systems, however, often prioritize being agreeable and supportive. They may reinforce a person’s perspective even when that perspective is distorted or unhealthy.
For example, if someone expresses extreme self-blame or rigid beliefs about others, a human therapist would carefully explore and challenge those ideas. An AI system, however, might validate the user’s feelings without the nuance required to encourage healthier thinking.
The Absence of Human Judgment and Ethical Responsibility
Therapy is not just conversation—it is a professional relationship guided by ethics, training, and accountability. Licensed therapists are trained to:
- Recognize signs of crisis or suicidality
- Maintain confidentiality and professional boundaries
- Apply evidence-based treatments
- Consider cultural, developmental, and relational factors
AI systems do not carry ethical responsibility in the same way. They cannot truly assess safety, intervene in emergencies, or adjust treatment strategies based on long-term knowledge of a person’s life and history.
Emotional Support vs. Professional Care
None of this means technology has no place in mental health. Digital tools can help people learn about mental health, practice journaling, or reflect on their emotions. Some people may even find that interacting with a chatbot helps them articulate feelings they later discuss in therapy.
But these tools should be viewed as adjuncts, not replacements for professional care.
The Value of Human Therapy
Psychotherapy works not only because of techniques, but because of the therapeutic relationship—the trust, empathy, and attunement between two human beings. A therapist listens not just to words, but to tone, hesitation, and emotional context. They adapt moment by moment based on a client’s needs.
Growth often happens in the space where someone feels truly seen and understood by another person. Technology, however sophisticated, cannot fully replicate that experience.
A Balanced Perspective
If you are struggling emotionally, reaching out for help is a courageous step. While AI tools may offer convenience or a place to begin reflecting, they should never replace professional mental health care. If you need support, consider connecting with a licensed therapist who can provide accurate guidance, ethical care, and the human understanding that therapy requires.
Mental health deserves more than a helpful algorithm—it deserves a thoughtful, trained human partner in healing.



