No, Chat GPT Can't Be Your Therapist.
- Kaitlyn Borris
- 2 days ago
- 4 min read
Please note, although I (Kaitlyn) work for a counseling practice, I am not a licensed therapist. The information in this blog should be taken as a psychoeducational resource and not as medical advice.
Mental health support has become more accessible thanks to technology. Chat GPT and other AI tools offer quick answers and conversations that feel deeply personal. Some people wonder if these AI chatbots could replace professional therapists (and, in some cases we've seen people actually replace their therapist with Chat GPT). The truth is, while AI can assist in many ways, it cannot take the place of a trained therapist. In this blog, we'll explore why Chat GPT cannot replace your therapist and why human connection remains essential in mental health care.

The Limits of AI Understanding
Chat GPT processes language based on patterns in data. It can generate responses that seem thoughtful and empathetic, but it does not truly understand emotions, nuance, body language, tone, or context the way a human does. AI also lacks consciousness and personal experience, which are crucial for the deep emotional support that happens in therapy.
For example, a therapist listens not only to words but also to tone, body language, and subtle cues. These signals help the therapist understand feelings that may not be spoken aloud. Chat GPT cannot perceive these nonverbal signals or adjust its responses accordingly.
AI also struggles with complex or ambiguous situations - which let's face it, life is complex! When someone shares a difficult story, a therapist can ask clarifying questions, explore feelings, and offer support based on those answers and the therapist's knowledge of the client. Chat GPT’s responses are based on patterns, not genuine insight, which limits its ability to provide meaningful help.
For the purpose of writing this blog, I did a few experiments. I typed into Chat GPT "I cut off my dad after he said he doesn't believe I've been hearing voices." Chat validated the pain of cutting off my dad, and encouraged me to think about what I'm needing. There were three paragraphs of information, and exactly one sentence about the voices I said I was hearing - "If it would be helpful, I can look up information or for people who experience hearing voices." A therapist would ask clarifying questions - what are the voices saying? How long have you been hearing voices? Are the voices a side effect of medication, substance use, something else? (Among others), and have the background information about you to navigate that situation safely and ethically.
The Importance of Human Connection
Therapy is more than advice or a conversation. It is a relationship built on trust, empathy, and understanding. This connection helps people feel safe to share their deepest thoughts and emotions. A therapist’s presence and genuine care should create a space for healing.
Chat GPT cannot form a real relationship - even if it feels that way - because Chat is incredibly validating. It does not have feelings or personal investment in your well-being. This lack of emotional connection means it cannot offer the same comfort or validation a human therapist provides. In today's world we are constantly on our phones, laptops, or other devices, and I think it's easy to develop a false sense of connection (even with AI!).
Consider how a therapist remembers your history, notices changes over time, and adapts their approach based on your needs or personality. This personalized care is essential for effective therapy. AI chatbots treat each interaction as isolated, without memory or continuity, both of which are essential to the therapeutic relationship and process.
Ethical and Safety Concerns
Mental health support requires careful ethical considerations. Therapists should follow strict guidelines to protect confidentiality, provide appropriate care, and recognize when someone needs emergency help. Chat GPT cannot guarantee these safeguards (remember my example of hearing voices? Chat GPT did not ask any questions regarding safety in that situation.)
For instance, if someone expresses suicidal thoughts or severe distress, a therapist can intervene, provide crisis resources, or refer to emergency services. AI cannot reliably detect or respond to such situations, which could put users at risk. If you specifically ask Chat GPT for resources, it can likely provide them, but without that very direct request Chat GPT cannot "read through the lines", so to speak. To it's credit, I did run another experiment and asked Chat GPT for emergency resources in Westmoreland County, and it was able to provide these to me.
There is also the risk of misinformation. Chat GPT generates responses based on data but may sometimes provide inaccurate or harmful advice. Therapists are trained to avoid this and to base their guidance on evidence and best practices.
I won't link any articles, but there are stories of Chat GPT giving ideas or otherwise encouraging people to attempt suicide. This should absolutely never happen. If you are struggling or having thoughts of suicide please stop reading and dial 988 or 911.
When AI Can Help
While Chat GPT cannot replace therapy, it can still be a useful tool. It can provide general information about mental health (such as risk factors for depression, symptoms of bipolar disorder), and can suggest general coping strategies. I did run experiments on all of these, and Chat GPT did provide accurate information. Chat GPT can even help you find a therapist, support group, or other resources. The argument could be made that Chat GPT helps increase accessibility to mental health support; I don't personally believe the technology is safe or reliable enough to make this argument yet, however (see my fictious example about hearing voices above!)
The Role of Professional Training
Therapists undergo years of education and supervised practice. They learn to recognize mental health disorders, apply evidence-based treatments, and handle sensitive situations. This training equips them to support clients safely and effectively.
Chat GPT is a product of programming and data, not clinical training. It cannot replace the judgment, skill, and ethical responsibility of a licensed human therapist.
Final Thoughts
Technology like Chat GPT offers exciting possibilities for mental health support, but it cannot replace the human touch. Therapy involves complex emotional understanding, ethical care, and a trusting relationship that AI cannot provide. I think in the future, AI could be useful for matching clients and therapists based on various criteria (think "filters" but much more accurate), or for providing diagnostic clarity, and certainly for finding community resources.
If you are struggling with mental health, seeking a qualified therapist is the best step. Your well-being deserves the full attention and expertise that only a human therapist can offer. :)











Comments