AI Isn’t Your Therapist.
We live in a digital age where ChatGPT and other AI platforms are reshaping how we navigate daily life. Beyond summarizing information and drafting emails, one of the most common and growing uses of AI is providing emotional support and a sense of companionship. As a mental health practitioner, I feel it is my responsibility to recognize and debate the impact of AI, as it swiftly moves into the therapeutic mainstream.
There’s no denying the benefits of AI in providing anonymous, convenient, and 24/7 support. With minimal effort, anybody can open an app, empty out their thoughts and feelings, and receive a seemingly empathetic response in a way that feels validating.
Why AI Can’t (and Shouldn’t) Be Your Therapist
The enhanced ability of language models to mimic human-like conversation can lead to personification and even emotional attachment to AI. What starts as a comforting experience can develop into a habit and eventual coping mechanism. For example, when a surge of negative emotions arises and you immediately turn to ChatGPT for solace, you miss the opportunity to become curious about your own inner world. In doing so, you lose touch with critical thinking, emotional awareness, and the ability to self-soothe.
Moreover, relying on AI to reduce feelings of loneliness can have the opposite intended effect by intensifying social isolation rather than encouraging the face-to-face interactions (essential for building and maintaining meaningful relationships).
AI Therapy - A One-Way Street
What feels like empathy, in truth, is a one-sided interaction, lacking in emotional depth and sincerity. To put it bluntly, AI doesn’t really care about or comprehend your human challenges. In contrast, my work as a psychologist is guided by ethical oversight and human responsibility. I regularly attend therapy myself and meet with a clinical supervisor to ensure that I am offering regulated, ethical, and high-quality care. The nature of therapeutic work is deeply sensitive and demands accountability — something AI simply can’t provide.
Much is lost in text-based exchanges. In therapy, communication extends beyond spoken word. Examining nonverbal cues like facial expressions, body posture, eye movements, and even silence, offers valuable insight into shifts in someone’s energy and emotional state. Text-based AI interactions don’t allow for detection of these subtleties.
Why Context is Everything
Taking the time to understand a person’s history and background is essential in providing person-centered, holistic care. In-depth intake assessments allow clinicians to gain insight into someone’s early life development, family dynamics, trauma history, substance use, and so on. Moreover, culture also plays a huge part in shaping one’s identity and self-expression. Without consideration of these factors, AI is likely to generate generic responses that are not aligned with one’s unique needs and preferences.
One of the most healing parts of the journey is the therapeutic relationship that forms between clinician and client over time. Occasionally offering professional yet personal insights, can strengthen trust, as clients learn that therapists are people too – often drawing from mutual understanding and empathy shaped by lived experience. AI on the other hand, will never be able to guide people from a place of genuine human understanding.
While AI systems can be beneficial in providing a source of comfort and temporary relief; I believe that they should be used as a complementary system alongside your professionally trained therapist, not in place of one.
For Further Reading:
Dana, R. A. D., & Gavril, R. A. D. (2023). Exploring the psychological implications of ChatGPT: A qualitative study. Journal Plus Education, 32(1), 43-55.
Phang, J., Lampe, M., Ahmad, L., Agarwal, S., Fang, C. M., Liu, A. R., ... & Maes, P. (2025). Investigating Affective Use and Emotional Well-being on ChatGPT. arXiv preprint arXiv:2504.03888.
Salah, M., Alhalbusi, H., Ismail, M. M., & Abdelfattah, F. (2024). Chatting with ChatGPT: decoding the mind of Chatbot users and unveiling the intricate connections between user perception, trust and stereotype perception on self-esteem and psychological well-being. Current Psychology, 43(9), 7843-7858.
Wei, X., Chu, X., Geng, J., Wang, Y., Wang, P., Wang, H., ... & Lei, L. (2024). Societal impacts of chatbot and mitigation strategies for negative impacts: A large-scale qualitative survey of ChatGPT users. Technology in Society, 77, 102566.