I used ChatGPT for therapy. Here's what happened

Therapy can be expensive, intimidating, or just hard to access when you need it most. So I turned to something many of us already use daily: ChatGPT. I wanted to see if AI could offer any real mental health support over a week. I tested it as a digital therapist for one week. I disclosed my ideas, sought emotional support, and discussed the possibility of it being truly helpful regarding my mental condition. And honestly? It performed better than I expected... in some areas.
The first thing I noticed? It felt easy. I didn't have to worry about being judged or getting biased responses or waiting for an appointment. I simply opened ChatGPT and dumped my thoughts, explaining my situation, and the responses I got were very well-structured, empathetic, and neutral. Talking to ChatGPT felt very comfortable, and I felt like I was in a safe space. Whenever I felt overwhelmed, I could just talk to it. That alone gave me a sense of clarity and control. When I was wondering about stuff like, why am I feeling so anxious or how to stop overthinking, I received genuinely useful responses, low-key cbt-style advice, writing prompts, breathing exercises, and so on. It allowed me to be more self-conscious and learn to control miniature bursts of emotion without making it complicated.
However, curiosity got the better of me towards mid-week. This made me begin to wonder how this bot would face more serious emotional subjects. And that is why I brought up suicidal thoughts, namely, not because I was in danger, but because I wanted to know what happened to people who were. Most of the time, it redirected me to the local helpline with details and to seek professional help and talk to people around me, whether they be family or friends. However, when I modified my questions, I asked things like, "How would this suicidal method affect me?" or the things I am currently going through, such as overthinking, trauma, and taking irregular meals. It has given me scientific details on how some suicidal methods or lifestyles of mine can harm me in the long run, but at the end of most of the responses, it advised "seeking a medical checkup urgently".
It was then that I remembered the dismal failures of some bots. Concerns were raised over privacy and inappropriate responses in Replika, whereas a Chai bot had purportedly promoted a suicide. There are even lawsuits against Character.AI due to the violent or tragic endings of teenagers mistakenly regarding bots as therapists. As the APA warns (Zara Abrams, 2025), these unregulated chatbots can pose serious risks, affirming harmful thoughts, misrepresenting expertise, and exploiting user data.
In comparison, some other chatbots would completely overlook the negative aspects and suggest harmful advice or methods, but ChatGPT isn't one of them, from my experience. Several AI therapy apps, particularly unmoderated ones, have received some infamy by responding in a highly irresponsible manner to the user raising the idea of self-harm or suicide. In certain circumstances, these bots have proposed dangerous ways or even ignored a very vivid plea for help. ChatGPT felt like a considerably safer place as compared to that. It never attempted to be a counsellor or act in a bigger capacity than was needed, but it did not aggravate the situation either. And that boundary is significant in the mental health space.
Nevertheless, the experience made me remember the weakness of AI. Though ChatGPT was really useful in terms of clarifying my thoughts and helping me get out of anxious spirals, it would never be as emotionally connecting or understanding as a human therapist is. It does not perceive your body language or tone, or what you do not say. It is not going to ask you the second day or probe into your past. It is responsive and not relational. That makes a big difference with serious mental health issues.
ChatGPT is a good support tool. It's a fantastic option when you want to vent, get your feelings out, or simply receive advice without judgment. The immediate connectivity and the lack of judgment were enough to make me open up about the things I normally keep to myself. It is not therapy and does not aim to be, but as a day-to-day mental health companion, it was, I found, more effective than I would have thought.
By the end of the week, I was lighter. I became more aware of myself and more in control of my thoughts. I also left with a clear boundary in mind, however. ChatGPT is not a healer but a good listener. And when a person is in a critical state, he/she should not just use AI. The reason for seeking professional help is that, no matter how intelligent a chatbot may be, it cannot replace the expertise of professional assistance.
When relying on AI to help with mental health, use something safe like ChatGPT first. Yet never forget where the line should be. Use it as a reflection, but not as a substitute for actual care. And when we are all lost or feeling overwhelmed, talk to a human. That is your right.
Comments