Life & Living
#Perspective

ChatGPT as therapist? Risks young Bangladeshis should know

ChatGPT as therapist? Risks young Bangladeshis should know
Photo: Collected / Igor Omilaev / Unsplash

Nowadays, many youngsters, university students, and even adults are turning to AI chatbots, chatting and confiding in them the same way they would with a well-wisher or even a therapist, often using models such as ChatGPT, Gemini, and Character AI. Oblivious to its downsides, we pour our hearts out to them, typing into these chatbots about how we feel when things do not go our way or when we do not get the job we worked very hard for.

Many even use them for self-validation and motivation. They rant to these chatbots, hoping to feel heard and to receive the words they have longed to hear from someone: "You're working really hard," or "It sounds like you're going through a lot right now." Many others use it for making decisions, starting from what to wear and how to plan their study schedule.

The chatbot seems like just the kind of friend we need — non-judgmental, always available at a moment's notice, and never too demanding. But this growing attachment raises a grave concern. It reflects a generation becoming so lonely and isolated that they now turn to chatbots to say the things they once shared with their friends, siblings, teachers, or parents.

In an interview with The Daily Star, Dr Ashique Selim, a consultant psychiatrist based in the UK, pointed out, "To identify the real problem, we should look at what is prompting the young adults or school children to tell their concerns to these chatbots in the first place."

Referring to a case where a school-going child asked ChatGPT if he should tell his parents that he failed a school test, Dr Selim remarked, "What we should be asking here is, does the kid have no one to share this with — not even his teacher or someone he can trust?"

He points out a worrying gap between what we need (someone we can trust) and what we get (a lifeless chatbot). Many users have also complained that these chatbots tend to be their "yes-men," constantly trying to flatter, agreeing with almost everything they say, and inadvertently validating their beliefs, a phenomenon otherwise known as the AI sycophancy.

Dr Selim highlights a critical issue, saying, "If someone with symptoms like grandiosity tells a therapist, for instance, that he believes he is a king, the therapist can use their expertise to gently guide him toward a healthier perspective. But when posed with the same statement, these chatbots are known to respond with something like, 'That's a very good question.' And that's where the real problem lies."

Instead of constructively pointing out problematic traits to the person, as an expert therapist or well-wisher would, these AI chatbots are known to over-empathise, even with unhealthy behaviour.

Photo: Collected / Growtika / Unsplash

Experts warn that these chatbots have neither the experience nor the capability to assess a person's non-verbal cues, and their inaccurate answers pose serious risks to the mental health of vulnerable users who frequently rely on AI for making important life decisions. However, Dr Selim emphasised that the chatbots themselves aren't entirely to blame. He mentioned that one can still use AI chatbots for routine tasks and queries, for instance, to learn how to get better sleep.

He further explained, "For anyone to use AI chatbots efficiently and safely, they need three things: knowledge, awareness, and the right mental state."

More often than not, problems arise when a person lacks one of these three things. When someone has unusual or emotionally heavy conversations with an AI chatbot, they should have the awareness that it is, in reality, just a bunch of code.

They should have the knowledge of the mechanism of the AI chatbots; that it is a Large Language Model (LLM) that uses previously input historical data to generate predictive responses, basically a lot of probability maths taking place behind the scenes.

In essence, it's not human, and what it says is not a human opinion but a statistical prediction of the most probable next set of words that fit the context and the prompt you have just entered. In a way, it is designed to give you a palatable answer.

That being said, users also need to be in the right mental state — lucid enough to make judgment calls, not taking the AI at its word. For instance, a school-going child or someone clinically depressed may not have that level of mental lucidity to use AI safely.

A study by MIT Media Lab suggests that users who view ChatGPT as a friend are more likely to experience negative emotional effects such as loneliness and emotional dependence.

Dr Selim also shared in this regard, "The important thing is to have a balance. After spending a certain amount of time chatting with an AI, there must be a point where one realises they should now spend some time with real people."

Beyond confiding in chatbots, youngsters and children today use AI frequently for homework or assignments. Instead of asking for a ready-made answer to a difficult math problem or theoretical topic, consider asking the AI to explain the underlying concepts and guide you through the methods so you can solve it on your own.

Sal Khan, an American educator, visionary and the founder of the educational platform Khan Academy, contends that AI will not replace human tutors, but it has the potential to complement them by acting as a powerful tool. However, AI will only serve as a powerful tool if we actually use it as one and not become overly dependent on it. Dependency on any tool, including AI, will ultimately prove counterproductive.

Lastly, while AI chatbots can mimic human judgment, they will never be an alternative to a talk with a friend, family member, teacher, or simply someone you can trust. When using AI chatbots as a tool for small queries alongside your conventional tutor or therapist, it is essential to keep in mind that it is merely just that — a tool!

Comments

জুলাই সনদ স্বাক্ষর: বিএনপি বলছে ‘নতুন অধ্যায়’, ‘গুণগত পরিবর্তনের’ আশা জামায়াতের

সনদ বাস্তবায়নের ক্ষেত্রে যদি সরকার ডিলে করে বা অন্য কিছু চিন্তা করে, তাহলে সেটা জুলাইয়ের সঙ্গে জাতীয় গাদ্দারি হবে এবং নতুন করে রাজনৈতিক সংকট তৈরি হবে।

১০ ঘণ্টা আগে