Relying on ChatGPT a lot? Study says you’re lonely

Imagine confiding in a chatbot about your bad day, only to end up feeling more alone. New research from OpenAI and the Massachusetts Institute of Technology (MIT) suggests that while most people use ChatGPT for practical tasks, a small group of heavy users might be trading quick comfort for creeping loneliness—or even emotional dependency.
First study: Behavioural analysis
The findings come from two studies. The first was an observational analysis of nearly 40 million ChatGPT interactions, where researchers used automated tools to scan conversations for signs of emotional engagement. By assessing over 4 million conversations and gathering feedback through targeted user surveys, the team was able to correlate users' self-reported sentiments with the type of interactions they had with the chatbot.
This large-scale approach provided a broad view of how people use ChatGPT and showed that affective, or emotionally expressive, interactions are relatively rare overall. However, among heavy users—particularly those employing ChatGPT's Advanced Voice Mode—a small subgroup exhibited significant levels of emotional engagement, with some even describing the AI as a friend.
Second study: Controlled trial
The second study was a controlled trial involving nearly 1,000 participants over a four-week period. This Institutional Review Board-approved experiment aimed to determine how different features of ChatGPT, such as text versus voice interactions, might affect users' well-being. Participants were randomly assigned to various model configurations to explore the effects on loneliness, social interactions, emotional dependence, and potentially problematic use of the technology.
The trial revealed that while brief use of voice mode was linked to improved emotional well-being, prolonged daily use could have the opposite effect. Notably, the study found that personal conversations—those in which users and the AI exchanged more emotionally charged dialogue—were associated with higher levels of loneliness compared to more neutral, task-focused interactions.
Emotional dependence on the chatbot
Why the mixed results? Well, it depends on who you are. The research indicated that non-personal conversations might inadvertently foster greater emotional dependence on the chatbot, particularly among heavy users. Meanwhile, those who already had a predisposition to attachment in relationships and viewed the AI as a friend were more likely to experience negative outcomes. This suggests that personal factors, such as one's emotional needs and initial state of loneliness, can play a significant role in how AI interactions affect well-being.
Notably, most users aren't at risk. The average person asks ChatGPT for help with spreadsheets, not soul-searching. However, the studies highlight a niche group: heavy voice users who lean on the bot for emotional support. Think late-night voice chats about existential dread or binge-sharing personal drama. These users, while rare, showed stronger signs of loneliness and dependency.
To assess these patterns, the researchers developed a set of automated classifiers called 'EmoClassifiersV1'. These tools, built using a large language model, were designed to detect specific affective cues within conversations. The classifiers were organised into two tiers: top-level detectors that identified broad themes like loneliness and problematic use, and sub-classifiers that focused on more nuanced aspects of both user and AI messages. This dual approach allowed the team to efficiently process millions of conversations while preserving user privacy and capturing subtle behavioural signals.
The takeaway
As such, the research raises questions about how "human-like" AI should act. ChatGPT's voice mode, designed to be engaging, walks a tightrope. While lively banter might boost mood during a 10-minute chat, marathon sessions could backfire. The research suggests that the chatbot is using these insights to refine its models, aiming to balance helpfulness with nudges toward healthier usage.
But the lesson here isn't just "AI bad". For most, ChatGPT remains a tool, not a therapist. The real lesson? Boundaries matter. If your nightly routine involves debating philosophy with a chatbot for hours, it might be time to call a human friend instead. As the study dryly suggests, AI isn't designed to replace relationships—but if you use it that way, don't expect a happy ending.
The study stresses that this is early research, not a verdict. The teams plan more studies to untangle how AI interactions shape—or warp—our social lives. For now, their advice is simple: treat ChatGPT like a Swiss Army knife, not a diary. And if you catch yourself calling it "mate"? Maybe log off and ring a real one.
Comments