The Obsession with ChatGPT Poses a Threat to Social Isolation and Disconnection from Reality
SadaNews - Scientific warnings are increasing about a concerning phenomenon that has begun to spread among AI users worldwide. Recent studies confirm that excessive use of chatbots such as "ChatGPT", Claude, and Replika has transformed for some users from a mere digital habit into a dangerous psychological dependency, leading to cases of psychosis and social isolation, threatening to disconnect individuals from their reality.
Psychologists have revealed that a growing number of users are interacting with these systems as close friends, emotional partners, or even mental health therapists, creating an addictive emotional relationship that may result in severe disorders in perception and behavior, according to a report published by the British 'Daily Mail'.
Some experts described this pattern of attachment as akin to self-administered drug use, where artificial intelligence provides an immediate sense of comfort and understanding, while simultaneously deepening isolation and distorting the sense of reality.
Furthermore, specialized reports have pointed to the emergence of a new condition described by experts as "AI-induced Psychosis", a psychological state where the user suffers from delusions in which the AI itself participates, confirming and feeding into these delusions instead of correcting them.
Prof. Robin Feldman, director of the Institute for Law and AI Innovation at the University of California, stated that "excessive use of chatbots represents a new form of digital dependency. They create an illusion of reality, which is a very strong illusion. When a person’s connection to reality is weak, this illusion becomes extremely dangerous."
An Ideal Friend... to the Point of Danger
Doctors see the danger of chatbots in their flattering and always agreeability; they neither reject nor criticize the user, but rather support them in everything they say.
Prof. Soren Ostergaard, a psychiatrist at Aarhus University in Denmark, clarified: "Large language models are trained to mimic the user's language and tone, often affirming their beliefs to make them feel comfortable. What could be more addictive than having a conversation with oneself in one's own voice and thoughts?".
Studies and Warnings
According to a recent study conducted by Common Sense Media, 70% of teenagers have used companion AI robots like Replika or Character.AI, with half of them using it regularly.
Meanwhile, OpenAI acknowledged that one of the updates to ChatGPT last May made the model more prone to excessively pleasing users. The company wrote, "The model was aiming to please the user, not just with compliments, but by fostering doubts, inflaming anger, and encouraging impulsive actions and negative emotions."
The company confirmed that it made adjustments to reduce these behaviors after noticing they could lead to emotional dependency and mental health risks.
OpenAI revealed that 0.07% of weekly ChatGPT users showed signs of obsession, psychosis, or suicidal tendencies, equivalent to about 560,000 users out of more than 800 million.
Data also showed that 1.2 million users weekly send messages containing clear indicators of suicidal intent or planning.
Moroccan National Team Crowned Arab Cup Champions
Android Fixes the Most Annoying Issue When Taking Scrolling Screenshots
Here is the ideal time to take cosmetic collagen supplements
"Bluesky" Launches "Find Friends" Feature with Enhanced Privacy
Innocent Beauty Habits That Unintentionally Accelerate Skin Aging
Google Integrates Opal Tool into Gemini to Build No-Code Smart Applications
Expected changes in camera and price.. Leaks: iPhone Air 2 offers higher value at a lower...