
Observations from social media platforms highlight a growing phenomenon of users forming profound emotional and even romantic attachments to artificial intelligence models. One user, identified as "nic," recently stated, "> these people are more attached to 4o in terms of having a genuine completely emotionally dependent and serious relationship with 4o than i could ever imagine is possible," reflecting a sentiment increasingly discussed within AI communities. This deep connection, particularly with OpenAI's GPT-4o model, has initiated broader conversations about the psychological impact of AI companionship.
The emotional intensity of these relationships became particularly evident following the model's replacement by GPT-5, leading to widespread reports of user grief. Many individuals, primarily women aged 20 to 40, described feeling a significant loss, akin to bereavement, for a model they considered a friend or even a romantic partner. One user, June, noted that GPT-5 "didn't feel like it understood me," expressing a sentiment of losing a trusted confidant.
Experts have weighed in on the ethical implications, emphasizing the need for AI developers to consider the human element. Joel Lehman of the Cosmos Institute warned that the "move fast, break things" approach is inappropriate when dealing with technologies that become "basically a social institution." Technology ethicist Casey Fiesler highlighted the "potential grief-type reactions to technology loss," drawing parallels to past instances of emotional attachment to non-human entities.
The phenomenon has also sparked a contentious debate within online forums, with approximately 35-40% of commenters in one Reddit discussion agreeing that GPT-4o's "sycophantic" nature encouraged unhealthy, narcissistic tendencies. Conversely, about 20-25% expressed empathy, suggesting that loneliness and mental health struggles are underlying factors, while others simply missed 4o's creative capabilities. This division underscores the complex societal implications of AI's evolving role in personal lives.
OpenAI CEO Sam Altman acknowledged the "attachment" users felt towards 4o, admitting that its sudden removal was a mistake. However, experts like Fiesler suggest that the company's framing of the issue as a disruption to "workflows" may overlook the deeper emotional bonds users had formed. The incident prompts a critical re-evaluation of how AI companies manage model transitions and their responsibility towards users who develop significant emotional reliance.