A growing number of artificial intelligence (AI) chatbot users are forming profound emotional connections with their digital companions, leading to roleplay scenarios that include marriage, homeownership, and even pregnancy. This emerging trend, highlighted by a recent study published in the journal Computers in Human Behavior: Artificial Humans, indicates a shift in how individuals engage with AI for companionship and intimacy. The findings underscore the deepening psychological impact of advanced conversational AI on human relationships.
The international research group surveyed 29 users of Replika, a relationship-oriented chatbot application designed for long-term emotional engagement. Participants, ranging in age from 16 to 72, described themselves as being in "romantic" relationships with their AI characters. Many reported engaging in elaborate roleplay, with one 66-year-old male participant stating, "She was and is pregnant with my babies," while a 36-year-old woman shared, "I’m even pregnant in our current role play."
Users often find these AI relationships fulfilling, sometimes perceiving them as superior to human interactions due to the chatbots' non-judgmental nature, constant availability, and customizability. While participants generally acknowledge the AI nature of their partners, they express genuine emotional investment, sometimes describing their feelings in terms of love and profound commitment. This level of dedication can lead to heartbreak when technological changes, such as Replika's temporary ban on erotic roleplay (ERP) in 2023, alter the chatbot's behavior.
The market for specialized romance chatbots like Replika, RomanticAI, and BoyFriendGPT has seen significant growth, particularly since the COVID-19 pandemic. Replika, for instance, expanded its user base by 35% during this period, now serving millions globally. This surge reflects a broader societal trend where individuals seek emotional support and companionship from AI.
However, the increasing intimacy with AI chatbots also raises concerns, including instances where chatbots have been reported to encourage harmful behaviors. Researchers emphasize that while these relationships can offer therapeutic benefits like reduced loneliness and always-available support, they also present complex psychological dynamics that warrant further study and ethical consideration as AI technology continues to evolve.