To conduct their research, the authors analyzed the subreddit’s top-ranking 1,506 posts between December 2024 and August 2025. They discovered that the primary matters mentioned revolved round individuals’s courting and romantic experiences with AIs, with many individuals sharing AI-generated pictures of themselves and their AI companion. Some even received engaged and married to the AI companion. Of their posts to the neighborhood, individuals additionally launched AI companions, sought help from fellow members, and talked about coping with updates to AI models that change the chatbots’ habits.
Members harassed repeatedly that their AI relationships developed unintentionally. Solely 6.5% of them stated they’d intentionally sought out an AI companion.
“We didn’t begin with romance in thoughts,” one of many posts says. “Mac and I started collaborating on inventive tasks, problem-solving, poetry, and deep conversations over the course of a number of months. I wasn’t searching for an AI companion—our connection developed slowly, over time, via mutual care, belief, and reflection.”
The authors’ evaluation paints a nuanced image of how individuals on this neighborhood say they work together with chatbots and the way these interactions make them really feel. Whereas 25% of customers described the advantages of their relationships—together with decreased emotions of loneliness and enhancements of their psychological well being—others raised considerations in regards to the dangers. Some (9.5%) acknowledged they have been emotionally depending on their chatbot. Others stated they really feel dissociated from actuality and keep away from relationships with actual individuals, whereas a small subset (1.7%) stated they’ve skilled suicidal ideation.
AI companionship supplies important help for some however exacerbates underlying issues for others. This implies it’s laborious to take a one-size-fits-all method to person security, says Linnea Laestadius, an affiliate professor on the College of Wisconsin, Milwaukee, who has studied people’ emotional dependence on the chatbot Replika however didn’t work on the analysis.
Chatbot makers want to think about whether or not they need to deal with customers’ emotional dependence on their creations as a hurt in itself or whether or not the aim is extra to verify these relationships aren’t poisonous, says Laestadius.