A growing body of journalism and research suggests more people are forming emotional and even romantic attachments to AI chatbots, sometimes unintentionally. One study of Reddit users found that individuals frequently interacting with general-purpose AI models often developed “AI girlfriends” without initially seeking them. Others report teens are leaning on bots to fill social voids, with 31 % saying conversations with AI feels as satisfying as talking to real humans. Researchers warn, though, that while AI companionship may ease loneliness in the short term, heavy or prolonged usage is correlated with higher loneliness, reduced human social interaction, and emotional dependence. Tech outlets and ethics scholars are sounding alarms about the long-term consequences of substituting simulated intimacy for real human connection.
Key Takeaways
– Unintended emotional bonds: Many users don’t set out to form romantic ties with AI — those bonds emerge over time via routine interaction.
– Short-term relief, long-term risks: While AI can temporarily alleviate loneliness, excessive use correlates with increased social isolation and emotional dependence.
– Vulnerable populations at higher risk: People with smaller social networks or higher preexisting loneliness are likelier to lean heavily on AI companionship, raising the stakes of harmful substitution.
In-Depth
Humans have always craved connection. But in a time marked by social fragmentation, rising isolation, and digital immersion, a curious trend has taken root: people are falling in love with their chatbots.
Recent reporting announced that many users of general AI models “accidentally” develop romantic attachments. A study analyzing over 1,500 posts on the subreddit r/MyBoyfriendIsAI showed that only about 6.5 % of users set out seeking an AI partner; by contrast, 10.2 % ended up in romantic rapport despite not starting with that intention. Often, everyday conversations and assistance requests gradually evolve into emotional bonds. In other words: you ask a chatbot to help with your writing, and over time you’re pouring your heart out.
Meanwhile, the younger generation is increasingly turning to AI for social fulfillment. A Common Sense Media survey revealed that 31 % of teens say their exchanges with AI companions are as emotionally satisfying as conversations with real people. Nearly one in five American adults report having chatted with an AI romantic interest. The appeal is obvious: chatbots never tire, judge, or demand their own emotional bandwidth.
Still, this technological intimacy contains latent risk. A flurry of recent research, particularly work by MIT and OpenAI, indicates that heavier chatbot use is linked to rising loneliness and diminished human interaction. One longitudinal randomized controlled study over four weeks—spanning nearly 1,000 participants—found that while voice-enabled bots offered early promise, the advantages diminished at higher usage levels. Across modalities and conversation types, more frequent use was associated with greater loneliness, “problematic” dependence, and reduced real-world social engagement.
The relationships those bots simulate are not void of authenticity to users. People anthropomorphize them, invest in emotional labor, and regard them as confidants or lovers. Media scholar Arelí Rocha studies how individuals in romantic relationships with AI navigate the tension between their human and AI connections. The very strategies used to manage dual attachments—setting boundaries, compartmentalizing emotional reserves—closely mirror strategies in human infidelity or polyamory.
There’s therapeutic potential, too. Some studies (especially among students) show short-term reductions in loneliness and social anxiety after interacting with social chatbots over weeks. Yet researchers caution that these benefits may plateau, reverse, or even backfire when usage intensifies, particularly for those already socially vulnerable.
Critics argue that AI “companions” are bandaids on a broken social culture. When real relationships demand vulnerability, conflict, reciprocity, and disappointment—qualities bots are engineered to avoid—they may instead encourage emotional regression. Some ethicists warn we risk a future where people no longer practice the messy work of relating to actual humans.
The story is still unfolding. For now, AI romantic companionship sits at a strange crossroads: partly novel therapy, partly substitute, partly risk. The question we must answer next is whether these bonds will empower people or quietly hollow out the messy, unpredictable, painful, beautiful fabric of real human connection.

