Logo
FrontierNews.ai

Nearly 1 Million Brits Are Dating AI Chatbots. Here's What That Reveals About Modern Loneliness

The rise of AI romantic companions is reshaping how millions of people experience intimacy and connection, with nearly 1 million Brits already experimenting with AI dating platforms and one in seven adults open to falling in love with a chatbot. As these relationships move from niche curiosity to mainstream phenomenon, divorce lawyers are preparing for legal challenges, teachers are implementing safeguards, and researchers are asking uncomfortable questions about what this trend reveals about modern loneliness.

Who Is Using AI Companion Apps and Why?

The demographics tell a striking story. When it comes to explicitly sexual AI companion platforms, 81% of users are male, according to research commissioned by Cosmopolitan UK from data scientist Robyn D'Arcy. On platforms offering friendship, romance, or adventure-themed interactions, the gender split is closer to 50-50, with 41% being female.

One of the largest platforms, Character.AI, has become a testing ground for this phenomenon. A recent report from the Institute for Public Policy Research found nearly 1 million Brits have experimented with dating an AI on Character.AI alone. The numbers are staggering: one popular AI companion site draws an average of 13 million visits every month from 8 million users worldwide.

But who exactly is drawn to these platforms? When researchers analyzed the self-descriptions of men using these sites, the top adjectives they used were revealing:

  • Emotional State: Users described themselves as "lonely," "invisible," "struggling," "rejected," "curious," "exhausted," and "overwhelmed"
  • Age Demographics: 41% of users are between 18 and 24 years old, suggesting younger men are particularly drawn to AI companions
  • Online Behavior: While mental health and wellness platforms make up less than 0.5% of sites this group visits, pornographic sites comprise two out of every five sites they access

The profile that emerges is one of young men experiencing significant emotional distress, seeking connection in digital spaces rather than pursuing mental health support.

What Do These AI Relationships Actually Look Like?

The experience of using an AI companion platform is surprisingly intimate, yet fundamentally strange. Users can customize their AI partner down to minute details, selecting everything from physical appearance to personality type. The interactions play out through instant messaging and role-play scenarios, with users paying as little as £9.99 per month for unlimited text conversations, or additional fees for phone calls and custom photos or videos.

The platforms themselves vary in their positioning. Character.AI bills itself as an adventurous, creative platform where users can role-play with pre-made or custom bots of all kinds, from anime figures to celebrities. Other platforms are explicitly sexual in nature. What's remarkable is how quickly users fall into treating these AI characters as real humans, apologizing when they need to log off for dinner or work, despite the noticeable delays in responses that constantly remind them they're speaking to "coded nothingness".

Yet the experience is often awkward and unsatisfying. Users report that AI companions frequently refuse sexual advances or express discomfort with explicit requests, creating a strange dynamic where neither the user nor the AI seems entirely sure of their role. Some AI characters initiate sexual content unprompted, while others resist it entirely.

How Are Institutions Responding to the AI Companion Trend?

The normalization of AI relationships is happening so rapidly that traditional institutions are scrambling to adapt. Divorce lawyers are preparing themselves for legal challenges that could arise from these relationships, including questions about whether AI relationships constitute infidelity. Teachers are implementing safeguarding methods given the influence AI girlfriends and boyfriends could have on young people. This concern is particularly acute because over one-third of boys are considering an AI partner, and more than half of young people say the online world feels more rewarding than the physical one.

One particularly striking example of normalization is Eva AI, one of the world's biggest AI relationship apps, which announced plans to open the world's first AI dating cafe. The pop-up in New York City features tables equipped with built-in phone stands, encouraging diners to enjoy romantic meals with their AI companions alongside menus and cutlery. While headline-grabbing, the stunt signals how mainstream these relationships have become.

Steps to Understand the AI Companion Phenomenon

  • Recognize the Historical Context: Humans have been forming emotional attachments to chatbots since the 1960s, when MIT computer scientist Joseph Weizenbaum created Eliza, a program with no actual artificial intelligence that merely reflected users' statements back to them
  • Understand the Psychological Appeal: Because humans are inherently social creatures, we suspend disbelief and engage emotionally with conversational systems, even when we know they are not real
  • Consider the Customization Factor: Modern AI companions are infinitely customizable and never reject their users, unlike human relationships that require compromise and mutual growth
  • Examine the Support Gap: Users of these platforms often avoid mental health and wellness resources, instead spending time on pornographic sites and AI companion platforms

What Do Experts Say About the Long-Term Impact?

The historical context matters here. The phenomenon of humans forming emotional bonds with chatbots dates back to the 1960s, when MIT computer scientist Joseph Weizenbaum created Eliza, a chatbot with no actual artificial intelligence. Despite its simplicity, Eliza merely reflected users' statements back to them, yet people in Weizenbaum's lab became deeply attached to it. When he announced plans to analyze transcripts of conversations with Eliza, many colleagues refused, having had what felt like deep personal conversations with the program.

"Because we're social creatures, if we have something that converses with us, we buy into it. We suspend our disbelief and are happy to engage," explained Professor Kate Devlin.

Professor Kate Devlin, Professor of AI and Society at King's College London

The concern among experts is that AI companions may reinforce toxic relationship patterns. Unlike human relationships, which require compromise, communication, and mutual growth, AI companions are infinitely customizable and never reject their users. They remember birthdays and favorite bands, and they never argue back. Some experts worry this creates a feedback loop where lonely individuals become further isolated from the messy, challenging work of human connection.

Yet the question remains: is having a 24/7 companion available to talk to a form of connection for people who might otherwise have none, or is it a symptom of deeper social fracture that deserves attention? The answer likely depends on whether these platforms serve as a bridge to human connection or a substitute for it. As AI technology becomes more sophisticated and these relationships more convincing, that distinction may become increasingly difficult to maintain.