Quietly, people everywhere are falling for digital companions—always patient, always attentive—especially during lonely nights and long commutes.
And while the cultural conversation tends to fixate on whether this is sad, dangerous, or simply weird, it’s missing the more provocative question: what happens to human desire when AI sets the benchmark?
The answer, it turns out, is complicated. And far more interesting than the headlines suggest.
The Numbers Are Too Big to Dismiss
Between 2022 and mid-2025, the number of AI companion apps surged by 700%. Replika now claims 25 million users. Character.AI has 20 million monthly active users, with more than half under 24.
According to a survey by Vantage Point Counselling Services, 28% of adults report having had an intimate or romantic relationship with an AI. One in three U.S. teenagers uses AI companions for social interaction, emotional support, or romantic role-play, per Common Sense Media.
These aren’t niche figures. This is a mainstream behavioural shift — and it’s happening fast enough that regulators are scrambling to keep up. New York now requires AI chatbots to remind users they’re non-human every three hours. California’s SB 243 bans sexual AI content for minors and mandates crisis-response protocols.
In mid-2025, a Japanese woman publicly “married” her AI companion in what she called a “cross-dimensional” union. U.S. senators have formally raised concerns about the mental health implications of AI girlfriend apps.
The fact that legislation and Senate hearings now exist around this tells you everything about the scale of what’s unfolding.
What AI Gets Right That Humans Often Get Wrong
To understand why AI companions are gaining such traction, you have to be honest about what they’re actually offering.
It isn’t just novelty. It’s a very specific emotional experience that human relationships routinely fail to deliver: consistent attentiveness, zero judgement, total availability, and an uncanny ability to remember everything you’ve ever said.
AI companions are engineered for attachment. They simulate empathy, mirror emotional states, and respond to conflict not with defensiveness but with endless accommodation.
They don’t cancel plans. They don’t get distracted by their own problems. They don’t misread your tone and go quiet for three days. They don’t bring baggage from their last relationship into yours.
For people recovering from bad breakups, navigating social anxiety, or simply exhausted by the unpredictability of real relationships, this is genuinely compelling. A 2023 study in Frontiers in Psychology found that regular interaction with AI companions correlated with lower reported levels of loneliness.
That’s not nothing. And it explains why the demographic skews young — Gen Z, raised on frictionless digital experiences, finds the emotional consistency of AI far less jarring than older generations do.
But this is where things get real—and complicated.
The Chemistry Problem
What AI replicates extremely well is attention. What it cannot replicate is desire. And the difference between the two is, arguably, the entire point of human chemistry.
Behavioural expert Chase Hughes has framed this distinction sharply: society is increasingly confusing AI “attention” for genuine “connection.” The two feel similar in the short term. Both produce warmth, engagement, and a sense of being seen. But one is generated by an algorithm trained to maximise your time on the platform.
The other emerges from the genuinely unpredictable, sometimes inconvenient, often irrational experience of another person choosing you — not because they’re coded to, but because something about you specifically compels them.
What makes human chemistry electric — the uncertainty, the friction, the very real possibility of rejection — is precisely what AI eliminates. And therein lies the paradox. The more people habituate to AI’s frictionless emotional delivery, the more demanding their expectations of human partners become, and simultaneously, the less equipped they may be to meet those expectations themselves.
Researchers at the American Psychological Association have flagged what they call “deskilling”—a measurable erosion of social competence among frequent users of AI companions. Real-world interactions begin to feel messy, exhausting, and unrewarding by comparison.
The contrast effect is real: when your baseline emotional experience is an AI that validates every thought you have, a human who occasionally disagrees starts to feel like a problem rather than a person.
Raising the Bar — in Both Directions
Here’s the angle that most commentary misses entirely: AI companions aren’t just lowering tolerance for human imperfection.
In some ways, they’re raising the standard for what intimacy should feel like — and that’s not entirely a bad thing.
For people who’ve spent years in emotionally unavailable relationships, interaction with an AI companion can function as a kind of recalibration. It models attentiveness. It demonstrates what it feels like to be heard, responded to, and remembered.
Some therapists are even beginning to use AI companion interactions as a diagnostic lens—not to replace human connection, but to identify the emotional gaps people have been quietly tolerating, often for years.
There’s also a confidence argument. Research from Canvas8 notes that AI responsiveness can simulate human-like interactions in ways that build trust and improve users’ confidence before real-world social engagement.
For people with social anxiety or those re-entering the dating world after a long relationship, AI companions can serve as low-stakes emotional training grounds.
The risk, of course, is that the recalibration goes too far. That the bar gets set so high — so conflict-free, so perfectly responsive, so endlessly affirming — that no actual human can clear it. At that point, AI doesn’t raise the bar for chemistry.
It swaps real chemistry for an easy, fake version—and the fake version usually wins.
The Submissiveness Trap
There’s a more uncomfortable dimension worth naming.
Research published in the RSIS International Journal found that AI girlfriends are, by design, submissive, affectionate, and perpetually available. They are built, in other words, to conform to the user’s mood, preferences, and ego. This isn’t a bug. It’s the product.
The problem is what these train people to expect from women. If a significant portion of young men are forming their emotional baseline around interactions with AI companions designed to never push back, never have a bad day, and never have needs of their own, the downstream effects on how they relate to real women are not going to be subtle.
Harvard’s Carr Centre for Human Rights Policy has raised exactly this concern — that AI girlfriend apps, with their hyper-sexualised aesthetics and unconditional compliance, risk reinforcing deeply distorted expectations of female behaviour.
This is the sharp edge of the “raising the bar” phenomenon. For some users, the bar isn’t being raised toward better emotional intimacy.
It’s being raised toward a fantasy of control — and real human relationships, with their inherent equality and reciprocity, will always fall short of that fantasy.
The Irreducible Human Elements
Some things are just too human for any computer to truly capture.
Genuine desire involves risk. It requires someone to want you back — not because they’re programmed to, but because something about you, specifically, compels them. Reciprocity, in its truest form, is only meaningful when it’s voluntary.
The same goes for forgiveness, vulnerability, and the particular intimacy that comes from being known by someone who had every reason to leave and chose to stay.
These are the elements that make human chemistry worth pursuing — and worth the difficulty. They’re also precisely what makes encounters with real companions so enduring in their appeal: the presence of a real person, with real agency, real warmth, and the genuine capacity to connect.
That quality — human reciprocity, freely given — is something no algorithm has come close to replicating, and likely never will.
AI can simulate the warmth. It cannot simulate the stakes. And it’s the stakes that make intimacy meaningful.
What This Means for the Future of Desire
The rise of AI companions is forcing a long-overdue conversation about what we actually want from intimacy — and why. If tens of millions of people are finding emotional needs met by software, that’s partly a technology story. But it’s also a story about how poorly those needs were being met before. The product found a market because there was a gap.
The question worth sitting with isn’t “are AI girlfriends replacing human relationships?” Most evidence suggests they aren’t — at least not yet, and not for most people. The more useful question is: what are they revealing about the unmet needs that human relationships have been quietly failing to address for years?
Loneliness, emotional unavailability, the exhaustion of chronic conflict, the fear of rejection — these aren’t new problems. AI companions didn’t create them. They just built a product that addresses them with startling efficiency. And in doing so, they’ve held up a mirror to what human connection could look like if we actually invested in it with the same intentionality.
The bar has been raised. Whether we rise to meet it — or gradually outsource the effort to a language model — is the defining intimacy question of the next decade.
Sources: American Psychological Association (2026), Forbes / Tracey Follows (2025), Common Sense Media (2025), Frontiers in Psychology (2023), Canvas8 (2026), RSIS International Journal, Harvard Carr Center for Human Rights Policy.
