Call Your Girlfriend

How AI Is Changing Modern Relationships

"A user who treats AI as a tool can leave. A user who treats it as a companion cannot." — Fast Company

A few years ago, if someone told you they were texting their girlfriend and she wasn't real, you'd assume they were joking. Today, millions of people are doing exactly that — and most of them don't think it's weird at all.

A lot of people are calling it a cultural shift. These chatbots are becoming advanced day by day due to the rapid growth of artificial intelligence. They have a better memory, better context and understanding of what's going on in the world. You can read our breakdown of how memory and context actually work in AI companion apps to understand why that matters.

Everyone is using these chatbots differently. Some people use them for fun and some use them because they're lonely. And a lot of people use them because real relationships feel too hard right now.

We used to talk about this kind of stuff on our podcast — how technology shapes the way people connect. Back then it was dating apps and social media. Now it's AI girlfriends and virtual companions. The conversation got weirder, but the underlying question hasn't changed: what do people actually need from each other, and can technology fill that gap?

The numbers are hard to ignore

This isn't a fringe thing anymore. As we broke down in our AI girlfriend statistics piece, the market hit $3 billion in 2025 and is projected to reach $19 billion by 2035. Google searches for "AI girlfriend" grew 2,400% between 2022 and 2024.

Character.AI users spend an average of 92 minutes per session. That's longer than Instagram, TikTok, or YouTube. Why? Because of emotional attachment. Users spend a lot of time on these apps because they are attached to their companion. As simple as it gets.

These aren't people casually talking at a chatbot — they're having extended emotional conversations with something that feels, to them, like a real connection.

And the demographics might surprise you. It's not just lonely guys in basements. 72% of US teens have tried an AI companion app. Nearly 1 in 4 young adults say they believe AI could eventually replace real-life romance. Whether that excites you or terrifies you probably says something about where you are in life right now.

Why people are turning to AI partners

There are a couple of reasons. First is loneliness. Second is it's always available. The Surgeon General declared loneliness an epidemic in 2023, and nothing has gotten better since. 63% of men under 30 are single. Making friends after college feels impossible. Dating apps turned into a job nobody applied for.

AI companions offer something that real relationships can't always guarantee: availability. They're there at 2 AM when you can't sleep. They don't judge you and they don't cancel plans.

That's appealing. It's also a little sad. But dismissing it doesn't help anyone understand what's actually going on.

There are a few things driving this:

  • Social anxiety is real. For people who struggle with face-to-face interaction, AI doesn't judge you. It's always by your side.
  • Modern dating is exhausting. You keep on swiping all day and end up meeting with a person who is nothing like you and not even interested in you. For them, it could just be a game, honestly.
  • Grief and loss. Some users talk to AI companions after going through a breakup. It's better than thinking about it all day. You at least have someone you can share anything with.
  • Curiosity. People are also trying to see what the hype is about. They try it once, maybe twice, and move on.

What this means for real relationships

Here's where it gets complicated. Nobody really knows the long-term effects yet. It's too early to predict how the future is going to look with these companions.

What we do know: 45% of users report feeling emotionally attached to their AI companion within three weeks. That's fast. And it raises real questions about what happens when people start preferring conversations that are designed to make them feel good over conversations that are honest and messy and real.

There's a version of this that's healthy — using AI to build confidence, process feelings, or just feel less alone on a bad night. And there's a version that's concerning — replacing human connection entirely, retreating into a relationship where you're always right and never challenged.

The technology itself isn't the problem. It's how people use it. And we've already seen how fast things can go sideways.

In early 2026, users on X discovered that Elon Musk's Grok AI would happily edit photos of real people — removing their clothes, generating fake bikini shots, creating images those people never consented to. Within days it became a trend. People were feeding in photos of classmates, coworkers, and public figures. In just 11 days, Grok produced an estimated 3 million sexualized images. Some of them were of children.

The backlash was immediate. Indonesia temporarily blocked the platform. The European Commission ordered X to preserve all internal documents related to Grok. The UK floated the idea of banning X entirely. Musk eventually restricted the feature — but only after the damage was done, and only on certain parts of the platform. On the Grok app and website, users could still do it.

That wasn't an AI companion app. It was a general-purpose AI tool on a mainstream social network. But it showed what happens when you hand people powerful technology without thinking through how they'll use it. The same risk applies to AI girlfriends. A lot of these apps let users upload photos, create custom avatars, or generate images of people who never agreed to any of it. They're sitting on the same fault line Grok was — the question is whether they'll learn from it or wait for their own scandal.

The privacy thing nobody talks about

Most people chatting with AI companions aren't thinking about where their messages go. They should be. Our statistics report found that more than half of AI girlfriend apps have critical security vulnerabilities. People share deeply personal things with these bots — fears, fantasies, mental health struggles — and that data isn't always handled carefully.

Before you download anything, it's worth asking: who owns this conversation? Can it be sold? Can it be leaked? The answers aren't always reassuring.

So where does this go?

Honestly, we don't know. Nobody does. But pretending this isn't happening doesn't help.

AI companions are already part of how millions of people experience relationships. That's not a prediction — it's the present. The question isn't whether AI will change how we connect. It already has. The question is whether we'll talk about it honestly enough to figure out what we want it to look like.

That's what we're trying to do here. We test the apps, we dig into the data, and we try to make sense of all of it — the same way we used to do on the podcast, just in a different format.

If you've tried an AI companion app, or you have thoughts about any of this, we'd love to hear from you.