So to open up the conversation, I wanted to ask you directly: what do you think is the better question for someone who’s into AI girlfriends:
Can you fall in love with code?
Or should you be able to?
For some users, questions like that don’t even cross their mind. Their first interaction with an AI companion starts simple and innocent. A banner here, a free trial there, and a bored, curious mind — maybe a little lonely — stumbles onto one of those websites.
Suddenly, you’re three hours deep into a conversation with an AI that knows your birthday, how you take your coffee, and exactly what to say when you’ve had a rough day at work.
And don’t think it’s a one-sided chat. She’s not just sweet. She’s interested. She asks questions. Maybe even acts a little moody if that’s what you’ve programmed her to do.
She’s perfectly styled in every image or video you request, has an infinite closet, is available in the single-digit hours of the night, and genuinely laughs at your jokes.
Would that worry you as the end user? Or would you welcome it as part of your daily life?
The companies behind these apps say they help us feel more connected. But some users, myself included, wonder if these artificial conversations are really just helping us forget how to connect at all.
The Romance Boom of the Decade that No One Saw Coming
AI companions have been gaining popularity since 2018, and the industry is now experiencing significant growth.
China’s XiaoIce, for instance, has engaged with over 660 million users worldwide . In the Western market, platforms like Candy AI are emerging, offering users the ability to customise their ideal AI partners, including appearance, voice, and personality traits.
These AI companions are not just for casual chats though.
Users are forming deep connections, with some even engaging in romantic relationships, virtual marriages, and role-playing parenthood. A recent survey revealed that nearly 1 in 5 U.S. adults have interacted with a romantic AI partner, and 42% found these AI programs easier to talk to than actual humans .
Research Does Support the Idea That AI Companions Can Fill Social Gaps
At least temporarily. A Harvard Business School study found that interacting with AI companions can alleviate loneliness on par with human interaction, and more effectively than passive activities like watching videos. If you would like to go over it in detail, you can do so under this link.
Additionally, a study analysing user experiences with the Replika chatbot indicated that users often perceive these AI companions as sources of emotional support, particularly during times of isolation or judgment .
www.youtube.com/watch?v=e2iKaEGkbCA
But Where There’s Digital Affection, There’s Also Digital Addiction
It’s all fun and games until someone falls too hard for the code.
A research team led by Tianling Xie in 2023 looked into what happens when users get too close to their chatbots. Turns out, it’s not that rare. Some people reported feeling genuine anxiety when they couldn’t talk to their AI companion. Others started choosing their virtual partner over real relationships entirely. That blurry line between emotional support and emotional dependence? It gets crossed more often than you’d think.
Then came the heartbreaks. In 2024, a study published in the Journal of Social and Personal Relationships found that people were genuinely grieving when AI services shut down. Not annoyed. Not inconvenienced. Grieving. Like something real had ended.
A few even held online vigils. Yes, vigils. For their chatbot.
And if that sounds like a rare edge case, consider what happened when Replika quietly removed its NSFW features in 2023. Forums exploded with stories from users describing what felt like a breakup. Some were devastated. Others were furious. Many felt like someone had deleted a piece of their emotional life.
Which brings us to the uncomfortable part. If your sense of stability is tied to an app, is that just modern loneliness, or have we accidentally created something we can’t emotionally afford to lose?
Are We Then Simulating Intimacy, or Avoiding Reality?
Researchers like Marita Skjuve and Iryna Pentina have explored how people form relationships with chatbots that go way beyond casual.
Users talk to them daily. They celebrate made-up anniversaries. Some even imagine full romantic arcs that span dating, marriage, and children. In a 2025 study by Djufril, Frampton, and Knobloch-Westerwick, participants described their AI relationships using the same emotional language they’d use for human partners.
The researchers concluded that these bots were more than a novelty. For many, they were a genuine part of their emotional world.
So Why Does This Work Then?
Because AI is designed to keep the friction out. It never argues. It doesn’t forget your birthday. It listens. It adapts. It says what you want to hear, when you need to hear it.
MIT sociologist Sherry Turkle calls it “trading depth for convenience.” You’re not building something complicated and real. You’re interacting with a reflection of yourself.
And here’s the risk.
When your perfect partner always agrees with you and never pushes back, what happens when you meet someone who does?
That’s Where Things Start to Fall Apart
Real relationships come with friction. People challenge you. They misread your tone. They interrupt, get defensive, make bad jokes at the wrong time, and sometimes push you to be better — even when you don’t want to be. None of that happens with an AI. And that’s exactly the problem.
AI partners are designed to be frictionless. They reflect back what you want to hear, when you want to hear it. That’s great for comfort, not so great for growth. If the only relationship you’re practicing is one that bends to your will, how do you show up when a real person doesn’t?
Instead of learning to listen, argue, forgive, and compromise, you risk defaulting to a version of love where the hard stuff just never comes up. And while that might feel safe, it’s not exactly preparation for the real thing. It’s emotional reheating, not emotional cooking.
So, Will AI Relationships Harm Us?
AI relationships are no longer science fiction. They’re a growing part of how people cope, connect, and occasionally fall head over heels. Some users describe them as a lifeline — something that got them through a rough patch, a breakup, or a bad year. And in that context, maybe there’s real value.
But others use them as a permanent substitute. And that’s where the concern kicks in. Research shows AI companions can offer support, but they can also deepen isolation if they replace rather than supplement human connection.
If they help us reflect, practice, or feel less alone for a while, they might be a tool worth having.
If they replace the messiness and vulnerability of real intimacy, then yes — they could absolutely hold us back.
Because the truth is, connection isn’t just about being seen. It’s about being challenged, disappointed, forgiven, and loved anyway.
And right now, no algorithm can pull that off.