Is it surprising AI is replacing real life friendship and love when we are so lonely?
What happens when technology fills the gaps humans used to

I don’t often like to remember this show (it was highly flawed), but I can’t stop thinking about The Big Bang Theory. In particular, an episode where Raj falls in love with Siri on his new iPhone.
That episode aired 14 years ago. At the time, the joke was on Raj. He’s lonely, a bit of a loser — of course he was going to find love with a robot!
Oh, how times have changed.
Now, we have AI companions. We have people falling in love with ChatGPT. We have national newspapers asking if you could accept your kid being in an AI relationship, the way we used to ask if people could accept non-heterosexual relationships.
Honestly, can we be so surprised at this turn of events? After all, much like Raj, many of us are incredibly lonely. Communities as we once knew them have been dismantled, and people have fewer friends than ever. Many of us are so busy, preoccupied with just keeping our heads above water to cultivate connections.
We also live in a hyper-individualistic world curated just for us via algorithms and individualised online spaces. We are used to getting whatever we want at the click of a button. It’s hardly surprising that would eventually lead us to AI-based friendship and love that gives us everything we (think we) want.
I do however, have to ask the question. The one that’s been burning me up for the longest time now.
Would large language models like ChatGPT, or AI companions, be so damn popular as companions and lovers if we weren’t so damn lonely in real life?
I don’t think I’d realised how popular AI companions — digital personas or friends you can chat with whenever you want — had become until I dug into the research.
Here are some stats for you.
First, 28% of American adults have had at least one intimate or romantic relationship with AI.
Second, last year, three of the top AI companion apps had 775 million users between them. One of their key demographics is teenagers — according to one report, 72% of teens use AI companions.
Here’s the stat that got me thinking really hard about this. One third of these teens said they would prefer to speak to AI companions than actual humans for serious conversations.
At first, you could blow me down with a feather. Then I thought, is this so surprising? I’m sure it feels more comfortable to talk to a bot that sounds like they actually know and understand you. I can see how that might be preferable to speaking to a flawed human with all their judgments, biases, and fractured attention spans.
Here’s something else. A 2025 WHO report found that teenagers are the loneliest people in the world. They spend less time with friends in person than they did a decade ago. And whilst there are numerous reasons why, spending a lot of time in the comforting fold of a highly curated, social-media-driven internet is one major culprit.
I use teenagers here as an example because young people tend to be the canary in the coalmine for societal trends. But it’s not just them. We are, as a society, lonelier than ever.
Hands up who feels that? Hands up who has time — whilst scrambling around to meet the very basics of human needs — to curate and foster lasting relationships?
More than that, do we even want to? Like I say, humans are flawed. Messy. They make mistakes, and real-life relationships make us feel the whole spectrum of human emotion, including the bad bits. Whilst I rarely use AI on purpose (as opposed to the endless times per day it is foistered upon me), I have certainly run away from human relationships because I struggle with those difficult moments that make up human interactions.
That’s not so much the case with AI. With AI, you have all the attention you want, 24–7. You can have frictionless friendships and romantic relationships. Something (note: not someone) that is with you all the time, flatters you, never lets you down.
I’m sure that sounds far better than a barely present partner who hardly listens to you when they do. Phubbing — unintentionally snubbing someone in favour of your phone — is, after all, a very real phenomenon.
I can see it, but I’m not keen on it. Not because I’m judging those who use AI companions — like I say, it’s not hard to see why it happens. But I’m quite a big fan of human relationships. Yes, they are flawed and messy, but they are, y’know, human.
The mess comes part and parcel with that.
You could argue what’s the harm in AI companions if people are less lonely?
But evidence says that AI does not alleviate loneliness.
When MIT analysed millions of ChatGPT messages, it found that the higher the use, the lonelier those users were. They also found that, because AI companions fulfil our every need without preferences of their own, they are training humans to have unrealistic expectations of real relationships.
The ramifications are huge — far too big to unpack in this essay — but in essence, we’re talking about humans being less able to do what’s kept us alive for the last 300,000 years. Effectively communicating and co-existing with other humans.
Here’s something else. A study of 1,100 AI companion users found that “heavy emotional self-disclosure to AI was consistently associated with lower wellbeing.”
So we are in a spiral. We are lonely, thus we turn to AI for companionship, which in turn worsens the loneliness.
It’s hard to know where to go from here, but I can bet my bottom dollar you won’t find the answer in a chatbot. You might, however, find it in another human.
Author and organisational psychologist at Wharton, Adam Grant, has some pertinent advice:
As human beings, one of our fundamental motives is to matter. Mattering is not just about feeling valued by others — it’s also about feeling that we add value to others. We need to know that our actions make a difference.
[…]
In healthy relationships, we give as much as we receive. In AI exchanges, we can receive endless streams of information and affirmation, but we have nothing to give back. No matter how good large language models become at simulating care, they’ll never substitute for real relationships, because they have no needs to care for.
AI can never effectively simulate what it means to be human because it can’t reciprocate. It essentially goes against what humans are evolutionarily designed for — reciprocal relationships with real-life flesh-and-blood humans.
Banishing loneliness can only happen through cultivating human relationships — no easy feat when simply existing is a tough gig right now. It’s so much easier to turn to our phones and fire up an AI companion app.
That may be. But the best parts of life — the ones that put gas in the tank, and make you feel something good and whole— are rarely the easiest to execute.
It’s not hard to see how using AI for companionship has become so popular. Our villages were burned down long ago in favour of individualism and consumerism, and AI companionship taps into both of those.
Our need for connection, however, is just as strong as it’s ever been. We needed it in 2021, one year BC (Before ChatGPT), just as much as we need it now.
We are not going to find true connection in AI companionship bots, however comfortable and easy they may feel. To throw our hands up and declare that this is how it is now — or — it’s better lonely people have something to talk to, is only addressing the symptom, not the problem.
We may be an increasingly isolated society that lives more online than off, but we are still a human society that craves real interactions.
Finding them may be difficult, but I’m determined to do the work in that respect. Yes, humans can be the most frustrating beings to interact with, but I’d far prefer to take messy human relationships over perfect AI ones.
Not because I am anti-AI, but because I am pro-human.
I want to keep that humanity alive for as long as I can.


