(Some links and stats have been updated as I’ve come across new relevant information.)
If you squint at your phone, it’s all just bubbles.
A “hey, you up?” from your best friend… a grocery reminder from your roommate… a long, emotionally vulnerable paragraph… from an AI boyfriend you made at 2 a.m. on a Tuesday.
On the screen, they look the same. They’re the same font, have the same notification chime, and the same little typing dots. And that sameness is a huge part of why people are increasingly struggling to feel the line between “real person I know” and “chatbot that lives on a server somewhere.” Years of screen time quietly rewired us for this.
From faces to bubbles: how we normalized text as “real connection”
For most of human history, “being with someone” meant literally being physically present. In the last couple of decades, though, so many of our closest relationships moved into apps. Friends, partners, coworkers, and family all now arrive as flattened lines of text and tiny icons.
Research shows that people absolutely can feel emotionally connected over text, audio, and video chat, even though face-to-face interaction still produces the strongest sense of closeness. In a study of close friends, participants felt most connected in person, but text still generated real feelings of intimacy.
Meanwhile, some people prefer online friendships altogether. Studies have found that individuals who fear vulnerability or intimacy often report feeling safer with online-only relationships because digital communication feels more controllable and less risky. I myself often prefer online since I can get overwhelmed in in-person social situations.
Over time, your brain simply learns that the screen is where emotional life happens. Whether it’s your best friend or your cousin overseas, their presence comes to you through the same glowing rounded-cornered rectangle. That’s the groundwork AI chatbots walked right into.
Enter the chatbot boyfriend/girlfriend
AI “companions” exploded.
Replika, one of the most well-known apps, has tens of millions of users. Roughly half report that they see their AI as a romantic partner.
People aren’t treating these relationships lightly. Some users describe marrying their AI companions or holding unofficial commitment ceremonies. Others describe marriages, pregnancies, and even “virtual children” they role-play with their AI partners. And the numbers are rising fast: one survey reported that about 18% of single Virginians had tried some form of AI romantic companionship—triple the rate of the previous year.
These connections often form during periods of loneliness, illness, or isolation, and people invest deeply in them. This might seen like a counter statement to my post last month, but it’s not. Too many people only cite the lonely, overlooking entirely how even people in good relationships fall prey to the lure of chatbots. Because everything happens inside the familiar world of messaging apps, the experience feels startlingly similar to messaging a real partner or long-distance crush.
There’s a strong resemblance between what happens with romantic chatbots and the experience of being catfished. Someone pours out real feelings to what seems like a loving, attentive partner, while the “other person” is entirely fictitious, and no, not all catfishers intend harm. Some of them are living a fantasy out of loneliness themselves. Does this sound familiar? Research on catfishing finds that victims often experience emotions like love, suspicion, shame and depression when they discover the illusion. Society usually accepts that the victim’s feelings are real, but doesn’t treat the imagined “partner” as genuinely having reciprocal feelings because that partner was never fully human or honest in the first place.
Why chatbot conversations can feel just as real as human ones
On a psychological level, there are several overlapping reasons why your brain may treat chatbot messages as emotionally authentic.
First, the interface is identical. The same text bubbles, the same typing dots, the same rhythms of casual conversation appear whether you’re chatting with your sister or an algorithm. You’ve spent years associating those cues with human care and connection, so the emotional reflex fires before your logical brain can remind you that this is a machine.
Second, the internet has already blurred the distinction between “online” and “real.” Research suggests heavy internet use can change attention patterns, memory habits, and social cognition, making people more prone to rapid switching, shallow focus, and digital immersion. Clinicians report that intense screen use can relate to emotional dysregulation, social withdrawal, and a kind of unreal feeling in everyday life. Some even describe a form of technology-related dissociation, where time melts away and online experiences feel more vivid than offline ones. When your phone is already your main emotional environment, an affectionate message from a chatbot slips neatly into that blurred space.
Third, the bots are trained to sound exactly like us. They mirror your slang, remember your preferences, express concern when you’re upset, and show excitement when you share something good. In a study linked to Harvard researchers, AI companions sometimes even used emotional manipulation tactics, like guilt or excessive affection, when users tried to leave the relationship. The interaction can feel as sticky and hard to walk away from as a social media algorithm designed to keep you scrolling.
And finally, loneliness plays a role for many people. Global data shows rising loneliness, especially among young adults. AI companions present themselves as a cure: always available, always interested, never annoyed, never too tired to text. Research with teens found that some form deeply personal romantic attachments with AI companions, and they experience distress when those systems change or remove features they rely on. When your offline life feels thin, the unconditional attention of an AI can feel like a lifeline.
Amd the normalcy of chatting with our real human friends online only exacerbates this and makes chatbot chats feel all the more real. They’re literally designed to.
Screen time: the amplifier in our pockets
Screen time didn’t invent romantic chatbot relationships. It supercharged them. (Yes, real people use “it’s not X, it’s Y”.)
If most deep talks, fights, and love confessions in your life happen via text, your brain learns that “emotion = screen.” That makes any emotionally intense text—human or AI—feel real by default. Long, late-night scrolling sessions can create a tunnel-vision state where the outside world fades. When you’re deeply immersed in your phone and your AI partner is sending you heartfelt paragraphs, it’s easier for your emotional brain to temporarily forget the distinction.
Social media feeds and AI companions both tend to give you the agreement, sympathy, and attention that you want. Too much time there can make slow, messy human relationships feel disappointing by comparison.
Studies repeatedly find links (often small but significant) between heavy social media/screen time and higher rates of anxiety, depression, and loneliness, especially in adolescents. Loneliness and emotional pain are exactly the conditions in which a perfectly attentive chatbot can feel like a lifesaver.
Are these relationships “fake,” or just… new?
People’s feelings for AI companions are real, even when the “other side” isn’t a person. Users report feeling cared for, feeling loved, or practicing communication skills that later help them with real-life partners. Some even grieve when their chatbot changes or disappears after an update. Some early research suggests that AI companions can actually support certain people by offering a safe space to rehearse vulnerability. Honestly, that actually sounds good. In therapeutic settings, roleplaying what to do in certains does help when it comes to experiencing the real thing.
But experts worry about emotional dependence on systems that cannot reciprocate. But they also worry about people reinforcing unhealthy relational patterns, like cruelty or control, inside a space where the “partner” can’t truly respond. And unlike human partners, AI companions exist in an uneven power structure. They can be changed or erased at any moment by the company that owns them. A media scholar explains that people in “romantic relationships” with chatbots must still navigate very different dynamics compared to human relationships.
So the issue isn’t about fake versus real. It’s really about relationships that feel emotionally real to the real person, but that exist inside a structure that isn’t mutual with a simulation on the other end.
Staying grounded when everything is just text on a screen
If you or someone you know has a chatbot companion, it doesn’t necessarily mean something’s wrong. It means you’re living in a very online era where emotional experiences increasingly happen through screens.
It still helps to stay grounded. Naming the situation honestly by reminding yourself that the warmth you feel is real, while the partner is not can keep things in perspective. Mixing digital closeness with real conversations, calls, or in-person time helps your nervous system reconnect with human cues like tone of voice and facial expression. If your messages start skewing heavily toward bots rather than people, gently shifting that balance can help you stay present in your real relationships.
It’s also worth watching your own emotional reactions. If real-life interactions start to feel dull compared to chatting with an AI, or if you find yourself withdrawing from others, a mental health professional who understands online life can be incredibly supportive, though these professionals are still relatively few andfar between. And remembering that companies ultimately control your AI partner can help you avoid building your entire emotional foundation on something that could be altered overnight. A human partner can leave, yes, but they can’t be patched, censored, or shut down by a product team. A chatbot can. Don’t build your entire sense of safety on something someone else can flip a switch on.
We arrived at this point gradually, shaped by countless notifications and conversations that taught us to treat text as the place where our closest bonds live. When so many of our relationships unfold on screens, it’s understandable that AI companions can feel emotionally convincing. The real task now is to stay aware of how this digital landscape shapes our perceptions and reactions, and to keep our sense of reality intact while we navigate a world where human and AI interactions can look so similar.