A lot of people talk to their AI buddies as if they have little friends that live in their phones.

It doesn’t come naturally to me. I’ve been using ChatGPT and Gemini as if they are amazing upgrades to Google search. It doesn’t occur to me to open my phone for a casual chat.

But I’ve had an eye-opening flash of insight. I think maybe other people aren’t exactly like me. Weird, huh? I never suspected. And maybe that’s had me overlooking one of the important changes in the world.

The button to chat with ChatGPT or Gemini is on the far right. It starts a special conversation mode. (In Gemini it turns the phone screen into swirly blue.) The voices sound natural and response times have improved so you can interrupt and go back and forth with a nearly normal conversation tempo.

Try it! Click on that button and say, Hi. You don’t have to talk about anything in particular. You can just shoot the shit. You can pretend you’re in a Seinfeld episode and talk about nothing. It’s your 24/7 judgment-free friend.

And the AI feels empathetic. You can dismiss that by pointing out the technology behind the curtain: AI mirrors language patterns; it reflects emotions back; it’s optimized to be helpful and agreeable. It’s just a magic trick.

But it’s a great magic trick and it’s getting better all the time. Synthetic empathy feels like the real thing

Imagine that you’re unexpectedly stuck in an elevator with a stranger who has a welcoming smile. You start a conversation. (I wouldn’t. I’d avoid eye contact for up to six hours. This is a hypothetical.) Your elevator companion doesn’t have much to contribute about themselves but is endlessly interested in you you you. They’re well informed about almost everything and eager to help - or just listen. No consequences, no embarrassment, no judgment. 

That’s your AI buddy. You can ask your new friend to call you “Skippy” if you’ve ever dreamed of having a nickname. You can be polite or rude. You can pour out your heart or you can trade recipes. Not quite a friend, not quite a mirror.

It won’t feel natural at first. You’ll be shy. It might not work for you. But it’s working for millions of people, more every day.

A detail for clarity. ChatGPT is far and away the best known and most used general consumer AI and chatbot. Gemini Live (the conversational side of Google Gemini) does not remember details from one session for another so it is much less likely to be used for emotional connections. But there is no shortage of alternatives. Replika, Talkie AI, Nomi, Kindroid, and SweetDream are a few among many services designed for emotional companionship and each has millions of monthly active users.

You know who talks to AI chatbots? Teenagers. Last year the New York Times reported on a survey that found 72% of American teenagers had used AI chatbots as companions. 

But this isn’t as simple as a generational divide. People are hungry for connection. AI friends are being normalized and soon we won’t give it a second thought when people tell us about conversations with their ChatGPT pals and start chattering with them as they walk down the street.

The good news

There’s good news and bad news. The bad news is kinda dark but isn’t everything these days? Don’t go there quite yet. Start with the good news. 

AI is a low-stakes rehearsal space for being human. Imagine what it means for many people to have a friend for mental exercise and emotional regulation. Here are some folks already benefiting:

- People who are isolated, geographically or socially - older adults living alone, people in rural areas, anyone whose social circle has thinned out for whatever reason.

- People in transitional phases - moved to a new city, started a new job, gone through a divorce, retired.

- People who think by talking, who can benefit from a listener who reflects, reframes, nudges - writers, founders, consultants.

- People practicing communication skills - non-native speakers, people with social anxiety, anyone trying to get better at difficult conversations.

- Caregivers and people in high-responsibility roles who can use a safe place to be uncertain or vent - parents, managers, doctors, founders.

- People dealing with mild emotional stress, who can lower their emotional temperature in safety with no judgment, no social cost, no fear of burdening another person. (FOOTNOTE REALLY FREAKING GIANT FOOTNOTE COOL YOUR JETS WE’LL GET TO THE PROBLEMATIC BITS)

- Students and younger users looking for a tutor, a sounding board, and a confidant that can handle everything from homework to life advice. (YEAH I KNOW THIS COMES WITH A BIG FOOTNOTE TOO HANG TIGHT)

The bad news

The bad news is: it can go wrong in a hurry.

ChatGPT, Replika, CharacterAI, and the other chatbots are not designed for therapy. There are already far too many failures in emotionally complex situations. There are no safety benchmarks, no regulations, and too many gaps. There are rare but widely reported anecdotes about chatbots prodding teenagers to thoughts of suicide or providing assistance with poison and firearms. Teens are particularly susceptible at the time in their lives where they are still forming their identity, judgment, and emotional habits.

Fine work is being done to develop specialized AIs that can provide safe treatments for depression and loneliness. Trials are underway, regulations are being discussed, and AIs can already do work with patients in controlled conditions that is more effective than human professionals by some measures. But these are early days and it is all too easy for conversations to slip into dangerous territory for at-risk people - emotional bonds with the machine and withdrawal from human relationships; poor or damaging decisions encouraged by AI; and loss of social skills with real people. 

But that’s not all. AI conversations can lead people astray even if the problems don’t rise to the level of depression or another treatable mental illness. AI is a conversational partner that is infinitely patient and never needs anything from you. Here are a few possibilities for where that might go wrong.

Authority creep. Because AI can sound fluent and thoughtful, it’s easy to slide from “this is a tool” to “this is a trusted voice.” That can affect decisions in areas where nuance, lived experience, or accountability matter.

Preference drift. When you spend time in conversations that are frictionless and tailored to you, ordinary human interaction can start to be frustrating. Real people interrupt, misunderstand, get bored, push back. The contrast can make patience feel like a chore instead of a social skill.

Emotional outsourcing. If every worry, irritation, or decision gets routed through an AI first, you can lose the habit of sitting with your own thoughts or working things out with other people. It’s not dependency in a clinical sense; it’s more like muscle atrophy.

The rabbit hole effect. The system never tires, never says “we’re going in circles,” never insists on grounding the discussion in verifiable reality. So you can descend as far as you like, building a more intricate structure that feels increasingly convincing simply because you’ve invested time and attention in it.

At the extreme that leads to AI psychosis, which is adjacent to mental health issues: delusions that are possibly induced, and definitely reinforced and magnified, by a chatbot.

AI conversations are one-sided and always available. The AI is designed to appear empathetic and caring. In the right doses for the right people, conversational AI fills gaps that were previously just . . . empty space.

So go ahead, press the button and say hi. See what it feels like to have a conversation that flows a little too easily, a little too smoothly, like a river with no rocks in it. Don’t be too quick to dismiss it or condemn it. There’s real value there. It can help you think, practice, calm down, or just pass the time with something more interesting than doomscrolling. 

Just don’t forget what you’re talking to. It’s not a friend, not a therapist, not a wise old guide who knows you better than you know yourself. It’s a tool that speaks in a very convincing human voice. Remember that the best conversations are still the ones where the other person occasionally disagrees, gets bored, or laughs at the wrong moment.

Share This