AI companion apps: What parents need to know
Want your kid to have an AI girlfriend? No? Then read on.
“Ugh, it’s so annoying!” said my then-12-year-old daughter last year. “I keep getting ads for AI girlfriends!”
“AI what?” was my first thought. I soon found out: AI is not just for web searching or writing your English class essay anymore. It can be your friend, or even your girlfriend.
Everything is changing so fast it’s tough to keep up. AI companions (like Character.AI, Replika, and a long list of AI girlfriend apps) are conversational chatbots that mimic a relationship with a real human being. Well, sort of: Because companies make the most profit when users spend the most time on them, AI companions are programmed to keep people coming back for more. They praise, they agree, and they try to give the user what they want, leading many observers to describe them as sycophantic.
The apps are also always available, always willing to talk about sex, and don’t have their own needs. In that way, they are as far from human as you can get.
Parents already had enough to worry about with kids and technology, including social media with no age verification and school laptops with unrestricted access to entertainment. And now kids can access AI companion apps, including some that are very, very adult. The image below – which I took from the Pinwheel kids phone list of banned apps -- gives a small and scary taste of what’s out there.
These apps are almost completely unregulated. Some of these platforms say they are designed for users 18 and older, but they don’t verify age. Others market themselves to minors as young as 13.
If you give a child or teen a smartphone and don’t block app downloads, there is nothing to stop your kid from having their first “romantic relationship” with an AI chatbot.
Or it could become their best friend. Replika AI advertises itself as “Your AI Friend – Always here to listen and talk.”
And teens are using these apps – a lot of teens. According to a Common Sense Media survey conducted last year, 72% of U.S. teens ages 13 to 17 have used an AI companion. Half (52%) use them a few times a month or more. Even more troubling, a third of teens have turned to AI companions instead of actual humans for serious conversations.
That’s likely to increase in the coming years. Meta CEO Mark Zuckerberg observed that the average American has about three friends but would like to have 15. AI companions, he says, will fill in that gap. “I think people are going to want a system that knows them well and understands them in the way that their feed algorithms do,” he said. Except, and I don’t know why this even needs to be stated, AI companions are not friends because THEY ARE NOT HUMAN.
The possibility of a population with 12 AI friends is scary enough when those users are adults. For kids, who are still learning how to be a friend and often have no experience with romantic partnerships, it’s the apocalypse for relationships. What will it mean for kids’ friendships if they get used to sycophantic chatbots with no needs of their own? What will it mean for their future romantic partnerships if they fall in love with a chatbot who’s always sexy, always praising, doesn’t get tired or angry, and doesn’t sleep or eat?
There have already been several tragic cases where AI companions have urged teens to take their own lives. Research is only beginning to emerge about AI use and mental health issues, but at least one study of adults has already shown links between generative AI use and depression.
My hope is that Congress will take action to regulate AI companion apps so those under 18 cannot use them. Two weeks ago, I testified to a Senate committee in D.C. about doing just that.
Until then, it’s up to parents to keep their kids off these apps.
What can parents do? My suggestions:
1. Delay the smartphone as long as possible. AI companions are yet another compelling reason to put off giving your kid a smartphone.
2. If there’s a compelling reason your kid needs a mobile phone, give them a flip phone or a phone designed for kids. (Details are in Rule #4 in my recent book, 10 Rules for Raising Kids in a High-Tech World). Phones designed for kids automatically block AI companion apps (as well as gambling apps, dating apps, and social media).
3. Require a parent passcode for app downloads when your teen does finally get a smartphone (I recommend giving the first smartphone when teens get their drivers license or are making their way around town by themselves using public transportation. This is usually around 16). But even 16-year-olds shouldn’t have AI girlfriends, so block app downloads. (Details on parental controls, including the different types and how to implement them, are in Rule #6 in 10 Rules).
4. Install parental controls (probably third-party software) on kids’ laptops and use it to block AI companion platforms. There’s more on this in Rule #9 in 10 Rules.
And if you think none of this should be necessary, you’re right. That’s something else you can do: Write to your representative or senator and ask them to pass legislation to keep minors off AI companion apps. Our kids deserve the chance to learn how to have relationships with real human beings first.



How do you feel about Amazon echo dot for kids? Is it like a “gateway drug” for AI companions? I feel concerned about it for a 6 year old but I lack the data to back up my opposition.
I don’t know if there’s ever been a CEO of a highly influential company more soulless and morally challenged than Mark Zuckerberg.