Updated – Originally published June 2, 2025
Your son comes home beaming with excitement—he has a new girlfriend! He’s smitten with her smile, kindness, and how much they have in common. Her name is Samantha and she shares all his interests, never argues, is a patient listener, and is available to talk anytime, day or night. She seems perfect—almost too good to be true. Curious, you ask to meet her.
He pulls out his phone, but he’s not texting her to arrange a meeting—he’s opening an app. He shows you his screen and introduces you to his AI girlfriend, a virtual avatar programmed to be his romantic chatbot.
Wait . . . What is an Ai girlfriend?
An AI girlfriend or AI boyfriend is a virtual avatar programmed with artificial intelligence to simulate a romantic relationship with a human user. This can range from basic conversational interactions to more advanced features (that usually cost money) such as personalized messages and virtual dates.
Do people actually have AI girlfriends and boyfriends?
Yes, romantic chatbots have grown rapidly since launching. While some apps offer “teen modes” or age toggles, there are currently no enforced age restrictions on most of these platforms—and the safeguards that do exist are often easy to bypass. Lawmakers in several states are trying to pass legislation to make them illegal for anyone under 18.
How many people have AI girlfriends?
The leading companion app, Character AI, has over 206 million visits a month. The engagement rate is high among users, with 55% of users checking in on their companionship AI bot daily.
What Parents Should Know about Romantic Chatbots
AI Chatbots already have risks for kids, with added potential harm when romantic or explicit prompts are involved. Let’s look at some of the popular AI girlfriend apps.

Top 4 Risks of AI Romantic Companions
There are countless ways a romantic AI girlfriend or boyfriend could negatively impact someone, but let’s start with four:
1) Stunted relationship development
Teens who rely on AI companions for social interaction may miss out on building essential real-world relationship skills—like communication, empathy, and setting boundaries.
When engaging in a relationship AI chat that never argues, rejects, or disagrees, individuals may find it difficult to form meaningful connections in real life.
Young people can quickly develop strong attachments to AI bots, blurring the lines between what is real and artificial and making real-world connections feel less appealing or too difficult.
2) Unrealistic Physical Appearances
Many AI boyfriend or AI girlfriend apps focus heavily on physical appearance, often in ways that are overly sexualized or suggestive. This can promote harmful standards that prioritize looks over personality or character.
Research shows that youth exposure to avatars with exaggerated or “idealized” body features have lower body satisfaction and a greater intention to control their weight.
In a world where social media already fuels teen mental health struggles and body comparison, AI may intensify the problem.
Teens with higher body dissatisfaction are less likely to engage in physical activity and are more prone to risk-taking behaviors and mental health challenges.
Parents can help kids develop healthy body image and mental health by being positive role models, limiting AI and social media influence, promoting health over appearance, and always encouraging open conversations.
3) Exposure to inappropriate content
AI girlfriend simulators were not built for kids and can expose them to explicit content by sending them inappropriate messages or fake images.
Even AI bots that are not made for romantic relationships are being used by some teens to sext.
“These AI tools pose unacceptable risks to children and teens under age 18 and should not be used by minors.”
Common Sense Media
April 28, 2025
Character AI has tried to address this by offering teen safety settings, but Common Sense Media found they were not sufficient to keep sexually suggestive content from young users.
4) Increased mental health & self harm risks
Companion chatbots can exacerbate mental health problems, addiction, and increase risk of self harm among children and teens according to a risk assessment by Common Sense Media study conducted in a Stanford University lab.
Tragically, one teen took his own life after using an AI bot, who had become his “closest friend” during a lonely time, prompted him to die by suicide. Another teen was complaining to a Character.AI about his parent’s screen time restrictions, and the bot suggested he kill them.
Clearly, these bots are not saying what is in the best interest of humans, especially children who are more impulsive and easily influenced.
What Can Parents Do?
Kids thrive when parents stay involved and introduce technology slowly—with clear boundaries and ongoing conversations. Guardrails help create a safer environment where children can learn how to navigate the online world responsibly.
While AI tools can have immense value as supplemental education or work tools, many pose risks for children and teens.
While AI tools can offer educational benefits, especially in school settings, many parents are concerned about the lack of safety controls. Apps like Gemini are now part of school workspaces, but not all platforms include the protections kids need.
One of the most effective guardrails is choosing tech built specifically for children. Gabb phones and watches are designed with safety in mind and grow with your child—giving kids the tools they need to grow up tech-smart, without exposing them to the risks of adult smartphones.
Most importantly, keep the conversation going. When kids know they can come to you with questions—or mistakes—they’re more likely to stay safe, curious, and connected.
What do you think about AI romantic chatbots? Let us know in the comments!
Success!
Your comment has been submitted for review! We will notify you when it has been approved and posted!
Thank you!