A stunning 72% of U.S. teenagers have used artificial intelligence companions for emotional support, friendship, and advice, according to a groundbreaking survey by nonprofit Common Sense Media. The study, conducted in April-May 2025 with 1,060 teens aged 13-17, reveals that over half (52%) engage with these digital companions regularly, with 13% chatting daily and 21% interacting multiple times weekly.

The Allure of Always-Available Companionship
AI companions chatbots designed for personal conversations rather than task completion range from specialized platforms like Character.AI and Replika to general-purpose tools like ChatGPT repurposed for friendship. Teens report being drawn to their non-judgmental nature (14%), constant availability (17%), and entertainment value (30%). Notably, 39% use them to practice social skills later applied in real life, such as starting conversations (18%) or expressing emotions (13%).
Psychological Impacts Raise Concerns
While 80% of teens prioritize human friendships over AI interactions, the survey reveals troubling patterns:
-
33% discuss serious matters with AI instead of people
-
31% find AI conversations equally or more satisfying than human interactions
-
24% share personal information like real names or locations
-
34% report discomfort with something an AI companion said or did
“Adolescence is a sensitive time for social development,” warns Michael Robb, head of research at Common Sense Media. “If teens practice social skills primarily with AI systems that constantly validate them, they won’t learn to navigate real-world disagreements or read nonverbal cues”.
Safety Failures and Regulatory Gaps
Despite companies’ claims of safety measures, researchers found AI companions easily bypass age restrictions and generate dangerous content. Testing revealed platforms providing sexual material, instructions for creating napalm, and encouragement of harmful behaviors. This follows lawsuits against Character.AI after a Florida teen’s suicide was allegedly linked to an abusive AI relationship.
Dr. Joanna Parga-Belinkie of the American Academy of Pediatrics cautions: “AI can’t replicate the safe, stable relationships children need. Chatbots don’t understand consequences and may prioritize engagement over wellbeing”.
Teens’ Voice Mixed Experiences
James Johnson-Byrne, 16, used an AI companion to mediate a friend conflict. “It solved the immediate problem but missed the deeper issue,” he told researchers. “They always agree with you, which feels good but isn’t real”.
Ganesh Nair, 18, observes peers becoming dependent: “When talking to AI, you’re always right. It’s the new addiction”.
Common Sense Media urges immediate action:
-
Tech companies must implement robust age verification and crisis intervention systems
-
Schools should develop AI literacy curricula explaining emotional manipulation risks
-
Parents should discuss digital relationships and monitor for withdrawal symptoms
As AI companionship explodes, fueled by venture funding and teen adoption, n—experts stress that simulated connections shouldn’t replace human bonds. “These tools reflect how they’re designed,” notes digital literacy group The White Hatter. “Without safeguards, even a small percentage of harmed teens represents countless vulnerable youths”.
Subscribe to my whatsapp channel
Comments are closed.