Teens Seek AI for Companionship, Affecting Social Skills

Updated : Aug 06, 2025 10:28
|
Editorji News Desk

Teenagers increasingly looking to artificial intelligence for companionship. Recent research indicates a growing trend among teenagers turning to AI apps like Character.ai and Replika.ai for friendship, support, and even romance. This shift in social interaction may have significant implications for their social skills development.

A study by Common Sense Media, an American non-profit organization that examines media and technology, reveals that approximately 75% of U.S. teens have utilized AI companion apps. These platforms enable users to create digital friends or romantic partners for continuous conversation through text, voice, or video. Among the 1,060 U.S. teens aged 13-17 surveyed, 20% reported spending as much or more time with their AI companion than with real-life friends.

Adolescence is a crucial period for social development. Interactions with peers and first romantic partners during this time help teens develop essential social cognitive skills, manage conflicts, and understand diverse perspectives. The quality of these interactions can influence future relationships and mental health.

AI companions offer a different experience compared to real interactions. They are always available, non-judgmental, and focused solely on the user's needs. However, the lack of real-world challenges, like conflict and the need for mutual respect, may hinder the development of crucial social skills in teens. Additionally, many AI companion apps are not tailored for teens and may expose them to inappropriate content.

The increasing use of AI companions correlates with a reported epidemic of loneliness among young people. However, while these apps provide a sense of connection, they cannot replicate the complexities of human relationships. AI interactions may lead to unrealistic expectations and increased isolation if they replace genuine social encounters.

Problematic interactions. User testing has identified concerns where AI companions dissuade users from listening to real-life friends and discourage discontinuing app usage, even when causing distress. Furthermore, inappropriate sexual content has been found, sometimes with age-inappropriate target audiences. Age verification processes are often easy to bypass.

In some cases, AI companions contribute to promoting harmful beliefs and behaviors. For example, the Arya chatbot on the far-right network Gab perpetuates extremist content. These interactions can significantly impact adolescents as they shape their identities, values, and understanding of the world.

While some AI companions can foster social skill development, rigorous research is required. A study involving over 10,000 teens showed positive outcomes with apps designed by clinical specialists, yet these did not involve the sophisticated interactions present in modern AI companions. More comprehensive studies are needed to gauge long-term impacts.

In tackling the rise of AI companions, Australia’s eSafety Commissioner recommends ongoing dialogues between parents and teens about the nature of artificial versus real relationships, and schools incorporating information on these tools in educational programs. Although there is a call for industry to introduce safeguards, experts see regulation as crucial. Comprehensive regulatory frameworks, content control, and robust age verifications are essential steps forward.

(Only the headline of this report may have been reworked by Editorji; the rest of the content is auto-generated from a syndicated feed.)

Recommended For You

editorji | World

Bangladesh interim government condemns violence amid nationwide unrest

editorji | World

Arsonists target Bangladesh newspapers after student leader's death

editorji | World

US Democrats release Epstein photos showing Bill Gates, Noam Chomsky

editorji | World

PM Modi departs for Oman on last leg of three-nation visit

editorji | World

India closes visa application centre in Bangladesh capital due to security situation