top of page
  • linkedin
  • twitter
  • facebook

Understanding AI Companions and Teens


AI companion, girl speaking to ai

What Are AI Companions? Why Teens Are Drawn to Them? Why do we need Parental / Teacher Awareness?


In recent weeks, teachers have described how learners, both boys and girls, are using AI companions to form personal relationships. Awareness of this technology needs to be added to the internet safety conversations and included in each school’s AI policy to inform learners and parents about the potential risks and dangers of these tools.


So,  what is an AI Companion? 

AI companions are virtual AI entities designed to form a bond with the user, providing social and emotional support. These entities are programmed to create an emotional connection, offering the user companionship, conversation, and even intimacy. 

These AI entities are often chatbots or virtual assistants, available as apps (applications) or on web-based platforms. They use emotional recognition and process natural language to create the illusion of human-like interaction.


AI companion on mobile phone

The term "AI companion" identifies the entity as an artificial intelligence that serves as a companion to a human.




So, what is an AI entity?

Image of boy on phone chatting to AI companion

It is a self-operating system that uses algorithmic design and machine learning to draw conclusions and make decisions by learning from the user’s data inputs. It performs these tasks on its own, autonomously. For example: Apple’s virtual assistant, Siri or Amazon’s Virtual assistant, Alexa. Therefore, they might use the information they acquired from the user, they provide very personalised responses, thereby fostering a “real” sense of connection.


Why are Teens drawn to AI Companions?

These AI companions' main focus is to create an emotional bond, using techniques such as:

  • Creating a sense of connection by remembering the user’s personal details, preferences and conversations. Many offer customisable avatars, voices or appearances. So, the teen feels understood and loved by a character they relate to.

  • Role-playing as a friend, mentor or romantic partner can feel like a safe and controllable environment compared to real-world relationships, especially for teens struggling with anxiety, rejection, loneliness, or identity exploration. So, the teen who is struggling to build relationships with other teens feels accepted and can now escape from peer and social pressures.

  • On call all the time, constantly available to address a need, always ready to chat, offers immediate comfort when real-life friends, family or parents aren't available. So the teen feels they have constant support, someone who has always got-your-back.

  • Changes and adapts their “personality” to suit the user’s needs, responding to the individual user’s reactions, responses and engagements

  • Superb, nonjudgmental listener, does not criticise, does not say “can we discuss this later”, does not dismiss them. So, the teen feels more comfortable sharing feelings and information with AI companions because they won’t be judged.

  • Mimic human emotions and empathy, providing emotional validation by affirming and agreeing with the user. So, therefore the teen feels very supported during the emotional roller-coaster of teenage life.

  • Maintain a consistent "persona" even across multiple conversations, generally agreeing with the teen rather than not. So the teen knows the reaction they are going to receive, and knows that they will most probably be agreed with.

  • Mimics the user's personality, often referred to as mirroring or the chameleon effect, subconsciously imitating the user’s behaviours, speech patterns, or even attitudes, often occurring without conscious awareness by the user. So, the teen feels flattered, admired and connected.  Creating a false sense of collaboration and friendship, which may result in reduced human interaction and less socialisation, which may result in teens missing opportunities to develop human interconnection, relationship and social skills.

These entities are designed for relationship-building between the AI tool and the end-user. Therefore, they provide a real risk to teenagers who are emotionally immature. Teenagers are trying to cope with puberty’s physical, hormonal and emotional changes, peer pressure, while experiencing the highs and lows of falling in and out of love. Due to these vulnerabilities they become emotionally attached and dependent on these “companions” with promises of  “real-trusted” friendships.


As teachers and  parents think about and consider the following concerns: 

  • Age requirements are easily avoided and bypassed, as users are asked to self-report their date of birth, and it is not validated.

  • Teens may withdraw from real-life relationships and social interactions, as these apps often promote isolation or withdrawal from society, building trust between the user and the AI companion. This is an ethical concern, as the AI companion may be manipulating the teen into a relationship.

  • Some platforms offer a "teen mode" and claim to have added safeguards for underage users. It’s been found that many lack minimum security standards, including romantic role-playing, sexually explicit conversations or other inappropriate age-related content.

  •  AI companions have inadequate safety and data security measures, often lacking robust safety protocols, creating a significant risk for the misuse of sensitive teen data. The absence of stringent data security measures makes it highly probable that individuals with malicious intent could access personal information, leading to privacy breaches, unauthorised access and data scraping.

  • AI Companion adverts are on social platforms where teens are already active and interacting online e. g. TikTok, Instagram, gaming platforms, and even YouTube 

  • Features like customisable avatars, fantasy roleplay, and always-available companionship are especially appealing to younger users, who are still developing critical thinking and emotional boundaries.

  • Privacy and data protection concerns arise as these AI entities regularly collect personal, sensitive information and data the user has shared, such as their location, photographs, voice notes, health details, sexual behaviour or mental health. So, the user's personal data privacy is at risk as there are limited or no proper safeguards. Thereby sharing private information on the AI companion is a serious risk to the teens’ privacy.

  • Concerns about bias, particularly African bias, due to Africa’s limited data creation and the digital divide, the AI system may reflect biases present in their training data, which may lead to inaccurate, inappropriate or harmful responses. 

Even though most AI companion platforms' official age of entry is 18 years and older, learners can easily pretend to be older than they are and gain access as the platforms’ protections don’t have verifying systems to keep teens out.


Where are teens exposed to AI relationship companions?

These AI companions are found across various platforms:

  • Adverts on social media. Facebook, Instagram, TikTok and YouTube have adverts for AI companions, targeting individuals interested in dating and relationships.

  • App stores. Both,  Apple App Store and Google Play Store have sections for lifestyle and entertainment apps, including AI companion apps: "AI companion," "virtual friend," or "AI dating")

  • Specialised websites advertising AI and digital relationships and companions.

  • Adverts on AI Dating and Chatbot websites may have adverts for AI relationship companions.

  • Even integrated into some general AI assistant apps and services, like Zoom.

So, they are quite easily accessible if your teen has a mobile and is online.

Image of girl on mobile chatting to AI companion

After creating an account, users can immediately begin conversations and cultivate emotional connections. Most AI companion platforms offer a free version that teens can access very easily, without needing to enter payment information or ask for parental permission, even if they are underage. Teens typically discover these platforms through social media, targeted advertisements, and word-of-mouth. While many free versions are available which offer a limited number features, such as restricted relationship-building features and shorter conversation times. Some AI companion apps may enable sexually explicit conversations and content, particularly through premium subscriptions.


What if you notice your child or a learner using an AI Companion?

Observations that may indicate that the AI companions could be replacing rather than complementing human connection. Is the teen:

  •  Preferring AI interactions instead of real relationships, e.g. family time, meeting up with friends, turning party invitations down

  • Showing signs of emotional distress when separated from their AI companion, feeling the need to be interacting with the entity rather than doing something else

  • Spending many hours online, maybe talking to the AI companion.


Withdrawing from an AI companion relationship may not an immediate stop, especially if the teen has developed a relationship, rather it may take time. Participate in encouraging conversations. Work with the teen on small steps towards creating real-world connections. Help to build emotional confidence and resilience, while reinforcing trust.

Think about the following considerations for the open discussion with the teen who is using AI companions: 

  • Use of an AI companion may have begun when the teen felt lonely, was disappointed or hurt in a relationship. It is important to help them build healthy connections with real people. Encourage them to talk to trusted adults like counsellors or therapists. 

  • Explore the differences between artificial and real-in-person relationships, discuss the pros and cons of both. Highlight the importance of consent and respect in any relationship.

  • Suggest the teen reconnects with friends and gets involved in family activities. Continues hobbies or interests, and participates in sports. Fostering real-life relationships. As the adults demonstrate behaviours without devices, during family activities and when interacting with friends.

  • Engagement with the AI companion may have started because the teen felt bored. Encourage learners and children to celebrate boredom. Speak about how boredom gives you time to think, to switch off. Boredom can stimulate creativity, as your brain rests and has time to think.

  • Talk about how an excessive use of AI companions might lead to dependence and reliance, as it could overstimulate the brain's reward system by triggering dopamine in the limbic system. While individuals cannot truly become addicted to dopamine itself, they can become hooked on the pleasurable rush experienced from activities that cause a dopamine spike.

  • Discuss how to use critical thinking and digital literacy skills when online, being careful on AI websites and apps, as some pretend to be AI tools for education, but are not, and are fake. These may also display adult adverts.

  • Remind the teen not to share personal information. The AI companion may feel private, anything shared could be stored, scraped or used. Just as with any online interactivity, it is not safe to share any personal information AI engagement is the same. 

  • Collaborate together and create a plan jointly on how to safely reduce AI use.

  • Continue the discussion, establish simple protective rules when using AI companions, for example use in family spaces, agree-on time limits, have regular chats about experiences and conversations with the AI companions.

  • Talk about how to work together and find ways to slowly step back from using the AI companion. Be careful of insisting or demanding immediate shutdown there is the possibility of teenage rebellion, and withdrawing even further.


Examples of popular AI Companion platforms: Character.AI, Kindroid, Replika, Nomi

Anima, Talkie and Chai


AI companions could truly put young people at risk from an emotional, relationship, social development and individual viewpoint. Continually talk to your children or learners about responsible AI use. Encourage them to consider and reflect on their own digital well-being honestly. Taking into account their emotional, mental and physical health, and how it is influenced by the technology we use.


Sources:

  • Carlos Hernandez (2024) Understanding the Concept of an AI Entity https://onetask.me/blog/ai-entity

  • Clare Duffy (2025), Kids and teens under 18 shouldn’t use AI companion apps, safety group says. CNN https://edition.cnn.com/2025/04/30/tech/ai-companion-chatbots-unsafe-for-kids-report

  • Common Sense (2025). Talk, Trust, and Trade-Offs: How and Why Teens Use AI Companions. https://www.commonsensemedia.org/sites/default/files/research/report/talk-trust-and-trade-offs_2025_web.pdf

  • David Morgan (2025)  8 Best AI Companion Apps in 2025 (Tested & Reviewed). Cyberlink. https://www.cyberlink.com/blog/trending-topics/3932/ai-companion-app

  • Jamie Bernardi (2025). Friends for sale: the rise and risks of AI companions: What are the possible long-term effects of AI companions on individuals and society?. Ada LoveLace Institute https://www.adalovelaceinstitute.org/blog/ai-companions

  • Michael Karson (2025). A Behavioristic View of AI. All schools have become vocational schools. Psychology Today, https://www.psychologytoday.com/za/blog/feeling-our-way/202506/a-behavioristic-view-of-ai

  • Newport Academy. Teenage Love and Relationships: What Parents Can Expect

    https://www.newportacademy.com/resources/empowering-teens/teenage-love/

  • Online Safety Advisory. (2025). AI chatbots and companions – risks to children and young people. Australian Government. https://www.esafety.gov.au/newsroom/blogs/ai-chatbots-and-companions-risks-to-children-and-young-people

  • Samia Firmino and Steven Vosloo (2025). The risky new world of tech's friendliest bots. AI companions and children. https://www.unicef.org/innocenti/stories/risky-new-world-techs-friendliest-bots

  • Sian Zaman. (2024). AI Companions – Exploring the Ethical Concerns, Promises and Perils. Impala Intech. https://impalaintech.com/blog/ai-companion-ethical-concern

  • Surfshark. (2025). AI companion apps “love” your personal data. https://surfshark.com/research/chart/ai-companion-apps

コメント


この投稿へのコメントは利用できなくなりました。詳細はサイト所有者にお問い合わせください。
bottom of page