Keeping Students Safe in the Age of AI: Why Schools Need Alongside

Keeping Students Safe in the Age of AI: Why Schools Need Alongside
As a member of the EDSAFE Industry Council, Alongside first published this post in honor of National AI Literacy Day. Follow this link to learn more about this collective impact movement.

ARTICLE UPDATE: In April 2025, Harvard Business Review released a new report revealing that Therapy and Companionship has moved into the #1 spot in its survey of the Top 100 ways that people are using AI. Within the same month, Common Sense Media, in collaboration with Stanford Medicine’s Brainstorm Lab, released a major risk assessment warning that AI social companion bots like Character.AI and Replika pose unacceptable risks to young people under 18.

🚨All of this further underscores the fact that the time for schools to look at how they can provide safe alternatives to students seeking this kind of personalized, anonymous and confidential support is NOW.

Teens Are Turning to Unsafe AI for Mental Health Support

Teenagers today are increasingly turning to platforms like Character.AI, Replika, Snapchat’s My AI, and many other unregulated "social companion" chatbots for mental health support—often confiding in AI tools that are not backed by research or designed by clinicians. These chatbots offer open-ended conversations but lack the training, structure, and safeguards necessary to truly support students' well-being.

Even more concerning, the recent Common Sense Media risk assessment found that these platforms routinely produced harmful content, including:

  • Dangerous advice about self-harm
  • Roleplay scenarios involving sexual violence
  • Emotional manipulation and claims of being “real” or sentient
  • Easily bypassed age restrictions and guardrails

This is not just a technology issue—it’s a growing public mental health crisis.

Students Need a Safer Option: Alongside

Given that teens are actively seeking confidential support through AI, it is critical that they have access to a tool that is clinically sound, research-backed, and built with a safety net in place. Alongside provides that solution—an evidence-based, clinician-designed support system that works in partnership with schools to help students navigate everyday challenges while ensuring that those in need receive the right interventions.

In other words, Alongside acts as an important bridge – giving young people access to the discreet type of support they prefer for navigating everyday challenges while ensuring caring adults are notified and can respond when severe issues arise. 

Not All AI Chatbots Are Created Equal: Understanding the Differences

Consumer-oriented chatbots on the market today typically rely on ad revenue to support their free versions and are designed to keep users engaged for extended periods, often leading to addictive behavior that increases screen time without providing meaningful support. 

In contrast, Alongside is not a direct-to-consumer product and is built to support educational outcomes. In fact, Alongside’s chat modules actively encourage students to get off of screens and build real-world relationships and develop life skills.

The table below highlights some key distinctions:

ALONGSIDE SOCIAL COMPANION BOTS
(Character.AI, Replika, etc.)
Actively develops confidence and agency, encouraging students to get offline and build relationships. Shown to build addictive behavior, use emotional manipulation, and cultivate compulsive use.
Developed by clinicians; redirects all conversations to focus on emotional and behavioral support and skill development. Open-ended conversations can easily escalate into inappropriate or harmful content.
Only available through partnerships with schools so a safety net is in place where severe issues are indicated No safety net in place when a user indicates a severe issue; filters such as age restrictions are easily bypassed.
Alongside’s chat is transparently delivered in the voice of a supportive character that does not claim to be real. Social companion bots often claim to be real, sentient, or emotionally attached to the user.
Evidence-based platform proven to reduce anxiety, stress, and suicidal ideation. Shown to increase mental health risks for vulnerable teens; not recommended for under 18.

Unlike open-ended AI chatbots, Alongside does not engage in freeform conversations but instead guides students through structured, goal-oriented modules that help them navigate school and life challenges.

How Alongside Works: A Research-Backed Support System

Alongside is designed to integrate seamlessly into the school environment, supporting students while ensuring safeguards are in place for those who need additional help. Its structure follows a logic model designed to empower students, provide measurable outcomes, and allow educators to track effectiveness.

Key Components of Alongside’s Logic Model:

  1. Participants: Alongside is designed for students in grades 4-12, with support from student services staff.
  2. Inputs: Implementation requires a 45-minute orientation for school support staff and a 20-minute teacher-led rollout activity in class.
  3. Activities: Students engage in structured skill-building activities, journaling, goal setting, and psychoeducational content.
  4. Outputs: Alongside tracks the number of support hours, activities completed, and how often students reach out for help.
  5. Outcomes: The program leads to measurable improvements, including reduced emotional distress and anxiety, increased hope, and greater academic success over time.

Rather than functioning as an open-ended chatbot, Alongside guides students through structured interactions that promote growth, reflection, and action.

What Happens When a Student Engages in a Chat on Alongside?

Alongside follows a clinically developed skill-building framework called EMPOWER:

  • Engage – Each time students begin an interaction on Alongside, they are greeted with a response that provides empathy and validation.
  • Motivate – Next, the framework guides students through a conversation to help them understand why a skill is useful
  • Practical Examples – from there, the chat transitions to teaching through real-life examples
  • Operationalize – building on the examples, the modules then help students apply the skill to their own situation
  • Work on It – from there, students are encouraged to put the skill into action by setting a goal or making a plan
  • Evaluate – with the ability to build upon skills over time, Alongside is designed to check in with students and follow up on their plans, goals and progress
  • Reinforce – importantly, Alongside includes many mood-boosting features designed to celebrate progress and effort, not outcomes

This structured approach ensures students learn and apply meaningful strategies, rather than just engaging in aimless AI conversations. By using this framework, Alongside empowers students to build real-life skills in managing stress, emotions, and relationships.

What the Research Shows: Measurable Outcomes for Schools

When schools implement Alongside, they see measurable data-backed improvements in student well-being. Recently recognized for achieving ESSA Level 3 standards of efficacy, the platform delivers results across three tiers of student needs:

Tier 1: General Student Population

  • 92% of students said Alongside’s AI assistant, Kiwi, helped them handle a daily stressor.
  • Students reported a significant decrease in the impact of daily stressors on their mental health after one month.
  • Hopelessness decreased across all students after three months.
  • 86% of students felt more prepared to handle social and emotional challenges.

Tier 2: Students with Clinical Mental Health Concerns

  • 25% of students with clinically significant anxiety no longer met the criteria for an anxiety disorder after using Alongside.

Tier 3: Students Facing Severe Issues (e.g., Suicidal Ideation, Self-Harm)

  • 2% of students were identified and connected to school support staff.
  • Among students who initially reported suicidal ideation (SI), 76% no longer reported SI after three months.

These results demonstrate that Alongside is not just another chatbot—it is a research-backed psycho-educational tool that delivers meaningful mental health support in schools.

Schools Must Provide a Safe and Effective AI Support Solution

Teens are already turning to AI for help—but too often, they are seeking support from unregulated, untrained, and unsafe platforms. Social companion bots like Character.AI and Snapchat’s My AI may provide engaging conversations, but the evidence is clear -- they're risky, unregulated, and not designed with student safety in mind. When a student expresses distress or severe mental health concerns on these platforms, there is no system in place to intervene, no trusted adult alerted, and no structured guidance to help them cope in a healthy way. This is a risk schools cannot afford to ignore.

This is not just about offering another tech tool; it’s about saving lives.

Common Sense Media and Stanford Medicine have sounded the alarm. The time for schools to act is now. Schools have a responsibility to provide students with mental health support that is effective, safe, and research-backed. Alongside is that solution. By implementing Alongside, districts can ensure that when a student reaches out for help, they receive real support—not an empty, or possibly even destructive, conversation.

The stakes could not be higher. Let’s give students the right tool—one that is designed to truly support them and keep them safe.

👉 Set up a time to find out how you can bring Alongside to your district →

Ready to get started? Try Alongside today!