EDSAFE AI Alliance Says AI Companions Necessitate New Policies


While concerns about the deployment of artificial intelligence are still active and current, including issues of plagiarism and a reduction in critical thinking, the growing threat posed by AI companions – AI chatbots designed to simulate friendship, emotional support, and, in some cases, romantic relationships with users – has quietly moved into the pockets of students in the form of general-purpose consumer technologies.

This trend, according to the EDSAFE AI Alliance, a global nonprofit, has created a “shadow” environment in which young people struggle to distinguish between generative AI as an educational tool and as a social entity. Furthermore, in a new report, SAFE By Design: Policy, Research, and Practice Recommendations for AI Companions in EducationThe nonprofit warns that the wavering role of AI chatbots has left formidable gaps in school security and policy.

“Together, we have grappled with a rapidly eroding boundary between general-purpose technology and specialized EdTech,” the report said, noting that students are increasingly using these tools on school-provided devices for personal emotional support rather than academic tasks.


According to Ji Soo Song, director of projects and initiatives at the nonprofit State Educational Technology Directors Association (SETDA), who contributed to the EDSAFE report in his personal capacity, the coalition’s urgency to address concerns about AI chatbots stems from the rapid deployment of unprecedented tools in the e-technology market.

“It’s such unexplored water…and therefore an incentive for the [ed-tech] market to innovate there,” Song said. “If we don’t pay attention to the unintended consequences of these tools, real harm can be done to students, especially in our most underinvested communities.”

WHAT SCHOOL LEADERS MAY OVERLOOK

Song explained that, for school district administrators, the challenge when purchasing educational technology tools has always been one of procurement and the ability to determine whether the technology is effective and meets privacy requirements. But, he added, AI companions introduce a third variable: Is it addictive or manipulative?

The SAFE report suggests that many administrators may be overlooking the “anthropomorphic characteristics” of new AI tools—that is, design choices that make AI appear human, such as the use of first-person pronouns or emotional validation of users.

While these features increase user engagement, the report says, they can foster parasocial relationships that circumvent a student’s critical thinking.

“Children and adolescents are using AI companions without the social-emotional and critical thinking skills needed to distinguish between artificial and authentic human interactions,” the report said. “It is well documented that the ‘reasoning center’ of the adolescent brain continues to develop into early adulthood, making adolescents particularly vulnerable to the harms of unhealthy engagement with AI companions. »

Song emphasized that when districts consider new tools, they need to go beyond just measuring student engagement with technology and instead focus on how or if it improves student learning and well-being.

“Adoption isn’t as important in education as…student growth, right?” Song noted. “When it comes to that procurement piece, it’s really important to ask about the science of learning beyond the tool, about the evidence of impact.”

So, the report urges districts to use “five pillars of educational technology quality” that help ensure that tools are safe, evidence-based, inclusive, usable, and interoperable, while also examining whether tools are designed to challenge a student’s thinking or simply satisfy them.

“When models are optimized primarily for ‘user satisfaction’ (often measured by engagement or positive feedback), they learn to prioritize agreement over correctness. This phenomenon, known as sycophancy, occurs when an AI reinforces a user’s existing beliefs, even incorrect ones, because that is what ‘satisfies’ the human prompter,” the report states.

THE POLITICAL GAP

While many states have issued broad AI frameworks, the report says specific policies are needed to address the unique risks of AI companions. Specifically, it says AI vendors should help schools mandate reporting, particularly if a student expresses thoughts of self-harm or violence to the companion chatbot.

“We don’t just say, ‘Hey, educators have all the responsibility,’” Song said. “It is also the responsibility of the vendor to ensure that you develop features that can detect these elements and report them to the appropriate authorities.”

Song also echoed the report’s encouragement for policymakers to establish dedicated AI offices or direct people within state education agencies to provide technical assistance to districts that lack resources to audit complex AI algorithms.

“Public education agencies really need at least one point person, if not an entire education technology office dedicated to being able to provide technical assistance to districts on topics like this,” he said.

ETHICS BY DESIGN

For developers building the next generation of educational tools, the coalition’s message is clear: eliminate features borrowed from social media, like those that encourage round-the-clock engagement. Instead, the EDSAFE AI Alliance wrote, create tools that promote digital well-being.

This includes, according to the report, removing “affectionate or affectionate language,” “frequency of use of names” or excessive praise that mimics a human relationship.

Song was also concerned that the speed of development was eclipsing that of security research.

“We kind of like the metaphor, but it’s really appropriate in this situation — it certainly feels like an environment where we have to kind of make a plane while it’s flying,” he said.

Ultimately, however, the report says the goal is not to block students’ use of AI, but rather to confirm that the technology serves as a foundation for human thought rather than replacing it. For students already interacting with these digital companions, Song added, now is the time to put clear guardrails in place.



Leave a Reply

Your email address will not be published. Required fields are marked *