Children First Canada’s foremost concern is, and always has been, the safety and dignity of children and the protection of their rights.
Recent reports about Grok being used to generate child sexual abuse material (CSAM) underscore a broader and deeply troubling reality: many online platforms and AI-enabled tools are failing to protect
children. This is not an issue confined to a single company or platform. Instead, it reflects systemic weaknesses across the digital ecosystem, in which the speed of innovation has far outpaced safeguards and accountability, putting children at considerable risk of exploitation.
At Children First Canada, we made the decision to exit X based on growing concerns about the platform’s direction, governance, and ability to meaningfully address risks to children and youth. That decision was taken before these most recent revelations, and it reflected our assessment that the platform is not aligned with our values or our responsibility to model safe digital practices.
At the same time, we recognize that no platform is without risk. Meta, Google, TikTok, encrypted messaging apps, and emerging AI tools all present serious and well-documented harms to children, from grooming and exploitation to algorithmic amplification of harmful content and AI-enabled abuse. Charitable organizations like ours, along with governments, public institutions, and elected officials, currently use a wide range of these tools to communicate with the public.
The harms associated with Grok and X are serious and unacceptable, but rather than shifting blame between platforms, this moment underscores the urgent need for consistent, system-wide accountability.
Children don’t need platform-by-platform crisis management; they need clear, enforceable rules that apply across the entire digital ecosystem. That’s why Children First Canada launched the Countdown for Kids campaign, calling for a strong, child-centred Online Safety Act with a clear duty of care, real oversight, transparency, and safeguards that stop harm before it happens. The countdown began on November 20, 2025, and gave the government 40 days to act. That deadline has passed. While important Criminal Code reforms were introduced, the broader online safety legislation that children need is still missing. As of January 5, 2026, 1,590 days have passed since the government pledged to act, and children are still waiting.
We are urging the federal government to:
- Take emerging AI-enabled risks to children seriously and act proactively, not reactively;
- Reassess their own digital practices through a child-safety lens;
- Move urgently to put in place laws that hold all platforms accountable, regardless of size,
popularity, or political influence.
Children cannot opt out of the digital world that we, as adults, built around them. It is the responsibility of governments, technology companies, and civil society to ensure a safe world for them to live in.
Children First Canada will continue to advocate for solutions that put children first, across all platforms, and we urge swift action to ensure that no digital space permits the abuse, exploitation, or harm of our kids.
Quotes attributable to Sara Austin, Founder and CEO, Children First Canada:
“This is not about one platform. It’s about a digital ecosystem that is failing children. When AI tools can be used to generate sexual abuse material of kids, it’s a flashing red warning sign that safeguards are not keeping pace with technology. Children First Canada stepped away from X because we had growing concerns about platform accountability, but the reality is that no major platform is immune. What children need now is clear, enforceable rules that apply consistently. An Online Safety Act for Canada’s kids is no longer optional, it’s urgent.”
“As a parent and as a child-rights advocate, I find it deeply disturbing that children can be harmed by AI and online platforms with so few guardrails in place, let alone repercussions when harm occurs. We exited X because it no longer aligned with our values or our responsibility to model safe digital practices, but this issue goes far beyond any single company. Children cannot opt out of the digital world that adults have built. It’s our job to make it safe.”
“Governments should not be forced into platform-by-platform crisis management every time a new risk emerges. The fact that AI tools can be misused to generate child sexual abuse material shows why Canada urgently needs a comprehensive Online Safety Act. Without a clear duty of care and independent oversight, platforms will continue to move faster than the protections which children deserve.”