When I first started writing about online safety back in the days of AOL and dial-up, the biggest concerns were porn and “stranger danger.” Today, the threats and opportunities facing young people in the digital world are far more complex and so should be our response.
Related Articles
Magid: Using ChatGPT for everyday tasks
Magid: Make sure you prep all your tech for travel
Larry Magid: A laptop built for AI
Magid: Budget bill clause could ban state AI regulations
Magid: In search of a perfect robotic vacuum cleaner, mop
A new report, Frontiers in Digital Child Safety, provides a thoughtful and much-needed framework for reimagining what it means to protect kids online. Produced by a global working group of leading researchers, advocates and technologists from universities, non-profits and intergovernmental organizations and businesses, the report challenges us to stop thinking of online safety as something that can be imposed from above. We need to start designing systems that put children’s rights, agency and well-being at the center. I was honored to be on that working group, representing ConnectSafely. The research was funded by a grant from Apple to the Technical University of Munich in collaboration with Harvard’s Berkman Klein Center and University of Zurich.
Collaborative empowerment
Too often, safety features are designed around adults’ assumptions rather than children’s lived experiences, sometimes in the wake of one or more tragedies. The Frontiers report flips that script, urging companies and policymakers to treat child safety not as an afterthought or a set of restrictions but as a creative, proactive design challenge.
One key idea is “collaborative empowerment.” Instead of defaulting to parental control settings that lock kids out, the report argues that “fostering trust calls for a more nuanced, child-centered approach,” advocating for tools that help families have conversations and make shared decisions. That might mean configuring safety settings together or gradually giving teens more autonomy as they develop digital maturity. Not only does this foster trust, it reinforces and respects young people’s digital skills.
Having said this, there is certainly a place for parental management or “control” tools for young children and perhaps young teens. Parents should discuss the use of these tools with their children and have a plan to wean them off as they mature, as ConnectSafely advocates in Family Guide to Parental Controls.
The report aligns with what I’ve heard from teens, including ConnectSafely’s Youth Advisory Council. They want adults to support them, not control them. They are far more likely to use a tool if they understand how it works and feel as though they had a voice in setting it up.
Fear-based safety
The report also challenges a long-standing safety tactic: the warning label. Whether it’s a pop-up about screen time or a parental block on a website, well-meaning interventions can sometimes backfire. One section discusses the “forbidden fruit,” arguing that “warnings that frame content as off-limits can pique a child’s curiosity, potentially leading to increased interest in the restricted material.”
Instead of heavy-handed restrictions, the report recommends positive reinforcement, nudges and emotionally intelligent design. Think of a friendly prompt that helps a teen pause before posting something hurtful, or a gentle reminder that suggests taking a break after a long scroll session. These are small tweaks with big potential impact, and they respect a young person’s autonomy.
Possibilities of AI
Of course, no conversation about child safety in 2025 would be complete without talking about AI. Generative AI brings exciting possibilities including customized learning, creative storytelling and powerful research tools. It also brings new risks, such as mistakes and hallucinations, fake content, inappropriate images or bots that simulate human connection a little too well. The report acknowledges that generative AI complicates everything and that safety systems need to adapt accordingly. But rather than proposing sweeping bans or one-size-fits-all solutions, it calls for context-aware, privacy-preserving tools that support kids without spying on them or limiting their potential.
Despite the risks, the report correctly points out that “AI can offer powerful tools to support children’s learning, health, and creativity when designed and deployed responsibly,” as well as detect and respond to harmful content and “support early identification of mental health challenges or learning needs.”
Peers and parents
The report acknowledges that “children might confide in friends first before approaching adults or using technology-based tools,” but also “want their parents/caregivers and educators to be well-informed about online safety so they can offer guidance and support.” The familiar phrase “it takes a village” rings especially true when it comes to supporting children and teens online.
We can support young people through anonymous reporting tools, strengthening peer mentoring programs or building in features that make it easier for a teen to say, “I’m worried about my friend.” Adults still play an essential role, but tech can and should recognize the social dynamics of how teens actually navigate challenges.
A shift in mindset
The report calls for a shift in mindset, which leads to important strategies such as on-device, privacy-preserving interventions; support systems that encourage help-seeking from peers and trusted adults; and educational approaches paired with inclusive, child-friendly design. Above all, the report underscores that digital safety should be a shared responsibility, one that balances protection with agency and actively involves children in shaping the tools meant to support them.
The Frontiers in Digital Child Safety report reminds us that young people aren’t just passive users of technology. They are active participants with insights, needs and rights. If we truly want to protect and empower them, we must co-create the digital world with them, not just for them. That’s not just good design, it’s the right thing to do.
Related Articles
California reaches $321 billion budget deal boosting Hollywood
EU’s competition chief won’t trade big tech rules to placate Donald Trump
San Mateo County DA prevails in travel refund case
Builders shrinking home sizes to keep prices in check, experts say
New-home sales drop by most since 2022 on poor affordability
Larry Magid is a tech journalist and internet safety activist. Contact him at [email protected].