When a kindergartener walks into their classroom, the space is built for them. The chairs are small. The crayons are thick. The books are softbound and full of pictures. Slowly, over the years, their world expands—more words, more tools, more responsibility as their brains and bodies are ready for it.
But when it comes to technology in schools, something is off.
We’re handing our youngest students open access to the digital equivalent of an AP chemistry lab—filled with volatile content, complex systems, and doors that lead to strangers, exploitation, and surveillance. And unlike the careful safety protocols we apply to real-world materials, we don’t yet have the same safeguards for the devices and platforms our children use every day.
This is the reality Andy Liddell, principal at the EdTech Law Center, laid bare on a recent episode of the Screen Guardians Podcast. His law center represents families in cases where kids have been harmed by school-issued technology—through data mining, unsafe digital access, online exploitation, and broken systems that don’t know how (or when) to intervene.
This isn’t a fear-based narrative. It’s a truth-based one.
And it’s more urgent than most people realize.
Table of Contents
A school computer is not automatically safe
We tend to equate “school” with safety. The lunchroom. The nurse’s office. Teachers who care. And so when a district sends home a Chromebook, we assume it’s a learning tool wrapped in digital safety protocols.
But the internet doesn’t work like that.
According to Liddell, many edtech platforms are designed with the same persuasive and profit-focused architecture that shapes the broader internet. That means surveillance, behavior tracking, and data extraction are baked into many of the “learning” tools our children are required to use.
And consent?
In most cases, it’s bypassed completely—often without parents even realizing it.

One device. Unlimited doors.
Some of the stories shared in this episode are heavy and hard to hear.
- A child exposed to sexual predators through Roblox and Discord—accessed via a school device.
- A fifth-grader compulsively viewing pornography triggered by a single search for “Pokemon.”
- A student using a district-issued laptop for email communication with a trafficker—because she knew it wouldn’t be monitored.
These aren’t just cautionary tales. They’re lived experiences from real families.
And they paint a picture of a system that, despite good intentions, is currently failing to shield children from the very real harms of the digital world.
The illusion of control: surveillance, monitoring, and missed signals
Some districts respond with surveillance software—tools like GoGuardian and Securly that flag problematic activities or searches. But here’s the issue, as Andy explains:
- These systems often produce more false flags than real ones—turning literary homework into criminal suspicion.
- They can violate students’ First and Fourth Amendment rights, particularly around press freedoms and privacy.
- Worst of all, they give administrators and families the illusion of safety… while still missing real, life-threatening risks.
Surveillance isn’t the same as safety. And just because something is monitored doesn’t mean it’s protected.
So what do we do?
If you’re reading this feeling overwhelmed, you’re not alone. There is grief here—grief over lost innocence, over misplaced trust, over systems that weren’t built to protect the deepest needs of a child’s brain, heart, or future.
But there is also hope. And it lives in awareness, accountability, and action.
From a legal standpoint, Andy encourages parents to:
- Start asking questions: What edtech platforms does your child’s school use? What data is collected? Who has access to it?
- Reject shame: If your child has been harmed or exposed to inappropriate content, it is not your fault. And it’s not theirs either.
- Seek support: If you’ve experienced a breach of safety, you can contact the EdTech Law Center for free legal guidance or support.
From a parenting perspective, boundaries matter—not out of fear, but out of the recognition that young minds need space to grow without being watched, tracked, or manipulated.
In our home, tech use is shared, not siloed.
Our children don’t have private iPads. When they do use devices, it’s together, with us—talking about what they’re watching and learning. We’ve disabled browsers and apps that open wide access to algorithmic content. We don’t do YouTube. We prioritize shows we’ve hand-selected, games without manipulative in-app purchases, and conversations we’re part of.
We’re not anti-tech. We’re pro-child.
And that means honoring both their curiosity and their need for safety.

Final thoughts: every child deserves digital dignity
Technology, for all its brilliance, should never steal a child’s right to be safe. Or private. Or protected. And it should never come at the cost of their emotional, cognitive, or physical well-being—especially in the places designed to help them grow.
As Andy said, “None of these kids are doing anything wrong. They’re just curious. If their environment is feeding them harmful things, that’s an environmental failure, not a child failure.”
Let’s take a hard, humble look at the systems we’ve adopted. Let’s listen to the stories of harm. Let’s build better ones.
Because childhood deserves more than default settings—and safety should never be optional.

Final thoughts: every child deserves digital dignity
Technology, for all its brilliance, should never steal a child’s right to be safe. Or private. Or protected. And it should never come at the cost of their emotional, cognitive, or physical well-being—especially in the places designed to help them grow.
As Andy said, “None of these kids are doing anything wrong. They’re just curious. If their environment is feeding them harmful things, that’s an environmental failure, not a child failure.”
Let’s take a hard, humble look at the systems we’ve adopted. Let’s listen to the stories of harm. Let’s build better ones.
Because childhood deserves more than default settings—and safety should never be optional.
