People & Culture
MISSING: The Future of Safeguarding Children Online
QUICK SUMMARY
Discover how emerging technologies, policies, and parental strategies are shaping the future of our children online.
There is no fictional monster in this story that an unlikely hero can easily be expelled, nor is it as simple as a well-produced documentary with gripping scenes and insufferable characters. The reality is tragic, and harm of any kind targeted towards a child is an irreversible product of predatory behavior.
Children are amongst the most vulnerable of our population. For parents and guardians, it can be very unsettling to know you cannot always protect your child from harm. For law enforcement and information security experts, keeping children safe online is a constant challenge. Finding the dangers that target children online feels like a never-ending struggle.
It’s important to note that cases of child exploitation, cyberbullying, and child sex trafficking have risen. This increase is much higher than what most people know. In 2025, the National Center for Missing & Exploited Children reported a rise in financial sextortion cases involving minors. The number went up from 13,842 to 23,593. Child sex trafficking cases also increased, from 5,976 to a shocking 62,891.
Almost a year ago, Senior Advisor Heather C. Panton joined The Cyber Guild. She hosted a talk. The guests were U.S. Secret Service Sr. Special Agent Wendy Cassidy and FBI Unit Chief Jessica L. Diggons. Child exploitation remains a serious problem. This issue is made worse by the hidden online threats.
DARK TRAIL
To dismantle a network of predators, start with understanding the known threats. We first recognize that these situations, given the severity of potential danger towards a minor, can produce high tension in a family dynamic and even cause disruptions within school environments.
So where does this dark trail start?
Remember that the emotional damage and physical harm a child may face online can be serious. It is important to seek help early.
3 terms (or “characterizations”) are most commonly used in the issue of child exploitation. And while digital engagement masks the monsters, here we do not shy away from labeling them for exactly what they are:
- Predator: an individual who targets children and teens online for sexual exploitation, using digital platforms to initiate harmful contact that may lead to abuse either online or offline.
- Sextortion: (This is broken into 3 categories, Traditional, Financial, and Sadistic) this is when a predator threatens or blackmails a victim into providing sexually explicit content; the online predator threatens to share this content with the public.
- Nudifying Services: Online sites that allow a predator to upload an innocent, clothed image of a person, and then the site creates an interpretation of that person unclothed.
Put plainly, a predator exploits vulnerabilities to obtain explicit images or videos from a minor. These materials are then used to manipulate or extort the child further.
Crimes against children have become increasingly more complex as predators do not need to initiate physical contact to harm a child. They prey on a child’s emotional vulnerability through what’s known as grooming. This process of building trust with the intent to harm a minor starts by offering attention and support. During that time, the groomer collects personal information and gradually introduces sexual content to desensitize their victim.
How can a parent or guardian keep a close eye on every post, comment, private message, text, or shared content from their child? The goal is not to check every online exchange. We know that spotting “unusual behavior” can be hard.
What happens when the threat slips into a new guise?
Child sexual exploitation and abuse in some of its darkest forms has shown to be instigated by predators that include family friends, family members, familiar acquaintances of the child, or a stranger. The material collected from a predator does not always necessarily have to be a photo containing nudity. Investigators /law enforcement often found that cyber criminals targeting children gradually progress from asking the victim to send revealing photos to more explicit material over time, once trust has grown.
Predators are using AI generated images and deepfakes to manipulate children online. This presents a challenge for cybersecurity experts when trying to detect the source. Certain platforms are enabling behavior analysis tools to assist with detection, but the challenge still remains.
Requests to Connect
Forming relationships among humans involves mutual sharing, emotional connection, and trust built over time. Companion chatbots can have a negative impact by disrupting this developmental process for children without the proper oversight by responsible parents and online safety measures. There is the risk of children/teens developing abnormal social dependency on an application that is constantly available and excessively affirming of certain human behaviors. Chatbots can pose the risk of accelerating attachment to a non-human entity in unnatural ways. Decision-making, the ability to make sound judgments, and emotional regulation can become distorted.
Despite this, technological innovation does not have to equate to deprioritizing the safety of users. Especially, not for those in need of stewardship.
Research is being conducted by a team at Virginia Tech to support a means of using intelligent conversation agents to help teens build resilience against cyber predators. This is a case where AI resources can serve as a tool for helping to protect against cybergrooming. The objective aims at enabling agents to provide interactive coaching for teens to help them detect potentially exploitive interactions and decrease their state of vulnerability. To underscore the importance and value of this work, the project is designed with collaboration between researchers, parents, teens, and professors at the core.
Supporting research & innovation like the team at Virginia Tech, is a step in the right direction.
NEW AGE HEROES
Online manipulation and abuse of children is a grotesque act. Placing cybersecurity experts on the front lines has its advantages, with there being greater chances to detect threats and dismantle tools used by abusers online. These new-age heroes work alongside law enforcement to identify predators and victims, build child-safe algorithms, and analyze
seized devices and work tirelessly by all means possible to protect them.
Striking the right balance between innovation and child safety is a priority. Where else does cybersecurity fit into all of this? Social cybersecurity has evolved as a more familiar term that encompasses behavioral science. Protecting the most vulnerable online is just as much of a social challenge as it is a technical one. The work in social cybersecurity
involves a closer focus on protecting people, not just systems, from harmful online behaviors. This is an all-encompassing effort across social media, influence operations, and misinformation.
Cybersecurity experts with a specialized focus on children protection, can position themselves to be in service of reinforcing legislation and minimizing unsafe interactions for children online. Let’s think in terms of secure by design. Instead of looking at this issue from the lens of hoping for positive psychological recovery or stopping when a perpetrator is detained, consider that security involves active participation from all fronts: parents, school administrators, local law enforcement, extracurricular activity coaches, supervisors, neighborhood families, etc. Securing an environment in which children, throughout all stages of their social development, can learn, thrive, and experience autonomy safely is a ‘full force’ effort.
From a legal context, legislation is changing across states to help protect children from the risks posed by artificial intelligence. California, for instance, introduced AB 1064, otherwise known as the Leading Ethical AI Development (LEAD) for Kids Act, in 2025. The Legislation targets what’s known as companion chatbots, to restrict the use entirely unless safety guardrails are integrated to prevent harmful behaviors such as promoting self-harm, substance abuse, and sexually explicit interactions.
Other cases have looked closer into resolving gaps in data and privacy protection for children when they use certain applications. The availability of such data on minors via application usage also requires layers of protection.
According to Mayer Brown, a layer of added protection can look like,
“Updating the definition of “personal information” to reflect modern data practices by including biometric identifiers—unique biological traits that can be used to automatically or semi-automatically recognize individuals. These traits include fingerprints, handprints, retinal and iris patterns, DNA sequences, voiceprints, walking patterns, facial structures, and faceprints.”
Don’t dodge the conversation…
Blunt conversations when approaching this issue can at times serve as the best preventative measure; one that places a direct barrier between a minor being lured into a dangerous relationship with a predator. The conversation we’re referring to allows a trusted adult to understand who has access to their child. In this dialogue, parents/guardians have the opportunity to learn more about the exact nature of certain interactions they are having with another person, and yes, the classic…”how did that make you feel” question. We acknowledge that a sensitive conversation like that isn’t always immediately welcomed by a child, but that doesn’t mean they shouldn’t be happening.
Without a doubt, balancing parental responsibilities, full-time jobs, and segmented quality time with family can make it difficult to pay attention to who a child is interacting with. And not every scenario is possible to supervise. However, that doesn’t mean the problem should be left to a more specialized adult to solve. Placing yourself as an ethically grounded adult in society is the minimum of what’s needed to keep children safe. The future of combating this issue draws much closer to the proactive measures taken by all.
Thank you to our Advocate, Thomson Reuters Special Services (TRSS) (LINK TO WEBSITE), whose support helps us continue critical conversations about protecting vulnerable populations online.
Resources and Help Lines:
Cyber Tipline – to report child sexual exploitation
Take it Down– to help remove online nude, partial exposure, or sexually explicit photos of and videos of minors.
DHS Resource and Fact Sheet– to learn about what actions are being taken by the department to combat child exploitation and protect victims.
Survivor Resources for parents and kids learn how to talk to young victims of CSEA and other available resources
Internet Crimes Against Children Task Force Program (ICAC)
US Department of Justice – Keeping Children Safe Online; resources for parents and more information about the Child Exploitation and Obscenity Section (CEOS). CEOS is a critical function in the enforcement of the nation’s laws protecting children from sexual exploitation and prohibiting the distribution of obscenity.