Menu

For all their benefits, always on, connected devices and networks have created channels for predators to exploit children and proliferate explicit material. Nearly every week, the media reports on yet another tragic victim of child sexual exploitation (CSE) and abuse.

The sheer number of both photos and videos seized and reviewed annually is staggering. In 2022, the CyberTipline of the National Center for Missing & Exploited Children (NCMEC) received 32,059,029 reports. The NCMEC analyzed 88.3 million images, videos and other files related to CSE – a number that has grown and continues to grow exponentially each year, challenging law enforcement agencies around the globe.

“When I started in forensics, the majority of devices we seized were desktops and laptops,” said Randy Kyburz, Certified Digital Forensic Examiner with the Seattle Police Internet Crimes Against Children Unit.

“When we did have cellphones to examine, they were largely ‘dumb.’ Years ago, we’d walk out of a crime scene with maybe one of each. Today, we often collect 30+ devices at a scene, with smart phones making up about 40 percent of total devices recovered.”

Traditional digital forensic workflows, combined with sentencing guidelines and the sheer volume of offenders, have created an epidemic where child victims are often undetected and undiscovered, and the crimes committed against them are never investigated. The failure of this detection enables the continued access to and abuse of these children.

An urgent and global call of duty

Thanks to ubiquitous connectivity, offenders have virtually unlimited access to unsuspecting children and lurid content. The technology available today can enable even a single person to facilitate child exploitation on a much larger scale.

For example, Eric Marques leveraged the anonymity of the Dark Web to operate a hosting service on Tor that included 200 child exploitation websites that brought together hundreds of thousands of offenders from across the world and housed millions of images of child exploitation material, including images of infants and toddlers. The investigation that led to his arrest involved 70 law enforcement agents from over a dozen countries.

When millions of images of child sexual abuse material (CSAM) are seized by law enforcement, many of those photos or videos are destined to be left on devices, in the cloud or in evidence lockers. There is a dire need for a reliable way to extract, parse and identify known and unknown victims. Thus, it is crucial for law enforcement to adopt digital technologies that enable them to unlock, access, and analyze data quickly and defensibly.

Optimizing shared resources and workflows

The goal remains steadfast. Identify and save more exploited children – quickly.

Unique machine learning algorithms can help agencies accelerate time to evidence. The power of an intelligent investigative analytics solution lies not only in the ability to correlate and review actionable insights across all data sources, but also to help quickly find evidence when investigators may not know what they are looking for e.g. what people are talking about, languages they are using, locations they’ve frequented, etc.

Imagine having a solution that can deliver both critical extraction and analysis capabilities at the scene and more in-depth investigational analysis in the lab. An advanced AI-powered solution will enable the following:

Accelerate time to evidence with advanced machine learning

Analytics and CSAM image categorization to automatically identify images and videos – obtained through a forensic process and suspected of containing CSE-related material – using machine learning neural-network based algorithms.

Filter, categorize and export undiscovered media artifacts

Investigators can filter images based on categories such as face, nudity, suspected child exploitation, weapons and drugs so they only see images that match specific search criteria. New media artifacts can then be quickly tagged, categorized and fed into relevant databases.

Quickly identify and crossmatch victims with facial detection

Unique algorithms automatically detect faces within any picture or video available to the system, allowing investigators to immediately and accurately crossmatch individual faces. This allows investigators to quickly identify additional pictures of the same victim.

Analyze conversations for potential luring or abuse

Natural language processing goes beyond regex and simple watch lists to uncover names, addresses, locations and more from artifacts like emails, websites, text messages or even images that contain text, using OCR, in multiple languages.

Leverage public domain cloud data to correlate evidence

Visualize and analyze publicly available data from supported social media and cloud-based sources in a unified format to track behavior, uncover common connections and correlate critical evidence that can help build a stronger case.

Seamless integration with Project VIC, CAID and other hash databases

The existence of known incriminating images is automatically identified by matching image hash values and then classified using pre-defined CSE severity categories. Previously unknown images that are discovered can also be categorized, tagged and exported seamlessly back to Project VIC and CAID databases.

A collective, collaborative fight to serve and protect the innocent

Preventing child exploitation takes collaboration, real-time information and an ongoing commitment to identify every victim quickly and get criminals – and the content they produce and share – off the streets. With more and more children using mobile devices – both phones and tablets – at an earlier age, the risks are only getting bigger.

Due to the amount of data being created, there’s also a huge migration toward mobile apps leveraging cloud storage. “Today, your evidence may not lie in the country you live in, so preserving evidence on phones and self-generated content is incredibly important,” said Arnold Guerin, a police officer and technology specialist with the Canadian Police Centre for Missing and Exploited Children, managed by the Royal Canadian Mounted Police.

“It’s an imperfect scenario that can lead to tragic circumstances. Finding new victims is a driving focus. Police and a growing list of partners have allowed us all to make significant progress.”

What all global agencies have in common – as well as the growing ecosystem of technology vendors – is the strength of a shared goal. To find and protect exploited children.

“I get asked all the time how I can do this job,” said Guerin. “It’s the mission that makes the motivation clear. I think I have the best job in the world – to find and rescue kids – because I have the power to do it.”

Richard Brown, Project VIC Coordinator for the United States and manager for the National Association to Protect Children (Protect.org) concurs.

“Project VIC’s goal is to break down the walls and the days of isolated proprietary data and create an environment where any tool can be picked up to work on case data produced by any other tool in the Project VIC ecosystem. This is our international message to industry in this crime set. Project VIC often spreads this word through training with other countries alongside the International Centre for Missing & Exploited Children (ICMEC).”

A common goal unites everyone committed to this mission. As ICMEC eloquently states: One child is one too many. Every single child deserves to grow up free from abduction, sexual abuse and exploitation. We are committed to building a safer world for our children by convening partners, advocating for improved protections and providing the necessary tools and training to those on the front lines.

More information:

Project VIC is part of a global strategy to develop and implement streamlined methods to investigate child sexual exploitation. A collaborative effort between the National Association to Protect Children, law enforcement and industry, the goal is to increase information sharing among law enforcement worldwide, while identifying more victims, more rapidly. Project VIC does this by improving – and standardizing – the technology resources available to law enforcement who review images of child sexual exploitation.2

CAID, created in 2013 by the UK Prime Minister David Cameron, uses the latest technology to transform how police forces deal with images of Child Sexual Exploitation and Abuse. It brings together all the images that the Police and NCA encounter. Forces then access and use the images’ unique identifiers – called hashes – and metadata to improve how they investigate these crimes and protect children.