Menu

The rise of online child sexual exploitation offenses, facilitated by rapid technological changes, is a pressing concern. Perpetrators, using digital devices and the vast expanse of the internet, engage in abusive acts against children and leave a trail of trauma and suffering in their wake.

The misuse of terminology further complicates this reality. The widespread use of “child porn” to describe these despicable acts is inaccurate and profoundly harmful. It fails to capture the gravity of the situation, glossing over the depth of the pain victims experience.

The Problems with the Term “Child Porn”

Contribution to Normalization of Child Sexual Assault

Labeling these heinous acts as “porn” normalizes the abuse, placing online CSAM into a category akin to mainstream, adult, consensual pornography, explains Matt Parker co-founder of The Exodus Road. However, this is evidence of criminal acts against children, infants and young people. With more children playing games online, pedophiles and/or traffickers are using these platforms and posing as children to groom children and obtain explicit images.

“And then they extort. That opportunity to get more images or to threaten [victims] so that there’s money in exchange is a massive problem,” he said.

Glen Pounder, a founding board member of Raven, underscores the need for change, stating that the EARN IT Act will introduce stricter rules with which online platforms must comply. The amended legislation aims to appropriately define this crime as child sexual exploitation.

“I cringe when I hear child pornography,” Pounder said, emphasizing that it fails to capture the true nature of these offenses.

Research has also revealed that perpetrators often use these materials to desensitize themselves before committing sexual offenses or to groom child victims. This disturbing pattern of behavior normalizes their actions in their minds, facilitating offenders to coerce victims into sexual contact.

False Distinction and Overlap with Contact Sexual Offenses

“Child porn” creates a misleading division between viewing images and direct child sexual abuse. There is a troubling overlap between those who view child sexual abuse material (CSAM) and those who commit contact sexual crimes.

Studies suggest that individuals who viewed online CSAM (30%-80%) and those arrested for possessing such materials (76%) had sexually abused a child. Moreover, viewing this material contributes to the demand for its production. For survivors, knowing that images of their abuse continue to circulate is a constant reminder of the trauma they experienced.

Parker also pointed out a disturbing trend: elevated suicide rates among children subjected to extortion and coercion into producing CSAM. “Of course, on camera, they look like they want to be there just like everybody does in a commercial red-light district. They don’t,” Parker said, highlighting a grim reality where children are forced to live in basements, compelled to perform explicit acts on webcams. Calling this form of sexual exploitation “pornography” not only downplays the severity of the crimes committed but also adds an extra layer of complexity to the prosecution process

The Impact of Terminology on Safeguarding Children

Using the phrase ‘child porn’ minimizes the severity of the abuse. The incorrect terminology undermines the crime, leading to inadequate support and protection for the victims. Additionally, CSAM victims struggle to report their experiences as they may fear punishment or feel complicit in the creation of the materials.

Correcting this terminology is crucial as it reduces these barriers, helping survivors understand that what happened to them was abuse and not their fault.

The Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse, also known as The Luxembourg Guidelines, offer ways to navigate terms commonly used to address exploitation and sexual abuse of children.

Below are key examples of terms that should be avoided when referring to child sexual abuse.

Terms To Be Avoided Completely Recommended
Child porn Child sexual abuse
Child sex tourism Sexual exploitation of children in the context of travel and tourism
Child sex tourist Traveling perpetrators of child sexual offenses
Child prostitution Exploitation of children in/for prostitution
Child prostitute, child sex worker Victim of sexual exploitation
Customer, client, John Abuser, child sex offender
Webcam child sex tourism/webcam child sex abuse Live online child sexual abuse

Challenges Faced by Law Enforcement and Technology Companies

Law enforcement agencies worldwide grapple with the overwhelming volume of CSAM circulating online. Unfortunately, limited resources are often allocated, hindering national and international collaborative efforts. This lack of resources severely impedes the global fight against this issue.

Moreover, cooperation from online and social media service providers remains a challenge. Technology giants like Google and Microsoft have been at the forefront, supporting the development of tools to identify and remove such horrifying content from the internet.

However, Chief Operating Officer of the National Center for Missing and Exploited Children (NCMEC) Derrick Driscoll noted that the industry lacks the capacity to detect and report child exploitation now that networks have adopted end-to-end encryption on their social platforms. “It’s going to have a huge negative impact on our ability to detect and report and ultimately, recover children from sexual exploitation.”

Consequently, law enforcement agencies face significant barriers in investigating perpetrators engaged in online dissemination of these materials.

Utilizing Digital Investigative Solutions to Filter Such Terms

The advancement of digital investigative solutions has provided law enforcement with an opportunity to filter and detect online CSAM. For instance, digital forensics examiners apply their expertise to identify and catalog CSAM. They know this evidence is crucial to a child’s safety and the pursuit of justice.

This digital contraband will be used in criminal prosecution and, more importantly, to save a child. These tools play a vital role in safeguarding children by identifying and tracking down perpetrators.

Successful instances of digital investigative solutions aiding law enforcement demonstrate these technologies’ potential to protect vulnerable children.

It is important to acknowledge the horrors faced by these young victims and hold their abusers accountable. Utilizing accurate terminology forces society to confront the harsh reality of CSAM, fostering a collective responsibility to protect children.