What is CSAM?

Words by
Morgan Wilcock

JUN 21, 2024

What is CSAM?

The prevalence of child pornography, now called child sexual abuse material (CSAM), is on the rise. And its distribution grows by the day.

The total number of URLs confirmed as containing child pornography worldwide has more than quadrupled since 2019. 

The collection and distribution of child pornography is a devastating issue facing parents and children in the digital age. The growing need to fight against child pornography is one reason why organizations like the FBI, RAINN, and NCMEC have altered their terminology to refer to it as child sexual abuse material, or CSAM.

CSAM Meaning 

Child sexual abuse material (CSAM), previously dubbed child pornography, is defined under federal law as “any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old).” The change in terminology aims to more accurately describe what child pornography actually is: evidence of the exploitation and abuse of child victims

Because children can’t legally consent to engaging in sexual activity, they also can’t consent to having images and videos of their abuse distributed across the internet. Unfortunately for many exploited children, their journey to healing feels endless because of the permanent distribution of their abuse

The terminology extends to children and teens who are coerced or who voluntarily take videos and photos of themselves but whose content is distributed, regardless of intent or consent. Any distribution of compromising photos and videos of children is illegal and abusive. 

Why the Change? 

The words we use are important. Depending on the vocabulary we choose to use, the perception of a situation can turn it from innocuous to suspicious.

For example, a defendant being declared “not guilty” is quite different from one being declared “innocent.” Likewise, “rebel” and “revolutionary” could’ve been used interchangeably to describe America’s Founding Fathers, depending on which side of the ocean you stood on. 

silhouettes of kids

The term CSAM has long been used by law enforcement, but its use in colloquial language is an important step in combating child sexual exploitation. 

When we replace the word “pornography” with “abuse,” the phrase strikes a new chord. As more people begin referring to these abhorrent images of children as child sexual abuse material, both in person and in online circles on social media, the fight against child sexual exploitation can be further legitimized. 

How are Predators Accessing CSAM?

Unfortunately, CSAM is not difficult to find. It’s quite easy to find, even on surface websites and social media sites. In fact, Instagram is the most common surface website that predators use to find CSAM.

The reason that child sexual abuse material is becoming so accessible across the surface web is that cell phones are built-in with end-to-end encryption, a security method that ensures that a user’s data is unviewable except by the user. 

End-to-end encryption means that CSAM can lurk everywhere, from dark web browsers like Tor to innocuous social media sites and gaming platforms. It ultimately gives predators a free invisibility cloak. 

The Internet Watch Foundation, which detects and removes CSAM from the internet, found that more than 90% of CSAM removed from websites was self-generated. Self-generated CSAM has often been obtained by grooming and then extorting children for sexual images and videos. 

Fighting Against Child Exploitation

While child exploitation is a continuous problem, there are ways to help victims and prevent abuse.

Parents can educate their children on their rights so that their children are more likely to report if they have been abused. Kids need to know how to stay safe online and how to identify predators.

Parents need to make sure that their child’s online gaming chats are totally private to prevent them from coming in contact with a predator. This conversation around online safety never ends — it should be ongoing as kids mature. 

Victims of sexual exploitation can report their abuse to the National Center for Missing and Exploited Children online or over the phone at 1-800-843-5678.

A great way to prevent children from gaining access to individuals who may extort them for inappropriate photos is to give them a device that doesn’t allow them to communicate with strangers on social media. Thousands of apps exist that help predators communicate with their victims, but Gabb devices only allow downloads of pre-approved, kid-safe, and locked-down apps. Gabb devices also allow parents to monitor text messages and contacts for dangerous content and jump in when they sense danger.

Have you heard of the term “CSAM?” What do you think about the change in terminology? Let us know in the comments!

Like the post? Leave a comment!

Your email address will not be published. Required fields are marked *

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!

Share this article with...