https://cdn.gabb.com/wp-content/uploads/2024/06/Template-1.png
Life Online
4 min read

What is CSAM?

By Morgan Dye

The prevalence of child pornography, now called child sexual abuse material (CSAM), is on the rise. And its distribution grows by the day.

The total number of URLs confirmed as containing child pornography worldwide has more than quadrupled since 2019. 

The collection and distribution of child pornography is a devastating issue facing parents and children in the digital age. The growing need to fight against child pornography is one reason why organizations like the FBI, RAINN, and NCMEC have altered their terminology to refer to it as child sexual abuse material, or CSAM.

CSAM Meaning 

Child sexual abuse material (CSAM), previously dubbed child pornography, is defined under federal law as “any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old).” The change in terminology aims to more accurately describe what child pornography actually is: evidence of the exploitation and abuse of child victims

Because children can’t legally consent to engaging in sexual activity, they also can’t consent to having images and videos of their abuse distributed across the internet. Unfortunately for many exploited children, their journey to healing feels endless because of the permanent distribution of their abuse

The terminology extends to children and teens who are coerced or who voluntarily take videos and photos of themselves but whose content is distributed, regardless of intent or consent. Any distribution of compromising photos and videos of children is illegal and abusive. 

Why the Change? 

The words we use are important. Depending on the vocabulary we choose to use, the perception of a situation can turn it from innocuous to suspicious.

For example, a defendant being declared “not guilty” is quite different from one being declared “innocent.” Likewise, “rebel” and “revolutionary” could’ve been used interchangeably to describe America’s Founding Fathers, depending on which side of the ocean you stood on. 

silhouettes of kids

The term CSAM has long been used by law enforcement, but its use in colloquial language is an important step in combating child sexual exploitation. 

When we replace the word “pornography” with “abuse,” the phrase strikes a new chord. As more people begin referring to these abhorrent images of children as child sexual abuse material, both in person and in online circles on social media, the fight against child sexual exploitation can be further legitimized. 

How are Predators Accessing CSAM?

Unfortunately, CSAM is not difficult to find. It’s quite easy to find, even on surface websites and social media sites. In fact, Instagram is the most common surface website that predators use to find CSAM.

The reason that child sexual abuse material is becoming so accessible across the surface web is that cell phones are built-in with end-to-end encryption, a security method that ensures that a user’s data is unviewable except by the user. 

End-to-end encryption means that CSAM can lurk everywhere, from dark web browsers like Tor to innocuous social media sites and gaming platforms. It ultimately gives predators a free invisibility cloak. 

The Internet Watch Foundation, which detects and removes CSAM from the internet, found that more than 90% of CSAM removed from websites was self-generated. Self-generated CSAM has often been obtained by grooming and then extorting children for sexual images and videos. 

Fighting Against Child Exploitation

While child exploitation is a continuous problem, there are ways to help victims and prevent abuse.

Parents can educate their children on their rights so that their children are more likely to report if they have been abused. Kids need to know how to stay safe online and how to identify predators.

Parents need to make sure that their child’s online gaming chats are totally private to prevent them from coming in contact with a predator. This conversation around online safety never ends — it should be ongoing as kids mature. 

Victims of sexual exploitation can report their abuse to the National Center for Missing and Exploited Children online or over the phone at 1-800-843-5678.

A great way to prevent children from gaining access to individuals who may extort them for inappropriate photos is to give them a device that doesn’t allow them to communicate with strangers on social media. Thousands of apps exist that help predators communicate with their victims, but Gabb devices only allow downloads of pre-approved, kid-safe, and locked-down apps. Gabb devices also allow parents to monitor text messages and contacts for dangerous content and jump in when they sense danger.

Have you heard of the term “CSAM?” What do you think about the change in terminology? Let us know in the comments!

Let Us Come to You

Subscribe to the Gabb Now newsletter to get the top tech safety ideas, stories, and tips in a weekly 5-minute read.

Comments

  • Tips for Tough Conversations with Kids Jul 19, 2024 09:37 AM

    […] with your kids about the risks of child exploitation and how predators might try to manipulate or coerce them. Obtaining nude images is a common way a […]

  • What is the Dark Web? Dark Web for Dummies Sep 12, 2024 11:03 AM

    […] serve legitimate purposes, its association with illegal activities, including the distribution of child sexual abuse material (CSAM) and other criminal enterprises, makes it a significant concern for families […]

  • What is a Deepfake? How to Spot Fake Media Sep 13, 2024 02:34 PM

    […] Creators even advertise in online chatrooms the creation of personal deepfake pornographic content featuring any girl the buyer chooses. This only serves to further the objectification of women.This also has the dangerous potential for adults or kids to create deepfake pornography with images of children they find online. Deepfakes are not included in current child pornography or cyberbullying laws, but new legislation has been introduced which would make these deepfakes child sex abuse material (CSAM). […]

Leave a comment

Your email address will not be published. Required fields are marked *

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!