Warning: Reader Discretion Advised
The following content may disturb readers as it discusses child sexual abuse material.
Child pornography is a topic that no one wants to talk about but one that desperately needs our attention — especially for parents. Now commonly referred to as child sexual abuse material (CSAM), reports of child pornography more than doubled between 2018 and 2022.
One explanation is that the COVID-19 pandemic saw kids online for longer stretches of unsupervised time and that time was, in some unfortunate cases, spent interacting with predators. But CSAM reports haven’t returned to their pre-pandemic levels so clearly this isn’t the only explanation.
So what is happening?
The national conversation is now focusing more frequently, and more adamantly, on the role of “Big Tech” in the problem.
Social Media Companies are Slow to Act
Child sexual abuse material no longer hides on the dark web. It’s now out in the open and easily discoverable on the world’s most popular social media sites — Instagram, Twitter (X), Facebook, Tiktok, Youtube, and more.
The reasons people create and consume CSAM material in the first place are as complex as any other psychological perversion. But the reason CSAM is exploding in recent years is looking increasingly simple: social media companies are not taking action to stop it.
You’d be hard-pressed to find a social media executive admit as much, but the fact is that allowing child pornography onto social media sites increases traffic, engagement and ultimately, profits.
We saw this recently in the January 31st congressional hearing with five Big Tech CEOs. At one point during the hearing, Tennessee Senator, Marsha Blackburn, claimed that Meta is trying to become the “premier sex trafficking platform.” Meta CEO, Mark Zuckerberg responded with nothing more than, “That’s ridiculous,” in an immediate shutdown of a valid and serious argument.
It seems Big Tech’s primary concern is to stop the accusations of child porn, not the actual harm it causes. While companies are legally required to report child porn, they “have no obligation to actively search their platforms.”
Hany Farid, professor of computer science at the University of California, Berkeley explains that platforms will only comply with content laws and remove suggestive material as long as it serves their bottom line. So if the companies are profiting from child sexual abuse material on their site, they won’t act fast to get it removed.
Social media executives often argue that by not limiting the material posted on their sites, they are protecting the First Amendment rights of their users. This argument feels flimsy considering that they are legally protected when they remove any content from their sites, regardless of its legality.
Algorithms Promote Child Porn
All social media sites operate using fine-tuned algorithms that promote material that you love and hide material you don’t. These algorithms watch and time your every move, constantly suggesting material that will get you to stay online longer.
These algorithms often aren’t programmed to tell the difference between legal and illegal content, meaning that those who look for sexually explicit content of minors will be fed more and more by the mindless algorithm. Meta is aware of this vicious process, yet they are slow to take action.
This causes pause: social media sites argue that they aren’t actively making sexually explicit, illegal material, so they act as bystanders to the crimes taking place on their platforms.
This argument is often used in legal discussions among social media executives: to escape litigation for their sites containing abusive content, social media companies hide behind Section 230 of the Communications Decency Act. This act states that providers of internet services cannot be held liable for content posted by their third-party users.
But this act does not protect these companies from legal repercussions when the content being posted on their site is obscene, sexually exploitative of children, or promotes prostitution.
It is certainly within the rights of social media companies to actively remove sexually abusive content from their sites, and it is among their most important responsibilities.
What is Being Done?
The apparent indifference of social media companies can be incredibly disheartening. However, there are several active lawsuits against social media companies for their neglect of child safety. Because of the tireless efforts of concerned parents like you who are pushing for legislation, lawmakers are starting to catch on to the issues, and they’re taking action.
Victims of online sexual abuse can report the crime to the National Center for Missing & Exploited Children. They can also call at 1-800-843-5678. You can also have child sexual abuse material reported to Take It Down to prevent the circulation of compromising material.
There is Hope
Perhaps the most important thing you could glean from this article is that, despite the horror of child porn circulating on the internet, there is hope for our children’s safety online. There is hope for those who have been victims and are now on the road to recovery.
Litigation against these companies is underway, and that means that the worst may soon be behind us.
And remember that, despite public perception, instances of child sexual extortion are not common. The media tends to report shocking news, but the reality is that kids can be safe when they’re online. They aren’t destined to stumble upon an abusive situation.
The rarity of child online sexual abuse does not mean that there is no risk to children online. Take steps to protect against online predators, watch out for signs of grooming, and consider providing a gradual introduction to technology by using safe devices designed specifically to keep kids safe.
What do you think about the rise of child pornography? Did we miss anything? How do you protect your family against online predators? Let us know in the comments below!
Success!
Your comment has been submitted for review! We will notify you when it has been approved and posted!
Thank you!