https://cdn.gabb.com/wp-content/uploads/2024/01/Social-CEO-Congressional-Hearing-1024x683.jpg
Life Online
11 min read

Social Media is Failing Children: 5 Big Tech CEOs Testify Before Congress

By Jackie Baucom

On January 31, 2024, the CEOs of Meta (Mark Zuckerberg), X (Linda Yaccarino), TikTok (Shou Zi Chew), Snap (Evan Spiegel), and Discord (Jason Citron) testified before a U.S. Senate committee on their efforts in combating child sexual exploitation and child sexual abuse material on their respective platforms. 

The CEOs of X, Discord, and Snap refused to engage for weeks and provided testimonies only under subpoenas. The CEOs of Meta and TikTok willingly agreed to testify at the hearing.

The hearing brought to light big tech’s efforts — and lack thereof — to protect our children. The virtual realm allows criminals and bullies to operate from the shadows, harming kids through bullying, intimidation, addiction, and sexual exploitation. That’s not to mention the cumulative effects of consuming seemingly harmless content for hours every day.

This issue is not a new one for politicians. During a February 2023 Senate Committee meeting, Michelle DeLaune, president and CEO of the National Center for Missing & Exploited Children, said the organization’s cyber tipline received over 3.2 million reports of child abuse the previous year.

Since that 2023 meeting, the Committee has reported multiple bipartisan bills to help stop the exploitation of children online:

  • The KOSA Act: The Kids Online Safety Act requires that platforms must implement sensible strategies in creating and managing products or services used by minors. 
  • The COPPA Act Amendment: An amendment to the Children’s Online Privacy Protection Act of 1998, it enhances safeguards regarding online collection, use, and sharing of personal data belonging to minors.
  • The STOP CSAM Act: The Stop Child Sexual Abuse Act takes decisive measures against the spread of online sexual abuse material, provides support to victims, and enhances accountability and transparency for online platforms.
  • The EARN IT Act: The Eliminating Abusive and Rampant Neglect of Interactive Technologies Act establishes the National Commission on Online Child Sexual Exploitation Prevention, and removes tech’s immunity from civil and criminal liability under child sexual abuse material laws.
  • The SHIELD Act: The Stopping Harmful Image Exploitation and Limiting Distribution Act guarantees that federal prosecutors have access to efficient tools to tackle the non-consensual distribution of sexual imagery.
  • The Project Safe Childhood Act: This bill facilitates the investigation and prosecution of online child exploitation crimes through intergovernmental partnerships.
  • The REPORT Act: Establishes new measures to enhance the reporting of child exploitation crimes to the CyberTipline.

Everything a Parent Needs to Know About the January 2024 Hearing

Members of the Senate Judiciary Committee spent almost four hours questioning the CEOs. Mark Zuckerberg (Meta) and Shou Chew (TikTok) faced the harshest interrogations. 

The atmosphere was tense, as lawmakers criticized the platforms for their failure to protect children from predators and the harms of social media, as well as failure to stop the spread of CSAM. Parents who blame social media for their children’s death sat behind the CEOs.  

Some comments were brutal. Senator Lindsey Graham reprimanded Mark Zuckerberg, saying he has “blood on his hands,” as Graham shared a story of a child who was a victim of sexual exploitation. 

Here are some highlights from the hearing:

  • Zuckerberg (Meta) stood up and faced the gallery as he apologized for “the things that your families have suffered.”
  • One of the most dramatic moments occurred when Tennessee Senator, Marsha Blackburn, said Meta is trying to become the “premier sex trafficking platform.” A visibly angry Zuckerberg responded with, “that’s ridiculous.”
  • Louisiana Senator, John Kennedy, said social media algorithms are killing the truth, as people keep being fed only one side of an issue.
  • Evan Spiegel (Snap) apologized to families whose kids died from overdosing after purchasing drugs on Snapchat.
  • Evan Spiegel (Snap) agreed to endorse the Cooper Davis Act, which requires platforms to report certain instances of illicit drug trafficking to the Drug Enforcement Administration.
  • South Carolina Senator, Lindsey Graham, called for the repeal of Section 230, which is a federal law that grants immunity to websites and social media platforms for their content moderation decisions, and from lawsuits arising from user-generated content.
  • Evan Spiegel (Snap) and Linda Yaccarino (X) were the only CEOs that agreed to support the Kids Online Safety Act (KOSA). Yaccarino also endorsed the STOP CSAM Act.
  • Yaccarino, in an effort to distance X from Twitter, said the 14-month-old company (ignoring the previous 16 years as Twitter) sends more reports to the national tipline than Twitter ever did.
  • Jason Citron says Discord is working with Thorn, a tech company, to create an automated tool known to detect grooming and predatory conversations. X already has a partnership with Thorn to detect CSAM.
  • Zuckerberg said Meta has 40,000 people working on safety and security, and spent $5 billion in 2023 on safety efforts.
  • Evan Spiegel said 20 million U.S. teens use Snapchat, but only 400,000 use the parental controls. He was the only one of the five CEOs ready to present data on the number of young people using a platform.
  • Chew (TikTok) pledged to spend $2 billion this year on safety efforts.

Overall, the hearing provided a good representation of the current status of social media and youth safety. Parents and civic leaders are speaking out in the boldest possible terms against big tech, while big tech leaders are offering only vague and semi-committed responses for fixing the problems.

Below we’ve provided more context on each of the five companies represented by their CEOs at the January 2024 hearing.

Meta

meta logo

Meta is the parent company of Facebook, Instagram, and WhatsApp. 

Facebook is a social media platform that allows users to connect with friends, share content, and engage with communities. Facebook has proven to be unsafe for kids for a variety of reasons.

Instagram is also a social media platform that allows users to share photos and videos — organized by hashtags, and tagged locations. Safety issues for kids on Instagram range from child exploitation and pornography to scams and mental health concerns like promotion of unhealthy body image.

WhatsApp is a messaging app that allows users to send text messages, voice messages, make voice and video calls, and share media with individuals or groups. WhatsApp dangers include access to inappropriate content and online predators.

Meta’s CSAM Policy

Meta’s official position is that it doesn’t allow content that sexually exploits or endangers children, or AI-generated kids. Non-sexual acts, such as child abuse by law enforcement or images depicting child nudity in relation to famine and war that are posted by news agencies are allowed. 

Meta’s Efforts at Child Safety

Meta uses many tools and programs to detect and prevent the spread of CSAM. Meta uses PhotoDNA image hashing, Safety API, PDQ and TMK+PDQF, Lantern, and Take It Down (An NCMEC tool created with Meta’s help).

Meta also hosts yearly Hackathons, where employees collaborate intensively over a few days to brainstorm, create prototypes, and develop innovative ideas or solutions to products, services, or internal processes.

Meta Lawsuit

In December 2023, New Mexico’s Attorney General filed a lawsuit against Meta and Mark Zuckergerg, alleging failure to protect children from online abuse and trafficking on Facebook and Instagram.

Meta claims to have taken action against violators but court documents revealed knowledge of an estimated 100,000 minors experiencing sexual harassment every day.

X

X Logo

X (formerly known as Twitter), is a social media platform where users can post and interact with short messages, called posts, sharing images, videos, thoughts, news, and opinions with a global audience in real-time. One of the most prominent dangers on X is pornography and other graphic content.

X and CSAM

X’s policy outlines a zero-tolerance stance towards any material promoting or featuring child sexual exploitation, covering various media formats. In December 2023, X updated their policy to include automated monitoring and reporting of CSAM (Child Sexual Abuse Material) content to the NCMEC CyberTipline, suspending accounts that violate it, and blocking searches for child sexual exploitation terms. 

Efforts by X to Keep Kids Safe

X uses an automated system to report to the NCMEC CyberTipline, PhotoDNA, and blocks hashtags or keywords from search results.

X’s Well-documented Safety Concerns

In early 2022, the company explored allowing users to monetize adult content to compete with OnlyFans. However, concerns arose over Twitter’s inability to effectively detect CSAM and non-consensual nudity, leading to the abandonment of the plan.

In October 2022, Elon Musk, who had publicly stated that removing CSAM content was his number one priority, cut the team responsible for reporting CSAM by half.

An investigation by NBC News in January 2023, found accounts that posted hundreds of messages selling CSAM. Some of these accounts remained up for months, while others deleted content quickly to avoid detection. 

A similar investigation by The New York Times the following month, found CSAM is not only widely available on X, it was actively promoted by the algorithm.

TikTok

TikTok logo

TikTok is a social media platform where users create and share short-form videos often featuring music, dances, and challenges with a global audience. There are a lot of reasons TikTok is not safe for kids, including graphic content and a large list of dangerous trends.

TikTok’s CSAM policy

TikTok officially has a zero tolerance policy for CSAM, including any visual, textual, and audible depictions of explicit or inferred CSAM.

Efforts at Child Safety by TikTok

TikTok uses PhotoDNA, Content Safety API, and YouTube’s CSAI Match to combat child sexual abuse imagery. 

TikTok Safety Concerns in the News

A 42-year-old Alabama man met a 14-year-old Texas girl on TikTok, leading to romantic exchanges and a meeting. He was later arrested on charges of sexual assault of a child.
This was not a unique situation, as an investigation by The Wall Street Journal found TikTok’s algorithm facilitates inappropriate interactions, and serves adults who watch videos of young kids with more of the same content.

Snap

snapchat logo

Commonly known as Snapchat, it’s a multimedia messaging platform where users can send photos, videos, and text messages that disappear after being viewed. Snap also has various features for sharing and discovering content. Cyberbullying, inappropriate content, and location sharing are just a few of the child safety concerns presented by Snap.

Snap’s CSAM Policy

Snap’s policy prohibits any promotion, distribution, or sharing of pornographic content. Depictions of nudity in non-sexual contexts are usually permitted, but do not apply to images of minors. Any activity involving sexual exploitation or abuse of a minor is reported to authorities.

Snap’s Child Safety Efforts

Snap uses a variety of tools to counter CSAM, including PhotoDNA, Take It Down, and Google’s Child Sexual Abuse Imagery Match.

Snap’s History of Safety Concerns

In 2023, the National Center on Sexual Exploitation (NCOSE) included Snapchat on its annual list (titled “Dirty Dozen”) of tech entities facilitating and profiting from sexual abuse and exploitation. Snapchat was later moved to the “Watch List” after it introduced new safeguards aimed at protecting kids, including measures to limit interactions with strangers and restrict access to sexually explicit content.

Snapchat is the platform where most parents report children sharing sexual images of themselves. It is also among the most top apps (along with Facebook and Instagram) where kids receive requests for sexually explicit imagery and sextortion.

Discord

discord logo

Discord is a communication app that allows messaging and screen sharing. Users can join specific channels called “servers,” and engage in discussions on various topics ranging from movies to sports teams. Illicit content, minimal age verification features, and anonymous chat rooms are just a few of the reasons Discord can be dangerous for kids

Discord and CSAM

Discord’s official stance is a zero-tolerance policy on CSAM, including AI-generated content, as well as sexual conduct with minors, grooming, online enticement, and sextortion.

Discord’s Safety Efforts

Discord’s Safety team identifies and eliminates harmful content, taking measures such as banning users and collaborating with authorities. They employ Safety by Design practices, including risk assessments during product development to mitigate potential dangers, particularly for teens.

Discord uses proactive and reactive tools using machine learning models and PhotoDNA image hashing. They also rely on community moderators to uphold policies and report any violations.

Discord’s Record of Safety Concerns

A June 2023 NBC News investigation revealed Discord’s role in facilitating adult users’ grooming of children, trading CSAM, and engaging in sextortion. 

Since its inception in 2017, Discord communications were linked to over 35 cases resulting in adult prosecutions for crimes like kidnapping, grooming, and sexual assault.  Additionally, 165 cases, including four crime rings, involved adults transmitting or receiving CSAM or sextorting children via Discord.

Join the Conversation

Only time will tell how the hearing will influence safety concerns on popular social media apps, but public conversation is always a good start.

What do you think about all of this? Any questions we didn’t answer? Let us know in the comments below.

Let Us Come to You

Subscribe to the Gabb Now newsletter to get the top tech safety ideas, stories, and tips in a weekly 5-minute read.

Comments

  • wayne fields Feb 05, 2024 10:41 AM

    This most is very helpful for parents and grandparents as well!

  • Gabb Feb 08, 2024 10:21 AM

    We are so glad this helped!

  • Snapchat is Still Social Media and AI is Up to No Good Feb 21, 2024 02:58 PM

    […] campaign comes just days after Snapchat’s CEO was forced to testify before Congress with four other social media CEOs regarding harms caused to adolescents by their […]

  • Are Map Apps Safe for Kids? Risks and Alternatives Feb 26, 2024 10:16 AM

    […] you consider the types of dangers inherent in many of the apps available today (particularly social media), map apps are relatively safe. That doesn’t mean, however, that they don’t come with any […]

  • Surgeon General Says 13 is Too Young for Social Media (Gabb Reacts) Jun 13, 2024 02:56 PM

    […] January of 2024, the CEOs of Meta, X, TikTok, Snap, and Discord testified before a U.S. Senate committee about their efforts to combat child sexual exploitation and the dissemination of child sexual abuse […]

Leave a comment

Your email address will not be published. Required fields are marked *

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!