Holiday Starts Now

00
Day.
00
Hr.
00
Min.
00
Sec.
What is OpenAI Sora App and is it Safe for Kids?
Gabb Wireless
3 min read

What parents should know about Sora

By Natalie Issa

OpenAI has officially released the newest iteration of its AI video generation app — Sora 2. The concept is simple: users can download the app (which is currently invite only) and record a self video for your profile. 

Users can choose who has access to your profile — who are your friends, similar to other social media apps — and “anyone with access can type a few words describing what they want to see you doing and generate a clip in minutes,” columnist Geoffrey A. Fowler wrote for The Washington Post.

The videos can range from astonishingly realistic to downright goofy. In a blog post announcing Sora 2, OpenAI demonstrates the app’s capabilities, including hyperrealistic clips of people playing volleyball, performing gymnastics, and more. But in a recent article for The New York Times, journalist Mike Isaac shares a clip of him in a “‘Matrix’-style duel against Ronald McDonald, using cheeseburgers as weapons.”

While Sora might seem like an innocent passtime, some experts are already raising safety concerns. Here’s everything you should know about Sora — plus whether or not it’s safe for your kids.

What does the Sora app do?

Sora isn’t just an AI video generator — it’s also a social media app. Users can create Ai generated videos of themselves, but also of their friends. The app, similar to TikTok, has an algorithm that will suggest videos and let users follow and interact with friends.

According to The New York Times, “The powerful AI model that Sora is built on makes it simpler to produce clips, giving people an almost unlimited ability to generate as many AI videos as they want.”

Is Sora safe for kids? 

While kids might be drawn to Sora for innocent fun, a few experts have voiced their concerns. As Fowler wrote, “Sora pushes the idea of consent — or having say over how you appear online — into uncomfortable territory.”

According to The Washington Post, OpenAI has “guardrails against sexual, violent and otherwise harmful content.” But, as experts have pointed out, content doesn’t have to be sexual or violent to be potentially harmful. 

Fowler wrote, “You can’t make a video of someone undressed, but I’ve got videos of crowds spitting on me, someone slapping me, and me flipping off a baby.”

Even without explicit violence or nudity, AI videos could still depict illegal, inappropriate, or humiliating behavior — all without breaking OpenAI’s content rules. Experts have also expressed concerns that AI videos can be used to harass, scam, or bully people.

And while only your mutual friends can access your cameo, or likeness, to make an AI video of you, Fowler points out that “most people won’t keep track” of who they follow and who follows them back. So if your child has a falling out with a friend that they’re mutuals with on Sora, that friend could potentially use their likeness to create a malicious video.

With so many experts warning against sharing photos of their child on social media in the age of AI, letting your child upload a cameo of themselves to Sora might be risky. While they might want to create innocent AI videos of themselves, they ultimately have no control over how their friends will use their likeness.

And while Sora certainly isn’t perfect, some videos are pretty realistic — and could easily fool the most tech-savvy among us. 

So until OpenAI implements more parameters around safety and consent — if they ever do — it might be best to keep your kids off of Sora.

Have you heard about Sora yet? What are your main thoughts or concerns? We want to hear them in the comments.

Let Us Come to You

Subscribe to the Gabb newsletter to get the top tech safety ideas, stories, and tips.

Read More

Comments

Leave a comment

Your email address will not be published. Required fields are marked *

Success!

Your comment has been submitted for review! We will notify you when it has been approved and posted!

Thank you!