If you've ever browsed social media, chances are you've come across images or videos created by AI. Many people have been fooled, like the viral video of rabbits jumping on a trampoline. But Sora – a sister app to ChatGPT developed by OpenAI – is taking AI video to a new level, making the detection of fake content increasingly urgent.

AI video tools are making it more difficult than ever to identify genuine videos. (Source: CNET)
Launched in 2024 and recently upgraded with Sora 2, this app has an interface similar to TikTok, where all videos are AI-generated. The "cameo" feature allows real people to be inserted into simulated footage, creating videos that are frighteningly realistic.
Therefore, many experts fear that Sora will cause deepfakes to spread, distorting information and blurring the lines between reality and fake. Celebrities are particularly vulnerable, leading organizations like SAG-AFTRA to call on OpenAI to strengthen protection.
Identifying AI-generated content is a major challenge for tech companies, social media platforms, and users alike. But there are ways to recognize videos created using Sora.
Find Sora watermark

The Sora Watermark (indicated by the blue arrow) is a marker that identifies the tool used to create the video. (Source: CNET)
Every video created on the Sora iOS app has a watermark when downloaded – the white Sora logo (cloud icon) moves around the edges of the video, similar to TikTok's watermark.
This is a visual way to identify AI-generated content. For example, Google's Gemini "nano-banana" model also automatically adds a watermark to images. However, watermarks aren't always reliable. If the watermark is static, it can be easily cropped out. Even dynamic watermarks like Sora's can be removed using specialized applications.
When asked about this, OpenAI CEO Sam Altman argued that society needs to adapt to the reality that anyone can create fake videos. Before Sora, there never existed a tool that was so popular, accessible, and required no skill to create such videos. His perspective highlights the need to rely on alternative verification methods.
Check metadata

Checking metadata – a crucial step in determining if a video was created by AI, like Sora. (Source: Canto)
You might think that checking metadata is too complicated, but it's actually quite simple and very effective.
Metadata is a set of information automatically attached to content when it is created, such as camera type, location, time of recording, and file name. Whether the content is human-generated or AI-generated, it has metadata. With AI-generated content, metadata often includes source attribution information.
OpenAI is a member of the Coalition for Content Origin and Authenticity (C2PA), so Sora videos will contain C2PA metadata. You can verify this using the content authenticity initiative's verification tool:
How to check metadata:
- Visit verify.contentauthenticity.org
- Upload the file to be checked.
- Press “Open”
- See the information in the table on the right.
If the video was created by AI, the summary will clearly state that. When checking Sora videos, you'll see the line "published by OpenAI" and information confirming the video was created using AI. All Sora videos must have this information to verify their origin.
Source: https://vtcnews.vn/cach-nhan-biet-video-that-hay-do-ai-tao-ar972891.html






Comment (0)