If you’ve ever scrolled through social media, chances are you’ve come across an AI-generated image or video . Many have been fooled, like the viral video of rabbits jumping on a trampoline. But Sora, ChatGPT’s sister app developed by OpenAI, is taking AI video to the next level, making it imperative to spot fake content.

AI video tools make it harder than ever to tell what's real. (Source: CNET)
Launched in 2024 and recently updated with Sora 2, the app has a TikTok-like interface where every video is AI-generated. The “cameo” feature allows real people to be inserted into the simulated footage, creating videos that are so realistic they are frightening.
As a result, many experts fear that Sora will spread deepfakes, creating confusion and blurring the lines between real and fake. Celebrities are particularly vulnerable, leading organizations like SAG-AFTRA to call on OpenAI to strengthen its protections.
Identifying AI content is a big challenge for tech companies, social networks, and users alike. But there are ways to recognize videos created with Sora.
Find Sora watermark

Sora Watermark (green arrow pointing in) is a sign to identify which tool created the video. (Source: CNET)
Every video created on the Sora iOS app gets a watermark when downloaded – a white Sora logo (cloud icon) that moves around the edges of the video, similar to TikTok's watermark.
This is an intuitive way to recognize AI-generated content. For example, Google’s Gemini “nano banana” model also automatically watermarks images. However, watermarks are not always reliable. If the watermark is static, it can be easily cropped out. Even dynamic watermarks like Sora’s can be removed using dedicated apps.
When asked about this, OpenAI CEO Sam Altman said that society needs to adapt to the fact that anyone can create fake videos. Before Sora, there was no such tool that was widely available, accessible, and required no skills. His view points to the need to rely on other verification methods.
Check metadata

Checking metadata – a key step in determining whether a video is generated by AI, like Sora. (Source: Canto)
You might think that checking metadata is too complicated, but it's actually quite simple and very effective.
Metadata is a set of information automatically attached to content when it is created, such as camera type, location, time of filming, and file name. Whether human- or AI-generated content, metadata is included. With AI content, metadata often accompanies information that authenticates its origin.
OpenAI is a member of the Content Provenance and Authentication Alliance (C2PA), so Sora videos will contain C2PA metadata. You can check using the Content Authentication Initiative's verification tool:
How to check metadata:
- Visit verify.contentauthenticity.org
- Upload the file to be checked
- Click “Open”
- See information in the table on the right.
If the video was generated by AI, the summary will state so. When you check a Sora video, you’ll see “published by OpenAI” and information confirming that the video was generated with AI. All Sora videos must have this information to verify their origin.
Source: https://vtcnews.vn/cach-nhan-biet-video-that-hay-do-ai-tao-ar972891.html






Comment (0)