How AI-generated videos are defying reality

Gone are the days when videos generated by artificial intelligence (AI) revealed obvious flaws, extra fingers, limbs in unlikely locations, robotic expressions. With the launch of tools like Sora 2, this reality has changed radically. Now, all it takes is a few simple commands, a voice recording and a facial scan to create hyper-realistic avatars capable of starring in videos that look authentic.

The quality of the videos has surprised and unsettled. Among the examples that circulated on social media are recreations of figures such as Stephen Hawking behind the wheel of a Formula 1 car, Martin Luther King Jr. as a DJ, or even the character SpongeBob SquarePants in unexpected contexts. The repercussion was immediate, with several organizations expressing concern.

According to Sam Altman, CEO of OpenAI, despite efforts, it may not be possible to prevent all cases of intellectual property abuse. The issue of veracity and content manipulation is at the center of the debate.

“Sora 2 feels like an entry into a new world whose rules we still don’t fully understand. A world where it may become increasingly difficult to separate truth from fiction.”

Miloslav Lujka is a cybersecurity specialist and remembers that the technological leap is a huge challenge to regulation.

“Sora 2 is an absolute leap forward, I would say a decade. Before, you needed an extensive model to learn gestures. Now, anyone can create videos with a quality that is almost indistinguishable from reality.”

The democratization of AI is seen as a creative opportunity but also a risk. Lukáš Benzl, president of the Czech Artificial Intelligence Association, warns of the risks when personal data is shared.

“Each of us can become a creator, even without being creative. But we must think carefully before sharing our identity. Others can use it to create content on our behalf.”

The possibility of sharing avatars and disseminating videos to other social networks such as Facebook and Instagram worsens the problem of misinformation.

“When we have voice and image combined, we believe in what we see. And with Sora 2, we are practically in reality.”

Experts say the solution involves critical education and international regulation.

“If governments invest in explaining the power of these tools and teaching critical thinking from primary school onwards, it can prevent many mistakes.”

Lukáš Benzl adds that it is up to users to choose credible sources and pay for authentic content.

source

News Room USA | LNG in Northern BC