Remember Baronhead's Sola video?

Remember Baronhead's Sola video?

[but it may not be ready to replace Hollywood just yet.

One of the most notable uses of Sora since it was announced in February is the short film "Air Head" by Canadian production studio Shy Kids. The short film is about a man with a balloon on his head who talks about his life and the problems that come from literally having an air head.

The audio is human-made and the video was obviously edited using clips from Sora, although it is implied that the clips were "generated" by Sora.

That does not appear to be the case; in an interview with FXGuide, Patrick Cederberg of the Shy Kids team revealed that they had to rely on traditional VFX techniques to fix consistency issues with shots generated with OpenAI's tools .

Overall, this is not really a problem and likely reflects how AI video is used in the filmmaking process, but the lack of transparency does not represent a positive aspect of OpenAI.

OpenAI is reluctant to provide broad access to its AI video generator. This is partly due to the cost and time to generate a single clip, but also due to security issues for the potential use case of very high quality synthetic content in an election year.

Shy Kids was one of about a dozen creative professionals who were offered the opportunity to try out Sora, which is accessible through a ChatGPT-like interface and has guardrails in place regarding copyright.

Ceberberg, who handled post-production for Air Heads, told FXGuide: "It is a very powerful tool and we are already dreaming up all the ways we can incorporate it into our existing processes. But as with any generative AI tool, I think control is still the most desirable and also the most elusive at this point."

Some examples of this include unwanted faces appearing on balloons and yellow balloons appearing red. In the most prominent scene, where the characters are chasing their heads in the courtyard, we had to rely on rotoscoping in Adobe AfterEffects.

In the Sora clip, the man had a head and the balloon was red. To resolve this, I painted out the head in AfterEffects and changed the color of the balloon because I could not render it accurately in Sora.

According to Cederberg, this is not a fast process. Regardless of the length of the clip, it can take up to 20 minutes to render, and this time increases as the demands on the server increase.

"It's important to use the full 20 seconds, because it gives you more opportunity to slice and edit, and you have a better chance of getting something that looks good," he told FXGuide.

Sora is accessed through the ChatGPT interface, where prompts are refined by OpenAI chatbots before being sent to the video generator. He noted that he often has to use long and very descriptive prompts to ensure consistency, and even then it is not always possible.

To make this work, they approached the production of Air Head like a documentary rather than a short film. Basically, they worked with a huge amount of generated material and built it into a story, rather than writing a script and shooting to that script. They didn't always know what shots they would get, because they didn't always know what shots they would get.

Sora also seems to have the same problem as existing AI generators like Runway and Pika Labs in that clips appear much slower than the actual footage. Says Cederberg: "There was quite a bit of timing to adjust to make it not feel like a big slow-motion project."

The reality is that for some time, AI-generated content will be used as part of the workflow by filmmakers, not as a replacement for filmmaking itself. The work that Adobe has done to integrate generative video into Premiere Pro is a good indication of what could happen.

Shy Kids' experience seems to be at the extreme end of how AI video could be used. LA-based director Paul Trillo also used Sora to produce a promotional video for the next generation of TED Talks. He also noted that it takes a lot of clips to get the desired output and that he needs to generate hundreds of clips for the 12 that make the cut.

Cederberg sees Sora as a supplemental VFX tool, not just an extension of the regular process; Adobe suggests using it to generate B-roll or to extend existing clips.

Like the rise of digital VFX and other groundbreaking technologies, Sora and AI video will usher in a new generation of film, and perhaps a new golden age of cinema.

.

Categories