Oscars Consider Requiring Films to Disclose AI Use
The use of AI in best picture contender “The Brutalist” recently grabbed headlines and ignited controversy, but it isn’t the only Oscar contender to use the advancing technology. High-profile films such as “A Complete Unknown,” “Dune: Part 2” and “Emilia Pérez” have also used AI in large or small ways, and the growing popularity has led the Motion Picture Academy to actively explore changing its Oscar submission requirements so that films would have to disclose their use of AI, Variety has learned.
The Academy currently offers an optional disclosure form for AI use, but Governors and Branch executive committees are now investigating how AI is used in each branch with an eye toward making disclosure mandatory in the 2026 Oscars rules, which are expected to be published in April. The Academy’s SciTech Council is working on recommended language, Variety has also learned.
Development of visual effects tools and processes that take advantage of AI (including AI subset machine learning, or ML) isn’t a new concept. But for a look at the state of the art, this year, the Visual Effects Society Awards’ emerging technology category is packed with such nominees including Australia-based Rising Sun Pictures’ Revize machine learning toolset, which according to the company’s website, has been used for “a variety of digital ML augmentation, most notably face replacement, facial performance modification, deaging, body replacements and other likeness adaptations.”
The VES entry details its application in “Furiosa: A Mad Max Saga” and says that it was also used on “A Complete Unknown,” “Deadpool & Wolverine,” “Sonic the Hedgehog 3” and series “Apples Never Fall.”
Jennie Zeiher, president of Rising Sun, acknowledged that “A Complete Unknown,” the best picture nominated Bob Dylan biopic, and “Deadpool & Wolverine” did utilize Revize but declined to offer additional details.
For “Furiosa,” Rising Sun used the process on an estimated 150 shots to steadily transition the character Furiosa from child (actor Alyla Browne) to adult, played by Anya Taylor Joy. “We built controls that the artists could use … to essentially dial in the exact specific look and very quickly iterate,” Rising Sun’s machine learning 2D supervisor Robert Beveridge explains, adding, “It was a real fine balance of not introducing too many of [Taylor Joy’s] sharp adult features when we had this younger actress playing her.”
The toolset’s first use, says Zeiher, was on Baz Luhrmann’s “Elvis,” to place Austin Butler into old Elvis footage in select shots. Zeiher notes that overall, the tools aim “to make [the VFX team] more efficient, to put the money on the screen.”
AI startup Metaphysic’s toolset is also nominated in the VES emerging tech category. It was used to age and de-age Tom Hanks and Robin Wright in Robert Zemeckis’ “Here,” and to bring the likeness of late actor Richard Carter, who played the Bullet Farmer in 2015’s “Mad Max: Fury Road,” to actor Lee Perry, who played the role in “Furiosa.” Both movies were shortlisted in the VFX race.
Metaphysic tech was additionally used by VFX nominated “Alien: Romulus” to help create the likeness of the late Ian Holmes, who appeared in 1979’s “Alien.”
AI tools can also be found in widely used content creation software such as CopyCat, a feature in compositing system Nuke, which was used on “Dune: Part Two.” In that case, a machine learning model was used to identify and replicate the blue tone in the eyes of actors playing the Fremens, and in doing so saved “hundreds of hours” of work, according to the VES entry.
When “The Brutalist” was identified for having used AI in post, director Brady Corbet issued a statement, a part of which explained that AI audio technology Respeecher “was used in Hungarian language dialogue editing only, specifically to refine certain vowels and letters for accuracy. No English language was changed.” He added of the film’s Oscar nominated Adrien Brody and Felicity Jones, “Adrien and Felicity’s performances are completely their own.”
Respeecher is also identified in the “Emilia Pérez” end credits. Another tool involving AI, AudioShake, contributing to isolating opera singer Maria Callas’ vocals in 1960s recordings, which were used in the mix for Callas biopic “Maria.”
As a growing number of tools involve AI, the range of uses for such tools vary greatly. What can’t be denied is that it is becoming increasingly difficult to tell what has been touched by the tech, and how it was used. “There should always been truthfulness,” maintains one veteran VFX branch member requesting anonymity. “Awards decisions should be made knowing what the human artist did to achieve the results. And using new tools in innovative ways that pave the path forward for everyone else is a big contribution.”
“It’s important not to lose sight that this is about what supports the story,” the source adds, noting that where actors are concerned, “it’s never been possible to get a great digital performance that wasn’t based on a human actor … Honoring what all the crafts do together is what the season is about. I think any person in any craft will say it’s collaboration. That’s always going to be the case.”