On Actors and Deepfakes

How professional actors can win using technology, and what to guard against.

Russell S.A. Palmer
10 min readApr 11, 2022

Last month, a deepfake video with a message of surrender was uploaded to a hacked Ukrainian website. This was the first real attempt at using a deepfake for purposes of terrorism and political gain. There will be many more to come.

Citizens were rightfully terrified, and urged their governments to restrict use of AI-generated “synthetic media” — though outright bans might not even be possible or effective [1]. In fact, the best way to fight back against higher-quality attempts will be actors taking control over powerful AI tools while defending their likeness.

Deep Fake image of Ukrainian President Volodymyr Zelensky
Deepfakes can be used for bad, could they also be repurposed for good

Is this new? Should we be alarmed? Or does society grow accustomed to fakes — just like photoshopped images and fake news? Do we ban the printing press or Adobe suite? In a few years, people will be able to make deepfakes quickly and easily on their phone. There’s no way it can be completely controlled, but it can be benefitted from and monitored. The term already has negative connotations based on misuse, with it lately rebranded as “digital humans” and “virtual acting” performances.

Robin Williams 1993 character in Mrs. Doubtfire had a transformative disguise as a premise

Costumes, make-up, and wigs have been a part of performances since antiquity. The ancient Greeks didn’t allow women on stage, so men would dress up for the roles. Centuries later, filmmakers used visual effects (VFX) to create the impossible [2]. With the advent of digital film, Computer-Generated Imagery (CGI) expanded the realm of possibilities — prominently portraying dinosaurs in Jurassic Park — with a quality level that still holds up on screen after 30 years.

If audiences don’t need to strain their imagination, it’s easier to get lost in the story.

Deepfakes take CGI and imitation to the next level, powered by AI and Deep Learning” (forming the portmanteau “deepfake”). Previously, only rare and talented creators at shops like ILM could produce this CGI and “digitally-portrayed” acting for expensive blockbusters from the biggest studios. On powerful new computers these digital artists painstakingly modeled each frame. Directors and Producers worked with them closely to find the right shots, a new workflow pioneered by technologists like Steve Jobs, Ed Catmull, and John Lasseter at Pixar.

AI now leverages video data to easily recreate people in forms they’ve never undertaken. Almost anyone can soon show hyper-realistic camera quality video of people saying and doing anything, with Big Tech and authorities monitoring and removing some from the Internet already. However, lengthy content tends to suffer from uncanny valley” — brains recognizing and detesting fakes. Many films failed at this limitation, for example The Polar Express and Beowulf.

Will the state of the art improve for the benefit of Hollywood and society? Can actors and filmmakers really benefit from this technology? Can actors use their own likeness and data to help detect and flag fake videos? We think so, and hope AI can help bring positive impacts with this powerful new tool.

Singe image deepfakes using “few-shot learning” (article for Wired Magazine)
Living Portraits — Deepfakes from single images in the public domain (Egor Zakharov)

Recently on the set of Rust with star Alec Baldwin, an accident involving a prop gun caused the death of cinematographer Halyna Hutchins and injured director Joel Souza. Production was suspended [3]. Many were reminded of the untimely death of Brandon Lee on the set of The Crow.

Productions sometimes require action sequences and dangerous stunts that are critical to the plot. Unfortunately the history of cinema is haunted by incidents where brave stunt-actors have lost their lives [4]. Deepfakes can help synthesize video scenes reducing the need to perform these risky maneuvers. Just like the famous example below of an actor imitating Tom Cruise — you can put one person’s face on another [5] for up-close stunt re-enactments, and soon it won’t even require the body double. Actors can use their likeness to enact stunts without putting themselves in harms way.

Very realistic Tom Cruise Deepfake | AI Tom Cruise (YouTube)

They can record their lines as usual, then tweak and apply digital changes however they need. Anyone can play any character, and this is powerful for diversity and the democratization of the industry. It was not always possible for everyone to play anything — despite the desire and talent to do so. Recently in theater, there were barriers broken in casting the 2015 hit play Hamilton. Soon gatekeepers won’t have any reason not to cast diverse portrayals, with the ability to blend likeness in any way they want to improve the believability. Many actors may prefer not to use AI for years to come, and that’s fine too.

Deepfakes can allow actors to play a variety of roles and scenes, digitally faking the ones they are not comfortable with, and this includes both stunts and romantic scenes [6]. The movies will be better for them, showing the real actor’s likeness and mannerisms, but still matching the artistic vision of the picture. Some actors have refused nude scenes, preferring instead for the script to be edited, a body double used, or even being forced to turn down roles they aren’t comfortable with. This doesn’t benefit anyone, and deepfakes can help.

Unfortunately, this technology also makes it possible for unsanctioned, illicit, synthetic X-rated content to be generated (especially for famous actors with hundreds of hours of readily-accessible digital video data public to the world already). This is something for everyone to watch out for and is already on the radar of authorities. Governments are enacting laws prohibiting the sharing of others unsanctioned nude photographs.

As with detecting political deepfakes however, the best way to detect and remove this content from the public Internet will be using a form of AI automation, to detect and flag it automatically across the Web. Unlike other dangerous and powerful technologies such as nuclear weapons, due to the nature and portability of software this isn’t something governments can fully monitor or restrict from bad actors, and it’s inevitable some parties will continue it’s use unrestricted so we need the technological development and skills to fight back.

Going Back In Time and Around The World

Life happens. Filming begins, and image during production an older actor passes and can’t finalize their work and on-screen legacy. Do you think their last wish would be to scrap the picture, or would they want production to finish and for the world to see their last acting role?

Paul Walker died in a car crash before Furious 7 went out, so his brother honored his acting legacy and played as body double, deepfaking the final un-shot scenes (which was especially effective in this case due to their physical likeness). Carrie Fisher passed during production of a recent Star Wars. CGI and deepfakes helped finish their final work. Similarly, as actors naturally age they become limited to certain roles. However, prominent actors like Robert DeNiro, Will Smith, and Arnold Schwarzenegger were able to take on previously unimaginable new roles through AI de-aging to synthesize scenes for The Irishman, Terminator 4, and more.

AI overcomes the dreaded “uncanny valley” by having the actors themselves perform as their own body double, saying lines authentically with AI CGI representing them in a younger portrayal. Movie plots Tron to Gemini Man been made possible with this incredible idea, featuring both the current and de-aged character in both roles on screen at the same time interacting with each other.

De-aging Robert De Niro (The Irishman — Martin Scorsese)

The hit show Euphoria received some criticism recently, for its portrayal of teenagers and drug use. Actors are sometimes subject to enacting fictional scenes of rape and violence, which can be challenging for impressionable young actors to film in person.

In some cases, we both want and need to tell graphic and uncomfortable stories, and this deepfaking capability should be available to actors when they prefer it. Tools need to benefit the user, and we think AI can be a great tool to help new actors create great films. Not everyone in Hollywood gets job offers, especially new or struggling actors fighting to break in to Hollywood, who might be forced to accept roles including scenes they are not comfortable filming themselves. Virtual acting can help them land a wider range of roles and enhance comfort on set, to create a body of great artistic work.

Finally, we also have the option to show our films in other languages for audiences worldwide. Not all directors will want to change the original language audio, and will continue using subtitles — that’s fine and sometimes what’s best for the movie. Commercials can already be effectively localized — with an authentic actor representation [7]— to any language in any country with no re-shoots (and even using different brand names). This is another great benefit for the media industry and will become a powerful revenue generator for studios and media companies around the world.

Actors, Politicians, and others with public video content are subject to use today

Not everyone can have a perfect deepfake. There are imperfect-quality AR apps that paint your face on others (e.g. Snap’s virtual masks, and filters on TikTok or Zoom). The reason deepfakes of Tom Cruise are possible today (without consent or legal recourse) is due to many hours of video data readily available to the public.

So are you safe from deepfakes? Mostly yes for now, but public figures are going to be subject to the positives and negatives above. We hope to help aspiring actors and established stars alike, to profit from this game-changing new technology.

Read Part 2 with more actionable insights for actors to benefit and protect against

References

Inline images and video:

  • Top picture from New York Post
  • Middle image and GIF from Wired Magazine and LIVESCIENCE (Samsung Research MoscowEgor Zakharov; Aliaksandra Shysheya; Victor Lempitsky)
  • Video explaining how deepfakes were used in The IrishmanNetflix (YouTube)
  • Very realistic Tom Cruise Deepfake | AI Tom Cruise” — Vecanoi (YouTube)
  • Bottom pictures (YouTube, Wired Magazine)

[1] Chapter 5 “Security and World Order” — The Age of AI And Our Human Future (Kissinger, Schmidt, Huttenlocher)

[2] Special Effects (SFX), Visual Effects (VFX), CGI, and more

MOVIES INSIDER — “7 Classic Movie Tricks That Led To Modern CGI” (INSIDER)
“Sh! The Octopus’ witch transformation” analog special effect in a movie — 1937 Karl Struss technique
Stunts using a variety of prop and camera tricks (film pioneer Buster Keaton)

[3] “Rust” shooting incident involving Alec Baldwin and a prop gun

[4] 20 Movies and TV Shows Where Stunt Actors Died During Filming (Newsweek)

[5] Tom Cruise deepfake (Chris Umé — Metaphysic.ai)

[6] Actors suffering from graphic film scenes

More:

Actors refusing nudity and sex scenes

[7] Commercial for JustEat feat. Snoop Dogg using deepfake video and audio translation technology, powered by AI and Deep Learning:

3 New Ways Artificial Intelligence Is Powering The Future Of Marketing” NitroBots (nitrobots.ai)

Copyright © 2022 CYBERFILM.AI CORPORATION

--

--

Russell S.A. Palmer

CEO of CyberFilm AI in SF. From Toronto Canada. AI PM for 15 years across Silicon Valley at Microsoft, Viv Labs, Samsung, and JPMorgan Chase.