Over the course of an acting career that spanned more than six decades, James Earl Jones’ voice became an indelible piece of his work as a performer.
On screen, Jones, who died Monday at 93, brought to life a reclusive writer coaxed back into the spotlight in “Field of Dreams” and a haughty king of a fictional land in “Coming To America.” On stage, he won two Tony Awards for “The Great White Hope” and “Fences.” His work as a voice actor — the regal dignity of his portrayal of Mufasa in “The Lion King” and the menacing and deep timbre he lent to Darth Vader in “Star Wars” — helped cement his place as a legendary actor among generations of fans.
But in the wake of his death, an aspect of Jones’ career has come to the fore: consenting to the use of artificial intelligence to replicate his performance as Darth Vader after he stepped away from the role. Skywalker Sound and the Ukrainian company Respeecher used AI to recreate Jones’ villain for the 2022 show “Obi-Wan Kenobi” on Disney+. Mark Hamill’s voice was also “de-aged” using Respeecher for his appearance as Luke Skywalker in “The Mandalorian.”
Voice actors say they fear AI could reduce or eliminate job opportunities because the technology could be used to replicate one performance into a number of other movements without their consent — a concern that led video game performers with the Screen Actors Guild-American Federation of Television and Radio Artists to go on strike in late July.
To some, Jones’ decision to allow AI to replicate his voice raises questions about voice acting as an art, but also potentially helps lay the ground work for transparent AI agreements that fairly compensate an actor for their performance with consent.
Zeke Alton, a voice actor and member of SAG-AFTRA’s interactive media agreement negotiating committee, said it’s “amazing” that Jones was involved in the process of replicating his voice.