“It was time to try a digital human,” says Ang Lee of the steep challenge of creating a young Will Smith as ‘Gemini Man’ and Scorsese’s ‘Irishman‘ push new boundaries of VFX, budgets and, say some, ethics.
While directing Will Smith in Gemini Man, in which the 51-year-old actor stars as an assassin hunted by a clone of his younger self, director Ang Lee made an unusual request of his star. He asked Smith to “act less.”
Lee needed Smith to go back to his less-polished acting roots from the early 1990s in order to capture the performance for his younger clone. But to make Smith look like his youthful self required a whole new level of trickery that saw Lee and his visual effects team create a fully digital CGI 23-year-old Will Smith.
The result: On Oct. 11, audiences will see a Fresh Prince-era Smith trade punches with his present-day self. A few weeks later, septuagenarian screen legends Robert De Niro and Al Pacino will perform together as younger men in Martin Scorsese’s gangster epic The Irishman. As visual effects technologies advance, filmmakers are rethinking the potential of digital humans, particularly as a tool for de-aging actors.
While crafting a believable synthetic human is the most difficult of VFX wizardry, Hollywood saw the possibilities a decade ago when an elderly Brad Pitt aged backward into his youthful prime in David Fincher’s The Curious Case of Benjamin Button. The work won the VFX Oscar that year, but the challenge of aging an actor up or down was still so daunting that it was rarely used outside of limited and specific story ne.
In 2019, nostalgic audiences are seeing several stars appear as their younger selves thanks to a range of VFX techniques, including Samuel L. Jackson in Captain Marvel, Robert Downey Jr. in Avengers: Endgame and Linda Hamilton in the upcoming Terminator: Dark Fate, as she returns to the franchise after 28 years (2015’s Terminator: Genisys likewise featured a de-aged Arnold Schwarzenegger).
But to de-age by creating a synthetic human is still largely uncharted territory, and top VFX artists are using various techniques that present challenges and opportunities for directors, effects artists and even the actors themselves. Upon seeing his digital younger self for the first time in The Irishman, ILM VFX supervisor Pablo Helman says De Niro told him, “You just gave me 30 more years of my career.”
Scorsese knew he needed to wield the full capacity of de-aging magic in order to make The Irishman the way he wanted: that is, with his three leads — De Niro, 75, Joe Pesci, 76, and Al Pacino, 79 — playing their characters through the decades that the story spans. But motion-capture methods of creating an onscreen digital human couldn’t be used on the three veteran actors. “Marty said to me, ‘One thing I know for sure — Bob’s an actor‘s actor, Pacino and Pesci as well. They’re not going to wear a helmet with two little cameras and markers all over their faces,’ ” says Helman.
This led to a bold initiative at ILM to develop its performance-capture capabilities so that actors do not have to wear markers on set. Netflix, which made The Irishman for $159 million, and ILM say it involves a three-camera rig with a main camera and two witness cameras, as well as companion software.
“We had taken the technology away from the actor and let the director and the actors do what they need to do,” Helman explains. He adds that particularly with stars such as De Niro and Pacino, they like to act opposite each other and improvise. “That kind of interaction can’t be done in the moment when you have one actor acting against a tennis ball,” he contends. “We didn’t alter any performances. There were changes that were made to the appearance but not the choices they made in the bodies and also in the faces.” Each finished shot was then reviewed by Scorsese. “He would tell us if he felt the same way as he did when he selected the take, and if it would work for the movie.”
For Paramount’s Gemini Man, made for $138 million (plus rebates), Lee took digital human work into a whole new realm. The VFX supervisor, Bill Westenhofer, explains that as the younger and older Smith had to appear together in the same shots, other VFX techniques simply were not an option.
“I believed it was time to try a digital human,” Lee says. “You had to build the character, the detail and really study human details and the performance from our actor. I believe that’s what you have to do if that’s your lead character.”
VFX house Weta gathered images of Smith at a younger age and studied anatomy and terms such as nasolabial folds. “If anything isn’t right, it falls apart,” says Guy Williams, Weta’s VFX supervisor. “We did a deep dive into how light interacts with skin and creating pigments under the layer of skin.”
For shots in which Smith appears with his young clone, Junior, the actor performed first as Henry, with a reference actor of similar physicality playing opposite him as Junior. Then Smith performed Junior’s role on a motion-capture stage opposite a reference actor playing Henry. In scenes in which Henry and Junior are not both in the frame, the team would photograph Smith wearing a facial-capture system and then perform digital face replacement on his body. Action sequences involved fully digital doubles based on stunt performances with face replacement.
Westenhofer says that while getting the eyes right is important to overcome the uncanny valley, every element of the face and body has to be spot-on. “We had in our favor that Will is pretty healthy and still moves pretty youthfully. Making sure the youthfulness came through in the body was a consideration throughout.”
Costs can vary. At the moment, a fully digital human generally starts with the creation of a movable model of the human, explains Darren Hendler, head of VFX house Digital Domain’s digital human group. He estimates that this could cost from $500,000 to $1 million to create. Then, he adds, producers could expect to pay anywhere from $30,000 to $100,000 per shot, depending on the individual requirements of the performance in the scene. VFX pros point out that costs will drop as computers get faster and techniques evolve.
Because of the cost and complexity of creating a digital human, filmmakers often instead use so-called digital cosmetics for de-aging tasks on the actor‘s actual image, such as removing wrinkles. This was seen in Marvel’s Avengers: Endgame and Captain Marvel, de-aging Downey and Jackson.
These capabilities raise important ethical questions: When is it appropriate to use an actor‘s likeness, and what are an actor‘s rights to his or her likeness? That conversation intensified when late actor Robin Williams’ estate put restrictions on the use of his digital likeness, an unusual move.
Westenhofer believes these are discussions that will need to happen, including how likenesses are used in Deep Fakes. “For us to do this, it took a team of several hundred artists two years to pull off. We are not close to someone going in their garage and completely fooling someone,” he says.
And then there are questions about how digital humans could impact acting opportunities — actors hired to portray younger versions of lead characters may lose out on those opportunities. Still, Westenhofer is optimistic about how digital humans could lead to new stories that maybe Hollywood hasn’t considered at this point. He says, “Our role is to show that all of these things are possible and allow incredibly talented people with these great imaginations and storytellers to come up with things that we haven’t thought of yet.”