Sci-fi sequel “” harks back to the 1982 original in lots of subtle ways. But the most striking callback has to be the return of one of the original movie’s major characters looking exactly the same despite the passage of 35 years, thanks to cutting-edge visual effects.
If you haven’t seen the new film, we’ll be discussing the actor making the evocative cameo, but we won’t get into spoilers for the actual story.
The cameo appears in a pivotal scene in “2049” in which we come face-to-face with the character Rachael, looking as young as she did when we first met her in the 1982 film. The scene involves a digitally de-aged version of original “Blade Runner” star Sean Young. Following similar CG cameos from de-aged actors in movies like “ ” and “ ,” it’s the latest and most polished example of an actor being recreated in computer-generated form.
Speaking to me over email, Young acknowledged that the possibility of her image being conjured without her involvement was “a little nerve-wracking.” But she was glad to be included in the project, and enjoyed the chance to spend time in Hungary, the location for “2049” and her current TV show, “The Alienist.”
To create the digital Rachael, the filmmakers turned to MPC, the London-based effects company that de-aged Arnold Schwarzenegger for “” and
MPC’s process began by collecting photographs, film scans and other images of Young from both the original movie and others from the era, including “Dune.” “It’s basically research, research and research,” explained Richard Clegg, visual effects supervisor who oversaw MPC’s work on the scene.
But there’s only so much you can glean from old footage. “For example,” said Clegg, “you can change the upper eyelashes by a fraction of a millimeter and that will significantly affect the way the light bounces around inside the eyeballs. So precision is really, really important. You can’t just extract that from looking at footage from the 80s.”
That kind of hair-fine precision is important because we’re so attuned to reading faces, and eyes in particular. “When you’re conversing with someone, there’s so much happens subconsciously,” said Clegg. “So what’s the psychology behind that? What is it that we’re noticing? Why does it feel off? We spent a lot of time just studying and analyzing those things. We’re trying to understand the physics of light on the face, and all the subtle nuances on the skin or the way muscles move.”
Humans are so closely tuned to reading these near-invisible cues in each other’s faces that we can spot an artificial face a mile off — an effect known as the “.”
“You can hit uncanny valley super quickly,” said Clegg. “If it’s not 100 percent right, it’s 100 percent wrong. It doesn’t inch toward getting better. You’ve either nailed it or everybody hates it.”
The many reference images and footage of the first film were used to construct a digital model of Rachael’s head, going into microscopic detail on the pores, fine facial hair and makeup. For extra reference, Young was brought in to be photographed with a special lighting rig. “It’s quite the experience,” she said. “You sit in the center of a globe of maybe 75 cameras, and the cameras are timed to go off in a spiral.”
Obviously in 35 years, anyone’s skin and face change, but one thing that doesn’t change as much is your skull. So the effects team started with Rachael’s skull and worked outward — which kind of echoes the plot of the film, funnily enough.
The digital model of Rachael was built using 3D computer animation software Maya, one of the industry’s key tools. Surface texture was added using painting program Mari. The 3D model was then turned into a two-dimensional image, or rendered, using RenderMan software originally developed by “Toy Story” studio Pixar. An application called Katana took care of the lighting, and compositing software Nuke combined the digital Rachael with the real footage.
To see how things were progressing, the team inserted the CGI head into scenes from the original film. It then challenged directorand visual effects supervisor John Nelson to spot which was the real Sean Young and which was the digital version. “John got it — but he had to think about it,” laughed Clegg.
Young joined Villeneuve on set for a day to watch stand-in actor Loren Peta shoot the scene with Harrison Ford, Jared Leto and Sylvia Hoeke. The dialogue was locked down so the effects team could work from the audio track without having too many variables. Even so, and despite the fact it was a single element of a relatively short scene, a team of artists working in London, Bangalore and Montreal spent 10 months getting it right.
MPC’s digital head was then added over Peta’s. The team also made a few other tweaks, like narrowing her shoulders to “Rachael-ify” her.
Clegg allowed that it’s a weird feeling to labor so hard over something designed not to be noticed. “When I watched the film in the theater, my heart was racing a little bit when I knew it was coming up, because I was interested in how people would react. That’s why I was relieved to hear there was essentially no reaction.”
More and more digital doubles and de-aged actors are appearing in films. Following the Martin Scorsese‘s forthcoming mob movie “ ,” which will digitally de-age Robert DeNiro and Al Pacino. Will we soon see a live-action film cast entirely with photorealistic digital actors?in “Rogue One,” we’re heading for flicks like
“We’re getting closer and closer all the time,” said Clegg, who believes the industry is growing in confidence. “But it’s still really time-consuming. One of the hardest things to do, in my mind, is still the animation side of it, to get a performance, because that’s the bit where our brains subconsciously are experts.”
For actors like Young, digital doubles present an interesting new quandary. “If [‘Blade Runner’ production company] Alcon wanted to create an entire show which me, or just use my image in any way whatsoever, the technology exists,” she said. “I must trust whomever has my image to behave respectfully. Do I have any rights? Or even if it was acknowledged that I had rights, could I enforce them?”
It’s interesting to compare the two “Blade Runner” films 35 years apart, especially as many of today’s effects experts at MPC and other companies were inspired in their youth by the original’s jaw-dropping effects. I wondered if Clegg was nostalgic for the old-school days of models, miniatures and mattes.
“There’s something nice about getting your hands on things,” he said. “But that is essentially what we’re doing. Hours of painting and sculpting and modeling went into Rachel’s head. It’s phenomenal.”
Tech Culture: From film and television to social media and games, here’s your place for the lighter side of tech.
Solving for XX: The tech industry seeks to overcome outdated ideas about “women in tech.”