‘Avengers: Engame’ VFX Supervisor Dan Deleeuw On Crafting Smart Hulk & De-Aging Michael Douglas For Marvel’s Latest

'Avengers: Endgame' VFX Supervisor Dan Deleeuw
Jordan Strauss/Invision/AP/Shutterstock

A two-time Oscar nominee who has been working on Marvel pics since 2013, visual effects supervisor Dan Deleeuw found some of the most complex blockbuster work of his 26-year career in Joe and Anthony Russo’s Avengers: Endgame.

Debuting in April, and going on to rack up nearly $3 billion at the worldwide box office, the star-studded superhero film opens on a universe thrown into chaos, following the disastrous events of Avengers: Infinity War. Facing off with Mad Titan Thanos for the final time, the remaining Avengers assemble to restore order to their world.

The culmination of a 23-film story arc that began with Iron Man in 2008, Endgame also represented to Deleeuw the pinnacle of ambition, in terms of VFX craft brought to the MCU. Since the release of Iron Man, “I think virtually everything has changed. You’re at the point where you’re coming out of the uncanny valley, when you’re working on digital characters,” Deleeuw reflects. “Now, in the case of Smart Hulk and Thanos, you’re able to find that very minute detail that brings a character to life.”

'Avengers: Endgame'
Film Frame

Below, Deleeuw describes the process of crafting Smart Hulk—an uncannily handsome new iteration of Mark Ruffalo’s green giant—de-aging Michael Douglas, and putting together Endgame’s climactic battle sequence.

DEADLINE: Avengers: Endgame has been a momentous feature for Marvel, representing a major turning point in the MCU. What excited you about tackling the project?

DAN DELEEUW: This film was the culmination of all the other films, so from a storytelling standpoint, you had the ability to send off some of your favorite characters. Then, from my perspective, in terms of visual effects, we also got to use every superhero in the MCU, so you’re able to imagine those powers, and how those powers would interact, and they culminate in what was the final battle. Then, you were also able to invent new characters, with Smart Hulk, and really pursue the ultimate dream of comic book fans, and bring everything together in a fulfilling and fun way, as the series came to a close.

DEADLINE: How exactly was the Smart Hulk character brought to life? I don’t think I’ve ever seen an actor’s physicality captured so effectively, in the creation of a digital character.

DELEEUW: Because we knew it was going to be a blend between Mark Ruffalo and the Hulk we’ve seen in the other Avengers films, one of the first things we did is, we incorporated quite a bit of the proportions of Mark’s face into the digital sculpt, so he did have that big, Hulky feel, but you could see Mark in him. It was one of those funny things, when we saw the first turntable of the head: Everybody just thought how handsome it was, was uncanny.

Mark Ruffalo in 'Avengers: Endgame'
Film Frame

The technology behind that is this system called Medusa that Disney Research in Zurich developed, and what it allows you to do is, per frame, capture an incredible amount of detail of an actor’s face. What you’ll do is run the actor through different poses and different line readings, and that gives you the data you’ll pull from. Because what happens is, what you do with that character, you really want them on set. There’s systems for having someone sit in a separate room and deliver the dialogue, but it extracts them from the performance. So, it’s kind of a combination of technology and art at the same time.

What we were able to do is refine the technology, so that we could actually have Mark on the set, performing against the other actors. A big part of acting is listening, and responding to the actors you’re with. So, what that did is, it gave you a performance of the level that you wouldn’t normally get, if it was just something that was done in a motion capture stage.

So, what we did is, we used high-resolution data from the Medusa scans, to help what we captured on set. Because on set, you have a helmet cam. Basically, Mark will wear a motion capture suit with the little balls on it, and he’ll have a helmet on that has a little camera pointed at his face, and then we’d dot his face. The number of dots on his face is about 100, but that’s never enough detail, so what the software will do is basically solve how those dots move. Then, as a guide, it’ll go back and use that really high-resolution data, and then it can actually see how Mark’s skin stretches and folds, and how the corner of his mouth curls.

Then, there’s another piece of software that will take that information and apply it to your sculpt of Smart Hulk, so it’s able to get very, very small movements. So, in this method, you’re able to capture how the face moves, dependent on how the underlying muscles of Mark Ruffalo’s face are moving.

DEADLINE: This year, in visual effects, Martin Scorsese’s The Irishman has sparked a lot of conversation about de-aging actors through digital methods. You de-aged Michael Douglas, and aged up Chris Evans for Endgame, but I imagine you employed different methods than Scorsese’s team did.

DELEEUW: Lola [Visual Effects] is the company that we work with to do that, and the entire process is very dependent on the actor’s performance within the scene. So, you’re always starting with the actor’s performance and augmenting it, either by aging or de-aging them.

With Michael Douglas, you’ll have Michael run the scene, and then you’ll have a young double that will play the scene as close as they can to what he did. You take Michael’s face, kind of distort it, change the shape to where it matches close to what you want him to look like when he’s younger, and then you’re able to digitally graft the younger skin onto Michael Douglas’s face.

Basically, when you’re de-aging someone, you’re able to say, “Okay, do you want Romancing the Stone Michael Douglas, or China Syndrome Michael Douglas? Or do you want Streets of San Francisco Michael Douglas?” Then, the director’s like, “Oh, I want Streets of San Francisco Michael Douglas.” So, you go in, you’re able to look at all that reference, and you’re able to make judgments about what you need to do to his face to bring it back to that. But it always goes back to the performance. At no point is the performance ever replaced by a CG model or facsimile of what the actor’s doing.

'Avengers: Endgame'
Film Frame

It’s the same thing with Chris Evans, who was on the opposite end of it, when we made him older. We had makeup for his neck, because that’s a little bit harder to do with a digital technique. But Chris Evans was there, and then we had an old-age double that would perform the scene as close as he could. Basically then, you’re taking the wrinkles and the liver spots, and the detail of the older skin, and digitally remapping that texture onto Chris Evans’ performance.

DEADLINE: How did you manage the challenge posed by the massive action sequence toward the end of the film, in which characters from many Marvel worlds converge through portals for battle?

DELEEUW: It starts with a roadmap and pre-visualization. So, we’ll work with the folks at Marvel—the directors, the picture editor—and choreograph the entire ending of the movie in an animatic form. You work from the script, get a lot of previz artists, and just turn them loose. It’s a fun experience to work with all the different characters, and what’s interesting at Marvel is that you have a library of all the characters you can pull [from]. I describe it as going to Toys “R” Us, driving your shopping cart down the aisle, and dumping all the toys [in].

So, it was something that we knew was going to be big, but in our having fun with it, it turned out a lot bigger, I think, than any of us really expected before we got deeply into it. At the end of the day, it’s like, Oh, this is great, and now you have to figure out how to do it. That starts with a roadmap, and then from the previz, you’re able to break it down and understand, “Okay, this will have to be something that’s live-action photography,” or “This is something that will be CG.”

There’s a shot in the final battle [in which] there’s a big oner. It’s something that you see in other Avengers films—a big, continuous shot, with all the characters charging at each other—and it’s one of those things where you look at the original footage and it’s somewhat silly, because you have all your heroes charging at basically an open field of six stunt people in motion capture outfits. You’re like, “Oh, that’s not impressive.” [laughs] But then you go back in.

We worked with Weta [Digital] on this. They’ve got software that will simulate giant battles, and you can define behaviors for the different characters. So, if they’re attacked by someone with a laser gun, they’ll act a certain way; if they’re attacked with an ax, they’ll act a certain way. There’s a little bit of artificial intelligence involved in that, and by giving them different animation cycles, you get very complex battle going on. Then, we’ll intermix that with live-action, because you always want to have actors in there that give you the anchor to reality, and make sure you keep the bar where it needs to be to look real.

DEADLINE: On Endgame, you also served as a second unit director, and you’re currently at work in this capacity on Marvel’s Eternals. What have you enjoyed about this new role?

DELEEUW: It’s been great because you’ve got the experience of working with Chloé Zhao, the director. She’s great, and she’s got a really great vision for the film. [I’m] working with the actors more, one-on-one, which is really great, and getting the opportunity to see all the great action sequences and set pieces for the film. It’s really great being in a position where you can help design the action sequences, working with the director, and having a little bit more control of that, before you get into the visual effects portion of the pipeline.

This article was printed from https://deadline.com/2019/12/avengers-endgame-vfx-supervisor-dan-daleeuw-marvel-disney-interview-news-1202805604/