The Art Of Craft: VFX Supervisor Richard Bluff On The Game-Changing Innovations Behind ‘The Mandalorian’

“I think Season 1 really only scratched the surface of what was possible. We keep talking with Jon [Favreau] and [production designer] Andrew Jones about what could be possible with this technology, and it’s very exciting. We certainly have not reached our limits, even after shooting Season 2.” Richard Bluff

Disney+

 To craft an epic space Western on a TV budget, The Mandalorian eschewed location shoots and instead crafted a digital exterior volume on a soundstage.

 The volume, made up of 2,200 LED screens in a 20x70ft circle, projected environments crafted from plates and digital assets, on Epic Games’ Unreal Engine 4.

 The video game engine, married with high resolution environment scans, had advanced far enough to project photorealistic worlds in real-time.

Disney+

 The nascent StageCraft technology relies on age-old principles of rear projection and camera perspective, given a cutting-edge, modern twist.

 The many LED panels were bright enough to offer ambient lighting to the scenes being shot; actors and props seamlessly blended with even dynamic lighting changes.

Disney+

 The real-time rendering of the footage meant camera movement could be matched by parallax moves in the renderer, offering illusory perspective.

 Scenes could be shot on alien landscapes with little need for green screen chroma key stages. Post-production compositing work was considerably reduced.

 Almost 60% of scenes shot for The Mandalorian took place on the volume. VFX supervisor Richard Bluff considers StageCraft a game-changing technology.

 A recent tech demo for Unreal Engine 5 suggests even further improvements to come. Could long post-production VFX workflows become a thing of the past?


For more from our conversation with Bluff on the making of The Mandalorian, read on. 


DEADLINE: Supposedly, The Mandalorian is the fulfillment of a long time dream, on the part of George Lucas. That must have made the series incredibly exciting to put together.

Disney+

RICHARD BLUFF: Yeah, absolutely. George put a lot of time and effort in trying to develop his own live-action TV show many, many years ago, and of course with the Disney buyout, all that changed. I was involved in that original project, and during that time, George knew that in order to bring Star Wars to the small screen, he needed the technology advancement, particularly with the environments. The scale and scope of them in a Star Wars movie is extensive, so that’s how I got really engaged, because I was the supervisor of the Environments Department, globally for ILM, at that point.

When Disney bought Lucasfilm, I’d mentioned to a few folks at Lucasfilm that if we ever did continue to pursue a live-action Star Wars TV show, I would be interested. For me, it was the challenge of bringing the scale and scope of Star Wars to the small screen that really drew me to the project.

Of course, a few years ago, Jon was contacted by Kathy [Kennedy], and The Mandalorian started, and it was at that point that Lynwen Brennan and Janet Lewin of Lucasfilm put me forward as a consultant on the project, on behalf of ILM. From that moment on, I was basically helping to guide, and proving myself to Jon that I was his guy.

DEADLINE: Reportedly, in pre-production, Favreau put together a think tank of VFX experts, to figure out how he could make the show he envisioned. What was discussed in these meetings?

BLUFF: The group really consisted of the keys on the show. So, we obviously had Greig Fraser, the eventual DP, and he was somebody that Lucasfilm highlighted as having extensive experience, utilizing LED screens. Jon, himself, was very keen on utilizing game engine technology, wherever possible. He had a number of ideas as to how he would take advantage of it for the show, so he was very keen for [Epic Games CTO] Kim Libreri—a friend of ILM—to get involved.

Disney+

It was myself, Andrew Jones, Doug Chiang, the lead designer on the show, and Rob Bredow, and it was through those early discussions—and an understanding that I had from talking with Jon, as to what he was keen to pursue, in terms of new technology—that we started throwing around the idea of utilizing LED screens. And, of course, displaying real-time game engine environments on those screens…

There was also an early conversation about, is there a world where we utilize the game engine for post-production, final pixel as well? So, there were many different avenues we were pursuing. But the game changer that Jon was looking for, and that we were keen to pursue, was photographing LED screens with actors in the foreground, and convincing the audience that those actors were in that immersive environment.

DEADLINE: Can you explain the role game engines play in the VFX process? How did Epic Games’ Unreal Engine 4 enter into the conversation as a key piece of technology that would make the series possible?

BLUFF: Effectively, in order to display visuals on the screen and have convincing parallax, the visuals needed to perspective correct to the track to camera. That was really important. Now, if all we were doing was displaying a mountain range at 50 miles away, off in the distance…If we were stood on the salt flats, then you wouldn’t need to perspective correct that background, or it wouldn’t be as critical. You could simply display a photograph through some sort of graphical interface on the LEDs—pretty straightforward. It’s been done many times before.

But of course, what we wanted to do was solve for every environment, which also included interiors and smaller spaces. And to be able to do that, you need to move the taking camera around, and have the perspective shift on the LED screen. The only way you can do that, and ensure that the track is as in-sync as possible, is to compute the visuals on the screen in milliseconds, and generally speaking, the best way to do that is to use game engine rendering. Because generally speaking, that’s what they do. And they’ve been doing it for many, many years.

Disney+

It’s not the only way to do it, because there are other visual effects renderers that can compute in milliseconds, but we felt that we needed to build a production tool that went way beyond simply displaying pixels. We have to also save the data, retrieve the data, track the cameras. We needed to be able to create physical light in the environments, move things around. So, the more you delve into what we would want to achieve, on the day, on the stage, the more you’re going to be leveraging a game engine renderer.

DEADLINE: The set-up that you created, dubbed “The Volume,” boasts an incredible range of applications. From your perspective, what were some of the aspects that made it such a game changer?

BLUFF: I’ll speak a little bit about what some of Jon’s motivations were. When you examine projects that Jon’s been involved with in the past—and in particular, The Lion King—everybody’s well aware it’s an animated movie. But Jon wanted to create a show that felt more like a documentary. In order to do that, he built a system that allowed practical filmmakers to use preexisting camera and grip equipment to manipulate the camera. That camera was translated into the game engine as the taking camera, which would film all the previz, and then would later on travel downstream to do the final renderings for the show.

It’s very clear that Jon has a respect, and admiration, and appreciation for cinematography, and a camera operator, and everything that brings to a show, and to a shot—and in particular, to visual effects. So, one of the things I’ve heard from Jon, loud and clear, was that he wanted to utilize the LED screen technology [to] provide the camera operator and the DP the visual that he’s supposed to be composing against. If the spaceship, for example, is deep in the frame, displayed on the LED screen, and it’s an important compositional element, then we need to see it, the actors need to see it, and everybody on set needs to see it. And if something else is off-camera right—it’s taking off, and you see a shadow moving, or whatever it may be—and that’s creating wind that needs to be visible on the actors, then special effects will simply look at what’s happening on the stage, and position a fan accordingly.

Very straightforward ideas…But when you’re shooting against a green screen, it’s incredibly difficult for everybody to be on the same page. It largely comes down to guesswork, and typically, you can see when that occurs in visual effects movies and TV shows, when the composition isn’t quite right, or you feel like the composition is skewed more towards the visual effects in the background than it is the characters in the foreground. So, this technology was going to help with that.

Jon is very collaborative, and during the prep phase, when he’s working very closely with myself and others, he talks about the artwork that we’d try to mimic on the show at great length. He rightly points out that often, people have to try and remember conversations nine months earlier, to try and execute something in post. And when you’re talking about a show that has over 4,000 visual effects shots, it’s understandable that things would be forgotten.

Disney+

So, if we can build the environments and a set extension—the visual effects post-production work, if you like—during those initial conversations, while the entire crew is on the show, the DP can contribute to the lighting. Doug Chiang can contribute towards the overall composition; the production designer [ensures] that the physical set is a close match to what the content of the screen is, or vice versa. Therefore, what you end up shooting on the day should be exactly what the director, or the showrunner, or the creators have in mind. And by the same token, you’re not then having to translate conversations nine months later, [in terms of] what was intended.

So, these are the things that really motivated Jon to develop this technology—and of course, to provide the actors with a reference point as to where they are, what their motivations are. Kathy Kennedy herself pointed out, when she arrived on set for the first week of the Volume shooting, that all of the dailies that were shot in The Volume had a complete environment in the background. So, she could understand exactly where the scenes were taking place, rather than trying to guess where, and on what ship, was this green-screen scene shot for.

DEADLINE: To back up for a moment, can you explain what a traditional scenario for VFX rendering would look like, in comparison to the real-time rendering system your team devised?

BLUFF: Let’s just say that we are in a space hangar. Typically, we have a set that’s only a partial build, and green screen. The DP is probably referencing artwork with the rest of the crew, and trying to mimic the lighting that he imagines would be contributive, if that artwork and that set extension was visible in the green screen—and so, it’s largely guesswork. But the very best DPs, of course, can do a phenomenal job at matching—or driving, more importantly—what that post-production setting would look like.

So, we would take those plates, and that environment would get built many months after shooting wraps. Visual effects would build, and then start to light, and start to match it with everything that appears on set. Generally speaking, you’re using RenderMan, or V-Ray, or Arnold, or whatever rendering tool is out there, and you can tune those renders to be very, very quick, to get very quick feedback to the artists. But when you really want to increase the resolution, or the complexity of the lighting that’s being generated, then you’re going to have to wait a few hours, or sometimes overnight. The office, of course, produces that work for the supervisor and the director.

Now, what we’re doing is utilizing a game engine that is processing in milliseconds. So, a lot of times, for our show, we decided to pre-bake all the lighting ahead of time. The work that the artist is doing is very similar, in the sense that they are building, modeling and lighting. Then, once they feel like they’ve got the lighting in a good place, we present it to Jon, and he approves it, or gives us additional notes. Once that lighting’s approved, we start baking it to go into the game engine, and that really consists of, every surface has a texture on there, and has the contribution of a light on it. Whether it’s a shadow or an illumination, that all gets baked down into textures that replace the underlying texture, that wouldn’t have lighting on there.

Disney+

Geometry and texture is then taken into the game engine and rebuilt. The idea is that the game engine is processing the least amount of information possible to maintain 24 frames per second, while at the same time displaying the largest number of pixels and texture maps available. So, it’s a balancing act.

Now, we did have live lighting in some of these scenes, but it was usually with the understanding that that light was important to be able to manipulate on the shoot day, based on what the DP was trying to achieve, and it couldn’t be something that was predictable prior to the shoot. So, we kept that light. Now, as technology advances, ray tracing is going to be in real time. So, moving forward, we’ve been able to take advantage of true, real-time ray tracing, which allows more complex lighting scenarios, and allows us to keep it live, so we can modify on the day. Now, it is going to be a balance between what is being processed, and likely, if we’re introducing location photography, a lot of that, again, will be faked. But those are the choices that the technology allows you to be flexible with.

DEADLINE: Is it true that you were able to control The Volume’s visuals with a mere iPad?

BLUFF: We were able to control all the LED screens on an iPad from day one of shooting. The idea is that the “Brain Bar” is effectively mission control for the LED screens, but what you do need is somebody operating the fine controls that are available to the DP in the stage. So, we’d provide one of our operators with an iPad, and he can walk around the LED stage, or it could be stood right next to the DP, and with dials and switches on the iPad, he can change contrast ratios, he can change colors. He can create negative fill cards; he can create silks above the axis. There’s almost no limits to what the iPad control allows the operator to do, from a lighting standpoint.

DEADLINE: What was the workflow like, in terms of gathering visual materials to use as reference for The Mandalorian’s environments, and getting to the point where you could then project these environments onto your LED screens?

BLUFF: I think the broader explanation is that, just like any visual effects shot that requires a set extension, there is a variety of different ways of how you could achieve that environment. And depending on what it’s supposed to look like, there are options that are far more successful than others.

So, for example, we did send a photographer to Hawaii to capture volcanic rock photographs, and also 3D scans. So, instead of trying to sculpt some volcanic rock in the computer, let’s just go capture what is already out there and real. The same applies to landscapes that could be photographed in the real world, but of course, we only need to send one or two people to capture that data. In addition to that, some of the interiors, the production designer would select, with Jon and the director, real-world locations that would be the basis for digital set dressing, to modify it towards more of a Star Wars look. And then, of course, there’s some environments that were completely built from scratch in computer graphics, like the Roost hangar at the beginning of Episode 106.

Disney+

On top of that, we also built miniatures that we scanned and took into the game engine, because on occasion, that was the quickest and best way to achieve a photoreal environment. We needed to leverage as many different approaches as possible, considering the volume of work we were having to create in such an incredibly short time frame.

DEADLINE: Do you think this technology will become particularly useful in our socially-distanced era?

BLUFF: What I will say is that we built the technology because there was a need to bring visual effects and virtual production onto a practical set, and to allow all departments to work together more effectively, in this way. Since the advent of the pandemic, others have identified that methods that we’ve pioneered could have a significant impact on shooting, and I’ll let them speak to that.

DEADLINE: What do you think the big-picture implications of this technological breakthrough are for the entertainment industry at large? How widely do you see this technology being used, going forward?

BLUFF: Even before the pandemic, we were inundated with studios and filmmakers showing a keen interest in how this works, and Jon was very open to sharing how he was filming The Mandalorian to filmmakers. So, from that point, [with] the conversations that I’ve been involved in, what I’ve found incredibly interesting is that different filmmakers have pitched their unique ways of using the technology, whether it’s a director, a DP or a production designer. I think that’s the most interesting part of this, is that everybody’s looking at more efficient ways of shooting particularly TV, but of course, movies, as well.

I can see it creeping into the vast majority of studio slates somewhere, just because it’s a more collaborative way of shooting. You know, we’re still shooting movies the same way [we have been] for the past 80 to 100 years. It’s changed very little, in many ways, and a lot of what keeps it the same is that actors and filmmakers are forced to use their imagination, and we don’t really fill in the blanks for many months afterwards, when post-production catches up with the filmmaking process.

Disney+

This allows us to put a lot of the visual effects in front of the camera, with every other department, sooner—not everything, but a lot—and through that, I think you get a better show, and a better image. I think that’s really exciting, and everybody wants that, to better themselves, and have access to tools that could really advance their storytelling.

DEADLINE: This is a technology that seems to allow you to do a lot more with fewer resources, and less time—particularly if you’re looking for the scope of epic locations that are difficult or impossible to access.

BLUFF: Oh, absolutely. I think if you’re imagining a scenario where you’ve got a one- or two-day shoot on location somewhere, the logistics involved in that, for something that could be relatively straightforward, is huge. So, the idea of sending a very, very small team of visual effects photographers to collect the environment in the appropriate lighting scenario, and bringing it back to a stage is absolutely huge.

DEADLINE: Do you think virtual sets will replace existing technologies like blue and green screen? Or will those continue to exist as other tools in the arsenal?

BLUFF: I think you’ll always need blue and green screen in certain locations, and certain stages on location, and in backlots, for example, because the use of LED screens outside currently is going to be limiting.

In regards to having blue and green screen up on an LED stage, we thought we would be doing a lot more of that, and in actual fact, we did very, very little of it, which is why, if you watched a rough cut of Season 1 episodes prior to post-production starting, it was astonishing how much of the frame already had the appropriate set extensions photographed. It’s something that I wasn’t used to seeing, and it certainly helped everybody zero in on what was important to tell in the story.

DEADLINE: Is there a sense of pride for you, having helped to develop a technology with the potential to change entertainment forever?

BLUFF: You know, it’s interesting. People asked during the production if I thought this would be a game changer, and in all honesty, I couldn’t see past the challenges that were in front of us at that point. And even when we got past the physical shooting, and the LED stage was a success, we still had 4,000 visual effects shots left. Then, even when we were finishing post-production on Season 1, I was starting to turn my attention to Season 2.

So, I haven’t really had much time to look back, and I think it’s only since we went public with how we did the show, almost 18 months after we started the journey, that I’ve been able to reflect. And absolutely. There’s a huge amount of pride that the whole team has.

But I think what’s important for folks to know is that it took more than a village. It took hundreds and hundreds of people equally contributing to helping solve this problem, and it relied on decades-old theory or technology. You know, ILM used Epic’s game engine on A.I., when Steven Spielberg was visualizing, on set, the Rouge City.

So, I don’t personally ever feel like I changed anything. I believed I was incredibly fortunate to be challenged by Jon Favreau and others to bring many, many ideas together, and to build a team of people that could execute. I believe it was this team of people that executed and achieved this success, and I think that if there are people that should be spoken about as changing the industry, it’s people that have been involved in this level of technology for many, many, many years prior to me.