Costing $15 million an episode, the live-action “Star Wars” series “The Mandalorian” from the Disney+ streaming service is expensive as hell, but funnily enough actually looks like it cost more than that in the final product.
The incredibly rich cinematic look contains some astonishing shot compositions and filming that is effectively indistinguishable from the films, and one of the big reasons for that is the production’s use of a space being called ‘The Volume’ throughout production and is now going by the official name of ‘Stagecraft’.
The technology is a mix of one of the oldest tricks in the books – rear projection – with some familiar and cutting edge stuff. Instead of using projected images on a large screen, the filming takes place in a soundstage surrounded by high-quality LED screens displaying a virtual landscape image.
Said landscape is rendered in the Unreal gaming engine – thus allowing for incredibly elaborate vistas of all kinds which can be adjusted on the fly by technicians there using basic computers and tablets. Do you want to move a massive boulder? Done. Change the time of day and weather? It can be as simple as one person moving a finger on a slider.
More importantly, the parallax of the projection changes as the camera moves. This allows for an environment that on screen appears to behave in the same way an actual 3D environment does. The result is much of the episode can be shot ‘in camera’. While there is no behind-the-scenes featurette from “The Mandalorian” as yet showing off how they do it, Unreal Engine itself has posted two videos – a short basic rundown and a more detailed examination – demonstrating a more basic version of the tech.