Are Game Engines the Future of Filmmaking?
Since the earliest days of cinema, filmmakers have captivated audiences with movie magic. Similar to a magician and his hat, such visual innovations trick viewers allowing them to experience fantasy as reality. Today, with the advancements in gaming technology, amazing new workflows are transforming how we use visual effects (VFX) and computer-generated imagery (CGI). So, are game engines the future of filmmaking?
Before we answer that question, let’s take a step back.
As a filmmaker, it’s important to understand that the industry is a living, breathing thing, constantly changing with technology perpetually improving for the better.
Now, this isn’t to say that every project will call for such changes. For example, just look at celluloid in today’s digital world. It’s still the medium of choice when it comes to period pieces and comparable creative decisions. But, nevertheless, the future is invariably digital.
Still, it’s useful to gain some healthy perspective of the past in order to truly appreciate the gravity of the future. Let’s begin with a brief stroll down memory lane.
Past Historical Filmmaking Innovations
The journey to creating the ideal cinematic experience is marked by key milestones within the past 100 years.
These are just a few of the innumerable technical innovations that have reinvented how filmmakers shape the very nature of our perceptions, altering physics itself to make their storytelling larger than life.
Not impressed? We get it. By this point, who hasn’t heard about most of these filmmaking milestones. You’re here for the good stuff—you want to know what the future really has in store for us.
Combining Videogame Engines with Filmmaking
The evolution of videogames has been remarkable over the course of the past 30 years alone. Just like the film industry, the gaming industry has experienced its own advancements that have opened up a new realm of possibilities. This merger of film and game is known as the interactive arts.
In just the span of Doom in 1993 to Doom Eternal in 2020, not only has the style of gameplay evolved but our entire expectations for virtual worlds. And with bigger and better game engine technologies, the landscape of the gaming world is more interactive than ever. With the popularity of virtual reality (VR), the future has much in store for how we interact with our entertainment.
Just so you know…
A game engine is the core software component of a videogame and possesses the tools necessary to change and manipulate the landscape. Most game engine suites include built-in applications to ease development, like establishing unique physics, sound, graphics, and artificial intelligence (AI) functions.
Now, you’re probably thinking, ‘What’s a game engine have to do with filmmaking?’ Honestly… it’s all about world-building.
Unreal Engine and the Disney+ series The Mandalorian
Everyone knows of the live-action Star Wars series The Mandalorian—it’s on Disney Plus, you might’ve heard of it… you know, the Disney-centric streaming service that accrued over 100 million subscribers in record time (and is currently just trailing Netflix and Amazon)!
But for a series on a streaming service, how are they able to achieve such amazing visual effects?
Working with green screens can create many cost-related challenges. Plus, it takes long stretches of time to meticulously work in post. Alternatively, building such sets would cost millions of dollars. Therefore, it could diminish the quality of the production in other areas.
Instead, showrunner Jon Favreau along with DP Greig Fraser, and a team of geniuses at Industrial Light & Magic (ILM) and Lucasfilm did what they do best and helped to conceive a whole new photo-realistic VFX system with the help of Epic Games with their Unreal Engine. The collaboration also consisted of production technology partners like ARRI, Fuse, Lux Machina, NVIDIA, Profile Studios, and Golen Creations.
According to Fraser, “This is the beginning of something extraordinarily powerful.”
What is Unreal Engine?
Unreal Engine is the proprietary gaming engine developed by Epic Games, first launched with the first-person shooter of the same name Unreal in 1998. Since its inception, Unreal Engine has expanded into 3D gaming and is on the cusp of releasing its most up-to-date iteration, Unreal Engine 5 (UE5).
UE5 takes the interactive experience to the next level by leading the industry in awe-inspiring visual content, a node-based VFX system, a high-performance physics system, world-building, plus much more!
Did you know that you could also learn how to code with UE5? Explore the next generation of technology at UE’s The Pulse hub, which presents panels with seasoned developers along with the latest information and resources for hungry creatives.
The Future of World Building is with StageCraft
Move over green screens, there’s some new tech in town—or at least it’s visible on the horizon. The new creation is StageCraft, a stunning new invention that manifests CG environments in real-time, appearing on a curved 20-foot high, 270-degree LED wrap-around wall and made up of 1,326 individual LED screens and an LED ceiling.
Why is StageCraft important?
Create virtual sets that track with camera’s motion around both objects and actors
Rendered in real-time to large StageCraft LED wall and atmospheric lighting systems
Real-time composition of shots
Immediate editing of virtual sets
Ability to film multiple scenes within a short time window
Appears more natural than traditional chromakey effects
This massive LED setup is known as the Volume. (Pretty cool, huh?) What Favreau and his team have successfully done is merge practical sets with virtual backgrounds. All, of course, powered by Unreal Engine filmmaking tech. That way, as Jon Favreau has put it, they can benefit from “real-time, in-camera rendering.”
The Volume Mandalorian
The Mandalorian is the first project of its kind to harness the power of this new game engine technology and successfully apply it to the world of filmmaking. However, it’s perfectly understandable that the planet-hopping Star Wars series would serve as the ideal project to debut the potential of this cutting-edge technological advancement. Lucasfilm and ILM have crossed this bridge before. Only now do they have the seemingly limitless resources of Disney backing them.
Jon Favreau, who helped advance the technology of tracking physical cameras in a virtual space for The Jungle Book and The Lion King. Being that the Star Wars series called for extra-planetary exploration, which could get complicated from both a budgetary and creative standpoint, he was keen on applying those same virtual innovations.
It literally transports you to the virtual location. The latest innovation of StageCraft Volume tech also benefits actors. Rather than working with green suits on a green screen, they’re able to see the environment and therefore act and react authentically to the environment.
The Engineers Literally Move Mountains
The team of artists and engineers who operate the Volume Mandalorian are referred to as the Brain Bar. Like Mission Control for NASA, the Brain Bar is stationed nearby to operate and manifest the virtual environments, alter the light, and change locations, among other crucial features. The Brain Bar can move the location from one part of a hangar to another. They can also drop you into the vacuum of space or place you in the middle of a tropical planet.
The visual effects supervisor for The Mandalorian, Richard Bluff has remarked that the LED screens are the same kind you see in Times Square. That way, the art department created small, transportable sets wheeled in and out of a small stage with relative ease. Then, the Brain Bar simply changed the background with relative ease.
Lighting and Texture
From the get-go, the engineers were clear on the idea that all lighting and textures had to feel absolutely authentic. During the testing phase, not all of their landscapes worked such as the ones built from scratch. However, photo-real imagery from real locations that evoked the fictional planets like Tatooine ended up working remarkably well.
As long as the lighting was consistent with the location, the audience will believe the actor is in that environment. The same goes for the production workers, who felt as if they were on an interstellar set in a galaxy far, far away as opposed to a sound stage in Los Angeles. “It’s like we’ve put them inside a videogame,” says Kim Libreri, the chief technical officer of Epic Games.
The team was even able to better light the Mandalorian’s armor than working with green screen. This was able to be achieved without meticulous work in post-production. The high-resolution VFX courtesy of Unreal Engine filmmaking technology was what made such a feat possible.
Lighting The Volume
When referring to the Volume while talking to Deadline, Grieg Fraser says, “We have full control of the light, we’re not spending all that time trying to cut the sun, or trying to diffuse it, or trying to add negative fill. On the practical side, we’re able to move faster. But even more importantly, on an emotional side, we’re able to build the world that we’re wanting to in advance, knowing that we’re going to have an extended period of that particular controlled light.”
The engineers also mastered lighting and motion. In one scenario, they programed StageCraft the Volume to completely integrate its lighting setup. They synchronized it with the dollying camera as the Mandalorian walked center of frame. As the camera tracked backward, the environment moved along with the titular character. They were able to achieve this feat without flickering or any technical issues!
What Other Projects Have Used StageCraft technology?
An LED wrap-around screen was also used in Rogue One, but they were limited in a few key ways. The Rogue One production used pre-rendered environments and was only able to work with a single camera perspective, which they ended up replacing in post-production. Compared to The Mandalorian, the environment that was cast upon the Volume was the same visuals used in the final product.
So, what other movies made with game engines are on the horizon?
It’s not a movie but one of HBO’s prestige shows Westworld has also integrated The Mandalorian Unreal Engine and StageCraft technology for its third season. In fact, Favreau himself shared his newfound technological approach with series showrunners, Jonathan Nolan and Lisa Joy.
There are other companies seeing the writing on the wall for videogame engines in filmmaking. The Spanish-based company Orca Studios is in the process of establishing multiple studios for virtual filming.
Epic Games is taking its new technology even further by investing in 45 movie and short film projects to be developed with the Unreal Engine. It will begin first with an animated feature film about Gilgamesh in conjunction with animations studios like Hook Up, DuermeVela, and FilmSharks.
The Future of Filmmaking Videogame Technology
The future is an exciting place full of elaborate innovations in technology that can generate the very essence of magic. And when merged with filmmaking, it creates a whole new way to experience entertainment.
With the emergence of the Unreal Engine in filmmaking, the line between gaming and movies continues to become increasingly obscured. It has unveiled a new interactive way to interface with entertainment.
The Mandalorian’s success with Unreal Engine cinematography and the Volume has laid a significant blueprint that will be replicated by others. This will most certainly be another milestone in the advancement of film technology. The Mandalorian LED stage is only the beginning…