VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


May 04
2017

ISSUE

Web Exclusive

EPIC GAMES’ UNREAL ENGINE MAKING BIG VFX IMPACT

By IAN FAILES

At the Game Developer’s Conference (GDC) in San Francisco earlier this year, the makers of Unreal Engine, Epic Games, offered several presentations that astonished many of the attendees. These developments are sure to impact visual effects workflows. In fact, they already have.

Among Epic’s GDC demos was a CG Chevrolet short called “The Human Race” incorporating The Mill’s Blackbird proxy tracking car, and the VFX studio’s Cyclops virtual production toolkit, along with real-time augmented reality (AR) using Unreal’s real-time rendering tech (dubbed “Project Raven”). Also at the conference, Epic revealed final frame renderings from Rogue One of the droid K-2SO that made use of the Unreal game engine, and the company showed off its Unreal Engine 4 VR editor.

Watch a behind-the-scenes video for “The Human Race.”

VISUAL EFFECTS IN REAL-TIME, ALL THE TIME

If it’s not already obvious from that listing of projects, Unreal Engine is finding significant use in live-action projects for rendering final-quality images in real-time, i.e. where traditional post-production, final-quality rendering and compositing might have previously been used.

That move into final VFX quality is the goal, or at least part of the goal from Epic, which also sees an increased use for its Unreal Engine toolset in visualizing sequences in pre-production and on set (again, in real-time). That doesn’t mean it’s a ‘one-button’ solution; significant tech in tracking, image-based lighting and rendering, and other virtual production advancements were required to make “The Human Race” possible.

   “We want AR to be VFX-grade quality AR, basically. If you want the content to be believable, if you want to relate to the content, whether you are selling a car or telling a story, I think quality matters, and that’s what Unreal Engine stands for. And we had seen so much crap in AR that we had to put our foot down and say this is our version of AR and it looks good.”

—Marc Petit, GM, Unreal Engine Enterprise, Epic Games

Virtual production? Isn’t that already happening in visual effects? Films like Avatar, Real Steel and The Jungle Book have leaned heavily on filming live-action actors on set or in motion-capture volumes, and being able, with simulcam technology, to imagine vast environments behind them or cg characters interacting with the action. Indeed, Epic Games itself has been at the forefront of these efforts as seen in their several Hellblade demos.

The Blackbird car

The Unreal Engine rendered CG car

WHAT THESE DEMOS MEAN FOR VFX

But what’s the step up that “The Human Race” and “Project Raven” offers? The answer is two-fold: improved AR and improved production workflows with near photo-real production quality complete with realistic lighting, shadows and camera effects.

“We want AR to be VFX-grade quality AR, basically,” says Epic’s Marc Petit, General Manager, Unreal Engine Enterprise. “If you want the content to be believable, if you want to relate to the content, whether you are selling a car or telling a story, I think quality matters, and that’s what Unreal Engine stands for. And we had seen so much crap in AR that we had to put our foot down and say, this is our version of AR and it looks good.”

What exactly did Epic’s AR “Project Raven” demo entail? In “The Human Race” project, the Blackbird proxy car stood in for a digital Camaro ZL1. The Blackbird can be configured to perform and drive like any other vehicle and then in post production – thanks to its tracking markers – VFX artists can work out the orientation of the CG car.

The color selector component of “The Human Race” demo, which allowed details of the Camaro to be changed on the fly.

During filming, the viewfinder of course only sees the Blackbird. “Project Raven” sought to use Unreal Engine to visualize the Camaro realistically over the Blackbird – in real-time. It’s been possible to do this with other simulcam techniques previously, but the goal, again, was as much realism as possible. The Blackbird had four RED cameras, which were able to generate a panoramic environment and a way to source information, plus all the various reflections and effects over the car. The idea was that the CG car could be composited in real-time and fit into the live-action background as if it was a final effects shot.

“It’s about the value of cutting down iteration, the value of seeing stuff in the viewfinder when you’ve got your graphics on top of the live action and the kind of time you can save.”

—Marc Petit, GM, Unreal Engine Enterprise, Epic Games

That required a lot of data passing through Unreal Engine – about 1.5 gigabytes streaming per second, thanks to the actual photographed plates and the panoramic lighting environment generated via the RED cameras. In addition, the typical compositing elements of camera noise and subtle flares, blooms and other bits and pieces were also part of the mix. But it’s not post production; it was all done in approximately 41 milliseconds through the engine.

This workflow allowed for the real-time visualization of the car. An extra component was the ‘high-end’ AR. Here, a virtual camera system was developed to suggest that even a consumer might be able to change the color of the car in real-time during filming, while also staging shots. That was made possible via collaboration with Google’s Tango platform, which has been developing in the area of AR and computer vision for smart phones for a few years. The result was the delivery of scenes with physically based car renders and a realistic final shot that matched the live-action background, even with varying lighting conditions.

A final still from The Human Race, which also involved a futuristic CG version of the vehicle.

REAL-TIME IN PRODUCTION

In addition to high-end AR, Epic’s other aim is to improve production workflows. That’s time spent in planning, shooting and posting scenes with CG assets. Epic’s goal is to not only be part of the final rendering where possible, but also help filmmakers and creators make informed decisions on set. Less takes and fewer iterations are aimed at saving time and money.

“The way you frame the car, you need to be able to see the exact geometry of the car, and you need to get a good sense of the reflection and interaction of the environment to make your camera decision. So we were able to give the director in the pursuit car a good view and a sense of comfort that he could get what he wanted and we could support his creative vision by watching what Unreal Engine was showing him in real-time in the pursuit car.”

—Marc Petit, GM, Unreal Engine Enterprise, Epic Games

“One good example (arose) from when we were shooting ‘The Human Race,’” notes Petit. “The way you frame the car, you need to be able to see the exact geometry of the car, and you need to get a good sense of the reflection and interaction of the environment to make your camera decision. So we were able to give the director in the pursuit car a good view and a sense of comfort that he could get what he wanted and we could support his creative vision by watching what Unreal Engine was showing him in real-time in the pursuit car.”

Petit suggests that the toolset would be particularly useful on a project similar to, say, Real Steel, where CG fighting robots had to appear in dynamic live action scenes among live-action actors. “I think the ‘Raven’ demo is a way of saying, if the simulcam virtual production workflows have the right color pipeline and compositing pipeline in the game engine, then we can actually do more projects like Real Steel, which was relatively inexpensive to shoot for an amazing result.”

Epic has already been involved in several demos combining performance and facial capture, real-time cinematography and rendering for Ninja Theory’s upcoming game, Hellblade: Senua’s Sacrifice.

“It’s about the value of cutting down iteration, the value of seeing stuff in the viewfinder when you’ve got your graphics on top of the live action and the kind of time you can save,” adds Petit, who also notes that the technology developed for Real Steel and Avatar was somewhat bespoke and isn’t necessarily widely available, whereas a solution from Unreal Engine would be.

FROM DEMO TO REAL

What’s next from Epic and the Unreal Engine in this virtual production/VFX cross-over? Asked whether we might see a digital human and real-time cinematography Hellblade demo that uses the “Project Raven” toolset, Petit remained somewhat coy.

“Digital humans are definitely next in line,” he offers. “We had this opportunity with the Blackbird, which was a fun capture device, but it would be a natural progression for us to do a very similar workflow presentation using a digital human.”

Ultimately, it’s clear that these worlds of gaming, VR, AR and VFX will continue to collide, and that’s something effects artists can continue to be part of.

 “Digital humans are definitely next in line. We had this opportunity with the Blackbird, which was a fun capture device, but it would be a natural progression for us to do a very similar workflow presentation using a digital human.”

—Marc Petit, GM, Unreal Engine Enterprise, Epic Games


Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Exclusives, Games
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Exclusives, Games
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Exclusives, Games
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
Exclusives, Games
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Exclusives, Games
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.