VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


December 13
2018

ISSUE

Winter 2019

The Reality of Virtual Filmmaking

By IAN FAILES

Habib Zargarpour works with his virtual production tools to plan out a scene. (Image courtesy of Digital Monarch Media)

It’s easy to think about a film being made by a crew with actors, a set, some lights and a camera. Indeed, many films are still made this way. But as films continue to be imbued with more complex action, and invariably more complex visual effects, filmmakers are turning to new production techniques to imagine these scenes virtually even before they’ve been shot, and then to scout sets, interact live with CG assets and characters, and to shoot, revise and iterate virtual scenes on the fly.

‘Virtual production,’ as it has become known, can mean many things. On Ready Player One, for example, virtual production techniques were used by director Steven Spielberg to combine motion-captured characters, virtual cameras, ‘simul-cams,’ realtime rendering and virtual reality to help imagine the entirely synthetic world of the OASIS. The upcoming Lion King from Jon Favreau and James Cameron’s Avatar sequels are also examples where virtual production is a central part of the making of those films.

There are also many different types of virtual production, whether it’s in the motion-capture equipment used, the style of virtual cinematography, the real-time rendering engine of choice, or simply how different pieces of a system are ‘bolted’ together. VFX Voice asked several players in the virtual production space about their approaches to this growing area of filmmaking.

A still from Digital Media Monarch’s virtual production for the ‘Trash Mesa’ scene in Blade Runner 2049. (Image copyright © 2017 Alcon Entertainment, LLC)

 

DIGITAL STORYTELLERS

Digital Monarch Media (DMM), run by Wes Potter and Habib Zargarpour, is one of a growing number of specialist virtual production outfits, with contributions made to films including Ready Player One, Blade Runner 2049 and The Jungle Book. A variety of hardware and software tools DMM provides enable directors to operate a virtual camera and make lighting and CG set changes. The virtual camera or V-cam is a customized tablet mapped to real camera lenses with game controllers for operation, running through the Unity game engine. Giving filmmakers a collection of tangible tools that they can interact with is a big part of DMM’s tool set.

“There are many things you have to cater for,” notes Zargarpour, who previously worked at Industrial Light & Magic, EA and Microsoft. “There are different camera rigs simulating dollies and jibs and those kind of things. And then you have people who like key framing. And then you have all the things relating to recording takes and being able to view and note those, plus a user interface for accessing the V-cam, lighting and moving set pieces around. There’s a lot to think about.”

For Ready Player One, DMM worked with Digital Domain to integrate real-time engines into the virtual production pipeline. Scenes with motion-captured actors were scouted and filmed virtually, and then could be re-worked if necessary with the virtual tools. Spielberg also utilized a range of VR headsets for motion capture, scouting the sets and shooting practical sets.

On Blade Runner 2049, the arrival of a flying Spinner vehicle was proving difficult for director Denis Villeneuve to imagine. To help find the shot, the director used DMM’s Expozure V-cam tool to visualize the scene from the interior of the cockpit. “We attached the Spinner to the virtual camera and gave it to Denis,” explains Zargarpour. “He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.”

DMM’s tools have been taken to a new level for an upcoming film where they were used at the previs stage to generate master scenes that could then be ‘filmed’ quickly with the virtual camera. Zargarpour says that two operators used Expozure to generate 300 previs shots for the film, including with realistic real-time water simulations, while over the same period six previs artists generated only 12 shots using more traditional means.

A Digital Media Monarch demo that incorporates real-time rendered ocean waves via an NVIDIA ocean plug-in. (Image courtesy of Digital Monarch Media)

A DEDICATED PLATFORM

Having worked on a number of films where virtual production was a key part of the filmmaking process, visual effects studio MPC decided to build its own virtual production platform called Genesis. “The turning point for MPC was The Jungle Book,” notes MPC Realtime Software Architect Francesco Giordana. “That’s when we realized how important virtual production is and we embarked on this new adventure. Tracking all the assets, the scenes, the cameras, the animations and all of the modifications is key. We couldn’t find any available third-party solution that could give us that out of the box.

“It also became apparent that standard VFX tools were not the way to go. They didn’t offer the speed or the flexibility that on-set workflows require, so we turned to game engines, not as the definitive solution, but rather as a piece of a much more elaborate framework.”

Genesis works with Unity, but MPC says that they rely on the game engine mainly for graphics and user interactions. The studio has built a separate layer for handling network configurations instead of relying on typical multiplayer game patterns. A key aspect of Genesis has been to incorporate elements of typical live-action filmmaking into the virtual tools.

“The typical way of maintaining the look and feel of a physical camera in virtual production is to motion capture a Steadi-cam and have a cameraman operate it, but we went a lot further,” says Giordana. “We encoded a variety of traditional camera equipment like cranes, fluid heads, dolly tracks and more, and made sure they would feel to the camera crew exactly the same as the real thing.” So far, MPC has employed Genesis on a couple of projects and used the full range of VR and more traditional camera devices. “We feel like the tool kit has now been properly battle-tested, and on one of these projects we shot over 2,500 takes in a little over five weeks,” states Giordana. “It’s amazing seeing how an entire crew can gradually familiarize with these new workflows, pick up speed and suddenly start producing over 100 takes a day!”

“We attached the Spinner [vehicle in Blade Runner 2049] to the virtual camera and gave it to [director] Denis [Villeneuve]. He performed the Spinner, and then the camera inside the Spinner. Usually this would take a long time to iterate on.”

—Habib Zargarpour, CCO/Co-founder, Digital Monarch Media

A scene is set up for filming with a motion-captured actor and the virtual camera as part of MPC’s Genesis platform. (Image courtesy of MPC)

MPC’s Genesis system is intended to be a multi-user platform, as demonstrated in this Unity Book of the Dead frame. (Image courtesy of MPC)

A still from ‘Reflections,’ a real-time ray-tracing demo in Unreal Engine 4, completed in conjunction with ILMxLAB and NVIDIA. (Image courtesy of Epic Games)

“We wanted to extend the [Unreal Engine] editor so that you can have multiple editor sessions connected together from different computers and from different users. Then we also wanted it to run as a VR application, where they might not even be in the same location.”

—Kim Libreri, CTO, Epic Games

At SIGGRAPH, ILMxLAB and Epic Games demonstrated the ‘Reflections’ piece during the Real-Time Live! event. (Image courtesy of SIGGRAPH)

 

VIRTUAL TOOLS AT YOUR DISPOSAL

Just as some outfits have incorporated the game engine Unity into their virtual production workflows, several others have used Epic Games’ Unreal Engine for real-time rendering aspects. That engine has several components that fit into a virtual production platform. For example, Unreal’s virtual camera plug-in relies on Apple’s AR kit on an iPad or iPhone to feed camera positions to the engine, thus enabling a user to use that device as a camera portal into a virtual world. “Straightaway, you can take some CG content you’ve made in the engine and film it as if you’re a professional virtual cinematographer,” says Epic Games CTO Kim Libreri.

ILMxLAB recently utilized an Unreal Engine workflow for their Star Wars-related ‘Reflections’ demo, which also took advantage of real-time ray tracing. Another Unreal Engine demo from Kite & Lightning showed how a head-mounted iPhone capturing a facial performance could drive the performance of a CG avatar – in this case, a baby – in real-time with convincing results. Unreal Engine is also used by several studios to do live-action real-time compositing, yet another aspect of virtual production filmmaking.

Epic’s vision for the future of virtual production with Unreal Engine is the idea of virtual collaboration. “If you make a movie by yourself,” says Libreri, “you don’t really need a collaboration system, but that’s typically not the way it is. You usually have somebody working on lighting, somebody working on set layout, somebody doing the camera work, someone directing. We wanted to extend the editor so that you can have multiple editor sessions connected together from different computers and from different users. And then we also wanted it to run as a VR application, where they might not even be in the same location.”

Outside of more mainstream projects, Epic has even enabled virtual production to take place within its popular multiplayer game, Fortnite. Here, players can find one of the game’s hidden ‘greenscreens,’ capture themselves dancing, and then use the replay system to output the footage to be combined via a traditional editing tool with other footage. “It’s not just about pros,” says Libreri. “We’re enabling the YouTube/Twitch generation to make little movies and express themselves.”

A frame composed through the Unity game engine for Book of the Dead, which was made with MPC’s Genesis system. An important philosophy behind the Genesis approach is to replicate how live-action filming is achieved. (Image courtesy of MPC).

Previs for a ship at sea demo crafted using DMM’s Expozure virtual production tool. The virtual dolly is highlighted in the frame. (Image courtesy of Digital Monarch Media)

“The turning point for MPC was The Jungle Book. That’s when we realized how important virtual production is and we embarked on this new adventure. Tracking all the assets, the scenes, the cameras, the animations and all of the modifications is key. We couldn’t find any available third-party solution that could give us that out of the box… so we turned to game engines, not as the definitive solution, but rather as a piece of a much more elaborate framework.”

—Francesco Giordana, Realtime Software Architect, MPC

Kite & Lightning’s Cory Strassburger demonstrates his Bebylon project using an iPhone X, Xsens motion-capture suit, IKINEMA software and Unreal Engine. (Image courtesy of Kite & Lightning)

This Ncam Real Depth demo shows how the presenter against greenscreen is able to be captured and composited – in real-time – behind a separate element in the footage via depth data.

Real Light is Ncam’s new approach to rendering virtual graphics in real-time and integrating them into a real-world environment, as well as adding in dynamic shadows and lighting changes.

Ncam’s sensor bar attached to a camera enables elements of the environment to be tracked in terms of position and depth for the purposes of augmented reality and real-time VFX

ON-SET PRODUCTION

Virtual production systems tend to rely on camera-tracking technology that enables virtual graphics to be super-imposed in real-time onto live-action photography. Ncam has been offering that ability for several years, with experience on large-scale  films such as Hugo, White House Down and Solo: A Star Wars Story. Among newer projects are Deadpool 2, Aquaman and The Nutcracker and the Four Realms. It also has a major footprint in real-time broadcast graphics.

The classic use of Ncam on a feature film or television show has been to visualize CG environments or set extensions, typically by replacing large amounts of greenscreen with a previs asset. Says Ncam CEO Nic Hatch, “This allows the filmmakers to compose and frame the shot through the eyepiece as if the CG set were physical. Lens choices can be made, and the zoom, lens distortion and focus are all interactive as per a live video feed. We’ve also seen animated assets being used, such as vehicles and aircraft. These types of shots can be really useful for trigger points and reaction and interaction with the actors.”

Ncam’s toolset is made up of camera-tracking tech called Ncam Reality, which is both a hardware and software solution. The hardware consists of a sensor bar with optical and mechanical sensors that attaches to the main camera, and a server where the software sits. Ncam’s software includes 3D point-cloud generation, which makes the camera tracking marker-free. The company also has a depth-sensing option called Real Depth which can extract actors from greenscreens with interactive depth-based compositing.

In recent times, Ncam has been moving forward on experimentations with LED screens rather than greenscreens, more augmented reality projects, as well as techniques to allow for automatic relighting of CG elements based on the real-world lighting in a scene (its product will be called Real Light).

“The system figures out where the real-world lights are in terms of direction, brightness and color, and recreates those in real time,” explains Hatch. “It also computes reflections, image-based lighting and shadows. We’re really excited by the possibilities Real Light offers, not just for real-time applications, but also in terms of data collection on set and location, and then bringing that data back to set to recreate the exact lighting conditions, to take out the guesswork.”

A New Tool on the Virtual Block

A group of Netherlands-based filmmakers and visual effects artists have embarked on their own virtual production platform called DeepSpace. The tool-set is a collaboration between Planet X Technologies and ScreenSpace Lab, and is aimed at helping filmmakers craft previs with virtual camera tools.

Essentially, their system works by allowing someone to hold a physical camera in their hand rigged with common camera controls. The user can see the scene inside VR goggles, and can move around the scene to experiment with angles, lenses and camera movement. Later, this can be adjusted further and be used to produce a technical visualization of the location.

“We noticed that there is a reluctance with directors, DPs and other filmmakers to engage in creating quality previs for their projects,” says ScreenSpace Lab’s Marijn Eken on the move to develop DeepSpace. “Communicating to the 3D and previs animators what kind of shots they want to see can be a frustrating and slow process from their perspective. We wanted to give back control to the filmmakers by letting them create the shots themselves in an intuitive way without having to know how to operate 3D software.”

DeepSpace is targeting the European market, focusing on productions that may have a complex scene that is difficult to grasp or visualize. “Our workflow is to invite production designers to review their own designs in VR and together with the director make decisions on what they actually need to build,” explains Eken.

“A stunt coordinator can give us input on how he would approach a stunt. And, finally, the director of photography, together with the director, can effectively have a ‘shooting day’ in our system and download the recorded clips to edit into a previs movie.”

A recent Dutch feature film used DeepSpace to figure out a shoot that needed to take place on a highway with precision drives and near-miss accidents. (Image courtesy of DeepSpace)

“We wanted to give back control to the filmmakers by letting them create the shots themselves in an intuitive way without having to know how to operate 3D software.”

—Marijn Eken, Founder/Developer, ScreenSpace Lab

CG car assets and a virtual set enabled the filmmakers to imagine the roadway sequence by testing out lenses, framing, pacing and action. (Image courtesy of DeepSpace)

A shot from the final sequence shows how the live action closely matches the virtual production-enabled previs, as created with DeepSpace. (Image courtesy of DeepSpace)

Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Film
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Film
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Film
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
Film
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Film
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.