VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


September 27
2018

ISSUE

Fall 2018

Painting the Afterlife in WHAT DREAMS MAY COME

By IAN FAILES

Robin Williams plays Chris in What Dreams May Come.

 

Perhaps the ultimate goal of a visual effects artist is to be involved with a project that is art and spectacle at the same time. Vincent Ward’s 1998 film What Dreams May Come certainly represented both of these areas, particularly with its ‘painted world’ version of the afterlife, as visited by Robin Williams’ character, Chris, after the death of his wife, Annie (Annabella Sciorra).

The visual effects for the painted world of What Dreams May Come would take the concept of optical flow – where every pixel is tracked in a moving image – to new levels in order to turn liveaction scenes into paintings in motion. This work, and other painterly imagery in the film, would ultimately result in an Academy Award for Best Visual Effects (awarded to Nicholas Brooks, Joel Hynek, Kevin Mack and Stuart Robertson).

On the 20th anniversary of What Dreams May Come, VFX Voice sat down with one of its visual effects supervisors, Nicholas Brooks, to find out more about the R&D behind the optical flow-based painterly world and how it was executed.

Robin Williams during filming of a scene where his character, Chris, encounters his old dog, Katie. Optical flow was used to make the flowery setting a painted world.

VFX VoiceWhat were you doing right before you started on What Dreams May Come?

Nicholas Brooks: Well, it certainly was an interesting time. I was at Mass.lllusions at the time. We got the scripts for The Matrix and What Dreams May Come all at the same time. So those two films were linked, basically. The bullet time of The Matrix and the painterly world of What Dreams May Come were part of the same R&D effort.

VFX VoiceCan you talk about some of the initial conversations with the director and this R&D effort that followed for the painterly world?

Brooks: Vincent felt that heaven had to be something special. He didn’t want to rely on golden light and white sets. So he came up with the concept of making the character Annie a painter – and art would be the way that we’d know we were in heaven. Then Chris, Robin Williams’ character, walks into heaven and sees his wife’s heaven and they’re connected through the painting.

Director Vincent Ward (left) and Robin Williams on set.

Chris and Annie inside the painted world afterlife.

Vincent Ward directs Robin Williams. Note the orange ball tracking markers dotted among the flowers.

To do that, Vincent explored re-touching the live-action photography. He was looking at romanticist painter Caspar David Friedrich’s work for inspiration, particularly the painting called ‘Two Men Contemplating the Moon.’ Right at that time I’d been working with Kodak’s Cineon software and they’d just released Cinespeed, which was the first optical flow re-timer. From that, we came up with this wacky idea of doing ‘machine vision’ tracking of entire plates. We did some tests initially with painterly filters, which looked good on still frames, but they became a flickering mess when applied to a series of images. The idea became, why don’t we try to use optical flow to drive a paint system?

 

Chris and Albert (Cuba Gooding Jr.) come across the Purple Tree, matching a newly painted tree crafted by Chris’s wife, Annie. Digital Domain ‘grew’ the tree with L-systems techniques. The studio was one of several vendors on What Dreams May Come. along with principal vendor Mass.Illusions, and POP Film, CIS Hollywood, Radium, Illusion Arts, Mobility, Giant Killer Robots, Shadowcaster and Cinema Production Services. Overall Visual Effects Supervisor Ellen Somers oversaw the production.

 

VFX Voice: Can you explain what optical flow meant, in terms of the way you wanted to use it?

Brooks: Optical flow comes from machine vision. If you’re putting an autonomous robot on the surface of the moon or Mars, you give it a vision system, a minimum of two, possibly three, possibly five cameras that view the world from slightly different angles. In a way, it’s similar to the way that humans with a stereo pair of optics, i.e., eyes, see the world and are able to understand depth through convergence.

So if you’ve got a robot and you’ve got these cameras, then you need to write the perceptual brain or the perceptual program that works out the differences between these views and creates an idea of depth. Basically, it tries to match each frame to the other frame and give you ‘per pixel’. That’s called a vector field or an optical-flow vector field.

In our world, where we’ve got one camera, you create a vector field that allows you to track pixels from frame one to frame two. If you’re panning to the right for instance, you will get a set of vectors that show the features of that world as values for each of those pixels in terms of x and y shifts.

What we realized was that this meant we could generate a paint stroke for every pixel of the image, and we could transform that paint stroke to the next image and to the next image after that, and so on, because the camera is moving and we are basically generating a set of pixels. Then what we needed was a particle system that would take optical flow as transformation information. None of the existing tools had just the right kind of image processing, so we knew we had to build it ourselves.

 

The original plate for a scene of Chris and Katie exploring the afterlife.

The final shot was enabled with extensive development of tracking techniques, optical flow and a specialized particles tool to produce the painterly effects.

 

VFX Voice: How was this developed further for What Dreams May Come?

Brooks: The first thing we did was get the film studio to give us some money to test this, and we went out and shot some footage of a guy walking through a forest in South Carolina. We selected two shots from that, and we had hired a programmer, Pierre Jasmin, who had a particle system that he’d written. Pierre was one of the original programmers of Discreet Logic, and he went on to co-found RE:Vision Effects.

Pierre had written a particle system that would take an image and analyze the color per pixel. Then what we did was generate paint strokes. We physically painted a bunch of paint strokes that had white, blue and red paint mixed into it. So you can imagine these slightly Monet-like strokes of different shapes and sizes that you saw the three pigments in. We scanned them all in and we used the color of the pixels – the white, blue and red channels – to drive the movement. For example, on the photography, let’s say it was green grass; it would look at that green pixel and go, okay, if the base color is green, it would do some variations based on that.

In Pierre’s system we generated layers of particles with optical flow with these paint strokes. We had rules for orientation, depth, and all sorts of different variations. In essence, we would apply a traditional painter’s algorithm, i.e., the way you might paint from the background to the foreground in terms of how we would paint the sky and how we paint the horizon. And we essentially segmented the image into different alpha channels. So when you look at the image, it would be maybe 10 different maps or different depths.

What Dreams May Come was basically greenlit on the back of this test. Everybody realized it was possible that we could actually film in amazing locations, and be able to transform it into a moving painting without it looking kind of like kitsch or CG or over-processed.

Interestingly, we did this What Dreams May Come test – which was really successful – and right after that we did the test for bullet time using a similar technique, but without the particles. Bullet time was more about frame interpolation where we set up all these multiple cameras and interpolated across with these different camera views.

VFX Voice: Once you were in production, what things were being done on set to help with the optical flow process later on?

Brooks: We’d also been developing the use of Lidar for visual effects production. We were Lidar scanning the landscapes and getting tracking information and spacial information for the environment that helped us generate depth maps and to generate 3D data. This seems trivial today, but at the time we were absolutely using Lidar in a way that it hadn’t been used before. At that point it was still used mostly for engineering. We were scanning trees – all sorts of stuff – and learning how to segment all that point-value information, surface it, and abbreviate it so that we could use it in our paintings.

With all this information we had, we were able to just keep tweaking the imagery, and we learned so much along the way. One of those little tricks we learned was, as we were moving paint along we could kind of accumulate it and sort of smear it. As the camera moved it would be self-smearing.


Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
VFX Vault
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
VFX Vault
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
VFX Vault
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
VFX Vault
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
VFX Vault
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.