VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of two prestigious 2018 Folio Awards for excellence in publishing.

 

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 02
2019

ISSUE

Web Exclusive

In Motion with MILL MASCOT

By IAN FAILES

The user interface for Mill Mascot, which allows users to transfer their facial and hand performances onto a CG character in real time. (All images courtesy of The Mill)

We’re reaching a time when traditional animation, CG techniques, virtual production, real-time rendering and motion capture are all coming together in useful tools for artists in visual effects and animation. Indeed, we may already be there, as visual effects studio The Mill’s new Mill Mascot system shows.

Mill Mascot is a toolset that allows artists to puppeteer CG characters through hand and facial gestures. Making that possible is a connection of real-time game engine tech, animation tools and motion sensors. The idea is that directors, clients and artists can ‘jump in’ and get creatively involved in the development of a character themselves – live.

VFX Voice asked The Mill about how the Mascot system was developed, how it works and what it has been used for so far.

Hand gestures in Mill Mascot translate to the character in real time.

RAMPING UP ON REAL-TIME ANIMATION PERFORMANCE

Mill Mascot began life at the studio as The Mill’s ‘real-time animation system,’ or RTAS, bringing together software and hardware tools aimed at enabling live animated production. The results were a number of in-house Mill demos and some ‘Vonster’ spots for client monster.com, which saw a hairy creature puppeteered to perform human-like antics.

“The rudimentary beginnings of Mill Mascot allowed characters to be controlled in the very traditional sense of puppeteering by only moving their mouths open and shut,” outlines Jeff Dates, Creative Director at The Mill New York. “We also had to be aware of character design in the early stages, as the system allowed for minimal movement of the body and extremities.”

Since that time, says Dates, Mill Mascot has evolved in its abilities in significant ways. “It now allows for a great deal more flexibility and creativity. Characters are far more dynamic in their look, and the system generates final imagery at a much higher fidelity. We also have much finer control over the movement, from the entire body and extremities down to the eyelids and nuanced facial expressions. This has also greatly been improved by the presence of facial tracking in addition to hand controls.”

Behind those abilities is a combination of Epic’s Unreal Engine, Leap Motion for hand tracking, and an iPhone X for facial tracking. However, The Mill continues to develop and iterate the system so that it will ultimately be software and hardware agnostic.

The Mill New York Creative Director Jeff Dates.

“The [Mill Mascot] system behaves much like a live-action puppet. We’re able to introduce performance controls for facial emoting and hand dexterity for the character. The ability to generate useful animation is limited only by what the character is able to do. So in reality, we found that once the character is ‘active’ and being performed, it’s as alive as any puppet. So we had literally hours of animation from our recording session. Not to mention real candid ‘outtakes.’”

—Jeff Dates, Creative Director, The Mill New York

It’s all part of a push at The Mill to approach visual effects and animation work differently. The studio has the benefit of working on a range of different creative projects, from commercials to shorts to motion graphics and animated pieces. In terms of real-time, The Mill has been at the forefront of this technology, having partnered with Epic on a virtual production demo called ‘The Human Race.’ Here, The Mill’s tracker-covered Blackbird car was utilized to deliver car commercial-like shots for a Chevrolet vehicle demo. A virtual car was composited live over the Blackbird, and could be re-skinned in real-time.

This and other real-time projects are designed to provide flexibility and choice for directors and clients. A much more traditional shooting, CG and animation pipeline requires time for iterations. Real time is intended to allow for quick changes in choices, and a chance to explore more options.

Importantly, The Mill isn’t aiming, just yet, for a full-body motion-capture solution with Mill Mascot. Clearly there are systems capable of doing that incredibly well. Instead, Dates makes a distinction between real-time performance animation and real-time motion capture, with Mill Mascot intended to be a tool that can drive a fun character performance, not necessarily one completely grounded in reality (even though that ability does exist).

The Mill’s Blackbird vehicle, as filmed during ‘The Human Race,’ a virtual production collaboration with Unreal Engine.

The ‘Vonster’ project was one of the first commercial uses for The Mill’s real-time animation performance system.

“At The Mill we use theatre-trained puppeteers and real-time animation artists to inject human essence and personality into our characters, but anyone can take the reins.”

—Jeff Dates, Creative Director, The Mill New York

 

MILL MASCOT IN ACTION

The ‘Vonster’ spots were some of the first glimpses into what The Mill’s real-time animation system was capable of. Other instances with live-animation performances were seen in ‘Seanna’s Ocean Buddies,’ a Sesame Studios animated short. Then, at a recent conference for HPE (Hewlett Packard Enterprise), The Mill devised a real-time interactive installation with Mascot.

“After creating the CG mascot for HPE’s ‘Tame The IT Monster’ TV campaign, they wanted to re-purpose the asset and use Mill Mascot in order to interact with their audience at the HPE Discover Conference in Las Vegas,” recounts Dates. “We created three activations using Mill Mascot in Vegas. First, the IT Monster performed live on stage, interacting with the HPE CEO during his Keynote presentation and ‘interrupting’ his speech. Because the character was being performed live by puppeteers using Mill Mascot backstage, the IT Monster could also interact with the audience, proving the content wasn’t pre-recorded.

“Additionally, we built two custom booths on the conference floor, one that allowed attendees to play with and interact with the IT Monster, and one that allowed them to puppeteer and animate him themselves. It was hugely rewarding to see people interacting with Mill Mascot on such a large scale and the client was extremely happy.”

Controlling the Vonster monster with hand gestures.

“Using a system like this really allows [clients] to be a part of the process. Not only can they see the action, but they can also direct and give feedback live, and in some cases perform the characters themselves. This adds a whole new level of accessibility to the art.”

—Jeff Dates, Creative Director, The Mill New York

 

HOW THE SYSTEM WORKS

Dates has been the driving force behind Mill Mascot, a project he says came out of the challenges of many projects he’d worked on thus far. “I’ve always been a storyteller. One of the frustrating things for writers is getting work produced, so I came up with the idea as a way to produce my own stories and sketches to try out new material. If you are familiar with the saying, ‘Fail fast, fail often,’ well, Mill Mascot allowed me to get ideas out of my system and onto the screen quickly. Animation became less precious, and experimentation became the currency.”

To get a character working in Mill Mascot, artists start by generating a high-quality CG character asset using traditional modeling, rigging and texturing tools they are already familiar with. Then they can import the character into Mill Mascot, connecting it to the gestural controls (the Leap Motion controller and iPhone X) and allowing it to be animated and rendered in real-time via Unreal Engine.

“The system behaves much like a live-action puppet,” explains Dates. “We’re able to introduce performance controls for facial emoting and hand dexterity for the character. The ability to generate useful animation is limited only by what the character is able to do. So in reality, we found that once the character is ‘active’ and being performed, it’s as alive as any puppet. So we had literally hours of animation from our recording session. Not to mention real candid ‘outtakes.’”

Consideration of the gestural controls is important to then allow a performer to use hand and facial movement to drive the character. “Here at The Mill,” says Dates, “we use theatre-trained puppeteers and real-time animation artists to inject human essence and personality into our characters, but anyone can take the reins.”

So far, the kinds of characters created have appeared as final animation in advertisements and also in live audience situations. But Mill Mascot can also be used as a previs tool or for exploring performances, either by animators within the studio or directors and clients demonstrating what they would like to see.

Live view, as the creation of the Vonster monster comes together.

“This doesn’t make traditional animation redundant, it just provides a new way in which we can animate characters in a world that demands fast and interactive content all the time.”

—Jeff Dates, Creative Director, The Mill New York

 

WHAT IT MEANS FOR CREATIVES

So what does Mill Mascot offer for directors, clients and artists that is different from the usual way they bring CG characters to life? Dates suggests that often clients do not know or understand what goes on behind closed doors in the world of visual effects.

“However,” the creative director notes, “using a system like this really allows them to be a part of the process. Not only can they see the action, but they can also direct and give feedback live, and in some cases perform the characters themselves. This adds a whole new level of accessibility to the art.

“The key benefit for all parties is creative flexibility,” continues Dates. “With the ability to treat character animation more like a shoot, capturing content and performance live, directors and animators can make creative decisions on the fly, develop new content and ideas as they go, and even capture ‘bloopers.’”

Will it take away the jobs of animators? Not at all, cautions Dates. “This doesn’t make traditional animation redundant, it just provides a new way in which we can animate characters in a world that demands fast and interactive content all the time.”

Artists utilize traditional modeling tools to get their CG characters ready for Mill Mascot.

A Mill Mascot character inside Unreal Engine.

“The IT Monster performed live on stage, interacting with the HPE CEO during his Keynote presentation and ‘interrupting’ his speech. Because the character was being performed live by puppeteers using Mill Mascot backstage, the IT Monster could also interact with the audience, proving the content wasn’t pre-recorded.”

—Jeff Dates, Creative Director, The Mill New York

INTO THE FUTURE

Moving forward, Dates says he hopes Mill Mascot can be used to help generate performances for some famous brand mascots. “This system has huge potential in the episodic entertainment world, and I’d really love to see Mill Mascot used in this space. We managed to ‘shoot’ the two-minute animated film for Sesame Studios last year in just one day, so this could be of huge benefits to brands such as Nickelodeon and Cartoon Network.”

A background plate in Unreal Engine. Characters performed with Mill Mascot can simply be placed against any background element.

Another view of the character with tracking of a person’s facial performance via an iPhone X.

Share this post with

Get the Latest VFX News

Receive notifications when WEB EXCLUSIVE content is posted on VFXVoice.com

Select List(s)*

Web Exclusives
VFX Weekly

Most Popular Stories

The Miniature Models of <strong>BLADE RUNNER</strong>
02 October 2017
Exclusives
The Miniature Models of BLADE RUNNER
In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work.
THE PEARL: THE SUPER ALIEN MODELS OF<strong> VALERIAN</strong>
02 August 2017
Exclusives
THE PEARL: THE SUPER ALIEN MODELS OF VALERIAN
Among the many creatures and aliens showcased in Luc Besson’s Valerian and the City of a Thousand Planets are members of the Pearl, a beautiful...
The New <strong>Artificial Intelligence</strong> Frontier of VFX
20 March 2019
Exclusives
The New Artificial Intelligence Frontier of VFX
The new wave of smart VFX software solutions utilizing A.I.
Converting a Classic: How Stereo D Gave <strong>TERMINATOR 2: JUDGMENT DAY</strong> a 3D Makeover
24 August 2017
Exclusives
Converting a Classic: How Stereo D Gave TERMINATOR 2: JUDGMENT DAY a 3D Makeover
James Cameron loves stereo. He took full advantage of shooting in native 3D on Avatar, and has made his thoughts clear in recent times about the importance of shooting natively in stereo when possible...
2018 – Year of the Rapidly Expanding <b>VFX Moviemaking Vocabulary</b>
03 April 2018
Exclusives
2018 – Year of the Rapidly Expanding VFX Moviemaking Vocabulary
Industry leaders discuss trends and issues.