VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 02
2017

ISSUE

Fall 2017

IMMERSIVE EXHIBITS BRING HISTORY AND CULTURE TO NEW LIFE

By HELENE SEIFER

TOP: “theBlu,” a VR adventure from Wevr, takes museum patrons to the depths of the ocean for virtual underwater encounters. “theBlu” made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. The Natural History Museum version of “theBlu” consisted of six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “VRZOO” is another Wevr VR experience. (Photos courtesy of Design I/O)

A Babylonian princess created the world’s first museum in 530 BC in what is now Iraq. This repository of Mesopotamian artifacts was neatly arranged and each ancient treasure accompanied by a chiseled clay cylinder with a description in three languages. Although more than 2,500 years have passed since Princess Ennigaldi preserved the past for all to see, for much of history the basic organization of museums remained the same: display some cool stuff and label it. In this age of exploding technology, however, the creative approach has kicked into hyper-drive. Welcome to the new museum experience, where systems developed for film and gaming now reign.

“How many times will you dive with a whale?” asks Jake Rowell, Director of “theBlu,” a VR adventure from Wevr that brings us to the depths of the ocean for underwater encounters without getting wet. “Most people are not scuba certified!” “TheBlu,” which started as a home application, made its museum debut in early 2017 at the Dubai Aquarium & Underwater Zoo, followed by an extended stint at the Natural History Museum of Los Angeles, and more locations are planned. Museum patrons virtually encounter a big Blue, a jellyfish migration, and a whale fall where a deceased whale’s bones have become an undersea ecosystem. “It’s a celebration of the ocean and ocean life,” says Neville Spiteri, CEO and Co-founder of Wevr. “We worked with oceanographers and scientists to ensure a degree of accuracy.”

The Natural History Museum version of “theBlu” consisted of six 9’ x 9’ pods that held one person at a time for a personal immersive experience. “We go to great lengths to consider the audience,” continues Spiteri. “VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you.” Rowell adds, “Exploration is the heart of what ‘theBlu’ is.” Meeting a full-sized whale, even virtually, is awesome. “The creature acknowledges you’re there and they’re there. The connection is very powerful.”

According to Spiteri, reactive sea anemones and bioluminescent creatures are possible because, “It’s a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim.” Explains Rowell, “‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.” Art is also critical, and the work of Academy Award- winning visual effects artist Andy Jones was key.

The Museum of Science, Boston has taken a deep dive into tech in order to excite visitors and inspire life-long learning. “Particle Mirror” opened this year, and inculcates an interest in physics. Kids and adults alike gambol in a snowstorm, bounce giant colorful orbs and frolic through pixie dust – and nothing is real.

TOP and BOTTOM: “Particle Mirror” is now on display at the Museum of Science, Boston. Participants are led through nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics. In the “snow” simulation, visitors collect the falling particles, toss them around, or sweep them off the floor as they accumulate. In the “blue sparks” simulation, a “trails” effect makes the “sparks” look like they’re swimming through swirling liquid. Particles generate different musical chords when pushed. (Photos courtesy of Karl Sims)

Created by Karl Sims, a digital media artist, computer scientist, and recipient of a MacArthur Fellowship, the wall-sized virtual mirror uses a Microsoft XBox One Kinect Depth Sensor camera to capture visitors’ motions and depth and project them into the AR environment. Sims, who previously founded software company GenArts, developed the systems for “Particle Mirror” using C, OpenGL and Open CL, which runs on a Linux computer with an NVIDIA GTX 1080 graphics card. Participants are led through a series of nine revolving scenarios, interacting with a changing variety of dots and spots, all following the laws of physics and the properties programmed into each. Gravity causes “snow” to fall from the top of the screens – and, just like real snow, it collects on heads and shoulders, and can be scooped into snowballs. Music enhances the magic – when particles collide different sounds are generated. “The bubbly effect makes bubbly sounds. They gurgle when pushed,” Sims explains. “The goal is to inspire kids to learn more.”

The museum also houses Sims’s “Reaction-Diffusion Media Wall,” installed in 2016. He details that the exhibit simulates “two chemicals  that make emergent  dynamic  patterns.” Consumers at a kiosk manipulate patterns projected on 24 hi-def screens. To create the effects, Sims used a consumer-gaming piece of hardware: a graphics processor on Linux machine with 2000 processing cores.

Museum of Science, Boston develops most things in-house with a dedicated nine or 10-person team, including a 3D designer, a physical manipulative engineer, software developers and builders. According to VP of Exhibition Development Christine Reich, special effects excite and engage people in learning. “Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.”

On Karl Sims’ “Reaction-Diffusion Media Wall” in the Museum of Science, Boston, two simulated chemicals, shown as white and dark blue, react and diffuse to generate biological-looking patterns and shapes, which are displayed on a high-resolution wall of 24 screens. A touch- screen kiosk in front of the display allows visitors to adjust parameters and create a wide range of different results. (Photo courtesy of Karl Sims)

Equally important is referencing cultural touchstones. As Reich notes, the Museum is always trying to determine “how we can leverage pop culture to teach STEM (Science, Technology, Engineering and Mathematics).” Last year, their “Star Wars®: Where Science Meets Imagination” experience concluded its eight-year, 20-venue tour. Their “The Science Behind Pixar” exhibition elucidates the engineering and computer science behind animation technology. Seen by over 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program the grass in a movie scene to determine if blade density affects movement.

Understanding how actions affect our environment is the goal of the 2,500-square- foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. Infra-red cameras react when practical logs, wrapped in reflective material, are used to divert the path of a virtual stream to feed or starve flora and fauna. Gesture-reading Kinect cameras react to actions that the group of museum-goers take: whether interacting with animals or planting virtual seeds, every human action counts in how well the entire ecosystem works.

According to Geralyn Abinader, Creative Producer at NYSCI, “When you talk about system-thinking, it’s really hard for even adults to grasp the relationships in a living system. When an individual does an action or there are aggregate behaviors that cause changes; sometimes the reactions are immediate, but sometimes they are over the long term. Sometimes they are nearby, some- times on the other side of the world.” Visitors can see that if they plant enough seeds and supply enough water, animals will appear. If food sources dry up, animals will migrate elsewhere.

Although based on real scientific models about the ways water systems work, the projected worlds are fanciful, filled with imaginary critters. “We wanted children to understand one or two important concepts and not get bogged down with expectations about what they were seeing,” offers “Connected Worlds” developer/designer Theo Watson, who, with his wife Emily, founded Design I/O, the company that designed and programmed “Connected Worlds.” “The more straight-forward the characters were, kids said they liked them more, but had very little to say about them. The weirder the characters, the more they had to say. That helped us because it captured their interest and attention.” The exhibit-planning included scientific collaboration, as Watson explains. “Some of the people who advised the UN on climate change were involved.”

“We go to great lengths to consider the audience. VR is a powerful medium and you can instill a high degree of empathy and can also scare people easily. We aim for the right level of awe. No cheap scares. No shark jumps out at you. Exploration is the heart of what ‘theBlu’ is. The creature [a full-sized whale] acknowledges you’re there and they’re there. The connection is very powerful.”

—Neville Spiteri, CEO and Co-founder, Wevr

The complex responsiveness of the exhibit was a technological challenge, as Watson attests. “Almost every aspect of this was on the verge of being impossible to do.” He developed the software on OpenFrameworks, built it on C++. “Each of the six environments is a single projector connected to a single computer. The floor is another computer, but the content comes from seven projectors, so it feels like one continuous digital surface. All the environments talk to each other over a network. When a creature flies from the dessert to the forest, the forest knows that and makes sure the bird shows up on the forest screen.” Design I/O used the XBox Kinect camera, but the interaction and tracking aspects are customized. “Most of our installations are closed source. Sometimes we’ll solve really hard problems that would be helpful to millions of people, so we’ll open source that part of it.”

LEFT and RIGHT: In the “Star Wars: Where Science Meets Imagination” exhibition at the Museum of Science, Boston, visitors can jump to light speed in a full-size replica of the cockpit of Episode IV’s Millennium Falcon in a multimedia ride through the universe. And, they can meet a puppet of Jedi Master Yoda up close in the swamps of Dagobah from Episode V: The Empire Strikes Back. (Photo credit: Dom Miguel Photography. Copyright © 2006 Museum of Science, Boston and Lucasfilm Ltd.)

An added attraction in “Connected Worlds” is the “Living Library,” a 2-foot by 4-foot physical book. Look through the book, find an interesting section and a hi-res camera mounted above will recognize the page and prompt an interactive digital display.

“[The VR experience] is a combination of software, custom development of code and algorithms, and creating a simulation engine that determines how the fish move and swim. ‘TheBlu’ code base lives in Unity – the rendering engine. It’s flexible and customizable and can achieve many different looks. We set a series of ‘go-to’ points in the story. Some parts are scripted, but the ocean still needs to feel alive and random.”

—Jake Rowell, Director of ‘theBlu’

The Franklin Institute in Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. “Westay up on what the latest available technologies are,” explains Susan Poulton, the Institute’s Chief Digital Officer. “If the technology exists in the public, we want it here.” Under her guidance the museum has VR stations with HTC Vive and Oculus Rift headsets which transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 iPod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. Visitors can download the app on their phones and access them at home, as well. To make sure that’s possible, the museum is giving away over 10,000 Google card- boards. Additionally, in a further commitment to immersive tech, two flight simulators have been upgraded to recreate the Apollo 11 landing in a 4D ride using HTC Vive.

At the Museum of Science, Boston, a visitor changes the surface appearance of a virtual object in “The Science Behind Pixar” exhibition, which demonstrates some of the engineering and computer science behind animation technology. Seen by more than a 800,000 people thus far, there are now two traveling versions, each with over 40 interactive components, including a simulation allowing visitors to program elements in a movie scene. (Photo copyright © Michael Malyszko)

The Franklin Institute of Philadelphia has embraced immersive technology as a means of engagement and learning, with VR and AR strategies. The museum has VR stations with HTC Vive and Oculus Rift headsets that transport visitors to the depths of the sea, into outer space, or inside the human body. The Institute also has four movable, 6-foot-tall, 10-person VR Carts outfitted with 10 ipod touches, Samsung headsets, and access to the museums full array of 360-degree photos and videos, including all 26 of NASA’s VR properties. The “Terracotta Warriors of the First Emperor” augmented reality experience was featured last September. (Photo courtesy of The Franklin Institute)

The real goal,” according to Poulton, “is providing technology accessibility. To experience VR and all the things it can do. But the big issue is through-put. How many visitors can we put through in a day? The VR stations are limited to 24 people an hour, so that’s approximately 100 a day. It’s a challenge finding great content that works for the rhythm and flow of the museum.”

The other prong in their strategy is augmented reality experiences. For example, the “Terracotta Warriors” exhibit opened in September. Visitors can download an app to access AR on their phones. “Activate the AR and hold the phone over a warrior,” Poulton explains, “and it might show the original weaponry, created with CGI. Some will show the chemical decay process or the original coloring of the statues.

“We have to meet expectations,” she says. “This younger generation – the phone is a third eye, not a device. They need and will expect to use it, and if we’re not figuring that out now, 10 years from now, even 5 years from now, we’re in trouble.” She continues, “We are learning how to react to audience needs and do things quicker and not get so hung up on the details. We need hyper-relevant contexts – not just a hurricane exhibit, but a hurricane that’s happening right now in Japan. Museums are going to radically change.”

Even art museums are jumping aboard the technology train. Design I/O was commissioned to create two pieces for an inter- active video wall at The Cleveland Museum of Art. In the reveal mode, visitors’ movements effectively “paint” an item from their collection. Zoom turns each museum-goer into a living magni- fying glass. As Watson details, “In reveal, you’re basically throwing paint strokes off your body. Zoom is the simplest idea. As you walk up to the video wall, you see an object from the collection. Your body position can zoom into it until it’s 10-20 times magnified.”

In an effort to entice young adults to appreciate older art, The Montreal Museum of Fine Arts has projected an enchanted garden in their Romanticism Gallery. Leaves rustle and winds blow, creating an alluring room. They also developed a fascinating duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of a fashion icon. “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier” both feature manikins that talk with realistic articulation, but they’re not animatronic.

“Neuroscience is now teaching us that emotions are the starting point for behavior. When people are in a heightened state, we can push them beyond their expectations. Triggering that emotional reaction leads to deeper learning. One way is through digital immersion.”

—Christine Reich, VP of Exhibition  Development, Museum of Science, Boston

Nathalie Bondil, Director General and Chief Curator of the museum, worked with UBU Theatre, an innovative company known for incorporating video masks into their productions. A selection of real people, including Gaultier’s top model and Gaultier himself, were tapped for the show. Bondil explains that first UBU artisans “made precise molds of their heads. Then they created a new head in plaster. After that, they recorded film of the same characters acting.” This part of the process was not easy. “Each person must combine the perfect features for the perfect 3D head. We cannot have someone with big gestures, who can’t move their head left to right while talking, who must look in front of him.”

TOP LEFT to RIGHT: Understanding how actions affect the environment is the goal of the 2,500-square-foot interactive “Connected Worlds” at the New York Hall of Science. Visitors are surrounded by six interconnected ecosystems: Desert, Mountain Valley, Wetlands, Reservoir, Jungle and Grasslands. A 40-foot-high waterfall, rivers, indigenous plants and native creatures are projected onto the walls and floor, and environments thrive or die depending on what’s done to them. (Photos courtesy of I/O Design and copyright David Handschuh)

Then UBU devised a means to realistically project the talking head onto that person’s sculpted head mold. “Magic comes from the combination of the sculpture and the recording of the same person projected on the same head. A completely unique specific system had to be created to avoid bizarre effects. In Korea, they thought we hired people to be on stage.”

To entice young adults to appreciate older art, the museum has projected an enchanted garden in their Romanticism Gallery for “The Salons of the Belle Époque: Romanticism” exhibition in the Michal and Renata Hornstein Pavilion for Peace. Leaves rustle and winds blow, creating an alluring room. (Photo copyright © Marc Cramer)

The Montreal Museum of Fine Arts developed a duo of exhibitions combining old-school craftsmanship with a complex projection system to convey the magic of “The Fashion World of Jean Paul Gaultier: From the Sidewalk to the Catwalk,” whose five-year, 12-venue world tour ended last year, and the currently displayed “Love is Love: Wedding Bliss à la Jean Paul Gaultier.” Both feature manikins that talk with realistic articulation, but are not animatronic. (Photo copyright © MBAM/Denis Farley)

Stéphanie Jasmin, Artistic Co-director of UBU, adds, “There were tricks we developed. When we project an image, it’s a square. The image is flat. A face is not square – it’s 3D. And we needed to make invisible cuts [in the monologues] or there’d be a jump cut in the face!” The systems created were so precise that a team of technicians had to travel with the exhibition to set it up properly for each venue.

With its ability to engage and excite, special effects techniques in museum exhibitions are sure to expand. Wevr’s Rowell observes, “VR is a powerful form of storytelling. I’m a huge fan of film, television, video games, animation. What I get excited about with VR is that it’s a new way to communicate, to show people the story.”


Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
VR/AR Trends
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
VR/AR Trends
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
VR/AR Trends
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
19 December 2023
VR/AR Trends
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
Created by Michael Waldron for Disney +, the second season of Loki follows notorious Marvel villain Loki (portrayed by Tom Hiddleston), Thor’s adopted brother and God of Mischief.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
VR/AR Trends
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.