Audiences who are streaming the current season of “The Mandalorian” have seen the main characters fly through space and probe the murky mines of Mandalore thanks to the ground-breaking virtual-production technology that was invented for the show.


READ ALSORead all of Ryan Bordow’s movie reviews here

READ ALSOHere’s how Arizona film incentives could attract filmmakers


Virtual production not only generates richly detailed sets that are displayed on gigantic LED screens rather than blank green screens, it also saves money by making production more efficient and accurate. Since “The Mandalorian” debuted in 2019, the technology has also been used in the movies “Dune” and “The Batman.”

Learning technology from ‘Mandalorian’

Now, students at Arizona State University have access to this cutting-edge technology to tell their own stories. The Sidney Poitier New American Film School offers virtual-production technology with extremely high-resolution LED wall and floor screens made by Planar Studios at the ASU California Center in downtown Los Angeles.

The Media and Immersive eXperience Center in downtown Mesa will offer the technology starting in the fall semester, according to Jake Pinholster, founding director of the MIX Center and executive dean in the Herberger Institute for Design and the Arts.

“In a few short years this has become one of the most explosive and transformative trends in the movie production industry because it has huge positive ramifications,” he said.

“It cuts on post-production time. It makes it easier to pre-visualize and know what a shot will look like before you turn on the camera,” he said.

“Actors can see the environment and respond to it. You can shoot a dawn scene all day long.”

Industrial Light and Magic, the special-effects production company founded by George Lucas, released a video explaining how it created the technology for “The Mandalorian” so that the world-building can be adjusted in real time and saved. The method streamlines the work that was previously done in the pre-production, production and post-production timelines.

The environments are created digitally and loaded onto the giant screens, where the actors can interact with what the audience will see. Previously, actors would work in front of a blank green screen and the digital effects would be added during post-production.

Because ASU is the only film school offering the technology, Pinholster and Nonny de la Peña, founding director of ASU’s Narrative and Emerging Media program, are helping to set standards for teaching the method. They are on a working group of the Society of Motion Picture and Television Engineers.

“To a certain degree, there is no industry standard in how to do this because it’s still very much an experimental process,” Pinholster said.

“We are one of the first universities training people in what will become the major production technique.”

Coursework in virtual production will be included in both the Narrative and Emerging Media program in Los Angeles and the undergraduate program at the MIX Center in Mesa.

De la Peña’s grad students in Los Angeles have been working with the Planar screens to do both fiction and nonfiction storytelling.

“We’re using new technologies in all kinds of ways,” she said.

“We have students walking around carrying iPads scanning the building, and how do you tell the story with what you’ve scanned?”

The scans are run on game-engine technology and, once uploaded to the giant LED screens, the effect is immersive. Her students are working on stories involving Shakespeare, drug abuse, water issues and baseball.

De la Peña sees virtual production as the future not only for movies but for narrative journalism.

“You can have a reporter on the scene without being at the scene,” she said.

“If we want to make sure we have students prepared for the future of storytelling, we need to teach them that now.”

A crucial part of embracing new technology is determining how to use it ethically. She and Mary Matheson, director and a professor of practice in the film school, teach a class called “Diversity and Ethics in Emerging Media.”

“The students are learning now about how (artificial intelligence) is trained, which is, if nothing else, sexist and racist,” de la Peña said.

One of de la Peña’s students, Cameron Kostopoulos, debuted “Body of Mine VR,” an immersive virtual-reality experience, at the South By Southwest festival  March 12–14, which won a jury prize. The experience places the viewer into another body for an exploration of gender dysphoria and trans identity.

Kostopoulos used a combination of several technologies, including the Planar screens plus VIVE, to create “Body of Mine VR,” which combines body, face and eye tracking with audio interviews.

Kostopoulos, a cisgender gay man, grew up in Texas.

“Being in the closet for basically my entire K–12 experience, looking back, I know how having certain spaces could have helped me,” he said.

“So because of that, I’m passionate about creating those spaces and those experiences for other queer youth who could benefit from them. And for cisgender people to learn about the trans experience and gain empathy.”

“Body of Mine VR” uses full-body motion capture and eye tracking, so at one point, the viewer looks into a mirror and sees themselves blink.

“I put all that together for a more intimate VR experience than what you would normally get with controllers,” said Kostopoulos, who is a writer, director and developer based in Los Angeles.

Combining all the new technology at the ASU California Center was a challenge.

“It’s basically supergluing a lot of cutting-edge stuff into our own makeshift tracking system,” he said.

“Because all of the pieces of tech exist in isolated pockets, there aren’t many experiences that combine everything to do a fully immersive embodiment of a body in VR,” he said.

“There are not a lot of tutorials I could follow and not a lot of people who have worked with it.

“But getting it to finally work was totally worth it and it ended up super cool.”