How Can VR Tell 360 Degrees of the Human Experience?

A prolific filmmaker shares his thoughts with XR Today on the rise of Immersive Screen cinematography

12
Gary Yost
Virtual RealityGoogleMetaUnityInsights

Published: October 4, 2021

Demond Cureton

Advancements in cinematography had given birth to a new medium of filmmaking – 360 immersive video – which has allowed directors and artists to tell stories from a completely new perspective.

XR Today spoke with the Emmy-nominated Mr Gary Yost, Veteran Filmmaker, Sofware Designer, and Co-Director of the WisdomVR Project, on his work with stereoscopic 360 video.

He is known for leading the Yost Group, which created Autodesk 3D Studio (3DS Max) in the 1990s, is also an expert in stereoscopic and immersive filmmaking, and has used his skills to curate stories for his WisdomVR Project to bring 360 cinematography to VR headsets globally.

XR Today: How do you feel about receiving an Emmy nomination for your critically-acclaimed film ‘Inside COVID-19’, and what does it represent to you, both as a veteran filmmaker and from the perspsective of the growing 360-degree VR filmmaking industry?

Gary Yost: That’s a good question. It’s an honour to be nominated and validated for the work we’re doing, especially in the same catagory as Felix and Paul Studios, who are great innovators in the field.

Their work on the ISS Space Station is groundbreaking and technically amazing, so to be honoured with them was even better than just the nomination itself, and I don’t really expect to win.

The documentary filmmaking space for 360 VR has been fraught with challenges, and we’ve seen for a couple of years when Google supported the Jump assembler, Stitcher, and there was the Jaunt camera and Yi Halo cameras that documentary filmmakers could borrow from Google.

They didn’t have to worry about stitching or technical issues, but could just send all the footage back to Google Cloud to receive this beautifully stitched stereoscopic 360 documentary.

But in May 2019, Google shut down that programme, which made it impossible for documentary filmmakers to really go further unless they wanted to become really hardcore, stereoscopic 360 technologists to continue operating at a very high level of quality within the medium.

A lot of people just couldn’t do that, and fortunately, my background as a technologist inspired and allowed me to go deeper to see what I could do with the tools, and it’s a very small world right now.

There’s only really a handful of people doing very high-level, 360 VR stereoscopic documentary work, and for that reason we’re trying to push it forward.

I guess the best compliment I’ve gotten about the film is from John Carmack, Chief Technology Officer of Oculus, who is really responsible for most of this technology. He said that ‘Inside COVID-19 wa sa really wonderful documentary that just happens to be in stereoscopic VR 360.

That medium is maturing, which says it all. We’re helping to push it as well by just making a really great documentary that happens to be in stereoscopic 360.

XR Today: How did 3D VR filmmaking impact audiences compared with 2D films, and which message does the film communicate to people watching it?

Gary Yost: Our philosophy of documentary filmmaking is that, the best way into a story is through the personal experience. So, the pandemic came along in Spring 2020 and we had just finished a series of nine WisdomVR experiences for the Oculus, which were just published on Oculus TV.

We were planning on getting to our next series, and it was obviously impossible to shoot with the elderly anymore due to the lockdown. I was later introduced to [Dr Josiah Child] who had been both very sick with COVID-19 and nearly died, which happened while he was preparing five emergency departments and managing for the pandemic itself.

He seemed to embody this dualistic quality of the entire pandemic and one person, and we felt it was an opportunity to speak to the wisdom of being both a physician and patient within the pandemic, all in one person. So that personal story embodied, in some ways, the entire nature of the pandemic.

Our vision and theme was really to tell this huge story about a global pandemic through the eyes of this one person, and fortunately, he had a philosophy degree prior to becoming a physician.

He was also very open about sharing his life and became a really powerful window into many of the feelings associated with all of us, such as feeling unsafe as well as curious and wanting to know more.

One of the problems now is we’ve forgotten the lessons of past pandemics, possibly because they weren’t documented as well as needed. The 1918 avian flu pandemic didn’t teach us a lot of lessons we’ve used constructively in the current period, so we felt VR would be the perfect medium to give people in the future an opportunity to really feel the intensity of what we’ve been going through in a way you couldn’t get through a normal 2D film.

These days, 2D films feel very limiting to me because they’re highly manipulated by filmmakers, and with all these cuts, I never get the feeling that what I’m seeing is truly what happened.

The beauty of VR 360 filmmaking is these long shots inside spaces with people makes you feel the truth and authenticity of the experience in a way you don’t really feel in a highly-edited 2D film.

We wanted to give authenticity to our viewers, because there are so many perceptions of the pandemic, and people have so many opinions about what’s true and what’s not that we felt the veracity of 360 filmmaking, which makes you feel as if you’re really there, would cut through the tendency people have of looking at things too subjectively.

We believe that, with all the weird misinformation on COVID-19, giving people objective experiences with someone that was very personal would be the best way to capture the pandemic.

XR Today: You’ve chosen VR as the medium for recording humanities, philosophical history. What would you say is the process you use to create these films and to edit the footage? And are there any specific technologies that you use for that project?

Gary Yost: Firstly, on our approach, we typically make our WisdomVR pieces around 10 minutes long, and work with a subject that has something very valuable to impart to others, specifically those with headsets, which, in our case, tends to be younger people.

We fint that one theme they want to impart, which might be with one of our subjects. Ruben Margolin is a kinetic sculptor, and with him, it’s the theme everything in life can be viewed as a wave.

We work with the subject to develop concepts around the theme supported by the location we’re filming in, and use it to drive it deeper into the subject and the viewer’s psyche.

So instead of being distracted by everything around you, when you’re wearing the headset, you can actually listen to the subject in one place, and when you turn around, you get more of the story while looking at the subject space.

We’re trying to create a story all around you, and that’s really different than watching a 2D film, because traditionally, when people watch TV films, they’re very distracted.They’ve got phones and there’s a lot of stuff going on, but in the headset, you’re completely isolated from all of that, and it’s really the only medium where we seriously get to monopolise someone’s audio and visual senses, and find it a sacred space for people to do so.

So we work very hard with the subject and every single shot. What we call a chapter or a location speaks to the theme, an you don’t really have to think much as a viewer. You just let it wash into it, and it really becomes a part of your experience. So it’s not like watching a piece, you experience a piece. That’s our general philosophy.

When we got to Inside COVID-19, it’s not a 10-minute piece, but a 35-minute one, and we had to develop much more of a story around it, but still used those same techniques to create these chapters where the viewer would feel immersed in each aspect of the story, which really made them feel as if they were there with the subject, Dr Josiah Child.

Regarding the technology, I have the camera we use right here, which is actually only 20 cameras of its kind in the world – a Z Cam v1 Pro. It has 10 cameras in a radial array, very tighly packed, which, because each camera has 190 degrees of field of view, we can get tremendous amounts of overlap between each camera, allowing us to create a high-quality stereoscopic image, even if objects or people are fairly close to the camera, within 18 inches. No other 3D 360 camera can do that due to its tighly-packed array and stitching process, which is completely unautomated.

The way the Google Jump platform was, the stitching process was fairly complex. We use a piece of software developed in Spain by a company called Mystica VR, and it provides a lot of flexibility as to how we take the output of those 10 cameras and create a naturalistic stereoscopic field.

We’re very proud of our work as we’re producing the most naturalistic stereo of any documentary VR 360 filmmaker out there. Part of that is due to the Z Camera, and partly with all the control Mistika gives us in creating that imagery. There’s an artefact of that tightly-packed camera array, and that’s even though you can get things very close, because the cameras are so close together, there’s a hypo-stereo effect where things magnify as they get closer.

So we have to be very careful if we want a shot that feels natural and have to keep the subject about 42 inches away from the camera, but a corollary is when we produce an extreme closeup with it, which is impossible to do with any other VR 360 camera.

There’s a great quote by the World War II photojournalist Robert Kappa, where he said, “If your images are not powerful enough, you’re not close enough.” As I’m also a photographer, I firmly believe that getting close and intimate with the subject is the best way to bring the viewer into the story.

So that’s our primary tool for camera capture, but we also use traditional tools like Adobe Premiere, Final Cut, After Effects, and other things people use for post-production.

XR Today: People are just learning how to use these technologies, namely with holographic content. There are endless possibilities on how to apply these cameras, whether in filmmaking, digital twins, and others, where you have your LIDAR scanners. I would like to see how those kinds of technologies are incorporated in the future.

Gary Yost: That’s a really interesting subject. Photogrammetry, 3D CAD, and volumetric capture… [it’s a] tremendously interesting field and it’s going to be a long time before we can do volumetric capture of an entire space [with a] subject moving around the room, compared to capturing a subject on a soundstage.

As documentary filmmakers, we need to capture people in their space, and although we’ve experimented a bit with volumetric capture, it’s all composited with other elements, and that veracity and authenticity is missing, because people are not being captured in their actual environment. I’m following that closely, and involved in a group developing a sixth camera. We’ll see how that goes over the next few years.

XR Today: Regarding your work with the Yost Group, which led to the creation of Autodesk 3D Studio and other desks such as 3DS Max. These are some of the world’s largest brannds for creating films and industrial content. Thinking back on your career, what are your thoughts on how software is instrumental for creating VR films, and how will they reshape the global film industry in the future?

Gary Yost: I did all that work in the 90s, and as before, I became a documentary filmmaker afterwards. I’ve been looking to use those tools I invented in my work, and it was only in this project, with Inside COVID-19, where I had no choice because I needed to tell the story of the pandemic.

A big part of that story is that of the virus itself, and a virus is, of course, invisible, right? It’s too small to see. One of the great things about 3D tools are they can build things that are too small or large to visualise. I saw the US Centers for Disease Control (CDC) had created a 3D version of the virion in March, used by the New York Times… this classic 3D model with the spike proteins in red.

I read a little article about how they did that in 3DS Max, around the time I had started production on Inside COVID-19, which just lit a light bulb in my head, and I thought, “Well, if the CDC could do that, I could certainly bring the virus in as a character in this film, too.” So I contacted an old friend of mine, Andy Murdock, who had done a lot of cell biology animation for National Geographic over the years.

I said, “Hey, let’s tell the story of the virus and how it works. Let’s show how it goes through membrane fusion and viral replication in one of the epithelial cells, and how it gets multiplied inside the cell.”

We later embarked on two and a half months of research, and I was able to convince another old friend of mine, Phil Miller from the Chaos Group, who makes vRay and the Chaos cloud render, and they donated $30,000 worth of cloud rendering time, because we’re now making these incredibly complex renderings at 7K x 7K, which has never been donen before.

Back in the day, we were just rendering HD, or 2K per frame, and even 4K is an 8-megapixel frame, but 7K x 7K is a 49-megapixel frame, so it was really daunting to even create these renderings, and we were doing seven to eight minutes of animation.

It was through a community I had been a part of for almost 35 years [where] we were able to bring this virus to life as a character in the film [and] give people a visceral feeling of what it’s like inside of ourselves, to see this thing come in, invade, and replicate while using our own body to amplify its deadly characteristics. It was full circle for me, where I had done all the work in 3D, but it became an essential part of my workflow process for this documentary.

So you can imagine what a great feeling that was for me to meet up with a lot of old friends to show the world something it has never seen before, and yes, I’ve become really fixated on a lot of cell biology issues.

We’ve also got plans to go further with a Unity-based interactive exploration of how our immune system works, and I’ve put together a team with some world-class cellular biology animators to provide an in-depth look at how vaccines, mutations, and our immune systems work, which could keep us busy for a while.

So it’s like the wisdom of our bodies, and since the WisdomVR Project is my invention, I can move it in any direction I want. At this point, cellular biology and the wisdom of our bodies is something that’s very important to me, and, I think, to the world.

XR Today: From what I remember from a lot of different animations you used for the cells, it was so captivating and reminded me a little bit like Disney’s Fantasia, and just the depth of the virus moving around the human body and travelling as if it had its own personality. It was it was almost like humanising a character that we normally wouldn’t look at as having one, or how he’s becoming a part of the human body.

Gary Yost: It’s exactly what we tried to do, to make it relatable in a way, because most people just think of the virus as a completely abstract thing. So, we worked very hard to take it out of the abstract realm, give it personality and a sense it’s truly a real thing, because it is real, even though I can’t see it with my own eyes. It’s there, and these 3D tools are just so perfect to bring it to life.

All that personality really comes from Andy Murdoch’s skill as an animator, and there’s this quality that viral particles have called pleomorphism, which is a non-rigid, structural component. We worked very hard studying the work of David Goodsell at the Scripps Institute for Computational Biology to understand the pleomorphic properties of the virus and how membrane fusion happens.

So these particles are very animated in the way they move around the world, and we were very excited to be able to capture some of that.

 

 

EducationImmersive ExperienceImmersive LearningVisualization
Featured

Share This Post