Volumetrically Capturing Authentic Digital Actors, with Metastages Christina Heller

Published: Sept. 6, 2019, 9:51 a.m.

b'Even with all\\nthe advancements in CG animation, it can\\u2019t capture that distinctly\\nlifelike essence that a real human exudes. But XR can capture that\\nessence \\u2014 volumetrically. Metastage CEO Christina Heller drops by to\\ndiscuss the process of transcribing the aura of a person into XR\\nspace.\\n\\n\\n\\n\\n\\n\\n\\nAlan: I have a really special\\nguest today; Christina Heller, the CEO of Metastage. Metastage is an\\nXR studio that puts real performances into AR and VR through\\nvolumetric capture. Metastage is the first US partner for the\\nMicrosoft Mixed Reality Capture Software, and their soundstage is\\nlocated in Culver City, California. Prior to Metastage, Christina\\nco-founded and led VR Playhouse. So, between Metastage and VR\\nPlayhouse, she\\u2019s helped produce over 80 immersive experiences. To\\nlearn more about Christina Heller and Metastage, you can visit\\nmetastage.com. \\n\\n\\n\\n\\nWelcome to the show, Christina.\\n\\n\\n\\nChristina: Thank you so much for\\nhaving me.\\n\\n\\n\\nAlan: It\\u2019s my absolute pleasure.\\nWe met, maybe three years ago? At VRTO?\\n\\n\\n\\nChristina: Yes, that\\u2019s correct.\\n\\n\\n\\nAlan: Yeah, we got to try your\\nincredible experiences, mostly in the field of 360 video. And you\\u2019ve\\nkind of taken the leap to the next level of this stuff. So, talk to\\nus about Metastage.\\n\\n\\n\\nChristina: Sure. As you said,\\nit\\u2019s a company that specializes in volumetric capture. I think, in\\nthe future, you\\u2019ll see other things, but at the moment, we specialize\\nin volumetric capture. Specifically, using the Microsoft Mixed\\nReality Capture system, which is an incredibly sophisticated way of\\ntaking real people and authentic performances, and then bringing them\\ninto full AR and VR experiences, where you can move around these\\ncharacters, and it\\u2019s as if they are doing that action right in front\\nof you.\\n\\n\\n\\nAlan: Let\\u2019s just go back a\\nlittle bit. What is volumetric capture, for those who have no idea\\nwhat volumetric capture is?\\n\\n\\n\\nChristina: Sure. For a long\\ntime, if you wanted to put real people into AR/VR experiences, you\\nhad basically two ways of doing it. You could either animate it; so,\\nyou would try to create \\u2014 using mo-cap and animation \\u2014 the most\\nlifelike creation of a human character possible. Think, like, video\\ngames; when you go play a video game and they\\u2019ve got a character\\nplaying a scene out with you. If you wanted to put real people into\\nthese XR experiences, that was the most common way to do it. \\n\\n\\n\\n\\nThen there was also volumetric capture,\\nwhich, for a long time, just wasn\\u2019t quite \\u2014 I would say \\u2013 at the\\ntechnological sophistication that people wanted, to integrate it into\\nprojects. Volumetric capture \\u2014 thanks to the Microsoft system, I\\nthink \\u2014 is finally really ready to be used in a major way in all\\nthese projects. And basically what it does is, we use 106 video\\ncameras, and we film a performance from every possible angle. So,\\nwe\\u2019re getting a ton of data. We use 53 RGB cameras and 53 infrared\\ncameras. The infrared is what we use to calculate the depth and the\\nvolume of the person that\\u2019s performing at the center of the stage.\\nThe RGB cameras are what\\u2019s capturing all the texture and visual data.\\n\\n\\n\\n\\nThen, we put that through the Microsoft\\nsoftware, and on the other end of it you get a fully-3D asset that\\nreally maintains the integrity and fidelity of the performance that\\nwas captured on the stage. Unlike some of the animated assets \\u2014\\nbecause this was kind of the challenge \\u2014 the animated assets, they\\nmight get kind of there, but they had that uncanny valley thing\\ngoing.\\n\\n\\n\\nAlan: Yeah, those are creepy.\\n\\n\\n\\nChristina: Yeah. And so if\\nyou\\u2019re not familiar with the term \\u201cuncanny valley,&'