Shaping the Digital World with Our Hands, with Clay AIRs Varag Gharibjanian

Published: April 28, 2020, 10 a.m.

b"We\\u2019re used to navigating our computing with keyboards, mice, and maybe track pads \\u2014 analog input. But those inputs work for desktop computers; they\\u2019re clunky for XR interfaces. That\\u2019s why we need gesture controls ASAP, according to today\\u2019s guest, Clay AIR\\u2019s Varag Gharibjanian.\\n\\n\\n\\n\\n\\n\\n\\nAlan: Hey, everyone, Alan\\nSmithson here. Today we're speaking with Varag Gharibjanian, the\\nchief revenue officer at Clay AIR, a software company shaping the\\nfuture of how we interact with the digital world, using natural\\ngesture recognition. We're going to find out how Clay will bring our\\nreal hands into the virtual world. Coming up next, on the XR for\\nBusiness podcast.\\n\\n\\n\\nVarag, welcome to the show, my friend. \\n\\n\\n\\n\\nVarag: Hey, Alan. Glad to be\\nhere.\\n\\n\\n\\nAlan: It's my absolute pleasure\\nto have you on the show. I know you guys are working on some cutting\\nedge stuff, so why don't I not ruin it, and just let you tell us what\\nis Clay AIR?\\n\\n\\n\\nVarag: So Clay is a software\\ncompany, we're specializing in hand tracking and gesture recognition,\\nmostly in the AR and VR space. And we're also tackling a couple other\\nindustries, automotive. And our third product category we call Clay\\nControl, which is kind of all the devices that can use gesture\\ninteraction at a distance.\\n\\n\\n\\nAlan: Are you doing this from\\nsingle cameras, or is this from infrared cameras, or a combination of\\neverything?\\n\\n\\n\\nVarag: Yes, so Clay's-- we're\\nhardware agnostic. So it'll work across all those types you just\\nsaid. It could be one camera, two cameras, or more. And all different\\ntypes, so we'll work on RGB cameras that you'll find on everyday\\nsmartphones, to what you might find embedded in AR and VR devices, to\\nmonochrome ones, time-of-flight ones, and so we're pure software and\\nwe've worked across a lot of those different types and have\\ncompatibility with most of them now, which gives us a lot of\\nflexibility and it's really useful.\\n\\n\\n\\nAlan: So I'm going to be able to\\nlook at watches on my wrist in AR, right? Like I'm going to be able\\nhold my hand up and see what the newest, latest, greatest watch is?\\n\\n\\n\\nVarag: It's actually pretty cool\\nthat you say that, because that is one of the use cases that comes in\\noften inbound to us, as companies -- it hasn't happened yet -- but\\nthose companies definitely brainstorming around how you track the\\nhands even with just a smartphone, like overlaying something.\\n\\n\\n\\nAlan: We actually did it. We did\\na project just using Google's hand tracking library. We managed to\\nmake the watch sit on the wrist, but it was kind of glitchy. It would\\nsit weird. And yeah, it was-- it was not great, but we made it work,\\nit just wasn't sellable.\\n\\n\\n\\nVarag: Yeah.\\n\\n\\n\\nAlan: So this is really a\\nfoundational software. And I know you guys are working with some of\\nthe larger manufacturers. You want to talk about that -- or can you\\ntalk about that -- and then what that might look like?\\n\\n\\n\\nVarag: Yeah, I can speak a\\nlittle bit about that. So we feel -- like you said -- this is\\nsoftware that really needs to be optimized for the hardware that it's\\nworking on. The deeper it is in the stack, the better performance\\nyou'll get, and the better synergies you'll get with all the other\\ntechnologies that are working on these devices. So that's why when I\\njoined the company, really, I made the focus to get as deep into the\\nstack as possible. We looked at the market that time a couple of\\nyears ago to look at who is really central to defining the reference\\nstack. What's going to most AR and VR devices? And to me, Qualcomm\\nmade the most sense. So we spent a lot of time working with them. As\\nyou know -- and some of our listeners might know -- they really do\\ndefine a l"