Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:07):
Hello, I'm Karen Quatromoni,
the Director of Public Relationsfor Object Management Group, OMG.
Welcome to our OMG Podcast series. At OMG,
we're known for driving industrystandards and building tech communities.
Today we're focusing on theAugmented Reality for Enterprise
(00:29):
Alliance area, which is an OMG program.
The AREA accelerates AR adoptionby creating a comprehensive
ecosystem for enterprises,providers, and research institutions.
Today,
Boeing's Sam Neblett willhost the podcast session.
(00:50):
So I know you have a huge amountof experience in the gaming sphere,
and that's with your podcast.That's with just personal time,
and I think the gamingsphere is leading the way
for informing enterprises on howthey can use AR both effectively
and efficiently in just awide variety of settings.
(01:13):
You mentioned firearms training, that'sguns are pretty popular in video games,
but I want to think aboutreal world examples of
thinking outside of the box forindustry specific applications
and potential or interestinguse cases that you've noticed.
So not necessarily justpicking up a gun with a
(01:36):
controller,
I'm thinking anything new andgroundbreaking that you've seen
or you feel could be on thehorizon. Something like,
it might not have to be super complicated,but just an out of the box solution.
So something like voice commandsfor hands-free machine controlling
manufacturing, hand gestures,
and for sterile surgery in healthcare,
(02:00):
or overcoming issues that you'venoticed in UI and UX for learning
curves with eye tracking and ARapplications that you've seen.
Is there anything very interestingin a specific industry that the
enterprise might be interested in for ar?
Yeah,
I think quickly just to touch on oneof the things you just mentioned there,
(02:21):
the eye tracking and handgestures for UI navigation
I think is crucial.
Previously before these technologies,
if you're using a virtual realitycontroller or an AR controller,
a lot of times when you're holdingone of these controllers in your hand,
there's a laser that comes off of theedge of it that you can point at objects
(02:43):
in the UI to make menuselections and things like that.
When you drop these controllers, boom,now it's like, oh, where's my laser?
In what way can I makemy next menu selection?
And I think if you've used theApple Vision Pro, that's exactly it.
I think that is the bestcurrent method to solving UI and
UX problems is by glancing atsomething and having a pinch gesture.
(03:04):
You can cruise through menus so fastlike that. If you've never used the Apple
Vision Pro, I highly recommend you goto an Apple store and check one out.
It's a quick demo. And veryquickly you'll start to understand,
I think the direction that a lot ofthis stuff is going. Now granted,
the Apple Vision Pro is, I guesstechnically a consumer device, but
(03:27):
I think that the intendeduses of that device really
lead to where we're goingin this conversation.
It's funny that youmentioned voice commands.
I haven't actually seen any compellinguse cases for voice commands.
Neither have I. Yeah,
which I find interesting because itseems like some low hanging fruit.
And in the gaming side of things,
(03:48):
I see people doing some really interestingstuff with AI and voice commands by
basically connecting a chat GPT tothe game that you're playing and
attaching individual personasto NPCs in the world. I
have a friend named Genis VR who makesa lot of cool stuff for VR and AR
and uses peripherals, haptics and allthat kind of stuff and her content.
(04:11):
And she will be playing a game likeSkyrim VR and will walk up to somebody and
say, Hey, what do you think about dragons?
And the AI will connect tothe NPC and create a brand new
never before heard of line of dialogue
using that character's voice,which is really, really cool.
But in terms of enterprisesolutions and stuff,
(04:34):
I haven't seen anybody useanything that I found compelling,
which I find interestingin some of our trainings.
We do have sections of theaircraft trainings where
the user is prompted tospeak out loud, but it's all,
it's implied. You know what I mean?
It's like now you say this and you sayit out loud and then you hit okay and
(04:57):
move on to the next thing. But yeah,
I do find that interesting in termsof some kind of interesting and
outside of the box use cases. Like Isaid, I'm a haptics industry professional,
and my brain kind of goes to someof the haptics use cases there.
And it's funny because there's acouple of companies that do this,
but there are some electro stimuli,
(05:20):
haptic suits that exist out there.
One of 'em is made by acompany called Tesla Suit,
and another one in the gaming sideof things is made by a company called
O-O-O-W-O.
And these haptics are not fun.
They are not comfortable.
It's a lot of friction to get into thesedevices because they are using electro
(05:41):
stimuli.
The nodes that attach to your skincan't have clothing in between you and
it. So to wear the Tesla suit,
which I put on at CES 2019, Ithink. So it's been a while now,
I had to basically get naked at aconference to put this thing on.
I had my underwear on still,
but otherwise my whole bodywas inside of this suit.
(06:02):
And the O technology is very,
very similar except for it's just ashirt and it is designed to be used in
games. However, the sensation of these,
have you used one ofthese, by the way, Sam?
I've used the OO suit. Yeah.
The O. That thing isnot comfortable, is it?
No. And it's not fun to have,
(06:23):
have all your users get in it, andthen you have to worry about them.
With cleanliness, we have tomotion tracking suits too,
so I have to take it home and wash it.
So for all of those reasons, Iwould say it's not really good.
It's not really worthit. They're expensive,
they're uncomfortableto get into and out of,
and the haptic sensation physicallyactually hurts. You're being shocked,
(06:44):
which by the way,
I learned electrocuted should only beused when referring to death because it's
like execution, like execution,electrocution. So unless you died,
you didn't get electrocuted,you got shocked,
which is something thattook me forever to learn,
but this thing shocks you andsome of the sensations are akin
to scratches and stufflike that. And I do see,
(07:07):
however,
a compelling use case for negativereinforcement in training because
let's say we are doingthe oil rig training,
snipping the wrong wire or pulling thewrong lever could mean literal death for
everybody out there. Some of these things,
there's a reason why we're using immersivetechnologies to train some of this
(07:27):
stuff because it's very dangerous. It'sscary to learn these things in medical
procedures at contact.
We've done some stuff with CincinnatiChildren's Hospital and we went in there
last week for a demo, andI'm looking at these little,
I dunno, fake cadaver things andstuff like that that they have.
And I'm realizing, wow,
(07:47):
they're actually performingoperations on children in here and
they need to practice theseprocedures because messing up a little
bit could mean something terriblehappening to a child. So it's really,
really important that we learn thesethings and take these trainings seriously.
And if you've used the Owovest or the Tesla suit vest,
(08:08):
it only takes one shock for you to realizeyou don't want that to happen again.
And now when I did the game for owo,
my head was on a swivel in a way thatit would not have been if I didn't
have that negativereinforcement happening.
So while I don't recommend these devicesfor something like gaming, I mean,
I guess if you're a streamer and youwant some shock value or something,
(08:29):
you could buy one and use it.But at the end of the day,
I don't see a practical use case otherthan negative reinforcement using haptics
in a training where it couldnecessarily be a life death situation.
That's fair. Okay. Yeah, that makes sense.
So we've talked about some benefitsand challenges that you mentioned
challenges. I mean,
(08:50):
it hurts or it's difficult to puton one of the EMS or TENS based
like the suit and reducingcognitive load as a
benefit and increased safetybecause you can practice
surgery before actually going to dothe real thing like you're mentioning.
But what other challenges do youthink you see with some of the newer
(09:11):
modalities? User acceptance.
Are the people familiar with controllers?
Are they going to get frustrated withusing hand tracking or eye tracking? Now,
do you see any accuracy limitationswith hand tracking or eye
tracking or the
contact CI glove for haptics?
Is it not quite accurate enough thatyou're working on it? Security concerns?
(09:35):
You mentioned that might be somethingthat EU has to worry about collecting
PII for advertisementpurposes or whatever,
aging IT infrastructure. Can our ITsystems and enterprise take this stuff?
Is it ready? Are thesecompanies working with say,
Microsoft Azure and AWS toreally get them accepted in a
(09:56):
more formal way? Or is it stillkind of the wild west, you think?
Yeah, I think in terms ofinfrastructure and stuff like that,
I think we're good to go.
I think most people have computers thatcan handle this type of technology.
I mean,
I guess you might have to acquire somehardware to pull some of this stuff off,
but it all works pretty wellwith existing technologies.
So I don't think we have toomuch to worry about there.
(10:21):
You touched on a lot ofreally good stuff there.
But I think in terms of some of thechallenges with accepting some of these
things, I think user acceptanceactually is a pretty big one.
People, they resist change.There's a lot of people,
especially in the DOD side of things,
there's some guys who have been in thoseroles for decades, literal decades,
(10:43):
and the way they've beendoing things has worked.
They're not in a big hurry to make a hugechange to the infrastructure of their
training that's going to take years tofully implement and all of that stuff,
especially when they're used toseeing the results that they expect.
But other than some of the thingsthat you kind of touched on there,
(11:04):
I think that scalabilityis a really big one.
I think that there is,
if I'm trying to sell somebodyon using AR or VR for training,
I would probably lean intosomething like scalability.
Traditional trainers forthe Air Force, for example,
are insanely expensive to build.
(11:25):
It's basically a fake cockpit using allof the real stuff that a cockpit's made
out of. It costs a lot of money to build.
They can only exist inone place at one time,
and only one person can occupy itat a time. And also, typically,
you'll probably need a second body thereas an instructor and as an instructor
to explain what's happening.
But if you had an AR or a VRapplication that was designed to
(11:48):
train people in a virtual environment,
this is a program that you could slapon a hundred headsets and send to
everybody and say, all right,everybody, here's your homework.
Spend an hour or two inthis virtual cockpit.
And then when everybodycomes in the next day,
they all have so much moreexperience than they did before.
And you're able to scale this trainingand scale your ability to share this
information with across your workforceor across your whatever in a way that
(12:12):
was previously inaccessible.
So I see a huge opportunity forscalability of training that
would require you being onsite,
having something really expensive or largeor something that would be incredibly
dangerous to participate in.
Okay. Yeah, that makes sense.
So moving on to future outlook,
(12:34):
you mentioned how theenterprise can expect
to use these different interactionmodalities in the next five to 10
years.
Are there any particular companies thatyou expect to have a massive impact on
the AR space?
And then what are some high levelsteps that companies can do to
(12:55):
prepare to integrate these new ARsolutions? So that could be hand tracking,
start looking at gloves,start integrating them,
maybe start using if they'reusing Unity or Unreal,
import the new like for Unity,for example, the XR HANDS toolkit,
and start supportingthat in your input code.
(13:16):
What do you think companies to watch thatyou expect to have a massive impact on
the AR space five to 10 years?And then what can companies do to
within the enterprise do toprepare for these advancements?
Cool. I'll start with thelatter portion of this.
I think if you've made it this far inthis conversation and you're considering
(13:37):
maybe getting into some of the stuffthat now is a great time to start doing
some of the stuff that you mentioned.
Start taking a look at the differenthand tracking solutions that exist.
Maybe get a couple of different HMDsand start playing around, like I said,
go to the Apple store, do a demo there.
Experience what it feels like to haveeye tracking and hand tracking working in
conjunction with each other.
(13:58):
This is definitely the time becausethe technology's moving very rapidly.
We're looking, I mean, they could put out
an AR headset and a VR headset once ayear that would set a new standard for the
technology. So things aremoving really, really quick.
And I wouldn't expect to havesomething that meets all of your
(14:20):
expectations today. And I sayexpectations in quote marks because
I do feel like the general publichas a somewhat inflated idea
of what to expect when they usea lot of these technologies.
We are raised on incrediblescience fiction films in media
and stuff like that thathonestly show us a lot of tech
(14:42):
10, 20, 30, 40 years in advance.
One of my favorite movies ever is TotalRecall starring Arnold Schwartzenegger.
And if you watch that movie,
you will see so many technologies thatdid not exist when that movie came out
that are now commonplacein the real world.
And most people who would watchthat movie today would just,
they wouldn't notice that it would justkind of go in one and out the other.
(15:04):
And they're like, okay, cool.Yeah, they're doing a video chat.
But I remember being younger and watchingpeople do video chats and movies being
blown away that that might actuallybe something that we can do. However,
a lot of people, when they thinkabout ar, they think about vr.
They have this expectationthat when they put an HMD on,
it's literally stepping into a portaland it's going to change everything.
(15:26):
And when they see a little bit of afriction or a little bit of lag here,
or maybe the hapticsdon't line up perfectly,
they kind of just want to throw it out,throw it out the window, I would say
don't have unrealistic expectations ofthe technology of where it is today.
But I think it's safe to assume thatall this technology is going to reach
(15:48):
utterly profound levelswithin our lifetimes,
the type of profound whereyou will be changed by 10,
20 minute experiencesputting on the headset.
So I definitely see all of thathappening. In terms of companies to watch,
I am not exactly sure,
to be honest. There's definitely a fewthat have been making waves and have been
(16:11):
pushing the envelope. Snap is one of 'em.
Snapchat has been using AR filters.
They're basically the firstcompany to actually get
wide adoption of AR technology,at least that I have seen.
I mean so many people using Snapchatfilters to send videos and stuff like that
to their friends. They'reworking on a lot of AR stuff.
(16:31):
So I think that's a companythat's really worth looking at.
Niantic does a lot of really cool stuff.
They made Pokemon Go andsimilar AR games. Like you said,
gaming kind of paved the wayfor a lot of these things.
I think that's the casehere as well. Also, meta,
there are concerns with Facebook owninga company that does a lot of this stuff,
like being a data collection company.There's some privacy concerns there.
(16:56):
But in terms of the r and d and the moneythat's being spent on pushing all this
stuff forward,
I don't think there's another companythat does what Meta does. They are pushing
things so so hard.
So I would pay attention to the productsthat Meta is putting out over the next
five to 10 years.
In terms of where I see alot of this stuff going,
(17:17):
I think we're going to see thisincredible blend of biometric data,
immersive technology,
and procedurally generated contentthat is going to create these
amazingly immersive experiences forpeople that are totally individualized and
completely custom to exactly what itis that you need to get out of that
experience. So for example,
(17:39):
let's say you have a training programthat is using all these technologies.
It is reading my biometricdata in real time, my eyes,
my facial expressions, all of that stuff.
It sees my pupils dilatingand focusing and all of that,
and maybe there's somekind of BCI that can tell
my level of engagement.
(18:00):
So the procedurally generated contentin real time can read my biometric
data and feed to me the content that'snecessary to get the desired outcome out
of my biometrics. So maybe they're lookingfor a specific level of engagement,
or maybe they're trying to scareme or make me experience fear.
It can keep ramping up the fear levelsuntil it gets to the point where it's
(18:20):
getting the response out ofme that it's looking for.
So when you combine that with,you combine all these things,
these biometric data,
having it be an immersive experience whereyou really feel like you're connected
to what's happening and it beingprocedural and custom to the user,
I think it's going to be insanely potent.
I think they'll be able to identifyyour weaknesses in a training relatively
(18:44):
quickly and start to addressthose weaknesses in real time.
And that's super, super exciting.Of course, like I said earlier,
it comes with safety concerns.
We all basically are going to have toget rid of the idea of privacy and stuff
like that, I think at some point.But like I said earlier,
I think the trains left the station.
I think trying to prevent thesetechnologies from integrating with people
(19:08):
is trying to fight the tide. Youknow what I mean? This is nature.
We are naturally connectingwith technology more and
more and more all the time.
As a baby, a computer bs upand you crawl towards it.
We are instinctually driven toconnect with the stuff, and to me,
it's natural. So I say, just hang on.
(19:29):
And I'm throwing my hands up on therollercoaster. I'm just like, all right,
let's go. Woo, because it's going to befun and exciting, and I'm here for it.
Yeah, that's awesome.Yeah. Great. So is there
anything else that you would like toplug before we think we're good on
everything else?
Yeah. Yeah. I mean,
(19:49):
if you find this conversation and myperspective on some of this stuff,
interesting, I would highly recommendyou come and check out between Realities.
The podcast that I dowith my partner s Skiva,
we do live episodes everyFriday as long as time permits,
and we don't have work obligations andtravels and things like that getting in
the way. But we alwayshave a guest on our show.
(20:11):
So every week we have somebodyeither from the gaming space,
the enterprise space, the trainingspace. We get developers, CEOs,
YouTubers, the whole gambit,
basically anybody who cares aboutthis technology as much as we do,
regardless of what they're doing,
could be somebody that wewould have on to the show.
And we really do like tokind of peel away layers,
(20:31):
talk about the nittygritty of all this stuff,
focus on some of the individuals who arereally making a difference in the space
and giving a voice to somepeople who are like us,
who are just trying to get involved andcome and be a part of it. Of course,
if you're interested in some ofthe haptic stuff that mentioned,
definitely check out Contact ci.
We are doing a lot right nowthat's currently available
and behind the scenes to
(20:54):
improve some of the haptic fidelity andstuff like that that you were mentioning
earlier, which by the way,
I will say fidelity of hand trackingand haptics has a long way to go,
but I mentioned earlier itexists in a state that is
enough to kind of bridge the gap fromyour hands to your brain to allow you to
have a more immersive experience thanyou would if the haptics didn't exist.
(21:17):
A demo I like to do often is havinga gloved hand and an unloved hand,
and you reach out and interact with somebuttons and switches with a gloved hand
and then reach out and do it with yourbare hand. And the difference is night
and day. One of them feels real,
however you want to definereal and the other one doesn't.
So I really do think that there isa huge value for haptics right now,
(21:37):
even if it isn't entirely lifelike, andI do expect it to get to that point.
So yeah,
between realities and contact CIare definitely my two primary things
that if you wanted to follow upwith me to reach out. And also,
I'll say that I do a lot of travelingand I go to a lot of conferences and
events. So if you everwant to meet up, I'll be,
I don't know when this comes out,
(21:58):
but I'll be in a WE at aWE in June in Long Beach,
and I go to basically all of the ar,vr, and tech focused conferences.
So feel free to reach out to me anytime.
Send me a DM on Twitter orLinkedIn and we can connect.
What's your Twitter handle.
For people? My Twitter handle is Alexvr, but there's some underscores in it.
It's like Alex under vr, butI think if you type Alex vr,
(22:22):
you'll probably get to me. Okay.
Awesome. Well,
thank you so much for your time andall of your input and expertise, Alex,
and for anyone else who might beinterested, just like Alex said,
check out his podcast and contact ci,
we'll get your email listedwherever we can host this. But yeah,
(22:44):
thank you again and take care.It's been great having you.
Thanks, Sam. Looking forward toour next chat. Talk to you soon.