Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:07):
Hello, I'm Karen Quatromoni,
the Director of Public Relationsfor Object Management Group, OMG.
Welcome to our OMG podcast series. At OMG,
we're known for driving industrystandards and building tech communities.
Today we're focusing on theAugmented Reality for Enterprise
(00:29):
Alliance (AREA), which is an OMG program.
The AREA accelerates AR adoptionby creating a comprehensive
ecosystem for enterprises,providers, and research institutions.
Today,
Boeing's Sam Neblett willhost the podcast session.
(00:50):
Hello everyone. My name is Sam Neblett.
I am the co-chair of theAREA Research Committee.
The area is the AugmentedReality for Enterprise Alliance,
and I am also a seniorsoftware developer for
Air VR technologies and atechnical lead engineer at Boeing.
Today we are going to betalking to Alex Meland on
(01:14):
and getting his perspective on oneof the trends that the area has
identified for 2024trends to watch in AR for
enterprise,
and that is the new interactionmodalities and how those affect
different companieswithin the AR space for
enterprise.
(01:36):
So we have today AlexMeland from Contact CI.
He has a podcast. Alex,
I'll hand it to you fora further introduction.
Thank you, Sam. Good to see you.Thanks for having me. Hey everybody.
Alex Meland. I also goby Alex VR in the space.
If you find me on Twitter or whatever,it'll be Alex__vr (two underscores).
(01:57):
I've been a virtual reality and augmented
reality enthusiast,
industry professional and contentcreator for the past four years or so,
maybe five years I guess. Now
I have been working in the hapticsspace for AR and VR for the
(02:18):
past few years. I started witha company called B Haptics.
They make wearable consumerhaptic devices for gaming.
So if you're playing a game,
you get to wear this cool vest that makesyou look like Batman and when you get
shot from behind actuatorson your back fire.
So it's a very immersive hapticexperience with what they do.
And then from B Haptics,
I went over to Contact CIwhere I currently work in
business development as a
(02:41):
strategic partnerships manager.At Contact CI,
we make a multi forceergonomic haptic glove that is
designed and createdfor training simulation,
training and exercises, things for theenterprise or the Department of Defense.
The Air Force being oneof our number one clients.
We ship gloves to the Air Force and theytrain pilots in virtual cockpits in a
(03:05):
way that's scalable and easier to affordthan some of the traditional methods.
As you mentioned, I also am a podcaster.
I have a virtual reality podcastcalled Between Realities.
We do live shows on YouTube,
but it exists pretty much onall of the podcast platforms.
And while we are gamers,
(03:27):
we are taking a philosophical approachto the conversations that we have on
between realities and really try to peelaway the layers and discuss the nature
of this technology, theseimmersive mediums, virtual
reality, augmented reality,
mixed reality, ai,
how they all kind of intersect. And welike to do a lot of future tripping and
(03:47):
think about where are we going to bein five years, 10 years, 20 years,
because there are a lot of ethicaland moral conversations that we
need to start having as we start toindulge in these more immersive and
intense experiences.
And we try to keep that conversationgoing between realities.
(04:09):
But otherwise, yeah, I'm just very muchingrained into the VR and AR space.
I have affiliation with upload vr.
I've been a writer for VR TrendMagazine last year I was the host of
the seventh International VRAwards, which was really cool.
We did that in Rotterdam and got to beup on stage presenting awards to all of
the brightest and greatestin the VR and AR space.
(04:32):
So I've been very lucky to findmyself in a position to kind of
have access to all of thesecutting edge emerging technologies,
get to maybe influence some decisionshere or there and report on it.
And yeah,
I basically just had my life changedby this technology and have just had my
foot on the gas ever since.
Awesome.
(04:52):
You sound like the perfect person tospeak on different modalities that our
area members and others within theenterprise that utilize AR might
want to either adopt or further develop.
So one of our first questions,
so what are some of the most promisingnew trends that you are noticing
(05:12):
regarding input methodologiesfor augmented reality?
So this could be things like voicecontrol, hand gestures, eye tracking,
brain computer interfaces.You mentioned ai,
so what are some new promisingtrends you're noticing?
So you definitely touched on the
combination that I seethings heading primarily with
(05:36):
eye tracking, brain computer interfacing,
and maybe potentially somehand gesture stuff too.
I think eye tracking isprobably the number one.
I think that that is the lowesthanging fruit in terms of ways to
interact with virtual environments.
And as we blend that with stuff like ai,
(05:58):
we're going to see some legitimatelyinsane stuff start to appear.
But eye tracking I think is a big one.
Anybody who has used the Apple VisionPro I think is starting to see a little
bit of the direction that Iexpect a lot of this to go.
The Apple Vision Pro is something thatis shipped without controllers and
controllers have been foraugmented and virtual reality XR
(06:21):
is kind of the term that weuse to encapsulate all of these
emerging head-mounteddisplay technologies. And
actually, funny enough,
the VR awards that I hosted last yearis turning into the XR awards for this
next upcoming one because thetwo are starting to blend.
You see these new VR headsets that arecoming out like pass through cameras and
(06:44):
the ability to see the world around youand to layer virtual elements into that
are becoming a big, big thing.You know what I mean? So yeah,
I feel like a lot of this is going toconverge over the next five to 10 years
and we'll start to see devices that aretrying to cover all of the bases there.
But if you've used an Apple Vision Pro,
you can start to see where a lot of thisstuff is going because as controllers
(07:07):
have been traditional ways ofconnecting with virtual environments,
using these head-mounted displays,
they're not exactly the most intuitive.Me being a gamer for
my entire life, if you put drop twocontrollers in my hands, like I'm at home,
I've got dual thumbs sticks,I've got buttons on both sides,
triggers on both sides, Iknow exactly what to do.
(07:27):
But there are people who are startingto adopt ar vr technologies that have
never held a controller intheir life, and not only that,
they don't see how pushing thesebuttons on my hands translates
into new skills.
Maybe they're using a headset to trainon an oil rig or something like that.
So how does that translate?
(07:47):
It doesn't help build moreauthentic muscle memory stuff.
So hand tracking combinedwith eye tracking is where I
see things existing overthe next five to 10 years.
I think people are trying to drop thesebulky controllers and interact with
their God-given hands.
Our hands are amazing.The sense of touch is incredible,
(08:09):
and it's something that it's like theonly sense that we're born with as babies
that we have fully developed.
We are interfacing witheverything as babies using touch,
which is of course why contactCI is why we do what we do.
The idea is to be able to bring that senseof touch with your hands and interact
naturally and comfortablyin virtual environments in
a way that's compelling and
(08:33):
kind of bridges that gap for yourbrain from the virtual experience to
a real one. And of course, the methodthat we use to accomplish that is haptics,
right?
Haptic feedback being vibrations orforce feedback actuators that are
engaging in the perfect moment to kindof trick your brain into having this
tangible,
tactile experience when you reach outand grab a virtual tool or something like
(08:57):
that.
That really is the missing element rightnow in terms of using hand tracking to
interact with these types of things.If you want to reach out and make a menu
selection with hand tracking withoutany kind of haptic interaction,
you're kind of just like pokingthin air. It's just all air.
There's nothing happening.
But when you reach out and there's avibration on the tip of your finger,
(09:18):
when you hit that button that signalsto your brain, Hey, confirmation. Yep,
I did the thing. That's what I looked for.
And it doesn't necessarily haveto mimic a real one-to-one touch.
While of course we wouldlove to see that. Right now,
a lot of the ways that things are beingdone is similar to the ways that haptics
are conveyed in your phone orin a controller of some kind.
(09:40):
If you're playing Call of Duty on yourXbox and you pull the trigger like
shaking in your hand, it doesn'texactly feel like a gun firing,
but it's enough to connect your brainto the experience to make it a much more
visceral and immersive thing.So hand tracking is huge,
eye tracking is huge. Oneaspect that eye tracking
(10:03):
it can improve greatly is performance.
There's something that a lot of developersare starting to use called Foveated
rendering,
which basically only renders theimage at directly where your gaze is
focused and that kind ofblends everything else out,
and it can really increase the performanceof a lot of the applications that are
running this stuff. However,
(10:25):
with hand tracking or Imean with eye tracking,
with as compelling as it isbecause it is compelling.
If you've used the Apple Vision Pro, youglance at something, pinch your finger,
and boom, you are cruising through menusso much faster than a point and click,
oh, there he is, baby. He's got it. Sam'sgot the Apple Vision Pro. I love it.
But one thing that I think is worthmentioning is that with eye tracking
(10:48):
comes a risk of very potent
and personal data.Being up for grabs your eyes
convey an incredible amount of informationmore than you realize at a conscious
level.
And you could be just cruising on a storeon unlike an app store in a headset,
(11:08):
and if it has eye tracking andthey're collecting that data,
they could potentially figure out thingsabout you that you don't even know
based on the way that your eye flickerswhen it sees something like, ah, boom,
he likes that. We can use that data later.
So there is some security issues andstuff like that that need to be considered
when you're implementing eye trackingtechnologies, but in my opinion,
(11:30):
not only is it worth it,it's also inevitable.
So I think it's importantthat we take it seriously,
but at the same time understand thatthere are solutions to this and the juice
is worth the squeeze.
Okay, awesome. Yeah, that'sgreat. So eye tracking,
hand tracking and hapticslike you mentioned.
So these new modalities youmentioned comparing them to
(11:52):
controllers, traditional controllerslike I don't have any in front of me,
game controllers essentially,or an Xbox controller.
Personally,
I have had mixed experienceswith hand tracking
on different devices. I believeit's getting better, but do you
(12:12):
feel that it's better forsome applications that have
traditional controllers
and worse for others in this? Itdoesn't have to just be controllers,
it could be something like a touchscreenif you are using an iPad or an iPhone
for ar,
is there something where do you feelthat across the board hand tracking,
an eye tracking is the wayto go for everything for
we're talking enterprise use
(12:34):
cases,
or are there still some use cases for ARwhere you feel traditional controllers
might be the best bet for rightnow, possibly in five years,
or is everything goingtowards hand tracking?
I do not think everything isgoing towards hand tracking.
I do think that there are a lot ofuse cases where that makes sense.
There are some use cases where handtracking doesn't necessarily need haptics.
(12:58):
There are use cases where haptics areincredibly important to the hand tracking
experience,
and there are a lot of use casesthat don't necessarily need hands.
The easiest one to kind of identify wouldbe something like a firearms training.
This controller that I'm holding in myhand right now is actually a beautiful
peripheral. It really is.I love VR controllers.
(13:20):
If you've never held one,
it really is kind of holding atraditional gaming controller like an Xbox
controller and breaking it in halfbecause you have triggers on both
sides. You've got buttons on bothsides. It feels really, really good
for something like a firearms training.
This is fantastic.These controllers do track better than
(13:42):
hands in today's world in 2024.
A lot of the hand tracking solutionsright now that are being implemented are
optical based. They're camera based,ir, sensor based, and they work really,
really well for certain applicationsand not so well for others.
We use hand tracking to trackour gloves with contact ci,
(14:03):
and we've built the gloves to be ergonomicenough to where they are registered
by these IR sensors as hands,
they're very form fittedand they don't have bulky
exoskeleton stuff hangingoff of 'em and all of that.
A lot of the interactions that,
because the hand tracking module thatwe use currently, we use it frankly,
(14:26):
we're developed to work with all ofthe optical hand tracking solutions.
But my favorite one to useis the LEAP controller,
the Ultra Leap Controller two.
And the current way that weimplement it is by attaching the hand
tracking camera to the front of a headset,
and then when you stickyour hands out, boom,
there they are.They're tracking your hands.
(14:48):
But there's some limitationsto that for if I have my hands
stretched out in front of me and thenI decide I want to fix my gaze to
something over my left shoulder,
the IR camera has now pointed away frommy hands and my hands are no longer
being tracked. So there'scertain workarounds,
but ideally you would want those singlecamera that's mounted to your headset to
(15:10):
work properly. So for something thatrequires my head to be on a swivel,
I'm looking around all over the place.
Maybe that hand trackingsolution isn't ideal.
But if I am doing some kind ofa pilot training, like I said,
for the Air Force for example, you'resitting in front of the cockpit,
all of your buttons and switchesare all right there in front of you.
You can easily reachout and grab everything.
(15:33):
What you're saying are those physicalbuttons and switches or they're all.
Virtual? They're not. They're all virtual,
but they are placed in a waythat's ergonomic for the user.
If you are having to reach greatdistances across something,
the cameras can't reallytrack your hands as well.
So something that's tight closeright here in front of me,
(15:54):
not requiring the user to reach toofar or to look in too many different
directions.
The hand tracking solutions right nowwork extremely well for that kind of use
case. But I wouldn't use hand trackingfor something like a firearms training,
like I said, because this peripheral,it occupies my hand comfortably.
There's a trigger on the back of it,
just like there would be on a firearmand it tracks really, really well,
(16:17):
regardless of which direction I'm looking.
So some use cases likeawesome to use peripherals,
something like maybe, I don't know,
a fireman's fire hose or something likethat. If they had one of those with a
tracker attached to it whereyou can actually pull the lever,
I think that would workreally, really well. Thing is,
(16:38):
it's kind of like thiscost versus time versus
benefit kind of thing. Likehaving something like a
haptic glove, for example,
would allow you to do a lot ofdifferent stuff that you wouldn't
necessarily need a physicalperipheral to achieve.
So I think if it's something thattranslates easily into a controller like a
(16:59):
firearm, by all means use thecontroller For some use cases,
having an additional peripheralwould really make sense.
If it's all you're doingis fire hose training,
then maybe get the peripheraland add a track to it.
But if you want to have a varietyof things, then hand tracking,
eye tracking haptics, that's a way toaccomplish a lot with a single stone.
Okay.
(17:19):
That makes sense.