Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Blaise Delfino (00:19):
Thank you to our
partners.
Sycle, built for the entirehearing care practice.
Redux, the best dryer handsdown.
CaptionC all by Sorenson, Lifeis Calling.
CareCredit, Here today to helpmore people hear tomorrow.
Fader Plugs, the world's firstcustom adjustable earplug.
(00:42):
Welcome back to another episodeof the Hearing Matters Podcast.
I'm founder and host, BlaiseDelfino, and, as a friendly
reminder, this podcast isseparate from my work at Starkey
.
Welcome to the Hearing MattersPodcast.
I'm your host, Blaise Delfino,and today we're doing something
(01:03):
a little different, but for avery good reason.
I'm excited to share a recentepisode from my friend and
colleague, Dr Dave Fabry, hostof the Starkey SoundBites
podcast.
Now, if you're not alreadysubscribed to his show, I highly
recommend it.
Dave brings incredible insight,expertise and real-world
(01:25):
context to the latestinnovations in hearing
technology.
So why are we sharing thisparticular episode on Hearing
Matters?
Well, it's simple.
It highlights where ourindustry is headed and why that
matters for both providers andpatients.
In this episode, Dave and HollySchissel dive into Starkey's
(01:46):
latest accessory releases,including the Table Microphone
and the Remote Microphone Plus.
Now, these tools are absolutegame changers for patients who
need to hear clearly in dynamicenvironments.
That could be a noisyenvironment, a work meeting or
even a family gathering.
(02:06):
The ability to wirelesslystream conversation directly to
a patient's hearing aids is notjust about convenience.
It's about connection,confidence and really
independence.
What also personally has mefired up is the conversation
around AuraCast.
This is more than just abuzzword.
(02:27):
It's really the future ofwireless audio.
Auracast will enable hearingaid users to tap into public
broadcast, so from airports totheaters, to conference rooms,
straight through their devices.
Think of it as Bluetooth onsteroids, built for
accessibility.
It's inclusive technology andit has the potential to
(02:51):
transform how our patientsexperience the world.
So sit back, tune in and letDave and Holly walk you through
some of the most excitingadvancements happening right now
in hearing healthcare.
Here's the episode Enjoy.
Dave Fabry (03:12):
Welcome to Starkey
Soundbites.
I'm Dave Fabry, Starkey's ChiefHearing Health Officer and host
of this program, and todaywe're going to talk about some
exciting new advances in ourhearing health technology, and
there's no one better to talkabout that than Holly Schissel,
our VP of product marketing atStarkey.
So we've been long overdue todo this absolutely happy to be
(03:34):
here, dave well, and it's it'sgreat to have you here in
perfect timing.
So we just are in the midst of anew launch and let's talk and
dive right in, because I knowthat's people really like when
we get right into the technicaldetails of what it is that is
coming with this latest productlaunch.
Holly Schissel (03:54):
Yeah, so latest
product launch, dave, is based
on Edge AI.
The launch that sort of tookthe industry by surprise last
fall, so typically there's alonger release cycle.
When we introduced Edge AI withour new DNN technology it
really changed the game, gavethose patients the extra edge.
(04:14):
We continue to innovate and runright.
Never a dull moment aroundStarkey.
Dave Fabry (04:21):
Yeah, and we know,
with Edge AI let us focus a
little bit on where we've beenand a lot of times we start
getting bored with our featuresand our story before the market
even realizes it.
And I think the market is still.
As you said, we kind of tookpeople by surprise because
number one going to Genesis wetook for the rechargeable
(04:43):
battery products, both customand standard.
We took range anxiety off thetable for hearing aid users to
have the confidence that nomatter what type of environments
they were encountering, whetherwe're using our DNN technology
or not, they had all day batterylife.
But then with that movement toedge AI, with always on DNN, we
(05:04):
had no compromise on thatbattery life, no compromise.
In addition, we know thatpatients are concerned about
speech understanding and soundquality speech understanding,
particularly in noisyenvironments, and we did some
important benchtop testing withedge AI and then leading into
this latest launch.
Can you talk a little bit aboutsome of the performance levels
(05:26):
that we've been able to see inlaboratory testing for the
potential for speechunderstanding in noise?
Holly Schissel (05:34):
Yeah, sure.
So with this product andintegrating the DNN on that kind
of direct audio path, we'vebeen able to see a 30%
improvement in speechidentification.
So once we identify it then wecan really do something with it.
So in our bench testing we'veseen actually up to 13 dB
improvement in SNR.
Dave Fabry (05:53):
That's amazing.
Holly Schissel (05:53):
Yeah, really
really big difference.
So translate that intoreal-world experience.
Certainly, there's variedconditions, but we're giving our
patients that extra edge toperform better.
Dave Fabry (06:03):
Yeah, and this has
really been a sequential path
that dated back, really thebeginnings of it conditions, but
we're giving our patients thatextra edge to perform better.
Yeah, and this has really been asequential path that dated back
really.
The beginnings of it were withLivio AI, but then certainly
with Livio Edge AI that welaunched in 2020, where we first
used that edge mode, asituational program that
incorporated DNN that enabledwhether it was a quiet talker,
(06:26):
distant talker, communication innoisy backgrounds to really
optimize to the environment thata patient is using their
devices in ways that no one elseon the market does.
I mean, a lot of people willtout how many millions of
comparisons or listeningenvironments they've monitored
and adapted to, but thedifference with Edge and now
Edge AI is that thatcustomization, personalization
(06:49):
is taking place, combining themost sophisticated DNN
processing with listener intentto say, yeah, you can train on
millions of environments, butthe one I want to communicate in
is unique to me and you rightnow, and that's where I want it
to optimize.
And so, with up to 13 dB ofimprovements, measured
(07:09):
improvements, that's aremarkable bar and a high bar to
set for the industry.
So what else are we doing withthis latest product that we can
talk about?
Holly Schissel (07:21):
Sure, we've made
some performance enhancements
from a wind noise algorithmperspective, so 85% improvement
in the speed at which we adaptto those better conditions for
the listener.
So again optimizing comfort,kind of one of our favorite
topics.
Our accessories we're alsoadding to our portfolio the new
table microphone as well as aremote microphone, so really
(07:43):
covering all the angles for ourpatients.
Dave Fabry (07:46):
Well, and that's a
very welcome addition.
I mean, we know that with EdgeAI we had the new radio, if you
will, in addition to all of theother features to incorporate
those speech and noiseimprovements.
But that new radio, the LEaudio compatibility, required
that we updated the capabilitiesof the table mic, which is for
(08:10):
many of my patients theirfavorite accessory, to go in
combination with the hearing aid, because as great as our
hearing aids are, sometimeshearing aids alone are not
enough.
And even with the connectivityadvantages provided by AuraCast
and broadcast capabilities withAuraCast and I hope we'll have a
(08:30):
little time to talk about thatbecause that's kind of a big
deal- for the future forubiquitous connectivity.
But there still, I think, formany patients, will be
situations where they go todinner with a small group and
they want to be able to have theconvenience of taking the table
mic out and setting it on thecountertop or on the table, or
(08:53):
having, if they're with oneguest, having them wear it
around their neck and give themthe confidence to provide
improvements beyond what hearingaids alone can do, even as
impressive as what we're seeing.
Holly Schissel (09:04):
Absolutely, and
we've made some great
performance improvements for ourtable mic.
So if we think of the table micbeing a multi-array system,
we're starting the defaultcondition so that it's looking
for those people sitting aroundthe table.
So the mics closest to the userare not amplified.
They can certainly simplyselect them and bring them back
into the conversation, but it'snot often you've got someone
(09:25):
right by your side at a table.
So I think those are reallynice.
You know, always looking forthose little tweaks bring it up
another level for the patient.
Dave Fabry (09:35):
Yeah, really refine
the functionality, improve the
user interface, as you said,sort of beginning with the
assumption that you don't havesomeone sitting right next to
you but rather they're out hereyou can still, as you have
before for those people who'vegotten used to the user
interface and being able toeither have it automatically
select from the eightbeamforming microphones or
(09:56):
select still up to two of theirfavorite.
If they're just with two otherpeople, they can freeze those
locations and use it as theyhave before.
With two other people they canfreeze those locations and use
it as they have before, andotherwise the performance, the
user interface, will be verysimilar.
Subtle differences, but I think, despite all the promise of LE
(10:16):
Audio, having now the remotemicrophone and the table mic as
something that a patient canhave to use on demand, in
addition to the remarkableachievements we've achieved with
the devices, is a welcomeaddition.
Yeah absolutely.
Holly Schissel (10:31):
So I think,
we've got that solid audio
performance of the Edge AIproducts alone.
But there's just situationalawareness.
Sometimes I say, you know,first fit the loss with the
hearing aids and then fit thelifestyle with the accessories.
Dave Fabry (10:44):
Yeah, and it is
always sort of mystifying to me
that more patients aren't awareof some of these accessories,
and I think it really is notjust for patients with severe to
profound hearing loss.
We're finding increasinglypeople that have had noise
damage over time or simply agingof the ears benefit from being
(11:06):
able to pick up thosehigh-frequency cues in noisy
environments that are oftenobscured in the cacophony of
other voices around in achallenging listening
environment.
So really glad to have thatback in the stable and with
improvements as you mentioned.
Holly Schissel (11:21):
Yes, but one
thing that I thought was
interesting is I think wewatched through the years that
accessories have this low attachrate.
So I think, as what you justsaid, people think hearing aids
alone should do it.
But the number of suggestionsand requests from the field that
we got from these productsreally gave me heart that people
know the right place and thebenefit they can provide
patients.
So we're really excited to getthem out there.
Dave Fabry (11:42):
Yeah, so many of the
patients, certainly those with
more significant loss, largesignal-to-noise ratio deficits
for speech beyond what theiraudiogram might suggest.
But even some of those peoplewho are in noise a lot of the
time, and the professionals forthose who are listening, can see
(12:02):
those patients in data loggingwhere they're, in speech and
noise and communicationenvironments that are
challenging.
Don't forget that thecombination of the accessory,
like the table mic and theirdevices can provide people with
that additional almost I sayit's like their third ear or
their superpower.
And I think it's important tosort of address for a minute
(12:25):
that the pairing is donedirectly between the table mic
or the remote mic and thedevices so that, heaven forbid,
if a patient wanted to leavetheir smartphone at home, they
could take simply the table micand their devices, go out into a
noisy conversation and justturn the volume up and down
(12:45):
directly on the table mic,activate the function, turn it
on and off and even if, whenthey're finished, turn it off
and then they're back on theirhearing aid.
So, as blasphemous as thatseems for many of us who are
tethered to our smartphones,some people don't always want.
They want to be present andhaving that accessory gives them
additional confidence andperformance.
Holly Schissel (13:07):
My mom thinks
it's rude right when she takes
it out.
She wants to keep it hiddenduring that conversation, so
absolutely.
Dave Fabry (13:12):
And, speaking of
that, one more user interface.
That was a welcome addition onthe recent launch but still
hasn't, I think, been given alot of attention.
Or may just again be part ofthe story where we're wanting to
focus on signal-to-noise ratiobenefits.
But once again we have an AppleWatch compatible user app and
(13:33):
talk a little bit about thefunctionality of that.
How can a person use what's ause case for how they could use
the Apple Watch to morediscreetly and less rudely, to
your mother's point, makeadjustments to their hearing
instruments?
Holly Schissel (13:46):
I think it's the
volume and program changes that
they can make more simply, sodiscreet.
If you've left your phonebehind, it's a quick way to make
an adjustment and I thinksometimes if something is quite
loud or you want to make a quickadjustment, it's right there at
your wrist.
So I think we're always justlooking at ways to make sure
that the technology is seamless.
It integrates into thatpatient's life that wherever
(14:09):
they are, they've got thesolution for them.
Dave Fabry (14:12):
How would someone
decide between the table mic and
the remote microphone?
Specific to the accessories.
Holly Schissel (14:20):
I think again,
it's kind of looking at those
lifestyle situations.
So interesting with the tablemic is it can certainly be used
as a table mic, as the namestates, but it also can be worn.
So if somebody finds thatthey're in multiple situations
that might be the best one forthem.
I would say remote mic isprobably better for those
one-on-one conversations, soit's really going to be
(14:42):
dependent upon their lifestyle.
Dave Fabry (14:44):
Yeah, I think it's
that really simplified user
interface to just be able to putit on for when there's another
talker, one talker where thetable mic is really an
all-around player and,interestingly, some people may
not be aware that with the tablemic, as it goes from horizontal
to vertical in position, itwill automatically turn that top
microphone on the one upclosest to the source that a
(15:07):
person wearing the device wouldbe speaking.
Holly Schissel (15:09):
Yeah, simple
user interface.
Another nice improvement we'vemade on the remote microphone is
that when it's not streaming,when you're not using it as a
remote microphone, you canactually use the volume control
buttons to control the volume onyour hearing aid as well.
So, again, kind ofsimplification, getting that
multi-purpose device so youdon't have to have a remote and
(15:30):
your phone and all of thesedevices.
It's just simplifying if youjust want one clean accessory.
Dave Fabry (15:37):
That's really a nice
addition, actually, because
then you don't need a separateremote control.
Now that remote control is verynice again for those patients
that we've already had with EdgeAI and that continues with this
latest launch.
But that simple remote thatenables people who don't want to
use a smartphone to not have toalways use the onboard controls
(15:59):
for their device, but they caneven engage Edge Mode if that's
assigned to one of thosefunctions on there.
And talk about advances interms of Edge Mode Automatic on
the top tiers, and do you thinkthat this helps providers defend
a higher tier technology forthat edge mode automatic, in
(16:21):
that the departure really frombeing an on-demand feature to
one that continuously updates?
Holly Schissel (16:26):
Yeah, I think
it's really important Again,
kind of that seamlessintegration into the patient's
life.
So you know, we recommend edgeMode as that on-demand
experience and it's certainlythe settings are going to maybe
be a little bit more dramaticthan our typical everyday
program.
So when they go into Edge Modesometimes they then forget right
, they leave that situation thatthey called upon Edge Mode for.
(16:49):
So if they walk out of therestaurant, if they get back in
the car, it'll automaticallyupdate to that new condition and
be an appropriate listeningresponse.
So I think it's just thatseamless right.
The more user-friendly, thebetter.
We can optimize that hearingexperience.
It makes a big difference onthat top tier.
Dave Fabry (17:08):
Excellent, yeah, and
I see again that alone, that
feature of the patient beingable to select whether or not
they want to emphasize speech orreduce noise, or that
compromise, if you will, of bestsound, which is even during the
(17:32):
day they may say if they're inmeetings or they're at a
conference and they want to hear, to enhance speech clarity, but
then later, as the socialevents start occurring, they may
want to reduce noise more.
So, even within the sameindividual at different times
throughout the day, they mayselect a different element of
edge mode.
Increasingly I'm seeing thatpatients are moving away from at
(17:54):
least in my hands they'removing away from manual programs
like restaurant, crowd, outdoor, and then instead using
personal plus edge mode tocharacterize and personalize to
all of their environments.
I don't know whether we've seenthat in a larger sense.
Holly Schissel (18:14):
I know we saw it
in one of our clinical studies
when we first brought out thatedge mode feature.
So it's again that patientpreference for the settings that
it adapts to.
So I think, as we think about,not only do they prefer that
unique situation, it's just theease of which they can come by
it.
They don't have to think aboutmultiple programs or they get to
that one.
It's just a simple process toget there.
Dave Fabry (18:37):
Yeah, for those
patients who specifically want
to use designated clinicalfitted programs, we can still
have up to those four programsthat can be configured.
But I agree that we want toreally enable that ease of use
and ease of application forthose patients who just want to
set it and forget it, but alsothose who want to get in under
the hood and add manual programson their own.
(18:59):
That hasn't changed a bit.
Holly Schissel (19:01):
Yeah, I think of
the product line.
You know it gives patients anedge and it gives professionals
an edge.
So it's a flexible enough userinterface and our ProFit fitting
software that you can addressevery patient's needs.
Dave Fabry (19:14):
Well, so we've
talked about the DNN processing
and the way that we'reincorporated this.
We've talked about accessories,we've talked a little bit about
enhancements in terms of edgemode and we briefly opened with
LE Audio.
So talk a little bit about,talk to me as if I were your
mother.
You know, in terms of why wouldshe want to consider updating
(19:37):
to products that use LE Audio,versus what she has with the
hearing aid, as a sort of astandalone device, if you will?
Holly Schissel (19:44):
Sure.
So there's a few reasons.
So one is AuraCast technology.
So that's really what we is thewireless tech of the future.
So we're waiting for the timewhen that starts.
It's now integrated intoconcert halls.
We've seen televisions, we'veseen streaming services, but
imagine a day when you canactually hear the airport
(20:07):
overhead cast because it can bethrough AuraCast.
So it'll start to be integratedinto more of these public
spaces from an integrationperspective.
The other big improvement thatwe sometimes overlook is just
the audio performanceimprovement.
So really, less packets aredropped, what?
Dave Fabry (20:26):
does that mean?
Holly Schissel (20:26):
What does that
mean?
I was just about yeah.
So it's when you start to, itspeeds up or you drop a packet
the hearing aid.
It sounds like a drop in thesignal, so crackling, just audio
artifacts.
So the new LE Audio will kindof address those things.
So you've got this real clearstreaming experience and it's
(20:47):
better connectivity ifsomebody's walking around while
they're streaming, et cetera.
Dave Fabry (20:52):
More range.
Holly Schissel (20:53):
More range.
Dave Fabry (20:54):
yes, yeah yeah, so,
and you know we talked about
sort of the future of this youmentioned, I think one of the
great use cases is indeed theairport.
I mean, we both spend a fairbit of time in airports and in
our Minneapolis airport some ofthe gates are relatively close
together and you know whenthere're amateurs flying and
you're not careful to listen towhich gate the announcement is
(21:18):
coming from, they'll say, well,they got three gates all
boarding at roughly the sametime.
People are getting up andfeeling sheepish as they get up.
Did you announce for me We'llreally be able to, once the
infrastructure develops aroundthis, be able to use, let's say,
a QR code or some sort of wayto signify that I want to link
up to gate 13 announcementsversus gate 14 announcements and
(21:41):
really prevent thatembarrassment of getting up at
the wrong time?
For me personally, that's a bigone and just a convenience
factor.
Absolutely.
Give me some other use casesthat you might see benefits for
this in a public setting.
Holly Schissel (21:52):
I think that it
can go to multiple people in the
same space.
So, as we think about telecoilscertainly an important part of
access for patients, but that'llstart to be integrated with
Oracast.
Like you said, it can be asimple QR code that can allow
you to be connected.
So the other reference peopletalk about.
You know the gym.
You're there often, dave.
Dave Fabry (22:15):
You too, you've got
your tennis shoes on so we could
go right now after this.
But yeah, we've got threescreens in back of us here and
they might all have differentvideo but the audio isn't
typically played because youcould never sort out the audio
from all of those different.
But being able to link to thatSports bar same way In Minnesota
, I can guarantee you that ifthey begrudgingly have a Packer
(22:38):
game on, it never has audioassociated with it.
It should be, outside and it'llbe outside and so now at least I
could link to that televisionand get the audio.
But you know, I think youalluded some of the things that
are a little nearer term.
I mean, we're already seeingthe potential for this and I
think, as opposed, you mentionedtelecoils, people in general.
Starkey has been one of themanufacturing partners to
(23:02):
hearing aid users, stillsupporting the use of telecoil
and custom and standard devices.
Until and unless there is noreason to have those systems be
compatible.
We're going to continue tosupport those people who've had
loop systems installed Very fewpeople.
I'm going to make a statement.
I'm going to ask you to correctme if I'm overstating, but I
(23:24):
think very few people are usingtelecoils for the purpose of
telephones anymore.
I would agree purpose oftelephones anymore, I would
agree, and it instead is morefor assistive technologies in
places of worship, in adultlearning centers.
You know places where they'vehad that, maybe at home they've
put in a loop system and theyhad that installation.
(23:45):
Is AuraCast designed toeventually improve that
situation without requiring aninstallation?
Are there any situations whereyou can think aside from the
fact that in a small communityor in a place of worship where
they already have the loopinstalled, they may want to save
(24:06):
their budget to see at theappropriate time that it's worn
out or that the technology willbe much better to replace that?
But do you see that this willduplicate some of the types of
group environments that atelecoil would?
Holly Schissel (24:20):
Yeah, duplicate
is probably a good term.
There certainly is going to bethis period of overlap.
I think it's just mostimportant that patients have
access to these types oflife-changing features.
So I do think, you know,sometime in the future that's
what we'll start to see.
Dave Fabry (24:35):
Yeah, yeah, what do
you think?
Well, I agree, you know, I think, as I said, we're going to
support for as long with formfactors and custom and standard
devices that incorporate thetelecoil, because we think it is
and remains a very importantway for a lot of people to
communicate.
But I think the potential forimproved sound quality, not
having to sit in that definedarea, and then I think the thing
(24:58):
that's different about AuraCastis that, in addition for
applications for hearing aidusers, it also will apply to
people who are using earbuds andother means of communication,
wireless communication that willalso incorporate LE Audio.
So this is a much biggerpotential benefit to people in
(25:20):
those sports, bar, airport grouplistening situations, museums
where you could be getting anaudio tour that linked to
various locations.
And I think for that reason weneed to accelerate our timeline
to make sure and we've done thatwith this technology, with our
LE Audio, we're designed to bethe most open in our space to
(25:41):
really be compatible with all LEAudio transmitters, or as many
of them as are possible outthere in the field, so that we
give that confidence to the enduser that their technology will
work if there is an LE Audiobroadcast mode transmitted
somewhere in common spaces.
Holly Schissel (25:59):
Yep that
confidence.
Dave Fabry (26:00):
We even have it in
our WFA auditorium.
We even have it in our WFAauditorium.
So those customers of ours whowill be coming to Eden Prairie,
if they're fitted and then theyhave a smartphone that is
compatible with that, and that'sagain the infrastructure that
has to develop aroundsmartphones, is that ability
within the app or within thephone to be able to detect the
presence of LE Audio and thenpair and join a broadcast.
(26:23):
Absolutely so the issue, evencloser term on this, is TVs.
It's likely now, onlyhalf-kiddingly I say I was the
remote control growing up in myfamily.
My dad would say change tochannel five and I'd get up and
go and turn it.
Right, but you know we kept theTV for 10 or 15 years in those
(26:43):
days.
People are replacing televisionsets.
Okay, Boomer, People arereplacing television sets much
more routinely and frequently.
Now it's likely the nexttelevision that you buy will
have the potential at least tobe able to have AuraCast
compatibility for home.
Yeah, be able to have AuraCastcompatibility for home.
(27:08):
Now, that wouldn't be the samebroadcast, but the AuraCast
capability would enable, let'ssay, multiple family members to
have the same stream from a TVset without necessarily
requiring a separate TV streamer.
Yeah, absolutely.
So in the meantime, we do have anew and improved TV streamer
with LE audio capability thatwe've had since Edge AI and
(27:29):
continuing.
But I think that's perhaps oneof our accessories that in the
long run will be on theendangered species list in the
sense that you know, unlessyou're traveling and you want to
couple that to a TV set thatyou don't know is AuraCast
capable, couple that to a TV setthat you don't know is AuraCast
capable.
But at some point is it areasonable thing to believe that
(27:53):
people will be able to expectthat they can stream directly
from a television set to theirhearing aids through the phone,
you know, to the phone, to thehearing aids?
Holly Schissel (27:57):
I hope so.
I mean absolutely.
That's one of the promises ofthis.
Yeah, great for patients.
Dave Fabry (28:01):
Then, moving even
closer term we talked about with
iPhones and Android phones.
There are, yeah, great, thereis a QR code, like you,
(28:25):
referenced.
Holly Schissel (28:27):
Many of the
professionals we work with have
that QR code in their officesand if not, we can certainly get
them one.
It's also on our Starkeycom,where our patients go for a lot
of great information abouthearing aids.
It's listed there, as well ason our Starkey Pro site for
professionals, so we try to haveit in many places just to make
sure that people can find whatthey need.
Dave Fabry (28:47):
Another area,
another use case that many of my
patients have been asking formany are younger and still
working or actively using theircomputer in addition to their
phone is that now they wantcompatibility with their
computer, whether they're a Macuser or a PC user?
Talk a little bit about that.
(29:07):
Is that possible?
Holly Schissel (29:09):
Absolutely.
So that's right where we'regoing.
So it's trying to make thoseintegrated earpieces, so it's
that hearing aid that aids manydifferent parts of their life.
So that information is alsothere on the site.
So it gets to be pretty complex, right.
So it's which wirelesstechnology are they using?
So our goal is to always on thesite.
So it gets to be pretty complex, right.
So it's which wirelesstechnology are they using?
So our goal is to alwayssimplify that information.
So the QR code is probably theeasiest way because it can
(29:32):
determine what type of cellphone is reading this QR code to
serve that patient up on thecompatibility information.
We try to keep it all straightfor them.
Dave Fabry (29:41):
Yeah, it does get
complicated and I know many
people feel like they have tobecome an Apple genius or a Geek
Squad member, not to excludeanyone else, but just to really
understand that compatibility.
But you can go and find that atStuckycom and find the chart of
what's compatible here is toenable that sort of seamless
(30:09):
transition, like when I'm on aTeams call and I'm on my phone
and then I transfer over to mycomputer.
To just have that audio betransferred over without
intervention is the objectiveRight now.
Perhaps the reduction in painthat's the most welcome is that
let's say, because I'm an iPhoneuser and a Mac user, that I go
through the accessibilityhearing devices and then pair
(30:31):
that.
Initial time I don't have to gothrough that process again.
As long as I've done it oncewith my iPad, once with my phone
, once with my Mac.
Then as I go between thosedevices I may need to simply
toggle off Bluetooth on thedevice that I'm connected to.
As long as I've previouslypaired with the computer or with
(30:52):
the pad.
It will pick up that connectionand route the audio there
Pretty seamlessly the majorityof the time for myself and for
the patients that are using thistechnology.
Holly Schissel (31:02):
Yeah, absolutely
.
Dave Fabry (31:03):
We've been really
excited about that.
That connectivity is, I think,increasingly becoming table
stakes, with the expectation byhearing aid users that not only
connected to their phone butconnecting to other peripherals
is why we've built this intothis technology moving forward.
So we both know what's on theroadmap, but you want to tease a
(31:25):
little bit about anythingcoming in the future.
What else do you envision,without giving away any
confidence, to give ourcompetitors any lead time on
this?
But where do you see technologygoing?
As somebody that's in thedriver's seat, you're in a
catbird's position of being ableto see where the competitive
market is, where we are.
(31:45):
I think increasingly we have tobroaden who our competitors are
, to consider a broader array ofcompetitors than in the past.
But what do you see as beingthe biggest opportunity in the
next three to five years?
Holly Schissel (31:59):
I think it's
always keeping our eyes on what
the patient needs, so looking atthem, and that drives what we
do.
So I think we'll continue tosee those performance
enhancements for patients as wemove forward.
Dave Fabry (32:12):
Yeah, I think you
know the health and wellness
features.
We probably I mean again, I'mbiased, but modesty aside I
think in 2018, when weincorporated inertial
measurement units to startenabling physical activity,
social engagement, falldetection, now fall risk
(32:33):
assessment we really, I think,have led in the area of sort of
helping raise awareness for thefact that hearing loss doesn't
occur in a vacuum.
It often is accompanying otherissues that people are dealing
with in aging or at any pointthroughout their life, and I
really, you know, I think thatfocus isn't going to change.
(32:54):
I think using more and more ofthe intelligent assistant
functionality I think DNN andopen AI and using some chat GPT
kind of functionality, has beensomething that we've continued
to enable and lead movingforward in comparison to the
(33:16):
rest of the industry.
Anything you want to talk aboutor tease on at all.
Holly Schissel (33:20):
If you keep
pushing me.
Dave Fabry (33:21):
I know, yeah, it's
my job no.
Holly Schissel (33:24):
You know, I
think that's kind of a cool
thing about Starkey.
If we think about surprisingthe industry last fall when we
launched at JI, I think thereare more surprises to come.
Dave Fabry (33:35):
Okay, well, I did my
best for those who want to get
a glimpse of the future.
I think one of the things thatI'm really pleased with is
whether my hands are full orwhether people have manual
dexterity issues thatfunctionality to be able to not
only query about how to clean myhearing aids.
I think the self-check featureis something that is now.
(33:59):
It hid in plain sight for anumber of years and now it's.
I think, after Edge AI, I thinkit's the second most widely
used function beyond volumecontrol and program change, and
really I see more and moreclinicians starting to enable
and empower their patients touse that self-check feature to
(34:21):
determine when they need toreplace wax guard and when they
need to come in for a follow-upappointment, and I'm really
pleased to see those types offeatures that are always
patient-driven, and I know thatyou really stay on top of that
in the market too.
Holly Schissel (34:35):
We try.
Dave Fabry (34:36):
Well, I appreciate
the time today and I think
that's a good little teaser, butyou, appropriately, are
preserving Fort Knox for thefuture for us there you go.
But, holly, I thank you forsitting and talking a little bit
today about where we've been,where we're going, and I hope
this is helpful to you about aglimpse into the latest
(34:57):
technology from Starkey.
Stay tuned, because thetechnology is changing very
rapidly and I guarantee you thatHolly is one of the people
behind the curtain making theseimprovements that are always
patient focused and also withthe professional in mind, to
take our technology, with theexpertise of the professional,
to optimize results for thepatient.
(35:18):
So, thank you, thank you.