All Episodes

August 8, 2024 17 mins

Christine Perey, AREA Founder, chats with Jameson Spivack, Senior Policy Analyst at the Future of Privacy Forum about the privacy challenges in enterprise AR.

AI and computer vision advancements could address concerns over privacy in data collection and handling. Privacy and sensitivity to security risks from the use of cameras and other sensors in the workplace continue to be obstacles to large-scale AR deployments. 

Privacy is the #4 trend in Christine's Top 12 blog: "Top 2024 Enterprise AR Trends to Watch." Read it here: https://thearea.org/top-2024-enterprise-ar-trends-to-watch/

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:04):
Hello, I'm Karen Quatromoni,
the Director of Public Relationsfor Object Management Group, OMG.
Welcome to our OMG Podcast series. At OMG,
we're known for driving industrystandards and building tech communities.
Today we're focusing on theAugmented Reality for Enterprise

(00:26):
Alliance (AREA), which is an OMG program.
The area accelerates AR adoptionby creating a comprehensive
ecosystem for enterprises,providers, and research institutions.
This Q and A session will be led byChristine Perey from Perey Research and
Consulting.

(00:47):
I'm very happy to be heretoday. Thank you for joining me.
We're going to be talking intoday's fireside chat with svac,
and we're going to betalking about privacy and
Jameson. Please introduce yourself,
tell us what you do andwhy you're the expert.
Yeah, absolutely. Thank you, Christine.So my name is Jameson Spivack.

(01:07):
I'm a senior policy analyst withthe Future of Privacy Forum,
where I lead the organization'swork on immersive technologies,
which we define broadly toinclude extended reality.
So obviously augmented reality aswell as virtual and mixed reality,
but also technologies like neurotechnologies or virtual world and

(01:29):
gaming platforms and othersimilar technologies within that
ecosystem.
And so I focus a lot on theprivacy implications of these
emerging technologies and spend a lotof time thinking about what are the new
privacy risks that mightarise with these technologies?
How do existing or proposed legislation

(01:52):
apply to these technologies ordon't apply? And are there any gaps?
And because immersive technologiesintersect with so many other
critical aspects of the techecosystem and of the privacy
ecosystem,
I also spend a lot of timethinking about biometrics and kids'

(02:13):
privacy and advertising and mobilityand all the ways that immersive
technologies are being integrated andimplemented in these other spaces as
well.
Indeed, indeed. That's areally good explanation.
And this isn't the first rodeofor a future privacy forum.
There was a report issued, Ithink two or three years ago,

(02:34):
maybe before your time,
but I'm sure you're familiarwith what I'm talking about.
The XRSI and some others wereinvolved in developing that.
How do you get your message outbesides that white paper that
was a few years ago?
What are you doing talks ornew papers? What do you do?

(02:55):
Yeah, yeah. All of the above.
So you're right that before my time atthe Future Privacy Forum, unfortunately,
but we're continuing to put out reports in
December,
we put out a risk frameworkthat is geared towards
organizations that are handlingbody related data in the

(03:16):
context of immersive technologies.
And it serves as aframework for them to assess
first to understand how theyare using body related data,
what their legal obligations are,
what the privacy risks are basedon how they are handling this data,

(03:38):
and then what are some ofthe best practices that they can implement in order to
minimize the risks thatthey've identified.
And it's geared towards organizationsthat are operating in the immersive
technology space,
but it's really applicable to anyonethat handles body related data.
And increasingly we're seeingthat technology relies on data

(03:59):
about our bodies and ourbehavior, which can be and.
Speech, our speech,recognizing a person's voice.
And we're getting better and better atthat all the time because of the speech
interfaces with theseLLMs and with other tools
that we have. So yes, speech isone of those body parts, I think.

(04:22):
Exactly. Exactly.
Just the amount of data about our bodiesis increasing and the aggregation of
all of these different type data typestogether is allowing for really exciting
use cases,
particularly in health andproductivity and things like that.
But with it comes the risk becausethese data types are particularly

(04:43):
sensitive.
So that's one way that weare something that we've been
working on. Also doing a lot of talks.
We can include a linkto that. Is it public?
We can include a link to that inour show notes. Okay, thank you.
Absolutely. Absolutely. Ontop of that, doing talks,
I was just at state of thenet on Monday this week,

(05:07):
which is a big techpolicy conference in DC
where I gave
a talk on neuro technologies and whatsome of the privacy implications are and
how policymakers in the USand in around the world are
thinking about regulatingneurological data or neural data.

(05:28):
So data from our nervous system.
Patterns, patterns aboutwhat we're thinking.
Right? Right. Exactly.
And we speak a lot with stakeholders in
industry, in civil society, in academia,
and really what the Future Privacy Forumtries to do is bring together a lot of

(05:51):
the top thinking on these issuesfrom the different sectors,
and then try to help companiesthink about some of the
best practices that they can implementin order to protect people's privacy.
That's great. That's great. Yeah. So

(06:13):
this is a multi-year,
even multi-decade exercise becausethe target's always moving,
right?
So there's a lot ofthings that have been done
to protect privacy,
but it's after on materialthat's been collected or
gathered in the past.

(06:35):
And I'm thinking for exampleof Google Maps and Street View,
how personal identifiable
information is scrubbed or obfuscated.
And one of the trends I was thinkingabout is that we could use AI not only
to recognize faces, that part,

(06:56):
you don't need ai that'sjust called computer vision,
but to use advances in computer vision to
erase personal identifiableinformation in anything
that the camera on theglasses might capture.
So think of a person who's in,
maybe a service technicianis out in the field,

(07:20):
they're getting so much value from theirglasses that they don't want to take
'em off. But on the other hand,they're taking pictures of,
they're recording video about thecustomer's site where they are,
and maybe they're preparing apiece of equipment that's within
a closed sensitive site where there'sintellectual property or where

(07:43):
other employees are moving around.
So there's so much informationthat is inadvertent,
could be inadvertently exposed.
Can we use AI to make
immediate improvements inthat what's been done so

(08:05):
far?
Yeah, I think you're absolutelyright that we could use AI to,
for example,
blur the faces of bystanders whomight appear in a user's field of
vision,
but who are not able to or won'tconsent to having their face
be captured. And this is absolutelysomething that I've seen.

(08:27):
So we focus more on the consumer use
of these technologies rather thanin industrial or workplace uses.
But I think that it'sabsolutely comparable.
If someone is using an XRheadset out in public and they
are capturing people's faces,
those people are not going to beable to be notified that their

(08:50):
data is collected and they absolutelywould not be able to provide consent.
Consent.
They absolutely won't beable to. So theoretically,
this is a good privacy preservingpractice automatically blur their faces.
But we need to be really careful abouthow this interacts with existing laws on
processing biometric data.
So to give you an example ofsomething that we've seen in the US

(09:13):
is there's actually beensome issues with face
blurring and the BiometricInformation Privacy Act in
Illinois, which is the premierbiometric privacy law in the US
because in order toengage in face blurring,
a device will actually need to firstsense that a face is present and then

(09:35):
pixelate the image.
And even that amounts to processingdata about the person's space,
which under certain biometricprivacy laws like Illinois bipa
Biometric Information PrivacyAct requires consent also.
But the bystander obviously isnot able to give consent even to
the blurring of their face.

(09:58):
Actually.
Some issues with this.
So I think the case law isstarting to clarify that those
specific use cases or those specificcases wouldn't be a violation of the
law.
But the lesson here is that regulationsreally need to be careful and they
need to be very clear so thatthey don't discourage or make

(10:20):
unlawful certain practices like faceblurring that are actually protecting
people's privacy. So theoretically, yes,
it's possible.
I am not sure how common it actually is,
but if it's not common,
it might be because companies areworried about the legal implications of

(10:42):
actually.
Erasing somebody's face for their,
and it's not just, see, I think thatwe can also extend this and say,
okay, so we're notgoing to do it on faces,
but there's lots of othersensitive information in a
business context or a businessenvironment. As I was saying,

(11:03):
the type of machines around themachine that I'm repairing or
the location of
a central office in atelecoms environment,
or perhaps even a,
how do I say it, defensein a military context,

(11:27):
you'd want to keep information blocked
and never recorded. Okay.
So let me ask, where would companies
who want to get better

(11:47):
want to offer featuresof their platforms or
their services?
They want to offer betterfeatures and they want to protect
privacy and things like howdo they start on that journey
that seems like,
do they rely on theircustomers to take the lead?

(12:11):
So if I'm selling toan oil and gas company,
or I'm selling to somebodythat has a factory,
do I have to rely on my customerto do all that? Or should I,
as the provider of technologies begin,
where do I begin?

(12:31):
Yeah, that's a really bigand good question. I think
it is going to vary by companydepending on what their goals and
objectives are.
I can speak more about theconsumer sector direction,
consumer sector than B2B,

(12:52):
but I know that certain companies will,
they will prioritizeprivacy or security or user
control over data,
but that might limit someof their features such as
interoperability.
So if a company wants to have

(13:16):
a headset where data is stored ondevice and is not sent back to their
server,
that is a privacy protectingpractice because the person's data is
only on their device, noone else has access to it.
And that's great for privacy,
but it prevents certain features like
co-presence or building out mapstogether in a space with another

(13:39):
person,
because that would require the data togo back to the server or to our third
party.
And so that comes down to what the
company, what they are prioritizing.
And I think that some of that willjust come down to where they sit in the
market.
You can imagine there arecertain companies that privacy is part of their brand,

(14:01):
their business model is,
doesn't necessarily rely on thecollection of personal data.
And so that might be aroute that they'll take.
Whereas other companies thatare more about personalization,
they will collect thatdata and they will offer.
Network operators.
Network operators need to authenticatespecific users to get them on

(14:24):
their networks, to give themdifferent levels, tiers of service,
depending on what their agreements are,
their service levelagreements and stuff forth.
So there are definitely differences in
business models. That's a veryinsightful way to answer that question.

(14:46):
You've been around this block, I can tell.
So tell me about,
are there committees inWashington DC or planned
committees that are lookingat this intersection?
Are you trying to get policymakers to

(15:07):
pay attention or to ignore it?What's the best strategy there?
Yeah,
so there's a trade associationcalled the XR Association
based in Washington that's beenvery active on Capitol Hill,
working with policymakers.
They have education daysfor the lawmakers and their

(15:30):
staffers where they're teaching themabout the technology and how it works.
And I think that it's so early for the
technology that a lot of people don'teven really know what it is or how it
works. And so you have tostart with just the education.
And that's what they've been doing isthey have demos where the lawmakers can

(15:51):
actually come and try out a VR headsetfor themselves or an AR headset for
themselves. So they're doinga lot of work on the hill.
They've been able to get, I think it's,
November is at least the last yearor the year before was designated as

(16:11):
XR month or something like that.
So they've had a few wins in that sense.
But I think that it's just so earlythat a lot of it's not a priority for a
lot of lawmakers. Another thing that.
Yes, there are other priorities right now.
I can definitely seed that point.

(16:36):
And so I think that there's been anarrative that AI means that immersive
technologies are dead.The metaverse is out,
no one cares about VR or ARanymore because it's all about ai.
And I think that that's justa completely false dichotomy,
and that actually AI makes immersivetechnologies more likely to

(16:56):
take off because it makesthe technology better,
it lowers the barrier to entry sothat people can create more easily,
create virtual content for themselves.
And so I think that it actuallymakes it more likely that these
technologies will take off. Buta lot of people have trouble.
They see it as kind oflike zero sum. They.
Can't put the two in the samebox. Yes, I understand. Okay.

(17:21):
It's all right. Good. Good,good, good. This is very helpful.
So fantastic. Jamison.
Thank you for your time and for sharingyour insights and what you do with us.
Yeah, we'll look forward to keepingin touch and learning more about
where this all goes, this journey.

(17:43):
Yeah, absolutely. Thankyou so much for having me.
Thank you. Bye-Bye.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.