All Episodes

December 25, 2025 • 39 mins

Today, we're sharing an episode of a show that explores the problems that new technology is creating and how we navigate living in the future. It's called Kill Switch, and it's hosted by Pulitzer Prize-winning journalist, Dexter Thomas. The episode you're about to hear is about the latest in wearable tech—stuff like smart glasses, pendants, watches and rings. After the implosion of Google Glass back in 2013, which faced backlash and ridicule, we’re now readily embracing wearables. What’s behind the new fervor of wearables today, and have we moved on from the privacy and surveillance questions that plagued Google Glass?

Dexter talks to Victoria Song, a senior reviewer at The Verge whose job it is to test out each new iteration of this technology, about the state of wearables today, why companies are obsessed with getting AI into them, and how they’ve already changed how we talk to each other, and ourselves, IRL. Find more episodes of Kill Switch wherever you get podcasts.

Got something you’re curious about? Hit them up killswitch@kaleidoscope.nyc, or @killswitchpod, or @dexdigi on IG or Bluesky.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin.

Speaker 2 (00:18):
Hey, it's Jacob. Today. We are going to play for
you an episode of a podcast called kill Switch. The
show is hosted by a Pulitzer Prize winning journalist named
Dexter Thomas, and it's about technology, the problems that new
technology can create, and how we navigate living in this
technological world. The episode you're about to hear is about

(00:39):
wearable tech, basically how we got from Google Glass more
than a decade ago to where we are now. I
hope you'll like it.

Speaker 3 (00:57):
There's this TikTok that somebody posted where she was getting waxed.

Speaker 4 (01:03):
I had a Brazilian wax done about three weeks ago
and it's been haunting me ever since. The girl that
was giving me the watch, she was wearing meta glasses
and she's like, they're not charged, they're not on, Like
I promise.

Speaker 3 (01:24):
Verbal technology is having a moment. There's been this wave
building for a while where the stuff that seemed like
science fiction, like having a computer on your wrist, is
completely mainstream now, and now the next step of smart
glasses is starting to get there too. The best example
of this is metas ray bands. These glasses blend in

(01:45):
so well that people are starting to wear them as
their normal, everyday prescription glasses with the added benefit of
being able to, among other things, record audio and video.
And this brings in a lot of questions about privacy.
These aren't new questions. We've had well over a decade
to think about this, but it's got to be uncomfortable

(02:05):
to have this societal quandary about surveillance grown at you
when you're not wearing any pants.

Speaker 4 (02:13):
I could not stop thinking, like could this girl be
filming me right now?

Speaker 3 (02:17):
Like? Could she be filming?

Speaker 1 (02:22):
Mataw will tell you that they have an LED indicator
light that goes on when you are recording. But someone's
gonna figure out a way to hack that.

Speaker 3 (02:31):
Victoria's song is a senior reviewer at The Verge.

Speaker 1 (02:34):
Someone's going to figure out a sticker to buy so
that you can put it over the case. Like what
is going to be the social etiquette going forward for
these glasses? Are they going to be banned from certain professions?

Speaker 3 (02:45):
Victoria's been testing and reporting on wearables for over a decade,
so she's seen the technology evolve from fitness trackers that
count your steps to rings that track your heart rate too,
smart watches, dependents, and glasses but this time feels a
little different.

Speaker 1 (03:01):
For whatever reason, the big tech powers that be have
convened to decide that wearables are it. Wearables are how
we're going to get AI to really take off. Like
we're gonna have AI leave the computer and be on
your body, beyond you as a person, and you're gonna
take a I with you in every aspect of your life.

(03:22):
Is that the future that we're heading towards? Is that
the future we want to head towards. These are the
conversations that we're going to have to start having.

Speaker 3 (03:34):
I'm afraid Kaleidoscope and iHeart podcast.

Speaker 4 (03:41):
Dig.

Speaker 3 (03:42):
This is kill Switch. I'm Dector Thomas. I'm all right, m.

Speaker 1 (04:24):
Goodbye.

Speaker 3 (04:29):
This isn't the first time we run into questions about
privacy and smart glasses. Most people first started thinking about
this a little over a decade ago in twenty thirteen,
with the release of Google Glass. When I first started
thinking about wearables, there's maybe something that I might be
interested in? Was Google Glass?

Speaker 1 (04:49):
Oh yes, Google Glass.

Speaker 3 (04:50):
I see your face, So we obviously got to talk
about Google Glass. When you first heard about Google Glass,
what did you think about it?

Speaker 1 (04:58):
Ah, it was sort of straight out of science fiction.
There was this marketing sizzle reel showing what Google Glass
was supposed to be.

Speaker 5 (05:08):
You ready, right there, okay glass take a picture?

Speaker 1 (05:14):
Okay, class you want to be? And you know it's
an augmented reality device that shows you notifications of stuff
that you need based on the world around you in
real time. It's total game changer if you really think

(05:37):
about could this actually work. A lot of people probably
were thinking of things like Iron Man and Tony Stark
smart glasses. There's just a lot of science fiction imagery
that exists with heads up displays, even though this technology
actually dates all the way back to World War Two
with fighter pilots and heads up displays in the windshields.

(05:58):
If you were to really trace the genesis of it,
it's a really old concept in technology. But I think
Google Glass was the first that kind of plucked it
from science fiction into the realm of like, hey, we
have a product, and it's like, I think, what was it,
like fifteen hundred dollars somewhere in that ballpark, and you
can buy it if you if you have the money,
And people did wear the Google Glass Explorer addition out

(06:21):
into the world and it was like a really surreal
thing that happened.

Speaker 3 (06:27):
Google glass was not subtle. It barely looked like glasses.
It was this bar that went across your forehead, that
sat on the bridge of your nose, and instead of lenses,
there was this metal square thing and this inch long
transparent camera that stuck out over your right eye.

Speaker 1 (06:43):
It reminded me of the dragon ball Z scouters that
vegeta the level scoter. Yeah, it reminded me kind of
like that, because there wasn't actually like glasses for people
to see through.

Speaker 5 (06:59):
What does a scouter say about his power level?

Speaker 1 (07:02):
It's over nine?

Speaker 3 (07:05):
What There's no way that could be right, yo. I
hadn't thought about that. It looks like the dragon ball Z.
But the scouter that oh my god.

Speaker 1 (07:17):
Yeah.

Speaker 3 (07:19):
You know, google glass really leaned into the science fiction
aesthetic of it, right. It wasn't supposed to blend in.
You were supposed to stand out when you wore it.

Speaker 1 (07:29):
Yeah, it was very distinctive when you saw someone wearing it,
you'd see it and you'd immediately clock it.

Speaker 3 (07:36):
People would immediately clock it, and in a lot of
cases they'd make fun of it, not just because of
surveillance or because they didn't want to be recorded. Sometimes
the criticism was just more surface level and basic, like
why would you want to wear that thing on your
face and show everybody that you're a weirdo who records everything.
Back when it came out, The Daily Show had a

(07:57):
whole segment that was making fun of it. Our nation
has long been haunted by discrimination, and while we've made
great strides over the years overcoming the challenges, there are
still those that suffer from the barbs of injustice. I
was denied admission and service at various establishments. I was
mugged in the Mission district. I was asked to leave
a coffee shop, and the reason why was because we

(08:19):
don't do that. Here, Hold on a second, What the
hell have you all got on your faces? Google glass?

Speaker 4 (08:26):
Google?

Speaker 1 (08:26):
What Google glass?

Speaker 3 (08:28):
Google?

Speaker 1 (08:29):
And that's what this is all about.

Speaker 2 (08:31):
Yes, yes, it seems even in this day and age,
you can still be treated.

Speaker 3 (08:37):
Differently just because of how you look wearing a fifteen
hundred dollars face computer.

Speaker 1 (08:43):
So you know, that's why it caused the controversies that
it did. I think the most famous example was just
a woman wearing the Google glass while she was out
in San Francisco and someone ripping it off her face video. Okay,
it's on video now.

Speaker 3 (09:06):
Sarah Slocum was a social media consultant and she was
wearing a Google glass at a bar in San Francisco.
She was showing some friends how Google glass works, and
some other people at the bar saw it and got
mad at her and started yelling at her because they
didn't want to be recorded, so she started recording a
video of them yelling at her, which maybe is kind

(09:26):
of ironic, but maybe also a preview of what the
future was going to look like. So it's kind of
hard to tell because everyone's yelling, but there's a moment
in the footage where you can hear someone say, get
your Google glasses out of here. They start yelling back
and forth, and she also flips them off and again
this is all on the video that she's recording, and

(09:47):
then someone grabs the glasses and rips them off of
her face. Sarah Slogan was featured in the local news
about this, and she also told her story in a
Daily Show segment about what happened.

Speaker 5 (09:57):
I was at a bar and people started verbally accosting me.
They started getting physical immediately when I started recording. They
ripped them off my face basically, and they ran outside.
It was a I hate crime. This only thing is
that they're going to be wearing these things probably.

Speaker 1 (10:13):
In a year.

Speaker 3 (10:15):
I'm gonna reserve my comments about calling this a hate crime,
and instead I'm just going to move on to the
fact and say that Sarah Silacumb's prediction was incorrect. Pretty
much nobody was wearing these things.

Speaker 2 (10:27):
In a year.

Speaker 1 (10:28):
When you looked at it, you couldn't help but wonder, oh,
was this person recording me? And I mean there was
a recording light and all of that stuff. But it
really kind of forced the whole sci fi aspect of
it into current everyday life in a way that I
don't think society was really prepared for at that point
in time.

Speaker 3 (10:46):
So I also remember when Google Glass was announced and
actually thinking, oh, you know what, this would actually be
kind of cool. My thought was I could record concerts
or something like that. That might be kind of cool,
because using your phone it wasn't really a great experience either.
You're in people's way. So I'm going to artist shows

(11:08):
who I know sometimes they asked me to film their sets,
but it's kind of weird being a guy up on
the stage with the camera. But I remember going from
being kind of interested in the Google Glass to realizing, oh,
I think maybe people hate this, and then it was gone.
That life cycle of it was so fast. Did it

(11:29):
feel like that for you?

Speaker 1 (11:31):
Oh, it's absolutely really fast. And there's a lot of
factors that go into that. One. The price was prohibitive
to the things that it could actually do, Like the
video that they showed was truly sci fi revolutionary, couldn't
actually deliver on that, right, So it was much more
limited in what it could actually do. And then it

(11:51):
was so ostentatious when you wore it that there's no
way of being discreete right, So you're basically becoming an
ambassador of future technology for the average person. Do you
want to have people come up to you on the
street and be like, oh, what is it that you're wearing?
Are you filming me or you recording me? Nah, you're
out here just to have a nice time. And you know,
in twenty thirteen, we were not in the TikTok era.

(12:14):
We were not in an era where everyone is used
to people just whipping out their phones and filming life.
The definition of a public space in a private space
was different. Ten years ago than it is now. I
think Google Glass was like a really unfortunate example of
a good idea, or at least like a substantive idea

(12:34):
and really poor timing.

Speaker 3 (12:36):
And of course there was the phrase that went along
with anybody who had it, whether or not they were
doing anything wrong. Glasshole.

Speaker 1 (12:43):
Yeah, glasshole. I kind of feel for Google because they're
never going to live that down.

Speaker 3 (12:50):
I don't know if I've ever talked to anybody who
said they feel bad for Google. This is this is
a new one.

Speaker 1 (12:54):
It's one of those things where it's like, I see
the vision, you know, pun intended. I see you really
wanted to go with that. I don't think it was
necessarily a bad idea, but you kind of made some
choices that were not in step with reality, and now
you have to live with the fact that you coined
the term glasshole, like indelible, forever, imperpetuity.

Speaker 3 (13:20):
Google Glass was a massive failure for Google. They did
try to salvage it by selling to companies where maybe
there would be some industrial uses for it. The idea
was that this whole thing about privacy and people feeling
uncomfortable wouldn't really matter if it's a worker and a
factory using Google Glass to check inventory or something. But
that didn't really catch on either, and Google basically gave

(13:41):
up on trying to push smart glasses. It looked like
as a society, we just didn't want it. But now
all of a sudden, there's been a resurgence in wearables
and specifically in smart glasses, and all those same questions
are coming back up. So maybe saying that society didn't
want it is the wrong framing. Maybe it's just that

(14:03):
we didn't want it yet. So are we ready for
smart glasses? Now? That's after the break? Where are we
right now with wearables and specifically with smart glasses?

Speaker 1 (14:24):
Right now, I would say we are entering the age
of AI hardware. There are these wearable pendants that are
always on you at any given point time, listening to
every single conversation you have, acting as your quote unquote
second memory. That's one vein of wearable AI hardware. The
others are things like headphones and smart watches, so existing

(14:46):
popular forms and then kind of coming out of left
field again, we have smart glasses, specifically the ray bad
meta smart glasses emerging as a kind of third track
of AI hardware. You know I wrote about the meta
smart glasses and I had their comms people reach out
to me and be like, actually, could you call it

(15:07):
AI glasses? This is how closely tied together really, their
thesis of AI and glasses are like their CTO. I
believe Andrew Bossworth has basically gone on the record saying like,
this is how we get AI to really take off,
and it's smart glasses. This is the ideal hardware form

(15:29):
factor for AI going forward will be in these smart glasses.

Speaker 3 (15:34):
This is kind of interesting because if you look on
some corners of social media, you'll see a lot of
people get mad about AI features being added to products
that really would be fine without it. But tech companies
are still obviously trying to put AI into everything anyway,
and in wearables they seem to be betting on us
accepting it.

Speaker 1 (15:55):
People have had chat, GPT, Claude, all of these AI
assistants on their computers for a while, and a lot
of us have come to the same conclusion that AI
can be incredibly dumb, but if it's on you, and
from all the conversations that I've been having with AI
startups and with big tech investing in the future of AI,
if it's on you, that's a different proposition.

Speaker 3 (16:18):
And with the metal ray band specifically, the AI is
sort of a bonus feature. It's a way for Meta
to get people accustomed to AI without explicitly having to
force it on them.

Speaker 1 (16:28):
These are functional devices where you don't have to use
the AI. You never have to say Meta AI. You
don't ever have to use that wayward, but it's there.
If you're curious, you could use it. And because there
are other reasons why you would use these glasses besides AI,
it becomes a very easy entryway, a door for Meta

(16:49):
to open and say, hey, you're using it for these purposes,
It wouldn't be really great if you wanted to use it,
sort of like how you use Ciri, but actually can
do more and why don't you experiment with these actually
very useful use cases. And that's why these particular glasses
have actually caught on like wildfire in the blind and

(17:10):
low vision community because the AI features genuinely have created
a game changing, life changing technology for them. Like I've
spoken with a bunch of blind and low vision users
who tell me that the Meta glasses enable them to
live more independent lives. They have this live AI mode

(17:30):
that allows you to just ask the AI what's going
on in my surroundings. It can read a menu for you,
and for a cited person that may not that might
be like, oh my god, why do I need an
AI to read a menu for me, unless it's in
a different language and translating it before a person who
is low vision or blind, just asking an AI, hey,

(17:52):
you know, my kitchen is really messy. Where is this
particular appliance and having the AI be able to tell
you to save you some emotional labor, to save you
some of the toll of having to ask a living
person to see your very messy kitchen, it provides an
actual life changing service for that in particular community.

Speaker 3 (18:14):
Victoria is not kidding. I know someone who's blind and
they let me use their smart glasses once and it
was genuinely amazing. So we were walking around on this
college campus and the glasses had these little speakers in
them and it would read signs to me from across
the quad. It could tell me if there was a
lamp pull or a bicycle in my path. Basically, just

(18:35):
a whole bunch of things that would help somebody to
lead an independent life. So if someone ever tells me
that AI is useless. I got to stop them right there, because,
even just from my very limited peak into what this
can do for people, it can be life changing. But
I don't think that reason is why smart glasses are
taking off now in a way that Google Glass didn't.

(18:57):
It's not just the technology, it's the timing. We'ren't an
age now where people are filming each other all the time,
whether it's making tiktoks on their phone or surveiling their
neighbors with the ring camera. We're used to seeing cameras
outside in a way that we just weren't in twenty thirteen.
So it sort of makes sense that even though these

(19:18):
things are conceptually very similar to Google Glass, the meta
ray bands are being more widely adopted now.

Speaker 1 (19:26):
I've seen them in the wild quite a bunch lately,
and it's from non tuchies. Like it makes sense to
me if I go to cees and there's a bunch
of people wearing them. Makes sense if I'm at a
press conference and there's a lot of influencers around. But
I was just walking out, I was seeing a friend
that I hadn't seen in ten years, and I looked
at them and I was like, are those meta ray
bands smart glasses? And they're like, oh yeah, I love them.

(19:47):
I love to take concert footage with them.

Speaker 4 (19:49):
Yo.

Speaker 1 (19:50):
And you know, I was at a concert in June.
I turned to my side. There's a girl, she has
Meta ray bands, and I find her TikTok later that
day from that concert with the footage taken from these glasses.
So to your point, that example has become real. It
is a thing that the technology has finally caught up,
and it's a screen form factory, and it's so much

(20:10):
more affordable. It's around like two hundred to three hundred dollars.
And here's the other thing that I think is kind
of a factor in it is that it lets you
use your phone less. And phones are almost twenty years
old at this point, and we're culturally speaking, kind of fatigued.
There's so many products out there about helping you focus more,

(20:33):
putting down your phone, locking it, freaking the screen so
that you're not like glued to this device anymore. And
so the proposition now with wearables is that it enables
you to do that. You can still stay connected, you
can still get your notifications and be reachable, but you
can triage it and you don't have to look at
your phone quite as much. The reason why I knew

(20:55):
that these would not just fall away and be completely
Google Glass two point zero with my spouse is a
has become a luddite. They absolutely abhor my job and
wearable technology that I test, and they were like, I'm
gonna cop me a pair of those.

Speaker 3 (21:13):
Really, the ray bands?

Speaker 1 (21:14):
They got their own pair. Yeah, they got their own
pair of ray bands. That is their main pair of
glasses that they wear every day. They love cars, so
they go walk on the street and they can go, oh, hey, mita,
what model car is that and like in that instant
not have to pull out their phone and look all
that stuff up. That was compelling enough for them, and
the price point was good enough, and the look of

(21:35):
the glasses was not dorky. It was just a complete
like whoa moment for me.

Speaker 3 (21:42):
On top of the timing and the price, the look
of the Meta ray bands is probably another really big
factor of why these are working in a way that
Google Glass never did, because you don't have to consciously
sign up to be, as Victoria called it, an ambassador
for the future. When you wear these things and in retrospect.
That's probably another place where Google Glass kind of messed up.

(22:05):
They were relying on early adopters to be ambassadors and
hoping that they would get evangelists that would make everyone
else want to jump on board. Instead, they got Sarah Slocum,
the young lady who got her glasses ripped off of
her face. But you don't need a single ambassador. If
the product just looks like something that's already out there,

(22:25):
everyone is already an ambassador. You're just joining the club.

Speaker 1 (22:29):
That's one thing that Meta hit right on the now.
They partnered with estel or Luxotka, which is the biggest
eyewear brand in the world, has a bunch of different
brands under an umbrella, including Oakley's, which they just released
a pair of smart glasses with under the Oakley branding,
and ray Bands, which iconic worldwide. So to be able
to have those fashionable brands and to say here you go,

(22:52):
you're gonna look, if not stylish, normal in them, it
was a huge thing for smart glasses to go truly mainstream.
They have to be good looking because humans are vain creatures.
We are constantly obsessed with what's going to make us
look good. And I don't care if you have the
coolest piece of tech on the planet, that's the most convenient.

(23:14):
You're putting it on your face, on your eyes, all right.

Speaker 3 (23:17):
So some of y'all have listened this far and thought, Yo,
this sounds terrible. Anybody who wears these things is a
complete weirdo. But some people you're listening and you're thinking,
these actually sound kind of cool. You might be thinking
about getting a pair, or maybe you already have a pair.
So here we are again. We got to realize that
even though it felt like we societally rejected Google Glass

(23:40):
ten years ago, maybe all we did was just kick
the can down the road. We just put off making
a decision. Maybe when Sarah Slocum said that we'd all
be wearing these, she wasn't wrong. She was just a
little too early, Which brings us back to the TikToker
getting a Brazilian wax, the question of how much privacy

(24:00):
do we really want?

Speaker 1 (24:03):
We got a listener question. It was a spicy question
from a Verge Castle listener where they were like, is
it okay for me to wear it while I have
intimate relations with my wife? And I was like, Oh,
that's that is genuinely a question that we have to
grapple with if these are to become a mainstream piece
of technology. Is like, when is it okay to wear

(24:26):
these devices? What conversations are you supposed to have when
you're wearing these devices? Is it just on vibes? Are
we going? Are we treating them like smartphone cameras? Right?

Speaker 3 (24:35):
Right?

Speaker 1 (24:36):
My contention is that a smartphone you know when someone's
recording because they hold it a specific way. There's just
kind of like a body language that goes with holding
a camera and recording. But because of the form factor
being so discreete which is a benefit in many ways
for content creators who are trying to do first person

(24:56):
point of view. But at the same time, is someone
just adjusting their glasses in a specific way going to
start looking like you're recording something? Even if you don't
have smart glasses, you know the matter Ray Bounds do
have a indicator of LED light, which tells you when
someone is recording, and very bright outdoor lighting, I would
say most people would not notice it, and that is

(25:18):
a thing that I test for every single iteration that
comes out.

Speaker 3 (25:23):
Living in twenty twenty five means you know that you
could be recorded at any time, but these glasses add
another layer. When someone pulls out a phone and starts recording,
at least you have a visual indication that it's happening.
But what happens when you don't know you're being recorded?
That's after the break. So if you're outside while you're

(25:49):
listening to this, maybe you're going with shopping, maybe you're
going for a run, taking a walk in the park.
I want you to enjoy this moment. Just take it
in seriously because it might not be that much longer.
You can do this privately without having another person looking
you up. What I mean by that is that the
privacy issue of the meta ray bands goes beyond just

(26:10):
being recorded without your consent. Last year, a couple of
Harvard students were able to combine the recording and the
AI functionality of the metaglasses to docs people in real
time as they walked by. We built classes that let
you identify anybody on the street. Cambridge Community Foundation. Oh hi, ma'am,

(26:31):
but are you a that's seem yes? Oh okay, I
think I met you through like the Cambridge Community Foundation,
right yeah, yeah, yeah, it's great to me though, I'm.

Speaker 1 (26:40):
Kin terrifying, just absolutely terrifying. And they are college students
who were able to put that together. They just were
able to jerry rig something. And ironically, several months later,
they are now coming out with their own smart classes product.
So it's just kind of it's it's a whole revorse
of just you know, on the one hand, they were

(27:00):
raising awareness like, oh my god, this could BEBC used,
but also we have a product now cool now.

Speaker 3 (27:05):
To be fair, this new smart glasses product that they're
selling is not the facial rect cognition look up that
they demoed on campus. Their pitch is that their new
glasses go further than metas glasses. Metas glasses just turn
on and record when you tell them to. Their product
will always be on. One of the founders told the
magazine Futurism that quote, we aim to literally record everything

(27:27):
in your life, and we think that will unlock just
way more power to the AI to help you on
a hybrid personal level. End quote. And their glasses won't
have an indicator light that tells you it's recording, because again,
it's always recording. This could bring up some legal issues,
which by the way, are issues that other recording wearables

(27:48):
are probably also going to run into at some point.

Speaker 1 (27:51):
Some people live in two party consent recording states, so
you have companies making tech that could be illegal in
some respects.

Speaker 3 (28:00):
Yeah, in California specifically, both parties have to consent to
being recorded.

Speaker 1 (28:06):
So I live in New Jersey and work in New York,
which are both one party consent states. So technically I
can walk out into public spaces with the thing on
and it's recording and I don't require anybody else's consent.
But if I'm going to California, is it okay for
me to wear of always on recording device while I'm
on public transit? It suddenly becomes a very strange, murky

(28:30):
gray area. If you look at Meta's privacy policy for
the smart glasses, what they say is the best practices
for using these devices out in the world, and they
can basically say, hey, we published this, We've told people,
don't be a jerk. We're good, right, and when people
inevitably are jerks using their technology their defenses that they say, well,

(28:52):
we never intended them to use it that way. But
you can think about air tags in that respect. Apple
came out and were like, we made this incredibly convenient
device for you, and a small bunch of bad apples
are going to use it to stalk people and use
them in ways that we absolutely didn't intend. But in
the fine front, we're going to say that that's illegal.

(29:14):
We don't condone that. Legally, we're scott free.

Speaker 3 (29:18):
So if you're not familiar with these apples, AirTags are
these small tracking devices that you can attach to your
keys or your wallet to keep track of them. Products
like this existed before Apple's version, but Apple just made
them more convenient, which made them more mainstream, and after
the product became more mainstream, the obvious bad things you
can do with this technology also became more mainstream. People

(29:39):
started sneaking air tags into people's purses or attaching them
to their cars so they could track them and stop them.
So Apple did make a notification system that would alert
you if an unknown tracking device is following you, but
you would only get that notification if you also had
an Apple iPhone. Months later, they did put something out
for Android, but even then you had to know how

(30:01):
to use it, And as cases of people being stocked
kept hitting the news, they'd keep making modifications like making
the AirTag beat more if it's away from its owner
for long enough. But of course people started working on
how to disable that. There was another solution for this,
stop with the software updates and just cancel the product,
pull it off the shelves, but Apple didn't do that

(30:22):
because air tags are really popular. People really like being
able to find their lost stuff. In the case of
the Apple air tags and now for the Meta ry bands,
the trick seems to be to find enough consumers for
whom the product is indispensable, people who think that the
benefits outweigh the risks.

Speaker 1 (30:42):
I think this is what really the crux of all
of this lies on, is that air tags are so
convenient that most people, the vast majority of people will
be like, yeah, I'm good with that because air tags
are so convenient, and I'm not the bad Apple using
it in that way. It's like, for what app for
what use case will be so convenient that you are

(31:02):
willing to overlook the dystopian nightmare that comes along with it.

Speaker 3 (31:07):
I mean, I'm feeling like, even if the vast majority
of people with the metal ray bands or any of
the smart glasses use them in very responsible ways, just
the fact that it's out there is going to alter
how we can walk around in society. Period.

Speaker 1 (31:23):
No, you're absolutely right. Like, in testing these devices, I
don't speak to myself as much as I used to
because I wore one of these devices into a bathroom.
I commented on my bowel movement and it recorded it,
and then it generated a to do for me to
have lactaid again, And I was like, this is the
rudest thing that's ever happened to me. But also, holy crap,

(31:45):
this is our dystopian future because when the AI is
in your glasses, when the AI is independent that sits
around your neck, when it's in your smart watch, when
it's on you twenty four to seven, and you just
have an unfiltered thought that you speak aloud to yourself,
well suddenly you're not the only one listening to your
own thoughts. It's an AI that's listening to your thoughts

(32:05):
and it's going ooh. They mentioned that maybe this is
a thing that I will rate a.

Speaker 3 (32:09):
To do list for, and that sounds like a convenient application.
And there's that word again, right, convenient. You say something
out loud and your smart glasses remember it for you,
but it's not remembering it for you. It's also remembering
it for the company and for the advertisers. You blurt
out something unconsciously about your head hurting, or you're around
somebody else as they're talking about having a headache, and

(32:31):
next thing you know, you're getting a bunch of targeted
ads for a very specific brand of headache medicine that
is paid to be at the top of the list
for the demographic of twenty five to thirty five year
old women who like cold brew coffee, live in urban environments,
and like techno music.

Speaker 1 (32:46):
If we want to get super dystopian about it, we
live in the engagement economy, right. Engagement or economy requires
the constant feed of data and personalization and all that
sort of stuff. What better AI training tool do you
have than a wearable I sit up at night and
I think about it, and I was like, legitimately, we

(33:06):
started out just tracking our steps. Now it's your heart right.
They're working on finding ways to tell you if your
blood pressure is high or low, if your blood glucose
is hire low. So they're looking to how can the
feed crapt on more data so that we can know
more about you.

Speaker 3 (33:22):
Give you more ads, Give.

Speaker 1 (33:23):
You more ads, personalized experiences like this is just my
I don't know if you've seen the meme of Charlie
from It's Always Sunny with the red string on the board.
This is my conspiracy theory. Yes, so where wearables are going.

Speaker 3 (33:36):
I don't think that's a conspiracy theory. I think it's
a very reasonable thing to assume is that the more
data is collected about you, it can be used to
show you ads that you will not scroll past. Yeah,
you've got kind of a preview of this of what
society looks like because you're around the tech type people

(33:57):
all the time. What does a society look like when
we know that there's a good chance that somebody around
you is recording, is recording everything they're seeing.

Speaker 1 (34:10):
I think it's a much more self conscious society. I
have become someone who when I'm out and about, I
am scanning the glasses that people wear to see if
there's cameras on them. I had a kind of a
unique upbringing. There's a question of whether my dad was
a North koreanspire or not, and whether we were under
surveillance at any given point in time, and so I

(34:30):
grew up always thinking my life is kind of public.
I have to perform as if I'm always being watched.
So I kind of grew up with that my whole life.
But it's a heavy thing to grow up with, and
I think, you know, a lot of people are privileged
and blessed to not grow up performing in that way,

(34:52):
as if there's a movie set invisible camera on you
at all times. But I think that is just going
to be a reality that everyone starts to do. You
start to become a lot more conscious of your actions
in public. You start to become conscious of the spaces
in your house that are truly private. You know. I
say to people all the time that the only truly
private place you have in this world is the inside

(35:15):
of your head, Like that's kind of dystopian when you
say that. But living the life that I do, testing
the products that I do, having the upbringing that I had,
I unfortunately think I am well equipped to tell people
that this is what's coming. I think we're all going
to have to live our lives as if we're many
celebrities out in public at a given point in time,

(35:37):
and that the paparazzo could come for you. And there's
degrees of that. Not all of us are Timothy shalom
living out here having to wear caps and disguises. But
to a degree, I do think we're all going to
be living very public.

Speaker 3 (35:51):
Lives, whether or not we want to, whether or not we.

Speaker 1 (35:54):
Want to, I think we are all in some way
going to be living as public figures. So I think
as a society going forward, we really have to think
about what are truly private spaces, and like, what truly
private spaces we want to protect, because it's a human
need to need that privacy, and I don't think that
it should be given up for whatever convenience. Like, however,

(36:18):
tempting the conveniences and sometimes the inconvenience is necessary.

Speaker 3 (36:23):
It seems like this is another one of those conversations
where we're having as a society where the outcome seems predetermined,
which is to say, these are eventually going to be
adopted by everyone. It's just a matter of time. This
isn't a decision that you as an individual get to make. Look,

(36:44):
you don't have to like these glasses, but they're going
to hit mainstream adoption. Everybody's gonna be wearing them, and
so you can choose not to buy them if you
don't want to, but you cannot choose to live in
a world that doesn't have them, and it feels like
that is where we're at right now. Do you think
we're there?

Speaker 1 (37:02):
I think you're spot on, because I think the sales
prove it. Google believes that the zeitgeist is strong enough
for them to be like, hey, guys, smart glasses, let's
get back in on that and rebrand ourselves as the
people with the most experience in the space. So at
the end of last year, the age they put me
back in coach. Like last at the end of last year,

(37:24):
they legitimately launched Android XR, which is going to be
their platform for smart glasses and headsets.

Speaker 3 (37:30):
Why would you say now is the right moment to
launch XR.

Speaker 2 (37:33):
I think now is the perfect time to work on
XR because you have a convergence of all these technologies.

Speaker 3 (37:39):
We've been in this space since Google Glass, and we
have not stopped.

Speaker 5 (37:43):
We can take those experiences, which already work great, and
find new ways to be helpful for people.

Speaker 1 (37:50):
Once Google is able to overcome their PTSD trauma over
Google Glasses to be like, we want back in on
the thing that people made the most fun of us for.
I think it is inevitable. I think you're spot on
about that.

Speaker 3 (38:06):
Thank you so much for listening to kill Switch. I
hope you dug If you want to connect with us,
we're on Instagram at kill switch pod or you can
email us at kill Switch at Kalaidoscope dot NYC. And
you know, while you got your phone in your hand
whatever before you put it back in your pocket, maybe
wherever you're listening to this podcast, leave us a review.

(38:26):
It helps other people find the show, which in turn
helps us keep doing our thing. Kill Switch is hosted
by Me Dexter Thomas. It's produced by Shina Ozaki, Darlak Potts,
and Kate Osborne. A theme song is by me and
Kyle Murdoch, and Kyle also mixes the show from Kalaidoscope.
Our executive producers are Ozma Lashin, Mangesh Hatikadur and Kate

(38:47):
Osborne from iHeart. Our executive producers our Katrina Norvil and
Nikki Etur.

Speaker 1 (38:53):
Oh.

Speaker 3 (38:53):
One more thing. So there's that clip that we played
of the dragon Ball Scotter exploding and maybe you we're
wondering why it exploded or if there's any scientific basis
for why a scouter would explode when the power levels
are too high. Okay, maybe you weren't wondering that, but
I was. And it turns out that the official dragon
Ball site published an article that kind of explains it,

(39:14):
featuring an interview with a professor in the engineering department
at Miyazaki University who talks about using AI headsets to
measure the weight of pigs. I promise you I'm not
making this up. I'll leave that in the show notes. Anyway,
We'll catch you on the next one.

Speaker 4 (39:42):
Good Bye.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.