All Episodes

September 17, 2025 39 mins

Wearable tech is having a moment – after the implosion of Google Glass back in 2013, which faced backlash and ridicule, we’re now readily embracing wearables from rings to AI pendants to new smart glasses in the form of Meta Ray-Bans. What’s behind the new fervor of wearables today, and have we moved on from the privacy and surveillance questions that plagued Google Glass? Dexter talks to Victoria Song, a senior reviewer at The Verge whose job it is to test out each new iteration of this technology, about the state of wearables today, why companies are obsessed with getting AI into them, and how they’ve already changed how we talk to each other, and ourselves, IRL.

Got something you’re curious about? Hit us up killswitch@kaleidoscope.nyc, or @killswitchpod, or @dexdigi on IG or Bluesky.

Read + Watch: 

Victoria’s smart glasses review on The Verge: https://www.theverge.com/2025/1/10/24340208/ces-2025-smart-glasses-rokid-halliday-xreal-vuzix-nuance-audio

“How the low-vision community embraced AI smart glasses” from The Vergecast: https://www.youtube.com/watch?v=pgu0a9QK75E

The Daily Show segment, “Glass Half Empty”: https://www.youtube.com/watch?v=ClvI9fZaz6M

“Why Do Scouters Explode?”: https://en.dragon-ball-official.com/news/01_1666.html

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
There's this TikTok that somebody posted where she was getting waxed.
I had a Brazilian wax done about three weeks ago,
and it's been haunting me ever since.

Speaker 2 (00:22):
The girl that was giving.

Speaker 1 (00:25):
Me the watch, she was wearing Meta glasses and she's like,
they're not charged, they're not on, Like I promise. Verbal
technology is having a moment. There's been this wave building
for a while where the stuff that seemed like science fiction,
like having a computer on your wrist, is completely mainstream now,

(00:48):
and now the next step of smart glasses is starting
to get there too. The best example of this is
Meta's ray bands. These glasses blend in so well that
people are starting to wear them as their normal, everyday
prescription glasses, with the added benefit of being able to,
among other things, record audio and video. And this brings

(01:09):
in a lot of questions about privacy. These aren't new questions.
We've had well over a decade to think about this,
but it's got to be uncomfortable to have this societal
quandary about surveillance thrown at you when you're not wearing
any pants. I could not stop thinking like, could this
scrow be filming me right now? Like could she be filming?

Speaker 3 (01:34):
Madam will tell you that they have an LED indicator
light that goes on when you are recording. But someone's
going to figure out a way to hack that.

Speaker 1 (01:43):
Victoria's song is a senior reviewer at the Verge.

Speaker 3 (01:46):
Someone's going to figure out a sticker to buy so
that you can put it over the case. Like what
is going to be the social etiquette going forward for
these glasses? Are they going to be banned from certain professions?

Speaker 1 (01:58):
Victoria's been testing and reporting on way bearables for over
a decade, so she's seen the technology evolve from fitness
trackers that count your steps to rings that track your
heart rate too, smart watches, dependence and glasses. But this
time feels a little different.

Speaker 3 (02:14):
For whatever reason, the big tech powers that be have
convened to decide.

Speaker 4 (02:18):
That wearables are it?

Speaker 3 (02:20):
Wearables are how we're going to get AI to really
take off, Like we're gonna have AI leave the computer
and be on your body, beyond you as a person,
and you're going to take AI with you in every
aspect of your life. Is that the future that we're
heading towards? Is that the future we want to head towards?
These are the conversations that we're going to have to
start having.

Speaker 1 (02:50):
Kaleidoscope and iHeart podcasts. This is kill Switch.

Speaker 5 (02:57):
I'm Dector Thomas, I'm staring, I'm good bye.

Speaker 1 (03:41):
This isn't the first time we run into questions about
privacy and smart glasses. Most people first started thinking about
this a little over a decade ago in twenty thirteen,
with the release of Google Glass. When I first started
thinking about wearables, there's maybe something that I might be
interested in. Is Google Glass?

Speaker 4 (04:01):
Oh yes, Google Glass.

Speaker 1 (04:03):
I see your face, So we obviously got to talk
about Google Glass. When you first heard about Google Glass,
would you think about it?

Speaker 3 (04:11):
Ah? It was sort of straight out of science fiction.
There was this marketing sizzle reel showing what Google Glass
was supposed to be.

Speaker 6 (04:20):
You're riding right there, okay glass, take a picture?

Speaker 3 (04:26):
Okay, class, and you know it's an augmented reality device
that shows you notifications of stuff that you need based
on the world around you in real time. Is total

(04:48):
game changer if you really think about could this actually work?
A lot of people probably were thinking of things like
Iron Man and Tony Starks smart glasses. There's just a
lot of science fiction imagery that exists with heads up displays,
even though this technology actually dates all the way back
to World War Two with fighter pilots and heads up

(05:09):
displays in the windshields. If you were to really trace
the genesis of it, it's a really old concept in technology.
But I think Google Glass was the first that kind
of plucked it from science fiction into the realm of like, hey,
we have a product and it's like, I think, what
was it, like fifteen hundred dollars somewhere in that ballpark,
and you can buy it if you have the money,

(05:30):
and people did wear the Google Glass Explorer additioned out
into the world, and it was like a really surreal
thing that happened.

Speaker 1 (05:39):
Google Glass was not subtle. It barely looked like glasses.
It was this bar that went across your forehead that
sat on the bridge of your nose, and instead of lenses,
there was this metal square thing and this inch long
transparent camera that stuck out over your right eye.

Speaker 3 (05:56):
It reminded me of the dragon ball z scouters that
vegeta the power level scouts. Yeah, it reminded me kind
of like that because there wasn't actually like glasses for
people to see through. What does a scouter say about
his power level, it's over nine.

Speaker 1 (06:17):
Nine thousand.

Speaker 6 (06:19):
There's no way.

Speaker 1 (06:20):
That could be right, Yo, I hadn't thought about that.
It looks like the dragon ball Z. But the scouter that,
oh my god.

Speaker 5 (06:30):
Yeah, you know.

Speaker 1 (06:32):
Google glass really leaned into the science fiction aesthetic of it, right.
It wasn't supposed to blend in. You were supposed to
stand out when you wore it.

Speaker 4 (06:42):
Yeah, it was.

Speaker 3 (06:42):
Very distinctive when you saw someone wearing it. You'd see
it and you'd immediately clock it.

Speaker 1 (06:48):
People would immediately clock it, and in a lot of
cases they'd make fun of it, not just because of
surveillance because they didn't want to be recorded. Sometimes the
criticism was just more surface level and basic, like why
would you want to wear that thing on your face
and show everybody that you're a weirdo who records everything.
Back when it came out, the Daily Show had a

(07:09):
whole segment that was making fun of it. Our nation
has long been haunted by discrimination, and while we've made
great strides over the years overcoming the challenges, there are
still those that suffer from the barbs of injustice. I
was denied admission and service at various establishments. I was
mugged in the Mission District.

Speaker 2 (07:27):
I was asked to leave a coffee shop and the
reason why was because we don't do that.

Speaker 5 (07:32):
Here.

Speaker 1 (07:33):
Hold on a second, What the have you all got
on your faces?

Speaker 6 (07:37):
Google glass? Google? What?

Speaker 1 (07:39):
Google glass? Google? And that's what this is all about.

Speaker 2 (07:43):
Yes, yes, it seems even in this day and age,
you can still be treated differently just because of how
you look wearing a fifteen hundred dollars face computer.

Speaker 3 (07:55):
So you know, that's why it caused the controversies that
it did. I think the most famous example was just
a woman wearing the Google glass while she was out
in San Francisco and someone ripping it off her face.

Speaker 2 (08:12):
It's on video now.

Speaker 1 (08:18):
Sarah Slocum was a social media consultant and she was
wearing a Google glass at a bar in San Francisco.
She was showing some friends how Google glass works, and
some other people at the bar saw it and got
mad at her and started yelling at her because they
didn't want to be recorded, So she started recording a
video of them yelling at her, which maybe is kind

(08:39):
of ironic, but maybe also a preview of what the
future was going to look like. So It's kind of
hard to tell because everyone's yelling, but there's a moment
in the footage where you can hear someone say, get
your Google glasses out of here. They start yelling back
and forth, and she also flips them off and again
this is all on the video that she's recording, and

(08:59):
then someone rabs the glasses and rips them off of
her face. Sarah Slocan was featured in the local news
about this, and she also told her story in a
daily show segment about what happened.

Speaker 6 (09:10):
I was at a bar and people started verbally accosting me.
They started getting physical immediately when I started recording. They
ripped them off my face basically, and they ran outside.
It was a hate crime. This only thing is that
they're going to be wearing these things probably in a year.

Speaker 1 (09:27):
I'm going to reserve my comments about calling this a
hate crime, and instead I'm just going to move on
to the fact and say that Sarah Slocumb's prediction was incorrect.
Pretty much nobody was wearing these things in a year.

Speaker 3 (09:40):
When you looked at it, you couldn't help but wonder,
oh was this person recording me? And I mean, there
was a recording light and all of that stuff. But
it really kind.

Speaker 4 (09:48):
Of forced the whole sci fi.

Speaker 3 (09:50):
Aspect of it into current everyday life in a way
that I don't think society was really prepared for at
that point in time.

Speaker 1 (09:59):
So I also remember when Google Glass was announced and
actually thinking, oh, you know what, this would actually be
kind of cool. My thought was I could record concerts
or something like that. That might be kind of cool,
because using your phone it wasn't really a great experience either.
You're in people's way. So I'm going to artist shows

(10:20):
who I know. Sometimes they asked me to film their sets,
but it's kind of weird being a guy up on
the stage with the camera. But I remember going from
being kind of interested in the Google Glass to realizing, oh,
I think maybe people hate this, and then it was gone.
That life cycle of it was so fast. Did it

(10:42):
feel like that for you?

Speaker 3 (10:43):
Oh, it's absolutely really fast. And there's a lot of
factors that go into that. One. The price was prohibitive
to the things that it could actually do, Like the
video that they showed was truly sci fi revolutionary, couldn't
actually deliver on that right, So it was much more
limited in what it could actually do. And then it

(11:04):
was so ostentatious when you wore it that there's no
way of being discreete right, So you're basically becoming an
ambassador of future technology for the average person. Do you
want to have people come up to you on the
street and be like, Oh, what is it that you're wearing?
Are you filming me or you recording me? Nah, you're
out here just trying to have a nice time. And
you know, in twenty thirteen, we were not in the

(11:25):
TikTok era. We were not in an era where everyone
is used to people just whipping out their phones and
filming life. The definition of a public space in a
private space was different ten years ago than it is now.
I think Google Glass was like a really unfortunate example
of a good idea, or at least like a substantive

(11:46):
idea and really poor timing.

Speaker 1 (11:49):
And of course there was the phrase that went along
with anybody who had it, whether or not they were
doing anything wrong. Glasshole.

Speaker 3 (11:55):
Yeah, glasshole. I kind of feel for Google because they're.

Speaker 4 (11:59):
Never going to that down.

Speaker 1 (12:02):
I don't know if I've ever talked to anybody who
said they feel bad for Google. This is a new one.

Speaker 3 (12:07):
It's one of those things where it's like, I see
the vision, you know, pun intended. I see you really
wanted to go with that. I don't think it was
necessarily a bad idea, but you kind of made some
choices that were not in step with reality, and now
you have to live with the fact that you coined
the term glasshole like indelible, forever, imperpetuity.

Speaker 1 (12:32):
Google Glass was a massive failure for Google. They did
try to salvage it by selling to companies where maybe
there would be some industrial uses for it. The idea
was that this whole thing about privacy and people feeling
uncomfortable wouldn't really matter if it's a worker and a
factory using Google Glass to check inventory or something. But
that didn't really catch on either, and Google basically gave

(12:54):
up on trying to push smart glasses. It looked like
as a society, we just didn't want it. But now
all of a sudden, there's been a resurgence in wearables
and specifically in smart glasses, and all those same questions
are coming back up. So maybe saying that society didn't
want it is the wrong framing. Maybe it's just that

(13:16):
we didn't want it yet, So are we ready for
smart glasses. Now that's after the break. Where are we
right now with wearables and specifically with smart glasses.

Speaker 4 (13:36):
Right now, I.

Speaker 3 (13:37):
Would say we are entering the age of AI hardware.
There are these wearable pendants that are always on you
at any given point time, listening to every single conversation
you have, acting as your quote unquote second memory. That's
one vein of wearable AI hardware. The others are things
like headphones and smart watches of existing popular forms, and

(14:00):
then kind of coming out of left field again, we
have smart glasses, specifically the ray band meta smart glasses
emerging as a kind of third track of AI hardware.
You know, I wrote about the meta smart glasses and
I had their comms people reach out to me and
be like, actually, could you call it AI glasses because

(14:22):
this is how closely tied together really. Their thesis of
AI and glasses are like their CTO. I believe Andrew
Bossworth has basically gone on the record saying like, this
is how we get AI to really take off, and
it's smart glasses. This is the ideal hardware form factor

(14:43):
for AI going forward will be in these smart glasses.

Speaker 1 (14:47):
This is kind of interesting because if you look on
some corners of social media, you'll see a lot of
people get mad about AI features being added to products
that really would be fine without it. But tech companies
are still obvious trying to put AI into everything anyway,
and in wearables they seem to be betting on us
accepting it.

Speaker 3 (15:07):
People have had chat GPT, Claude, all of these AI
assistants on their computers for a while, and a lot
of us have come to the same conclusion that AI
can be incredibly dumb. But if it's on you, and
from all the conversations that I've been having with AI
startups and with big tech investing in the future of AI,
if it's on you, that's a different proposition.

Speaker 1 (15:30):
And with the metal ray band specifically, the AI is
sort of a bonus feature. It's a way for Meta
to get people accustomed to AI without explicitly having to
force it on them.

Speaker 3 (15:40):
These are functional devices where you don't have to use
the AI. You never have to say Meta AI, you
don't ever have to use that wayward, but it's there.
If you're curious, you could use it. And because there
are other reasons why you would use these glasses besides AI,
it becomes a very easy entry way, a door for

(16:01):
Meta to open and say, hey, you're using it for
these purposes, it wouldn't be really great if you wanted
to use it, sort of like how you use Siri,
but actually can do more and why don't you experiment
with these actually very useful use cases. And that's why
these particular glasses have actually caught on like wildfire in

(16:22):
the blind and low vision community because the AI features
genuinely have created a game changing, life changing technology for them.
Like I've spoken with a bunch of blind and low
vision users who tell me that the Meta glasses enable
them to live more independent lives. They have this live

(16:42):
AI mode that allows you to just ask the AI what's.

Speaker 4 (16:46):
Going on in my surroundings. It can read a.

Speaker 3 (16:48):
Menu for you, and for a cited person that may
not that might be like, oh my god, why why
do I need an AI to read a menu for
me unless it's in a different language and translating it
before a person who is low vision or blind, just
asking an AI, hey, you know my kitchen is really messy.
Where is this particular appliance? And having the AI be

(17:11):
able to tell you to save you some emotional labor
to save you some of the toll of having to
ask a living person to see your.

Speaker 4 (17:19):
Very messy kitchen.

Speaker 3 (17:20):
It provides an actual life changing service for that particular community.

Speaker 1 (17:26):
Victoria is not kidding. I know someone who's blind and
they let me use their smart glasses once and it
was genuinely amazing. So we were walking around on this
college campus and the glasses had these little speakers in
them and it would read signs to me from across
the quad. It could tell me if there was a
lamp pull or a bicycle in my path. Basically just

(17:47):
a whole bunch of things that would help somebody to
lead an independent life. So if someone ever tells me
that AI is useless, I got to stop them right there,
because even just from my very limited peak into what
this can do for people, it can be life changing.
But I don't think that reason is why smart glasses
are taking off now in a way that Google Glass didn't.

(18:10):
It's not just the technology, it's the timing. We'ren't an
age now where people are filming each other all the time,
whether it's making tiktoks on their phone or surveilling their
neighbors with the ring camera. We're used to seeing cameras
outside in a way that we just weren't in twenty thirteen.
So it sort of makes sense that even though these

(18:30):
things are conceptually very similar to Google glass, the Meta
ray bands are being more widely adopted now.

Speaker 3 (18:38):
I've seen them in the wild quite a bunch lately,
and it's from non tuckies. Like it makes sense to
me if I go to CEES and there's a bunch
of people wearing them. Makes sense if I'm at a
press conference and there's a lot of influencers around. But
I was just walking out. I was seeing a friend
that I hadn't seen in ten years, and I looked
at them and I was like, are those meta ray
bands smart glasses?

Speaker 4 (18:57):
And they're like, oh yeah, I love them.

Speaker 3 (18:59):
I love to take concert footage with them, yo. And
you know, I was at a concert in June. I
turned to my side. There's a girl, she has Meta
ray bands, and I find her TikTok later that day
from that concert with the footage taken from these glasses.
So to your point, that example has become real. It
is a thing that the technology has finally caught up,

(19:21):
and it's a discrete form factory and it's so much
more affordable. It's around like two hundred to three hundred dollars.
And here's the other thing that I think is kind
of a factor in it is that it lets you
use your phone less. And phones are almost twenty years
old at this point, and we're culturally speaking, kind of fatigued.

(19:41):
There's so many products out there about helping you focus more,
putting down your phone, locking it, freaking the screen so
that you're not like glued to this device anymore. And
so the proposition now with wearables is that it enables
you to do that. You can still stay connected and
still get your notifications and be reachable, but you can

(20:03):
triage it and you don't have to look at your
phone quite as much. The reason why I knew that
these would not just fall away and be completely Google
Glass two point zero was my spouse is a has
become a luddite. They absolutely abhor my job and wearable
technology that I test, and they were like, I'm gonna cop.

Speaker 4 (20:24):
Me a pair of those.

Speaker 1 (20:25):
Really, the ray bands.

Speaker 3 (20:26):
They got their own pair. Yeah, they got their own
pair of ray bands. That is their main pair of
glasses that they wear every day. They love cars, so
they go walk on the street and they can go, oh, hey, mita,
what model car is that and like in that instant
not have to pull out their phone and look all
that stuff up. That was compelling enough for them, and
the price point was good enough, and the look of

(20:47):
the glasses was not dorky. It was just a complete,
like whoa moment for me.

Speaker 1 (20:55):
On top of the timing and the price. The look
of the Meta ray bands is probably another really big
factor of why these are working in a way that
Google Glass never did, because you don't have to consciously
sign up to be, as Victoria called it, an ambassador
for the future when you wear these things, and in retrospect,
that's probably another place where Google Glass kind of messed up.

(21:17):
They were relying on early adopters to be ambassadors and
hoping that they would get evangelists that would make everyone
else want to jump on board. Instead, they got Sarah Slocum,
the young lady who got her glasses ripped off of
her face. But you don't need a single ambassador if
the product just looks like something that's already out there,

(21:37):
everyone is already an ambassador. You're just joining the club.

Speaker 3 (21:42):
That's one thing that Meta hit right on the now.
They partnered with estel or Luxotaka, which is the biggest
eyewear brand in the world, has a bunch of different
brands under an umbrella, including Oakley's, which they just released
a pair of smart glasses with under the Oakley branding,
and ray Bounds, which worldwide. So to be able to

(22:02):
have those fashionable.

Speaker 4 (22:03):
Brands and to say here you.

Speaker 3 (22:05):
Go, you're gonna look, if not stylish, normal in them,
it was a huge thing for smart glasses to go
truly mainstream. They have to be good looking because humans
are vain creatures. We are constantly obsessed with what's gonna
make us look good, and I don't care if you
have the coolest piece of tech on the planet, that's

(22:25):
the most convenient you're putting it on your face, on
your eyes, all right.

Speaker 1 (22:30):
So some of y'all have listened this far and thought, Yo,
this sounds terrible. Anybody who wears these things is a
complete weirdo. But some people you're listening and you're thinking,
these actually sound kind of cool. You might be thinking
about getting a pair, or maybe you already have a pair.
So here we are again. We got to realize that
even though it felt like we societally rejected Google glass

(22:52):
ten years ago. Maybe all we did was just kick
the can down the road. We just put off making
a decision. Maybe when Sarah Slocum said that we'd all
be wearing these, she wasn't wrong. She was just a
little too early, Which brings us back to the TikToker
getting a Brazilian wax, The question of how much privacy

(23:13):
do we really want?

Speaker 3 (23:15):
We got a listener question. It was a spicy question
from a verge Cast listener where they were like, is
it okay for me to wear it while I have
intimate relations with my wife? And I was like, oh,
that's that is genuinely a question that we have to
grapple with if these are to become a mainstream piece
of technology. Is when is it okay to wear these devices?

(23:39):
What conversations are you supposed to have when you're wearing
these devices? Is it just on vibes? Are we going?
Are we treating them like smartphone cameras? Right?

Speaker 5 (23:48):
Right?

Speaker 3 (23:48):
My contention is that a smartphone you know when someone's
recording because they hold it a specific way. There's just
kind of like a body language that goes with holding
a camera and recording, but because of the form factor
being so discreete which is a benefit in many ways,
like for content creators who are trying to do first
person point of view. But at the same time, is

(24:12):
someone just adjusting their glasses in a specific way going
to start looking like you're recording something? Even if you
don't have smart glasses. You know, the meta ray bands
do have a indicator of LED light, which tells you
when someone is recording in very bright outdoor lighting. I
would say most people would not notice it, and that
is a thing that I test for every single iteration

(24:33):
that comes out.

Speaker 1 (24:36):
Living in twenty twenty five means you know that you
could be recorded at any time, But these glasses add
another layer. When someone pulls out a phone, it starts recording.
At least you have a visual indication that it's happening.
But what happens when you don't know you're being recorded.
That's after the break. So if you're outside while you're

(25:01):
listening to this, maybe you're going with shopping, maybe you're
going for a run, taking a walk in the park.
I want you to enjoy this moment. Just take it
in seriously because it might not be that much longer.
You can do this privately without having another person looking
you up. What I mean by that is that the
privacy issue of the meta ray bands goes beyond just

(25:23):
being recorded without your consent. Last year, a couple of
Harvard students were able to combine the recording and the
AI functionality of the metaglasses to docs people in real
time as they walked by. We built classes that let
you identify anybody on the street.

Speaker 5 (25:38):
Cambridge Community Foundation.

Speaker 1 (25:42):
Oh hiot, ma'am, but are you a that's Sam? Yes,
Oh okay, I think I met you through like the
Cambridge Community Foundation, Right, yeah, yeah, it's great to me.

Speaker 5 (25:52):
Yeah, I'm Ken.

Speaker 3 (25:53):
Terrifying, just absolutely terrifying. And they are college students who
were able to put that together. They just were able
to Jerry Rigs and ironically, several months later, they are
now coming out with their own smart classes product. So
it's just kind of it's it's a whole revorse of
just you know, on the one hand, they were raising
awareness like, oh my god, this could bebs used, but

(26:15):
also we have a product now cool.

Speaker 1 (26:17):
Now. To be fair, this new smart glasses product that
they're selling is not the facial recognition look of that
they demoed on campus. Their pitch is that their new
glasses go further than metas glasses. Metas glasses just turn
on and record when you tell them to. Their product
will always be on. One of the founders told the
magazine Futurism that quote, we aim to literally record everything

(26:39):
in your life, and we think that will unlock just
way more power to the AI to help you on
a hybrid personal level. End quote. And their glasses won't
have an indicator light that tells you it's recording, because again,
it's always recording. This could bring up some legal issues, which,
by the way, are issues that other recording wearables are

(27:01):
probably also going to run into at some point.

Speaker 3 (27:04):
Some people live in two party consent recording states, so
you have companies making tech that could be illegal in
some respects.

Speaker 1 (27:13):
Yeah, in California specifically, both parties have to consent to
being recorded.

Speaker 3 (27:19):
So I live in New Jersey and work in New York,
which are both one party consent states. So technically I
can walk out into public spaces with the thing on
and it's recording and I don't require anybody else's consent.
But if I'm going to California, is it okay? For
me to wear of always on recording device while I'm
on public transit. It suddenly becomes a very strange, murky

(27:42):
gray area. If you look at Metta's privacy policy for
the smart glasses, what they say is the best practices
for using these devices out in the world, and they
can basically say, hey, we published this, We've told people,
don't be a jerk, We're good, right when people inevitably
are jerks using their technology their defenses that they said, well,

(28:04):
we never intended them to use it that way. But
you can think about air tags in that respect. Apple
came out and were like, we made this an incredibly
convenient device for you, and a small bunch of bad
apples are going to use it to stalk people and
use them in ways that we absolutely didn't intend. But
in the fine front, we're going to say that that's illegal.

(28:26):
We don't condone that legally, we're scott free.

Speaker 1 (28:30):
So if you're not familiar with these apples, AirTags are
these small tracking devices that you can attach to your
keys or your wallet to keep track of them. Products
like this existed before Apple's version, but Apple just made
them more convenient, which made them more mainstream and After
the product became more mainstream, the obvious bad things you
can do with this technology also became more mainstream. People

(28:52):
started sneaking air tags into people's purses or attaching them
to their cars so they could track them and stop them.
So Apple did make a notification system that would alert
you if an unknown tracking device is following you, but
you would only get that notification if you also had
an Apple iPhone. Months later, they did put something out
for Android, but even then you had to know how

(29:14):
to use it, and as cases of people being stocked
kept hitting the news, they'd keep making modifications, like making
the air tag beat more if it's away from its
owner for long enough, but of course people started working
on how to disable that. There was another solution for this,
stop with the software updates and just cancel the product,
pull it off the shelves, but Apple didn't do that

(29:35):
because air tags are really popular. People really like being
able to find their lost stuff. In the case of
the Apple air tags and now for the Meta rate bands,
the trick seems to be to find enough consumers for
whom the product is indispensable, people who think that the
benefits outweigh the risks.

Speaker 3 (29:54):
I think this is what are really the crux of
all of this lies on is that air tags are
so con that most people, the vast majority of people,
will be like, yeah, I'm good with that because air
tags are so convenient, and I'm not the bad apple
using it in that way. It's like, for what app
for what use case will be so convenient that you

(30:14):
are willing to overlook the dystopian nightmare that comes along
with it.

Speaker 1 (30:19):
I mean, I'm feeling like, even if the vast majority
of people with the metal ray bands or any of
the smart glasses use them in very responsible ways, just
the fact that it's out there is going to alter
how we can walk around in society period.

Speaker 3 (30:36):
No, You're absolutely right, Like, in testing these devices, I
don't speak to myself as much as I used to
because I wore one of these devices into a bathroom.
I commented on my bowel movement and it recorded it,
and then it generated a to do for me to
have lactaid again, and I was like, this is the
rudest thing that's ever happened to me. But also, holy crap,

(30:57):
this is our dystopian future. Because when the AI is
in your glasses, when the AI is independent that sits
around your neck, when it's in your smart watch, when
it's on you twenty four to seven, and you just
have an unfiltered thought that you speak aloud to yourself. Well,
suddenly you're not the only one listening to your own thoughts.
It's an AI that's listening to your thoughts and it's

(31:18):
going ooh. I mentioned that maybe this is a thing
that I will generate a to.

Speaker 1 (31:22):
Do list for, and that sounds like a convenient application.
And there's that word again, right, convenient. You say something
out loud and your smart glasses remember it for you,
but it's not remembering it for you, it's also remembering
it for the company and for the advertisers. You blurt
out something unconsciously about your head hurting, or you're around
somebody else as they're talking about having a headache, and

(31:44):
next thing you know, you're getting a bunch of targeted
ads for a very specific brand of headache medicine that
is paid to be at the top of the list
for the demographic of twenty five to thirty five year
old women who like cold brew coffee, live in urban environments,
and like techno music.

Speaker 3 (31:58):
If we want to get super just Opien about it.
We live in the engagement economy, right. Engagement economy requires
the constant feed of data and personalization and all that
sort of stuff. What better AI training tool do you
have than a wearable? I sit up at night and
I think about it, and I was like, legitimately, we

(32:18):
started out just tracking our steps, and now it's your
heart right. They're working on finding ways to tell you
if your blood pressure is high or low, if your
blood glucose is higher and low. So they're looking to
how can the feed crapt on more data so that
we can.

Speaker 4 (32:33):
Know more about you, give you more ads, give you.

Speaker 3 (32:36):
More ads, personalized experiences like this is just my I
don't know if you've seen the meme of Charlie from
It's Always Sunny with the red string on the board.

Speaker 4 (32:44):
This is my conspiracy theory. Yes, so where wearables are going.

Speaker 1 (32:48):
I don't think that that's a conspiracy theory. I think
it's a very reasonable thing to assume is that the
more data is collected about you, it can be used
to show you ads that you will not past. Yeah,
you've got kind of a preview of this of what
society looks like because you're around the tech type people

(33:10):
all the time. What does a society look like when
we know that there's a good chance that somebody around
you is recording, is recording everything they're seeing and hearing.

Speaker 4 (33:22):
I think it's a much more self conscious society.

Speaker 3 (33:25):
I have become someone who when I'm out and about,
I am scanning the glasses that people wear to see
if there's.

Speaker 4 (33:32):
Cameras on them.

Speaker 3 (33:34):
I had a kind of a unique upbringing. There's a
question of whether my dad was a North Korean spire
or not, and whether we were under savellance at any
given point in time, and so I grew up always
thinking my life is kind of public. I have to
perform as if I'm always being watched. So I kind
of grew up with that my whole life. But it's

(33:54):
a heavy thing to grow up with, and I think,
you know, a lot of.

Speaker 4 (34:00):
People are privileged and blessed to.

Speaker 3 (34:02):
Not grow up performing in that way, as if there's
a movie set invisible camera on you at all times.
But I think that is just going to be a
reality that everyone starts to do. You start to become
a lot more conscious of your actions in public. You
start to become conscious of the spaces in your house
that are truly private.

Speaker 4 (34:22):
You know.

Speaker 3 (34:22):
I say to people all the time that the only
truly private place you have in this world is the
inside of your head. Like that's kind of dystopian when
you say that, But living the life that I do,
testing the products that I do, having the upbringing that
I had, I unfortunately think I am well equipped to
tell people that this is what's coming. I think we're

(34:45):
all going to have to live our lives as if
we're many celebrities out in public at a given point
in time, and that the paparazzo could come for you.
And there's degrees of that. Not all of us are
Timothy Shalome living out here having to wear caps and disguises,
But to a degree, I do think we're all going
to be living very.

Speaker 1 (35:03):
Public lives, whether or not we want to.

Speaker 4 (35:06):
Whether or not we.

Speaker 3 (35:07):
Want to, I think we are all in some way
going to be living as public figures. So I think
as a society going forward, we really have to think
about what are truly private spaces and what truly private
spaces we want to protect, because it's a human need
to need that privacy, and I don't think that it
should be given up for whatever convenience. Like, however, tempting

(35:31):
the conveniences and sometimes the inconvenience is necessary.

Speaker 1 (35:35):
It seems like this is another one of those conversations
where we're having as a society where the outcome seems predetermined,
which is to say, these are eventually going to be
adopted by everyone. It's just a matter of time. This
isn't a decision that you, as an individual get to make. Look,

(35:57):
you don't have to like these glasses, but they're going
to hit mainstream adoption. Everybody's gonna be wearing them, and
so you can choose not to buy them if you
don't want to, but you cannot choose to live in
a world that doesn't have them. And it feels like
that is where we're at right now. Do you think
we're there?

Speaker 4 (36:14):
I think you're spot on, because I think the sales proven.

Speaker 3 (36:17):
Google believes that the zeitgeist is strong enough for them
to be like, hey, guys, smart glasses. Let's get back
in on that and rebrand ourselves as the people with
the most experience in the space.

Speaker 4 (36:31):
So at the end of last year, the age they
put me back in coach.

Speaker 3 (36:34):
Like last at the end of last year, they legitimately
launched Android XR, which is going to be their platform
for smart glasses and headsets.

Speaker 1 (36:42):
Why would you say now is the right moment to
launch XR. I think now is the perfect time to
work on XR because you have a convergence of all
these technologies. We've been in this space since Google Glass
and we have not stopped.

Speaker 6 (36:56):
We can take those experiences, which already work great and
find new ways to be helpful for people.

Speaker 3 (37:02):
Once Google is able to overcome their PTSD trauma over
Google Glasses, to be like, we want back in on
the thing that people made the most fun of us for.

Speaker 4 (37:11):
I think it is inevitable. I think you're spot on
about that.

Speaker 1 (37:18):
Thank you so much for listening to kill Switch. I
hope you dug this one. If you want to connect
with us, we're on Instagram at kill switch pod, or
you can email us at kill switch at Kaleidoscope dot NYC.
And you know, while you got your phone in your
hand whatever, before you put it back in your pocket,
maybe wherever you're listening to this podcast, leave us some review.

(37:39):
It helps other people find the show, which in turn
helps us keep doing our thing. Kill Switch is hosted
by me Dexter Thomas. It's produced by Shena Ozaki, Darluk
Potts and Kate Osborne. A theme song is by me
and Kyle Murdoch and Kyle also mixes the show from Kaleidoscope.
Our executive producers are Ozma Lashin, Mangesh Hatikadur and Kate

(38:00):
Osborne from iHeart. Our executive producers our Katrina Norvil and
Nikki Etur.

Speaker 6 (38:05):
Oh.

Speaker 1 (38:06):
One more thing. So there's that clip that we played
of the dragon Ball scouter exploding, and maybe you were
wondering why it exploded or if there's any scientific basis
for why a scouter would explode when the power levels
are too high. Okay, maybe you weren't wondering that, but
I was. And it turns out that the official dragon
Ball site published an article that kind of explains it,

(38:26):
featuring an interview with a professor in the engineering department
at Miyazaki University who talks about using AI headsets to
measure the weight of pigs. I promise you I'm not
making this up. I'll leave that in the show notes. Anyway,
We'll catch you on the next one

Speaker 6 (38:54):
Would bind

kill switch News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.