All Episodes

January 9, 2025 44 mins

Robert and Garrison wade through the insane fever dreams of a thousand madmen to bring you the future of consumer electronics.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Cause media.

Speaker 2 (00:03):
Hi everyone, it's James coming at you with a pretty
nasty cold here. I wanted to share with you that
wildfast has swept through Los Angeles in the last couple
of days. While I'm recording this. Thousands of people have
been displaced. Five people have died that we know of
so far, thousands of structures have been burned, and many

(00:23):
many people in a will be finding themselves out of
their homes with nowhere to go, with very few resources.
If you'd like to help, We've come up with some
mutual aid groups who you can donate to, and we'll
be interviewing one of them on this show next week.
So if you'd like to help, the three places where
we suggest you would donate some cash are the Sidewalk Project.

(00:44):
That's the Sidewalkproject, dot org, Ktown for All that's letter
k t Own, f O R A l L dorg
and ETNA Street Solidarity. You can find them on Venmo
or I think on Instagram as well. That's a E
T N A str E E T S O L

(01:09):
I D A it t Y. All right, I'm gonna
go rest my voice.

Speaker 1 (01:16):
Oh, welcome back to it could happen here a podcast
about it happening here, which is really true in a
lot of ways. Tonight, Harrison Davis and I are seated
at the glorious majestical hotel name redacted on the Las
Vegas Strip. We got a long day at CEES Monday,
listening to panels, catching up with the latest tech news,

(01:37):
trying gadgets, and also at the same time texting our
dear friends in Los Angeles as unprecedented fires sweep them
from their homes. Literally the gettiest threatened Pasadena and Santa
Monica are both being evacuated as once. It's a real
one to two punch of America's favorite tech show in
the apocalypse today. How are you feeling, Garret?

Speaker 3 (01:58):
It's an average day in America, average.

Speaker 1 (02:00):
Day in America. Temperature's not coming down anytime soon. No, no, well,
just take a moment to breathe with that. So you
want to start us off with what you did this morning?
I was paneled guy, yesterday. There was a man of
action walking around and mostly trying all the free massage chairs.
What did you see this morning?

Speaker 3 (02:19):
I saw so many AI panels, half of which I
left halfway through because I knew they weren't gonna be
useful for me, just dogshity. The other half I took
notes on and just got sad. But No. Today was
full panel starting bright and early in the morning, where
I walked into a panel where I heard augmentation and
not replacement about twenty times in the span of like

(02:41):
twenty minutes.

Speaker 1 (02:42):
Yeah. I keep hearing versions of that too. In the
Hollywood panels, they would be like, you know, we want
to develop a machine that can read the brains of
our viewers and alter the endings of movies, you know,
but we see this as a way of augmenting the
artists work.

Speaker 3 (02:54):
Yes. And the biggest thing that I noticed across multiple
panels today is an almost like anxiety among these tech
executives about consumers rejecting the AI slopification of everything, and
they're trying to find ways to like actually force people
to start like using these products or having them like
like it. Yeah, And I haven't really sense that anxiety before.

(03:17):
It's it's all. It's all been very very positively, and
I think.

Speaker 1 (03:19):
It's a mix of Number One, the money still isn't
there where they need it to be. It has not
started like blooming to the extent that they were expecting
it by now. And the other part is people are
still not happy with this stuff. I'm glad you felt
that too, because that almost was like, especially after the election,
like I don't trust my feelings on this that they're

(03:39):
really scared, But I really do think there's a piece
of that coming through.

Speaker 3 (03:43):
No of a phrase one of the panelists used this
morning was the AI ick, like like how do we
how do we beat the AI ick? And if you
ever said yourself, how do I stop having people feel
an ick around me? Maybe you should really look inwards. Yeah,
maybe the problems you not them.

Speaker 1 (04:01):
You know, who doesn't need to worry about quote unquote
I for their product market is people who make things
that people like.

Speaker 3 (04:08):
So but I heard a lot about you know, and
trying to get people to use these products is like
making sure artists don't feel like they're being replaced instead
having their like art production process be augmented with AI
and how how that can make art easier to make
while still keeping the human at the center of AI tools.
And this is just what they talked about for like
a while, while reiterating that lots of the developments they

(04:30):
need to see on AI, they have it on the
tech side, what they need to rely on is consumer
acceptance to really drive the innovation, to see like what
they can get away with, Like how much will the
consumer accept the sophification of art and entertainment and customer
service and all these things are trying to cram AI
into and like.

Speaker 1 (04:48):
How much worse can you make the world before people
stand up and stop you with their fists or guns.

Speaker 3 (04:55):
And you mentioned something about like trying to like tailor
like movie endings for specific people, and i' I heard
them stuff about that. There's this one guy who was
who was like the panel's resident like content creator who's
supposed to represent like the artist block, even though he's like, eh, yeah,
you know some kind of like AI friendly content creator
though on this panel, and he talked about how like

(05:15):
back in the day, you need to have friends that
would like recommend you music, and like the Spotify algorithm
is is too based on like an echo chamber what
you already like, but now with a gentic AI this
allows trust between the consumer and the machine to recommend
new music. And like again, like so much of these
AIR products is just trying to like replace friendship.

Speaker 1 (05:35):
Yeah, people, have you tried friends have you tried people?

Speaker 3 (05:39):
How can you engage with like art and culture without friends,
Like how can you like learn more about like what
your friends are into what they like? How can you
discover new music just like without that instead replacing that
beautifully human process.

Speaker 1 (05:52):
Every year at cees, there are points in time where
I get that, like, oh yeah, twenty twenty really fucked
us up a lot, Like twenty twenty really did some
lasting damage. Like I know it was that was happening
with the younger generation before the iPad kid generation, but
like that that really did a number on some folks.

Speaker 3 (06:12):
Someone from Meta right of Facebook specifically they're like metaverse division,
which they're still trying to push for by.

Speaker 1 (06:18):
The way, Oh yeah, now, I mean they're still calling
it meta, which honestly, there's a degree which I almost
respect it because like we are not biting no.

Speaker 3 (06:25):
One, No one is. But she talked about how how
they can like blend the metaverse and AI to make
customized personal experiences. Say that you're watching an immersive live
concert in the mixed reality something that both me and
Aubert do all the time, and.

Speaker 1 (06:42):
Hairy styles mixed reality concerts. We're seeing the hundred gets, you.

Speaker 3 (06:47):
Know, honestly, a one hundred x mixed reality concert could
go crazy.

Speaker 1 (06:50):
Here We'll finally I'll finally get you pilled on real
big fish.

Speaker 3 (06:54):
But basically, as you're in this like metaverse concert, they
can have an AI that will sense your own excitement
and personalize the ending of the experience based on your
favorite songs or artists. So as they're getting excited from
like AI, Taylor Swift can like finish the song like
for you based on like your own like musical taste,
based on what the AI knows about you. And it's

(07:15):
about creating these customized experiences.

Speaker 1 (07:17):
It's such a you can clearly tell that none of
these people have souls, right, It's such a mismatch of
what people get from music because they think that like, oh,
this is just like a if I see that like
this specific beat line is I can just sort of
like plug this in and like, I don't know, Like
what makes people react to musicians and artists is that
they like make things that make them feel something like

(07:37):
That's why people get like really into artists, is they
feel seen and identify with a piece of art, as
opposed to like, oh, oh that guy really like the
first opening bars to fucking octopus's garden, Like let's let's
just like really turn up the octopus a lot more octopuses.
How many more octopuses can we fit in this fucking

(08:00):
in this track?

Speaker 3 (08:01):
No. Another panel I went to later in the day
was about like how do you market to gen z?
Very funny panel. Yeah, and and they're talking about how
like authenticity is so important, like you need to partner
with influencers that have like have like an authentic brand.
And it's funny having that ductionapost with like these like
these like AI slot panels were like you need like
an AI Taylor Swift to come like boost the excitement

(08:23):
for all these kids who are in their metaverse concerts.
Oh boy, But no, like personalized content like like targeting
like AI AI generated content specifically for certain people, for
certain users, whether it's on social media, whether that's on
you know, the metaverse. Like some of these people talk
about someone on the panel from Adobe, who's you know,
Adobe's integrating a whole bunch of generative AI into their

(08:44):
like suite of products, right, like a Photoshop premiere after effects, right,
big big company in the creative space. He said, They're
like personalized content is always the most impactful, like content
that a person feels like a genuine connection to, and
that connection could be fted by just being like, you know,
a compelling artist where you can recognize shared experiences of
shared experiences of humanity. But now you don't need that

(09:07):
artist part anymore. He said, they only need three parts
to create a pipeline. You need data, you need compelling
like journeys to take the user on, and you need
the content itself. And the goal is to create content
at scale that's highly personalized. He said quotes. We're good
at the first two parts. Now we just need to
improve the actual content side, which I don't even think

(09:29):
that's true. I don't think AI is good at creating
compelling human journeys.

Speaker 1 (09:33):
I had it. So the video I didn't play you
guys from my terrible fucking AI generated videos was this.
It was like a girl coming to college. We need
a picture of her dad, and it was like a
narration of her life with her father who like is
dead that she misses and all that she learned from him.
Data and it like it's a mix of like all
these different like there's a chunk where it looks like

(09:53):
a Disney animated picture. There's a chunk where it looks
like anime. She and her dad having these like adventures
around the world is a bit of looks like a
Marvel movie. And he's like, we can do all these
different you know, animation styles and they're seamless and like,
you know, the audience really goes on a journey with this.
And it's like, but there's there was no girl who
lost your dad. Nobody lost their dad here. This is

(10:14):
you just had a computer generate text about a dad dying.
Like there's nothing underpinning this, right, nobody has anything they're
trying to get across, Like you just know, in this one,
they look like Marvel heroes for some reason. In this
one they look like Zulu warriors, kind of done up
in a slightly racist lion king style. Like what is
being transmitted other than like, look at all of the

(10:35):
different art styles we can rip off.

Speaker 3 (10:37):
Now they do not have a journey, but even they
themselves admit that they still don't have the content. The
content itself still isn't even there. And that's something like
they even acknowledge. And this is like a hurdle to
this is this is a hurdle to get over. What
they do have is the data, and like this is
like something that Adobe has done because if you use
Adobe products now some of the most used creative products,
Adobe trains all of their AI systems, all the stuff

(10:58):
that you make using their products, which you know, he
really just blazed past that point because that's that's a
whole other discussion. But even they know that they don't
have like the actual products and this is still reliant
on like consumer acceptance. As as they said before someone
from Meta, the same person on the panel that talked
about how like a few days ago on Instagram, they

(11:18):
tried to announce like you'll have like AI profiles right,
like like completely AI generated pictures profiles, like you know,
like fake people who have their own accounts, And this
created such a big backlash that they rolled this back
and they simply announced this before Cees.

Speaker 1 (11:34):
One of these accounts was literally like I'm a mother
of two queer black woman. You know, yeah, I got
a lot to say about the world. Someone call up
the situationists please, And some like people started talking to
were like, we're any black people at all involved in
like making this chatbot. She was like, well no, and
that's a real problem. That is a real problem.

Speaker 3 (11:54):
Okay, Yes, And the excuse that this person for Meta
said is that the market just isn't ready yet. It's
not that the actual product itself is like bad or
like no one really wants. The market's not ready yet.

Speaker 1 (12:07):
Well, they're so used to everything that they've done so far.
They've kept getting money right, and like it slowed down
and they've had to do layoffs, but like nobody's just
made them stop at any point, which, honestly, you know,
I made a comment about healthcare executives a while back,
needing like a fucking retirement plan paid in millimeters. So

(12:28):
I'm not going to make that same comment about tech
industry rules because you know, we all know what's in
the news. But something has to be done to force
these people to stop moving in this direction. And I
don't know how to get across and like they're already
at this point, like they seem to really want not

(12:48):
want this, and we have to find a way. They're
just not ready. We have to find a way to
force this on them. I don't know how to get
across to them in a peaceful manner.

Speaker 3 (12:56):
Oh oh sorry, people don't want.

Speaker 1 (12:59):
This man of peace Garrison. I'm a man of peace.
I'm not a plumber.

Speaker 3 (13:06):
The last thing I had to add out of this panel,
just in terms of how much this stuff is just
actually taking over more and more of the market even
if people don't want it, is that the guy from
Adobe announced it in the fourth quarter of last year.
They were able to boost all of the Adobe's like
you know, emails. If you send like an email to Adobe, right,
you have a problem, like you need help. But like
everything that they do on emails is now one hundred

(13:27):
percent generated by AI. And this was boosted from fifty
percent at the start of last year. Now it's one
hundred percent of all of their email content is now
done by AI with some moderation.

Speaker 1 (13:37):
But there comes like when the company itself is like
communicating with customers through emails.

Speaker 3 (13:43):
That's what it sounded like.

Speaker 1 (13:44):
Yes, they're still writing emails sometimes to each other or
for that too.

Speaker 3 (13:50):
He described it as like email content. So I'm pretty
sure it is customer service stuff like marketing, maybe like outreach,
like certain like outreach things. But yeah, like what one
hundred percent now generated by AI with some human like moderation.
But yeah, that is where things are moving and that's
how I started my morning.

Speaker 1 (14:08):
Well, better than a cup of coffee. Is that sense
of creeping dread that like, Wow, I just saw a
bunch of people who will probably would rather kill the
world than be stopped from shoveling AI slop into people's mouths,
because this is the only future they can imagine, is
one in which they work for a company that feeds
the planet poison and kills the human concept of creativity

(14:31):
so that they can buy a house in San Francisco.

Speaker 3 (14:33):
Do you know what I want to feed the concept.

Speaker 1 (14:36):
Of Yeah, we'll talk about that, but here's some ants.
We're back. What was part two of this episode? That's
feed buddy. I'm ah, oh, let's talk about that helicopter. No, yeah,
I think yeah.

Speaker 3 (14:56):
As I was going from panel to panels scrippling notes
on AI. Yeah, has some very exciting news stories drop
that we'll talk about later. What were you up to, Robert, Well,
I was. I was trawling the show floor as I
oft do.

Speaker 1 (15:08):
At some point in a cees and I came across
a number of majestic products. You know, a lot of
it was AI based, and we'll talk some more about
that here. But I ran into something that was thank god,
had nothing to do with AI and it's a death trap.
Every every one of these.

Speaker 3 (15:24):
There's like some sort of yes, we find a new death.

Speaker 1 (15:27):
There's a lot of connected vehicles. There were a lot
of evs last year. There were a ton of different
flying taxi type options people that were really trying to.

Speaker 3 (15:35):
But you don't see it all this year.

Speaker 1 (15:37):
Nothing this year, nothing this year, because it's a terrible idea.
It's a terrible idea. The people who are rich enough
to pay for flying vehicles don't want it to be
a taxi, and the people who can't afford their own
flying vehicles also can't afford anyway. So this is instead
of any of that. Richter r i ct O R,

(15:59):
which is a Chinese company. Their ads say, I'll say
why be normal, saying this the future of travel will
not be on the ground. And the Richter is a hybrid.
It is like a smart car style sized vehicle. It's
like half the side. It only has two wheels, though

(16:19):
it looks more like a scooter. It's more like a
weird little scooter, but it's fully enclosed and in addition
to having its wheels and being able to travel about
on the ground. It has four like quadrocopters style rotors
because it is an aquatic flying car. Aquatic flying I
saw no evidence that could actually go in the water.

Speaker 3 (16:37):
How high can these things go? Up?

Speaker 1 (16:39):
Less than two hundred meters? You know why, Garrison?

Speaker 3 (16:41):
Why? Why is that?

Speaker 1 (16:42):
Because if you try to go above that you need
a pilot's license. You don't need a pilot slicnse I
have that. When I was interviewing them, I was like, so,
I assume there's gonna be some sort of pilot's license
for this flying craft. And they're like, no, as long
as you stand nd two hundred meters, you get.

Speaker 3 (16:54):
Do you need drivers? Like? Are you gonna put a
license plate on this?

Speaker 1 (16:58):
Or there's no space for one?

Speaker 3 (17:00):
Buddy's completely unregularly to.

Speaker 1 (17:02):
Be honest, and I don't say this for anny problematic reason,
but like, these folks are Chinese and did not seem
to have a great deal of knowledge about the US
words sure that said, I can't imagine China's less strict
about personal aircraft.

Speaker 3 (17:16):
I would like to take this fucker on the I five,
just start start to zooming.

Speaker 1 (17:22):
Yeah, see it.

Speaker 3 (17:23):
Up in the air, because you could probably do like
a pretty a pretty good road trip on this right,
you can you can? You can be about that.

Speaker 1 (17:29):
So it's very small and it's completely electric. So I
asked him, how much time do you get in the
air with this bad boy on battery? Maybe twenty five minutes?

Speaker 3 (17:39):
What happens after twenty minutes?

Speaker 1 (17:43):
I did ask this, and I was like, this is
just rough out of the sky, and they're like, no,
we're working on like a like an intelligent thing that
will like.

Speaker 3 (17:52):
Land.

Speaker 1 (17:53):
Yeah, which is also very exciting, really really looking forward
to seeing how they pull that off. The videos that
they have show it driving on the highway too. They
weren't able to tell me what a top speed was.
It has no rear view mirrors and no side view mirrors,
but they said there's lots of cameras on the inside,
so I'm sure that's fine. It's a death trap. This
thing will get everyone who even looks at it wrong killed.

(18:15):
They should be a video of the prototype. It was
completely frameless. It was just quadrcopter blades and like a
chair on a platform lifting a guy into the air.
It couldn't go forward or backwards. But they're like, a year,
we canna have this figure out.

Speaker 3 (18:28):
It can't it can't move forward.

Speaker 1 (18:30):
It only only went up in the videos I saw,
So you can't actually travel, absolutely not. I couldn't by
the way. I couldn't fit in this thing, like you
would be cramped in this fucker.

Speaker 3 (18:43):
But it's good for vertical travel.

Speaker 1 (18:45):
It's great if you just need to go up to
under two hundred meters. There's no more efficient way.

Speaker 3 (18:52):
If you pull over by the cops, you just just.

Speaker 1 (18:55):
Go up above them. I'm in the sky now. You
can't do shit to me for five mass minutes. Oh god,
it's like if you're just driving, you go up to
one hundred kilometers, which made me think, so agad, that's
like sixty the year for twenty minutes. Then I land,
Then my battery is dead.

Speaker 3 (19:14):
Then you can't go anywhere.

Speaker 1 (19:14):
You can't go anywhere. You can get back.

Speaker 3 (19:16):
The battery issue is gonna is gonna be troubling.

Speaker 1 (19:19):
But it seems completely useless.

Speaker 3 (19:20):
But as we've heard NonStop the past two days, this
is the worst it's gonna be.

Speaker 1 (19:25):
This is the worst it's gonna be. Only gonna get better.

Speaker 3 (19:27):
Things only ever get better.

Speaker 1 (19:29):
That's that's what everyone was trying to insist upon. To
may hear what else did you see on the show
floor that cut your off garrison? So many magical, wonderful,
marvelous things, most of which were just like various different
AI connected smart houses. That was what Samsung was showing off.
That was what LG was showing off. But I believe
you saw one as well, right, Yeah, I mean I

(19:50):
I walked through the LG booth.

Speaker 3 (19:52):
It was kind of the same as same as last year.
The Samsung booth was too intimidating. But I should check
it out because last year we didn't do the sums
hung booth because we were going to and then either
either one of us threw up or spilled something.

Speaker 1 (20:08):
Hey, okay, okay, yes, did I Did I pour my
crative into a constant into a carbonated beverage that spewed
a geyser a blood red foam into the sky around
to the white sung Did the security guard stare at

(20:28):
me as it happened? Did I set the drink down
as it continued to spew and said, I'll go get
some towels and then leave forever towels? Yeah?

Speaker 3 (20:39):
Left? We fucking bounced so we couldn't do them booth
last year. Maybe I'll try it this year. But tell
me about these smart houses.

Speaker 1 (20:49):
Well, Garrett, Sam Sumi has a great idea for a
smart house. First of all, you wear that game the sims. No,
well they're really betting that you do, because their current
plan is design and you're a home with the AI
powered map views. Okay, okay, sure, you get like you
should feed it like a picture. You like, you lay
out your your floor planning your house, and it gives
you like a three D model and you can take

(21:10):
pictures of your furniture or pictures of furniture that you want,
and then it really places it around and you can
place them. Now, a couple of things, one of them
is that there's no scaling done by the AI, so
it's up to you to figure out how the furniture
you might want to buy measures up in comparison to
the apartment.

Speaker 3 (21:29):
Sure.

Speaker 1 (21:29):
Sure, but it does look like the actual like map
that they've got. I'll show you the picture that I took.
I'll try to put it up somewhere like it looks
like the video game the simps. You're populating like a
little three D CGI house. And I was like, okay,
well there's there's a use there, right. People like planning out,
Like you you're moving into a new apartment, you can

(21:49):
like fill it in here, and before you even move in,
you can figure out what kind of furniture you need
or how you're existing furniture will fit in there. I
would never have used that. I usually picked up all
of my furniture from the trash before I had a
house when I moved into a new place. But I
know people who would have used that. Sure, that seems useful.
So I talked about security. Some one thing that concerned
me is like the first guy I talked to, he

(22:10):
was like, oh, yeah, I think it's all stored locally.
And I was like, so Samsung doesn't have any access
to any of the data on like my house and
its layout. And he was like, let me, let me
get you to one of our engineers because he can
answer that question. And the engineer's answer was, and I'm
paraphrasing here.

Speaker 3 (22:26):
Oh okay.

Speaker 1 (22:28):
So that made me very confident.

Speaker 3 (22:29):
That does make you feel safe about sharing your personal data?

Speaker 1 (22:32):
Right, yeah, I'm the layout of my actual house well.

Speaker 3 (22:34):
And the thing is, I really don't like that at all,
because this is this is something that people were asking
Facebook slash meta when they were doing like their you know,
like metaverse stuff because their headsets are recording you know,
very very extensively, like your home layout, and the whole
point well, part of the point was that some of
that data could then be used to send you targeted
advertisements based on them seeing everything in your home. And

(22:57):
I suspect that Sam Sung might also have some interest
in targeted advertisements, being a tech company, but you know,
I could never say.

Speaker 1 (23:07):
Yeah, and they were that wasn't really one thing they
had is for like their retail segment, they had like
a live video grocery store ad showing you prices of
different produce and I think like the insinuation that didn't
lay out is like you can change prices on the fly,
you know it, which kind of made me think about that.
There was some talk last year of like, Okay, we
want to be able to like face scan customers so

(23:29):
we can see if they have money and increase prices
for like products for certain people, which I'm sure they're
going to try. They were too enticed by that idea
not to so I caught a little bit of that,
But they really like to the extent of how big
And this was an interesting last year, Samsung and LG
their boots were huge and they had a lot of
get different gadgets. Sam Times booth is big this year.

(23:50):
Forty percent of it was that scan your furniture, scan
your fucking like map acts. Not that much like very
little actual shit going on.

Speaker 3 (24:00):
The people slap the word AI onto everything there was.
Another big thing was all Samsung.

Speaker 1 (24:05):
Because Samsung makes a ton of appliances, they make TVs,
all sorts of entertainment products. All of them have this
I figure what they called like samsum tag or something
that you can you can map it in your phone,
so you can have a whole map of all of
the devices and shit that you have in your phone
and you can control them all from a single point.
And right, no one, by the way, had any interest
in answering my security questions there. But also if you're
into that, if you want to have all of your

(24:27):
appliances and entertainment things linked up and controlled on your
phone and all of them are Samsung, you don't care.

Speaker 3 (24:33):
You don't care about no, if you're getting a smart home,
I don't think you really really care about that.

Speaker 1 (24:38):
But also none of it was like, yeah, I can
control everything from my phone. You've been promising me that literally,
like in twenty eleven decades they were promising me you're
gonna be able to control your whole house thing.

Speaker 3 (24:48):
It feels new this year. This is the thing. Is
like even walking through the LGB with which usually has
some really cool new thing, this year nothing new, No,
nothing new. They slapped the word AI on one corner
of their TELEVI set.

Speaker 1 (25:00):
Right.

Speaker 3 (25:01):
I guess LG does have like a large language model
in like one corner of their booth, but like so
does everyone else, Like that's not like, yeah compelling.

Speaker 1 (25:08):
There was sk which is a South Sees Korea company
there booth again the massive like aire a big thing,
but it's nothing. It's just a big visual display that
looks cool, that looks like a bunch of server racks,
like you're in this huge cube of servers. But everything's
just like Echo does different actual products. One of them
was real time CCTVs that use an AI at like

(25:31):
an LLM type thing to summarize pictures. So I like
walked through and it did pick me out as a
notable person. So I've got like this people of interest
thing where it's like a man holding a smartphone standing
next to another man. But also I'm like, what does
that really get you, Like the fact that you're summarizing
up like these people who are like this person's kneeling
and taking a picture this person standing because I like

(25:53):
actually tried deliberately, I like reached to my bag to
try to be suspicious. I like did finger guns and
it never marked me out. And I can pull a
real gun or anything, because I very rarely bring that
to the cees for But I don't know, like I
can see how there could be a utility there if
you're actually able to say you're setting up like surveillance
side of a residential building and it can alert security

(26:15):
that like something is happening outside. There's a potential you
if it's good enough utility in that that they didn't
display it at the show. It was literally just describing
randos from the audience, And like, I just don't see
how a security guy is there's a guy with a
phone on outside of the building, like.

Speaker 3 (26:31):
A yeah, no, it's it doesn't seem very new, it
doesn't seem very innovative.

Speaker 1 (26:36):
Nah. So again when I'm when I'm seeing here, overwhelmingly
for all the talk about like there's no resisting it.
AI's coming. It's going to dominate everything. This is the
next big thing. A remarkable lack outside of what I
will say. The one thing where there are continuously new
products that are better every year the smart glasses. Yes,
they're getting more impressives. I don't think I'll ever be

(26:59):
a smart glasses guy. I hated glasses enough that I
let them shoot me in the eye with lasers. Shout
out to our lacek sponsors. But I see why people
would like it, and there seems to be legitimately substantial
utility if we.

Speaker 3 (27:14):
Have high power smart glasses. Yeah, that look like a
regular pair of glasses. I will get a pair of eventually,
because yeah, why not. There was a great demo. I'm
pulling over to an LAWK view.

Speaker 1 (27:24):
They had like one glass that was the first world
smart glasses for TikTok life. Not particularly excited about that.
But they had another set of ar glasses with a
twelve hour battery, where like, if it works as well
as the demo, and that's a big if, but it
seems to like your smart watch, so it'll tell you
can see in a heads up display as you're cycling.
That was the demo it'll both like give you directions

(27:45):
like in your eyes. And it seemed to be like
fairly well thought out, so it's not like overly corrupting
your view. It'll show you your heart rate, you know,
it'll show you like all that kind of stuff. So
you get like a useful degree of control and assistance
from that kind of thing. And that is I will
say the last three cees is the glasses get a
little better and a little smaller every year. Smaller, certainly,

(28:07):
I would say that's a real product that's probably going
to continue to improve.

Speaker 3 (28:12):
Do you know what else always seeks improvement, Robert No,
The capacity for you to get personalized possibly AI powered ads. Well,
that human is exciting formed the consumer choices.

Speaker 1 (28:25):
Let's all sit down for some AI powered ads. Wow,
I can't believe they put Jay Shetty's voice the dhed
Harrison Ford from the latest in Handed Jones movie My
Dick's Hard? How are you Garrison?

Speaker 3 (28:45):
Oh? I feel good because today, as we are recording this,
it's it's late Tuesday night. There was a series of
fascinating breaking news articles that happened as we were sitting
or at least as I was sitting in on these
aipans would be that hard to not just like completely
interrupt everything and be like, yeah, hey, hey, any comment

(29:05):
on this.

Speaker 1 (29:06):
Guys, Guys, something real happened. Shut your fucking stupid mouds
about this AI Hollywood bullshit.

Speaker 3 (29:13):
So a few weeks ago, if you were unaware, a
Green Beret rented a test the cyber truck to feel
like Batman and Halo and drove to first the wrong
Las Vegas and then eventually Las Vegas, Nevada, parked outside
of the Trump Hotel and Casino and then loom himself up.

(29:34):
And this has been a big news story. It happened
during the same day as a pretty horrible terrorist attack
in New Orleans, which resulted in about fifteen people dead,
done by a guy who was employed by Deloitte, a
frequent frequent CS sponsor. So this is these felt like
a very CES style of attacks, you know, one Delote
guy driving into people, murdering homesh you guys. And then

(29:57):
this cyber truck explosion in Vegas a week before CES,
you know, very odd. And then and then Robert some
news drops today that I would love to hear you.

Speaker 1 (30:07):
And out, you know, Garrison, I made a comment the
other night about how like, it's pretty well documented that veterans,
you know, not that they're more likely to carry out violence,
but when they do, they tend to have higher body
counts because they have more skills. It turns out I
thought we were getting more literal bang for our buck
training Green Berets than we are. My assumption is because

(30:27):
my uncle was a Green Beret and he did some
very scary, probably wore crime shit in Vietnam, and I
assumed like thatman, I'll tell you one thing about my uncle, Jim,
that man could make a bomb. That man would not
need to ask anyone for advice if he needed to
make a bomb. He's not with us anymore. God rest
his soul. But it turns out this Green Beret, who

(30:49):
you know, a fucking dollar store TJ Max version of
the Green Berets is what we're working with now, asked
chat Gpt how to build a fucking bomb, and it
sounds like he was trying to make it triggered by
tannerite with it, which is a bipartite explosive compound that
you use as like an exploding target, so it'll go
boom big, but you have to shoot it with something
like a rifle that's high velocity, or use like a

(31:11):
blasting cap. Otherwise it's very stable and very safe, which
obviously has use. You know, it was invented actually to
set off avalanches and stuff anyway, because that's very available
in very high power. He was looking to like fill
his car with that and then shoot it with a
rifle while he was in it, and that's what he
was asking chat gpt about. So it's not clear to me. Actually,
the actual headline is that like he used chat gpt

(31:32):
to make his bomb. It seems and I'm not privy
to what the police are obviously, but it seems like,
based on what I read in the article, we're not
sure if he actually used chat gpt to make a bomb.
It's more that he was interested in making a bomb
setting off tannerite by shooting it, but may have ultimately
decided not to do that because he would then be

(31:54):
alive for the explosion, which he didn't want to be. Also,
the authorities don't seem to fully know how he triggered it. Yeah,
so it's still kind of unclear to me. I guess
hopefully we'll get more later, but he he definitely needed
chat GPT's help to try and figure out how to
make the bomb.

Speaker 3 (32:11):
He certainly used chat GBT in the planning process of
this attack.

Speaker 1 (32:16):
Yeah, fair to say that.

Speaker 3 (32:18):
And it's odd because both me and you spent a
number of hours today actually like attending like demos from
like these you know, speech to text, text to speech
AI systems. We went to like two specific ones that
they like, you know, demonstrated demonstrated the capabilities of their
like you know, like AI assistive tech. The first one

(32:39):
we went to spent twenty minutes talking about how their
biggest inspiration, their quote unquote North Star was the movie
Her was with Waquie.

Speaker 1 (32:49):
They had a whole slide about how that was the
gold standard for AI human communications. The movie Her, in
which Joaquin Phoenix falls in love with an AI chat
bot voiced by Scarlett Johanson who hires a prostitute to
have sex with them while she participates vocally, and then

(33:11):
it turns out the AI is really kind of Polly
and Joaquin Phoenix is not okay with that, and then
maybe the ais all go to space. It's kind of
unclear at the end. I don't think it was a
great movie. A lot of people liked it. I don't
see whether you or not you like it. Why this
is your vision of how a chatbot should work.

Speaker 3 (33:27):
The actual chatbot they had was like fine, it was
it was actually pretty good at translation, you knowing from
Spanish to English.

Speaker 1 (33:34):
It worked quite well. Yeah, the demo was like solid,
it was pretty accurate. You know. I love coming here
and fucking with people. I love like being a dicky.
They asked for a volunteer, and at that point we
knew about the chat gbts. I wanted to go up
and ask, like live this robot to like help me
make a bomb. But the guy who was pretty handsome

(33:55):
and like an interesting like English Spanish like specified he
was and he didn't want to be mean to him.
He seemed nice, wasn't shitty, like he was fine. There
were just ten people in this room that was supposed
to have two hundred. I'm sure she wasn't the one
that talked about her. That was someone else that it
was someone else at his company. And like he just
seemed like he wanted to do with I didn't want

(34:18):
to be a dick to it.

Speaker 3 (34:19):
No, no, and like.

Speaker 1 (34:19):
It it wasn't hurting any It was fine.

Speaker 3 (34:21):
Like Similarly, we went to this again a nice jawline.
We went to this other one about this like actually
a much more dubious concept in my mind, which is
like this this AI assistant to help like elderly people,
like people in like their eighties and nineties who don't
want to be an assistant living facilities, who have been
living on their own, but they're getting to the point
in their life where like they need like some degree
like in home care.

Speaker 1 (34:42):
He specified. A lot of them are people who have
either just lost a spouse or maybe their spouse is
aging faster and worse than them and is no longer
really able to be the kind of companion that they
were before.

Speaker 3 (34:53):
So it's like this. It's both like a conversation tool.
It helps like memory recall kind of in some ways
has the feat or is that like, you know, someone
in their sixties would just use their smartphone for it
to help keep in touch with their family. It's kind
of simplified and more automated. Uh so you know, ways
to help keep in touch with like your family can
prove like your memory like talk about your own life.

Speaker 1 (35:12):
And the device is weird. It's about the width of
like a bedside table maybe six to eight inches deep,
so think about like eighteen inches long to maybe six
inches deep something like that. Half of it is like
a little tablet, like a seven inch tablet with the speaker.
Half of it is something about the shape and size
of a head on like a neck that can pivot

(35:33):
and nod on the neck. There's no face, so when
it's talking, there's like a white light in the center
of it that kind of like pulses in time with
the speaking that it does. So we saw this picture
of the device and we saw the description of like
this is an AI companion for the elderly, and we
were both like, number one of these people are going
to be monsters. This is going to be like something

(35:54):
to shovel you're dying dad off with because you don't
want to spend it.

Speaker 3 (35:56):
You want to spend time with your family.

Speaker 1 (35:58):
Scum, You're too busy AI generating SKA music and trying
to sell your shitty robot to Garrison and me. More
on that tomorrow, More on that tomorrow. And so that's
what we came in prepped to this meeting, like this
is this idea I.

Speaker 3 (36:12):
Find pretty distasteful in general, is like replacing actual like
you know, friends or human contact or like like in
home care with a fucking like Alexa machine essentially.

Speaker 1 (36:22):
And to be clear, I still think this product might
be a bad idea that doesn't work. But the guy
behind it, who is the dude that we talked to,
cares a lot and is really very clearly trying to
do a good thing and thought through the ethics and
the efficacy of what he was doing a lot. And
I'm not convinced it will actually do anything, but I

(36:45):
like wish him the best.

Speaker 3 (36:47):
Like it's specifically is designed to not look like a human,
so that somebody's using it, you know, wouldn't like start
to believe it's like human.

Speaker 1 (36:53):
Like we don't want to trick people. We don't want
them to mistake it.

Speaker 3 (36:57):
It refers to itself to like, like as a robots like,
it refers to its own like you know, like motors
and functionality like like pretty consistently to to like you know,
make sure that the person who's talking to it gets
like reminded of that. And something I talked about is,
you know, there's been a lot of news stories this
year about people building very unhealthy attachments and relationships to
these kind of AI AI programs like Character AI. There's

(37:20):
a story like a year and a half ago about
like a journalist to quote unquote, like you know, like
like fell in love with some kind of chat thing
that resulted in him killing himself. You know, but these
kind of these systems like he.

Speaker 1 (37:30):
Was he was not a teenager, was no character was
that a journalist?

Speaker 3 (37:34):
Last year there was there was a journalist people fellow
in love with an AI chat thing. A few weeks
ago there was the kid who you know, was talking
to this like the character I.

Speaker 1 (37:43):
Also, I just need to reiterate her not a great movie.

Speaker 3 (37:48):
But but you know, there has been a lot of
these stories of these things like going wrong or you know,
encouraging or like not stopping you know, like these like
intense conversations like suicidal ideation or you know, like self harm,
all these things.

Speaker 1 (38:01):
We brought these up kind of thinking he would flinch
away and not want to talk about it, and he
very much acknowledged that, like he was aware of this,
and this is something that they were attempting to build in.

Speaker 3 (38:10):
This is this is like this is you know built
into it. I think this is still you know, a
big problem with this entire industries. I'm sure everyone would
say this is you know obviously that we have we
have guardrails for this, and then becomes a new story
when those guardrails fail. Similarly, was to go back to
the Tesla bomb. You know, there's supposed to be guardrails
and chat GPT to make sure he doesn't tell you
how to build a bomb and those guardrails can fail.

Speaker 1 (38:32):
He showed us one which was like he told the robot,
I love you What was it? L e q lq
was the l eq e l l i Q, I
love you l eq And the robot like responded with
a like, oh that makes like my fans are all
spinning or something like that, where he's like, I wanted
the responsibilit to be that it's reminding the person talking
to it that it's a machine, that it can't think

(38:53):
we're love them back. We don't want it to be negative,
but we like, we don't want to be like feeding
into that. And I don't know that that's the best
way to do that, but like, at least they're thinking
about that kind of thing. The thing that it was
interesting to me is that he build this as the
first proactive HOMEI thing, so unlike an Alexa or whatever,
where it's just waiting for you to ask it something,
but it does not chime in randomly to talk to you.

Speaker 3 (39:15):
Or it won't change the subject either, and like continue conversation.

Speaker 1 (39:19):
This will prompt you out of the blue, be like, hey,
how are you doing? How are you feeling today?

Speaker 3 (39:23):
It's been a way and specific. You want to see
pictures of your family? You see pictures of your family.
Do you want to call your son? You know, but
do you want to play a game? Talk to me
about about that movie you see?

Speaker 1 (39:32):
Talk to me about that? Hey, remind me how did
you meet your husband? You know? Like literally, these are
all the things it will do. And it had some
side features like if it prompts you to start telling
a story, it'll save that as like a memoir thing,
so that like, you know, when your elderly mother passes
or whatever, it saved up this like collection of stories
over the years, and you can like show it pictures
while you're telling it stories and it will listen and

(39:54):
it'll have comments and it'll ask you further questions about so,
how did you feel, you know, after meeting them? This way,
like that's really interesting. I didn't know that explain to
me how it worked, and it will also prompt you
to send those to your kids. And the big thing
almost every kind of dialogue thing would prompt you to
send a message to a friend or your kid. So

(40:14):
a big part of it seemed to be this is
not a replacement. This is a machine that we hope
people will get comfortable with and then it can prompt
them to try to engage with the world more. Yeah,
loved ones, because that's our whole goal is to connect
them to people.

Speaker 3 (40:29):
I asked him, is like, you know, part of this
product is designed to like, you know, help solve like
loneliness in older adults, and like, how much of this
is really just like kind of trying to like replace
actual human contact with this, like you know AI contact.
Will that really help, you know, loneliness? And he talked
about how like like I think, like he said, like
ninety percent of the people who like use this, like
it results in actually more more communication with their family.

Speaker 1 (40:53):
They have this in like some two thousand homes right now.

Speaker 3 (40:56):
They have like two thousand units. It's like a subscription
model I think it's right now is like ninety nine
dollars a month's gonna be boost up to like one
hundred and fifty with some extra features in the next year.

Speaker 1 (41:06):
It's very much still under evolution. So one thing he
pointed out is that, like, yeah, initially we had the
ability to like connect people to other elderly folks using this,
and so they've kind of formed their own community, had
like a weekly bing bil game and asked us to
build in more chats so they can message each other directly,
and so some of them are like playing bingo directly
now through these machines, And I'm like, well, that seems

(41:26):
probably good.

Speaker 3 (41:28):
Yeah, yeah, because I still am like fundamentally opposed to
this premise, yes, but it's interesting to seeing someone still
but a sad aging yeah right, that's not their fault.
And it's interesting to see someone like approach this from
like a you know, a very like compassionate standpoint, even
if I find the actual kind of nature of this
thing existing to be like deeply uncomfortable.

Speaker 1 (41:47):
Because yeah, I can't not find it off putting, but
I I think there's a chance that it will help
with the real problem. I cly would prefer if it helped. Yeah,
So I don't know. It was kind of it was
a unique in this world of like as it was
a unique kind of like product for me where it's like,
I don't know that this application of AI technology will

(42:10):
actually do what you're hoping it will, But I got
the vibe from that guy I got was nothing but
good will.

Speaker 3 (42:17):
We're some of the other people we talked to you
today who are completely.

Speaker 1 (42:21):
Soul out of yes, yes, nothing behind their eyes, dead eyes,
black eyes, like a dull's eye.

Speaker 3 (42:27):
Even the way this guy is talking, you can tell
you you had like a very like empathetic voice, like
much like.

Speaker 1 (42:32):
One of the things he did is he he would
tell it like, I'm in some pain, and then the
robot would cycle through to the pain scale and would
try to because one of the things it does is
it will take information for care and it will text actively,
so it's not just communicating with the old person. It
will text and message their kids, you know, and whatnot,
prompt their kids, Hey, your mom's lonely.

Speaker 3 (42:54):
Yeah, or it'll even say if you know, someone like
didn't take their meds today.

Speaker 1 (42:57):
And again it's kind of say that. But also his
part of this is he was talking a lot about
like empathy, and I think just because of the kind
of brain you have to have and when to do this.
He used it in terms of like the machines empathy,
which it doesn't have, but the whole project, it was
impossible not to see that she was a deeply empathetic man.

(43:19):
He was really trying to make the world better and
I can't not respect that.

Speaker 3 (43:26):
Well, I think that does it for us here at Cees.

Speaker 1 (43:30):
That's right, what a packed thirteen. No worry, no empathy.
Tomorrow takes just a real dead eyed monster. I am
a true villain you're gonna hear from in the next episode.
I am a stumbbag.

Speaker 3 (43:43):
I am the best that I'm gonna be. Because I'm
starting this week, I can still feel the Cees magic. Yeah,
by Friday, I am going to be a different person.
I am going to rip some poor pr person two shreds,
I swear, but yeah. Tune in tomorrow to hear our
takes from the CS kind of side show called Showstoppers.

(44:06):
To hear also some exclusive, brand new AI generated SKA music.
So we'll give you that hint for tomorrow's episode. See
you see you there, mm.

Speaker 1 (44:16):
Hmm, We'll see you all there. I love you all.
Good to help. It's happened.

Speaker 3 (44:22):
Here is a production of cool Zone Media.

Speaker 1 (44:24):
For more podcasts from cool Zone Media, visit our website
coolzonmedia dot com, or check us out on the iHeartRadio app.

It Could Happen Here News

Advertise With Us

Follow Us On

Hosts And Creators

Robert Evans

Robert Evans

Garrison Davis

Garrison Davis

James Stout

James Stout

Show Links

About

Popular Podcasts

Monster: BTK

Monster: BTK

'Monster: BTK', the newest installment in the 'Monster' franchise, reveals the true story of the Wichita, Kansas serial killer who murdered at least 10 people between 1974 and 1991. Known by the moniker, BTK – Bind Torture Kill, his notoriety was bolstered by the taunting letters he sent to police, and the chilling phone calls he made to media outlets. BTK's identity was finally revealed in 2005 to the shock of his family, his community, and the world. He was the serial killer next door. From Tenderfoot TV & iHeartPodcasts, this is 'Monster: BTK'.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.