Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Cool Media.
Speaker 2 (00:05):
Welcome back to it could happen here, a podcast about it,
the Consumer Electronics Show, happening here to everyone, And of
course it is in fact happening to everyone, because over
the course of the day, all of our subjects here,
all of our experts here, have watched different kinds of
dudes explain the different kinds of jobs they want to
replace with a chatbot that was trained on Reddit. So
(00:27):
I'm gonna go around a circle and introduce our guests today.
First off, we've got the great Ed Anguiso Junior. Ed,
thank you for being here, Thanks for having me on.
We've got Garrison Davis, who's also great, but I'm not
gonna say it at the same time because I don't
want Ed's compliment to feel like less. But you're contractually
obligated to not mind.
Speaker 3 (00:48):
Yes, thank you boss.
Speaker 2 (00:52):
Great to be here, as always, very natural, very natural.
Zi hi hi, hello, hello, oh hello, hello, thank you.
This is your first cees as well, that's your first
time being a journalist. Also true. How do you how
do you feel doing the job that Alex Garland has
just reminded us in the movie Civil War is a
(01:13):
fundamentally noble, perfect endeavor only practiced by heroes.
Speaker 4 (01:16):
I love wearing a dress, shirt and tie and just
getting very drunk.
Speaker 2 (01:21):
You were very surprised when I gave you your gun,
but you can't be a journalist without one. Yeah, yeah,
I playing with it without the safety. Remember, And last,
but certainly not least, In fact, may be better than
some people in the room. Again, I'm not going to
say who. You can wonder that for yourself, feel David Roth.
Speaker 5 (01:43):
Thanks, I agree that I'm pretty good. How much better
I am than how many people in this room?
Speaker 2 (01:48):
I'm not. I'm not even really like something that not
like talking about. Yeah, exactly because we haven't gotten those
numbers back from Open AI.
Speaker 5 (01:55):
Yeah, it would be irresponsible to speculate at the end.
Speaker 2 (01:58):
Yeah, you got away for three. So what I want
to do here? I think this is kind of our
roll up. We've spent our last day on the floor.
I want to go around and I'll start first. You
guys have a second to get your thoughts together. What
comes to mind im media is like, this is the
thing that I had the most positive reaction to, and
this is the thing that I had the most negative
reaction to. I think is a solid way for us
(02:18):
to start out, and I think my most negative reaction,
obviously was the Amy artificial child best Friend toy, which
was deeply upsetting and uncomfortable, and I hated both that.
Like I could tell, from an industrial design standpoint, pretty
good design, Like it looked like something like, oh, Kitt'll
think that's cute, and from a this is our intent
for this product standpoint, it felt like a replacement for
(02:41):
the love of adults in the life of a small child,
which I thought was like evil in a profound way.
And I guess the best thing that I saw. I'm
not perfectly competent at this point to like analyze how
well it worked, but from the demo I saw, I
was very impressed with Naqi. Knakis basically reads facial micro
motions in order to let people control a computer, not exclusively,
(03:03):
but especially if they're quadriplegic or whatever. Like I thought
that was really interesting, and it's the kind of thing
because honestly, I might loop that in with There was
a AI assisted like Caine for people who were blind.
There was another device that led you to control a
computer through like facial movements in your mouth. It was
like a retainer. All the stuff that's like Oh, these
are like really, people care a lot about helping somebody
(03:25):
regain the ability to utilize technology to let them reconnect
to the world. That's like the opposite of replacing a
child's parents with a toy ed You're in the hot
seat next, you know.
Speaker 6 (03:36):
The thing I loved the most was obviously the Global
Pavilion for connecting Web three businesses across cryptoching. Oh yeah,
DeFi fintech cb DC's which your central bank, digital currencies
and legal ADVARC. You know this in my heart flutter
(03:58):
because you know what, even when you think they're down,
Crypto finds a way to squirm into your life.
Speaker 3 (04:08):
It really is the zombie of the tech world. Yeah,
because it's dead and yet it's undead. It's constantly trying
to crawl back somehow.
Speaker 2 (04:16):
The fact that it's dead makes it more dangerous.
Speaker 3 (04:19):
Now to exactly it's specifically a zombie. I will try
to figure out what's the vampire, but specifically Crypto is
the zombie.
Speaker 2 (04:27):
Yeah. When I first read the line that is not dead,
which can eternal lie and with strange eons, even death
may die.
Speaker 6 (04:34):
Yeah, when I think of you know, twenty eight years
later trailer where they use that poem.
Speaker 2 (04:45):
Three sex.
Speaker 6 (04:47):
You know, like, I'm just guessing where the price of
bitcoin is gonna go. I think we're at the beginning
of a golden age, not for us, but for the grifters.
Oh God, next week, when are dear Golden boy gets
or Orange boy gets elected so I are inaugurated? Has
he already won the election?
Speaker 2 (05:06):
Did well?
Speaker 3 (05:08):
Well, that's that's that's debatable. I think there was some
very curious irregularities in multiple states right.
Speaker 2 (05:15):
Straightforward from here to be encouraging the blue and on
people that know, now let him, let him have it,
have it. You're right, he wasn't shot at all. That
was all an AI trick. Yeah, that was they are
fifteen would blow your whole head off that way.
Speaker 1 (05:32):
You know.
Speaker 6 (05:32):
The thing I actually did like the most similar to you,
I really did like the assistive tech. I mean the
stuff that is for people who are disabled not able
body either experience things either cognitive decline or you know,
the neodegenerative things or paralyzed. Like, this is actual stuff
that we need a lot more investment in development and
(05:52):
I assume maybe the scaleut production of it and figure
out ways that it can be offered to people in
a variety or in a spectrum re use cases. Right,
I ain't the stuff that I did not like.
Speaker 4 (06:04):
Hmm.
Speaker 6 (06:06):
You know, I didn't really care for a lot of
the luxury surveillance stuff, you know, the fake cgms that
you know. I'll never forget this woman telling someone right
next to me it's a it was a medical device
that when I asked, she looks at my attag and goes, no,
(06:26):
it's not a medical.
Speaker 2 (06:31):
We had a beautiful moment where it was this like
it was like a set of smart goggles, which there
were a lot of that had like night vision, but
also it had like threat assessments. So the specific thing
they bragged is like it can help a police officer
identify if somebody has a gun. Right, And this was
right after we had gone to an AI security camera
that I had flipped off with both hands and it
(06:53):
had identified as a man giving the thumbs up. And
I don't feel great about it identifying the gun.
Speaker 6 (07:00):
Luxury surveillance for health, luxury surveillance for air recognition. Also
like they had it in litter boxes and shit, don't
need that, really don't fucking need that. Why does the
letter box need to be connected? To the internet, and
why does he need a camera in it?
Speaker 2 (07:13):
You know that does maybe think of a better world
where we have exactly as much money and focus on AI.
But it's all integrating it into cat focused products like
fifty dollars being poured into cattle.
Speaker 5 (07:25):
Translate whatever your cat is saying into French perfectly.
Speaker 2 (07:29):
Your cat can make deals with Chinese and by the way,
we've hooked him up to venture capital. We have an
open line soft bank.
Speaker 6 (07:37):
Siri, why you know that I would love this translation.
Let's help my cat make some deals. Let's help me
figure out why or how he learned to open my door.
You know, things like this, But what we get an
am bullshit.
Speaker 2 (07:50):
I want to see a guy a dressed as Steve
Jobs be like ladies and gentlemen. We have finally done it.
We have gotten across the concept of death to a cat.
Stand there mortality? Actually that like common ad where he's
talking about AI. But he's like, we.
Speaker 5 (08:08):
Taught proofs to a dog, got turtle back on what
a dog would think.
Speaker 2 (08:15):
About It was just a dog sitting at a table
smoking a cigarettes. The future. Mm hmm, Garrison, you're.
Speaker 3 (08:26):
Up best of ces. I think was definitely the VLC
media booth the park where they they had like they
had big big traffic cones on their head, wearing them
like wizard hats with with huge cloaks.
Speaker 2 (08:40):
They were dressed a wizards. They were dressed as wizards.
Speaker 3 (08:43):
And we walked up to them and they said.
Speaker 2 (08:45):
Let's let's start VLC folks. If you if you don't
know this, this was especially relevant to those of us
who pirated a lot. It's a media app that allows
you to basically play any kind of like any video,
idiophile yes, or audio file, and now it will automatically
give you subtitles to using local AI. That's not like
reaching to the cloud or anything to do it.
Speaker 3 (09:03):
Because putting some titles on pirated media can sometimes be
really hard. So they said, we have something that analyzes
the audio that's being spoken in whatever media you're watching,
and we will.
Speaker 2 (09:12):
Put some titles up for you.
Speaker 3 (09:13):
We walked up and we're like, so what do you
have here? Like, we are not selling anything, We have
nothing to sell.
Speaker 2 (09:20):
You in this beautiful they're fringe. So it was in
this like yeah, yeah, wonderful accents. I'm not going to
take the degree of like I don't give a fuck
about anything else of this stupid goddamn show that they
gave off. They exuded it.
Speaker 3 (09:33):
And they're they're they're by far the coolest because of
something Robert you said to them.
Speaker 2 (09:37):
I walked up and I was like, Velc's a very
popular app. They just crossed six billion downloads. I've been
using them for almost as long as you've been alive.
And I walked up and I was like, I've been
using your product for fifteen years in order to pirate media.
And they said, very nonchalantly, keep.
Speaker 3 (09:52):
Going, keep it, keep going, keep doing that, keep doing that.
Speaker 6 (09:58):
I'm obsessed.
Speaker 4 (09:59):
Yeah, that's amazing.
Speaker 5 (10:00):
All sad about this too, because, like I, it's a
good app. I have also used it. I saw the
guy in the hat and I was like, Oh, it's
the VLC from you know, from on your desktop. And
then I was like, that's that's stupid. I don't need
to talk to the man. He's wearing a hat and
a cape. And I'm glad that you followed through. As
a journalist. Pushed aside your instinct to be like, do
not approach stranger in a cape.
Speaker 2 (10:19):
Garrison does not have that Answerry no, No. At some
point the contrary I feel a magnetic attraction. This is
why I keep an air tag on them. Great way
to get abducted.
Speaker 3 (10:33):
I think similarly, obviously, all of the AI stuff for kids,
all of like the AI slop is like obviously bad.
We've talked about that a lot already. The other thing
that's like kind of like the worst is so much
what you said ed like a level of surveillance tech.
I tried out multiple a systems that are supposed to
like detect and predict behavior based on facial expressions or gesture,
(10:55):
and this is really tricky. There was one at Eureka Park.
It's creating company that's powered I believe b Samsung with
money and also they've access to like their training data.
They're called Visomatic And specifically why this exists. It is
a camera that you can put on a put on
a computer. It will it will detect where your face
(11:15):
is pointing and where your eyes are paying attention to.
And the reason why this exists is for online test taking.
It so people don't like look at their phone to
like to like cheat, So it at tracks where your
eyes are moving and if your eyes look down too much,
it's gonna flag it as someone's possibly cheating. So this
was obviously like introduced after the pandemic. There's a lot
a lot of online test taking. Samsung uses this tech
(11:36):
themselves for any kind of like online exams that they
as a company will put on, you know, whether it's
like for people, students, employees. But they also had like
other features where you could switch it. I assume it's
doing all the same work, it just is placed differently
on the monitor and instead of can you know, do
like object detection you know what you're wearing, and the
general like behavior analysis if you seem like you're behaving suspiciously,
(11:57):
which is something that we tried at the K booth,
which also a South Korean company, for their own like
like surveillance detection. But I asked Visomatic, like like what
kind of use cases do you see for this beyond
test taking? You know, like yeah, general surveillance, Like yeah,
we want to learn how to like predict or like
analyze potentially suspicious human behavior. As we were walking by
(12:19):
the s K version, one quite funny thing is as
I walked by it first, it first identified me as
a blonde woman holding a cup. It then changed and
said blonde person, which I think is pretty it's pretty.
Speaker 2 (12:32):
Deep, very progressives, doing the opposite of a Facebook.
Speaker 3 (12:35):
Yeah, it could sense the pronouns. It's like hmm, maybe
maybe not maybe not a woman, maybe not a blonde person.
But but yes, that was you know, something that was
like quite well done. Specifically the visomatic stuff, like very functional.
It could tell when I was looking at the screen,
when I was looking at my phone. It could tell
(12:56):
from like various various angles, like what what I was holding,
what I was looking at, where my attention was being directed.
Speaker 1 (13:02):
Like it was.
Speaker 3 (13:02):
It was very well done. It was very accurate. But
you know, possibly scary.
Speaker 2 (13:07):
Well, speaking of possibly scary, the sponsors of this podcast
don't know who they are. Could be the Washington State
Highway Patrol again, in which case, thank you boys for
your noble service on our nation's roads. I'm not saying
that because I got pulled over the other week and
I'm really trying to fight a case right now. I
would never do that anyway. Thanks guys, and we're back.
(13:42):
Is it my turn? It is your turn?
Speaker 3 (13:44):
Okay, So I'm going to introduce our special white woman
correspondents to give us some exciting breaking news in the
white woman tech development world.
Speaker 4 (13:54):
Okay, so the first one is positive sure contexts. I'm
a transwoman.
Speaker 7 (13:59):
And one of the boosts that was pretty interesting it
was this group were they French, Garrett, you remember.
Speaker 3 (14:06):
I I you know they're they're They're European. They're called
el E ell E Health. It's ELI.
Speaker 4 (14:14):
Anyways. This is a at home hormone tester, so it
is saliva base. It's like a little disposable package. Currently
they only advertise cortisol and progesterone, but they have plans
for estrade, isle and other hormones as well, testosterone as well.
Sorry and yeah, So you swab your mouth in the
(14:38):
morning or evening and then you wait, what was it like,
fifteen minutes, twenty minutes, and then you scan this little
like QR thing on the device and your phone calculates
what your levels are. And this has very interesting implications
for like the DIY hormone market or use case. I
(15:02):
started DIY and did my own like blood tests, but
a lot of like trans kids don't have access to that.
So this is a it's a good idea if it's
actually effective, Like we alln't have hands on yet, we
haven't tested it yet, but I would love to do
a comparison of like testing my own levels and then
trying this very interesting, very intriguing.
Speaker 3 (15:24):
Yeah, we will certainly as soon as possible test this
compared to the regular like mail in blood tests, which
is like the current way to do it, but that
requires shipping your blood to a laboratory and that's maybe
not always the best or even like convenient. So being
able to test this just at home without shipping any
of your DNA to some random laboratory would be really
(15:44):
really cool.
Speaker 4 (15:45):
Right, There's no insurance involved. This is completely supposedly close source.
Speaker 2 (15:52):
From what y'all were telling me earlier today, when you
explained this to me, it sounded kind of like the
people making this have an understanding of the dangers inherit,
particularly to the trans community, and why they might want
to use this, and a focus on privacy. For that reason,
I didn't.
Speaker 4 (16:07):
Press them on that because I don't know, I follow.
Speaker 2 (16:12):
You know, wide variety of Yeah.
Speaker 3 (16:14):
No, we tried to expect as much intel as possible
about kind of what their future plans are, but but
not specifically like in that level.
Speaker 2 (16:21):
But privacy, like they seemed like they had of coursably good.
Speaker 3 (16:25):
Understand of course, because it's because it's because it is
your own, like DNA and hormones, you know, like I,
I do not know this company is even thinking about
trans people if it is trans friendly, but it could
be used by trans people regardless.
Speaker 2 (16:36):
Yeah, much like a glock exactly exactly.
Speaker 4 (16:41):
The potential is great.
Speaker 7 (16:43):
And then probably my least favorite goofs I have to
call out some other white women, my so cal boho
white women. It's evjects and has that spelled e V
E V J e C t okay?
Speaker 4 (16:59):
And what this is?
Speaker 7 (17:00):
Oh god, yes, this is a special plug for your
charging port of your EV. So the idea is a
nefarious party sees you and your fancy EV and approaches
you and you need a quick getaway their words, their words,
their words, by the way, like they see your fancy
(17:22):
car and your targets. So this device will like e jacks,
you can just drive away from the charge.
Speaker 2 (17:33):
Leaving broken pieces of plastic both yes, the charger.
Speaker 7 (17:38):
Like this is not single use. Yes, so yeah, all
those all those people targeting so caw white women.
Speaker 5 (17:49):
This is finally someone is serving the community of people
that think that if you find a zip tie on
your car door handle, MS.
Speaker 3 (17:57):
That was the first thing I said as soon as
soon as we walked away, I was like, this product
wins the Crulson Media Award for the most white Woman's
specificly right, like, if you see a slice of cheese
on your windshield.
Speaker 2 (18:09):
You're already targeting right away.
Speaker 3 (18:11):
This is this is that exact demographic of people who
think they're going to get trafficked in like your local
like olive garden parking lots.
Speaker 8 (18:19):
Gang stalked Americans in the Tesla charging station in Brentwood, California,
where the average income is like in the eight figures.
Speaker 2 (18:30):
I gotta say, though, you're being very unfair to them.
It was so nice of them to put down the
phone where they were doom scrolling TikTok, to look at
all of the different reasons their kids are going to
be abductive. They're talking about this.
Speaker 5 (18:39):
Product, you know, they're like, some finally someone's gonna do
something about it, create a disposable piece of plastic.
Speaker 6 (18:46):
You notice that guy is always sitting down in the
gym and the coffee shop and the gas station. This
is how he doesn't get it.
Speaker 4 (18:54):
I'd be careful.
Speaker 2 (18:55):
I feel like I could have upsold them, and like,
what if we put some explosives in this, you know,
really like keep off of the car, like a flashbacks
inside of it.
Speaker 6 (19:05):
Called the police station and took a picture of him.
Speaker 2 (19:08):
And someone scared. A lady driving a Vibe. That's one
of the electric cars.
Speaker 3 (19:13):
Right, It's like a crocodile tail. As as it a jects,
it whips around, mobilizing anyone in the vicinity.
Speaker 2 (19:22):
We're calling it the Iguana, and it does with enough
force to break a grown man's tyle. Yes, okay, David Roth.
Speaker 5 (19:31):
So there's a lot of I guess I gathered less
than in years past that this was at one point
like basically a car show. There is not a lot
of transit stuff this time around. I didn't get to
see very much of it, but I did have. I
guess this is both my best and my worst experience,
the most powerful transit experience of my life.
Speaker 2 (19:48):
So I live in New York City.
Speaker 5 (19:49):
I take the subway pretty much everywhere I go, and
you know, it has its ups and downs. For the
most part, it's good. It moves like twelve hundred people
through a tunnel of thirty odd miles an hour, and
for the most part, everybody, everybody else alone or you know,
watch its videos on their phone and stuff. But I
knew that there had to be a better way, and
at the Las Vegas Convention Center I got to experience it.
(20:11):
You're familiar Elon Musk, serial entrepreneur. Yeah, so he invented
something called the hyper loop, which is a car that
goes through a tunnel that's the exact same size as
the car at eleven miles an hour and it takes
there's someone has to drive it, and also someone has
to help you get into the car, but you can
fit up to three additional people into the car, so
that ratios everyone, I know, Yes, right, so yeah, you
(20:34):
got two people moving three people two hundred yards at
the speed of like a brisk walk.
Speaker 2 (20:42):
Now, David, this kind of technology wasn't possible just a
few decades, right exactly. I mean this was the sort
of thing.
Speaker 5 (20:46):
Though there had been tunnels, they were mostly used by animals,
voles miners, yes, right, and thought yeah, and that was
mostly for pirates in at least one movie I saw recently, Yes,
but no one had thought about it as a transit
sort of thing. It was more of a like a
place where you would go if you needed to get
copper and of course, but in this case, so this
is like where it's good to have and this I
(21:08):
guess every ces is like this. This was my first
to be reminded that there are visionaries out there who
are like, what if you put car through hole? What
if instead of a thing that moves multiple people at once,
you had a thing that took exactly the same number
of people to move that number of people slightly more
than Yes, Yeah, so that was cool. I mean it's
(21:31):
just like fun to see like where this stuff is going.
And I really wonder if we're not going to start
seeing things like cars on the streets of American cities,
you know, like it could be okay, David, I mean
most of the obviously this is something we'd going last
there was like three or four good things you all
said them. I thought the accessibility tech stuff was the
(21:51):
the stuff that made me feel good about what was
going on here, and there was a great deal of
stuff that made me feel like pretty bad about what
was going on. Yeah, you're up to and including like
the survey and stuff beyond the you know, like advanced
Samsung powered snitch tech so that nobody whatever your boss
can tell if you're really looking at the.
Speaker 2 (22:10):
Zoom that you're on.
Speaker 5 (22:11):
Don't really love that personally, But for me, the a
lot of the smart home stuff is a real drag,
like just in the sense that.
Speaker 2 (22:18):
It clearly, first of all, beyond being.
Speaker 5 (22:20):
Like sort of unnecessary, there's a level of just willingly
giving over your agency over the small moments that make
you know human life human life, and just being like
I would really love it if just like an artificial
intelligence could pick my pants out for the day. I'll
simply stand here waiting for that to happen. Yeah, just
fucking grim actually, like didn't really care for it. I
(22:43):
feel like you gotta like, what are you using that
time to do?
Speaker 2 (22:47):
Yeah? What are you getting? What are you optimizing from
yourself by not having like pieces of like the thing
that a human being does, which is like pick your clothes, right? Yeah,
I wonder how you feel about this because you and
I have been going to cees from and I guess
a broadly similar number of years. Like I've never been
to CEO.
Speaker 5 (23:04):
Oh really this is your no, I'm a fucking sports
writer man, like this is out here because Ed got
me a folding head.
Speaker 2 (23:10):
I have like the dead eyedes veterans. Oh yeah, well
I'm very tired. Yeah.
Speaker 5 (23:14):
This is the thing with like I think, as far
as I can tell, it seems like it's a loop
where you more or less like you start out it's
too much, you get big I right away, and then
you just sort of feel zombie fined. But then we
have talked to people over the last few days that
are like, you know, I remember like fourteen ces as
ago that was pretty good, like and they're also tired
and also deranged by this point.
Speaker 2 (23:35):
Yeah, the first time someone showed me a tablet computer,
I was like, oh man, science has given me everything
I want, like and I guess it's I don't know.
Do you remember like when the last one was that
you felt like even sort of that stirring Yeah, uh twenty,
like eleven or twelve when they did I got to
(23:55):
see inductive charging of a car for the first time,
and it like was so big. The Las Vegas Convention
Center is like as the size of a city, and
seeing like the lights in that whole convention center dim
as they were doing it was very inefficient because yeah,
but that was just like that was like, oh wow,
this is kind of like amazing that this is even possible,
(24:18):
but yeah, not really sense, not really sense. That's why
I'm really glad that there's lights in the hyper loop tunnel. Yeah,
otherwise it'd be unless something goes wrong. It started to seem
kind of grim otherwise. Well, I the smart home stuff
is interesting because that that has been as long as
I've been going to these they've been trying to sell
people on smart homes, and I don't think I've ever
(24:39):
gotten a good idea of what a smart home is
that I think a person would want. I can think
of two things a person would want right. One of
them is, it would be nice if like I didn't
have to think about playing music. I could just like
tell my house to play the music I wanted and
it would play the music and I could hear it
everywhere and I didn't have to fuss with a bunch
of shit. And the second is what a I'm coming
(25:00):
home from vacation and my house is cold, it would
be nice to turn on the heat or like an
hour before I get home. And one of those things
you'd use every day. And one of those things is
not really a viable to base a business off of.
But like they keep trying to find new ways to
stick computers in my house. And I don't know, does
anyone else have anything they want out of a fucking
smart home? No?
Speaker 5 (25:20):
I mean, I like, it's not an accident that my
apartment is basically going to be in the year two
thousand and five forever. Like, I mean, it's it's expensive
to do all this stuff. This is the bit that
with so many of these demos, like just you start
to notice how incredibly grandiose the residences in which all
of this stuff is being sort of like postulated as
being useful. Is it's like the like Lexus December to
(25:42):
Remember sales event type.
Speaker 2 (25:44):
Energy just a big fuck what lives do you live? Yeah?
Speaker 5 (25:46):
This also we've talked about this on Ed's show that
like there's a lot of stuff here that feels like
like the first fifteen minutes of a George Romero movie,
like just getting you set for eventually there's going to
be a lot of like, you know, disembowelings and hideous
shambling zombie. Yeah, and smart home not a bad horror
movie concept. I don't think it's a great consumer concept.
Speaker 2 (26:06):
Yeah, speaking of great consumer concepts, the ads for this podcast,
all right, we're back and I want to close this
out by asking everybody a question, which is, how do
you feel about where tech is going?
Speaker 6 (26:29):
I think we're going to hell.
Speaker 2 (26:30):
I think we are getting wrapped it up.
Speaker 6 (26:33):
Very fast into the sweet abyss. I'm worried about the
fact that so much of the tech is oriented around surveillance,
around precursor forms of prepping, around very soft forms of
like perfection and optimization that rhyme with eugenics. I'm like,
(26:55):
I don't like the direction that a lot of this
stuff is going. But also there I don't know what
to do about it because so much of it is
driven by private interest, right, It's like venture capitalists, well
capitalized individuals and firms that they're connected to decide what
we get to get pushed and these corporations, you know.
Speaker 2 (27:16):
Yeah, the nature of like you can you can really
tell that a lot of like the health products are
very optimized for like rich tech executives. Like there's a
lot of sleep products that all relied on you being
willing to like bathe yourself in speakers, playing benaural beats
while you slept, and like a different devices measure you're
like do an ECG And it's like I don't know.
My aunt's not going to do that.
Speaker 6 (27:38):
Oh yeah, you know, like I was, you know, I was.
I took with my partner about this. They have type
one diabetes, they have a CGM, They use it constantly,
and they're We've been talking about and thinking about writing
about how there've been a crop of devices that are
like trying to push onto this idea that you need
to have close monitoring of it to preempt if you
are going to be prediabetic, or to optimize what you're
(27:58):
eating throughout the day, but that you know, when you
actually dig into what they're doing, it's like part of
this track of rhetoric where it's like whoa, you know,
if your sugar slightly goes out, it's because you're being
a bad person. It's because you're eating the way that
you shouldn't. It's because there's a moral failing or character
failing there that this tech can help purify you of
and you can be your best self, which is really
just like not large.
Speaker 3 (28:20):
You know, And.
Speaker 6 (28:23):
I feel that sort of rhetoric lurking behind a lot
of the biometric surveillance stuff, even though there are applications
that are not that.
Speaker 2 (28:30):
Yeah, you know, it's kind of focused on like the sin,
the health sins that you're committing. We spend a decent
amount of this week hanging out with a Catholic priest,
and I do feel like several tech companies were the
ones trying to sell us indulgences.
Speaker 3 (28:42):
Right, yeah, yeah, all right, gare There's small improvements for
consumer tech, right this is a very consumer base where
it's supposed to be a consumer based tech show. There's
products like these shocks, headphones, which every year get a
little bit better. I try to conducting headphones last year,
which are very good.
Speaker 2 (29:03):
They work underwater if you're deaf in an air music
the way you used to be able to.
Speaker 3 (29:08):
Yeah, very cool stuff. This year they have what they
called air conductive. I don't quite know how it works,
but it does work. I can hear it if you're
standing like two three feet away. There's no sound bleed,
but I hear music in the middle of my head
despite having to not put an earbud actually like in
my ear. They're super useful. They work greatly, really good
sound quality, durable.
Speaker 2 (29:28):
I'm on year two of the same pair that I
run with every single day, like sweat rain. Great products.
It's like small improvements.
Speaker 3 (29:36):
Right, It's not necessarily like revolutionizing hearing, but it's it's
very very small improvements. Where As the other kind of big,
big trend, which isn't necessarily like holly consumer based. It's
kind of what these larger companies are trying to move towards.
Is I feel like they're trying to replace friendship with
(29:58):
this form of like technology and like AI enable technology.
You used to have friends to get recommended new music.
You used to have like friends to like tell you
about new stuff that they're interested in.
Speaker 2 (30:08):
No longer.
Speaker 3 (30:09):
Now now you have an AI agent that.
Speaker 2 (30:10):
Can do that for you.
Speaker 3 (30:11):
You you don't need you don't need friends to help
kind of talk about you know, you had you had
a rough breakup. Instead, you can have a short term replacement.
Using AI, you can have a friend replacement of a
girlfriend replacement. It's all these things that are trying to
like replace the core concept of friendship, even as like
an as even as like a baby, even as a toddler.
(30:33):
Your first friend doesn't need to be people you meet outside.
It can be this little hovering robot you have in
the living room that can also organize your fridge, tell
you what to tell you what you'll roll around your
house the middle of the night with with cameras and
that could be your first friend. It's replacing the core
concept of friendship. It's this move towards complete optimization of
(30:54):
every aspect of human life, because as smooth as possible,
that completely ignores like what it means to be human.
Speaker 2 (30:59):
It's the fast saying difference between that elder care robot
el e Q, which was clearly a man with a
tremendous amount of empathy trying to design a device to
help people, and what I usually see with AI, which
is trying to design a device to remove the need
for human empathy. Like I went to a there was
a VET app called Laika that's like chat GPT for veterinarians,
and they were like, yeah, you know what, most of
(31:20):
it we focused initially on like technical questions, so like
if I have these symptoms, what can that mean? But
what Vet's kept asking us is like, we would really
like advice on how to talk to people that their
pets are going to die. And I was like, do
you are vetts not getting out of vet school because
that's like that's a big part of being a vet,
Like do they need chat GPT for this.
Speaker 3 (31:40):
I saw this other company that was like it was
designed to help you get over the loss of your pet,
where you could you could pump tons of photos of
your pet AI into this AI generator and it will
generate new images and this is proven to help you
move on from loss. Which is literally a Nathan Fielder
(32:00):
from like seven years ago, seven years ago, and like, no,
you should talk with your friends about that. That's why
you are a human. That's how you can move on
from loss. You have to make new connections. Poorly AI
generated images of your cat aren't going to help you
move on, Like what why? Anyway, Replacing friendship is the
(32:23):
thing that I see a lot of the tech world
wanting to do, maybe because they don't really understand real
human relationships that aren't like innately transactional. I'm not sure,
but that is like a huge trend that I've seen
multiple multiple people mention.
Speaker 7 (32:36):
All right, Zi, So I've worked in this industry for
like three years now, and this is my first big convention,
and I would say this is just affirmed pretty much
all of my disillusions with the tech world, and most
of it's just nonsense. And maybe the postive people are
(32:56):
onto some stuff.
Speaker 2 (32:59):
Well you say that, but I really do think vera
dox is gonna revolutionize the way in which mysterious fogs
kill large numbers of Maybe, but don't name it something
so sinister. Yeah.
Speaker 5 (33:12):
Yeah, if you were to be like, this is the
thing that keeps your apples fresh for a long time,
that would be great.
Speaker 2 (33:16):
I would just don't call it apple fresh. Yes, but
call it apple fresh.
Speaker 3 (33:20):
By the way, you should listen to You should listen
to Better Offline to hear context for veradox, which we
discussed in the last episode of our daily Cees coverage
over there with the wonderful Edixitron. But essentially, evera dox
is this missed. They get sprayed on produce, which allegedly
helps it stay shelf stable for a few more days exactly.
Speaker 2 (33:41):
So maybe that shelf stable myst will also translate to
waking up the dead.
Speaker 5 (33:45):
Possibly, But you don't know that it's gonna do. You
don't know that it's not going to do right right areas.
Speaker 2 (33:50):
We as a journalist, that's sorry job to ask these questions.
Speaker 3 (33:54):
And we discussed that way more in depth on Better Offline.
Speaker 5 (33:57):
Yeah, we do discuss whether the ability to bring leaf
led us back to life does have any repercussions in
a pet cemetery sort of way for your possibly dead
loved ones.
Speaker 2 (34:08):
David, it's me sorry. A lot of good points.
Speaker 5 (34:11):
I mean, I think Gary never both made the point
about the sort of sociopathic like thread of a lot
of this just sort of like an inability to understand
not just what people might want from a technology, I think,
which is to feel not I mean, there probably are
people I imagine that. It's like if you were the guy,
the dude that's like trying to age himself backwards. You know,
(34:31):
he's like Brian Johnson, Brian Johnson, we love Brian, Yeah,
but he like I feel like he would have been
walking through this clapping his hands with delight.
Speaker 2 (34:44):
Day king his son's blood yeah, time for Yeah, it's
a it's I drink his son's blood pretty Ray's And
it's not bad the.
Speaker 5 (34:52):
High quality plot, but that it felt like it was
that that there was a lot of this sort of
like an optimization unto like trans sending being human at all.
And I don't think I mean again, there probably are
people that want that. They certainly have money. I don't
imagine that I think what most people would like. I mean,
then you don't expect technology to make you feel more human.
(35:13):
But something I've been thinking.
Speaker 2 (35:15):
About a lot.
Speaker 5 (35:16):
We're talking about this a lot on a better offline,
but there's a passivity that a lot of this sort
of seems to be forcing onto people where you're just like,
sort of things are happening to you that make your
life more efficient and convenient. And I don't think that
I want that. I mean, I'm older than and poorer
than the you know, market that I think they're aiming
(35:37):
for with this, But it's certainly old enough to remember,
as you said, like finding music, like that's a thing
that yeah, you know, your friends tell you about it.
And in my case, I mean again just being in
my middle age, you like go to a store and
you flip through shit, like there's a distinction between finding
something and being given something or being fed to something
like you're a foa grag goose and it's just getting
(35:58):
sort of piped into your brain and life and being.
And I think it's an important distinction. I think that
little bit of agency of having some sense of doing
the things that you want to do, like I would
imagine that well, I don't have to imagine technology that
helps you do that as opposed to doing it for you.
I think that I don't want stuff that makes me
(36:19):
feel less human. I don't want stuff that makes me
feel more like I'm in a fucking matrix pod. And
I think that a lot of the stuff that was
out there seemed targeted towards the matrix pod dwelling community.
Speaker 1 (36:32):
Yeah.
Speaker 2 (36:33):
I think that's that's about the best line we could
go out on, Like, that's that's yeah, enailed it. Thanks.
I thought I crushed that one. Yeah you did. You did,
great job, Dave.
Speaker 3 (36:42):
Where can people find your work day?
Speaker 2 (36:44):
Defector dot com? Why let me do that without crushing
my want? No, no, no, no, that's okay.
Speaker 3 (36:48):
Let me.
Speaker 2 (36:48):
That's a load bearing piece of content. Defector dot com
is the website, and at.
Speaker 6 (36:53):
Big black Jack a bit on Twitter and blue Sky.
This Machine Kills is my podcast. Tech Bubble dot com
dot com is the newsletter.
Speaker 2 (37:02):
Hell yeah, do you want to tell people how to
find you?
Speaker 4 (37:06):
Zi at New Old Woman on Twitter with okay zeros
zeros for.
Speaker 2 (37:11):
Neo zeros All right, everybody, Well, uh, that's going to
do it for us here at it could happen here
and our week at CES. You know, just try to hug,
hug your loved ones until the Vieried ox sweeps through
all of their homes neighborhoods. Yeah, oh no, it's in
the room. It's in the room.
Speaker 1 (37:31):
It could happen Here is a production of cool Zone Media.
For more podcasts from cool Zone Media, visit our website
Coolzonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you listen to podcasts. You can
now find sources for it could Happen here, listed directly
in episode descriptions. Thanks for listening.