All Episodes

January 9, 2026 132 mins

Welcome to Better Offline’s coverage of the 2026 Consumer Electronics Show - a standup radio station in the Palazzo Hotel with an attached open bar where reporters, experts and various other characters bring you the stories from the floor. 

In Thursday’s second episode, Ed is joined by actress and standup comedian Chloe Radlciffe, Garrison Davis of It Could Happen Here, writer Westin Lee, Robert Evans of Behind The Bastards, Ed Ongweso of the Tech Bubble Newsletter, and tech writer Rob Pegoraro to talk about how this CES feels like something is missing, building things for actual people, how everybody is talking about “incremental” improvements in AI, OpenAI’s similarities to Theranos, and the problematic nature of mass-automation.

EXCLUSIVE CES SALE! Get a *permanent* $10 off an annual subscription to my newsletter through January 13 2025: 
https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/cue848p5sc

Ed Ongweso Jr.: https://bsky.app/profile/bigblackjacobin.bsky.social

Rob Pegoraro: https://bsky.app/profile/robpegoraro.com

Westin Lee: https://bsky.app/profile/westinlee.bsky.social

The Tech Bubble Newsletter: https://thetechbubble.substack.com/ 

Chloe Radcliffe: https://www.instagram.com/chloebadcliffe/?hl=en

https://punchup.live/chloeradcliffe

Donate in Sean-Paul’s honor: https://www.perc-epilepsy.org/

Here’s the robot we talked about: https://bsky.app/profile/iwriteok.bsky.social/post/3mbwvlat7dc23

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

Email Me: ez@betteroffline.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Media.

Speaker 2 (00:03):
Damn, they got me feeling like white. Michael Clayton, I'm
at Zeitron and this is Better Offlines coverage of the
Consumer Electronics Show. We are as we have been all
week here in the beautiful Palazza Hotel in beautiful Las Vegas, Nevada,

(00:27):
bringing you yet another episode covering CS with the crazy
assortment of guests from the tech industry. We've got an
open bar, tacos and places to sit down for the
members of the media, whether they join us on the
microphone or not. This is our second episode of the day.
It's Thursday, and we've got two more on Friday and
an epilogue on Saturday. Then they're going to roll me
into one of the aqueducts here and flush me into
the sewer. But before they do that, it's time to

(00:48):
introduce the latest contestants on the CES Challenge. Joining me
as ever as stand up comedian and star of it
is this thing on Chloe Radcliffe. No, no, not any more,
No not unfortunately, no.

Speaker 1 (01:02):
Very fortunately it can still here. You can't get rid of.

Speaker 2 (01:05):
Me, and we will not be and of course it
could happen. Here's Garrison Davis. Hello, good morning, and take
writer Rob Pegoraro joining us once again.

Speaker 3 (01:13):
It's good to be back.

Speaker 2 (01:14):
It sure is good to have you. Lovely to see you.

Speaker 3 (01:17):
So yeah, what have you been up to?

Speaker 2 (01:19):
What have you seen today?

Speaker 3 (01:20):
I've been at LVCC all day. What did you LVCC?

Speaker 4 (01:24):
Oh my goodness, are you realized? I said good morning
because the Cees time distortions turned to kick in. Oh.

Speaker 2 (01:29):
Absolutely, well you were in like Tokyo a week ago,
so your time is.

Speaker 4 (01:32):
Well the best best is an interesting word. The most
interesting thing I saw today is I happened to arrive
at the LG booth right as their Home Assistant Robot
demo stored for the first time.

Speaker 3 (01:47):
Cloyd Lloyd cloydge.

Speaker 4 (01:50):
So I got to see that that in action right
for their for their first demo.

Speaker 2 (01:56):
Walk me through the action part of this action, Like,
how was the action?

Speaker 1 (02:01):
Maybe it's maybe more proper to say I got to
see Cloyd in Oh it was.

Speaker 3 (02:06):
It was action.

Speaker 4 (02:07):
It was breathtaking because it's going to change so many
people's lives. How so, so they gave us three different scenarios.
One is you have this clumsy dad who always forgets
his keys, and the robot is able to find the keys,
then put them somewhere else, which is an interesting solution

(02:30):
to the problem of losing your keys because it was
a frame them. So this this this clumsy father of
too put left his keys on the couch. Then the
robot walks in from the couch and grabs him from
the couch and moves him to a different location in
the house, which seems like a ma she that just
makes you lose your keys.

Speaker 2 (02:48):
The jesta ai, I am here to add chaos to
you alive.

Speaker 4 (02:51):
So it was frame is losing your keys, but it's
just the robot moving keys around.

Speaker 2 (02:54):
Right? What was what speed did it do this? Would
you say? Uh, just like real slight. It's very slow.

Speaker 4 (03:04):
Its second segment was like a rich single guy who
doesn't have anyone living with him, who has this robot
as a quote unquote roommate, and the roommate mostly does laundry.
And I watched this robot sped I kid you, not
ninety seconds putting a single shirt in the washing machine

(03:27):
and then also spent I would say, about two minutes
attempting to fold a towel. How did you do again,
I use the word attempt it. Yeah, it got about
half the towel folded. After two minutes and then was
directed towards the next part of the demo, so it
could not finish the folding a single towel task. I'm
sure if you give it five minutes, it could fold

(03:48):
a towel.

Speaker 1 (03:49):
And that's actually a high amount of confidence based on
everything you described.

Speaker 2 (03:54):
Yeah, we trained it entirely on Stona Dato.

Speaker 3 (03:56):
Yeah, it moves like someone who's headweight too much weed.

Speaker 1 (04:00):
Uh.

Speaker 4 (04:01):
And then by the time the time the third demo started,
I had I had to go.

Speaker 2 (04:06):
I had enough.

Speaker 4 (04:07):
But it was like a medal age woman who's whose
robot was like trying to get her to like work
out more.

Speaker 2 (04:15):
I saw that one and I mentioned earlier it was
like here you go and it's got like one pound
weights and she goes, thanks Cloyd. If you didn't give
me this, I wouldn't even move.

Speaker 4 (04:23):
The most the most interesting part of the demo was
and like I got to see like the robot, you know,
try to do a lot of these things.

Speaker 2 (04:30):
That wasn't just on the screen.

Speaker 4 (04:33):
But it was The presenter kept reiterating that Claude knows
what you want before you even have to say anything.

Speaker 3 (04:41):
Oh, that's not crapy at all.

Speaker 4 (04:42):
And that was that was the thing, because the cloud
gets to observe you and learn your patterns and learn
your habits once you like, and you can also input
this data.

Speaker 2 (04:49):
And it was.

Speaker 4 (04:51):
During the towel folding segment he even made a joke
because because it was because it was taking too long.
It was like, ah, yes, Claude was able to fold
the towel exactly exactly.

Speaker 3 (04:59):
Like I like, it is this Cloyd the robot or
Claude the AI?

Speaker 2 (05:02):
Yeah, Cloyd Kloyd Kloyd.

Speaker 3 (05:04):
Does Cloyd run Claude?

Speaker 2 (05:06):
I mean I probably not. I mean it's slow and expensive,
so I guess. And also just from my understanding, this
thing is fucking huge. It's big giant, it's big. You
just have this offensively looks like it's like it's like
my it's like me sized. Yeah. Yeah, it's but also
super wide, like a like a big scooter, like yeah,
the kind of scooter you see.

Speaker 5 (05:24):
It's like a sort of storm tripper turned h butler.
I guess, yeah, but learns what you need other than
like like three rapidity.

Speaker 2 (05:34):
Yeah, like how folded it faster than seven minutes or correctly?

Speaker 1 (05:39):
Yeah?

Speaker 3 (05:40):
Yeah, And.

Speaker 4 (05:42):
The presenter made a joke, is that because because claud
Cloyd Cloyd wasn't able to fully fold the towel. He
made a joke about how that's how he actually prefers
his towels being folded. Because this was this was like
an on and in an ongoing statement he kept making
it was like, you know, Cloyd knows how long I
want my croissants cooked, so he can he can bake micros,

(06:02):
so he can bake microsohants exactly how long I want
to He also, uh, Cloyd prepared a snack during the demo,
and the snack was taking milk out of the fridge
and putting the milk carton down on the counter because
I do not know if the robot fingers were able to.

Speaker 3 (06:17):
Unscrew the mill of my snack of milky.

Speaker 4 (06:20):
Like the milk carton actually, so it was just taking
the milk putting it on the counter. But for the
for the for like the wife and father of two
wife and father, for for the couple, and and and
and and their kid. Demo, the robot was like, you know,
if if you're too busy, the robot can make breakfast
for your family in the morning. Well, the robot can

(06:40):
put a tray of of croissant dough. It's pre prepared
in the oven, so you have.

Speaker 1 (06:45):
To put that.

Speaker 4 (06:45):
You have to put the croissants on that unclear. No,
the robot can take as long as the dough magically exists.
Sure Claude Cloyd can walk towards the oven. The oven
will automatically open with the new lg s and then
put the tray in.

Speaker 1 (07:00):
But as the cant, you still have to line the
croissants and you also have to unroll it can do
the easiest part afford drone delivery of croissants from a
really good part.

Speaker 2 (07:12):
That we saw was moving trey into oven.

Speaker 4 (07:15):
How the how the cross and dough got on the
tray complete mystery. How the how the milk got from
the cart into the glass also we'll never know.

Speaker 1 (07:25):
My question is did the did the nice actor who
they hired to lie seem well acting is like exactly
and he is good at his job, seem uh like
self aware at all, a little bit like he seemed
a little embarrassed he did he.

Speaker 2 (07:45):
Did a good job.

Speaker 6 (07:47):
This is also Vegas people, so and just think about
the paycheck when he when when the robots spent ninety
seconds putting a shirt in in the washing in the
washing machine.

Speaker 1 (07:58):
Uh.

Speaker 2 (07:59):
It was framed as Cloud being extremely thorough.

Speaker 7 (08:02):
Hm.

Speaker 2 (08:02):
Look look look how thorough. Yeah, as being it knows mysfications.
I like shit done wrong and slow. Okay, I like
my stuff done poorly. It's like, if I needed someone
to do something slowly and poorly, as I said, I
would do it myself.

Speaker 4 (08:15):
The odd thing it's like, and there was so much
robotics in LVCA this year, like North Hall full full
of robotics, and it's it's robotics that I that I've
You've seen it, see, yes before, but it's always been
They've always been kind of janky, and you haven't seen
like big companies like LGE really mess with it because
it's all kind of janky. And I was just kind
of surprised that LG is actually like dipping their toes

(08:36):
into this because it's not it's not necessarily new technology,
but it's it's at the same level of jank as
it was like a year and a half ago. But
for some reason they decided to like go for it.

Speaker 3 (08:46):
They thought nothing else.

Speaker 2 (08:47):
I genuinely it's just headline grabs. It's just I fucking
read a Wall Street Journal story to them. Not even
name the reporter because he should know better. But it's like, yeah,
this is the year of the robots, and one of
the things, I swear to God, most of the robots
they mentioned didn't exist. It was like Cloyd. I think
they mentioned Cloyd. I feel like it's irresponsible for anyone
to report on Cloyd as if it's real. Like it's

(09:09):
real in the sense it's physically, it's physically in the.

Speaker 3 (09:12):
Las Vegas Convention Center Central Hall, LG booth.

Speaker 2 (09:16):
But it's like, this is not something that can be
purchased or is sold.

Speaker 3 (09:20):
Or even does like it does anything.

Speaker 2 (09:22):
The very simple thing I mean, I mean feel like
I'm going insane.

Speaker 1 (09:26):
I made this point last time we were talking about
we were talking about Cloyd on some other episode I
can't remember, several years ago. Yeah truly. But so I
know that this is a repeat point for anybody for
the real errors out there. But the keys finding thing.
For the robot to be able to find the keys,
that means that the keys must have some kind of

(09:48):
air tag chip on them, Like that's how the robot.

Speaker 2 (09:51):
Maybe it's possibly, in.

Speaker 1 (09:54):
Which case you go, how about you just put an
AirTag on your keys.

Speaker 4 (09:58):
Sure, I think Cloyd might be able to find the
keys via just visual recognition, Like Cloyd was like sorting
laundry by color. Cloyd was washing certain items but not washing.

Speaker 1 (10:07):
It just feels like if if I can't visually see
my keys.

Speaker 2 (10:10):
Well, you know you're you're a busy dad. You don't
have time.

Speaker 1 (10:14):
I am Amber dad.

Speaker 2 (10:14):
You don't have time that dad and wife are busy
with AI but doing busy productivity woman and no woman
is Because at CS twenty twenty six, she was given
pink battery packs.

Speaker 1 (10:27):
I was, I was, I was welling around a little
little power back thing and he pointed to the pastel
colors and was like, this is it. It looks like
a foundation, it looks like a makeup. And I was like, ah, yes,
that's for girls. And I pointed to a black one
and I was like that make me go no bad,
but this make me go yes. And he was like,

(10:48):
could not see. He just kept it, kept telling me
power bags for girls.

Speaker 3 (10:52):
It's woke. Whoa because it's truly good, truly dead.

Speaker 2 (10:55):
Yeah no, but woke too. And C is twenty twenty seven.
It's gonna be fucking It's gonna be in they're going
to be the gender forcing machine.

Speaker 4 (11:02):
I mean yeah, they already have like the the like
accessibility stage. This year it's woke too, is going to
be bigger.

Speaker 2 (11:08):
Yeah, it's gonna be like the caliphight section and the
gender section, the false transgender section. That's gonna be fucking great.

Speaker 3 (11:15):
Sure, we're all changing genders next year. It's happening, folks.

Speaker 1 (11:19):
But that means you get a power bank that's whatever color.

Speaker 2 (11:22):
You can get a pink power bank. That sounds fucking great.
We're all changing agendas next year. No, it's what else
do you see?

Speaker 8 (11:29):
Yeah?

Speaker 3 (11:29):
You see?

Speaker 4 (11:29):
I mean it's honestly what was more interesting to me
at Central Hall was what I didn't didn't see.

Speaker 2 (11:34):
What was that?

Speaker 4 (11:35):
I didn't see Nikon, I didn't see Cannon, I didn't
I didn't see Samsung.

Speaker 3 (11:39):
They were not there there.

Speaker 4 (11:40):
They don't have booths this year, and holy the Panasonic
booth very bare. Not really any product. There's there's like
there's there's like enterprise products, there's like you know, energy
efficiency products like software, there's chips, but there's there's there's
not not many physical products. There's no likely Panasonic cameras.
That's the Sony booth isn't a Sony booth, just has

(12:01):
cars in partnership with h They've.

Speaker 3 (12:03):
Been weird in the last several years, like.

Speaker 5 (12:04):
This, but Samsung, but Sony because that exhibit somewhere in
the wind.

Speaker 9 (12:10):
Yeah.

Speaker 2 (12:10):
But usually they have like the Samsung megalopolis, like a
giant booth.

Speaker 4 (12:14):
Yeah not not not there, and Sony's exhibits is very small.
This year it's just cars n Well, no Nikon, which
has has had huge booths that see yes the past
like three four years.

Speaker 2 (12:26):
Yes, I get this weird feeling about this show that
something is up, something is really weird. This thing doesn't
feel right because I saw like a chunk of a
casino just with nothing there and those tables open. I
think was even at the Palazza, which is just fucking strange.
But something feels missing. It doesn't feel like there are
less televisions like last year.

Speaker 3 (12:44):
There is less television.

Speaker 2 (12:45):
Yeah, it's like a television brant. You could stride around,
like looking at all the various giant TVs that all
look and sound the same, and that was fun. This
it's smaller, weirder, And I mentioned this in another episode.
Usually they keep the country tree based stuff in the
basement in Eureka. Pop. Yeah, it was top it's a
top side.

Speaker 3 (13:04):
It's okay.

Speaker 4 (13:05):
There was some top side stuff last year.

Speaker 3 (13:09):
But it was much smaller. But there's there's there's way way.

Speaker 2 (13:13):
Bigger top side pavilions. Very strange above above Eureka this year.
This is a bad sign.

Speaker 3 (13:19):
There's a lot of a lot of shifts.

Speaker 2 (13:21):
Know this.

Speaker 4 (13:22):
This sea definitely feels feel, feels very different. And I
don't want to be just like cynical for the sake
of being cynical. I want to have a fun time
at see Yes, I actually I'm serious.

Speaker 2 (13:33):
I would love that. I would love to see some
ship that makes me go cool. I would love that.

Speaker 4 (13:37):
And yeah, I don't know, like I tried on some new,
new new to me auto translating your buds today. I know,
I know I've done a lot of auto translation stuff
so far, but uh, these ones were really good. These
ones were fast.

Speaker 2 (13:50):
Do you know who they were buying? Let me let me,
let me look through my notes. It's no, I must say,
though I'm not. They were they were in they were
in North Hall. I'm trying to find them. Yeah, I
don't think any of us are trying to be reflexibly.

Speaker 3 (14:04):
Why are you looking up, Rob?

Speaker 2 (14:05):
What you saying? Let's let's bring you in here.

Speaker 5 (14:07):
So so, yeah, I've I've now sort of completed my inspection.
I'm flying home, yeah this evening. And yeah, so West
Halls the usual weird exhibit of things with wheels and
sometimes tank treads. It's all the mobility stuff, which was
never part of CEES first.

Speaker 3 (14:22):
Sure it came here and God helped me nineteen ninety eight.

Speaker 5 (14:27):
Yeah, Yeah, there's lots of interesting stuff being done with,
you know, portable solar power, like if we had balcony
solar legalized in the US, that's where you'd see it
in the North Hall, Central Hall is you know, that's
still some exhibits with TV vendors, companies like high Sense
and TCL. We're sort of devoting more space to that,
and Samsung has sort of moved off site. And yeah, Panasonic,

(14:50):
half of their stuff was like here's what we do
for data centers, which are speaking as a Virginia and
that they're not that popular.

Speaker 2 (14:57):
People do not like these. Yes, I imagine you've seen
quite few of those recently.

Speaker 5 (15:01):
Every time I flied at Adullas Airport, can't miss him and
today I finally got over to the Venetian to look
at the sort of smart home exhibits upstairs. Downstairs is
Eureka Park, which is all the real sort of cats
and dogs, random internation, just.

Speaker 2 (15:14):
The random, the straight dogs more correct.

Speaker 3 (15:17):
That's where you have the weirdest stuff.

Speaker 2 (15:19):
Definitely, I think they need to lean into that with
Eureka Park. They need to make it dirtier. They need
to have like a dive bar in the corner, a
few straight rocks, like the It's Always Sunny in Philadelphia
episode when they go to Atlantic City and there's just
like dogs walking around. I think like they need to
make it weird and gross, because I think that's the
only thing that will make it interesting.

Speaker 1 (15:38):
I had the unfortunate order of going to Eureka Park first.
I went the first day that I got here. I
just had an hour before the show floor closed, and
so all I could do was be in the Venetia
Venetian and Ed had been like, oh, the downstairs is
the weirdest, and so I like wandered around the upstairs
of Venetian a little bit, and then I was like,
I gotta hit the downstairs whatever. Basically that was my intro,

(16:01):
and I was like, this stuff fucking rocks. Yeah, I
mean it's insane, it's batshit, it's a ninety nine percent
unnecessary yes, but it's so funny, and I was just
having a ball, having the time of my life. Even
even the upstairs at Venetian has been great. And then
the last two days at the at the convention Center,
I've been like, the fuck is this is this useful

(16:22):
shit that you think I'm going to actually incorporate into
my life. Get the fuck away from I want the
weird shit that I can make fun of.

Speaker 2 (16:29):
Yeah, I'm sorry for that. I wish the show was
more dopey, but it's kind of it's sad. I want
I like the novelty stuff, or I want like big
and stupid. I hear there's a giant tractor. I somehow missed.
That's very amazing. That's how the tractors did what it
was just a.

Speaker 3 (16:46):
Giant Gnere combine or was there another one?

Speaker 2 (16:48):
No, the combine, Yeah, it could be fun if there
were two.

Speaker 5 (16:51):
Big Green Mission actually imported some some corn from some
form in Iowa to play it in front of it
to show you what it actually does.

Speaker 2 (16:57):
Interesting. Yeah, yeah, I wonder why, Like I feel like
if you have a farm, you probably know about tractor.

Speaker 1 (17:05):
Yeah, but I don't know that the people here looking
at that tractor are people who.

Speaker 2 (17:09):
Have farms, right, But my point is, then why did
you put the track to there?

Speaker 1 (17:14):
It sounds like.

Speaker 3 (17:15):
Because you can.

Speaker 2 (17:15):
It's a giant green machine with if the reason was
just like we can, fine, I love that. That's great.
But yeah, so these these headbuds head times time kettle, Wow,
time kettle. You could have you could have given me
seven million years that wouldn't have come up with that name.

Speaker 4 (17:36):
Yeah, they were very fast, like nearly leg free conversational translation.

Speaker 2 (17:41):
That's awesome. Do you know how much they cost? Orry from?
Like that? Did they tell you? Because so many, so
many of these places just don't. It's like when can
I get it? Oh no, no, no not, we don't
do that here. We don't sell things or put prices
on things. We are simply we are simply showing you
something so we can raise mo and then run away.

Speaker 4 (18:01):
Oh yeah, they definitely, they definitely have been selling them
the past few months.

Speaker 2 (18:08):
And I will I will also go through my notes.

Speaker 1 (18:10):
No, it's okay, it's rob Can I ask solar balcony solar.

Speaker 3 (18:14):
Yeah, I was.

Speaker 2 (18:15):
Actually this was in the back of my mind too.

Speaker 1 (18:17):
Is that where somebody is that where you have, you know,
tiny little baby solar panels on your porch and you
can sell the electricity back to the electric The.

Speaker 5 (18:25):
Idea is you don't. So this is a sign that
it's very popular in Germany. I was in Efense September
in Berlin, and if you walk around the city, you'll
see solar panels on balconies next to flags and flowers
and everything all.

Speaker 2 (18:34):
Over the place.

Speaker 5 (18:35):
And the whole idea is it's very cheap. It's not
enough to like power your whole flat, but it will.

Speaker 2 (18:42):
It's enough.

Speaker 5 (18:43):
You put two, maybe three solar panels outside, you plug
them into an outlet, and that's it.

Speaker 3 (18:48):
And they started found some way to legalize it and
they flow back.

Speaker 5 (18:52):
So the whole term outlet is apparently wrong because it
is bi directional an inflow.

Speaker 1 (18:57):
Yes, oh wait, you mean what we what I think
of as an outlet, that thing.

Speaker 3 (19:03):
We've been lied to our entire lives and also inlets.

Speaker 2 (19:06):
And Field just stepped in, which is to say they're
lying to you about those plugs that have two spikes.

Speaker 3 (19:11):
Very big on the election.

Speaker 2 (19:12):
You can take Phil you you agree with this? Yeah,
the ones with the two plugs on the end, you
plug them in, they're legal right.

Speaker 3 (19:18):
Mail to mail yeah, male toil is illegal in most In.

Speaker 2 (19:21):
Most states, mal to maill is legal due to woke
electricity people who don't want you to they're trying to
in Texas they are. Actually it's a slate flag. It's
it's a fifty cow rifle just wrapped in mail to mail. Yes,
so that four hundred dollars, four hundred dollars for the

(19:42):
the I can see.

Speaker 4 (19:44):
I can they need to be wife, I can phone conducting. No,
I mean I think no, they're bluetooth to the phone,
so like your your phone will need but I feel
like they're they're running the computer through the phone.

Speaker 1 (19:55):
But but I'm asking does the phone need a wife?
I can need need.

Speaker 2 (19:58):
It's probably needs to sell.

Speaker 1 (19:59):
It's not an on device.

Speaker 2 (20:00):
L M, No, it is it is. It is not
on device. I will say with that uses others. I
will say with that one, if it's immediate, then the
situation you'd be using that is like international business. So
I can kind of get at Yeah, it was fine,
but I'm kind of getting worried about like LM translation though, sure,

(20:20):
because it's like, how would I possibly know. It's like,
it gets things wrong in the language I understand, imagine
the one I don't. Yeah, But at the same time,
I like this text, so you know, who knows if
that will exactly?

Speaker 3 (20:33):
I want that. I want that to work.

Speaker 2 (20:34):
But it's like, because the pebble watch we're talking about
the pebble ring we were talking about earlier. That's on
device somehow, and that's cool. It's just like, but the
thing is on device is very difficult. It's not I
can understand why people are doing it. It's just what
happens if this becomes more expensive, which it most certainly will.
But Rob, you've been going to see since nineteen ninety eight,

(20:56):
as we've.

Speaker 5 (20:56):
Just estabveraged, I think a lot about the life choices
that put me.

Speaker 1 (21:00):
And what do you think about those show?

Speaker 3 (21:02):
How do you know?

Speaker 2 (21:03):
Seriously, how do you feel about the ces? Like, how
does it compare to even last year of the year before?

Speaker 3 (21:08):
So honestly I haven't.

Speaker 1 (21:08):
Said how does it compare to nineteen ninety eight?

Speaker 5 (21:10):
Well, in nineteen ninety eight, this was a show about
TVs and cereos and it was the dawn of digital TV.

Speaker 2 (21:17):
What they took from UST.

Speaker 5 (21:18):
People were very excited about the prospect of being able
to spend like four thousand dollars on some thirty and
c r T. There would be four hundred pounds, but
it would display yeah ten eighty I resolution.

Speaker 3 (21:30):
Whoa, whoa yeah.

Speaker 5 (21:32):
And along the way it's become this the show about
all these different parts of technology. So there's all these
you know, electric cars, solar panels, AI household robots. So
I mean, like, I complain a lot about the logistics
because it is such a paniastic get around. But yesterday,
in the span of like four hours walking around LVCC North,
I talked to a company developing building a test fusion

(21:54):
power plant in Massachusetts. We else, let me take a
look at my notes, a bizarre sequence of events.

Speaker 1 (22:02):
Let's wait, sorry, say that as in a new nuclear
power plant.

Speaker 3 (22:07):
In nuclear fusion power plant.

Speaker 2 (22:09):
Wow, oh this is non existent.

Speaker 3 (22:12):
It's it's been twenty years away my entire life, you know.

Speaker 2 (22:15):
It's that's my favorite thing about the nuclear stuff. It's
like nuclear power is real, but it's like fission. Yep.
We are just we are med decades away from this
twenty like Dad.

Speaker 1 (22:24):
Just fusion fusion.

Speaker 2 (22:25):
Yeah, that's the whole fission fusion thing. I don't want
to get fill on it. He's he actually knows he
talking about We don't do that on this show.

Speaker 1 (22:31):
The only reason I know is because last night Phil
spent an hour explaining that.

Speaker 2 (22:36):
And I was suffering a podcast is concussion, So I was, which.

Speaker 5 (22:41):
I know the omore so other random conversations had during
this little tour just LVCC North guy with a battery
company in Singapore where they're they're building much.

Speaker 3 (22:51):
Of the battery out of paper. Oh yeah, yeah, I
was this.

Speaker 5 (22:54):
Yeah, so there, so that's really neat.

Speaker 3 (22:59):
Yeah.

Speaker 5 (22:59):
I tried on an excess skeleton to le We've heard
a few people which one which one is hyper shell X.

Speaker 2 (23:06):
I was wearing hyper shehell today as well.

Speaker 5 (23:08):
Yeah, and so it's sort of like you know when
you get on an ee biking and it's it's almost
like the hand of God is pushing you down the street,
Sam Surrey, where think it's up your legs and does
it feel honestly, you've used.

Speaker 3 (23:20):
It feels good?

Speaker 5 (23:20):
I mean, anything to make my feet hurt less at
the show, So maybe I should roll up wearing that
thing next time.

Speaker 4 (23:27):
I got really into the settings. I'm tell they they
have an app and they have three different modes, and
I'm not sure if it's.

Speaker 2 (23:34):
Roberts already talked abou hypers shell. He talked about it
a little bit, but he hadn't like really used it.

Speaker 4 (23:38):
We uh me and someone else have been trading trading
off wearing it all day. And so there's this like
eco mode, which is like the regular mode that that
you can operate it and you can change like the
percentage of how much power is getting directed to it.

Speaker 2 (23:50):
That changes how much it assists to you.

Speaker 4 (23:51):
There's hyper mode, which if you have hyper mode on
and all the way you are like bounding.

Speaker 3 (23:56):
It is it is, it is, it is.

Speaker 2 (23:57):
It is really pushing forward.

Speaker 3 (23:59):
It is fun.

Speaker 4 (24:00):
It's uncomfortable if you were walking slowly, you have to
be like walking fast. But if you're walking fast with it,
it's good. It's like really good.

Speaker 2 (24:07):
I grew up in London. I live in a fast walker.
It's great.

Speaker 4 (24:11):
And you can you can change the intensity of it.
But you can you can change like the torque from
like one leg to another to make to make a
different is.

Speaker 2 (24:19):
That if you have like an injury in one leg
or something. Yeah, that's really good.

Speaker 4 (24:22):
That's you can change like how much like input delay
there is, how much it's gonna like like like how
fast the acceleration is for when it starts moving. Something
that I was uh using to like mess with.

Speaker 8 (24:34):
Uh.

Speaker 4 (24:34):
The person who was walking around with today is they
have on their for their experimental future. They have something
called Fitness mode, which does the opposite of the mazing
resist resist strict movement. And this was really fun because
I could have them. I could see them like running
around on hyper mode and via my phone, I can

(24:56):
change it to Fitness mode to completely top there moving.

Speaker 2 (25:01):
Like the wrong trousers with Wallace and grum it.

Speaker 7 (25:04):
So I can just completely like like force them to
a standstill and it works great, and I I could
do it for like I can do it for like
thirty feet away on the.

Speaker 2 (25:16):
App uh And it was fantastic.

Speaker 3 (25:19):
It was really fun. I've I've been looking for more bits.

Speaker 2 (25:23):
That's really been I've been doing this for like eight hours.
I've been so annoying with this. I love that they
got wrong trousers mode. That's so good And honestly I'm
not cynic. I love these things.

Speaker 3 (25:35):
I fuck it was fun. I fucking love these.

Speaker 2 (25:37):
I know so many people with mobility issues on myself
of a fucked up an cool I love this. I
love that this feels like future tech.

Speaker 4 (25:44):
And it's small, it's doesn't it doesn't get like I've
tried on many many in exo an ex skeleton at
at the cess I've been to. Uh, this one was
very compact but still still functionals.

Speaker 5 (25:56):
It's wraps around your your waist, in your legs, really
an extra skill and I'm like, what do you do.

Speaker 2 (26:02):
Caught mobility suit.

Speaker 4 (26:03):
I don't know, but like it's basically just like an
extra set of hips.

Speaker 2 (26:07):
Kind of right.

Speaker 4 (26:08):
Could see some of them and hips that help that help,
and the help line that helped push your knee.

Speaker 3 (26:13):
Up well, the artificially assisted hips.

Speaker 2 (26:15):
So they dot the hips that finally we found hips
that lie. Yeah, it's I love that. I love that
they've got these, especially with like so much useless ship
Here a way to like help people walk feels almost
like the platonic ideal of cees. Like other than a
giant television, a television so large you can't see all
of it. They go like, yeah, I saw that too.

(26:38):
They just got a big as big ass TV. Hell yeah,
do you know what the price was? It wasn't transparent?
Yeah fuck it.

Speaker 3 (26:45):
Then no LGA had the transparent TV.

Speaker 2 (26:48):
Here's the thing. They're the useful TVs. But every year
I think they should try and make a bigger one,
Like just like every year there like it goes two
hundred inches. Now now we go one where we have
to bypart of the wall, Like, yeah, just escalate constantly.
And I saw on the very first day a wireless
TV situation where they were like, yeah, it's for trade shows.
You can get four of them and clip them into

(27:09):
one thing. I'm like, oh, you could be a sports freak,
and they just fucking look to me. They just look.
I'm like what I mean. Like when I was getting
my place here in Vegas, I went to look at
a house where the guy had just four CRTs, all
just playing different NFL games, And I also what during
that tour house didn't actually get in the end. I
was just enamored with that, and honestly it was all

(27:29):
I could think about, Like, you could just pop up
on them those sports TV things. Back to the future too.
You get fired by a guy from Japan, be great.
I'm not Markein Mttfly's dad in the future anyways. Yeah,
it's I like the EXE hyper Shell's good. I need
them to reach out to it. They reached out to you,
They reached out to Robert. They they don't like me there,

(27:52):
people like hyper Sheelle don't like me very much.

Speaker 1 (27:54):
Yeah, they're afraid that you're gonna get into your podcast
and with them positively.

Speaker 2 (27:59):
Yeah, you know, actually based on the lost twelve hours,
like I'm blamed, I'd be like, I don't know.

Speaker 5 (28:06):
I don't think I got ai in any way, So
you're probably that should be safe.

Speaker 2 (28:10):
Yeah, but you do have stats.

Speaker 4 (28:13):
You can see how many miles to walk, how many
steps you've walked on the battery was great, Like we've
we've we've had this thing on since like since like
eight am. I got back to my hotel to drop
it off about an hour ago, and we still had
fifty percent of the battery left.

Speaker 2 (28:27):
That's really better than my doing, and.

Speaker 4 (28:30):
You know it was it has it is better the
phone and like we're walking in a convention center, like
that's a lot of walking.

Speaker 1 (28:36):
The fitness mode, who are there? Is it the same
ten thousand steps?

Speaker 2 (28:42):
That's not bad.

Speaker 1 (28:43):
Yeah, it's funny because so much of our podcasting has
been looking at products that are unnecessary and pointing out
the unnecessary parts of them and rightfully laughing at those.
And this is a product that clearly I can help
a ton of people and is wonderful technology and very worthwhile.
And so I don't mean to apply the same logic.
But is the person who's buying Hypershell going to use

(29:07):
the fitness mode? Those feel to me like two different uses.

Speaker 3 (29:10):
It feels like something that's probably not difficult.

Speaker 2 (29:12):
It's an ad.

Speaker 4 (29:13):
It's an extra add on, that's the thing. It is
an experimental feature. It is not They are not. I
do not think they're selling them with fitness mode enabled
right now?

Speaker 2 (29:20):
Yeah, that is.

Speaker 4 (29:21):
It is an experimental feature that you have to like, click, like,
agree and okay to all right.

Speaker 2 (29:26):
We're going to rotate, of course. Now, this following ad
is not for hyper Shell. It's for something else that
you should buy without reservation. Just the credit cards out.
Please everyone. If it's a podcast, you must listen to. Superstition, fear,

(29:50):
and jealousy. Welcome back to Better Offline, the literal best
podcast ever made. I'm Rob Zombie. Slamming things in the
back of my Dragula is stand up comedian and star
of Is this thing on Chloe Rack?

Speaker 1 (30:00):
Where is she all?

Speaker 3 (30:00):
Killer?

Speaker 1 (30:01):
This?

Speaker 2 (30:01):
She is? It could happen.

Speaker 3 (30:03):
Here's Garrison Davis.

Speaker 2 (30:04):
Good evening, and take right to Rob Pegoraro. I I
like the Fitness Mode. I heard the moment you said that.
I'm like, I want to fucking go for it. Like
I've been, really I've been getting into more cardio stuff.
I'm like, I kind of want to funk. I want
to see how this sucks me up. It's really cool.
So how does did you try the Fitness Mode yourself?
How does how does it exert the pressure? Like? Is

(30:26):
it the is walking more difficult? Yeah, it's it's very difficult.

Speaker 4 (30:29):
You can't you You have to really strain to like
step more than like maybe like half a foot.

Speaker 1 (30:35):
Is it just that it sort of like binds up
the mechanism.

Speaker 2 (30:38):
I think so.

Speaker 4 (30:39):
I think it's I think it's like it's it's like
locking the murder or paying applying resistance to the you
have to then takes it takes way more efforctuse.

Speaker 2 (30:47):
I mean you've already got like wasted pants and stuff.
I know, this sounds like I'm doing you do you
actually do have these things you have like sweatsuits, and
well that's sort.

Speaker 1 (30:54):
Of what I That's where I wind up coming to
is like this feels.

Speaker 4 (30:58):
You can change the you can you you have like
you have like like like a slider, like you can
change the settings for.

Speaker 1 (31:04):
I totally follow. I totally follow. And the if if
we were not so positive on the rest of the problem,
I just am like this, I'm thinking of Conover about
the Conover saying this is a solved problem. You know,
where I'm like, we have weighted best or we have
My thing is with fits. You know how people bought
the like Nordic tracks. Do you remember those the cross

(31:26):
the internal cross country ski machines, and that there was
like a year where everybody's like, everybody's got to get
a Nordic track. Yeah, and then now and then upon
a bunch of people just have a Nordic track sitting
in the racement that they don't need.

Speaker 2 (31:38):
I think the difference with this is not only is
an experimental feature, it's not trying to be a one
size fits all thing because you can use it well hiking.

Speaker 4 (31:46):
Yeah, So if you if you want to have a
more of a workout well hiking, and there's settings for
going uphill downhill, and it changes the way that like
how how the motors like functions.

Speaker 2 (31:55):
So it does have a lot of not being sold
on this. Pardon it's not being sold on this idea.
This is an excitement that I even.

Speaker 3 (32:02):
Told me about the fitness mode.

Speaker 5 (32:04):
I tried ecomode and I.

Speaker 3 (32:08):
Have enough trouble getting around already.

Speaker 4 (32:11):
You don't need to do this walking race, walking uphill, downhill,
upstairs downstairs, gravel cycling, running mountain and sand dunes are
the adaptive motion.

Speaker 2 (32:22):
Like I can't set a fitness I would love to
because my ankle gets sucked up. If a war more
than a mile, I would love this. Yeah, diaper shell,
reach fuck hyper shell, reach out to me, like, I
swear to God, we're never going to know that you
call them diaper I'm going to I'm not going to
call them diper shell again. It'll be fine. That was
the last time you're like, you can't take it off
when you're.

Speaker 3 (32:44):
Diaper?

Speaker 1 (32:45):
Yeah?

Speaker 2 (32:46):
Oh god, yeah, so okay, rop back to you. Anything else? Well, actually,
was there anything you really loved.

Speaker 5 (32:52):
During this anything like the story of most fascinated by
there's a company called Donut Lab that does not actually
make it's not it's not a donut making robot. They
introduced a solid state battery for electric vehicles.

Speaker 2 (33:04):
So we had a very brief conversation about this, and
then someone emailed me and said they want more. So
guess fucking what, one listener, it's your Lucky.

Speaker 3 (33:12):
Day reader service.

Speaker 5 (33:14):
So it is kind of mystery how they've done it,
because this has trictly been the holy grail. Supposedly, if
you can get the battery chemistry and everything in the
physics to work, you have a much lighter battery that's cheaper,
the charges faster, those far more cycles. And they say
they've done all these things, but they're being very cagy
about them.

Speaker 3 (33:30):
And they also say.

Speaker 5 (33:31):
That the minerals are globally available widely. Sos like, what's
the catch is one of the other ingredients?

Speaker 3 (33:40):
Kittens, what's going on here? And so we'll have to see.

Speaker 5 (33:43):
Like they're showing off a motorcycle which includes the battery.
Would you use it was? No, they just had it
parked there. It's a thirty thousand dollars and up electric motorcycle.
Looks great, but if they've cracked the code for that,
and there's lots of other people working on some old
state batteries. And given the NV's are already good, like
you know, you have this thing, you can charge it

(34:05):
at home, you know, a fast charger. It's it's enough
time for me to take a nap, which I like doing.
And right, so address the other concerns people have.

Speaker 1 (34:14):
Is it big enough to power a vehicle that's bigger
than a motorcycle?

Speaker 5 (34:17):
Yeah, yeah, yeah, definitely though, like they showed off, there's
some company that makes like an electric car chassis with
the sort of skateboard thing where the batteries are right
underneath the center.

Speaker 3 (34:25):
That's most vs are designed. So yeah, their whole aim
is to have us in a lot of different cars.
Like they say, they'll have news soon about car partnerships,
so big deal.

Speaker 2 (34:37):
But if this is true, it's like a revolution in
old car battery text.

Speaker 5 (34:42):
Yeah, and not just that, I mean like drones, trains
for that matter, Because we don't know how to build
transportation infrastructure affordable in America, lots of railroad operators would
rather put the battery on the train than put the
wires above the tracks.

Speaker 2 (34:56):
I just want to be clear, if this is true,
I will be the most excited about this app up.
If Donut Labs ends up being fake, I'm going to
spend multiple episodes destroying this film so dirty. No no, no,
you won't because it's an exciting and cool thing. And
if this is real, it's fucking amazing. If it's not,
I will learn everything about this company and publish it
on this fucking No, they genuinely are.

Speaker 1 (35:17):
But it also sounds like somebody is going to crack
this well.

Speaker 5 (35:20):
Yeah, like there're enough other there's enough money being put
at solved directed at solving the problems.

Speaker 1 (35:25):
Yeah, and what's what's it going to do? I mean,
is the is the oil industry going to have an
even bigger procism than we've.

Speaker 2 (35:36):
The thing is what the oil industry they're well, yeah,
but they're also well invested in this stuff too, and
they will find ways to make money off of this.
They will not like it, but it's kind of hard
to deny electricity, but nevertheless they unlove it. Problem.

Speaker 5 (35:51):
Yeah, I mean, the whole the trend has been going
on for years and years, and it's just right now
in the US we have this president who has just
some trange notions of energy and affordability, and I think
his head is stuck in nineteen eighty six, which is
why he thinks it's a good idea to you know,
Hijack Venezuela is if it's like sticking up a gas station.

Speaker 2 (36:11):
Yeah, it's I love living in I want to live
you know what I would love. I'd love to live
in precedented times. I'd love to live in boring times.
I'm I've had too many interesting events happen. I'd like
to go back to boredom, but sadly they're not going
to allow me. I don't know, it's it's frustrating because

(36:32):
I've been asking people genuinely, seriously bring me stuff you're
excited about, and everyone keeps coming back with, like, you know,
it was kind of an Honestly, the exoskeletons are the
one unilaterally thing other than big tractor. And again it
must be clear, big thing, big huge thing.

Speaker 1 (36:47):
That's hug TV.

Speaker 2 (36:49):
I know, I fucking love it. I'm I'm an ape.

Speaker 4 (36:51):
One of the best CS moments I've I've ever had
is when I when we saw maybe two years ago,
we saw the USPS's first one of those duck electric
duck vehicles, and I was like, this is rad. This
thing looks great. And like two years later I saw
them rolling around the city.

Speaker 2 (37:09):
I'm not fucking rocks. I love that.

Speaker 5 (37:11):
So I Oshkosh, the company that makes it, they had
a media day at their facility in Wisconsin, flew me
out there.

Speaker 3 (37:17):
I got to drive it. WHOA, tell me more about that.

Speaker 5 (37:20):
So, I mean, so, first of all, it looks the
way it does because you want something where people can
stand up in the back, right, and so, yeah, which
you can't do right now in these nineteen eighties vinaged trucks,
the postal services rolling around. It's got air conditioning as
cup holders, all the modern conveniences, and yeah, Oshkosh took
a long time to serve get production ramped up. It
seems like they're doing that now, and yeah, it was

(37:43):
a neat. They also showed off like a battery electric
garbage truck, which I got to drive.

Speaker 2 (37:47):
Nice.

Speaker 3 (37:48):
It was the quiet garbage truck I've ever driven. Also
the only one.

Speaker 2 (37:51):
But whatever, I mean, I'm really I just love the
idea of driving a truck around. Yeah, so garbage truck.

Speaker 1 (37:58):
Here's the Okay, here's a question that I had the
thought of in an earlier episode and didn't wind up
asking Oshkosh. The company that makes those trucks is a
huge defense contractor, YEP, makes a ton of military vehicles.

Speaker 3 (38:11):
They do.

Speaker 1 (38:11):
Indeed, Uh I was reading on x the everything app
that UH ICE and CPB CBP was has been seen
using smart glasses to like record protesters and stuff. How
much of the tech that we see at cees winds

(38:36):
up getting filtered into what I would consider nefarious action?

Speaker 3 (38:42):
Probably a lot.

Speaker 5 (38:42):
I mean the fact that ICE is apparently using has
some sort of face recognition system against some database that
I didn't know the government Charlie had to see if
someone is an American citizen, which, yeah, I have a
lot of problems. Well where do we start, maybe like
perhaps they could not murder Americans? Yeah, that would be fun,

(39:04):
and then try to cover it up.

Speaker 2 (39:06):
Oh god, I'm going to be just shout two more people, yeah, CBP, Yeah, yeah, yeah,
I'm gonna be I'm gonna say something, but be honest,
people do we ma'am only and ask why don't I
have to say anything? Because I'm a fucking immigrant and
I'm terrified of saying anything about it at all, And
if you're ever wondering if I have opinions on this,
I do, but I'm terrified to say anything. I do
not think that I don't have thoughts or feelings about

(39:29):
someone being shot to fucking death mother, which is disgraceful
and disgusting, But I have to, like, I am very
guarded about it. It's not that I don't care, it's
that as an immigrant with a child in America, I
die here. So yeah, very fun fucking week for everyone.

Speaker 4 (39:44):
Yeah, but the thing you're talking about, we see a
lot of this with like the drone tech, and like
one of the one of the first ess I went
to and now you you still see this now in
the show is like it was on like you know,
Advancing Drone Technology, and it was like half put on
by Walmart and half put on by this a police
department UH in southern California who launched like the first

(40:06):
like big drone program, which has now been copied around
the country. And this was like a panel with like
both of them talking about how Walmart's doing drone deliveries
and how the police are sending or sending these drones
to like surveil neighborhoods and go to and go to
crime scenes before officers and like secure the area to
like you know, see see what's going on. And yeah,

(40:28):
you see this a lot with the drone stuff here
in Cees. There's across from the across from the Creator stage,
which is in Central Hall. It's where it's where if
you're a content it's all things content creator in creator economy.
Yet you would love the creator stage, I'm going. I
stood next to a panel where they talked about the

(40:50):
changing changing habits of gens of digital gen z Life.

Speaker 2 (40:55):
A presentation created by people over the age of forty.

Speaker 1 (40:58):
All of them, where yes, I say, what happens we're changing?

Speaker 2 (41:02):
Yeah, yeah, again.

Speaker 4 (41:03):
I stood next to this panel because I just heard
too many like things. I sometimes in panels, you know,
like five minutes later, you're like, this is not going to.

Speaker 2 (41:10):
Be you just thought hearing the Charlie Brown para, Yeah, and.

Speaker 4 (41:13):
Then and then and then you have to But it
turns out they use a TikTok to make purchasing decisions.

Speaker 1 (41:19):
Hell is that?

Speaker 2 (41:20):
But well that's so skivity of them.

Speaker 3 (41:23):
Did I say that?

Speaker 1 (41:24):
Right? That rocks? But across from the creator moment. I've
been podcasting for approximately fourteen hours over the last three days.
That's a tough moment.

Speaker 2 (41:36):
Across from the creator stage.

Speaker 4 (41:37):
There's a Chinese drone company that has these they're called
drone in the box systems where basically you put it
in the back of like a pickup truck and you
can launch you can launch a drone and it'll you
can like you can latch an autonomous drone and it'll pilot,
pilot round and then go back back into the box.
This this already exists in this we have we have
we police departments in the States already have this. But

(41:58):
this was like a transcom be that debt was making
making a lot of these and I can pull up the.

Speaker 2 (42:07):
Consumer druns here though I feel like last year there
was like various there were more fun things that was
like a baseball thing was there was.

Speaker 3 (42:15):
A drone soccer thing last year. Well one thing is SoC.

Speaker 4 (42:19):
GDU was the company with aerial surveillance in public safety
where they're.

Speaker 2 (42:24):
Too many, like I haven't see many consumer drune companies
know how many.

Speaker 5 (42:28):
Of them are now sort of banned by this order
that has come out which they're targeting DJI which without them,
like where Americans going to buy drones.

Speaker 2 (42:40):
Yeah, they're targeting the horse.

Speaker 3 (42:42):
When the thought this through entirely, it's.

Speaker 2 (42:47):
Just that whole thing, the whole all of that ship's
so bizarre to me. It's just like, oh, what if
the what if we fill American full of Chinese products
that could be used as Oh wait, we did that.
We've done that for ten straight years. We've just Amazon
has with various letters, putting random orders, shifted security cameras
into every home for on the cheap. Oh no, how

(43:09):
would we stop this? Well, it doesn't start with the stores.
It starts with China. The sneaky Chinese who are I
don't know, selling to America constantly. We don't stop that.

Speaker 1 (43:18):
You've heard it here, folks.

Speaker 2 (43:19):
No, it's just very confused. I don't don't really understand.
I don't think anyone really understands what we're doing against
China at this point. It's just like we might let
them buy stuff. Actually, that was an interesting story. So
today in Vidia with China, with these GPUs, they're saying
they have to be paid completely upfront, whether or not
they can actually sell them. They're saying that regardless of

(43:40):
how the the sanctions work out, if you buy them,
you must pay completely fucking up from whether or not
you can. You can't change your order, you can't get
a refund, you can't cancel, you can't reall cate. Fuck yeah,
in video doing great, that's the kind of shit you
do when you're like not doing well financially.

Speaker 5 (43:57):
Yeah, weird, Like why even it's just because you must
think you have exceedingly captive customers.

Speaker 2 (44:05):
No, yeah, it's like the kind of shit you do
when you think that you can't die and.

Speaker 1 (44:10):
But you also are a little worried about.

Speaker 2 (44:12):
Yes, exactly, you're like trying to pretend you're strong. And
I say this, in a town full of people that
act like they can't die, this is the person who
gets washed out first, Like, this is the person who
they have all the money until they get seven down
on the craps table. Yeah, little craps, Little craps for everyone.
Anyone play craps you listen, Email me easy a better
offline dot Com with your craps tips if you send
me toilet related stuff and will be really upset. It's

(44:35):
different kind of craps. Anyway, moving on, what else, Rob
Wels you see.

Speaker 5 (44:40):
Oh, let's seem so I've been trying to follow some
of the the political discussion. It's like everyone WASHINGTONI and
I fly two thousand miles to hang out with people
from DC.

Speaker 2 (44:48):
Yeah, exactly.

Speaker 5 (44:49):
And so there's a reallysting panel about tariffs Tuesday morning
where everyone was saying, this is such a dumb idea
like this, it's not helping their attacks paid by Americans,
not the chin. They're not helping people build factories. The
CEO of this kitchen robot company, souv Robert Thus, was saying,
you know, we've created a lot of jobs not in

(45:10):
the United States if they had a factory built in
Mexico when in Vietnam, like, none of the stuff works,
none of it makes any sense.

Speaker 2 (45:17):
Well, stupid question. Why would the tariffs make them put
jobs in Mexico? And I say this genuinely not.

Speaker 5 (45:24):
So her answer was basically, the first of all, a
lot of the components to the stuff she makes is
not made in the US and will not be made
in the US at any time.

Speaker 2 (45:33):
Soon, and.

Speaker 5 (45:36):
Therefore you need to import those things. And so why
eat the tariffs on the stuff you need to bring
to the US when you can just have them ship
to Mexico pay a lesser tariff.

Speaker 3 (45:47):
Once that way, and so.

Speaker 2 (45:49):
It doesn't change the incentives at all.

Speaker 5 (45:51):
Yeah, Like we haven't suddenly magically made it reasonable for
people to you know, do all this factor factory work
that was you know, offitable thirty years ago.

Speaker 2 (46:01):
Yeah, we're not going to pay people like was it NAFTA.
It's probably what caused the problem, not like yeah, not
like the tariffs people were.

Speaker 5 (46:08):
Paying after Now it's us MCA, which now Trump thinks
is terrible, and it's just not you know, it's not
policy making by grown ups. And so the rest of
the panel, I'm guessing everyone kind of agreed and yeah, yeah,
I mean it was I don't imagine. Well, the finding is,
of course, a lot of people in the electronics industry
seem to think that, oh, it's great that we're pushing

(46:30):
ahead on AI where we're getting rid of the regulatory shackles.
But the problem is the person who's doing that is
also the one who's imposing the tariffs.

Speaker 2 (46:39):
It's the same guy. So you don't have regulatory shackles.
You live in Virginia, you know that. Yeah, there's nothing
the capital of the world. All you hear in Virginia
is the screaming of the computer because the computer is
in pain, the GPU is hurting the computer.

Speaker 5 (46:54):
We have a new governor getting sworn in in less
than two weeks, Elcebenberger, and I think we will be
seeing some things done about data centers that were not
going to save She's going to save the computer from
the GPUs.

Speaker 1 (47:06):
What do you have a suspicion of what those things
will be?

Speaker 5 (47:08):
Well, I mean a lot of it is around on
that didn't she she is sort of. She said, we
need to make sure that like large utility users pay
their fair share, which the previous governor, Glenn Younkin, who
is I think most people would say he is the
most mediacre governor of Virginia the twenty first century. So yeah,

(47:30):
like young and he veted like really small board legislation
saying like, you know, we need to have some standards
for environmental review of data centers.

Speaker 3 (47:37):
Because these things are love the place.

Speaker 5 (47:39):
If you take off from Dollas Airport, look to your
left and right, you'll see these huge boxes with stuff
on the roof, the parking lots around. Nobody works in them,
and they do provide a lot of property tax revenue
for counties. So like it's a huge chunk of the
revenue of like Louden County, Prince William County, Western excerpts
of DC. But there's no benefit people nearby. They don't

(48:01):
generate jobs. They you don't get faster internet because you
have a data center one street away.

Speaker 1 (48:06):
In fact, you get more expensive election.

Speaker 2 (48:08):
Yeah, and pair in America, we don't feed those taxes
in once a month.

Speaker 5 (48:11):
They've they've got to turn on the backup generators to
make sure they work. So that's a lot of noise
and pollution. Everyone hates them. They're really unpopular. And the
whole idea that we need to do this because guys
like Mark Zuckerberg and Sam Altman think that we're going
to get super intelligence or AGI, so we have to
sort of suck it up.

Speaker 3 (48:29):
We need those GPUs to law.

Speaker 2 (48:31):
This is a true story, Jeff Howits and Ruyters law
a man with dementia to New York using a Kighi
Jenna branded chatbot and then he fell over and died.

Speaker 3 (48:43):
Yes, true story.

Speaker 2 (48:43):
Yeah, Yeah, that Grand Fox so Locker book should being
fucking prison and everyone who worked on it should be
in there with them. I'm deadly fucking serious. Everyone who
worked on that, every single fucking person fucking put Jenner
in there as well, her fucking face on it. Sorry,
I have some like slightly aggressive views on this. You
do have views? No, No, it's just I think it's
just exploitative. But no, it's interesting hearing the data center
a conversation here as well, because we have some core

(49:05):
Weave bullshit. Actually go and have a look and have
a look at them. Oh yes, yes, of all the
companies to be here in Vegas, Core Weave. I but
did you see any other panels that are of interest?
So I don't just talk about Colin.

Speaker 3 (49:18):
So there's a bunch asking.

Speaker 5 (49:19):
I'm gonna have to sort of catch up via THEOD
when I get back home. Like every year, I walk
miles and miles through this becazibits. Think I've covered it all,
come home, read other people, see s recaps, see all
the stuff I missed, and.

Speaker 3 (49:31):
Like, totally did I go to the same shew yea.

Speaker 2 (49:35):
Now I walk through I've been here for days and
there was like yesterday when I walked through stuff, nothing
was affixing to my brain. It was all just this
slop of my brain's going won't exist, won't exist, GPT.

Speaker 3 (49:48):
Wrapper, won't exist.

Speaker 2 (49:49):
Remember any laundering photonics panel, solar panel. And it was
strange because usually there is like a good amount and
I should also be clad cynic aside, there was a
use for this. There are people who go hit bios
who are like, oh, I'm gonna I'm building something and
what have you. But it feels like it's either that
or the like the slopification of cees.

Speaker 4 (50:11):
Yeah, it reminds me of like when three D printers
just started, yes, like getting popular and then and then
you see three D printed slop everywhere, and that is
this is interesting, like the way that you know, if
you if you go to like a like a like
like a comic con or like a red and fare
things that they used to be you know, like hand
handmade cool like novelty goods. Now it's just all these

(50:33):
three D printed dragons because we can use these, we
can use these, uh, use these devices just to like
print out print out this like you know, plastic bullshit.
And there's a similar thing happening with with these GTP
wrappers where instead of they are kind of new products,
but it's all it's all building some physical mechanism around this.

(50:53):
The same the same like the same software service it's the.

Speaker 2 (50:56):
Natural endpoint of ce S. It's just like half of
the ship's never going to exist anyway, so they're just like,
mah fuck, it is connected to the magic API thing
and then product please invest.

Speaker 3 (51:07):
I genuinely there is.

Speaker 2 (51:10):
Partly because of where I am in my life. I
can't do this this year, but maybe next. I think
I'm just if that they're not going to be here
next year, But if there are still LM companies next year,
all I'm going to do is walk up to them
and do the Radcliffe as we call it. It's like, what,
what's the point of this? Why do I care?

Speaker 1 (51:24):
And who gives a shit about that?

Speaker 2 (51:26):
Yeah? Who cares? Who cares?

Speaker 1 (51:27):
Who cares?

Speaker 2 (51:28):
Who cares? Because honestly, I don't think ninety percent of
this show can answer that question. Last year, when I
saw my favorite booth, which was the Korean baseball one,
where it's just it was like a golf tee thing
but for baseball, and the guys were like, no, you
need to step up and hit the ball, I'm like,
I cannot do it. I will be shipped. And then
when I finally did, they were like, oh right, you

(51:48):
can't hit it. I'm like I fucking told you. But
when I asked him, I was like, it's for people
that can hit baseballs. I'm like, yeah, I know, and
like professional baseball players who have their own like mansions
and chat. I'm like, that makes sense. I could the
people who played baseball and you want to hit the
ball at home. And also just it's fun to go
to a batting cage. Totally this year, it's like I
went looking for stuff like that. I really wanted, like

(52:10):
a baseball cage, something like that. I want. I wanted
something fun. There's a golfing a golfing simulator.

Speaker 3 (52:17):
Okay, okay, what happened.

Speaker 2 (52:19):
Let's just golf. It's pretty much.

Speaker 4 (52:21):
I think it's a the you said it heysen, I
want to say hens. I think it's at a Hecense booth.

Speaker 2 (52:28):
That's nice. There is some it's like a projected.

Speaker 5 (52:31):
Ennis set up in uh in the Venetian Expo. Tennis
out the middle of it.

Speaker 3 (52:36):
Yeah, you know. The idea is like you're sort of
like tennis coach. It will like, you know, serve very I.

Speaker 2 (52:42):
Just try that tomorrow. Yeah, I'm actually curious about the tennis.

Speaker 3 (52:44):
See that's fun.

Speaker 2 (52:45):
Yeah, we like that.

Speaker 3 (52:47):
That's that's good. We like tennis.

Speaker 2 (52:50):
We love tennis.

Speaker 1 (52:52):
I had a little bit of human connection.

Speaker 2 (52:54):
That's great. Tell me, please God.

Speaker 1 (52:56):
I was I had called a cab, so I staying
at Haras and I had called a cab from Harah's
to the convention center, and you know, everybody just goes
down to the little uber waiting zone and there were
there were a handful of people, and I just kept
watching car after car drive up, one person get in,

(53:16):
the rest of us are waiting there, and it just
felt insane. Yeah, and I think it would have felt
insane even if I didn't live in New York. But
living in New York and yeah, and you know, feeling
so passionate about public transit, I'm like, this really is insulting.
And I my guy pulled up and I just turned
to There were two women who were next to me,
and I was like, do you do want to get

(53:37):
in the car with me? We don't have to talk.
And one of them gratefully accepted, and she was like,
my uber kept canceling, I kept not being able to
get a car, Thank you so much. And the other
woman didn't just looked up, didn't say anything, looked back
on her phone. I copy that.

Speaker 2 (53:51):
Now there's the New Yorker.

Speaker 1 (53:52):
Yeah, yeah, yeah, truly, and she I'm not one hundred
percent sure that she understood what.

Speaker 2 (53:59):
Yeah, I was saying, that's probably more luckily, but this one.

Speaker 1 (54:02):
And I got in the car and we wound up
riding the whole way I mean, you know, or talking
the whole way on the ride, and I asked what
she did. I saw that her badge said Apple. And
she was saying that she's a buyer for stores and
she said, I, you know, I didn't used to work
in tech. And I said, what did you used to
do and she said, well, I used to be in
supply chain management and I went, oh that's cool. I
used to be in supply chain management and she said, oh, yeah,

(54:24):
I used to work at Target. And I was like, oh,
I used to work at Target. She said, oh, I
was a business analyst and I was like, oh, I
was a business analyst at Target headquarters in Minneapolis, were you?
And yeah, yeah, yeah yeah. For two years, For two years,
all Soda Coke, Pepsy, Doctor Pepper, all subsidiary brands that
passed through all the eighteen hundred Target stores in the
country passed through ME, which is insane that they let me.

Speaker 3 (54:47):
A nice twenty one year old queen of soda.

Speaker 1 (54:50):
Queen of soda. I used to say pop until I
worked Western. Yeah yeah, pop pap. Oh yeah, I have
some pap. And she you know what's even wilder. She
uh worked on the snacks desk and I in the
Snaxt department. And I trained on the cookie cracker desk.
That's good in the Snaxt department. It just was like

(55:12):
this very weird small world.

Speaker 2 (55:13):
Did you learn anything about apple anything?

Speaker 1 (55:16):
Absolutely nothing.

Speaker 2 (55:16):
No.

Speaker 1 (55:17):
I was like trying to have I was trying. I
was like trying to get in a little bit more,
and I just was like, this is not it's not
We're really you could have.

Speaker 3 (55:24):
Done a bit.

Speaker 2 (55:24):
You could have been like, well, fruit, just like your
mouth slayer.

Speaker 1 (55:30):
That's gold.

Speaker 2 (55:31):
I know the thing the thing about having like because
the Raa of West London I grew up in, I
can't sound fucking I mean I was about to say,
I can sound fucking stupid, really obvious. South we would
go you really, Yeah, he's learning learning, see that one coming.
Well the fruit, you can just have your mouth.

Speaker 1 (55:53):
I'll find I'll find her.

Speaker 2 (55:54):
Yeah, this is my simpleton friend, line, he's now, oh
my god, the idea of sharing my Uber with someone anywhere,
but let alone Las Vegas.

Speaker 1 (56:05):
What was more interesting was the driver wound up talking
about the driver worked for over thirty years at a restaurant,
at some restaurant and just got laid off, lost no severance,
lost all of his benefits. Now he's doing Uber and
he and he was like, there are no jobs. So

(56:27):
many people are looking for jobs and everybody's out of work.
And he said, I just got hired back at the
exact same restaurant that I worked at for over thirty years,
but I got hired back at the absolute lowest entry
level and only for the holiday season, and so I
don't have any vacation. I can't he said, I can't
even bump somebody working in the coffee shop. I like,
he it's it's such an insult. And it was the

(56:51):
most interesting thing he said. He said, I think people
in tech think that they are safe from it. Yeah,
because she also she was like, she was very sweet,
but she was like, you know, they should repurpose the workers.

Speaker 2 (57:04):
And I was, oh, I want to fucking I know.

Speaker 1 (57:07):
And I was like, the whole point, literally the point
of laying off these people is to not have the worker,
Like the point is to not repurpose the worker.

Speaker 2 (57:15):
And it seems like they are repurposing the worker as
the worker they had, but then they just pay them there.

Speaker 1 (57:20):
Yeah, it's a little confusing. I mean, I mean, I
think I think both are happening. And he wound up
cutting in and he said, I think people in tech
think that they're safe, but they are not safe. So
many tech jobs are going to fold in the next
in the coming years. And he was saying like, hey,
I is going to take all the jobs. And you
know who knows.

Speaker 2 (57:39):
He is going to take all the jobs because it's
going to destroy all the tech companies.

Speaker 1 (57:43):
Yeah yeah, yah, yeah, yeah, Like I think that there
are going to be a lot of tech jobs and
he and he was like when when tech people experience it,
they will not It will take until then for them
to realize what other people experience.

Speaker 2 (57:55):
And it's kind of what you were saying earlier with
Matt binda kind of an offhanded company men like, oh,
there's someone who grew up because the scarcity mindset of like, oh,
I can't use this ring enough because it will literally
die on it. I drop money myself. It's like the
sense that something can run out and it is weird
these tech people, especially the AI, this fucking smugness, and

(58:15):
that comment is probably the most cutting.

Speaker 3 (58:19):
Yeah.

Speaker 2 (58:19):
No, it's also probably the most intelligent thought I've heard
about the tech industry all week, other than on this podcast,
because it's like tech people really do think that they're
insulated from this, Like tech media, they know that they
can be laid off.

Speaker 5 (58:31):
They're well aware, they well like they Yeah, if you
work at Meta and you've seen so many people fired Microsoft, Google.

Speaker 2 (58:39):
But you think you're safe still.

Speaker 1 (58:40):
I mean, it really felt like the the you know,
first they came for the trade unionists, but I was
not a trade this so I didn't.

Speaker 2 (58:46):
It kind of is It's like this condescension towards actual labor.
Like I always say, like I can complain about my job,
but I send little emails, I talk into the microphone,
and I drink my little drinks like I'm very fucking lucky.
And I have a good amount of listeners who don't
like manual labor jobs and working restaurants, and so it's
like fucking rock stay every people. But it's like if

(59:07):
you don't speak to those people you are going to
get into the price.

Speaker 4 (59:10):
The way, the way that labor is completely cut out
of a convention like this, and you know there's there's
people people talk about innovators and like ideas and like
creatives and how like, how like you know, like the
ideas and creative creative innovation is what is what drives everything,
And it drives a little bit, it has it has,
it has a little bit, but you still have to

(59:31):
do the physical work. You still have to actually labor,
and often you have to have labor to create the
conditions that allow people to innovate quote unquote, but that
perspective no one talks about, like the actual like labor
part of like all of these things, even like in
this supply chain stuff, even like the focus on like
manufacturing this year, the labor is it gets gets completely forgotten.

Speaker 1 (59:52):
It made me wonder what the broad socioeconomic background trends
are for most people who work in tech. Like it
made me. It sort of made me feel like I
got to imagine that some huge swaths of tech, of
people who work in tech, and I don't know, this
is me really really really taking up No, this is

(01:00:12):
not even necessarily an educated guest. This is just a
vibe gut, but it feels like a lot of people,
some hyperponderance of people who work in tech, didn't grow
up in depressed areas in the middle of the country,
didn't grow up under the poverty line, didn't grow up
you know, with some that. I think probably a lot

(01:00:35):
of people in tech grew up with some relative amount
of privilege, even if they're not white, even if they're
you know, even from South Asian, but maybe perhaps they
come from a family where there is disposable income whatever.
And it made me be like, I think, like I
grew up with that without a ton of money in
the Midwest, and I'm like, I feel like if I

(01:00:55):
was working in it is industry that was potentially hollowing
out cities or potentially impacting impacting with data centers whatever.
I think I would always always have in the back
of my head the people who I who I grew
up around, the people who I came from.

Speaker 2 (01:01:10):
I am very lucky in that I didn't grow up
with money. But my parents work their fucking asses off,
Like my mom and my dad worked their asses off
my mom and me and three other children. So imagine
that and I'm probably not the most normal child. It
definitely wasn't. I grew up around the corner from prison
called worm with scrubs and didn't grow up in a
good area. My mom put food on the table so

(01:01:31):
you can ask for And I feel like the people
who are like we need to build the biggest, most
huge Stata center to this never had a fucking They
never missed a Christmas. They never had a Christmas where
they go. And to be clear, my mom worked or
ass and my dad worked her ass off. But it's
like those moments, there was one and it was like
my I think if my parents felt worse about it
the night, I'm sorry. If my dad's listening, they do

(01:01:51):
not feel bad. Dad, You work your ossof I love you.
But it's like those moments are what teach you to
actually have fucking humility and love for others, because there
is more than just stuff. But the idea of being
like we can repurpose this workforce really fills me full
of angry bio because it's like this whole show feels
at times like a monument against that where it's just

(01:02:12):
like we don't solve real problems because you don't fucking experience.
The Michael in Trader of Corwa doesn't know what fucking
thing about that person driving UBERA fuck Michael and Tro
pardon me, I want to write a beep in there.
But yeah, it's just it sickens me because it's like
we're gonna have kul Shnard tomorrow, Las Vegas Son fantastic
guy for the labor angle tomorrow, because it's it's something

(01:02:36):
really missing from here yea. And also I'm gonna be honest.
Hearing a guy driving Uber here is saying that restaurants
are having trouble like a laying off people is actually
genuinely scary for the Vegas economy specifically because we usually
I've read about that usually print money here.

Speaker 3 (01:02:56):
I hear international tourism is down. To figure out why
something happened.

Speaker 2 (01:03:00):
Luke Winkle from Winky, I don't remember how to say them, sorry,
Luke who who wrote a piece of slate like Vegas
is down, and it's like when that's bad, it's bad.
We're gonna rotate now from this somewhat grim point, but
I will. I want to end this point by saying
something which is important, which is I have more fucking
respect for anyone who works in the McDonald's or diner

(01:03:21):
or the shittiest chip shop in London and any city
in the world, than I will ever have for anyone
who works at fucking Google. And if you disagree with me,
go and work a fucking day in a fucking restaurant.
I've never done it. I'm lucky enough to never have to,
but my whole fucking family has. And if you don't
respect service workers, you're not welcome on this fucking show.

(01:03:52):
A z Eezy top once said, how how how how
We're back at Better offline CUS twenty twenty six coverage
and we've stuffed face with tech and our face with guests.
This was not a well written one.

Speaker 3 (01:04:04):
Joining me is stand up comedian Chloe Radcliffe.

Speaker 1 (01:04:07):
You won't let me leave.

Speaker 3 (01:04:08):
I won't let you leave.

Speaker 2 (01:04:10):
That's right. You've got two more episodes and then you
can go.

Speaker 9 (01:04:13):
Won't let me leave the Cees story.

Speaker 2 (01:04:15):
Robert Evans are behind the bastards. I'm also not allowed
to leave. He's not. And then we've got Western Lee,
the writer. Now I must say Western Lee is an inspiration.
Weston is one of the only people in the world
who cares about annualized revenue, and you and I sat
on you. We are friends because we sat and went.
We've lined up all the anthropic reporting around their annualized revenues.

(01:04:38):
We put them in the line, and we've worked out
how much we've they've made.

Speaker 10 (01:04:41):
You know, it's amazing when you learn about somebody sharing
a spreadsheet with them.

Speaker 2 (01:04:45):
No, it's I brought you here straight up to just
like thank you in front of hundreds of thousands of people.

Speaker 1 (01:04:50):
Thanks for the first time you had sex? Wait me yeah, ah,
oh we're getting too, we're getting to do dark.

Speaker 2 (01:04:56):
Nineteen Okay, I was five minutes before my twentieth Wow,
I have my first case when I was like a
Halloween two thousand and five. I don't know how old
I was, not I was nineteen at the time. I
know because I had sex like five minutes before. Wow,
like a real.

Speaker 1 (01:05:11):
Like kiss, then sex, no sex then.

Speaker 2 (01:05:13):
Oh no, no, sorry, don't it was months later like
it was in two thousand and six when I had
sex for the first time. Yeah, see, I don't give
a ship like anyway West and welcome to the show.

Speaker 10 (01:05:23):
Hey, happy to be here a long time, first time
for a long time, first time and sorry you true
open eye hater sixteen.

Speaker 3 (01:05:31):
Wow.

Speaker 2 (01:05:32):
Nice, it was comfortably.

Speaker 1 (01:05:35):
Yeah, that is that is so much said, cool and
cooler than I would have expected from a man who
said the sentence, It's amazing what you learn about somebody
from sharing a spreadsheet.

Speaker 3 (01:05:46):
Western. Western's the dog.

Speaker 10 (01:05:49):
Yeah, oh yeah, that's been a long time since.

Speaker 1 (01:05:51):
Then only one.

Speaker 2 (01:05:54):
No, No, it's great. I too, and I'm lying if
never had sex. No, okay, so putting my lack of
sex aside, we have had to delete Robertsons due to
issues of national security. So Western, what are you doing
here at ces? What do you? What are you up to?

Speaker 7 (01:06:09):
So?

Speaker 10 (01:06:10):
I'm a game writer and I'm a copywriter and when
I copyright, I work in like certain niches that are
cees adjacent, like gear that content creators use to do
podcasts and videos and stuff and film production equipment. And
I've narrowly avoided working cees in my lifetime, like three
four different times. It's never been and I had the opportunity,

(01:06:33):
I was like, well, I should come scope it out.
And I wanted to see all the amazing gadgets on
the trade shill floor.

Speaker 2 (01:06:40):
Yeah, and this was a bad year for that.

Speaker 10 (01:06:43):
Sadly, I do you guys really live like this, Like
every January, you just come.

Speaker 2 (01:06:49):
Here and yeah, fucking drag show. The about liquor on
the show floor sometimes?

Speaker 1 (01:06:56):
Is that for real?

Speaker 2 (01:06:57):
Some of the contacts have bars, you know, and did
you say comdex.

Speaker 3 (01:07:01):
Oh my god, it's like because that's what used to be.

Speaker 2 (01:07:04):
Right, Yeah, that's I.

Speaker 9 (01:07:06):
Could see the Mobile World Congress. Wrong. I was there
the year Angry Birds came out and the Rodeo guys
rented out like a whole beachside cabana, and every Rovio
employee I saw at the party they were hosting was
like blacked out to the new or past the point
of but that makes a whole company.

Speaker 2 (01:07:23):
It was amazing. No, but that makes perfect sense because
for a while they were like, Rovio is going public,
Rovio is the Angry Birds on the next mavel. Well,
I guess it would be Disney at the time.

Speaker 9 (01:07:32):
We're a bunch of like twenty two year old nerds
who suddenly had like a three hundred million dollar company exactly.

Speaker 2 (01:07:37):
They win insane. It was like German or something like sweetish,
like they were like a Nordic nation, maybe the Swedes, right, Yeah.

Speaker 9 (01:07:44):
No, I mean because I met a couple of Americans,
but I don't know. It may just have been that
their team had some I don't Yeah.

Speaker 2 (01:07:51):
Peter a Storm, friend of the show, My might pay
that m could drink me to death. I don't know,
but like if it's sweets, I'd be surprised if you
could see the alcohol.

Speaker 9 (01:07:59):
But most of what I remember from that party is
that there was a box of Cuban cigars that they
just like had on a table and at a certain point,
I was like, none of these kids smoke.

Speaker 2 (01:08:07):
They just took them the whole box. Yeah, you looted
the Angry Post.

Speaker 3 (01:08:14):
It did.

Speaker 9 (01:08:14):
Like two and a half hours had gone by and
no one's touched these fucking I think that Romeo Julietta's
and I was like, fuck it, I'm taking them.

Speaker 10 (01:08:21):
So this is my first s Yes, So what you're
saying is in two thousand and nine they had apps
like it's just a bunch of phones in a booth
and you can play.

Speaker 9 (01:08:28):
Ever's huge because it was a new idea because well
it was that was the you had your your waves.
You had like first like smartphones, and to an extent,
the smartphone and app foam waves at the same time,
where it was like every other booth is a new
smartphone or like huge Google used. Google had like this
massive booth when Google had a year after Andrew came

(01:08:49):
or Android came out. Yeah, they had a slide one year.
It was like three acres because it was like Android
was like a year or two new. Oh, they were
used to be so much fun that it was way
better back there.

Speaker 2 (01:09:00):
That's kind of the thing that I think is missing
as well, like putting aside the fact that they're on
useful stuff.

Speaker 3 (01:09:05):
It doesn't even feel.

Speaker 2 (01:09:06):
Like light hot, Like there used to be something dorky
Like E three in its heyday used to be kind
of silly and also very sexist in a way that
was reprehensible and remains that way. LOOE three is dead.
It's like there used to be a dorkyess through his
like fuck it where Google whenever fucking slide, like the
big tractor is the only bit of this place is whimsy.

Speaker 9 (01:09:25):
Lest One year I went to E three, they had
like two hundred extras all dressed as North Korean soldiers
to like from home Front.

Speaker 2 (01:09:32):
I think it was from Home Fro written by the
Red Dawn guy.

Speaker 9 (01:09:37):
Yeah, written by million million two hundred fucking North Korean
soldiers marching around the It was it was nuts.

Speaker 2 (01:09:47):
Little North Korean whimsy. But this place is bereaft of whimsy.
It's like a sexless goon cave. It's just there's just
it's doing stuff for the sake of doing with no
level of like self effail.

Speaker 1 (01:10:00):
Totally, there was no sense of humor.

Speaker 3 (01:10:02):
It's silly, and you should be silly about this stuff.

Speaker 9 (01:10:05):
It's not silly. Our pen that has a chatbot on
it has to succeed against the four hundred other pins
that have chatbots on it, on the glasses that have
chat bots on it, and the chat bots that have
actually three different chat pots if you think about it.

Speaker 2 (01:10:17):
But no, I think I think the general lack of
whimsy here is just very sad because the whole thing
about tech is it should be kind of fun. It
is kind of especially for something like this, where it's
like it's ostensibly for consumers, but it's like we're trying
some shit.

Speaker 7 (01:10:31):
Ow.

Speaker 2 (01:10:31):
It's like a concept car, Like I saw a concept
car for an autonomous vehicle, which is a bit everyone
does every Like last year Tco makes TVs had a
concept car and it's like this year I saw one.
It's also half hot.

Speaker 1 (01:10:43):
It's now. I will say, what you're describing is tech
that doesn't fully work, but is like taking a big
swing with everybody knowing it doesn't fully work. Yeah, we
all sort of forgive it. That actually describes a lot
of what we've talked about. The laundrobot that doesn't fold
find your keys, that doesn't find your keys, But there

(01:11:05):
is no there seems to be no feeling of experimentation,
there's no like.

Speaker 2 (01:11:11):
They're not big swings.

Speaker 9 (01:11:12):
Is I think the problem, like something like a laundry
folding bot is at least like that. That remains a
high bar for a robot to breach. And so every
attempt at it, even though the most at least that
you're trying to do something that we don't really have
it down. But most of by volume, the vast majority
of like new shit I saw was you know this
product that already exists, it's got a chat bot. Now

(01:11:32):
you can talk to it. It can transcribe your conversations.
I saw like a million glasses and rings and pins
and like a bunch of different individual products that are
all It has a chat bot. It records your conversations
it has a translation, right, Yeah.

Speaker 10 (01:11:49):
The coolest thing I saw today. It sounds like doesn't
even normally exist at CES, which is an install, like
an interactive install like out of Comic Con for a movie.

Speaker 3 (01:11:58):
What.

Speaker 10 (01:11:59):
Oh yeah, Rebecca Ferguson movie. We're talking about this scene.

Speaker 2 (01:12:01):
But I saw the booth.

Speaker 10 (01:12:03):
Mercy now, and it totally fits this because it's dour
as hell, because it's a dystopian future sci fi where
an Ai is judge Jerry and executioner for people on
trial and stuff, and Rebecca Ferguson.

Speaker 2 (01:12:14):
So like compost the criminal justice system.

Speaker 10 (01:12:17):
The video that they had so in the booth is
blacked out like it's a defense booth, like you're at
a cop show or a military show. And so I
just see this video playing on the side, and it
takes a second for me to register that it's Rebecca
Ferguson in the video and it's not just an ad
for a real company. So yeah, Mercy coming out January
twenty third.

Speaker 9 (01:12:35):
It's it's very fun because you said, uh, that's the
one with Chris Pratt. When I went up to the booth,
I asked, like, what is it, and she said, we'll
look inside, and then I'll tell you and openside and
it's like a guy in a chair, Well it looks
like handcuffed in the chairs where it's like, yeah, it's
clearly this is a robots are judging us kind of
movie or we have an AI judge. So I walk
out and I'm like, I don't know, is it like
a game? And she said no, it's a movie trailer.

Speaker 2 (01:12:58):
And I was like oh.

Speaker 9 (01:12:59):
She's like, haven't you heard of the movie? And I
was like, no, you even't heard of Mercy No, And
she said, well, it's got Chris Pratt in it. And
I was like, well, okay, that didn't that.

Speaker 1 (01:13:07):
That did not mean over you're not selling. Yeah, I
think Chris Pratt. I think Rebecca Ferguson cancels out Chris Pratt,
but she loses all of her all of her draw
is also canceled out by Chris Pratt.

Speaker 2 (01:13:19):
So it's just a flat line.

Speaker 1 (01:13:20):
Yeah.

Speaker 2 (01:13:21):
I am so neutral. I'm fucking done with things like
what if AI takes over everything. Make a movie where
it's like everyone bet on the wrong course, Like make
something where the dystopia is like societal failure based on
stupid choices. Yeah, that are not just political or what
if the AI is so powerful?

Speaker 1 (01:13:38):
Is ed you'r? Are you saying one of your listeners
should should fund a movie that I write about this?

Speaker 2 (01:13:43):
Yeah? Fuck it, Yeah I can be in that.

Speaker 1 (01:13:46):
Great.

Speaker 9 (01:13:46):
The idea I've had that I was thinking about since
we were doing our episodes on nuclear war is okay,
you don't have Skynet gain control of the nukes, but
instead you have the military adopt like an AI program
that it links a bunch of its our en listening
stations due to analyze incoming signal intelligence data, right, and
that errantly says that there's an attack imminent, and so

(01:14:09):
the humans decide that they're going to start by firing
all of the missiles. And it's the job of our
protagonists to convince all of these generals and politicians who
are losing their minds over data that over what the
conclusions of this AI that like, we are not in
danger and you do not need to fire all of
the news.

Speaker 3 (01:14:25):
If you made.

Speaker 2 (01:14:27):
Mond ian Ucci Death of Stalin, that would be really
good because mission impossible, The Lost One. If you watch
this fucking movie, great, If you didn't, it's so long
and so bad and so I don't know, but it's
like the entity controls all of the nukes, and it's like,
who gives a fucking job. I watched that movie the
whole time. I was just like, I hope they will
fucking die. Like I'm just like I will these people

(01:14:49):
of len Agela Besset, she's all right.

Speaker 1 (01:14:51):
Speaking of Rebecca Ferguson, movies where somebody has to convince
somebody to not fire peremptively, right, there is the movie
that's on Netflix that came out a couple of months
ago that is pretty much exactly that. But it's not
it's not exploring AI at all. It's that they pick
up a signal that that there's a that there is
something in coming and they can't figure out what it is.

(01:15:12):
And that I mean, oh yeah, watch that movie.

Speaker 2 (01:15:15):
Yeah, yeah, yeah, the movie that just yes, yeah, And
Rebecca Ferguson was actually I didn't remember who she was.
Looked up. She was actually in Mission Impossible, Funal Reckoning,
And my.

Speaker 1 (01:15:23):
Fucking boyfriend has a big fucking crush on her, So
I hate her.

Speaker 2 (01:15:27):
Stewart, I kind of like her, Stewart, come on the show,
you very funny?

Speaker 10 (01:15:32):
What if in your nuke movie? The nukes also had chatbots.

Speaker 9 (01:15:36):
Well, yeah, I mean you got to put because the
nuketons to be able to tell you like how it
needs to have a personality.

Speaker 10 (01:15:40):
It's like like a doctor Strangelow thing, where like every
Peter Seller's character is a chatpot.

Speaker 2 (01:15:45):
Yeah, it needs each agent is like you got this,
you got this, It's gonna be okay.

Speaker 9 (01:15:50):
It needs to be able to text the like bombardier
who drops it like, Hey, I just want you to
let you know that, like I think you did a
really good job.

Speaker 1 (01:15:57):
Fly today. Fly.

Speaker 9 (01:16:00):
Yeah, people are going to be angry for this, but
you did the right thing.

Speaker 2 (01:16:04):
It's going to be dangerous, but it's also gonna help
I do. I think that we're approaching an I and
UCHI versus far more than anything else at this point,
where it's just very silly and bad, but not for
like the nuclear holocaust way. It's just going to be
like everyone everyone shits their pants at once economically, but yeah,
that's also bad.

Speaker 9 (01:16:24):
I mean you maybe see some uh looming signs of
that here with just the number of things that they're
they're like pulling back on, Like I'm still saying a
lot of the same claims made about like what's how
self driving is going to take off, Like what part
like that car you were talking about, tinsor the world's
first robocar that the PR lady was absolutely certain is

(01:16:47):
definitely coming out in the next year more and it's
it's it's level.

Speaker 2 (01:16:51):
For self driving. And I was like, okay, what is
level for me?

Speaker 9 (01:16:54):
Well, she said, like a Weimo And I said, well,
weimos are only allowed to drive in a really small
set of area. Yes, what about this, because you're kind
of advertising that you can self drive it whenever you want.
Is it just you can self drive it if you're
in downtown San Francisco. And she's like, well, whatever we do,
it's going to be completely compliant with the.

Speaker 2 (01:17:13):
Law and safe.

Speaker 9 (01:17:14):
And I was like, that's simply not an answer about
where your car can go.

Speaker 2 (01:17:21):
What are you driving to today.

Speaker 3 (01:17:26):
On the law?

Speaker 2 (01:17:28):
Was this a booth person?

Speaker 9 (01:17:30):
This was a booth person. Now get to be fair.
So I'm accurate. The car you can also drive it
has the first storyial that flips out. But I was
I was asking about, like, well, the self driving stuff,
because you show it driving in a lot of areas.

Speaker 2 (01:17:42):
I also feel like self driving at this point, it's
not a solve problem even remotely. But it's like we've
seen it. I'm not sure what else you can show
here unless it's driving you around.

Speaker 9 (01:17:52):
Keep getting better, it being functional in other.

Speaker 2 (01:17:54):
Areas, right, but how would you show that in the
convention floor?

Speaker 9 (01:17:58):
They have a bunch of methods because a lot of
the boosts, they'll show you here's what the camera sees,
and here's like the big thing this year was demonstrating
here's how much better the sensors are at cutting through
fog and cutting.

Speaker 2 (01:18:09):
Right, Which is cool, I give them that stuff.

Speaker 9 (01:18:12):
Right, but yeah, it's it's not the same as there
being a bunch of new products. A lot of times
you're looking at like a disembodied piece of an automobile
and it's like here's.

Speaker 2 (01:18:21):
And a bunch of flashing thingies, and it's like, look
look at all the things we know. Look a cat.

Speaker 9 (01:18:26):
Yeah, and I don't doubt that it's important, But I
guess what I'm kind of gathering is that like we're
continuing to slowly figure out this technology and maybe one
day it'll get there. Before you know, society collapses.

Speaker 2 (01:18:38):
Yeah, it's like the CS trough of dis illusion. Yeah,
show it's just everyone's just kind of like, yeah, well
we're still working it out, give me money.

Speaker 1 (01:18:47):
And that's what I want to reiterate is when we
were talking about the whimsy, the kind of stuff that
is like it doesn't work and it's not gonna be
ready for a long time was acceptable when there is
some kind of self awareness, Yeah, it's not gonna be
ready for a long time. Yeah, when it is tried.
When when when a company is trying to pass it off,

(01:19:09):
is hoping that you don't ask the question of and
when will this be available and how much money is
it and can it do this and can do when
they seem to sort of be hoping that you just
buy the ruse, that's when it's frustrating.

Speaker 2 (01:19:18):
And that's like that was very much you usually reserved
for like Kickstarter and Indiegogo shit where it's like, okay,
you're kind of backing something vestigial and like you usually
wouldn't back the one that couldn't show you something because
those ones will disappear with your money. But now like
Nvidia is running things like kickstarts up because you can't
get refunds or change things, which is so cool.

Speaker 10 (01:19:38):
You apparently don't have to pay for anything now either.

Speaker 2 (01:19:40):
We love it folks, but it's it's so strange as well,
because you, I mean, sure you've got hucks. This ces,
but there would be a level of mad scientist to it,
like the LG. Kloyd robot.

Speaker 9 (01:19:52):
Ah Chloyd, Oh Cloyd, Well, oh Kloyd, Oh Kloyd. Cloyd
is your new best friend. He's a robot that can
do your dish is very slowly and also can't really
do them cat not like well. And he can fold
some kinds of very simple laundry like towels, again very slowly.

Speaker 2 (01:20:09):
And yeah, it was saying hook he said he couldn't
fold them properly. It didn't look good at it.

Speaker 9 (01:20:15):
No, and if you're kind of old and fading, it
can remind you to eat fruit.

Speaker 2 (01:20:19):
Yeah, it's name is Cloyd, but Cloyd like not a
natural thing to say. But the thing is like something
like Cloyd. They would be like, this is just a concept.
Like you, Samsung a few years ago just had a
giant like circle thing you could sit in and it
had giant screens and they were very much like it's
the same Sung crazy experience. It's like just some shit
to show its yes, and everyone went in went like

(01:20:39):
cool sat and Sung's fucking around with screens because Samsung
makes screens. But this was like, yeah, it's the year
of robots. Yeah, it's the year of robots in your home.
Well okay, not your home or a home anywhere, but
this home we built here it kind of doesn't work
in But what if it did? But it doesn't? Please don't.
You can't buy it? When can you buy it? It doesn't exist?

(01:21:00):
How much is it?

Speaker 9 (01:21:01):
We work at the video of this robot with the
woman we got whose hair is like studiously gray, even
though she she doesn't look like she's about you out
of her late forties, but we clearly want you to
think of her as old for the sections where it
looks like it's a robot for helping an old person.
But then here's some kids playing it, so you know
that it's good with kids, because we'd love it if
families bought it too, But we don't really know what

(01:21:21):
people want out of a robot that looks like a
cross between the robot from a fucking what's what's.

Speaker 2 (01:21:27):
That eighties movie? Not Wally the one.

Speaker 9 (01:21:29):
Before wait the decades it's silent running no, no, no,
fuck the nerdy.

Speaker 2 (01:21:33):
Batteries not included. No, I will say email if you
remember the one I'm just imagining my.

Speaker 9 (01:21:41):
Daily department, it's not a humanoid robots one.

Speaker 2 (01:21:43):
Yes, Marion, my daily grandmother would have fucking hated these things,
like just the Also, old people, their brains still fucking work.
They will feel the condescension of the world's slowest machine
doing something.

Speaker 3 (01:21:56):
But on top of it, if they were just.

Speaker 2 (01:21:58):
Like, here are some shit we're trying now, like we've
found pistons and like senses, we're trying these new sets,
I would be fucking like, fuck me up, Like, fine,
show me the new things you're working on. But it's
framed as it and there are journalists writing this is
the ces of robots and you're being fucking played, every
single fucking one of you.

Speaker 9 (01:22:16):
Well, I mean there's the there's industrial robots, which like
it looks like some of them are getting bad. I'm
not great at it, but like I've seen like big
robot arms that are like folding paper with a lot
more dexterity that I used to seeing and stuff like that.
And I'm saying even like there was a humanoid rubbo
with hands that was could just do that, and so
I'll believe that there's some like industrial applications, but so

(01:22:39):
much of the show is set on selling the idea
of a home robot, and none of those look like
they can do anything useful and they almost be expensive,
And I don't know why you think people want one.

Speaker 2 (01:22:49):
I just feel like the CEA who runs this show,
how is there not like a fraudulent element to this,
because when most of the show appears to be deception,
because like I've been to a lot of these fucking
that they've been coming here since twenty eleven, I've seen
a lot of weird shit that was kind of vestigiou
and like, Okay, we're working on this, maybe it will happen,
maybe it won't, but not most of it. And it

(01:23:13):
feels like the ultimate end point of just never fucking
truly holding their feet to the fire and never asking.
And I keep going back to this clubbing. I know
I will actually say this just on behalf of me
and the listeners. You keep saying these derisive things about
yourself that are incorrect, like oh, it's a layperson.

Speaker 3 (01:23:27):
Oh I'm stupid, You're not stupid.

Speaker 2 (01:23:28):
You're asking the questions that the journalists should ask, like
who cares and it's like, well, who cares? But also
is this real? Will you ever make it? Does this
do the thing that you should do? The fact that
the ca themselves doesn't have this level is disgraceful.

Speaker 9 (01:23:44):
Well, yeah, I think they made the decision a long
time ago that they're in the business of number one,
providing real estate to these companies. Yeah, they're working for
the company selling this schluck right, Like that's that's their
job and it's not even you would think. And I
think maybe in the past there was a broad understanding
that part of their job was to ensure that the
tech industry remained healthy and didn't become what it's turned into.

(01:24:08):
But it's been a while since anybody in the business
world felt like they had responsibility for anything.

Speaker 2 (01:24:13):
But even then, like this feels like almost the beginning
of the punishment, because what happens when all of these
companies with the LLM rappers die, Who's going to pay
for these fucking boots? Because if there were other companies
that pay them, they would have paid. I saw a
fucking chunk of the Venetian or the Palazzo casino closed

(01:24:35):
off today because there weren't enough gamblers. Oh pretty gamblers
aren't here during a business week. What is going on?
We can't con people from other countries into gambling.

Speaker 1 (01:24:45):
I did wonder if.

Speaker 9 (01:24:47):
The people from other countries are visiting as much.

Speaker 2 (01:24:49):
Yeah, well that's probably because there aren't good enough catch.
I'm joking, joking, Chloe, we you canna say?

Speaker 1 (01:24:55):
I did wonder if this population is a uh like
gamble party?

Speaker 2 (01:25:00):
Yes, population, yes, people. It used to be ces. People
would come get rip shit, they'd see it, or stand
at our beautiful craps tables. They'd be they'd be going
all the hard ways. They'd be doing big six, big A,
all the dumbest bets. These motherfuckers put so much money
on midnight, which is twelve on the crab's table. That's
how you know you have a healthy Vegas when you've

(01:25:21):
got some dumb fucks at the craps tables. Yeah, we've
got empty craps tables.

Speaker 1 (01:25:25):
This is a bad sign me making really smart bets.

Speaker 3 (01:25:28):
Yeah, what, you've got a system.

Speaker 2 (01:25:30):
That's why I play. I haven't gambled in years and
they live here, which is crazy. But no, I'm genuinely serious.
Something's off, Something is genuinely off.

Speaker 10 (01:25:41):
So I don't know sorry, Well, I don't have the
perspective with cees. But I gotta say, I don't know
how many of the talks and panels you guys went to,
but I spent a good mont.

Speaker 2 (01:25:49):
You see.

Speaker 10 (01:25:50):
Oh I went to the ARIA and the main through
line that I thought was a little odd because there's
definitely execs doing doing the lines, playing the hits. You know,
use AI now or your job is in jeopardy, and
you know, I know of companies where one hundred percent
of the AI is written, one hundred percent of the
code is written. Ryan, here's the And I've heard that

(01:26:11):
stuff for years, so it's a little bit like this
is the same, Like we're still doing this, but there's
like these little glimmers of things that make me go huh.
Like the word incremental showed up in like three different
talks where it's just like, what what does the next
year mean to you? And it's like incremental. You need
to you need to be going to your customers that
are using these tools, you know, in advertising and asking

(01:26:32):
for incremental gains. And I'm like, not revolutionary gains, not
little barn burners going from here to here. There was
a there was a talk that with the one of
the guys who runs this spinoff company from tell Us,
which the Canadian telecommunications company right now, I think, and
he's like, yeah, we saw one hundred two hundred and
fifty million in savings, you know, using AI tools. And

(01:26:54):
I'm like, all right, what's what's tell US's revenue? And
that's like, you know, point zero four.

Speaker 2 (01:26:59):
Eight for I also, sure you did, Sure you fucking
fucking anything else. I'm going to strap you to one
of the fucking chairs from saw. I'm going to fucking
handcuff you to an radiator, and I'm going to have
Jigsaw extract from you how you save that money, because
I bet you didn't save a fucking penny with large
language models, You're lying, sack of shit. It's a specifics,

(01:27:23):
get real incremental with a bone saw.

Speaker 9 (01:27:24):
And the big thing that you're seeing the shift here,
I think, has been because there's not like previously the
thing about this was about consumer electronics, and there's always
been you know, software and apps and stuff, but it's
been very focused on things that people will use, right,
and that's what journals are here to report. And I
was like, here is all these companies' vision of like
what people want to buy and what they want to

(01:27:46):
use and what will delight them and what will be
useful to them, and a lot more of the emphasis
since all this AI stuff has come to centralize it
on is here is a product that will allow you
to hurt a customer. Here is a product that will
allow you to extract from people without priding really or
while that you're sneaking into his service.

Speaker 2 (01:28:04):
There was a company.

Speaker 9 (01:28:05):
I went to their booth and got to walk through
SoundHound AI. Oh fucking Soundundhound. Yeah, And their whole thing
is we make agents. We make agents for companies. And
one example they gave him. An agent was like they
basically an AI doing the job of taking orders for
burger king right and like taking down orders, and it
also controlled the menu. And so the first thing that

(01:28:25):
they showed the guy demoing it showed with someone being like,
tell me what burgers are good, and while there was
a menu up in front of them, instead of looking
at the menu, asking it, and so it suggested a burger.

Speaker 2 (01:28:35):
And I was like, I.

Speaker 9 (01:28:36):
Don't think people will use it that way often. That's
a weird thing for him to show. But then I
understood why, because he was talking to this group of
like three or four people, and he was talking to
them about how and obviously, you know, Burger King feels like, Okay,
we have this much stock that needs to get moved
right now for whatever reason, because it's like close to expiring.
We can make sure to like really push that to customers.

(01:28:56):
Or if we want to sell more, you know, more
whoppers with cheese, we can. We can push the walk,
we can we can recommend that to people. We can
even alter the menu if things are out, or if
we really want to push certain items. And he said, like,
for example, if Coke wants to really sell coke vanilla,
we can start pushing large coke vanillas. You can push
specific sizes. And I was wondering why they're talking about this,

(01:29:17):
because again, people don't use it that way. And then
they realized all of the gaggle of four people that
the rep from Soft Town was talking to, I looked
at their badges. They all worked for Coca Cola, and
so this was a pitch up them. This is the
future of ordering at fast food restaurants, and if you
work with us, we won't push Coca Cola on customers.

Speaker 10 (01:29:37):
A lot of the marketing in this bubble. Like you know,
like as a writer, I've spent years watching like creatives
in Hollywood and games and stuff just feel like they're
on the back foot because they're being threatened by.

Speaker 2 (01:29:48):
This marketing way.

Speaker 10 (01:29:48):
The marketing is not even aimed at them. When somebody
says you have to learn this or else you're you're
somebody is going to take your job, that's not actually
aimed at that person. It's aimed at the investors. Yes, right,
And that's one of the most inficant bullying, you know.

Speaker 2 (01:30:01):
But the funny thing is Soundham was a basically a
Shazam competitor. But what's really good about that is Adam
brought this up yesterday and it's like there is a
famous tree where it's like every Gemini commercial is wat
should I eat for dinner? And he goes sandwich and
they got to holy shit, and it's literally that it's
literally what what burger do I get? Burger? Holy fucking shit?

(01:30:22):
That will be one hundred and fifty thousand dollars a month.

Speaker 9 (01:30:25):
And they showed their like their car AI program where
you can build your own agents in your car, and
he gave it these capabilities and the demo that he
ran it on was Oh, my car's making this weird noise,
and he described the noise and the chat bond the car.
First said that sounds like it's this kind of problem.
The average cost to prepare is seven hundred dollars. Do

(01:30:46):
you want me to find a place for you to
get your car repaired? Aliens, Yes, he said, I'll call
the dealership.

Speaker 2 (01:30:51):
I've booked.

Speaker 9 (01:30:52):
I've booked a spot at the dealership while you're there.
Do you want me to schedule a test drive and
a new Nissan whatever car? Yes, my wife loves that car.
And it was all about like, first off, nobody unless
you're I guess, I'm sure there are like rich people
one they here, Oh, problem, I'll take it to the
dealership and pay whatever it costs. Most people are like, okay, well,

(01:31:14):
I've got a guy who didn't rip me off too
bad before. I'm gonna see if he can diagnose it.
And like, can I but my gym straight to the
dealer seven hundred dollars? Okay nothing, and like yeah, I'll
take a test drive. I'm always in the market for
a brand new car, like yeah, it's just like it's
just it's all predatory for the person driving the car.
It's all because everything that got the engineer excited, the

(01:31:35):
only stuff you wanted to talk about was and here's
another place where I'll try to sell you something.

Speaker 2 (01:31:38):
That's that was it for the AI. The thing is,
it's like I think the tech industry has become guys
selling stuff the guys that will not use it. It's
just companies selling things to companies.

Speaker 1 (01:31:49):
Numbers.

Speaker 2 (01:31:50):
You don't need numbers. You don't need to raise numbers
to do money. Yeah.

Speaker 9 (01:31:54):
Well and that, Yeah, that's the because one of two
things will be true. Either all of these AI agents
being integrated into these projects will drive of sales significantly,
and that will be the new thing advertisers fight over.
And I'm not I'm not in love with that future.
That said, it'll probably were better than banner ads.

Speaker 2 (01:32:08):
We don't think it'll work as I don't think it's good.

Speaker 9 (01:32:11):
It'll work for a while, at least until they get
data on whether or.

Speaker 2 (01:32:14):
Not audit drives open. Aye tried to integrate shopping things.
We're just going to get onto that. But we're coming
to the rotation points. We're going to start rotating owls.
We're going to move on to the final section, and
I must say this upcoming AT, I don't I.

Speaker 9 (01:32:29):
Don't know what we're doing under your under your.

Speaker 2 (01:32:33):
I appreciate that it's now a flawless transition. Upcoming AT
is probably for a podcast or some product or some ship,
but I need you to click it or listen to
whatever the fuck you do. I've never I've never heard
the ads. I don't know what they are. If you
complain about the ads, please help me. Diane. I'm once

(01:33:04):
again in front of a microphone inside the Palazzo Hotel
in Las Vegas, Nevada, and everybody's staring at me because
they keep saying strange things into the microphone. I'm on
my eighteenth died coke and feeling more alive than I've
ever been. Joining me is actress and stand up comedian
Chlorie Radcliffe for Sadly, our final episode with her.

Speaker 1 (01:33:22):
Wow, this is it. We've come so far.

Speaker 2 (01:33:25):
Honestly, the listeners fucking love you and I need to
bring you back, so they're going to be pissed.

Speaker 1 (01:33:30):
I'll come back anytime you are.

Speaker 2 (01:33:32):
And to your left is another person who I need,
Rob Peggerarro, the tech reporter who has been around for
a why nineteen.

Speaker 3 (01:33:39):
Ninety eight was when you start, It's been a minute, Yes, genuinely.

Speaker 2 (01:33:43):
Multiple people have emailed him, both being like we love you,
and then one slightly horny person one slight it was
like Rob Pegerarro SMR. And then to your left is
Robert Evans of Behind the Bastards and who.

Speaker 9 (01:33:57):
I have heard two of our names in the room
right now.

Speaker 2 (01:34:00):
So many horny things I've like, no, I mean quite literally,
I'm so sorry. But and then okay, we'll just move
on Robert Evans behind Bustard my.

Speaker 1 (01:34:10):
Bid, but me, anybody's anything horny about me? I got
an olive in my mouth.

Speaker 2 (01:34:16):
You're going to get it. Yeah. I actually have just
describing this. Chloe just has an entire like one big
SPI and I said to.

Speaker 3 (01:34:23):
Phil, that's pretty good.

Speaker 9 (01:34:24):
It's just there's a Greek man sitting in a room
that's nothing but jars of olives, going like, all.

Speaker 2 (01:34:29):
Right, finally, finally podcasts do this. That's yeah, that was terrible.
I have not done any Greek.

Speaker 1 (01:34:38):
Do you want to try that one again?

Speaker 2 (01:34:39):
I know my brain is like I don't know what Greek? No,
I can't, I truly don't.

Speaker 1 (01:34:44):
Know, Like is Greek an accent that white people can.

Speaker 2 (01:34:47):
Do white people in Greek all the time. They're white Greeks. Okay,
well let's move on from that one.

Speaker 8 (01:34:54):
A fun bit I I one of my favorite bits,
one of my first bits I got Europeans really man.
I read this article that was arguing that okay, like
Greeks got inducted into whiteness at the end of World
War two, and I shared it and the person I
got a bunch of Germans really hated me sharing this article.
This was around the time they were trying to make
them pay for their back too.

Speaker 2 (01:35:17):
Yeah. Yeah, that is Edward on Groiso Junior, the writer
of the Tech Bubble newsletter and classic staple of the
Better Offline Experience. We were just talking about white people
do ces, and I know.

Speaker 1 (01:35:28):
I see you about why people do racist.

Speaker 2 (01:35:30):
Accidents white people do.

Speaker 1 (01:35:33):
They're very fun.

Speaker 2 (01:35:36):
That's a joke. She is doing a bit. We are
doing banks with Bandmaxing. No, we were just talking about
why people do ces. And I've said earlier it's like, Okay,
people do these shows because it's an excuse to get
people together. That's completely true. I also think that there
is something kind of magical about ces because you get
like people just people happen to be here, and if
you take it for what it is, which is a

(01:35:57):
convention center full of bullshit, you can see people like
everyone's room. I'm really happy to see like everyone. Everyone
is in different places. Some of them are like Chloe,
you're in New York, so I see a decent amount
of rob You're in DC, and Ed, I see you.
But like people get busy in Roberts in Portland. It's like,
it's nice to get people together. And yeah, the more
you suffer, the funny is to talk about it. And also,

(01:36:18):
I don't know, this podcast fucking rocks, and it's kind
of fun to do something weird and different every year.
I wish there was more on the floor to be
excited about, though, Ed, what have you been up to today?
We haven't heard much from you.

Speaker 8 (01:36:33):
Well, funnily enough, I found that there's this AI voice
emphasized that they let you do any accent that you wanted
to do, And I was on the floor some of
the worst you wasn't good, and it was really.

Speaker 2 (01:36:48):
Was good at anything.

Speaker 3 (01:36:51):
Yeah, No, I wish you know.

Speaker 1 (01:36:54):
Because you're like, I'm not the one, Yeah, Robot, I'm.

Speaker 3 (01:37:01):
No to be clear, this is not real.

Speaker 1 (01:37:03):
This is just me really yeah, yeah, this is man
cauld you imagine, but.

Speaker 2 (01:37:09):
That sounds like prostes. There would just be a line
out the door of guys just being like guys with podcasts,
guys with racist podcasts. WHOA, Finally I'm gonna take our jobs,
like yeah, woke is not back, yeah, because I'm not
what that's right now? Woke to c yes is gonna

(01:37:30):
be fucking great. We were talking about the gender reassignment section.
It's gonna be fucking great. It's gonna be illegal to
be white for one part of the convention, so that
it's gonna be fucking great.

Speaker 3 (01:37:41):
I cannot wait gott to open every day with the
land acknowledge.

Speaker 2 (01:37:43):
Yeah exactly, Oh no, a brand acknowledgement, but just for
like were brands like.

Speaker 9 (01:37:54):
Yeah, in ages past, black Bear or I AM would
operate this booth, but.

Speaker 2 (01:38:01):
At times Tanis would have been here, but they were.
I do like the Elizabeth Holmes is now doing like
epic epic posting where it's just like from jail, what
the posting from jail and beab like illegally, illegally canceled
for doing fraud.

Speaker 8 (01:38:19):
Actually you know my My greatest fear about going to
jail was not being able to tweet. But you know
this is this is yeah, Now I don't have to
be afraid of it.

Speaker 2 (01:38:28):
Yeah, you are a person where I post, that's just
me at home, right exactly exactly.

Speaker 9 (01:38:33):
You can launch your own version of fair Noose, which
is just people mailing you blood and you being like Lucia.

Speaker 8 (01:38:38):
I'm gonna I'm gonna be the first UH person in
prison to have a fundraising round on going. Okay, they're
gonna be clamoring for the I don't know what it'll
maybe it'll be for I do like an potempkin Ai
thing that's just like other prisoners and they're pretending to.

Speaker 1 (01:38:55):
Be the Martin Luther King's from Birmingham. Jim re Ducks
is like asking for VC, but.

Speaker 8 (01:39:02):
It's just just beware of the credits and tells you
that you can't have a VC.

Speaker 2 (01:39:08):
Fund in prison. Chapter two begins as a chat. But
I need to reach you this Elizabeth Holmes tweet from
twelve thirty one, twenty five. So New Year's Eve prisons
are meant to rehabilitate prisoners and break crime cycles. Instead, Comma,
they shatter families and make better criminals. It won't break

(01:39:30):
me on mine, but watching it happen daily is heartbreaking. Honestly,
she this fucking rocks. I'm just sorry if you're gonna
post some prison just being like, I'm not fucking guilty
of the fraud I obviously did. Fuck you.

Speaker 3 (01:39:43):
Yeah, fuck yeah, let's just go for it.

Speaker 9 (01:39:44):
Like I interpreted that differently. I thought she was saying
prison is meant to like it like breaks people and
makes them into better and makes them into better criminals.
That's not gonna happen to me. I'm gonna stay a
shitty cri.

Speaker 2 (01:40:01):
If you only knew my real story is so much
more interesting than the one you have been told. Coming
soon in twenty twenty six, and that is seven ellipses,
nothing hidden, nothing left out. The time for the truth
has come. No one can hide from the light. Woman,
you did actual fraud. I'm ready for I want to know,
Elizabeth Holmes, you are invited on Better Offline?

Speaker 1 (01:40:22):
Question? Did are there other people who? Are there men
who have committed fraud as harmful as the fraud that
Elizabeth Holmes committed? I'm not denying.

Speaker 9 (01:40:34):
Yeah, well, I mean they made off the guy the
end rock, but they.

Speaker 2 (01:40:38):
Didn't get culturally crucified in the same way.

Speaker 1 (01:40:40):
Right, they have been imprisoned.

Speaker 2 (01:40:43):
They have been imprisoned.

Speaker 9 (01:40:44):
Shcreley is the closest to getting that level of like
early variety. But it also worked out for him a
lot better than her. Yeah, yeah, yeah, he built a
little bit of a weird brand on it.

Speaker 1 (01:40:55):
Yeah. I think he's in DJ now, are right?

Speaker 9 (01:40:58):
Really that's less slept with them too, which was weird
DJ allegations.

Speaker 2 (01:41:04):
Yeah, there's a Bloomberg to ruin her life for Charlie. Yeah,
he's had a weird post prison run.

Speaker 1 (01:41:11):
Are there are there people who have committed fraud at
a at a harmful level, at a like meaningfully harmful level,
who have not been who have not gone to prison?

Speaker 2 (01:41:22):
I mean from DC?

Speaker 3 (01:41:25):
Where do I start?

Speaker 1 (01:41:26):
Yeah, well, I mean specifically in startups.

Speaker 2 (01:41:32):
Clinicle, not Clinkle.

Speaker 9 (01:41:34):
This same way where they are literally starting a company
on completely false pretenses and raising v.

Speaker 2 (01:41:39):
I actually disagreed Clinkle. Clinkle was this one where you
could Clinkle raised maybe a little bit less, but they
were just this prominent. They were also on the cover
of Forbes. They were also for some reason, is Clink
sound based payment things obviously fraudulent. I mean sound based, sounds.

Speaker 1 (01:41:59):
Like you have money in your pocket, then that means
we've paid you.

Speaker 2 (01:42:04):
He was a very damp looking white boys, so I
know it was a company where you could transfer money
with like hypersonic. It was bullshit. But that's the thing,
like that that fuck with didn't go to prison. None
of the Web three people who never did anything, not
fucking doodles n FT which ended up backing off NFTs,
that alex Ahonian, who is a fucking asshole who should

(01:42:26):
be crucified for his role in the racism and sexism
of Reddit. Let alone the amount of Web three bullshit
up a fucking hill. Putting that aside, he like nothing
happened to all the Web three people who did a
bunch of criminal ship, like all of the blatant money line.
And the question is fucking what is this cz from?

(01:42:46):
And he got part's Adam Newman already got fucking three
hundred million dollars from from Marc Andresen.

Speaker 9 (01:42:55):
Because there's there's two differences, and we can debate which
of them is more relevant in her getting penalized much
more than a lot of other people who did similar things.
But like one issue is a woman, but the other
is it was a health company. People like it was
also cancer is actually fraud.

Speaker 2 (01:43:13):
It was actually fraught fraud. It was like serious medical fraud.
He also didn't do anything close we work was just
a terrible business.

Speaker 1 (01:43:22):
And that's why I'm asking, like, who harmed people to
the same degree that wasn't well.

Speaker 2 (01:43:26):
Actually the question with Theano says, I actually, like I know,
like people had actually diet was very bad.

Speaker 9 (01:43:31):
Just being clear, I had fucking mortgage backed securities.

Speaker 2 (01:43:33):
Are a man, and that hurt a lot of people.
FTX did a lot more damage specifically in Yeah, yeah,
oh no, everyone involved with a great financial crisis, everyon
who bet on this, And I'm gonna be fucking honest,
I'm going to say this, every single company that invested
in data centers that is going to lead to the
financial collapse that follows is worse than thernose. Every single
person who pushed open up or hill is worse than

(01:43:55):
thearahnose for what's coming. Yeah, yeah, I'm fucking going for it.
I'm crazy, but yeah, and they will crucify Sam Altman
if what I think is happening over there is happening
where they're misleading investors, which is my personal belief and
not those of the iHeart Radio Corporation. If that is true,
Sam Mormons should go to jail if it's true, which

(01:44:16):
we do not know. If it is, he probably won't
because like if he's a white when he absolutely would,
but would he have been able to raise the money?
Would they do in this?

Speaker 3 (01:44:24):
It's but Elizabeth.

Speaker 2 (01:44:26):
Holmes was like an actual criminal, Like we can't, We're
not gonna whitewash this. She was an actual criminal who
like talk with the deep Steve jobsys, which I think
is a funny thing.

Speaker 1 (01:44:35):
The voice thing, Okay, I will say.

Speaker 3 (01:44:39):
This my eternal neck, the whole the whole bit.

Speaker 1 (01:44:42):
I that is where I think sexism is coloring the
entire interpretation, because truly, when I speak in a lower voice,
I get treated differently. And when I speaking of and
like that. You listen to women who speak in baby voices,
and we don't say it's fucked up that women speaking
baby voices, and you run into these women all the
fucking time, and that's not their natural voice, that's not

(01:45:04):
that voice that if they were just left grew up
in the woods, that they would be speaking with that's
a performed voice that they that's an identity that they have.

Speaker 2 (01:45:10):
Sorry, we've got no I won a way thing about this.

Speaker 1 (01:45:13):
But like, how women are perceived in society is our
voices are a huge part of it, as as they
are with men. Think of gay men who sound gayer
like they get treated differently, right, Like, all of that
is is part of our performance of power in society,
or our performance of how power and gender are together
in society. And so then I do think and like,

(01:45:36):
her voice sounds insane. It sounds insane, but I also
am like that one I fully fun drive.

Speaker 2 (01:45:45):
Can you tell me more about have you had like
as a woman, have you had to lower your voice?
And name what scenarios if you had.

Speaker 1 (01:45:51):
I have a very low voice, right, I just like
speak in a very low voice naturally, right. So I
it's more that I think the inverse is a much
more common experience where women women baby like tucking baby voices,
and that is a that is like a relinquishing of power,

(01:46:12):
and that is a submissive stance. And for me, it's like, yeah,
if I talk, if I speak to somebody on the
street or in public, if I'm speaking to a stranger,
and I'm like, yo, if I talk at that level,
like people people turn expecting a man.

Speaker 2 (01:46:26):
Right and to treat And then this is an actual
thirty Europe joke with that I'm a very sexy baby. Yeah,
the manipulation through No, it's just I mean, I'm a
fucking white guy, like what a wife fucking though. But
it's like the reason I also is like is this
a commonplace thing in the white place of woman.

Speaker 1 (01:46:43):
I'm like, I don't think it's a commonplace. I don't
think the voice thing. I don't think speaking in a
low voice or like a higher voice, even higher voices
all the time. Like if you start listening to it,
if there's so many, do you hear it's like, yeah,
weird phony baby voices.

Speaker 2 (01:46:59):
I don't.

Speaker 1 (01:46:59):
I that the idea that a woman like artificially lowers
her voice like Elizabeth Holmes did, that's a low is
very uncommon totally. And her voice sounds insane, Like I
don't dismiss that she sounds batshit, But it's also like
what we.

Speaker 2 (01:47:15):
It's the.

Speaker 1 (01:47:18):
Thing that drives me nuts, is like that we it
we will uh, we will find fault in however, a
woman pays sense right and that when she drives to
then model after a male presentation in some way, then
we're like, well, that's wrong.

Speaker 9 (01:47:35):
But it really was. What really frustrates me is you
get a lot of people who will point out laughingly
she was clearly trying to imitate Steve Jobs and the
way that she dressed, with the turnal like and point
out like how and how like the style of her
presentations and what doesn't get brought up much like nearly
as often in part because people don't like to think
of jobs or other like heroic figures like tech founder figures.

Speaker 2 (01:47:58):
This way is the one of the other way she
aped him.

Speaker 9 (01:48:00):
His jobs was very famous, not so much at the
big public announcement level, but like when he was talking
to investors, when he was putting together ideas with vastly
over promising right, and part of what he did was
be incredibly anal retentive about like when a product was
actually ready. But he had these like very specific, often

(01:48:21):
pie in the sky demands for what this thing needed
to do, how it needed to look, how it needed
to feel that he would promise could be done before
they knew that they could do it, and he hired
good people and Apple was able to square those circles
Apple was able to produce. But there is a similarity
with because her attitude was very much similar, which is
what I need to do is make the promise, and
I need to esthetically make the case of how this

(01:48:42):
is going to look and feel and what it can do.
And as long as we can get enough money doing that,
we can brute force the having a real product thing.
And it's not exactly the same thing, but it's close enough.
Because almost every founder at some point relies on I
don't fully know that what I can do meaches what
meets what I am selling to the people who are

(01:49:02):
investing money, and I don't fully know that the product
that I am promising them I'm really going to be
able to deliver, especially for the costs that I'm promising them,
and that fundamental reality of how all this works. They
gets obscured a lot because it's not It makes it.
It makes it sound like gambling, and in a lot

(01:49:24):
of ways it very much is.

Speaker 2 (01:49:25):
I mean, what's the difference. I mean, there is a
difference in that theirnise did not exist. But what you
just described is large language models like it literally is it.
It's look this thing fucking sucks and it doesn't do
what you want it to. But what if we spent
more money than anything that's spent ever, ever, forever. What
if it became something completely different and people will try

(01:49:47):
and differentiate because Okay, the first step with Theronise is bullshit.
They just sent it the lab Corp. They didn't even
have the beginning steps. That fucking sucks. But Samulman does
not have the first steps to agi fucking warry, I
Amma Day doesn't have the first steps to a fucking
shit in a hole in the ground like they're all doing.
Only a few steps from thernos, and they will be

(01:50:09):
colored very differently that they were just innovators risking. If
you end up writing that when this ship pops, you're
a mark in the same way that every single person
who covered Therhnose was because it's the same shite.

Speaker 9 (01:50:24):
There was.

Speaker 2 (01:50:24):
There was a real good reporter at the journal that yeah, yeah, yeah, yeah.
Several years after all the major articles came out. John
Carrier was several year twenty fifteen, I think that came out.
There were Theronose articles for years, and there was several
blogs before Carrier rou that were like, hey, I looked
at the science. None of this made sense. It seem

(01:50:47):
like it can work. Yeah, and carrier Row did and
typically he did incredible reporting, a really great job. But
it's like there were very few people who did. And
I think it's I think when Sam Mortman gets crucified
as I expect Evangelian style, it's not going to be
the same way, and it will. I think everyone needs

(01:51:07):
to pre prepare for if you want to frame this
as the innovator is dilemma. Oh, you know, just he
was just trying something that is not a fair way
to look at it because there was never a guidance
point from that wherever agi bullshit happens.

Speaker 9 (01:51:22):
It's also fundamentally really bad for innovation. Like the big
frustrations for me today was walking past probably a cumulative
acre or two of different home robots. This is a
robot servant. This is a humanoid robot that you can
have due tasks for you. You can haven't watched your kids,
you can have it. Take watch take your pets out.

Speaker 2 (01:51:41):
You know.

Speaker 9 (01:51:41):
Here we've got one and we've got a robot dinosaur
and they're trying to get passed each watch him. Yeah,
I was kind of like yeah, there's some of it's cute,
some of it's There were a lot of like humanoid
robots that were like like awkwardly strung up like they
were on a gibbet to keep them standing, because like
when they were either they couldn't they'd broken, or they
would stay up on their own power. Like I wasn't

(01:52:02):
sure exactly what was happening there. But in the middle
of all that, I come up, like after like an
acre or two of this stuff, I come across a
little booth where it's like two guys at like a desk,
maybe two desks, and they've got just like a couple
of like stringy looking little like fabric tubes hanging and
then a couple of little like gizmo's on the table.

(01:52:22):
And I go up and I ask, like what it is.
And it's a photonic muscle, is what they're calling it.
And the way it works is I'm pulling up the
picture right here. Ara a Flex is the company, and
it's a photonic muscle. And you have these little bundles
that look kind of like this, and each of these
bundles has an led light in it, and there's like

(01:52:43):
a kind of fiber or something inside it, and when
the light turns on, it causes that that fiber to
contract in a way that mimics a motor or functions
the same way as a motor, and so you can
put them in devices. And the example they had was
there this this device that's like a I think, like
you call like an intake thing on like a car

(01:53:05):
right where you need these little flaps to open, and
the flaps open and close by the use of these
and they take up like half as much space as
a motor would take because instead of a motor you
have these little like synthetic photonic muscles that are much lighter,
much less space, fewer moving parts.

Speaker 5 (01:53:22):
It's the chance to see stuff like that, which is
why I keep coming back, and yet I miss this
company because there's a lot to take in.

Speaker 9 (01:53:28):
There, and you you have to walk past again. All
these these companies that have put how many god knows
how many hundreds of millions of dollars into humanoid robots, most
of which never have a chance really of establishing themselves,
and very like basically none that I saw outside of
like the ones meant for specific manufacturing uses, seemed like
they had a creative or intellectually interesting reason or solved

(01:53:51):
a problem, and that so vastly outnumbers and funding and attention.
They like somebody made a muscle with light. That's kind
of that's kind of sweet.

Speaker 2 (01:54:00):
The problem is is like the idea a robot that
does one thing is not what people expect from a robot.
Everyone wants to have one robot that is just a
person a slave. Just let's just let's just cut straight
to it.

Speaker 5 (01:54:11):
LG's whole saleswitch was the zero Labor Home, which I
mean like big if true, right.

Speaker 2 (01:54:17):
Yeah, yeah, but it's from some lave.

Speaker 9 (01:54:20):
Didn't you see the Cloyd.

Speaker 3 (01:54:22):
Yeah he's uh your keys willing servant.

Speaker 2 (01:54:25):
Yeah, but he's pretty fruit.

Speaker 1 (01:54:27):
It's demented that it is what people want.

Speaker 2 (01:54:30):
Yeah, it's a slave. It's you either want a completely
faceless machine. And I'm sure I want to make.

Speaker 11 (01:54:38):
Sure is the is the black guy, Like come on,
that's the guy.

Speaker 2 (01:54:49):
You know you can't see.

Speaker 8 (01:54:50):
But I got some hotel rings on, so I know
something about slaves.

Speaker 1 (01:54:54):
I know something, you.

Speaker 8 (01:54:56):
Know in my newsletter I wrote it, there's this essay
a that was talking about how the roots of automation
are go back to slavery, right, so part of the
part of the the thing that developed the impetus for
automation is that, you know, you had the abolition of slavery,
and you had the British Empire being like, oh my

(01:55:16):
fucking god, how are we going to keep you know,
plantations are incredibly profitable. How are we going to keep
these sort of profits because they're also integral to our
empire and like for funding our navies and fielding our armies.
And so the idea was like, okay, well, you have
to make the factory resemble a plantation as much as possible.
And I think most people when they think of plantations,
they think like, okay, you know it's outside, disorganize a

(01:55:39):
little bit, you have an overseer. But they were like
very highly regimented, and they developed very specific methods of
extracting and terrorize as much labor as possible and terrorizing people.
And so the automation that was developed by this guy,
Charles Babbage was like an attempt to do early divisions
of labor and create machines that would be able to
you know, self surveil workers or workers would be able

(01:55:59):
to help avail themselves or quantify their work or divide
it into pieces like some of the very foundational theories
of how labor should be organized going into okay, well,
how do we get a non human along a division
of labor where we're separating as much work as possible
to discrete repeatable, substitutable parts, you know, which we already

(01:56:24):
designed to resemble slavery? How do we get that in
this like embodied thing that is also just happens to
look like a person without a face, that's we want
everywhere in the house.

Speaker 3 (01:56:34):
So I don't know.

Speaker 8 (01:56:34):
I feel like the heart and the soul of it
does link back to it, and it goes back even further, right,
But I think then also the other question is like
how much do you think this is also colored by
experiences maybe some of these people have where you know,
they're they're like, yeah, why shouldn't I, you know, why
shouldn't I have a person or a thing in my house?

(01:56:56):
And how can I convince other people to do it?
If only by having a robot do it?

Speaker 2 (01:57:00):
And I think that there's well, there's two parts of that.
It's one they might have hired help and they're like, well,
I have to pay some fucking piece of shit human
Jesus Corst. But also there is I feel like there's
a division in automation with some people where there's some
people where it's like I see this asn't as a
kind of as like an amplification of who I am,
like like an extra arm, like an arm that moves

(01:57:21):
one thing to one thing, which is not a thing
I command, But it's just a thingy that does an
action that's not a sexy thing you can raise funding on.
You can't be like I got an arm that moves
the thing from a thing, which is most industrial machinery.
It's just like repeating one task again and again and again.
The reason that replaces humans is that's a repetitive action.
It fucks up humans. But it's not about the subjugation
of a person. It's an action. It's not about controlling someone.

(01:57:44):
It's about making a thingy happen. Then there are the
people who are like, wouldn't it be awesome if every
fucking whim I had was perfectly solved and I could
order someone to quote Adam kind of a fear say,
don't know the way I like it, And it's this
idea of this bought yesterday. Oh we can do the
TikTok daunce. Why do you want someone to dance for you?

(01:58:07):
That's a fucking peculiar instinct.

Speaker 9 (01:58:10):
Part of what's so difficult, though, is that these are
it's it's a version of the same problem that you
have with a lot of like genetics and like fertility
and technology. Right, where do you want to make sure
if you can find out in the womb that a
child's going to be born with its lungs outside its
body and you can just kind of snip that out

(01:58:31):
so the lungs are inside the body, do we want that?
Everyone's pretty much like, oh, yeah, that's probably good if
kids have lungs. Yeah, do you want to be able
to tell if someone has autism and snip that whoa
suddenly suddenly or any of the other horrible eugenics questions? Right,
So everyone's like, yeah, there's certain things we want to
be able to do with genetics technology. But if you

(01:58:53):
say this is okay, all people are going to start
immediately pushing for all and like drawing a line is
something that like the fact that you're drawing a line,
it means there's going to be strong lobbies of people
pushing for stuff that is really bad, right, And with automation,
everyone's like, like, I saw a product today that's one
of a I've seen a couple like this where it's

(01:59:13):
a big basically like a big cart on wheels that
can go off road and can be fully autonomous and
you can set here's where wounded guys are gonna be,
here's where the hospital is, drive up to the front line,
people throw a wounded guy in, drive to the to
the medical good. Right, And you don't necessarily like it's
better if a robot does that, because people can get

(01:59:34):
blowed up, right if you can have and so. And
there's also like, obviously I'm sure we all have individual
pain in the ass tasks that were like, well, yeah,
I'm I finally have a robot washes my dishes. Like
but once you once you're saying, well, we do want
to automate some stuff, you're having then a conversation where
is the line? And right now everything's on the table,

(01:59:55):
including like the creation of like art and beauty and
the raising the children.

Speaker 2 (02:00:00):
The stuff when we're not working basic cognition.

Speaker 9 (02:00:04):
And that's the fact. But you can't if you're going
to talk about some of it. All of it's going
to be on the table at some point, and right
now it is.

Speaker 8 (02:00:11):
Yeah, I think that's what's the other thing right where
it's like, you know, to your point, there's automation that
saves you labor and enhances productivity around the home, you know,
and around certain workplaces that reduces the drudgery of tasks
we have to do. And then there's automation that is
deployed in a way to squeeze as much as possible

(02:00:31):
out of people beyond what they might in of themselves
want to do if they had control over to what
degree am I going to augment my workplace or myself
with this stuff?

Speaker 2 (02:00:43):
Right?

Speaker 8 (02:00:43):
And I think that ends up also being the line
with a lot of these technologies where it's like, you know,
how if we were designing genetic technologies, if we were
designing you know, assistive automation or assistive artificial intelligences and
grafting them onto our lives, I feel like the points
in which we would let them and not let them
look different than what your concerns and interests are. If

(02:01:05):
you're like, Okay, well I'm looking to get a return
or I'm looking to organize the workplace in a certain way,
or I'm looking to just meet a very specific need
and this or that context right in a battlefield and hospital,
you know, in a public space. So I think so
I you know, part of this is, like you know, downstream,

(02:01:25):
the consequence of the fact I feel like we're almost
all technical decisions have nothing to do with what anyone
other than these private investors.

Speaker 1 (02:01:34):
I mean a thought that I have had this whole
last few days is what does it mean to be
optimized or to have your life optimized? What is the
actual outcome of that? What are we optimizing for? And
if the optimization is we're going to remove all of

(02:01:55):
the annoying tasks so you can enjoy life, that sounds
like an awesome answer, I can totally get behind on
the ground that's not that doesn't seem to be what
any of the products Like.

Speaker 2 (02:02:05):
I was just thinking of something very simple, which just
like I'm lucky to live a really good life. And
the reason I ken is I have a I have
a washing machine and a dryer that works really well,
and I have a good dishwasher. Like it's like yeah,
like it's like the shit that sucks, isn't it? Like

(02:02:27):
I can afford to use uber I can get around
like a fifty dollars dishwasher that's really good, feels like
it would be more innovative than anything on this floor,
because most people can't afford to have a good one.
Like the things that get in the way of existing
are labor, as in having to do your horrible fucking
job that doesn't pay you enough, or the shitty like

(02:02:48):
how the fuck do you look after the children you have?
They aren't really there's not really tech. You can't buy
more time unless you have actual labor help, yeah, which
requires more money, and you can't really replace that with
a robot because the reason they need an every man
robot is because problems are complex. The problems of even

(02:03:09):
dishes are complex.

Speaker 1 (02:03:10):
And decision making is exactly you.

Speaker 2 (02:03:13):
Know what that is exactly it. It's like, it's not the
problems themselves, it's the decisions around them, I mean.

Speaker 9 (02:03:18):
And that's why the most responsible technologies at cees stick
to doing the things where there's no real downside if
it's not too perfectly like raising children, which I saw
a really good child companion robe today, which I look
for at every one of these I love.

Speaker 2 (02:03:37):
But yes, oh I wasn't sure where you going back?

Speaker 9 (02:03:41):
No, there, So there was a really good one that
it's one of my favorite CS categories where it's like
Southeast Asian tech company marketing to Americans that doesn't fully
understand our culture. And so the they have like a
whole they have a set of bleachers erected and like
fifty of these like little robot dolls sitting in them.
I'll show you, guys, but they're like they're all wearing

(02:04:04):
like sleeveless shirts and yeah, yeah, they're all wearing sleeveless
shirts and like little beanie hats basically like that have
the name of the robot on it, which is.

Speaker 2 (02:04:16):
Booster or is it Booster? Yeah, it's Booster.

Speaker 9 (02:04:20):
But then in front they had one robot that was
dressed differently and I didn't realize at first it looked
like it had like a suit and a black hat on.
And then the video starts playing and I realize that
fucking robot is Michael Jackson and it's dancing to Billy
Jean alongside what I believe is a kid.

Speaker 3 (02:04:35):
I know exactly, and it was well that did happen?

Speaker 9 (02:04:39):
Could not get across of them, Hey, guys, maybe Michael
Jackson is not the best robot mascot for a children's toy.
But immediately what I got to show you, guys, and
this is what I was telling you.

Speaker 2 (02:04:53):
We might cut this in.

Speaker 9 (02:04:54):
We might justly I want the reaction. This is what
happens in that video immediately after it's the robots dance,
So this is right after Billy Jean.

Speaker 2 (02:05:14):
That is identical to the way.

Speaker 3 (02:05:21):
So weird.

Speaker 9 (02:05:22):
What happens in that video, the instant that finishes doing
a like dancing to Billy Jean while like moonwalking, there's
immediately a man runs into another version of the robot
and starts beating it with a wooden stick, which then
switches to a guy putting cinder blocks on top of
it and smashing it with a fucking mallet. And then

(02:05:43):
right after that dude comes in with a liquor bottle
and shatters it over the robot's head.

Speaker 2 (02:05:48):
If you are protecting your children from the homeless from the.

Speaker 1 (02:05:53):
Robots, like, what else does that saying? I let the
robot stands up after you.

Speaker 2 (02:05:58):
You can't do it, you can't. I love that because
it looked almost identical to those things. I don't know
if you remember the modeling experiment, like the regularly disproved
thing about how children will copy anything, and it's just
children hitting a clown with a stick and it looked identical.
It's the same fucking thing that c Yes, baby, just
like I want to be in the room. When they

(02:06:21):
were recording that, it's like, hit it again, hit it again,
get a liquor, get a liquor bottle, fuck it up.

Speaker 1 (02:06:27):
It was unclear.

Speaker 2 (02:06:28):
I'll do that for free.

Speaker 9 (02:06:29):
What the selling point? Because it goes from it can
dance like Michael Jackson. You can beat the shit out
of it and it's fine. It'll walk with your little
girl and help her learn languages. It's like quizzing herd.

Speaker 2 (02:06:43):
Why is that?

Speaker 9 (02:06:44):
And I kind of think maybe they just sort of
incompetently realize something that is a brilliant idea, which is
if you if you mark it and build a robot
entirely on terminator to logic, yell, you just show that
this robot can't be killed. It's loyal to your child,
it'll never get drunk in hit them, It'll never not
come home after staying out late at the bar at night.
You know, like the whole state, the whole Sarah Connor

(02:07:06):
speech from T two.

Speaker 5 (02:07:10):
For you note, I did not see any robots built
out of liquid metal, but I mean I missed some
corners of the vation expos so maybe that's a south.

Speaker 8 (02:07:20):
I want the AI bubble burst will get to what
we really need to get to, which is building terminators.

Speaker 2 (02:07:25):
All right, as we approach the end of this, Chloe Radcliffe,
what do you think of cees.

Speaker 1 (02:07:30):
Just it's I mean, truly, I go back to numbers
you don't need. Yeah, I go back to so what
does that? Do you know? What's funny? Or so who
gives a shit? It's it's so unnerving to think about

(02:07:51):
how how much of our society feels like there's no money, yeah,
and then to come to a place where there is
all the money.

Speaker 2 (02:08:01):
Yeah.

Speaker 1 (02:08:01):
And it's that that I think is distributing to.

Speaker 3 (02:08:04):
An AI industry conference.

Speaker 1 (02:08:06):
Yeah, yeah, yeah totally. And And like I work in entertainment,
and such a common refrain in entertainment is like, ah,
we just don't have the budgets to do that anymore.
We used to have those kind of budgets and we
don't anymore. And I'm like, there is money fucking slashing around,
hitting the sides, coming out people's noses, like there is
so much money and it's infuriating to me that you know,

(02:08:30):
I'm taking a very narrow view right now.

Speaker 2 (02:08:32):
I actually push back, You're not, You're taking a very.

Speaker 1 (02:08:35):
Reasonable but here here the narrow, the narrow part of
the view is like I'm putting aside the fact that
it's infuriating that people are living very difficult lives amidst
all this money floating around. But but like even I'm
just thinking about my job. You know, it's like in
the entertainment industry, the fact that there is relatively little money,
that money is tight, and I'm like, it's not. It's
also it's it's especially relevant. Sorry. And then a second,

(02:08:57):
it's especially pertinent in entertainment because so much of entertainment
is now owned by the thing that owned the tech
where all the money lives. And so I'm like, what
the fuck do you mean there's no money in entertainment?

Speaker 2 (02:09:12):
Do you?

Speaker 9 (02:09:12):
You couldn't be more right, because like number one, like
the other thing is that it's entertainment makes as much
money as ever, it's not no longer making money. But
like you said, the companies that now own all of
these fucking are taking those profits and they're currently pouring
them into the ship that you're seeing on the broken
shit on the floor here.

Speaker 2 (02:09:31):
Yeah, and when all of this is dead and bleeding
on the floor, you're gonna give it to Chloe Radcliffe.

Speaker 1 (02:09:37):
Yeah that's right.

Speaker 2 (02:09:39):
We're gonna wrap here. And I just want to say,
on behalf of all the listeners and myself, thank you, Chloe,
thank you for joining It be honor. I've had a
hard join us next year. Yeah, it's been a genuine
pleasure to have you. Wecon peoplefined you.

Speaker 1 (02:09:53):
I am at Chloe badcliff on all platforms like my
last name Radcliffe but bad uh. And I'm in Cincinnati
this weekend, I'm in DC next weekend. I'm in Philly
the week after. Follow me sign up for my mailing list.

Speaker 2 (02:10:08):
Yeah yeah, Rob Pegarra will have your links in there.
But thank you so much for joining us. You are
you are welcome, You are one of the best. I
love having you here. I try and you are at
everything I'm at, which is genuinely a pleasure. Robert Evans,
my boss behind the Bastards, thank you for joining us,
and of course the other ed and I'm going so
love you, buddy, Thank you for having me.

Speaker 8 (02:10:30):
Love you too, Bud, and.

Speaker 2 (02:10:32):
I love you all for listening. We will be back
tomorrow for four more hours. Yeah, I forgot for a.

Speaker 1 (02:10:37):
Sh fucking psychos.

Speaker 7 (02:10:39):
I can't be.

Speaker 2 (02:10:41):
Nah, all of the men I'm radicalizing, and the woman
that listen as well, and the trans people and everyone
in non binary, everyone who listens. I don't care who
you are, but I care that you are who you are.
It has been a very long day and it's a
very long week. But I love you listening. We will
be back four hours tomorrow, an hour eplog on Saturday,
and then they are going to shoot me in the

(02:11:01):
head parody. I'm d Zichron. Please subscribe to my newsletter.
I did not do premium this week, so please use
the link in the thing. I desperately need that. Thank
you to Mattasowski, my wonderful producer who has been working
his fucking assoft all week. Amazing show, More amazing stuff
to come. We're also dedicating these episodes the Sean Paul Adams,

(02:11:22):
who was a friend of the show and the Suite.
Sadly he passed last year. Shanpaul absolutely rocked, So we're
honoring him by donating to the Pediatric Epilepsy Research Consortium.
Shampaul's son is epileptic, and his family and friends would
deeply appreciate your donations, as would I.

Speaker 12 (02:11:37):
Thank you, Thank you for listening to Better Offline.

Speaker 2 (02:11:48):
The editor and composer of the Better Offline theme song
is Matasowski. You can check out more of his music
and audio projects at Mattasowski dot com.

Speaker 1 (02:11:56):
M A. T.

Speaker 12 (02:11:56):
T OsO w Ski.

Speaker 2 (02:12:01):
You can email me at easy at Better Offline dot
com or visit Better Offline dot com to find more
podcast links and of course, my newsletter. I also really
recommend you go to chat dot Where's Youread dot at
to visit the discord, and go to our slash Better
Offline to check out I'll Reddit. Thank you so much
for listening.

Speaker 12 (02:12:18):
Better Offline is a production of cool Zone Media.

Speaker 2 (02:12:21):
For more from cool Zone Media, visit our website cool
Zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.