All Episodes

January 8, 2026 121 mins

Welcome to Better Offline’s coverage of the 2026 Consumer Electronics Show - a standup radio station in the Palazzo Hotel with an attached open bar where reporters, experts and various other characters bring you the stories from the floor.

In Thursday’s first episode, Ed is joined by Devindra Hardawar of Engadget, actress and standup comedian Chloe Radlciffe, Edward Ongweso of the Tech Bubble Newsletter and Matt Binder of Mashable to talk about the anti-consumer electronics show, how AI buying up all the RAM is going to make computing unaffordable, Dell’s quasi-reversal on AI, why you should be buying all your tech used, and why it’s time to use tech to tell people you love their stuff.

EXCLUSIVE CES SALE! Get a *permanent* $10 off an annual subscription to my newsletter through January 13 2025:
https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/cue848p5sc 

Ed Ongweso Jr.: https://bsky.app/profile/bigblackjacobin.bsky.social 

The Tech Bubble Newsletter: https://thetechbubble.substack.com/ 

Devindra Hardawar:
https://www.engadget.com/about/editors/devindra-hardawar/
http://thefilmcast.com/

Matt Binder: https://mashable.com/author/matt-binder

Chloe Radcliffe: 

https://www.instagram.com/chloebadcliffe/?hl=en

https://punchup.live/chloeradcliffe

Donate in Sean-Paul’s honor: https://www.perc-epilepsy.org/

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

Email Me: ez@betteroffline.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Media, enter the mind gym and pick up some brain weights.
It's time for Better Offlines coverage of the Consumer Electronics Show, and.

Speaker 2 (00:09):
I am your host ed Zitron.

Speaker 1 (00:23):
We're back here in the plaza and beautiful Las Vegs
Nevarda bringing you another episode. It is Thursday and we're
still covering cees. We are still here with an amazing
assortment of guests from the tech industry. We've got an
open bar, we've got tacos, a place to sit down
for members of the media, whether or not they join
us on the microphone. And of course we've got some
new contestants and some old contestants revisiting guests. My first is,

(00:44):
of course, Chloe Radcliffe, standup comedian and actress from He
is this thing on.

Speaker 3 (00:49):
You're never going to get rid of me. I'm like
bed buds.

Speaker 1 (00:51):
I love it with Cradley Booper and of course Matt
Bender Mashball joins.

Speaker 3 (00:56):
Us say hey three times in a row.

Speaker 2 (00:58):
What's up?

Speaker 1 (00:59):
And then the wonderful to Vendra Hadowar, Evan Gadget HEO,
let's go. And you said you wanted to talk shit
on Dell, so let's.

Speaker 3 (01:07):
Give the man.

Speaker 2 (01:08):
Give us the history.

Speaker 4 (01:10):
So last year, Dell did this the dumbest thing I've
seen a tech company do in a very long time.
They were like, Hey, we all these brands. Yeah, you
know and love XPS, that thing that's been selling for decades.
People love it. Let's kill all that. Let's call all
computers Dell, Dell, pro, Dell, Promax. Those names may sound familiar,
they're a little apply. Yeah, And when that was announced
last year, I was at a press conference and Michael

(01:32):
Dell was there announcing all of this, and my first
question to him is like, what do you have to
gain by copying Apple? Like, what are you doing here?

Speaker 2 (01:38):
Yeah?

Speaker 4 (01:38):
He did not have a good response to that. Fast
forward a year, every other PC manufacturer shipment's growing, Dell
down because nobody knew what the hell Dell was producing
this year, especially the people who wanted an XPS. So
they turned they turned around, they brought back the XPS brand.
They're actually doubling down on it and all so fixing

(02:00):
all the stupid issues we've brought up in our reviews
over the last few years too.

Speaker 2 (02:03):
So it's double vindication. What I guess back, what are
the things that they've messed up? What if they fix
have you seen the XPS lately? I have not.

Speaker 4 (02:11):
So they did this thing. Sometimes designers go a little crazy.
They're like, what if what if you couldn't see a
track pad? What if it was just all wrisk pad?
So they did that. They made an invisible track pad
that works and it looks cool, but it's hard to
use because you can't tell where it begins.

Speaker 3 (02:27):
It's hard to use because a huge part of using
a track pad is being able to see.

Speaker 2 (02:32):
I feel it, at least feel I have a problem
with my MacBook Probe.

Speaker 1 (02:37):
The track put pad is slightly bigger than usual. It
is nasty, clammy wrists get on it. But an invisible
one I would snap that thing.

Speaker 4 (02:44):
But also, at least with a MacBook right, you have
an edge.

Speaker 3 (02:46):
You can see what it is. You can see that.

Speaker 4 (02:48):
You don't even need to see it. You need to
feel it, right because you're not looking at the track.

Speaker 5 (02:53):
I could see I could see an invisible track pad
working if they made like the technology to just extend
it so that area is a track pad.

Speaker 3 (03:01):
I mean you're saying, is that's not the case.

Speaker 4 (03:03):
A lot of it is a track pad, but the
very edges aren't. Anyway, what they did with the new redesign.
They put just little little notches so you feel it.

Speaker 2 (03:09):
Wow.

Speaker 4 (03:09):
And these are just like basic usability things.

Speaker 2 (03:12):
To think about. They could have.

Speaker 4 (03:15):
They could have done that first, but anyway, And also
the function row on the Dell xps is for the
last couple of years was this like capacitive touch thing.
They weren't real keys. There were buttons that kind of
changed when you hit a button and they disappeared in
bright sunlight. You couldn't see them. So like your function keys,
the volume button, all that stuff just disappeared. And I
talked to that. I was like, did you guys not

(03:36):
go outside? Did you not bring this computer outside?

Speaker 3 (03:38):
Don't see you're talking to computer eduering didn't go outside?

Speaker 2 (03:42):
But like wait, but did they say whether they did?

Speaker 4 (03:46):
They were like it was COVID and we were designing,
you know. It was like the excuses were it was
kind of a rough time to design. I think sometimes
some companies let their designers go crazy. Yeah, we're gonna
out Apple. Apple. The thing about Apple is that they
they at least tend to think about usability and practical functional.

Speaker 1 (04:03):
The company best known for its usability. Yeah, like very
consumer facing. They don't like changing things up other than
liquid gloss, which fucking sucks.

Speaker 4 (04:12):
You could argue about liquid glass. Yeah, but they're charging
the mouse on the bottom. Oh that's like Johnny I bullshit,
but Apple itself doesn't tend to do Yeah, so anyway,
that's they're just keys. Now, they're just keys. And they
these new the XPS fourteen and sixteen or a lighter
than ever. This XPS fourteen is like about three pounds,
which is pretty great. Yeah for a fourteens laptop. So yeah,

(04:32):
total vindication. They fixed a lot of the issues. The
branding is back. It is funny talking to people at
Dell because all the executives and the marketing people like, hey, yeah,
this rebranding's gonna go great, and all the underlings, like
all the people who've been working on these computers and
have dedicated decades of their lives with their like what
what is happening here? Like I built XPS machines, you know,
And they felt really bad about it. So on every level,

(04:56):
the branding, the rebranding was a failure. XPS is back,
and I'm happy about that. And we just got to
tell Dell for like that whole when I saw them again. Yeah,
we were right, We told you, we told you this.

Speaker 1 (05:06):
Well, there's a misinformation going on around Dallas well because
everyone's saying, oh, Dell claim that they're backing off AI
branding and people don't like AI. Apparently, David Girard, you
chased this down, Journalius, Javid Girard, Jesus Christ. He apparently
that was never said. And Dell is full scale AI
like anyone who thinks that they're not. They have forecasted
twenty five billion dollars of AI server sales this year.

(05:29):
Do not believe Dell's lies. Do not let them lie
to you and claim this isn't the case. They are
fully AI piled and will be punished by the dark
gods when the AI bubble bursts.

Speaker 4 (05:40):
I've had some interesting conversations with people at Dell who
were pissed off at Microsoft and pissed off about the
AIPC push because it's literally all marketing bullshit, as you've
covered extensively, that has not led to actually useful futures
for people. So, yeah, Dell is invest in AI. What
I have noticed is that they're not saying AIPC. They're
not out there shipping saying like hey, we got the

(06:01):
latest co pilot plus systems because nobody cared. Nobody cared
about copilotplus.

Speaker 3 (06:05):
They try, saying they try.

Speaker 4 (06:08):
Just here they did, and now it's just like, hey,
you know what's great a computer? A computer that works?

Speaker 3 (06:13):
What if a computer? Yeah?

Speaker 4 (06:15):
What if?

Speaker 3 (06:16):
What if a computer was a thing that you could
see the things at the keys?

Speaker 4 (06:21):
You could see it, you could feel the track pas.

Speaker 3 (06:22):
Basically, I will say that from my now you know,
this is my first year at CESU, from my relative
limited exposure to the kind of products that get shown at.

Speaker 2 (06:32):
The ce S.

Speaker 3 (06:33):
Somebody being like, hey, do you want a laptop that
you have no idea where you're touching on it? Do
you want a laptop that has a bunch of buttons
but you're going to have to use it in pretty
limited facilities to make sure that you know exactly what
you are clicking. That's the new product is.

Speaker 4 (06:50):
The whole world can be described by like, I think
you should leave sketches, right, yea car sketches. I want
a car where the steering wheel doesn't fly off. And
that's basically what we're saying.

Speaker 1 (07:00):
No, no, it's it's literally there's two cys is two
I think you should leave sketches.

Speaker 3 (07:04):
It's that.

Speaker 2 (07:05):
And it's also a guy who just walks around going
what the hell is that?

Speaker 3 (07:08):
What the hell is that? Her niece smart.

Speaker 2 (07:10):
Glasses that goes that's a flower paw.

Speaker 1 (07:13):
Oh my god, who walks into his pantry and goes, oh,
what the fuck do I cook?

Speaker 4 (07:19):
The I've got too much shit on me sketch. Yeah, yep,
I just can't serve. That's Victoria song.

Speaker 3 (07:24):
Yeah.

Speaker 4 (07:24):
The just too many wearables, too many wearables. Although we
had a good chat with the guy from a Pebble
who's back. Everyone likes Pebble good wearables. Actually, I said,
I said smart rings were bullshit last year and I
still believe that. Sorry, Victoria, but the Pebble smart ring
kind of cool.

Speaker 3 (07:40):
What do you think is bullshit about smart rings?

Speaker 4 (07:43):
The form factor means they can never really do much.
The battery life is always gonna be.

Speaker 3 (07:48):
Sorry form factor you mean being ring.

Speaker 4 (07:50):
It's a ring where it can never have like a
big battery. It can never be that. So it's very
limited in terms of what you can do. And all
the rings we've seen, I know you've worn a couple
of like they're just they're fine, but you know what's better.
It's just a smart watch, is this not? As it
doesn't feel a.

Speaker 2 (08:06):
Push on you on that. I find smart watches very annoying,
and I agree, like my weird, fucking like oddly dainty wrists.
I mean that, like the see muscle there, thank you clothing,
thank you Chloe, just destroying me.

Speaker 1 (08:27):
Like my Apple watch for some reason, just doesn't get
a full connection no matter how tight it's interesting and
like this ring for the most part works, but even then,
like it's I still don't get a full connection sometimes
sometimes it doesn't really yeah, even then it doesn't recognize
a full workout.

Speaker 2 (08:43):
Sometimes it's it kind.

Speaker 4 (08:45):
Of wearables are personal, Like that's the thing.

Speaker 2 (08:48):
Yeah, we're all weird, Like what what we like to
touch with textures we like?

Speaker 4 (08:51):
I think I've learned. I just don't like ring stuff,
you know, so like that's just.

Speaker 2 (08:55):
It's hard to make a generous product in that way.

Speaker 5 (08:57):
The thing is, too, you have to be the type
of person who wears whatever that wearable is, like the
non tech version of it too, Like I don't wear watches.
I want to wear smart watch. I don't wear rings.
I want to wear a smart ring. I don't wear glasses,
So I want to wear smart glasses.

Speaker 4 (09:08):
That's a big one too. Smart glasses also kind of bullshit.
But the Pebble thing is kind of cool because the
Pebble guy was always like, first off, their thing was
eat ink smart watches, like back in twenty fourteen. Yeah,
those are coming back. He's bringing back that's been so
Pebble used to have these smart watches that lasted for
weeks and just they were very basic. They could like

(09:30):
upload notifications.

Speaker 2 (09:31):
He used to kindle. Yes, so it's like that. It's
like that, but watch.

Speaker 4 (09:35):
Not really an LCD screen but more like papery.

Speaker 3 (09:37):
Yeah.

Speaker 2 (09:38):
Yeah, it's it's the Piper screen.

Speaker 3 (09:39):
And I can read my book on my watch.

Speaker 2 (09:42):
Well, I mean I don't think you.

Speaker 4 (09:43):
Could read your notifications on your watch for sure. But
he's bringing back that because what happened was fitbit butt
Pebble Google butt fitbit, and he went to Google and
just like, hey, you're my software, my Pebble software. Nothing's happening.
Can you open source at and Google for once did
a good thing. I was like, yes, the open sources
so nice. Now now the pebblic guy can go back

(10:04):
and make hardware using that software. So the pebble watches
are back. But he has a ring that looks really cool.

Speaker 1 (10:10):
But the thing is with the ring from what I pod,
you have to physically hold it down.

Speaker 4 (10:13):
Well, so the ring all it does, it does one
thing you're supposed to do it really well, you hit
a button on the ring. It takes notes.

Speaker 2 (10:20):
Yes, do you have to physically hold it down?

Speaker 3 (10:23):
Then? I don't know if we also we looked this
up and I was saying that I because I as
a comedian, usually my phone background says document everything. I
record every single set that I do and stand up
in my voice memos like I use the notes and
the voice memos in my phone are the two things
that like.

Speaker 4 (10:40):
Yes, unlock my perfect for you, Perfect for Me supposedly
will last two years.

Speaker 3 (10:45):
Except also I love to like sometimes I'll just record
a meeting and not tell the other person. Now that's
legal in the state that I live, but it's a
way for me to be like, Okay, I remember, like
if I'm practicing pitch for a show that I'm trying
to sell or whatever, I'm like, I just want to
be able to be in the moment, but also remember

(11:06):
how I did this and how they react to and
what questionures they asked? Whatever, Anyway, this ring sounds like
a thing that I would use. Yeah, then I looked
up what it looks like and it really does look
like ye my ring that definitely does not have a
microphone in it.

Speaker 2 (11:20):
So I did just look this up.

Speaker 1 (11:21):
You just have to click it once to record it,
and it's seventy five bucks. I hate to say it,
I actually kind of want this because I'm not trying
to surreptitiously record people. But I absolutely like, usually got
an idea thirty seconds before I meant to fall asleep,
I remember thirteen things I would like a reminder for. Yes, now,
perhaps I wonn't be wearing a ring at that time,
but for seeably I'll walk around be like, shit, I

(11:42):
need to remember this, shit, I need to remember this.
Or even just if I could say an idea and
it says here it's an on device LM yep, and
that I find quite interesting because.

Speaker 4 (11:51):
Not going to the cloud, not going to the cloud, not.

Speaker 1 (11:55):
Melting giraffes as a means of powering GPUs.

Speaker 3 (11:58):
On which device on a ring on your phone?

Speaker 2 (12:00):
That's kind of what I'm trying to work out, and
we don't.

Speaker 4 (12:03):
Yeah, I don't. We just got a whole LLM in
a ring.

Speaker 2 (12:05):
No, that's kind of what I can get there.

Speaker 3 (12:08):
It's I'm asking such like podun kill bill, can you've
plait a whole robot on that ring?

Speaker 2 (12:16):
I actually will push back. That's not a podunk question.

Speaker 1 (12:19):
That's a perfectly fucking reasonable one, because especially when we're
being told we need eighty nine data centers, we must,
we must knock down entire neighborhoods and put up a
data center the size of I don't know New York
City for meta. I understand the question. I'm going to
assume that this thing needs you need to connect it
to your phone, and your phone it has I.

Speaker 3 (12:39):
Don't fit a whole AI on your phone.

Speaker 1 (12:41):
Yes, yes, but I'm guessing it's a very simplified, transcription
specific one.

Speaker 2 (12:47):
But I do not know.

Speaker 3 (12:48):
It might it might.

Speaker 5 (12:49):
It might store the recordings on the ring in case
you don't have the connectivity of your phone in the moment.

Speaker 1 (12:54):
Now it says completely on device, you don't require an
Internet connection.

Speaker 5 (12:58):
Okay, perfectly, I'm about this ring right here, right now.

Speaker 4 (13:02):
The show seventy five bucks March twenty twenty six, last
two years now that's the thing. It's not rechargeable. So
what once you once, once it's done, you send it
back to them?

Speaker 2 (13:15):
Why?

Speaker 4 (13:15):
Because he is very much like I think this is true.
We have too much shit to charge, Like I'm just
tired of it. So he's like, Okay, what if a
ring lasts two years and if you actually are still
using it by then send it back and maybe get
a replacement. Actually that is probably better than holding onto
a charger that you will know.

Speaker 5 (13:35):
I actually what happens a year and a half in though,
if it's got less of a charge in that last thing.

Speaker 1 (13:40):
It can record roughly twelve to fifteen hours of recording
and it lasts for two years.

Speaker 2 (13:46):
I'm sorry, I don't know if you're a journalist.

Speaker 3 (13:50):
Wait wait wait wait wait wait wait wait it.

Speaker 1 (13:51):
Runs out fifteen hours recording total. No, no, no, you
can that's what it says here on the website. I
mean I'm not to be able to offload.

Speaker 4 (13:59):
Yeah, we have not gone to the specific leve it,
but it definitely he wants to.

Speaker 3 (14:03):
Hold twelve to fifteen hours total and then you can
dump that.

Speaker 2 (14:07):
Yeah, but you just we have the technology. Clear.

Speaker 5 (14:10):
This is now I'm an article on not my outlet,
but Android authority uh with. It's expected to last for
up to two years with normal usage, and normal usage
is described as ten to twenty times per day, recording
three to six second voice notes.

Speaker 2 (14:30):
I wanted to love this. I wanted to so bad.
But if you're thinking a journalist and you do what what,
so fifteen is equal like thirty thirty minute interviews, that's
you're never going to record.

Speaker 4 (14:45):
It's not for interviews.

Speaker 3 (14:46):
Yeah, you're not going to record a full interview.

Speaker 4 (14:48):
That was for you on the toilet. This is for
you in the shower.

Speaker 2 (14:51):
Do you have a lot of thoughts there?

Speaker 4 (14:52):
It's toilet thoughts, it's shower thoughts. The storage thing we
have the ability to date like that that could be solved.

Speaker 1 (15:00):
But I'm just kind of like, if this is if
this is meant to be for quick thoughts, fine, but
I'm just like, as someone, especially because it does the transcription,
I'm just like, that would be really useful.

Speaker 2 (15:08):
But I guess you can do that with the Apple
Watch down.

Speaker 5 (15:10):
But also three to six second voice notes. I mean,
is that really saving you the time that you couldn't
just whip out your phone and type it yourself.

Speaker 4 (15:17):
Yeah, I'm kind of like, actually does that up?

Speaker 3 (15:21):
Yeah, I do actually think that being able to say
like a very quick thing captain's log. Yeah, yeah, I
do think that that is cool. The use case that
I would be using it for is if I I
am always like writing down in a conversation, if somebody
says a very smart thing or like analyzing something in
a really cool way, I'm always like, hold on, I

(15:42):
want to write that down. And so if I could
just be like, hey, can you explain, like explain to
me why this movie didn't wasn't wasn't effective in the
way that we both felt that it wasn't effective. But
I am not articulate enough to be able to put
this into words. But you, my boyfriend, who is a
smart director, can say it into my ring. And but

(16:04):
that explanation is going to take sixty seconds. Like I,
I do think that there's a use case and that
it's well sixty seconds though, is your sixty second note?
Is your entire use the last years?

Speaker 2 (16:16):
Exactly?

Speaker 3 (16:17):
That's what I'm saying. I would be doing that in
my mind too.

Speaker 5 (16:19):
If you give me a time frame like this, if
you want this the last two years, you need to
use it no more than sixty ten times you're going
to have your phone, But every time I use it,
I'd be like, oh, this was just number six, Do
I really want to go and push it for the
next note?

Speaker 3 (16:32):
And that is how you know that you did not
grow up with money.

Speaker 2 (16:36):
I still.

Speaker 3 (16:38):
That's a very clear class.

Speaker 1 (16:41):
I will give them credit and that they specifically say
this is not designed to record your whole life or meetings.
It is very much a reminder thing. I can kind
of see it. And they're all numerous times where I
will just be sitting there and be like, oh, I
have an idea, like what if Ben wois Blanc interrogated
Cuba for example? But like the use ones too, but

(17:02):
it's I also like typing, but again different if I
can see different strokes of different folks, that's fine. And
also Chloe does literally do this. She's like, can you
say that again?

Speaker 2 (17:11):
So I can run this might be a Chloe Radcliffe
approved device potentially.

Speaker 4 (17:15):
How the way I do reviews is often I'm like
out just like dictating thoughts into my phone. So that's
like often the thing. The other thing I'll point out
is that you know we the lifespan is the thing,
like will people want basically a disposable device. But what
if I think you send it back to them, it
gets recycled, the battery gets recycled, good for them. Electronics

(17:36):
recycling not that great, but matter than just need.

Speaker 3 (17:39):
To put it in the mail.

Speaker 1 (17:40):
Like it's just see the moment I'm told I have
to mail or something, I just want to I want
to jump off a fucking bridge.

Speaker 2 (17:47):
Like I was just like, oh, you want me to
go to the post office. Let me just let me,
let me Dwyer mode.

Speaker 4 (17:53):
You get a self self you know, label or whatever,
you drop it.

Speaker 3 (17:58):
No, no, no, no, you gotta print. You gotta print.

Speaker 4 (18:05):
You know they have there's that startup that does the
thing for returns right now where you get a barcode
and you just bring it to like.

Speaker 2 (18:09):
Yeah, and that ship. That's cool. Wait with like Amazon
returns and as.

Speaker 5 (18:16):
Long as you go to the ups store with coals
as well, I think, okay, yeah that's Amazon.

Speaker 4 (18:21):
There's also another company that just doesn't.

Speaker 3 (18:25):
Amberjack.

Speaker 1 (18:26):
Amberjack is a shoe company I used, and they have
it where it's just you scan. It was like you
can take it to coals or something, which is cool.
Like it's weird, Like there are little things like that
which are really useful.

Speaker 2 (18:34):
Like I said, this is a dyspractic physical coordinational disability.

Speaker 1 (18:39):
Putting together, like opening a package isn't fun. Packing a
package sucks. Is like the pain box.

Speaker 2 (18:45):
For me.

Speaker 1 (18:45):
It's like if you're like, return this with a label,
I'm like, you want me to touch the thing with
the knife on it and the type.

Speaker 4 (18:53):
It's also like that's where millennial breakdown happens. It's like, oh,
this is a task, not so much that takes time.

Speaker 2 (19:00):
Let me explain thispraxia though. Yeah, it's not like it's
like a task.

Speaker 1 (19:03):
It's picking up a box and putting tape in a
straight line is genuinely difficult.

Speaker 2 (19:08):
Like my brain.

Speaker 1 (19:09):
It's in like the more complex it is, like it
actually like is upsetting. Like it's meant like I'm sure
the few dispractics who listen like yeah, fuck you dude,
boxes suck, but it is really like that, and it's
hard to describe because it sounds like I'm just being
a baby. I wish I was. I wish this was
just me being like ooh, coot put a book together. No,
it's really fucking difficult. It's really it's so difficult for me.

(19:31):
So it's like, oh, I gave you a label.

Speaker 2 (19:33):
You gave me a death sentence.

Speaker 4 (19:35):
To the point of what you're saying there, we have
the technology.

Speaker 2 (19:38):
Yeah, two weeks is cool.

Speaker 4 (19:39):
You have a Cura code. You bring it to the story.
They're like, I'll deal with this problem for you, which
is can you walk away happy? Millennial?

Speaker 2 (19:44):
Yes?

Speaker 3 (19:44):
Yes, I was okay, I want to In defensive millennials,
are you a millennial?

Speaker 5 (19:48):
Yeah?

Speaker 3 (19:48):
Yeah, yeah, in defensive millennials. I think the reason I
don't want to do a task, for sure, you're not wrong,
I'm not Yeah.

Speaker 4 (19:55):
The things that the simple things that pile up that
take five minutes to do.

Speaker 3 (19:59):
Right, Yes, totally. But I think the reason that we
this is sort of off topic and I'm finding it
in the moment, so come with me on this, right.
But I think the reason that we are so uh,
that tasks are feel like so alienating and so off
putting is because we we are Oh boy, she's taken

(20:20):
a swing here, let's go to I think we are
because of our generation's interaction with the Internet's like rapid
expansion and social media's rapid expansion. We're sort of the
first generation that was expected to do suddenly like twice
as much in a career, in raising a family, in whatever.

(20:41):
That like, we are the generation, the first generation where
it's like you're expected to be your own whole small business.
And I think the prior generations excuse me, had more
she's taken us swing had more free time, yes, and
like that life was slower and so that like you
could say and forty five minutes once a week, I

(21:02):
do my little pasts, and like even if you can
be like I don't like doing them, but it's like
you have that forty five minutes and there's not this
like constant hammering in the back of your head. That is,
you could be doing these higher scale things for your
job or for your life that are made possible because
of the constant access to the internet.

Speaker 1 (21:23):
Ye, and your job is also not in your pocket
all the time, That's exactly. And it's like also boomers
and well Gen x is of course hearing this will say, well.

Speaker 2 (21:32):
When I grew up, my parents just left me at
home and we.

Speaker 1 (21:36):
Have just entertained ourselves and we just also cynical, and
of course we could buy houses and college actually guaranteed
you a job, and college was cheaper and housing was cheaper.
I guess not in the nineties, but even then it
was easier to get a house.

Speaker 4 (21:49):
And I mean, you're starting a millennia, you're starting a generational.

Speaker 3 (21:51):
I don't fuck it.

Speaker 2 (21:52):
Y boomers.

Speaker 4 (21:53):
Booms got us here, by the way.

Speaker 3 (21:55):
Yeah, GenX, we need you, we need you, we need
but they.

Speaker 4 (21:58):
Got us here because they're in action.

Speaker 2 (22:01):
Yes, gen x Is are the problem. There, we are.

Speaker 1 (22:07):
I think both of them had that hand. I think
boomers in the sheety decisions, but gen x Is are
the ones with their inaction when they chose, and they
were gen X's who did.

Speaker 5 (22:16):
Oh yeah, if even if you look at like the
like how like the generations fall politically now, Like you know,
the cliche is like all you boomers were the worst
or whatever, But if you look, there's a lot of
boomers who are very like anti Trump, and I don't
see the same with gen X. In fact, gen X
is the one that went the other way during the pandemic,
like well everyone else was like, oh, I can't believe
how Trump is doing what Trump is doing during the pandemic,

(22:39):
gen X was going like I can't believe I have
to steal it.

Speaker 3 (22:43):
My shitty kid who I can't send off the school.
I'm right wing now, I mean that's literally.

Speaker 1 (22:48):
And also boomers will do the whole thing. We're just
gonna get people. I'm gonna get some emails.

Speaker 2 (22:53):
There are a lot of boomers who are like, back
in my day, I could just walk into a star
and say, good sir, I'd like a job nights like
Kardia Jeb why.

Speaker 1 (23:02):
But the same time, and there was a lots of
like old people who were like, just go and get
a job.

Speaker 2 (23:06):
It's that easy.

Speaker 1 (23:07):
The gen x is if you make a post of
Blue sky En I've done this several times, just be
like make fun of gen x as being like, yeah,
I get home, my parents weren't there, so I'd make
myself dinner and I'd entertain myself all day every day,
and I was so independent. There will be gen x's
who respond and be like, yeah, that's actually true, and
you should like they get very personal about em I
don't know, you lived through some of the best music

(23:29):
as well, like fucking stop.

Speaker 5 (23:30):
I mean we're right, yes, I mean all the super rich,
fucking tech assholes are gen xers, Like.

Speaker 4 (23:36):
Yeah, to your idea, what you what you were saying
about back to me, back to the millennial. Correct. I
think you're absolutely right, But also there's other aspects of it,
and people talk about it. It's the way millennials went
to school, because we went to school in this like
real in America in this really optimized, like the sort
of goals that you had to chase were very specific,

(24:00):
whole gifted and talented crowd like we were seeing basically.

Speaker 3 (24:02):
Which I was one, yes, the same, but that was not.

Speaker 2 (24:07):
It really was not.

Speaker 3 (24:09):
No, it was not you see.

Speaker 4 (24:13):
Online like the the gift in tal adults. We're like, oh,
that was kind of wrong. That maybe kind of broke
us in terms of how we.

Speaker 3 (24:21):
Do you mean through the world the high expectations based
on us and the expectation of perfectionism or like how
that metastasizes exactly.

Speaker 4 (24:29):
All that, Like we were built for school and for
taking tests and then sent to the real world.

Speaker 3 (24:34):
It's like, oh, and do you think other generations? Oh?

Speaker 5 (24:37):
Absolutely if you think about it, like why why did
all the first internet culture come from millennials? And that's
because like gen X was at the age.

Speaker 4 (24:49):
That's there's.

Speaker 5 (24:52):
I don't know because we're no, but I'm talking about
like meme culture, and all lives.

Speaker 1 (24:57):
Sorry, that's I think I know what you're saying, which
is our lives were digital in a way.

Speaker 2 (25:03):
There's not.

Speaker 4 (25:03):
Yes here. Here's what's interesting is that we're getting too
millennial focus because we're all millennials, I guess. But also,
we knew what the world was very young. Okay, thank you,
good to know.

Speaker 3 (25:15):
Hollywood Castrian.

Speaker 4 (25:17):
We knew what the world was before the Internet. We
saw what the world became with the Internet, and we're
expected to be as experts at it post internet, so
we saw the full breadth of it. Yeah, so that's
why like gen zers and young kids do not know
the world before and it's kind of tough to describe
what that was.

Speaker 2 (25:37):
We also were children during nine to eleven and were
in like children.

Speaker 4 (25:42):
Teenagers, but yes, the world fell apart.

Speaker 2 (25:44):
Ages are still children two thousand and eight as well.

Speaker 1 (25:46):
Like I think that it's understated, like because two thousand
and eight, I moved to America that year and it
was very much like Welcome to America. No, it happened
as I moved, not because I don't know. I moved
and it was just like, oh, hey, you know the
whole thing you were told about, you got to college.
Everything will be fine, fuck you, just exactly. And I
don't think other older generations realize how stark that was,

(26:10):
how hot, Like there Jen x UMA's probably remember before
the Department of Homeland Security existed, like and I just
think the expectations of hyper digitization and the constant expectation
and notifications stuff like that. I think it's really easy
to color Millennia's like, oh, you're so oh you want
your little treats and all this. No, we're being harassed

(26:32):
by everyone is now harassed by everything all the time.

Speaker 2 (26:35):
We're aware of it every yes, and we know and
we remember.

Speaker 3 (26:38):
It could have been different. Yeah, I don't remember when.

Speaker 1 (26:41):
It's frustrating as well, because by the way, Michael Dell
gen X Yeah, just and we're and just.

Speaker 3 (26:46):
To be clear, this is mister Dell of.

Speaker 4 (26:49):
The company, one of the richest men in the world,
by the way, like he's up there. He's also the
guy behind trump Bucks.

Speaker 3 (26:53):
How have I never heard of Michael Dell? Oh? Is
he the guy who founded del Yes.

Speaker 1 (26:58):
And yeah, he's It's weird because they went through this thing.
Is will rotate shortly, but it's this weird thing where
Dell went through this real shit house era when they
were just complete dogshit their computers sucked, and then they
went took themselves off of the public markets private again,
fixed things, everyone loved them again, and then they went
public again and then they appeared to be like thinking
about being ship, but they're back to being okay, and

(27:21):
they bet their whole future on AI.

Speaker 5 (27:22):
It's just one of my earliest computers were Dell, and
I remember they were really big and then they sort
of everyone else beat them in everything, and then no
one wanted it and they had that really to me,
they had that time period where they were the computer
that at least I did, I automatically associated with, like
the older crowd, older people because especially as like the

(27:43):
gaming PC became big and then building your own PC
became big, Dell was still out there, I feel like,
doing like old school type marketing too specific audiences during
a time period where like everyone was moving beyond that.

Speaker 1 (27:55):
And then they bought Alienware, which was this insane company
where they'd be these had a day my dad.

Speaker 2 (28:01):
My dad bought me an alien where once, and I
fucking loved it.

Speaker 1 (28:04):
It was this insane like I don't know how like
three foot tall giant thing with like a swinging door
and a big alien head in there for like a
teenage boy with no friends.

Speaker 3 (28:13):
It was it was this was mostly what does the alien.

Speaker 2 (28:17):
It's literally just a gaming PC, but it looked like
a tube. It was more tubes.

Speaker 3 (28:23):
His wrists are so muscular, God damn it.

Speaker 2 (28:27):
They just get body shaving.

Speaker 1 (28:30):
No, it's it's interesting though because hearing them, it just
and as we'll transition into the next episode obviously, but
it's like it feels like all these companies are having
a midlife crisis, and I think you know what.

Speaker 3 (28:45):
I mean, like maybe they are.

Speaker 4 (28:46):
They are, well, that was the time the millennial midl
like crisis.

Speaker 6 (28:49):
It's like, give us room, very young, okay, but yes,
and coming up now and add just for gen X's
it's going to be for something called dot beers.

Speaker 1 (29:01):
It's going to allow you to have a social network
where you crow about how you were such an independent
child and how you remember when MTV was good.

Speaker 3 (29:10):
That is actively losing subscriber.

Speaker 1 (29:12):
No, Genexas will email me now and be like, actually,
you know what, I watched Liquid Television and I was cool.

Speaker 2 (29:32):
We're back in the room.

Speaker 1 (29:33):
We're back with a wonderful cast of course at Tristan
stand up comedian Chloe Radcliffe, Matt Binder of Mashable, and
Devendra Harda even Gadget and you know what I do
Actually kind of like this gen X and midlife crisis
happening with AI. It kind of makes sense because it's like,
you look at all these companies and they're all like, fuck,
how do we keep growing? The rock economy I've written
and said about. But it's also a degree of like, wait,

(29:55):
what do people want anymore? Because we got so rich
because gen X is able to keep a late wealth
in the way that millennials couldn't.

Speaker 2 (30:02):
Boomus.

Speaker 4 (30:02):
It's it's all fomo. It's all fomo about what's next.

Speaker 1 (30:05):
Fomo with disconnection from anyone like that. Yeah, I'm sorry.
Oh it was a rough time during COVID. We didn't
go outside. Fuck off.

Speaker 2 (30:15):
You could still step outside, just once, just be like
a scar.

Speaker 3 (30:18):
You know, it's so funny. I spent all of COVID
outside because that's where you didn't get the germ.

Speaker 4 (30:23):
For a while.

Speaker 3 (30:24):
It was better to be talking about it.

Speaker 2 (30:26):
Yeah, it's where you're just taking a laptop outside or
do you not have an overhead lamp.

Speaker 3 (30:30):
It was that was very stupid.

Speaker 4 (30:32):
But to what is happening to all these companies because
I've I've been covering them since twenty ten, I've been
following the tech industry forever. They're all desperate. They're all
desperate of missing out on the next big thing. Because
Microsoft missed out on mobile, they tried desperately to kind
of get that back, like Windows Mobile, all these things
making Windows eight, But no, that was Apple, that was smartphone, arraw,

(30:54):
that was tablets. They never quite got into that. But
they also didn't realize people like people like computers. Maybe
you should double down on that, which is what when
Theo's eleven was Whendows twelfth kind of became the thing.
But that was the mobile eerror. Then there was like
tablets and stuff, it's wearables. For a while people were
hot like, oh, this is gonna be the next big thing.
It's a niche. It's a very small niche.

Speaker 2 (31:15):
You wear them. Then that not everyone wears the same thing.

Speaker 4 (31:17):
To map and this point, what's happened is that the
world revolves around our phones because these are the most
personal computers we have. It's my best friends, it's your
best friend. It holds all your deepest secrets and inner thoughts.
It connects you to the world. So how do you
go past phone? The next stage of computing? To them
is okay, a thing that processes data at obscene levels
that we don't fully understand. But it seems like black magic,

(31:38):
and that's what AI.

Speaker 1 (31:39):
But it's so funny though, because they're like, well, what
do we replace a device that's all about personal stuff
and our communication with others and our ability to condense
our thoughts and access media. We like, well, what if
we took away all personal choice and data entry of
any kind?

Speaker 2 (31:56):
Would they like that?

Speaker 1 (31:57):
But I actually what you were saying that, I just
had a thought. These companies missed out on MOI mobile.
Microsoft didn't miss out a mobile, They made a ship
mobile phone by acquiring Nokia. And Windows Phone sucked. And
it's like, why do people like the max so much?
Because it because it was good, because the UI was good?
Why a people like the iPhone because the UI was good?

Speaker 2 (32:17):
It took so long for Android to even be half fast.
It was the T mobile. G one was the first,
and it was so far behind, and they were so
far behind for long, and Windows Phone was an insane
create it was just this like tiles. It was Windows
eight but in a phone, and.

Speaker 4 (32:32):
I will say it was cool. It was cool because
it was different, but they again missed out on some
practical functionality like Windows phone.

Speaker 2 (32:40):
The bjorc is cool and different. But I would not blame,
but I would not.

Speaker 1 (32:46):
I would not be like playing be York at an
NFL game like would in a commercial. I'm saying that,
like there are journalists and specifics and it's like something.

Speaker 3 (32:55):
But if if we're connecting with broad like middle, yeah,
I don't think so.

Speaker 4 (32:59):
Here sing surface now, which is at the NFL games.

Speaker 2 (33:02):
Oh yeah, no, the surface.

Speaker 1 (33:03):
It's so funny having surface things that, especially with Aaron
Rodgers just smashing them.

Speaker 2 (33:07):
But it's like they don't they like, well, what do
we do? We'll do off spin. It's like make it good,
make it good.

Speaker 4 (33:13):
I can understand. We're gonna go into history here for
the kids. But Microsoft was making mobile shit for a
very long time, right, the the ipack as well, right,
the ipack all that stuff. But it was BlackBerry era, right, Okay,
rewind to like late nineties. It was like Personal Assistance,
right PDAs. Yeah, the palm pilots all those things. Evolution

(33:34):
of that was BlackBerry. Microsoft was doing stuff within pocket PC
and like they made a couple of ones, but they
never really talk.

Speaker 2 (33:40):
And they were like little laptops, so they were they
were like.

Speaker 4 (33:42):
Little keys or like little BlackBerry.

Speaker 2 (33:43):
Clones could but used like a little pen you could use.

Speaker 4 (33:46):
Some of them had like a stylist two. But the
thing is BlackBerry was hot because data was bad, and
BlackBerry created the technology to have quick digital messaging to
all your friends. Go watch the movie BlackBerry, which is
not fully historically accurate, but gets the.

Speaker 2 (33:59):
Tech I founded the show.

Speaker 4 (34:03):
But the movie BlackBerry Wonderful Amazing has one of the
guys from him. It's always sunny in there too, and
he's just like wonderfully offensive and like really gives his
nice check out that movie. But BlackBerry innovative innovators delaydelemma
came for them because they didn't believe you could have
a phone that was all screen and that was iPhone.

Speaker 2 (34:21):
That's difficult, genuinely.

Speaker 3 (34:22):
Do you mean like they had considered they were like
should we make phone all screen?

Speaker 4 (34:25):
Yeah? Because BlackBerry, and they were like, no.

Speaker 3 (34:27):
No, we not make keyboard.

Speaker 2 (34:32):
Who was Dennis for me in Philadelphia? Plays my friend
Jim Person.

Speaker 4 (34:38):
Want to blow up.

Speaker 5 (34:39):
I remember when I was in high school, the big
phone at the time where those like the screen that
flips sidekick I think it was.

Speaker 4 (34:46):
Called phones were so cool. Before the iPhone, phones had
all this like cool stuff I had. I had a
Helio Ocean which which flipped up vertically and also flipped
sideways like sidekick. Amazing stuff. And then the iPhone came
out and it was like, okay, all screen but also apps.
Your phone is now a computer. It's not just like

(35:06):
a limited thing with garbage version of the Internet. It's
the full Internet. And then cellular speed's got better, so
like Internet in your pocket, full computer in your pocket.
You cannot deny that that's the best thing. So Microsoft
just was never able. They didn't catch up to that
quick enough. We talked about Android being shitited early on.
It was, but within a couple of years they kind
of did the Windows thing where Google just like had

(35:28):
the software and had other people come and make the hardware,
and Android took over the world, like Android had dominated
cell phone smartphones pretty quickly, pitty quickly on. But it's
hard to deny. Everyone is chasing, like what is the
next thing. They think AI is the next thing, And
I'm glad that there are people out there like we're
looking at this and like this is bullshit, this is

(35:48):
not good.

Speaker 3 (35:48):
But yeah, are they sure?

Speaker 5 (35:50):
Like are they really convinced that what's amazing is that's
not right? Because so Mashable published. I didn't do the interview,
but we spoke to the Novo CEO and we asked
him about people who are anti AI and AI skeptics,
and his answer to that was basically, you can't avoid it.
Just you won't be able to avoid it. And it's like,

(36:10):
that's not the that's that. But it's also not like
a ring door won't be able to avoid.

Speaker 3 (36:16):
Yes, that's the ring.

Speaker 2 (36:18):
It's the whole thing of people inevitable.

Speaker 3 (36:20):
Right do you believe in this product? Well, you can't
avoid what's coming. I mean, you know, we sell it, Yeah,
likedn't sell it, But it's I think I I, having
not read the interview, I could imagine being like sort
of having the attitude of like, I don't who gives
a ship if you if you don't believe in this,
it's happened. We've invested billity and deal with it, and

(36:46):
who why is it on me to convince you you
you can sit there and bang your drum as long
as and I don't mean this to I don't mean
to be on AI side, but like I can't imagine
sort of it, who gives it. I appreciate the honesty
the answer, But if.

Speaker 2 (37:02):
Your job is literally to market it to it channel.

Speaker 5 (37:04):
Like Apple, if when the iPhone first came out and
people asked about the iPhone, like are people really good?
Apple didn't go like it's inevitable. They were like, well
try try it and you'll you'll see.

Speaker 4 (37:13):
I will say, Steve Jobs famously an asshole.

Speaker 3 (37:17):
You're holding it to back them up.

Speaker 2 (37:20):
The messaging was you'll get used to it, like the
Apple way was, you'll get you this we know better.
But that's not what AI is saying. That it's happening.
And the funny thing is about AIS was if AI
was I can say to my computer load this, do this,
and it actually did it every time, that'd be fucking sick.

Speaker 3 (37:39):
Well that's what That's.

Speaker 2 (37:40):
What Microsoft's pitching you. But the they are lying it
just like I swear to fucking God, I thought public
companies couldn't just fucking lie. But I can only do
it about it.

Speaker 4 (37:50):
If you're all in the same grift, it doesn't matter, right.
And also what what we're learning is that rules don't matter,
just don't apply to the rich and the powerful.

Speaker 1 (37:56):
But here's the thing gravity does, and I think they
will be punished all uh like when when the stocks
in videos kind of down, today's kind of fun watch.
It's also funny as well because none of it really
clicks with what uses desire. Because what uses theizarre right
now is I wish my phone just fucking worked. I
wish every app didn't need to ping me fourteen times

(38:18):
a day to say have you thought about using me?

Speaker 4 (38:20):
I think just work? It is just just just do
what I ask you to do.

Speaker 3 (38:25):
What when you're saying that the companies themselves don't even
fully believe what? What's what more is.

Speaker 4 (38:31):
I've talked to a whole bunch of excators. I bring
this question to every big company I talked to, and
early on the Microsoft was doing copil, I was like,
this thing doesn't work right, Like you're you're making me.
You make me want to use copilot as like a
search engine, but I cannot trust it. It's not always
delivering correct information. If I bought a calculator that said
two plus two equals five. I would throw it away

(38:51):
because it's garbage. And the Microsoft people were like, it's
a work in progress, we're gonna get better, and that
that they're making you accept a certain level of bullshit
because they can't stop. They've invested too much in it.
They practically are invested in half of open AI, so
like they can't stop. It's too big and their stock
is being rewarded for it, so they can't say.

Speaker 3 (39:13):
Anything but okay, and again just to play devil's advocate
and that this is not this doesn't the views reflected
here are not my own. I can you know, I
look at what AI take, images take image production for example,
what the images that AI could create four years ago
versus the images that a I can still still images

(39:35):
that AI can create today are unrecognizable. But Trinity, they're
so much better when they were producing the dog shit
images four years ago. If somebody was like if I
you know, like if you do the same calculator thing,
and then they're like, well, yeah, it's a work in progress,
and in four years the calculator is going to say
two plus to equals for every single time. Again, I

(39:56):
think that that is a fair response.

Speaker 2 (39:59):
That's the thing though, this is one I've answered a lot.

Speaker 1 (40:02):
This is one where I'm like, oh so that made
sense maybe two years ago, so it made sense, like
I would say, it stopped being rational to do that,
like maybe I mean at latest March last year, so
twenty twenty five, I would say, right up until I
mis generation with GPT. That was when you sprank bonk.

(40:23):
I know people are gonna say nano, but nano banana.
I went in an entire day without saying nana banana,
which is now on televisions.

Speaker 2 (40:32):
It's just like bana is Google's image genera.

Speaker 3 (40:37):
Like the best, the best one on the mods and.

Speaker 1 (40:39):
People like, look, it can generate a picture of a
woman for some reason that I'm using.

Speaker 3 (40:45):
Like yeah, yeah, yeah, yeah. It's like it's like tube generation.

Speaker 2 (40:49):
Y oh god.

Speaker 4 (40:50):
And they can't answer the basic question about the entire
idea of imagery generation, which is what I But the.

Speaker 1 (40:57):
Thing is it's not like it's but the thing is,
it's not having these meteoric jumps yet four years ago
sort of twenty yeah.

Speaker 3 (41:04):
You're saying, you're saying, like the curve is leveling, Yes.

Speaker 1 (41:06):
So we've reached the point of diminishing returns. Because the
way these models get better is two ways. One you
feed data enter them. Two you tweak the models. By
they you basically tweak their outputs. You say, don't do this,
do this. Yeah, things with images, you're right up against
the wall. Now they've got about where they can get.
And their thing they'll say, is like, look, we've made
some realistic looking images. The problem isn't making one realistic image.

(41:28):
It's doing the same thing more than once reliably, maybe twice, thrice,
a hundred times. It's being able to have a consistent
visual image. And you can kind of do that, but
it takes a shit ton of computational power. And as
we run up against the realm of running out of money,
that's become the problem. But also most of them have
hit diminishing returns because we've run out of data.

Speaker 2 (41:50):
We are out of data. You mean, feed of things
to feed into the thing, slap for slop slot, well,
pig food for the slot.

Speaker 3 (41:58):
For the pigs too, food they can make to poop.

Speaker 4 (42:01):
But here's the basic question about image generation, Like, I
don't care how much better it's getting. Yeah, I don't care.
Who gives a shit about? Why does this exist?

Speaker 5 (42:09):
That's the thing like with so much of this, like
the like when we talk about like the calculator that
you know has to get the math problem correct.

Speaker 3 (42:16):
Well, there's the use case for that is it gets you.

Speaker 5 (42:19):
Yeah, with AI generation and an AI video image generation
and video generation, it's sort of just like Okay, you
can do it, but what is the purpose?

Speaker 4 (42:28):
Like like the guys are like, oh, you can make
a wallpaper for your TV.

Speaker 3 (42:32):
I don't.

Speaker 4 (42:32):
I don't turn my keb on for a wallpaper. I
don't care.

Speaker 3 (42:35):
It's like who gives a who gives like the point
for me?

Speaker 5 (42:38):
Like like people like, oh, you can take a picture
of selfie yourself and insert yourself and different. Well, for me,
like when I want to look at a picture of myself,
it's because I want to remember that experience that I experienced.
But what if I put myself in an AI image
generator and it puts me in front of the pyramids
in Egypt?

Speaker 3 (42:54):
Why the fuck do I care? I don't know no
memory there of me doing that.

Speaker 1 (42:58):
I saw a really evil one where it's like a
guy showing his grandmother with de men. She like a
picture of her with Jimi hendrakes parody. The following is parody.
I think that person should be fucking told I.

Speaker 3 (43:08):
Know, I'm so glad you brought that up, because I have.

Speaker 2 (43:10):
You should be in fucking pret you disgusting, fucking months stuff.

Speaker 5 (43:14):
Facebook consistently hammers me with recommendations of groups for people
who are asking people to like clean up their family
photos and stuff. And those groups are now full of
people who come in and go, oh, my daughter or
grandma or whoever in my family died and this is
the only photo I have of them, can you make

(43:36):
a color or fix it? And now all the responses
are people putting it through the AI image generators, and
they spit out things that don't really look like the
deceased family member. And some of these people love it,
and some people can put them in like the deceased
family member picture in a video generator that.

Speaker 3 (43:54):
Generates yesifying that grandfather.

Speaker 5 (43:56):
But it's like, you know, at some point, these image
generators are going to distort your memory of the actual person,
because because if you only have one photo of someone
and that photo was cleaned up quote unquote, but it
changed how your family member looks, and you're constantly looking
at that photo. It's gonna eventually actually fuck with your
real memory of what that person looked like or sounded like,

(44:18):
for like the voice cloning stuff. And it's like, I
think that's so fucking unhealthy. And they're gonna have a
world of people who remember their own ancestors as people
who did not even exist because they.

Speaker 3 (44:30):
Don't have an actual.

Speaker 5 (44:32):
Photo in their brain of how they actually look they
had the AI generated photo of them.

Speaker 3 (44:37):
I completely agree, and I'm sure that this is already happening.
And then I and this is where I get a
little nihilistic. This ise where I get a little depressed.
I wind up going my impulse as a millennial who
experienced the world's pre in you know, or pre ubiquitous internet.

(44:57):
My instinct is that like, their brains going to be
fucked up by shit like that because and it feels
good in the moment, And so they keep doing the
drug And why wouldn't you if the drug dealer keeps
coming to your house. Of course, it is going to
be very difficult to say no. It's much easier to
say no when the drug is not inside your house
all the time. And so they're gonna keep doing the drug.
They're going to fuck up the brain. They're gonna have

(45:18):
these this insane relationship with memory and with emotion and
with human connectivity. And then I want to say, and
at some point they will, in their heart of hearts,
understand that this is bad. In years, decades. I'm not saying,
you know, they'll wake up one week, yeah, but like
at some point in their old age or in their
middle age, they will go, Wow, this is bad. I

(45:41):
don't know how to describe this. I don't know how
to articulate this, but I can feel in my heart
that something is disconnected from what the human experience should be.

Speaker 4 (45:49):
I think that's partially what we're doing right now, because
people are using these tools and not fully thinking about it,
and there are just so many philosophical questions about us,
like how do we process information, we process memory, how
does this work within our society? What's the anthropological impact
of all this falsified information? Like there's immediate harms of nothing.
We can't trust reality anymore, we can't trust anything because

(46:11):
they can just make up fact. But also the eventual
harms is that the idea of memory is just gonna disappear.

Speaker 3 (46:18):
I mean, I mean, but.

Speaker 5 (46:19):
Another thing I've seen, And then this is the really
really depressing one, Like someone will ask for something like
they'll post the photo of their you know, child who
died it like you know, nine months, three years whatever,
and they'm like, oh, I'd love to see how they
would look if they made it to like forty.

Speaker 3 (46:34):
And it's like, no, don't do that, No, no, no,
right exactly, don't do that.

Speaker 5 (46:38):
You have the actual image or videos of them when
they existed and who they were and what they were,
and just keep that in your memory. Don't look at
this person who never will or did exist right that
they're generating.

Speaker 3 (46:51):
It's just odd and the same, all.

Speaker 1 (46:53):
Of that photo editing stuff, and in particularly the new
things where you can be like, okay, add saw to
the photo, make look like in America, give my grandmother
giant abs, like all of the things that you can
do with AI.

Speaker 3 (47:05):
Now all right now, now, now you've got my.

Speaker 2 (47:08):
Make my yoke my grandma a phrase to remember all
of you.

Speaker 3 (47:13):
Yes, yoke my grandma.

Speaker 2 (47:15):
But here's the thing. I genuinely believe all of this
is going away. No one, no one other than the.

Speaker 1 (47:20):
Is really willing to fully go. I think all of
this goes away. I think everything dies within two years.
I think all I think that there is going to
be the big egg face moment, and no one is
really no one wants to take the logical point, which
is the people who are dependent on these things are
not going to have a thingy anymore unless they're going
to put deep seek R one on their laptop of

(47:40):
like thirty seconds between responses, because I think all of
this is just I think every image is like a
few but my theory, I'm just this is gutt instincts.
I think it's like a few bucks per image, and
I think and I hear MT Technology revied a article
middle of last year. Images are more cheaper than text,
but there's still very expensive but nothing I was surprised too,

(48:04):
but I forget the actual thing. But all of this,
like the video generation stuff that's gone away.

Speaker 3 (48:09):
First, So and and sorry that a couple dollars per
imagery whatever is.

Speaker 2 (48:15):
For them to produce.

Speaker 3 (48:16):
Being paid by open AI, or it is.

Speaker 2 (48:19):
For those paying for the GPS to run.

Speaker 3 (48:21):
Yeah, right, okay, And that's another thing though.

Speaker 5 (48:23):
Most of the AI generated stuff you see out there
not professional work obviously, but like things people post on
social media or whatever. They're just using like the free
version of a lot of exactly because no one. No
one wants to actually, unless you're using it again for
professional purposes.

Speaker 3 (48:39):
No one is paying its fo my grandma for free.
No one wants to pay for the subscriptions.

Speaker 1 (48:45):
And that's the thing. Open ai asked this big thing.
People love to eam and be like what if they
did adds? If you are someone who's emailed me about
it that you're like, seventy five thousands, Wow, what if
they did adds? Here's the problem the information and a
great story this week was nine hundred million week connective users,
though they bullshit those numbers. Eight hundred million of those
are outside of the US, thus their lower value advertising clients.

Speaker 2 (49:07):
So yeah, most of them are free. They're just and
the most exciting. And the funny thing is is even
the ones that hey are the most enthusiastic who cost
you the most money. Open ai loses money even on
their two hundred bucks a month one.

Speaker 1 (49:19):
So the more the more you like something, the worse
you are as a customer. It's usually the opposite.

Speaker 2 (49:24):
It's usually like your power user of Facebook doesn't cost
them that much because it's basic. It's just so the
way a website works is like Facebook is.

Speaker 3 (49:33):
No, all right, ed, I know I'm a lady, and
I know I'm not in tech. No, I'm kidding.

Speaker 5 (49:38):
So when you when you should be in a buck,
so picture a computer.

Speaker 2 (49:42):
No no, no no, don't maybe be sarcasting and be
like do you do you know what CAMI is?

Speaker 1 (49:48):
So it's when you run a website like Facebook, where
you access a page with some videos. The streaming doesn't
require it's like a CPU and like require storage and
RAM like in a big server. That is intensive because
they have shit tons of users. But because like shit
tons of users, it's just because you have a lot
of them. It's not because the process of serving you
Facebook or Instagram is super expensive to do. It's expensive

(50:12):
because they have so many people. The problem is with
AI is that having one user who's particularly demanding is.

Speaker 3 (50:18):
Extremely expenctually an every bearing costs on the company.

Speaker 1 (50:22):
Exactly because GPUs are extremely they are cost intensive. They
require a lot of energy, but they also put out
a shit ton of heat, which requires a bunch of cooling.
And the bigger the data center, the more intensive the cooling.
And I had someone email me the other day saying,
really big data centers can cause like the giga what
once can cause like weather effects with the amount of
heats that.

Speaker 4 (50:40):
This, and they're affecting the local communities.

Speaker 3 (50:44):
And just to clarify what you were saying earlier about
when you when you're saying the logical endpoint of your
pessimism about AI or whatever hell you want to frame it,
the logical endpoint is that all of this goes away.
You're saying it collapses is from a financial standpoint, yes,
but what that? But then the logical endpoint of that
is that the company's anthropic open AI what I'm chat

(51:07):
GPT go, Well, we don't have the money anymore to
run these data centers correct. And the data centers are
what are required to when somebody types in your my
grammar yea, yeah, we need the data center to be
having to have the electricity on and have the air
conditioning running. Yes, and that when somebody's not paying those bills,

(51:27):
then I can't get my grammar yoke correct.

Speaker 1 (51:30):
And there's no economic reason to do it because the
the vedrous point, the only reason they're doing this right
now is because their.

Speaker 2 (51:36):
Stock value is growing.

Speaker 1 (51:37):
Microsoft, Google, Amazon, Meta they've seen their stock grow, not
because of AI revenues, they don't talk about those, but
because AI is so big, so huge and so yoke. Yeah,
everyone's talking AI. But the moment and the thing is
I've heard people say, oh, well, it's a chat GPT bubble,
it's not an AI bubble.

Speaker 2 (51:54):
Here's the thing. Oh yeah, it's a really.

Speaker 3 (51:56):
Right how does somebody thread that needle?

Speaker 1 (51:57):
Because they're fucking stupid. The argument is that, well, open
ay is the worst business of all time. It burns
money forever. They have no profitability, but all the other
ones will be okay, there will be other winners. The
problem is is that this is a vibees based thing,
and now what happens to the vibe when the main
business that everyone knows like eight hundred and nine hundred

(52:18):
million weekly active users compared to everyone else where they
have like twenty Now you may think you've heard Microsoft
say they've got hundreds of millions of Copilt. You've heard
Google say hundreds of millions of Google Gemini. That's because
they put them in their main products. But even then
their shareholders are just going to go, right, why are
you spending on this fucking money on this why are
you propping up these insanely expensive things? And if you

(52:41):
wonder why I'm so confident about this, none of them
talk about the revenues or the costs, right sure.

Speaker 5 (52:47):
The thing also though, is if AI was did disappear,
what do customers? What do consumers lose out?

Speaker 2 (52:53):
Like?

Speaker 4 (52:53):
What what is the care?

Speaker 5 (52:55):
Yeah, and the things that people may use in their
everyday life, like like AI transcription or voice to text.
We're at the point now where those models actually can
just live on your personal computer. You don't need an
outside server at all, You don't need a company to
pay a monthly fee to.

Speaker 3 (53:11):
You can just download a model.

Speaker 5 (53:12):
It's a few gigs and you have your own voice
to text and your own transcription. It's that other more
intensive stuff that are just like little toys that no
one really needs that will just go away.

Speaker 4 (53:22):
To me, that that that is the logical outcome by ways,
like the big AI will die off to a certain
degree that will lead to widespread economic devastation. But yes,
like that will be hard.

Speaker 2 (53:32):
It's going to be great financial crisis in mind.

Speaker 4 (53:34):
Could be worse than financial crisis, could be a depression
level hit on the world economy. So that worries me,
but in for when it comes, because all these companies
but the idea of local, the idea of like local
AI and stuff that's just like, Okay, my watch is
a little better, I can transcribe to my computer that

(53:55):
lives locally. That does happen.

Speaker 2 (53:58):
That's already one piece of bad news.

Speaker 1 (53:59):
Though, the only reason those models are being worked on
right now is because the big ones, And once those
big ones go away, they're not going to have any
fucking reason to do them. They don't care about transcription,
they don't care about the easeful stuff they care about.

Speaker 4 (54:10):
They still need a reason to sell you a new
laptop every y and I.

Speaker 2 (54:13):
Someone think this is going to be enough of one,
and I my thing is like, I'm not sure how
we're going to intellectually reconcile with this, because this is
the world panshitting competition.

Speaker 4 (54:25):
It's also everything is so fucking stupid right now, like
every every level. If you we've been in the CS
bubble but looking at the wider world right now, yeah,
so yes, it's so dumb, Like we Trump invading Venezuela
was how we started this year. I was born in Guyana,
South America. The country right next to Venezuela and it's
probably right next on like because they just found oil.

(54:46):
So xill I was like just out there taking all
that stuff right now.

Speaker 3 (54:49):
Put it back. We didn't find anything.

Speaker 4 (54:51):
Things links are so stupid right now. So that's why
I do kind of feel like, yes, we're going to see.

Speaker 1 (54:56):
It, but this is and I know this sounds hard
to reconcile. I think this is going to be stupid. Oh,
just because hear me out. Every major CEO.

Speaker 2 (55:06):
Has added AI has talked about a It's all they
talk about is AI. Every major tech company AIAIAI and
a bunch of people we work for, not me personally,
but tons of people we work for. People have said
that AI is the future. World leaders, AI is the future.
Everyone said AI the future, and I think all of
them are wrong. What are we meant to do with
that information? Because this is Trump politics. Okay, there's voting,

(55:29):
there's like a process.

Speaker 1 (55:30):
This is like for now the sure But the foundations
of economic movement, but also knowledge in the world and
like consensus, reality and paradigms themselves have been based on
the idea that the people running the money, people running
businesses are somewhat intelligent that would not be fucking wrong.
And what's become obvious is almost nobody knows anything ever.

(55:54):
And I don't mean this in the cutesy like oh,
bosses don't know stuff. I mean these people have just
been fucking wrong, like wrong, wrong, wrong, wrong, wrong, just
like AI will do this.

Speaker 4 (56:03):
But it's not. It's not just say I I will
tell you that. Like when I started covering technology and
doing drenals, and I was mainly like writing about startups
at the time, like, oh cool, these kind these guys,
their ideas are so interesting. They've gotten billions of usually
millions of dollars of investment. I was talking to you know,
Kevin sisterm and the Instagram guys when it was just
four dudes. I was like, oh man, they must be
so smart and so interesting. No, usually rasually, not usually,

(56:27):
just like you have You're a nerd who knows how
to code, who came up with like one interesting idea,
and the vcs were just throwing money at you. And
and even when it comes to like business, big business
in general, people rise to the top not often through
their skill. It's through they've been there long enough, or
they're a good enough company shell logical.

Speaker 1 (56:47):
My point is, imagine if the Great Financial Crisis. Yeah,
Microsoft was in housing, Google was in housing. I was
always in housing, Disney entered housing, everyone bought houses, and
everyone was selling houses, and everyone was offering mortgages. I
wanted mortgages, mortgages with the future of every single business
in the entire world.

Speaker 2 (57:02):
Everyone on LinkedIn was saying I'm a housing expert. Now
that is it.

Speaker 1 (57:07):
Except when they said housing, they meant cat litter boxes
because I'm joking.

Speaker 3 (57:11):
But it's like, yeah, what if everyone went was wrong?

Speaker 2 (57:16):
Wrong?

Speaker 1 (57:16):
Wrong in a way that's not political. It's just obvious
that so many people we thought were smart were just
like able to do mad libs.

Speaker 3 (57:27):
I have a question about the leaders who have backed AI. Right, Okay,
as I see it, there's two categories that I understand,
and I'm going to ask about what is the middle category.
The middle category category that I understand is tech ceo
who has whose company has some vested financial interest in
AI being successful? Right, totally get it? Absolutely, Like why

(57:49):
would Google add Gemini shove their products great? Because they
are like they are. They see a potential financial benefit
we'll find.

Speaker 4 (57:57):
Yeah, they were also Google's holding off. Google is trying
to be cool and chill about it, and then opening
eye kind of blew the doors open, and then they
had to brush everything out.

Speaker 3 (58:05):
Sure, yeahah, okay, that category. I understand. Then CEOs of
companies who have been sold the lie of AI can
make your whole company work better. That I also completely understand, Like,
you know, Salesforce saying, I guess this is sort of

(58:25):
an example of the middle cuttery. Somebody at Salesforce being like,
we're gonna add AI and AI is flashy and sexy
and that's a new and it's it's the best thing,
and then turning to home Depot and being like, now
you should pay us extra because our new Salesforce category
or our new Salesforce product has AI and that's going
to make your life better. And home Depot goes great, awesome,
will pay more for them. But that I understand. I like,

(58:46):
I understand the the.

Speaker 2 (58:48):
But even those don't truly make sense.

Speaker 3 (58:49):
But keep going right, right, right right, And I like
disagree with the efficacy, but I understand the messaging. They're
what CEOs or what company have added AI trumpeted AI,
you know, championed AI that are not in the category
of their company is a direct financial benefit benefactor to

(59:13):
the technology or or like a direct owner of the
technology versus technology.

Speaker 1 (59:17):
Disney Disney right right, right, being one of the most
journalist outlets, and they will claim, oh AI will generate stories.

Speaker 2 (59:23):
They don't.

Speaker 1 (59:24):
And I actually pushed back on the things with Salesforce
because there's a really cool thing, well, sales was it.
I will choose another company. In my head, Salesforce is
so special, but like a software company in this case.
The really simple thing I'll say to that is even
they don't make sense because it don't work. It don't
do that like it don't it, don't do it. It
don't do fingy like you say the whole thing copil
Solon doesn't work. It doesn't work. It's never worked.

Speaker 3 (59:46):
It works enough. No, it doesn't.

Speaker 4 (59:48):
It is good enough for people to be using it.
My issues are you're a computer should be you should
be like I gotta push back. It doesn't.

Speaker 1 (59:58):
There have been like eight different always in the last
year where Microsoft's three sixty five co pilot can't do
things like prepare PowerPoint.

Speaker 4 (01:00:05):
Oh yeah, it's like specific failures, yes, but on the
general failures.

Speaker 2 (01:00:09):
Yeah, on the general what's like the general general.

Speaker 4 (01:00:12):
What I'm saying is like, you ask Copilot or you
asked Gemini, like a question, and it will get to
your response. It is good enough.

Speaker 2 (01:00:19):
It will.

Speaker 4 (01:00:20):
He was like, hey, make me a chart of this data. Hey,
that actually works. A lot of that stuff does. Maybe
I'm not trustworthy, Like that's the thing that means it
doesn't well, no, no, no, But there are variations of
how badly.

Speaker 1 (01:00:32):
But the thing is, if there's a variation of how
well something works, that just means it doesn't work most
of the time.

Speaker 4 (01:00:36):
Sure, but hundreds of millions of people are still like
using them.

Speaker 2 (01:00:41):
Yeah, because they're being they're being pushed into by everybody.

Speaker 4 (01:00:44):
But people like using the like for them, the degree
of like as good enough as it can be for them.
It's good enough. That's why we're pushing back though, and
we're saying, like, you can't trust this data, you can't
trust this stuff. They're not thinking about that, but they're
just using it. So we'll continue this debate. It's not
really a debate. We're shortly after this break. This next
ad is for specifically boomers. It's called Where's My Pills.

Speaker 2 (01:01:09):
There's someone called to be really angry at this one.
They're in the draw mate, and we return to our scene.
I'm now joined by Edward and Graso Junior of the
Tech Bubble newsletter. Hello, my friends, Chloe Radcliffe, of.

Speaker 1 (01:01:32):
Course the actress and stand up comedian who it's you,
And of course Devinger Harder of Anger Jam.

Speaker 3 (01:01:37):
Hello.

Speaker 2 (01:01:38):
And I think we're in an interesting point in the
in the.

Speaker 1 (01:01:41):
Show where we're all like getting to the end and
we're realizing we have to go back to the real
world and there's this thing of this AI thing and
this whole, this whole situation where and to get back
to the debate we're just having. It's like, yeah, it
kind of works, but I don't think that people realize
that because it only kind of works when they start
what it actually costs, which is like probably two to

(01:02:04):
five to fifteen times what they're charging. No one's going
to want to pay for the half fast thing anymore,
and many will not be able to pay for it
at all. Like I was saying yesterday, a lot of
these companies that are just thing bolted onto LLM die
just immediately, like just they will.

Speaker 2 (01:02:21):
Turn to dust.

Speaker 1 (01:02:22):
Because imagine if the core fuel of a car of course,
like double tripled cadrupled in battle in cost of course.

Speaker 3 (01:02:30):
I okay, interesting proposition here. Do you think two questions?
Do you think that AI companies will start charging subscription rates?
Uh before the financial implosion that you predict?

Speaker 2 (01:02:49):
Yeah? I think that. Well, you've already kind of seen signs,
and they're very unfair with this.

Speaker 1 (01:02:53):
No one likes this one. Both open AI and Anthropic
of added priority processing to their API customers. API just
refers to instead of you or me using like chat,
GPT or claude, you as a company would connect your
software with an API to their service and then run
on top of it. Now, when you run something on
top of these models, you're effectively just prompting them.

Speaker 3 (01:03:12):
A API stands for a application big interface.

Speaker 2 (01:03:17):
I there you go, Yeah, that's yeah. My brain couldn't
think of the real one, so I can't.

Speaker 1 (01:03:24):
But it's like those companies cannot survive with a product
with a cost race. I had a reader listener and
reader reach out and actually tell me that you need
priority processing if you're above a certain things, just your
cost double there automatically. And I think the the other
thing that people will say is, oh, the cost of
models is coming down. Classic talking point Jensen Huang and

(01:03:46):
is Nvidio thing his press conference said, or the cost
of models is coming down. Yeah, but it's kind of
like if the cost of fuel came down a bit,
but your journey to work took fifteen times longer, because
now when you use a new model, it isn't just
spitting out on output. It's using models that use more
computational power to give you slightly better results. So it

(01:04:07):
may be cheaper it well, it's not. It takes longer.
It uses more tokens just for simplifications sake. Sure the
thing that happened that they generate to give you an answer,
It uses a reasoning model, which uses more tokens to
do the output. So while it might be cheaper on
a token basis, you're using more of the fuckers, so
it's not cheaper at all.

Speaker 2 (01:04:27):
It's actually more expensive.

Speaker 4 (01:04:28):
Yeah. We can argue a lot about the air industry
like that's yeah.

Speaker 2 (01:04:32):
Sorry, some Indian thing.

Speaker 4 (01:04:33):
I will say.

Speaker 2 (01:04:34):
We are seeing the direct impacts of the rise of
THEI right, go on, talk about memory and see the
fucking HBI the memory. I'd love to talk about this.

Speaker 4 (01:04:42):
RAM prices have increased by three to four times across
the board, which is insane, and that'll affect everything that
affects phones, computers, whatever, cause.

Speaker 3 (01:04:51):
Like this every like everything because car is computer car.

Speaker 4 (01:04:55):
So any device needs some sort of memory. What is
interesting is like we are seeing the direct impact of
that already Dell's XPS fourteen, which should be a computer
that costs under fifteen hundred dollars because of what it is,
and that's what the competition is. And that was about
the pricing Dell gave me when I first wrote my post,
they set sixteen hundred. Next day an update, that thing
is going to start at twenty fifty dollars.

Speaker 2 (01:05:17):
Jesus Christ, because because they're.

Speaker 4 (01:05:21):
Not going to say why, but I would logically think
like that is going to be the highest cost for
a lot of new systems. That is insane. That is
a it's a premium ultraportable. Sure that thing should cost
under fifteen hundred bucks or about fifteen hundred dollars. So
we are going to see similar premiums like that.

Speaker 3 (01:05:35):
Are you sayings you need to get a new computer,
do it now, because it's going.

Speaker 4 (01:05:39):
Worse, Do it immediately, and maybe buy used or refurbished
which is my recommendation for everything. But yeah, buying a
new system this year is going to suck.

Speaker 1 (01:05:47):
So we're going to do an episode on this next
week with Steve Bug from Games and Nexus.

Speaker 3 (01:05:51):
Nice.

Speaker 2 (01:05:51):
So RAM in random axis memory a website is yeah, no, no, no,
this is the thing.

Speaker 3 (01:05:58):
No I want, I want this, don't.

Speaker 1 (01:06:00):
The best lesson Sophie and Robert gave me is explain everything,
because even if you think you know something, you don't.
So RAM is when you quickly need information to do
a task on a computer, so you couldn't access the
hard drive. No matter how fast the hard drive is,
it's not fast enough to get something immediately.

Speaker 2 (01:06:14):
AI services are extremely.

Speaker 1 (01:06:17):
Demanding on something called high bandwidth RAM, which is just
allows you to keep a bunch of memory stuff in
the RAM so you can constantly access it.

Speaker 2 (01:06:24):
Because that's how LMS works. Someone's gonna email me and
say a truncated that shut the fuck up. But it's Nevertheless,
because of the demands.

Speaker 1 (01:06:31):
Of Nvidia building GPUs, of Broadcom building their own bullshit GPUs,
AMD building their own bullshit GPUs, the.

Speaker 2 (01:06:39):
Price of RAM across the world has increased. Now eagle
eared listeners may think I heard a story Edward about
open Ai taking forty percent of the world's RAM. This
story is bullshit, and everyone pushing it is a fucking liar.

Speaker 1 (01:06:54):
If you look on sk heinez and so open Ai
did a deal quote unquote with sk Heinex and Samsung,
two companies out in Asia where they were going to
get hundreds of thousands of wafers of RAM, just the
things that you can't into RAM, which is insane on
its own, But it was a letter of intent, which
is a concept of idea. It's like, if you and
I are email, why do we do this? It means

(01:07:15):
fucking nothing, And everyone's trying to blame that. No, what
it is is just every GPU, anyone doing anything with
GPS is.

Speaker 4 (01:07:22):
Just did you see the buying RAM, the new Nvidia supercomputer,
the Vera Rubin thing. I hate that they're using these
actual scientists to apply to these things. It seems so disrespectful.
But one point five terabytes of RAM in this Vera
Rubin supercomputer, and that's that's this. So it's like a

(01:07:43):
one point five terabytes.

Speaker 2 (01:07:44):
How many of that? How one point five terabytes per
how many GPUs?

Speaker 4 (01:07:48):
I'm gonna have to look into the specific HEREI there's
a lot of chusing, and I believe it's in the
fall that's in the full thing.

Speaker 3 (01:07:54):
I'm sorry not to be a woman who doesn't work
in tech about this. This very an so supercomputer named
after a lady, and we think it's disrespectful. Yes, tell
me the name again, Vera Rubin Viera Ruben. Okay, the
Vera Ruben computer is a big box and it's really powerful.
And inside there's a smaller there's a little shoe box
that you can keep all.

Speaker 4 (01:08:12):
The little yeah, and like little smaller computers are all
tied together.

Speaker 3 (01:08:17):
Yes, tie together. Okay. RAM is like one box where
you have all sorts of random things that you might need.
It's like a chick I know, I know, but I'm
just I'm picturing it in my head.

Speaker 4 (01:08:28):
If you if you imagine like city block, right, different
blocks are different components. The RAM is the streets in
between them to deliver information. Okay, So more RAM, bigger streets,
high band with RAM, faster streets, faster traffic.

Speaker 3 (01:08:41):
Right.

Speaker 4 (01:08:41):
Okay, So that's the basic thing. It's just a shit
ton of data. That's what I need.

Speaker 2 (01:08:47):
Why, oh my god, it's one point five terribies but
fucking rack.

Speaker 4 (01:08:51):
O peract and there's a lot of racks in there.

Speaker 3 (01:08:53):
There's a lot of racks in there.

Speaker 2 (01:08:54):
Yeah, I fucking dicent will be full of like hundreds.

Speaker 3 (01:08:58):
Racks on racks on rag literally. Okay, but hold on,
why is RAM a limited?

Speaker 4 (01:09:05):
There are only three companies in the world to make it,
and you can have you go, But that's basically they're
only three companies and nobody expected they were fine. We
were fine, and nobody expected that. Why do you need
so much? God damn ram? Nobody was prepared to do this.

Speaker 3 (01:09:19):
Yeah, And is RAM a physical thing that somebody makes?

Speaker 1 (01:09:22):
Yes, it's so the preparation, it's I'm going to fuck
this up slightly, so deventure, please correct me if I get.

Speaker 4 (01:09:27):
Say, I am not an architecture. So there's a company.

Speaker 1 (01:09:29):
Called TSMC Taiwan Semiconductor Manufacturing.

Speaker 2 (01:09:33):
I think I get that.

Speaker 1 (01:09:35):
Also, I respect companies that are just like what's your
name business companies like fuck, yeah, just tell me what
you do. They have these things with fabs and they
basically build chips and there's a whole fucking supply chain
of how chips are made.

Speaker 2 (01:09:46):
You get wafers of RAM.

Speaker 1 (01:09:47):
You caught them up sounds delicious, isn't you get? No,
you get like a big sheet of you get a
big sheet of memories. Yeah, and then you caught them up.
Making chips is fucking cool.

Speaker 3 (01:09:56):
But you're saying this wafer is a certain amount of chips.
You print the sheet. You may make a sheet of
a thousand chips, and then a wafer is one hundred chips.

Speaker 1 (01:10:05):
A wafer is just the thing that you carp into chips.
But really simple, just just you don't need to know
that pit, just really simple. There is a limited amount
of machinery to build RAM. There are what there's Skehix,
Samsung and who else who makes them?

Speaker 2 (01:10:20):
Marvel, Marvel's. There's like three companies who make RAM.

Speaker 3 (01:10:23):
And we can't make more machines to make m It's.

Speaker 2 (01:10:26):
Just limited space there spent.

Speaker 4 (01:10:28):
It's going to take a long time to spin that up.

Speaker 1 (01:10:30):
One important detail. So making chips in general requires something
called a clean room. So think about any scientific process
that cannot be interfered with by any contaminants.

Speaker 3 (01:10:39):
Growing mushrooms in your garage, right, take it three.

Speaker 1 (01:10:44):
Hundred steps higher where it's just like if a single
speck of dust gets in there, you fuck up millions,
hundreds of millions of dollars, Like, it's really you have
massive things, and to run these machines requires very specific
talent that we don't have enough of. But also there's
only a so an amount of space in the world,
so we're just we're at the limits. And while they're
building more fabs where you build these things, there's only

(01:11:06):
so much. Right now, we have these fucking assholes who
are like, I need one point five terabytes to give
you an idea, Like I think this iPhone air has
eight gigabytes of RAM. Terror byte is one thousand gigabytes
of incorrect Yeah, correct, great stuff. So it's just like
you have one big asshole taking all the chips. Now

(01:11:28):
back to my I worry about the collapse thing we're
gonna have. We have all of this allocation being done
for one company. What happens those aren't selling anymore. We're
gonna just have fucking chaos. But until then, the price
of RAM is going to keep going up and it's
going to be harder to get, which will make it
more expensive, which will make every single consumer device more
expensive without exception.

Speaker 3 (01:11:48):
And again, just to ask the stupid question when we
say the price of RAM is going up. I could
equivalently say the price per RAM chip.

Speaker 2 (01:11:56):
Is yes, exactly, sure, great, And it's kind of it's
like it's a very simple supply chain things like if
the price of wheels went up, cause would become more expensive.
I realized that it's a city, but like it's a
very simple thing and it's affecting everything, and I'm that's
what pisses me off.

Speaker 4 (01:12:11):
Yeah, because like we can debate about like how useful
AI is, but the direct impact to consumers right now,
people building PCs, anybody buying a new computer, You're gonna
be screwed in twenty twenty six. And I'm working on
a piece right now about that because tell me, tell
us more. Yeah, it's the same thing. It's AMD also
announced their new high end supercomputer, you know, full I
servers also gobs of ramp. So yesterday I sat down

(01:12:33):
and I talked with one of the AMD executives about
like what is the state of the computing industry, Like
how do things look? And they admit, like, you know,
it's going to be rough. People are probably if you're
a PC owner and you're going to do an upgrade,
you're probably not going to build a whole new PC
this year. Maybe you'll just do a processor upgrade or
GP upgrade or something. Even those are going to get expensive.
But I think they're expecting a slowdown in this industry too,

(01:12:55):
So it's gonna be it's gonna be tough.

Speaker 3 (01:12:57):
And the solution can't even just buy be buy used
and refurbished because those prices.

Speaker 1 (01:13:01):
Those prices around and we're seeing the after effects of that. Actually,
outside of this so weird example, but we're running out.
We also don't have enough power. So gas turbines which
they're using to power in these days send this, which
you should not do but a doing anyway, are actually
sold out for seven years, so they're buying old used
gas turbines. I wouldn't be surprised if the used RAM
market gets fucked too.

Speaker 4 (01:13:22):
It's already getting bad.

Speaker 2 (01:13:23):
But hell yeah, I mean, who do no one worries
someone stopped making consumer ram, right of course Micron, Micron
my dad, sorry, so.

Speaker 4 (01:13:32):
And I feel bad and Micron owned Crucial, and I
feel was like the first ramstick I bought when I
was building a computer, and like I have memories attached
to that. My bigger worry, though, is they're lighting up
nuclear power plants for this bullshit. And that's where it's like.

Speaker 2 (01:13:45):
I'm not worried about that. I'm worried more about the
gas turbine.

Speaker 4 (01:13:48):
The gas is bad. The gas is bad because nuclear
power will get that's going to be a big fight.
But I we should we should have invested more in
nuclear power. It may be too late, but I don't know.

Speaker 3 (01:13:58):
Hold on, Yeah, you're saying bad to light up nuclear
power plants because these are old things that might not
be in good enough shape. Well, it's just more like
you're not saying you're not anti nuclear.

Speaker 4 (01:14:08):
I'm not anti nuclear power. We need that power for
other stuff and this power should be going to like
residential uses. And it's literally gonna be sucked up by bullshit.

Speaker 3 (01:14:17):
But it's still and you're and why are the gas
turbines back gasses?

Speaker 1 (01:14:20):
Because because gas turbines are basically calmode. No they're not
the same, but like they have the same after effect
of belching out gas. And Elon Musk's data centers are
already ruining black communities because of this very specific thing.

Speaker 2 (01:14:32):
They're poisoning water.

Speaker 1 (01:14:34):
It's really fucking horrible. But also this ram thing I
think is gonna be everywhere, and the PC industry was
already in decline.

Speaker 2 (01:14:42):
I believe they were now it was last year it
was they actually got I thought they were having my
bad No, but only Dell was down.

Speaker 4 (01:14:48):
But here's the thing, for reasons we discussed.

Speaker 1 (01:14:50):
Surely won't everyone be down because more expensive and Apple
smartphone sales, like every bit of this is gonna be bad,
and they are already running out of ways to sell
you a phone. My biggest example being like this, why
else would they ship Apple Intelligence, which is the most
mass radicalization event I've seen in tech. Just I've never

(01:15:11):
met so many people like fuck ai them people who
use Apple Intelligence.

Speaker 2 (01:15:15):
Yeah, who are just like fuck this shit, I fucking
hate this. Why I don't know why my phone is
like sparkly rainbows and just wrong all the time and
giving me a summary saying like eight uber's are coming your.

Speaker 4 (01:15:26):
Way, the notifications are bad. Apple wouldn't be doing that
if other people weren't chasing all this. But they're being
forced into it, and you know are they being forced
They're kind of being forced into it because like why
could they not be like because everyone talks stock prices, yeah, shareholders,
stock price. Everyone's like, oh man, Apple slow to AI.
I guess Apple doesn't have the innovation.

Speaker 2 (01:15:46):
You know. It's like someone didn't shit that pans yet.

Speaker 4 (01:15:49):
Yeah, So Apple rushed out Apple Intelligence. They got out there,
they announced this super smart sery thing which was not
They couldn't ship it, so they took a hit for that.
They took a hit for the intelligence issues. Again, they
were forced into that in a way. And hey, I
blame them for thre the issues. But if Apple left
its own devices as a company that is really slow
about pushing and trying to change new things, I think

(01:16:09):
they would at least try to make it less error prone.

Speaker 1 (01:16:12):
But and the point I was making was that Apple
shoving Apple Intelligence in there was a sign that Apple
doesn't really have anything else.

Speaker 2 (01:16:19):
And I will say this, people get so map when
I like my devices. People.

Speaker 1 (01:16:23):
I get little emails of people being like fucking hill
when I'm just like, yeah, I like my phone, and
they're like fucking piece of shit.

Speaker 2 (01:16:29):
But it's like Apple Phone, good use it. I don't know.
I would love an.

Speaker 4 (01:16:32):
Apple phone with great aspects of Apple Intelligence.

Speaker 1 (01:16:35):
Other than the liquid ass the new operating system that
is so bad it makes want to scream. But it's like,
it's just because we're and it's not because of Apple
being a bad company, sir. It's just there's only so
much you can do with a certain form factor. Your
phone can only do so much.

Speaker 4 (01:16:49):
But there are things like, hey, voice memos instantly transcribed now,
that is sick sick.

Speaker 2 (01:16:55):
Voice notes getting TransCards, it's fucking great.

Speaker 4 (01:16:57):
Can hit under calls now and those get trans that's
happening because of an ARM device AI.

Speaker 2 (01:17:02):
And it's so that's great and that shit's dope, but
that shit don't make growth happen like you're not. But
also everyone does that now and that's the problem. It's like,
how do you sell a new phone?

Speaker 1 (01:17:12):
So they're in this situation where how do we sell
a new phone at a time when they're going to
have to increase the prices on everything because RAM is
more expensive because to make a new phone faster you
need RAM.

Speaker 4 (01:17:22):
But also to fit models onto your phone you need
more to RAM. You need more RAM because those models
sit in memory and have to sit there and work
all the time. They can't just like come off the story, ste.

Speaker 1 (01:17:32):
Do you remember in like twenty twenty twenty twenty one,
the supply chain crisis that was caused by like basic
goods being delayed. Yeah, this is kind of like that
for computer and we're going to see how many things
have computer in it, televisions, phones, cars, I mean even
the thing we're recording on right here, that that has
RAM in it, That everything has RAM on it that

(01:17:54):
has any electronics these days.

Speaker 4 (01:17:55):
That's my big worry of twenty twenty six is basically
worse screwed when it comes to consumer But the thing
is it will good for prices, screwed for prices, good
for all sorts of things, but oh sorry, keep going well,
companies may die out like think like things will just
be bad, but yes, prices is gonna be where it starts.

Speaker 1 (01:18:12):
And the other thing is is that we're at a
point when they're they're as you've seen you are at
the Consumer Electronics Show, You've had a very egregious example
of how these companies do not have a fucking clue
what to sell you. They're still they're half heartedly coming
up with ideas like the death of a relationship. It's like, yeah,
we're gonna go per recar guess we fucking hate each other.

(01:18:34):
We're gonna go on what was gonna be so happy.
Everyone's trying to come up with a little idea to
do something with and at a time when they're struggling
for ideas to sell more things.

Speaker 2 (01:18:42):
To grow the core thing they used to build, the
thing is becoming more expensive in a way that you
can't circumvent.

Speaker 1 (01:18:49):
There is no way to fix this. Good example would
be so in Video. I may have mentioned this is
the previous episode I'm review here. So in Nvidia Crazy Thing,
eighty eight percent of their revenue comes from selling these
things for data centers. A much smaller percentage comes from
selling GPUs for gaming devices, so an Xbox, for example.
The problem is with that is they've already raised the

(01:19:11):
price on the Xbox. They're probably going to have to
do it again and those GPUs now in Video also
has deals with other companies where they can take an
in Vidia GPU for gamers and they can sell it.
They can say it's an Video GPU, but it's got
our label on it. We do special things with them.
Problem in Video is no longer including RAM with them,
so every single non Invidio GPU company has to now

(01:19:32):
buy RAM separately. So their supply chain just got fucked up.
Imagine if you're a car company and you.

Speaker 2 (01:19:38):
Buy a certain block of car and it has certain bits,
but you customize it and they're like, yeah, man, sorry,
we just don't do steering wheels here anymore.

Speaker 1 (01:19:46):
They got real expensive. You're gonna have to buy them
from the steering wheel guy who is now charging more.

Speaker 4 (01:19:49):
That's already happened to some degree to stuff like light
ar and things like. There was technology that was in
Tesla's like this is Cordy pilot and self driving and
then later expensive.

Speaker 1 (01:20:00):
But it's the thing, though, this is a really simple
thing though. This is just the like part of the motor.
This is the core component of the entire computing supply
chain everywhere that is going to go up. We're going
to see, to your point of Indra, shit just get
more expensive and see who can withstand that pain?

Speaker 3 (01:20:21):
I mean, what do you what do you think the
chances are that inflation actually causes some kind of spark
conflagration before the financial collapse of the AIS.

Speaker 1 (01:20:38):
Will lead to the Yeah, this is a form of inflation,
honestly on the compute industry.

Speaker 3 (01:20:44):
I'm saying, no, no, no, sorry, that's exactly what I'm saying.
I'm saying, like you've talked about the financial collapse of
the AI's system of borrowing money. Ye, yeah, yeah, that
could go bad when if if keep going Yeah okay,

(01:21:06):
this is like playing a clarinet solo in front of everybody. Okay, uh,
that could that financial collapse could trigger if in video stops. Fuck,
we were just talking about this last night.

Speaker 2 (01:21:20):
Yeah, if in video stops.

Speaker 1 (01:21:22):
Most of in Vidia's revenue comes from people borrowing money, yes,
with borrowed money.

Speaker 3 (01:21:26):
Yes, and if people if I mean, if if the
bank's ever come calling for that borrowed money, they're doo
dooo lens yes, or if in video stop, if there
what is it? If their numbers go down, but if
their revenue stop growing, if their revenue stop growing, Okay,
that's the like financial system collapse, poential collapse. What do

(01:21:47):
you think the chances are that this inflation just hits
like general populous so severely before the actual before the
in video financial with this computer I understand, but like,
but like so severe, you know, I mean, people are just,
people are.

Speaker 2 (01:22:03):
Just I think I get through it.

Speaker 1 (01:22:05):
I think I get what you're saying. It will cause
two gut punches. It will be likely before the AI
bubble burst, we will have a thing where anyone selling
consumer electronics or even enterprise hardware, so just consume electronics
for the enterprise. Their cost go up, so they will
probably sell less because their customers will be unable to

(01:22:26):
afford as much, which will cause a bunch of tech
companies to not make as much money, which will lower
the values of their stocks. This will then lead them
to go AI AI, and then their shareholders will go, great,
you've been saying that AI bullshit for three years, you
got any money?

Speaker 2 (01:22:40):
And they go, what if I told you I had
less money? Because it lost me? And at that point
the world panshitting competition will end like.

Speaker 3 (01:22:49):
It's I guess I listen to all of this and
I actually get my gut, which means almost nothing because
I understand almost nothing of it. But my gut says
that the thank you, that the the inflationary effects are
going to actually be more immediate than.

Speaker 2 (01:23:10):
The AI body.

Speaker 3 (01:23:10):
I agree fully, And I guess I don't mean like, duh, yeah,
they're happening now, but I mean like some degree of
severity that that people decide is not tolerable. Yeah, and
it will happen before the AI Bob. Sorry, we bently
brought you into the No, no, no, you're good.

Speaker 7 (01:23:28):
I mean I think, I mean this raises me. The's
something I've just been wondering where it feels like we
just keep coming up into this problem where we are
having uh, structural limitations imposed or you know, rushed up
against because you know, the sector in one way or
another is forcing allocation of resources, allocation of capital, demands

(01:23:51):
of growth, allocation of energy to satisfy it. So you know,
at what point does it end where you know, we
we have a shortage of power. Now we have a
shortage of RAM that seems to be in the short
and medium term fundamentally unfixable.

Speaker 2 (01:24:07):
It is, and it's the demands of eternal growth. It really,
it's just because here's the thing we talk about a
lot this week about we just want to compute that works.

Speaker 1 (01:24:16):
Yeah, that isn't going to grow the company guaranteed twenty
percent year of a year, every single quarter. And it
sucks because you should just be able to have a
company that grows three percent every quarter. Yeah, the stock
market should like that, but they don't. And as a result,
when things consume, they must consume eternally to your point.

Speaker 4 (01:24:32):
Yeah, I mean the best example of this recently is
the instant pot, right, Oh yeah, everybody loves an instant pot.
It's too good. It's so good.

Speaker 2 (01:24:42):
Nobody needed to buy a new instant pots instant part
when I hat of business well books, they got balled
by private equity.

Speaker 4 (01:24:47):
Too, but also they could they couldn't hit that mass
growth because instant pot too good. It just works.

Speaker 2 (01:24:54):
A pressure cooker is a pressure cooker.

Speaker 4 (01:24:55):
Yeah, and they were one of the first digital ones.
But it's still it works and it works.

Speaker 5 (01:24:59):
Well.

Speaker 4 (01:25:00):
Oh sorry, you couldn't grow fast enough, you're dead.

Speaker 2 (01:25:03):
Yeah, And it sucks because I don't know that I
have this somewhat. I don't know. Maybe this is not
Night List.

Speaker 3 (01:25:09):
Would they have died if private equity hadn't bought them?

Speaker 4 (01:25:12):
No, probably not.

Speaker 2 (01:25:13):
They were private, Yeah, they were. They were private.

Speaker 3 (01:25:15):
They can just keep chugging along it.

Speaker 2 (01:25:16):
They could keep it.

Speaker 1 (01:25:18):
If you're a private company, you don't have shareholp. I
guess you have investors beyond a shareholders saying you have
to grow. You don't have fucking Jim Kramer up your asshole.
And it's it's unfortunate, But I think that there is
a level of this. I hones, stay, I'm glad you
brought this up. I think the thing that happens before
the AI bubble is just the the ultimate slowdown, because
what was it like the fact that like, it's kind

(01:25:40):
of insane the Dell did such a big song and
danced last year and then we're like, actually, JK, I'm kidding.

Speaker 4 (01:25:49):
That was a bit where it hurts. Yeah, they directly
saw falling sales because and they could draw the line
to the rebranding. Because of that, They're like, oh shit,
what do we do? We got to refolk Is on consumers.

Speaker 2 (01:26:00):
Which is so bizarre.

Speaker 4 (01:26:02):
Though if only people loudly told them that this was
a mistake from the beginning.

Speaker 2 (01:26:06):
If only there were articles at engadget dot com that
they could have read in detail last year. But it's
like that's that's really like, this is very bad, this
ram crisis, this is bad. It's so I'm because it's
so scary.

Speaker 7 (01:26:24):
What now then happens with because I think something that
we've talked about also is I think, like you know,
kind of like what you're saying off the top, the
financial element of uh, the AI bubble is one that
we can see it for what it is, but that
there are multitude of ways in which they've been able
to juggle, find some new frontier of.

Speaker 2 (01:26:44):
Growth, justify and more.

Speaker 7 (01:26:47):
These structural issues that we keep coming across feel non negotiable.
So what effect do do you guys see them as
being risk of some sort of breaking point? Yes, in
terms of the AI infrastructure overbuild, in terms of adoption
amongst consumers, I mean like like what like let's say,

(01:27:09):
for let's say that nothing happens in terms of degradation
of the financial conditions, Is the structural limitations that we're
seeing there with the RAM sufficient to impose like a crisis.

Speaker 3 (01:27:22):
Yes, is basically what I'm you're saying. It's so much
more eloquently than what I'm what I'm asking, But that's
basically exactly what.

Speaker 2 (01:27:27):
I'm asking yourself. You were completely on the money.

Speaker 1 (01:27:32):
You were also at here's the thing, and I think
Davindri you are thoughts in this as well. It's you
you can get more of that money can be found elsewhere.
Singaporean boys that the country's fund I remember the name
for that they invested. They're going to invest in anthropic
More money can be found. There's a limit but we're

(01:27:52):
not at it. Yeah, they're on limits to power. And
here's the thing, one point five fucking terrorbytes per. I'm
going to guess V two hundred eight e eight GP
whatever the tower, That's what.

Speaker 3 (01:28:03):
I was gonna guess.

Speaker 1 (01:28:04):
Of course, you're constantly like the GB three hundreds four
point five million dollars at how the hell did the
economics work?

Speaker 2 (01:28:11):
What do you mean data an ri t ed, I
don't get this. People are saying that the cost of
inference is going down, it's going up. Let's talk about
high bam ri RAM.

Speaker 1 (01:28:22):
You're constantly in my text with yeah, you just won't
stop talking about models.

Speaker 2 (01:28:27):
No, but it's just real simple.

Speaker 3 (01:28:30):
And videos are.

Speaker 2 (01:28:30):
Saying we're gonna build the biggest, fucking hugest tower ever
and by the way, it's full of more RAM than ever.
They haven't announced the price of then you big enterprise GPUs.
I think they're going to be catastrophically more expensive because
they have more RAM than ever. It's like everything, including
the thing that needs debt, the borrowed money to keep going,

(01:28:53):
is going to get more expensive. Consumer devices are getting
get more expensive. To your point, ed, it's like, yeah,
they've been able to get away with it because you
can find more venture capital. Man, you can find more day,
you can find more assholes to loan your money. You
can't find more ram. You can't find more power is
not something you don't just like plug into the fucking wall.
There are literal limits. And it's like we're now, we're

(01:29:17):
now seeing the after that. I can't get over the
fact they raised the laptops place five hundred bucks. That's
fucking that makes it unaffordable for most people. That's that
is a huge and this is this stops people upgrading
and upgrade cycles of what make these companies live and
die at a time when there's less real you can

(01:29:40):
get a laptop from you even I could get like
an M three MacBook Air and be pretty fucking happy.
I'm using a three year old MacBook Pro and it
fucking bangs. Is that ship that shits? That shit's flying?

Speaker 4 (01:29:53):
Computer that work?

Speaker 2 (01:29:54):
But the thing is why would I upgrade? And that's
actually like that sounds like a kids see things that
that's actually a huge question because I'm like a tech pig,
like I'm a hog. I love my laptops and my goot,
my dud dad's and my gizmos and shit, and I'm like,
I don't fucking need a new laptop. Why would any
of that is going on?

Speaker 4 (01:30:12):
That is a good problem Like that is usually it's
good for consumers. It's good for consumers, it's a good
it's you made a good product that lasts. Unfortunately, that
is bad for your overall revenues. But that's good that
you you were capable of doing that. I wish more
companies were capable of doing that, Honestly. I'm also thinking
about like where we are the climate crisis isn't isn't

(01:30:33):
slowing down?

Speaker 2 (01:30:34):
Right?

Speaker 4 (01:30:34):
Like, we're all these companies that for so long we're
talking about, Oh, where we have all these climate gold
we're going to reduce energy, we're going to be more conservative.
Once AI shit started happening, they shut up because all
their all their stuff blew away. That didn't matter anymore
because it was all AI all the time. And yeah,
what what? The world right now is going to be
more expensive when it comes to gadgets. The world in

(01:30:56):
the future is going to be hard to live in
because of all the stupid decisions happening right now. And
I'm so angry about all of it.

Speaker 1 (01:31:02):
And we are now going to transition to our final
thirty minutes of this episode. This next one is brought
to you by gen Z dot Bears. Now you it's
just a store where you can just buy zin to
your door. I don't actually know what zin is.

Speaker 2 (01:31:16):
Please, if you're if you're one of my three Zoomer listeners,
please email me what zinc is and how you consume it,
and can you tell me what's cool as well? I
haven't known what that was for ten years. And we're

(01:31:41):
back in the room. We're back in the room.

Speaker 1 (01:31:42):
We've got the Vendra Hardawa from En Gadgets. We've got
Edwin and Guiso Junior of tech Bubble newsletter have. We've
got Chloe Radcliffe, actress star of is this thing on?
I got targeted on an Instagram star is.

Speaker 2 (01:31:57):
What's great? Was? The clip I got was like Cradley
boot being talked to. H Sorry, I'm we'll learn it.
We'll learn it.

Speaker 1 (01:32:03):
Thank you, christ and going look you I'm and stand
up and Cradlely Booper going what anyway?

Speaker 3 (01:32:09):
Stand up?

Speaker 2 (01:32:10):
Comedian Chloe Radcliffe and yeah, we're back talking about ram
and the slowdown. And I'm gonna be honest, I know
it seems bad now and it will be bad. The
take industry needs this. The take industry needs to be
punished for the growth lust because there is no there's
no sustainable future doing this and this was also inevitable

(01:32:32):
and I think it's scary.

Speaker 7 (01:32:34):
So they're just I mean, so then is the you know,
if we if ideally what it delays on the products
that are the most RAM intensive to give infrastructure a
chance to fill a backlog and to give the next generation.

Speaker 4 (01:32:50):
There are rumors that they may revive old hardware. There's
a rumor that in videos, RTX thirty sixty, which is
a two three year old GPU, maybe back in market
soon because it's using all older slower RAM too, so
it's not using the newest, fastest stuff but less powerful
GP yeah, way less powerful. But it's also like like
two generations back. It is a sign that you have
to bring back the past because the present is unaffordable

(01:33:12):
and you cannot make it. So, I don't know, that's
never happened before.

Speaker 2 (01:33:17):
Yeah, I so we're never heard of that.

Speaker 4 (01:33:19):
We don't know if this is true, but this is
the really I thought that was confirmed.

Speaker 3 (01:33:24):
So I okay, as my responsibility as my person I
then think about how do normal people experience this, because
normal people have heard AI a million times. We've heard
ram is getting more expensive. But like I think I'm smart,
and I didn't know any of this ship. I've heard

(01:33:46):
these words, but I don't know the details. And so
as a as a representative of normal people, as stuff
gets more and more and more expensive, so far, the
American population has just borne the load of more expensive life.
People have. People complain about it. It's very frustrating. It

(01:34:08):
negatively impacts people's physical health and they're to enjoy life.
But like pretty much everybody has just sort of God.

Speaker 4 (01:34:16):
The people who can afford it eat the cost, but
then we don't hear about the people who.

Speaker 2 (01:34:20):
But there's there's also a limit finish oppression.

Speaker 3 (01:34:23):
Yes, but I think the people who can't afford it
eat the cost too, I mean.

Speaker 4 (01:34:27):
Like in other ways. But here's the thing.

Speaker 2 (01:34:29):
Really, here's the thing.

Speaker 1 (01:34:31):
You're saying that for now there is a limit, And
what it is is people are eating the cost inasmuch
as they can. At some point, there's a limit because
there are people who weigh like Apple's apop Max. I
don't really think that that's like a regular person thing
is like five hundred bucks, but those half ram in them.
Modern headphones half ram inem Like, there are headphones that

(01:34:53):
have RAM in them. Even cheap laptops have ram in them.
Everything has ram in it. So yes, the cost of
everything already went up and never came down.

Speaker 2 (01:35:02):
There is a point when people would just stop buying
new electronics at all.

Speaker 3 (01:35:07):
And I have a thesis. Okay, so there's a point
where people will stop by new electronics at all, and
then there is some further point at which people will
draw a line at inflationary prices overall, whether it's an electronic,
whether it's in whatever. Because I'm sure that RAM shortages
are going to affect things that don't have RAM in it,
because the things that are used to ship those things

(01:35:29):
have RAM in it.

Speaker 5 (01:35:30):
You know.

Speaker 3 (01:35:30):
It's like I right, I can imagine that this impacts
price overall. And what we have seen in trend lines
is that once prices go up, they stay up, they
don't come back down for the most part.

Speaker 1 (01:35:41):
So yeah, that might actually not happen with RAM, I think,
because if they My real crazy scenario and no one
wants to think about, is AMD and video buying all
this RAM what funck. They deal with it when no
one wants to buy their bullshit anymore, All of these
companies that have built massive allocations RAM won't have any usage.

Speaker 3 (01:35:59):
Yeah, but other like consumer goods, I think once those
prices go up.

Speaker 1 (01:36:03):
There this thing is if no one's buying them, they
will have to because what we're talking about here isn't
just like the top crust, it's everything.

Speaker 3 (01:36:10):
But so far people are still buying them. As we
often people are tightening belts. But I think a lot
of people are just spending way more money than they
would have been and not saving and like because because.

Speaker 4 (01:36:25):
You're living paycheck to paycheck with no safety net, right,
But I.

Speaker 3 (01:36:29):
Think a lot of people are paying the higher prices
rather than not buying the items there.

Speaker 4 (01:36:36):
Are, Yeah to a certain agree like sometimes you need it,
sometimes you need it. I do think we get to
look at this culturally too, because people don't just sit
down and read business news and make logical business decisions
as economists think. We do think we respond to culture.
We respond to like that's how things are reacting. And
I think it is interesting that the youngs, the gen
zs and all are looking back to pre smart phone culture,

(01:37:00):
pre social media and how prevalent is this. I mean,
it's just like they grew up watching reruns of the Office, right,
And then there's a lot of things about this they're
theorizing is that this is a world they probably will
never actually see, just like having a simple office job.
That's like that. And there's a lot of like looking
back towards retro stuff. I think culturally we will. We

(01:37:21):
may regress a little. It's like going back to the
old video cars, but like you regress back to Okay,
we gave too much of our lives to social media.
We will just take a step back back.

Speaker 3 (01:37:30):
Okay. So this is I was thinking about this on
the plane to cees as I was like sort of
collating some of my thoughts about tech and AI and stuff.
And I actually think I lay much more of the
sin at the feet of the constant entertainment feed, oh
than I do at the feed of AI. I think

(01:37:50):
that social media and the constant input of entertainment has
numbed us and has has uh reduced us people's interest
in making statements or engaging with societal ills. Yeah, by
you know, orders of magnitude. And so I actually, to me,

(01:38:13):
I'm sort of like, yeah, AI is whatever bullshit it is,
But I actually think that the core of the social
ills is the constant drip feed of entertainment that just
turns your brain, that just dulls you to.

Speaker 4 (01:38:24):
And a lot of people are trying to take a
step back from this information over load, and I think
we will probably see more.

Speaker 3 (01:38:31):
I think we will see more of that. But so
my thesis winds up being that I think the breaking
point for people, for people living in this inflationary environment,
I think that breaking point will be a lot higher
than it would be without social media. I think the
breaking point will take a lot longer than it would
be without. It feels bad, but you can just pick

(01:38:54):
something up and scroll through things that make your brain
go big.

Speaker 2 (01:38:57):
So I agree.

Speaker 1 (01:39:00):
My thing is is they forced AI into every social
networking app too. AI is everywhere, It's in your ear.
I realize I'm doing my same bit I always do.
But it's like when that disappears, gets weird, it kind
of comes obvious it was bullshit that will filter into
the feature kind of already seeing it, You're seeing you're
seeing random people being anti AI.

Speaker 2 (01:39:19):
Now people I never.

Speaker 1 (01:39:21):
Talked about it, but two separate things, but it creates
the kind of chaos that makes people be like, wait,
why do I need new fucking up to want? Just
go on e Bay and buy an hpoman from four
years ago? I just buy an iPhone sixteen instead when
I have or like what you have is fine, Like
this iPad is over a year old and I will

(01:39:41):
probably use it for another three years or more.

Speaker 2 (01:39:44):
I don't know what they could possibly do.

Speaker 4 (01:39:45):
iPad pro, by the way, a stunning example of a product.
Apple spent years being like why does this exist? A
super powerful iPad? Who will spend the money of a
laptop to just get an iPad? And they spent so
long figure out not what to do, like they had
no idea what to do with anything. That's endemic of
like another issue like them trying to push into bigger

(01:40:06):
categories and more expensive products. But it's all kind of
part of the same thing, like you got to push more,
you need more the information overload. We were not We
didn't evolve to processes must have much information. So I
think that has had a cultural impact on our brains.
COVID has like rewired people completely. You would think it
would push us to be more, you know, more supportive.

(01:40:30):
Most of their socialisport like try to come together, and
it just broke our brains in other ways. So it's
just like, I don't know.

Speaker 3 (01:40:36):
But I think that the reason it broke people's brains
was facilitated by the entertainment feed. And I want to
differentiate that from a social media platform where you can
go and you know, talk to your friends and family.

Speaker 4 (01:40:50):
Truly, to me, it is they all kind of ended
up becoming entertainment.

Speaker 3 (01:40:53):
This is but it's why I'm specifically different why I
am using the words entertainment because I think that how
would you define that a constant flow of entertainment that
is instantly and always excessed.

Speaker 2 (01:41:11):
And this is this video or this text involved in this?
Or is this only?

Speaker 3 (01:41:15):
I think video is so much more poisonous than I
ate text. But I will say I think text is uh.
I think text does a very similar thing.

Speaker 1 (01:41:24):
No, but video is much easier to cycle through and
consume a bunch of information.

Speaker 3 (01:41:28):
Fust It's not even it's not even just the cycle through.
It's that like I open Instagram. I opened Instagram this
morning to do a work thing, to like find a
specific producer and look at who she follows and be like,
do I know anybody who? She like it's like a
it's a networking, it's a useful tool. And I opened
it and it's some two hours later, truly, truly true.

(01:41:49):
And it's because like I see in the little carousel,
I see this little fifteen second like a girl being like,
here's what my boyfriend expected on New Year's and then
here's what he actually we got.

Speaker 2 (01:42:00):
And it's like a very funny going around and like looking.

Speaker 3 (01:42:03):
But my brain, my brain is like I see, I
see things.

Speaker 2 (01:42:07):
Move, and that makes me like, I like, I'm so
fucking glad. I'm autistic. I'm so fucking glad.

Speaker 3 (01:42:13):
I see that.

Speaker 2 (01:42:13):
I'm like, I got something else to do. I'm doing,
I'm doing what I just it's so cool.

Speaker 3 (01:42:17):
I don't I see that.

Speaker 2 (01:42:18):
I'm like, yeah, trying to fuck this happened to me.
I'm a like I'm a pig, but I'm not that
kind of hog.

Speaker 5 (01:42:24):
Yeah, yea, yeah.

Speaker 4 (01:42:24):
Last night I was like I had hours free for
for once, and see I was like I could watch
a movie, I could do all this stuff. Sat down,
started looking at TikTok and was like, you know what,
this is comfy. This is nice. And also because my
brain has been at like two hundred percent the entire
so it just needs to like cool down.

Speaker 3 (01:42:39):
But it is such a crew with the like with
like the brain. Then I'm gonna put on a movie
and cool down in front of the movie, which is
one bigger Yeah.

Speaker 2 (01:42:49):
And I think the the pressure is just increasing on
people and will eventually pop. And I think the pressure
is increasing from all sides.

Speaker 3 (01:42:58):
I just think it's going to take a lot longer
now than it would have with the technology we have.

Speaker 2 (01:43:04):
I will say that.

Speaker 1 (01:43:07):
The problem is the first of all, the real world
was never meant to connect with finance this much. Most
people were never meant to know what hb RAM was.
No one was meant to know what a GPU was.
Seven years ago. If you told you are somewhat to
GPU was, they would call the police, like they'd be like,
why are you saying strange words to me? You're a
witch and no offense to witches. But it's just now

(01:43:31):
everyone's kind of aware of this and it's in their
head and every Barston company is saying, AI, you must
do a I know will. You are stupid, You are
a pig and a moron. Banana no No, there's a
great video of a comedian I don't remember.

Speaker 2 (01:43:42):
It's just try banana now now, and it's just like,
that's all I think of what this stuff is just
harassing people. I think you're right in the It's going
to take a little longer because people are so addicted
to feeds, and the feeds are so intense it just
numbs them. But I think that there is a degree
of harassment from the AI stuff. That means that when
this starts, the burst, it's going to be like fucking

(01:44:03):
Chorus Sun and Return of the Jedi. I think people
are going to be fucking excited to see this burn.
I think people are just so fucking angry, and the
fucking price of all electronics going up at a time
when we're most the angriest we've ever been at the computer.
I'm not saying I don't know the order of events,
but I think this is I know this.

Speaker 4 (01:44:23):
The thing is, there's so much happening in the world,
right Yeah, there's also the literal destruction of democracy right
in front of us. Right there is war, we're being
led by warmongers, and also the economy is collapsing. All
this stuff is happening h I tend to look at
like what what will be the broad societal ways, like
we move through this and the only way is like
we do have to like step away from the tech

(01:44:43):
a little bit. Yeah, it's also like I don't know
doing those social things that we didn't used to do,
like we kind of stepped away from So I love libraries. Yeah, yeah,
libraries are great libraries. Fucking we let the we let
a lot of people really shit on libraries and could
also strip away their resources and stuff. It is things

(01:45:04):
like that where you see the rest of your community,
you can help each other, you learn together. That's the
sort of thing I want to support moving forward, and
we need that.

Speaker 2 (01:45:13):
I will also say, like I said this yesterday on
the episode, it's like I am a rare thing of
like the internet has allowed me to become a person
rather than maybe.

Speaker 4 (01:45:21):
Less just one. That's like I was a socially anxious kid, yeah,
and technology and the internet did help me.

Speaker 1 (01:45:27):
Like so yeah, and the thing is you just I
think there's just a level of intentionality where it's like
the feeds don't work on me. I get deep anxiety
when I have an infinite scroll, Like I have the
opposite it has the opposite effect on me. I'm like,
why won't this end? I can't watch anything everything. I
don't want to look at it because it's it's worrying me.
This is bothering, which is literally everyone else is different.

(01:45:49):
It's so strange. But it's like, these tools also let
me talk to everyone in this room in an instant.
They let me put on an insane podcast for twenty
hours in a week and nobody stopped me. But there
are actual beautiful things that technology can do. And what's
great is you could use the same technology from five
fucking years ago. I'm not even saying put your phone
down and burn it. I'm saying use what you've got now,

(01:46:11):
or use what just happened. Computers, putting aside all the
growth capitalism stuff, computers are fucking great right now.

Speaker 4 (01:46:17):
There's are great. People are going back to personal music players,
just like a thing that plays music, you know what?
That is beautiful. And the younger the kids right now,
the thing people are saying is like they don't know
if they're trusting an image that's a I that's an
inherent distrust in AI and calling it out as a
cultural object. I think that's fascinating too. So we are
seeing the signs that there is going to be some

(01:46:37):
sort of like pushback. Don't I don't want there to
be like the Battle Starcalactica thing, right, or the dude the.

Speaker 3 (01:46:45):
Edge you get.

Speaker 2 (01:46:48):
It's just we're gonna get just a guy who just like.

Speaker 5 (01:46:53):
You know.

Speaker 3 (01:46:53):
No, it's like, we don't have to go all the way.

Speaker 1 (01:46:55):
It's the tweet where the guy's like, what's the highest
number one million? It's but at the same time, my pushback. Sorry,
this is not a pushback. I'm agreeing with everything, but
it's the way through this.

Speaker 2 (01:47:07):
Is to just be a little bit more fucking intentional
with it.

Speaker 1 (01:47:10):
I know it's don't look at this. You're gonna look
at it. Don't hate yourself, but be aware that something's
being done to you with it. And also remember you
can call any of your friends using these things. You
can talk to any of your friends if you're feeling
lonely and said you signal.

Speaker 4 (01:47:25):
They brought the landline back. One of the interesting products
of like that was in the holiday season. I think
it's more like uppity rich parents are doing this, but
they're like, there's a startup that made handset and you
can program like your friends to it. So your kids friends,
it's a handset, they pick it up, they dial their friends.

Speaker 3 (01:47:43):
Court.

Speaker 4 (01:47:43):
I think there is a court. I think I think
there may be a court. It's a handset.

Speaker 3 (01:47:47):
It's those phone numbers memorize.

Speaker 4 (01:47:49):
Yeah, it's more like these are your trusted contacts. That's
all this thing can call. It's back to the days
where we were just like.

Speaker 3 (01:47:55):
Get plugged into your wall.

Speaker 2 (01:47:56):
Or is it? Because you can quite cynical about cs
and my god, have we got another two hours today
and four hours tomorrow and one hour on Saturday, so
we will be there's also a degree of like, as
much as we complain, I've got to hang out with
my friends all we and I got to do it,
and I got to record using a weird road cast to.

Speaker 1 (01:48:18):
Pro too of the best thing. It's amazing and we
can record the audio. I have one for Mattasowski producing
a beat on that.

Speaker 3 (01:48:24):
It looks like a.

Speaker 1 (01:48:27):
Soundboard as well. You can do all sorts of noises.
But it's like you can do really cool shit with
the computer now and you can talk to your friends
with it. It's but the reason that things are shit
is the people that are making the interfaces, the social
interfaces are getting in the way. Understanding that and starting
from that position is what makes things better for you.

(01:48:47):
They're not gonna make it better for you. They're fucking assholes.
Mark Zuckerberg is overseeing a fraudulent operation. Ten percent of
their revenue in twenty twenty four for Meta was from
scams and fraud. That is a fucking reported thing. For
Jeff Horwitz of Reuters, like that should tell you everything.
It does not mean that the computer is inherently bad.
What they've done to the computer is bad, and what
we can do with the computer is the solution.

Speaker 2 (01:49:08):
In the you can pull.

Speaker 1 (01:49:09):
Together a bunch of people just using text an email.
I fucking just did it this week. There are great
things you can do it text text, text, go meet
in real life. You can take fucking photos on your phone.

Speaker 2 (01:49:20):
Awesome. There are still good things to be done. I
know this is so hokey, but I think it's hot.
It's a good point.

Speaker 4 (01:49:26):
It's not about being completely anti tech. It's about it's.

Speaker 1 (01:49:29):
About being anti tech industry at the moment because it's
like we like Pebble because Pebble's like, what if a
device had a use case, and anyone's like, fuck this shit,
what do you mean you won't grow by twenty two
percent year of a year, every quarter, you piece of
except these private doesn't matter and it's just by years
at the moment.

Speaker 2 (01:49:47):
That's my big that's actually my better off line twenty
twenty by verb by reefer By.

Speaker 1 (01:49:53):
You don't give these fucking companies another fucking dollar until
they can prove that they can earn it, because that
is the real problem. They want to grow forever. You
want to grow forever and make useful shit. But the
actual internals of the computer. I also will address something
that people ema my occasionally, which is, oh, the computer
industry used to be good whatever. I don't know if
it used to be good, but it used to be bow.

(01:50:13):
But there were beautiful things in the computer.

Speaker 3 (01:50:16):
Yeah.

Speaker 4 (01:50:16):
Part of how we got here is that we didn't
ask cultural questions right. We didn't push back against certain elements.
Like I was there when social media was just starting out.
It was like, oh, yeah, look at this feed of information.

Speaker 2 (01:50:27):
It was so cool.

Speaker 4 (01:50:28):
But it was also we knew Mark Zuckerberg was a
piece of shit from the beginning, but most there did not.
Most people did not, but there were enough stories about oh,
look at this kid who dropped out of Harvard and
like did this thing, but there were stories about what
he did before, how like the shitty sexist, like proto
social network he made who he was. Like, those stories
were out there, and I watched this kid who didn't

(01:50:49):
finish college but also had no real awareness of the
world be propped up by vcs in the entire industry
as some sort of like boy king. And what he
did was create a thing that just generated money, right,
And that the entire point of Facebook two thousand, you know,
late two thousands to twenty tens, was just as much
engagement as possible.

Speaker 6 (01:51:09):
It was.

Speaker 4 (01:51:09):
It was just that farming. And then everybody copied that,
and that broke us because they didn't want to create
something good. They just wanted to create something that addicted us.
It's like cigarettes.

Speaker 1 (01:51:19):
It's that and the scourge of neoliberalism and the idea
that everything everything, that the market is dominant. The market
would never incentivize something bad. Again, Yeah, everyone took a lack.

Speaker 2 (01:51:30):
Of regulations in the twenty tens that that didn't help.
You talk about killing baby Hitler, that's great. I think
baby Milton Freeman and baby we're on a bag and
put them on the top of the pile. We got
all sorts of babies, a lot of baby. No.

Speaker 1 (01:51:43):
But it's But the thing is, it's like the incentives
of the problem and the only thing we can do
is consumers, is don't stop using tech. You I think
that that's an unrealistic thing.

Speaker 2 (01:51:52):
Like, Chloe, it's your luck, like you have to use tech,
especially as a stand up you kind of I have
to use social media.

Speaker 5 (01:51:58):
You have to.

Speaker 1 (01:51:59):
There's just no other way around it. Don't buy a
new phone. I'm done buying new fucking ship.

Speaker 2 (01:52:04):
I'm I'm going there.

Speaker 3 (01:52:06):
You haven't seen how broken my phone?

Speaker 2 (01:52:07):
No but going go and buy Chloe, Chloe, go on,
buy a research. I actually think, I actually think I
am done buying new phones. I'm going to buy the
referb forever.

Speaker 4 (01:52:23):
I buy my my parents refer buy phones every couple
of years from Amazon. Perfect. Yep, it works great. They
the hardware lasts and then they're still like.

Speaker 2 (01:52:32):
Ship, No new ship unless it's cool, unless it's fucking
new and cool and good Foxy. Yes, it's like really though,
if you can buy new by used, go on eBay,
go on whatever, Go on Craigslist, like go and buy
from a regular person, circulate that money in the real world,
take it out of the fucking stock market cash cat.

Speaker 1 (01:52:55):
Okay, I'm going a lil insane, but it's like I
stand by it because it's like all of the new
price is it going to go up?

Speaker 4 (01:53:01):
Yeah? And when it comes to social media, I think
we can be more intentional. Right, It's like, for the
longest time, people are making arguments like, Okay, I'm gonna
stay on X even though Elon Musk bought it and
destroy Twitter. I'm gonna stand on the stand. Now it's
a freaking sea sam generating garbage.

Speaker 2 (01:53:13):
Mae.

Speaker 4 (01:53:14):
It's like, I think we can start to make those
moral choices where it's like, career wise, I would be
better off staying on X probably, but you have to
have some sort of like moral center, and I think
where we need to ask more of that as ourselves
right now. So yeah, get off of X.

Speaker 2 (01:53:28):
I'm honestly like, I want to stay on there for
my fucking publisher and I might talk to them about it. Yeah,
It's just like it's a horrifying pedo generator.

Speaker 4 (01:53:36):
You don't want to be there, and they're literally nobody's
doing anything about the pornography and the sea sam being
generated on it and that just exists. And they just
got a bunch of new funding. They got rewarded for this.

Speaker 2 (01:53:46):
Well they did that so they could buy GPUs.

Speaker 5 (01:53:48):
Yes.

Speaker 1 (01:53:48):
Now, actually I want to end with a fun idea.
Here's a fun idea that I think everyone's gonna love.
So here is how Elon Musk ruins his life using AI.
So Elon Musk is doing. They just raised twenty billion dollars.
Most of that is going to go to GPUs AI.

Speaker 3 (01:54:02):
For groc And where did they raise that from?

Speaker 2 (01:54:04):
Vcs vcsh fuck? I forget, like who exactly why the
venture capitalist is it? It's mostly vcs in debt and Nvidia.
So they're doing that to build more data centers. They're
losing over a billion dollars a month just on the
current thing. They're going to build more data centers, which
will lose them even more money. Elon masks money mostly
comes from leverage, which means just borrowing on his current

(01:54:24):
holdings what He's not particularly liquid, and he's added the
most money losing thing ever to a social network which
already loses money and a slash groc on Reddit.

Speaker 1 (01:54:36):
Everyone there should be redacted. It's mostly just people generating
horrifying things of women.

Speaker 2 (01:54:43):
Here's the thing. Every time he does that, it's probably
five to ten bucks. It's happening millions of times a day.
This could actually lead AI could lead to the destruction
of someone like that. And people say, oh, Elon Musk
always finds more money. There are actual limits to Ed's point.
There are actual limits.

Speaker 4 (01:54:59):
The thing I would say for people, Yes, he is
still he is the world's richest.

Speaker 2 (01:55:02):
He's finding ways to do it because he's the richest
man on paper. Yeah, and he got so much, like
the Twitter deals thirteen billion dollars, there's fucking nothing compared
to the bullshit data centers. I want everyone to just
think about and laugh about the idea that this man
is building big ass data centers to just under money.

Speaker 4 (01:55:21):
He keeps talking about Mars. Let him go to my
to Mars. Let him be hoisted by his own Yeah.
If you've read the beginning of Red Mars, you know
it's totally great. There's some ideas there.

Speaker 2 (01:55:34):
Yeah.

Speaker 7 (01:55:35):
Then that first prologue about what might happen to Elon Musk.

Speaker 3 (01:55:39):
Yeah, that's so good.

Speaker 1 (01:55:41):
Yeah, it's and I realized that it's tough, and I
know it's we come to the end of this block.
I know we've been quite cynical. This is so hokey,
and I do this every CS. The best thing you
can do when you hear this stuff, when you're like
I'm upset with this is love your friends harder. Telling
me I love everyone in this room, genuinely, so happy
to have everyone here. But it's like that that little
thing that you think is so immaterial. We'll save lives,

(01:56:03):
will make people's lives happier. Tell people you love their shit.
Go online, use social media to say I think clubic
you're insanely funny, and I've watched you only get funnyeh, Devendra.

Speaker 2 (01:56:11):
I've followed you work for like a decade.

Speaker 1 (01:56:13):
Edward and Graso Junior an insanely talented man, and I
love reading your work and I love having you on
the mic. These things are actually very easy to do
and easier because of digital There is never go you're
listening to this, go and fucking do that too. Literally,
anyone you know, family member, friend creator, Tell them you
love them, love their shit, and why. That is actually
something that digital communication allows you to do today that

(01:56:33):
will make the world better by used by used. Now,
if you want a new thing, buy a used one,
it's probably fine. That is the only way we begin
fixing things. And these moral choices like Devendra mentioned.

Speaker 7 (01:56:45):
Or if I want to chat you bt rapper to
tell me what color my shit is and if that
is gonna be indicative of a health problem.

Speaker 2 (01:56:52):
You could just use Microsoft Copilot, which will be in
a Windows laptop from a.

Speaker 7 (01:56:55):
Year Microsoft three sixty five. Could hear the laptop around
and take a photo and then also at the same time,
look at the chap.

Speaker 3 (01:57:07):
You got this.

Speaker 4 (01:57:08):
We're going to see some cool stuff like these with
these local lll ms. We're gonna go back to punk tech, right,
people building their own cool little minds make little cool
things like We're going to be back there, their own
little gadgets powered by punk tech. I want to see
that more.

Speaker 1 (01:57:23):
And the thing is, I know it's fucking grim out there,
and it really is. We are still One of the
problems in society is we're hyper connected and full of technology.
One of the cool things is we are as well,
and there is a ton of tech that exists today
that you don't need to worry about the future.

Speaker 2 (01:57:38):
That will do what you want to do. By us,
I just tell your friends and family you love them,
to creators, you love their work, share their ship.

Speaker 3 (01:57:45):
Give me a little compliments gift No I mentioned.

Speaker 2 (01:57:48):
Give friends little compliments that you can do that in seconds,
that it doesn't matter where they are in the world,
as long as they've gotten into that connection. It's so
fucking hokey, but I think it's easy to miss. And
I think like we come to an end. As we
come to the end here, it has come to the
end of this blog. It's worth saying that. And you
know what, I only have one plug something Devendra. What
are you working on the moment? What do you want

(01:58:08):
people to read?

Speaker 4 (01:58:09):
And gadgets? Where where I'm doing this stuff? Check out
our CS coverage. I also do a movie podcast, the
Filmcast at the filmcast dot com. Can't wait to see
your movie, Chloe, But maybe don't listen to my review.
I don't know. I don't know.

Speaker 2 (01:58:20):
Edwin Greiser Junior, what you're excited about writing? What are
you writing at the moment?

Speaker 7 (01:58:27):
Working on this very weird essay for for prim magazine?

Speaker 2 (01:58:31):
Tell me.

Speaker 7 (01:58:34):
It's uh, I mean yeah, it's I've pitched the same idea.
You know, I supposed to do this. I pitched the
same idea two places, but I know one is gonna
one is gonna actually say yes, and the other one
and it's uh. One of my favorite speeches is MLK
is a Letter to Christians. It's from the perspective of
Paul writing to American Christians and warning and kind of

(01:58:55):
like sussing out the society they have in warning about it.
But I wanted to invert it, so I wrote a
letter from a future where fascist won, and it's a
letter from the fascist who won to the fascist now
as a way to reverse talk about stuff that's.

Speaker 6 (01:59:11):
Going on now.

Speaker 3 (01:59:12):
That sounds yeah, I don't.

Speaker 7 (01:59:15):
I don't like researching it.

Speaker 1 (01:59:17):
But.

Speaker 4 (01:59:19):
It's been interesting.

Speaker 7 (01:59:20):
It's a fun experiment. Yeah, so stuff like that. I've
been trying to do a lot more fiction experiments because
I've been doing more readings lately. So good, it's excited
for that.

Speaker 2 (01:59:27):
Fucking great, Chloe, Why are you playing? Wait?

Speaker 6 (01:59:30):
Then?

Speaker 3 (01:59:30):
I'm doing Cincinnati this weekend the ninth and tenth or
tenth and eleventh, whatever. The Friday Saturday is Lincoln be
in the then Washington, d C. The next weekend MLK
weekend Thursday, Friday Saturday, and then that following week Tuesday, Wednesday, Thursday.
Not a weekend. I'm doing my solo show Cheat in Philadelphia.

(01:59:52):
Come on out to that. After that Vermont for Collins. YadA, YadA, YadA,
Road Fargo, and I.

Speaker 1 (01:59:59):
Have been putting a link to getting a permanent discount
to my newsletter. And in these things I've not done
my paid newsletter. It's a chunk of my income. Please
fucking God, subscribe to my paid newsletter. I really if
I can in the years time, if I can double
from here, I can just do this.

Speaker 2 (02:00:14):
Yes, really happy?

Speaker 7 (02:00:15):
Also subscribe their mind. Yeah, well I figure out a
special we can do an a.

Speaker 2 (02:00:20):
Better than one. Yeah, really though, if you're gonna put
your money into anything.

Speaker 1 (02:00:24):
Sean Paul Adams, who was a friend of the show,
friend of the suite, He passed sadly last year. His
son is epileptic, so we are honoring him by getting
you to donate to the Pediatric Epilepsy Research Consortium. His
family and friends would deeply appreciate you doing so, and
so would I. Thank you so much. We'll be back
for two hours. In a few hours. This has been incredible.
I love doing this show.

Speaker 2 (02:00:44):
I love you all. Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song
is Metosowski. You can check out more of his music
and audio projects at Mattasowski dot com, M A T

(02:01:04):
T O S O W s ki dot com. You
can email me at easy at Better Offline dot com
or visit Better Offline dot com to find more podcast
links and of course, my newsletter. I also really recommend
you go to chat dot Where's your Head dot at
to visit the discord, and go to our slash Better
Offline to check out our reddit. Thank you so much

(02:01:24):
for listening.

Speaker 3 (02:01:26):
Better Offline is a production of cool Zone Media.

Speaker 2 (02:01:28):
For more from cool Zone Media, visit our website cool
Zonemedia dot com, or check us out on the iHeartRadio
app

Speaker 3 (02:01:35):
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Two Guys, Five Rings: Matt, Bowen & The Olympics

Two Guys, Five Rings: Matt, Bowen & The Olympics

Two Guys (Bowen Yang and Matt Rogers). Five Rings (you know, from the Olympics logo). One essential podcast for the 2026 Milan-Cortina Winter Olympics. Bowen Yang (SNL, Wicked) and Matt Rogers (Palm Royale, No Good Deed) of Las Culturistas are back for a second season of Two Guys, Five Rings, a collaboration with NBC Sports and iHeartRadio. In this 15-episode event, Bowen and Matt discuss the top storylines, obsess over Italian culture, and find out what really goes on in the Olympic Village.

iHeartOlympics: The Latest

iHeartOlympics: The Latest

Listen to the latest news from the 2026 Winter Olympics.

Milan Cortina Winter Olympics

Milan Cortina Winter Olympics

The 2026 Winter Olympics in Milan Cortina are here and have everyone talking. iHeartPodcasts is buzzing with content in honor of the XXV Winter Olympics We’re bringing you episodes from a variety of iHeartPodcast shows to help you keep up with the action. Follow Milan Cortina Winter Olympics so you don’t miss any coverage of the 2026 Winter Olympics, and if you like what you hear, be sure to follow each Podcast in the feed for more great content from iHeartPodcasts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.