All Episodes

July 10, 2025 34 mins

Senior White House reporter for The Washington Post Isaac Arnsdorf details his new book 2024: How Trump Retook the White House and the Democrats Lost America.Adam Becker examines his book More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hi, I'm Molly John Fast and this is Fast Politics,
where we discussed the top political headlines with some of
today's best minds. We're on vacation, but that doesn't mean.

Speaker 2 (00:11):
We don't have a great show for you today.

Speaker 1 (00:13):
Adam Becker stops by to talk about his new book
More Everything Forever, AI Overlord, Space Empires, and Silicon Valley's
crusade to control the fate of humanity. But first we
have a senior White House reporter for The Washington Post,
Isaac Arnsdorf, on his new book twenty twenty four, How

(00:36):
Trump Retook the White House and the Democrats Lost America.

Speaker 2 (00:40):
Welcome to Fast Politics, Isaac. Thank you so much for
having me, Molly.

Speaker 1 (00:44):
So the book is called twenty twenty four How Republicans
Took the White House and the Democrats Lost America. I'm
going to ask you a passover question. How is this
book different than all other books?

Speaker 2 (00:58):
This is really the the defining, definitive inside account of
the election that we just had from start to end.
Some books have have covered just one campaign or the other.
Some have started just kind of with the debate and
and in the final stretch. But this really goes back
to the beginning in terms of for both Trump and

(01:22):
Biden the decision to run again, which was not not
inevitable for either of them, or that neither party would
would would let or that both parties would let that happen, right,
that no one would be able to get their respective
parties to move on from those septuagenarians, and then seeing

(01:43):
that story develop throughout both primaries, through the debate, through
the Democrats switch, and then finally with the result that
we all saw in November.

Speaker 1 (01:54):
Okay, so let's keep going with this. What made Biden lose?
If you could give me sort of two reasons why
Biden lost.

Speaker 2 (02:02):
The first answer is the most obvious one, which is age.
Most Americans thought he was too old to do the job,
and they couldn't get past that. The mistake that his
advisors made was viewing that as a perception problem rather
than a reality problem. Right. The case that they were
making with is, well, he can do this job as president.

(02:25):
He doesn't communicate that well, he doesn't campaign that well,
he doesn't debate as well as he used to, remember,
that was the line. But when he's in the situation room,
when he's in the oval office, he can do the job.
And what that argument really fatally overlooks is that communicating
is the number one job of the president. If you
cannot communicate about your agenda and your accomplishments and your leadership,

(02:50):
then you cannot do the job. There is no difference
between doing the job and being able to communicate about
the job. The second factor for Biden is that the
Democrats were not able. They acknowledged during the campaign that
they needed to come up with a winning economic argument,

(03:10):
and they never got there. They never got there with
Biden and they never got there with Harris. There's actually
some great reporting in the book. During Harris's debate prep
and I asked about Biden.

Speaker 1 (03:20):
I just want to I want to square the circle
with Biden for a minute. I was told by Jake
Tapper that Biden ran a cover up about his age.
So how did Biden run a cover up about his age?
And also the American people didn't vote for him because
he was too old? Is it, as Jake Tapper tells
us again and again and again, a cover up? Or

(03:42):
was it that people thought he was too old? Are
you on team cover up? I just I'm curious.

Speaker 2 (03:48):
Our reporting is that Biden and his staff were completely
aware of his limitations, completely aware of the political problems
that those limitations posed. They thought that they could muscle
through them. They thought that they could choreograph events to

(04:09):
downplay them. And they thought that when push came to
shove and Americans were forced to choose between that and
Trump and Trump, je's Biden and they were wrong.

Speaker 1 (04:19):
But the polling suggests that there.

Speaker 2 (04:22):
Was not a cover up because Americans pursue sounds like
Americans knew exactly what his administration knew. So is that
a cover up? I'm just curious. To me, a cover
up means a secret that was hidden, and Biden's age
was not a secret that was hidden. I'm just curious.

Speaker 1 (04:39):
Look, I love Jake Taber and I have so much
respect for how many books he sold.

Speaker 2 (04:45):
So talk to me about Harris.

Speaker 1 (04:47):
So Harris gets one hundred days in, Biden finally drops out,
Harris gets run for president, but she doesn't get to
run for president as if she were a new candidate.

Speaker 2 (05:01):
So explain to us that, yeah, that's exactly right. That
was one of the one of the things that I
learned from doing the book was how much Harris. You know,
she didn't do everything that she could to escape Biden's shadow, right.
You know, she was repeatedly given the option to distance
herself and she wouldn't take it. Now, based on our reporting,
some of that is honestly her position, right, Like, she

(05:25):
was part of his administration and she didn't feel like
she could honestly run from that. The other factor is
Biden made that harder for her because he actually came
out and told her, even on the day of the debate,
came to her and said, don't distance yourself from me.
It would be bad for you. But right, no daylight. Right,
that's the famous no daylight. That particular word choice is

(05:46):
not the phrasing that we have in our reporting, But
there was a conversation where he told her that it
would be politically bad for her to distance herself. Do
you agree with that? Well? From reporting, right, I mean
not based on what the voters wanted to see, and
that certainly wasn't what wasn't the Republican sense of it.

(06:07):
And in hindsight or even at the time, a lot
of her advisors admitted that the way that she bungled
that question at the view was the biggest mistake of
the campaign. You know, rivaled only by the trans AD,
but the trans AD was something that was from years earlier,
whereas that was a mistake that was made during the campaign.
But even beyond that, a lot of people were obviously

(06:29):
critical of the campaign that Harris ran in twenty nineteen
for the twenty twenty primary, and a lot of people
had a low view of her potential or had concerns
about her potential as a candidate. But the reality is,
you know, whether she could have run a better campaign
for herself or not in twenty twenty four, she really
never got the chance. I mean, basically it was with

(06:51):
one hundred days out, there was no time to do
any of the kind of strategic and tactical thinking that
you have to do to run a campaign that actually
fits the candidate, where the messenger and the message worked together.
There was just no time to do that. They just
ran moved forward with the Biden campaign as it was,

(07:13):
including with a lot of the same advisors who were
Biden people who had come up with the policies that
the Biden administration was struggling to defend. So, you know,
how is she supposed to she just there was there
wasn't a chance for her to do a complete, a
complete ruboot. She never got the chance to run her

(07:34):
own campaign and see if she could have gotten farther
on her own.

Speaker 1 (07:38):
I wonder if you could talk about this sort of
consultant class and where they got involved in like you
certainly saw moments like I saw Tim Walls, and maybe
because I also am friendly with him, I feel a
little responsible for this.

Speaker 2 (07:56):
On some level.

Speaker 1 (07:57):
But like Tim Walls, whenever I interviewed him, I thought,
this guy is really got a little bit of political magic, right,
Like when he speaks, he's pretty good. He's got very likable,
very midwestern, like I just sawt There were a lot
of elements about him that I thought were really good,
and he had a number of really good speeches. And

(08:19):
then something happened when I was watching him and he
stopped and they didn't let him out. And there was
that with her too.

Speaker 2 (08:27):
She sort of had a lot.

Speaker 1 (08:28):
Of fire, and then there was like a month in
the middle where you just didn't see her, And I
wonder if you could talk about that and what that
was and what you sort of saw that to be.

Speaker 2 (08:39):
So that month you're talking about where she didn't do
any interviews, that's another mistake that the team will acknowledge
in hindsight privately whose idea was that and why? Well,
the thinking was, you know, for Walls, that he couldn't
do an interview until she did an interview, and with her,
they wanted to wait until the convention to introduce her.

(09:00):
They wanted to be able to divide to find that
story for themselves. Harris also, that.

Speaker 1 (09:05):
Was it, by the way, a huge mistake. I think
we can all agree.

Speaker 2 (09:08):
Well, the Trump campaign certainly used that time to get
out ahead to start this storyline that she was hiding,
and that contributed to this sense that you were hearing
from voters that they didn't know who she was. That
was that was valuable time that that the Trump campaign
was able to capitalize on instead of the Harris campaign.
The other reality that we learned in our reporting is

(09:29):
that Harris personally, this kind of comes from her prosecutorial training.
She requires a lot of prep Ye, I've heard that
how she works, and so the amount of time that
it took her away from the trail to prepare for
every one of those interviews, you know, there was a
trade off there also, and at first that wasn't a
trade off that they were as willing to make And

(09:50):
it was later on that they that they that they
decided either that they were going to take that time
away to do the prep or that she was going
to start doing interviews with less prep.

Speaker 1 (10:00):
Right, yeah, yeah, yeah, that I had heard that she
really I mean, what I had heard was that she
didn't really like doing interviews, that she was much better
with debating, which you could I mean, she's a really
good debater. I thought she was a really good debater,
and that she took a long time to prep. That
she really wanted to prep like a prosecutor, which is
totally insane because you can't prep for an interview. You

(10:23):
just have to go in there and talk, you know.
I mean, I think anyway, and certainly Donald Trump has
never prepped for anything in his life. What do you
think I want? Let's talk about the Trump campaign. It
strikes me that they really just put him in the
culture and took the idea that politics is downstream of
culture and just got him in everywhere. My sense is

(10:44):
that he just did everything and it worked. But was
there some method to the madness?

Speaker 2 (10:50):
Yes, that was one of the one of the specific
things that they sent out to do. They called it
making him a cultural icon instead of a political figure.
And you know that very naturally to Trump. You know,
that's who he was for most of his life. He
was a celebrity. He was not a politician. And you
could see this as early on as the primary. You know,
he went to a frat house for a pregame before

(11:13):
a football game in Iowa. You know, do you know
a lot of seventy seven or seventy eight year old
politicians who could just show up at a frat house
and flip burgers. I mean, those kinds of settings really
played to his strengths based on the kind of personality
that he was for most of his life, and that
was actually an opportunity for younger, younger voters, younger audiences

(11:37):
to introduce them. You know, they didn't know that, they
don't remember Trump before he was the politician, and so
that was a really powerful way to reintroduce him.

Speaker 1 (11:45):
You know what's funny is that I actually think that
a lot of politicians could do that if they let
them do that. But democratic politics, for whatever reason, and
I'd love you to like square this circle, because you
guys just wrote this book about the two campaigns. It
strikes me that the level of carefulness in the Biden Harris,

(12:06):
you know, like keep them cloistered, keep them safe, keep
them from speaking, is the largest sort of fuck up
because like, had you sent here's just I'm gonna give you.
This is my own hobby horse. But I'm gonna give
you I'm torturing you. I know, I'm sorry first the
tap re question, now this but but it's my podcast.

(12:27):
So but it strikes me that if you had led
Tim Walls or Harris do the number of interviews and again,
maybe they weren't THEO Vaughn might not have wanted to
have Harris on or and Joe Rogan may have. You know,
I've heard conflicting stories about what happened with the Joe
Rogan thing, whether he wanted her, whether that was too

(12:47):
far away, whether I mean, who knows, somebody told me that.

Speaker 2 (12:52):
Yeah, so tell me. Let's tell me what tell me.
So it is so Walls had a lot more success
getting on podcasts than Harris, it's all.

Speaker 1 (13:00):
Did they not want her?

Speaker 2 (13:01):
Yes, they got a lot of no's, a lot a
lot of of of non political podcasts that have left
leaning audiences said no because they didn't want to bring
politics into the space. Like who the Travis and Jason
Kelsey podcast said no, That Bill Simmons podcast said no.

(13:24):
Oh wow, they couldn't get on Hot Ones either, Wow,
the Hot Sauce show, Hot Ones show. Yeah, I'm not
a podcast person. And and then you know, with Rogan,
it was a lot of back and forth, and Rogan
wanted her to come do it in Texas, and so

(13:45):
they set up that rally in Texas. But then they
actually by the time they and they didn't really give
Rogan's team a heads up that they were doing that.
So then by the time they scheduled that, it was
almost too late. And finally Rogan offered the campaign of
time and they didn't take it. So it's a it's

(14:06):
kind of a missed connection. But yeah, it's a.

Speaker 1 (14:09):
Fuck up on their part. Subway Takes guys said, he
thought she was so bad they killed it. They only
aired walls the Subway Takes guys, that's a podcast too,
so she so wow. Okay, yeah, so.

Speaker 2 (14:23):
You know, you've you've you've heard democrats talk about this
asymmetry in the in the manosphere or the podcast right
that that you know, the the those those right wing
so called manosphere podcasts. You know, we were really happy
to bring Trump on and to get political in ways
that kind of either there aren't equivalents on the left

(14:45):
or the closest thing to equivalent equivalents on the left,
we're not willing to do.

Speaker 1 (14:50):
This is such a crazy I had no idea. At
some point did they know they were going to lose
in Harris world, and when.

Speaker 2 (14:57):
Our reporting is that their internal tra backing polls never
showed her ahead, They showed it very close in the
swing states up until the end. And she personally, you know,
she didn't write a concession speech. She was not expecting
to lose. They it did not feel to Harrison Walls
up through the last night, it did not feel like

(15:18):
a losing campaign. And that has to do with why
you didn't see her on election night.

Speaker 1 (15:23):
Wow, I would love you to talk about something that
I have written about, have saw and have and am
really interested in. Would is this idea that people and
John mcgrache what a really good piece about this that
Trump voters wouldn't believe that Trump was going to do
the things he said he would do on the campaign,

(15:45):
that there was a suspension of disbelief among the voters.
Can you sort of talk to me about if you
saw reporting that spoke to this.

Speaker 2 (15:53):
I think you have to distinguish between different parts of
the Trump coalition. You know, if you're going to a
Trump rally and talking to diehards, they absolutely take him
literally and they're absolutely thrilled about that. But you know, abortion,
you know, there are other issues and and other you know,

(16:16):
more persuadable voters that the Trump campaign was able to
to convert and and you know, we could look at
abortion as a good example of this, where you know,
the Democrats would attack him as supporting as being responsible
for overturning Robe Wade and and wanting to impose a

(16:37):
national abortion ban. And Trump came out with this position
of leave it to the states, and when presented when
when they presented that to these persuadable voters, they were
targeting those voters liked that position. So so there was
you know, there was an ability for voters to compartmentalize
with Trump. I think you know that they could that

(16:57):
they could vote for him because of immigration, or they
could vote for him because of the economy, and pay
less attention to the rougher edges that they weren't as
they weren't as focused on so interesting.

Speaker 1 (17:13):
Thank you so much for joining us, Isaac.

Speaker 2 (17:16):
That flew by, Thank you so much for having me.

Speaker 1 (17:22):
Adam Becker is the author of More Everything Forever, AI Overlords,
Space Empires, and Silicon Valley's Crusade to Control the Faith
of Humanity.

Speaker 2 (17:32):
Welcome to Fast Politics, Adam.

Speaker 3 (17:34):
Thanks for having me, Molly, it's great to be here.

Speaker 2 (17:37):
So you write about AI. It's great and we're not
going to have any problems.

Speaker 3 (17:44):
Yeah, that's exactly that's the message of my book, that's right.

Speaker 2 (17:47):
Yeah. Yeah.

Speaker 3 (17:48):
The book is called More Everything Forever because that's something
that we can have and it's a totally reasonable dream.

Speaker 2 (17:54):
Yeah, So talk us through this book.

Speaker 3 (17:56):
So the basic idea is there's horrible and unrealistic visions
of the future that tech billionaires are taking very very
seriously as a roadmap for what we as a species
should do, or more properly, where they should steer us
with their power to control the rest of us. And

(18:19):
these ideas generally involve going to space and living forever
under the watchful eye of an AI god.

Speaker 2 (18:26):
Why do we have to go to space?

Speaker 3 (18:29):
Yeah, it's a really good question. There's essentially no reason
for it.

Speaker 1 (18:33):
Okay, I mean, I like Europe, I like Los Angeles,
I like Chicago, But I don't want to go to
space particularly. I mean, I just don't even get it. Like,
am I stupid? Is that what's happening here?

Speaker 3 (18:45):
No, you're not stupid. They want to go to space
because that's where they think the future is. Because that's
the future. Well, because that's what science fiction told them,
and they've confused science fiction.

Speaker 2 (18:54):
For you, that makes sense, okay.

Speaker 3 (18:56):
Yeah, And so you know, Musk says that we need
to get off of Earth in order to save this
species and preserve the light of consciousness. And so he says,
you know, he wants to put a million people on
Mars by twenty fifty and have that be a completely
self sufficient colony. That's not going to happen. There's just

(19:16):
no way it's going to happen. Mars is a terrible place.
The gravity is too low, the radiation's too high, there's
no air, and the soil is made of poison. It's
a terrible place. Yeah, and Mosco wants to put a
million people there.

Speaker 2 (19:29):
Can he be one of a million people?

Speaker 3 (19:31):
I mean, I would like him to be the first one.

Speaker 1 (19:33):
You know, remember when Elan was like, I'm going to
dig a tunnel between New York and DC and it's
going to take seven minutes to get there. And I
was like, yes, my man, because I would take a
tunnel that takes seven minutes to get from New York
to d C.

Speaker 2 (19:47):
What happened with that?

Speaker 3 (19:48):
Yeah, so the hyperlop. He later essentially admitted that this
was an idea that he put out there because he
wanted to kill the high speed rail project in California.

Speaker 1 (20:00):
Let the record show, I'm giving you a ironic thumbs up.

Speaker 3 (20:03):
Yeah, and he doesn't like public infrastructure, that's why he's
destroying it.

Speaker 1 (20:11):
Oh good, good, just checking. Why doesn't he like public infrastructure.

Speaker 3 (20:15):
Because it's harder to monetize and harder to control.

Speaker 2 (20:19):
Yeah, that's good, just checking. Yeah, it makes me feel
much better.

Speaker 3 (20:23):
I'm glad.

Speaker 2 (20:24):
Yeah.

Speaker 1 (20:24):
Basically this is all the fault of science fiction writers
in the seventies, like my father.

Speaker 3 (20:29):
Uh sorry dad, Yeah, it's all you do a little bit. Yeah, yeah,
I mean I'm a huge science fiction fan. I love
science fiction. I grew up reading enormous amounts of old
science fiction, the same stuff that these tech billionaires read.
I think that sure, part of it is the fault
of the science fiction itself, but a lot of it
is also bad readings of that science fiction, like misunderstanding

(20:51):
it really badly.

Speaker 2 (20:52):
Say more on that.

Speaker 3 (20:53):
So, for example, Peter Thiel, everybody's favorite person.

Speaker 2 (20:57):
Yeah, we love him. Yeah, he's a very are you
sweaty guy? Have you ever seen him?

Speaker 1 (21:02):
Like?

Speaker 2 (21:03):
Very sweaty? Go on? Hopefully you won't sue me for that.

Speaker 3 (21:07):
Yeah. You never know with Peter Diel, No.

Speaker 2 (21:09):
You absolutely don't. We actually don't. For a long time,
we wouldn't say his name on this podcast.

Speaker 1 (21:13):
We would just say that guy who sees a lot
of people like Devin Nuna's all right, go on, go on?

Speaker 3 (21:18):
Yeah, When Devin Nunas sued his fake cow love that
yeah No. Teal said in an interview with Maureen Daud
that he preferred Star Wars to Star Trek because he
thought that Star Wars was capitalist and that the plot
of Star Wars was driven entirely by Hans Solo owing
money to job of the Hut. This is not what

(21:40):
Star Wars is about.

Speaker 2 (21:41):
Speak for yourself, I mean.

Speaker 3 (21:44):
George Lucas said that he based the rebels in Star
Wars on the Viet Cong right and Teal is saying
that that's a capitalist story, so what is easy identifying
with the empire. He's also said that, you know, we
should base our plans for things on science fiction of
their fifties and sixties, and that the message of that
science fiction is that we should develop space, develop the oceans,

(22:06):
develop the deserts. I don't know. I read Dune a
couple of times. I don't think the message of Dune
is developed the deserts any more than the message of
Star Wars is build a death star and blow out planets.
That's awesome, And Teal is not the only one, right,
Musk has a long history of publicly misunderstanding science fiction
and sort of the rest of these guys.

Speaker 1 (22:25):
Yes, I have to tell you that the idea that
they're reading science fiction wrong, it reminds me of Doge.

Speaker 2 (22:34):
Right, so we have Doge.

Speaker 1 (22:35):
They're cutting government contracts, and it turns out they're you know,
they're like reading the numbers wrong. They're publishing billion instead
of million. Right, they saved eight billions and million. You
know that they're just moving so fast and making a
lot of mistakes. Is that what's happening here? Or is
it this sort of intentional misreading in order to I mean,

(22:56):
like Peter Teal say what you will, is actually quite
smart and has done a lot of things that have
helped him, right, like suing Gawker. Again, I don't like
it for any number of reasons, but it was like
money that got him huge returns, right, it crushed Gawker.
It was like the beginning of the foray into destroying
the mainstream media. So he is very good at what

(23:19):
he does, whatever that is, I mean, in some ways.
But my thinking with him is that he's always just
so profoundly ideological that he can't see the forest from
the trees.

Speaker 3 (23:28):
Yeah, I think that's probably right. I think that these
readings of science fiction, these misunderstandings of science fiction, are
appealing to them because they allow these guys to believe
stuff that they already wanted to believe and do stuff
that they already wanted to do. Right, It's very very
easy to hang on to a misunderstanding of It's one

(23:49):
that justifies everything that you want to do and makes
you a lot of money. And I also think that
there's a reading comprehension problem. So I guess the answer
to your question is sort of both. I mean, I
think that for a lot of these guys, they don't
think clearly anymore. They have their own self interest and
these dreams that they have that are wildly unrealistic, and

(24:11):
they just sort of aim toward those and anything else
that they think of. It's just very easy for them
to fool themselves into thinking that they're being profound, that
they've understood what's going on around them, as opposed to
living inside of a bubble of their own creation. And
I sort of wonder if this is one of the
reasons why they're so easily taken in by the idea that,

(24:35):
you know, text generation engines chatbots like chat GPT are
actually thinking because they just string words together in a
way that sounds good to them, and that's what they've
confused for thought at this point, because they don't need
to think carefully in order to make their way through
the world anymore.

Speaker 2 (24:51):
I can't tell them I'm being stupid.

Speaker 1 (24:53):
I'm very old, and I've been around long enough to
see the blockchain. Everything's going to go on the blockchain.
Here's why things are going to go on the blockchain.
All of this sort of fads, right Crypto the first
time that producer Jesse the Voice of God and I
have an ongoing dispute over like, is AI going to
take all our jobs and ruin our lives or is

(25:14):
AI just going to go on the blockchain like everything
else that's trendy. That thing you just said about chat
GPT makes me think that AI is not. That stringing
words together is not. But do you think AI is
going to take all our jobs and render us useless
flesh puppets?

Speaker 3 (25:32):
Yeah? I don't think that AI can do the jobs
that we do now particularly well. And I don't think
that AI can replace humans the way that people who
are saying that it will take all our jobs are
sort of worried about or hopeful for, depending on where
they are in the economy. But I also think that

(25:52):
that may not stop it from taking a lot of jobs,
because the decision about whether or not you replace humans
with AIS is not based on reality. It's based on
the perception of reality up in the executive suite. And
so if there's a mass delusion or a mass you know,
hysteria about losing the AI race or this thing can

(26:14):
really do what all of the you know, people who
work for me do, then the executives are going to
try firing everybody and replacing them with AI and see
if that works. And you know, they're already trying that
and it's not working very well. You know, there their
study after study showing that when companies try to use
AI to replace humans, it doesn't go very well. And

(26:37):
I think we're going to see more of that and
hopefully that leads to a world where AI is, you know,
a tool that is used for certain applications that it's
you know, okay at or even good at, and you know,
it doesn't doesn't end up permanently taking all of our jobs.
But it's also being used as an excuse for these

(26:58):
companies to do what they already wanted to do, right,
just like the space stuff, And that's part of why
the idea of this AI is so appealing. The companies
don't much like their workers, right, I'm shocked. Yeah, they
don't want to have to keep paying their workers. They
don't want to have to deal with people who could
perhaps I don't know, organize and form a union and

(27:22):
start actually flexing their power. And so ideally what they
would like is to cut their workers out of the loop.
The AAI can't do that, but they'd like to believe it.

Speaker 1 (27:30):
Can't, right, and that makes a lot of sense. These
tech role of arcs, billionaires whatever. Some of them are cowering,
very fearful, like Mark Zuckerberg. Some of them are embracing
Trump isam. I wonder if you could talk to us
about like we're already seeing. I think some of a

(27:53):
split between Musk and Trump. I think it's going to
be more or maybe not. So what do you think
is happening there?

Speaker 3 (28:00):
Not sure what's happening between Musk and Trump specifically. I
think that Musk likes the idea of having that kind
of influence and power over the president, and Trump likes
that there's somebody sort of capering and cowering and meaning
himself publicly, and also someone who can take the flack

(28:22):
for the things that Trump and his people want to do.
So as long as that relationship remains useful to both
of them, I think it's going to go on. We
may be seeing the end of its usefulness. Musk tried
to buy the Wisconsin judicial election for Trump and it
didn't work, And there's been a lot of blowback about
Doge and the things that they've done, and they've led

(28:43):
to a lot of court cases that the Trump administration
is lost, and so you know, perhaps they see the
others sort of coming to the end of their usefulness.
I mean, certainly, Musk has already gotten a lot of
what he wanted in terms of, you know, getting rid
of the regulatory structures that were threatning his businesses because
he does things that are illegal.

Speaker 2 (29:04):
But many people are saying.

Speaker 3 (29:07):
Yes, exactly, many people are saying that he does things
that are legal. So that particular relationship I think could
have run its course. I'm not sure, but in general,
the tech brolgarchs are very comfortable with the idea of
autocracy and fascism. They do not mind getting rid of
democracy because they see it as irrelevant. And there has

(29:29):
long been this dream in Silicon Valley of getting out
from under the thumb of the government and using their
technology to solve all problems.

Speaker 1 (29:41):
I don't want to be a cock eyed optimist here,
but I'm just curious, why do you think that they
don't care more about science. Trump Ism has canceled all
these government contracts there. I mean, Elon must made all
his money on science, you know, and other people's ideas.

Speaker 3 (29:57):
Yeah, it's a good question. I think the answer is
they think that they do. They just think that scientists
aren't as good at science as they are. These guys
think that because they're the wealthiest people in history, that
makes them the smartest people in history. They think that,
you know, they know a lot about science and technology
because they run companies that build tech that's based on science.

(30:17):
But that's not actually how the world works, and they
are not scientists, and they actually don't know that much
about science. And so I don't think they see what
they're doing as an attack on science. They see it
as an attack on an entrenched power structure that threatens that, namely,
you know, the independence of academic science. They want science

(30:42):
to serve them, and they believe that they know what
real science will do. Mark Andresen one of these prologarchs, right,
He has said that he had this manifesto that he
put out about a year and a half two years ago,
and in that manifesto he said that, you know, he
and his fellow techno optimists were the keepers of the

(31:05):
real scientific method as opposed to you know, unaccountable academics
safe in their ivory tower. How does that work, Yeah,
that's a great question. I don't think that there is
a good answer there. I mean a lot of that
essay was Andresen, one of the most powerful people in
the world, trying to recast himself as the plucky underdog
against you know, scientists who are famously wealthy and powerful.

(31:29):
It's really a real misunderstanding of his place in the world.

Speaker 2 (31:32):
Yeah, I mean, just wild.

Speaker 1 (31:35):
What can people who are not billionaires do to protect democracy?
If you were looking at this and you were saying, like,
what would the song be in your head from what
you know about this crow?

Speaker 3 (31:47):
I mean, the long term solution I think is stronger
government regulation, stronger enforcement of anti monopoly laws. And I
would like to see at some point a tax on
wealth that you know, makes it impossible to a mass
as much wealth as these people have. I would like
to see a billionaire tax. You know that that says
that if you have more than, say, I don't know,

(32:09):
five hundred million dollars and the rest of that in
everything you've got over that it's got to go to
the government, because five hundred million dollars is enough for
one person. So mean, yeah, I know, I know people
think that I'm a Marxist.

Speaker 2 (32:25):
Yeah, people won't like that.

Speaker 3 (32:27):
No, people won't like that, but I think it's a
good idea.

Speaker 1 (32:30):
And what would you say, I mean, do you think
that they're going to like end elections. Do you think
that they're going to or you think this is a
slower role than that.

Speaker 3 (32:38):
I think that they would like elections to go away
or be irrelevant for them. I don't know how quickly
that's going to happen. I think it's more likely that
they're going to try to continue to put their thumb
on the scale for the elections, right because because social
media already does a lot of that for them, Right,
Because the structure of social media makes it easier for

(33:01):
misinformation to spread, makes it easier for very silly, lowest
common denominator ideas to spread rather than having nuanced conversations
that we'd like to have in an informed democracy. And
then you throw in AI's ability to create misinformation on
scale at demand, and you know it's bad. I think
that they're going to keep doing more of that, and

(33:22):
that they're going to find other ways to sort of
use the levers of control of the government that they
have right now to make it harder to have free
and fair elections.

Speaker 2 (33:31):
Mm hmm.

Speaker 3 (33:32):
Because that's easier for them to do than try to
end elections entirely. But I could be wrong. I wouldn't
be terribly surprised if somebody tries to stage like a
Reichstag fire type event to end elections. So I don't
have an easy answer on that one. But I do
know that the technology that these tech companies are building

(33:52):
and have been building, erodes the fabric of democracy, and
we're seeing the consequences of that now. Thank you, d
Thank you for having me, Mollie. It's great to be here.

Speaker 1 (34:04):
That's it for this episode of Fast Politics. Tune in
every Monday, Wednesday, Thursday and Saturday to hear the best
minds and politics make sense of all this chaos. If
you enjoy this podcast, please send it to a friend
and keep the conversation going. Thanks for listening.
Advertise With Us

Host

Molly Jong-Fast

Molly Jong-Fast

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.