Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hi, I'm Molly John Fast and this is Fast Politics,
where we discussed the top political headlines with some of
today's best minds, and Quinnipiac University national poll finds just
sixty seven percent of Republicans sixty seven percent of Republicans
support the BBB. We have such a great show for
(00:22):
you today as the world churns. Andy Levy stops by
to talk to us about the fallout from Trump and
musks exploding romance. Then we'll talk to Blood in the
Machines Brian Merchant about how AI is already quietly killing
job prospects in America. But first the news Mali.
Speaker 2 (00:44):
These tariffs, it's never ending drama, just like Trump and Musk.
Scopist says it's highly likely tariff deadlines will slide for
countries and talks because sure third talks.
Speaker 1 (00:55):
I was taald that there would be ninety deals in
ninety days.
Speaker 3 (00:59):
It's infrastructure week too.
Speaker 1 (01:01):
It's always infrastructure week, isn't it.
Speaker 2 (01:04):
That's what I'm saying. This is the new infrastructure week.
Speaker 1 (01:09):
So let's talk about the tariffs. Showed Endur's tariffs. There
are so many different machinations of tariffs in different ways.
There is a global tariff that Trump put on on
April second, then he paused on April ninth. Then he
promised ninety deals in ninety days. I mean, so they're
less than thirty days to go. There are sixty deals now,
(01:32):
I'm just kidding, they're not. There's one framework of a
deal that's like you remember concepts of a plan. That's
concept of a plan that's with the UK, one extension
that's with the EU. Then there's the China disaster. But
China's tends to be very easy to negotiate with and
very operating in good faith.
Speaker 4 (01:52):
Just kidding.
Speaker 1 (01:53):
By the way, I just want to point out he
could make deals with Canada and Mexico tomorrow. They are
our largest partners, and like all the Canada and Mexico
stuff is completely self imposed. Crisises that this administration is
doing because Trump is so unpredictable, Like, make a deal
with Canada and Mexico. Stop it. This is so stupid.
Speaker 2 (02:16):
It is pretty stupid. Speaking of stupid, a couple journalists
have identified that there's this feedback group of right wing
propaganda that keeps kind of fueling what's going on with
these LA protests. Device and I have to tell you personally,
I couldn't believe when I looked at Twitter this morning
how much footage I was seeing of twenty twenty protests
(02:37):
being passed on as if they're happening right now today.
Speaker 1 (02:40):
Yeah, twenty twenty protests, AI videos stuff from like the
beginning of the Russia Ukraine War, when the right was
sharing all these like insane videos where you were like,
this is not Russia. This is a video game. That's
what we have here. But you know, you have one
group of influencers who are again like I don't mean
(03:02):
to get on my hobby horse, but let me please
get on my hub person. Congress refuse to regulate, right,
they've refused to put any level of regulation on the internet.
So there is no fact checker, there is no truth,
There is no agreed upon truth. And so now if
you're a bad faith actor, which a lot of these
(03:23):
people are, you know, you're either a bad faith actor
or you're stupid, right, because anyone else has been on
the Internet so much, they know like this is not correct.
You know, like I know what looks you know, if
it looks too good to be true, it probably is.
And I mean that with all the caveat. But what's
scary about this administration is they actually, like in a
(03:45):
normal administration, you'd see a picture and the president would say, well,
I'm going to go to my D and I and
make sure that this is real. In this administration, Donald
Trump is like we have to go to war, you know,
like he doesn't. There's not a lot of fact checking,
and like we've seen this get and again, right, go on.
Speaker 2 (04:01):
And this administration. Tulci Gabbard yesterday was busy rattling off
conspiracy theories on a video for like an hour.
Speaker 1 (04:08):
Yeah, exactly. I just don't understand how we are in
this position again. And there's no like there are a
few Republicans who are still around who understand that like
what is real and what is not real. But a
lot of Trump's people are just like, if he can
get the base to believe it, who cares if it's real.
Speaker 3 (04:29):
That seems to be the modus operandi.
Speaker 1 (04:31):
Which is really scary.
Speaker 2 (04:33):
Yeah, it's it's horrible, and it really it was the
same thing as the thing we know that like Musk
has incentivized all this information by paying people. There's that
great article in The Times a week ago about Don Luker,
and you know that he just doesn't care about what
he prints because Musk is paying him. Speaking of things though,
that do break the pattern. Would you believe me if
I said Mitch McConnell was really hard on Pete Hegseth
(04:55):
at the hearing yesterday.
Speaker 1 (04:56):
Yes I would, because it's a guy who vote is
like the one Republican who voted against all the crazy
cabinet appointees. So I would believe that. I mean, look,
Mitch mcconald is how we got here, so I don't
think anyone should be, you know, feeling too bad for him.
But that said, he has now realized, you know, he
(05:17):
doesn't have power anymore, he's not in leadership, and he's
realized that this is just a disaster, right, that Russia's
being allowed to do whatever it wants with Ukraine, that
America is no longer the shining city on the hell.
This is what it is, man, this is what it is.
And I think Mitch mcconald understands that he was part
(05:37):
of this, and you know, I don't think he wishes
us well. I do think there's some amount of like Oiva,
I did this going on.
Speaker 2 (05:46):
So we usually don't wait into foreign policy waters. But
what are the things you and I have often discussed.
Speaker 1 (05:51):
For anything about it? Yes, go on, we do.
Speaker 2 (05:55):
One of the things you and I often discussed, though,
is what we do want to do is highlight the
things we think are not getting discussed in the mainstream media.
I think it's a little concerned that the US has
reduced the presence of staffers do de deessential in the
Middle East as tensions with Iran and rise.
Speaker 1 (06:10):
Yeah, look, this is so insane.
Speaker 4 (06:13):
Okay.
Speaker 1 (06:13):
Remember Donald Trump was like, I'm going to make peace
in the Middle East in the first day. I remember
that very well. It's like I'm going to end the tension.
Speaker 4 (06:22):
Yeah.
Speaker 1 (06:23):
And then he was like, We're going to develop Palestine,
kick out all the Palestines.
Speaker 3 (06:29):
The tension.
Speaker 1 (06:29):
Yeah, is the tension. And now it looks like, you know,
non essential American employees are getting out of Iran because
something's cooking. Again. This is all hard, Like this is
the thing about Elon Musk. Government is actually hard. It's
just hard. It's hard to get a lot of people
to do stuff. It's hard to make it work right,
It's hard. So like making peace in the Middle East
(06:53):
is hard. So part of Donald Trump's whole thing is
that he is so brash and caw and sometimes his
escalatory rhetoric works. It usually works with like sane countries
that would have been our friends anyway, but it doesn't
work with places like Israel or Raq or Palestine, and
(07:15):
so really it is just, you know, it's so fucking bad.
Excuse my friend, And you know, we just have this
super disorganized, very craven, extremely ideological administration that is like
(07:35):
a bull in a china shop and somehow still believes
that Vladimir Putin is a good guy. And that's why
we can't stop things escalate. There is no peace. It
just all of this that we're seeing play out is
everything we said was going to happen. Yes, we were
wrong about the twenty twenty four cycle, but no, we
were not wrong about what Donald Trump would do when
(07:57):
he got an office. Andy Levy is the co host
of As the World Churns. Welcome to Fast Politics, Andy Levy.
Speaker 4 (08:08):
Thank you, Molly, trying not to do something weird anyway,
I'm here with a New York Times bestselling author, Molly
john Fest. So what's up with that, Molly.
Speaker 1 (08:17):
Well, we were getting ready to record, by the way,
I'm in Chicago. If I don't sound amazing, Jesse is
probably going to kill me later. But I did get
a call from my editor that my book How to
Lose Your Mother, available in many places, is on the
New York Times bestseller list. And you know, this is
writing is like what I've done my whole life. So
to be on the New York Times bestseller list, it's
(08:40):
pretty fucking cool.
Speaker 4 (08:41):
Yeah, that's amazing, especially for a book that's based on
a sitcom that went off the air like ten years ago.
Speaker 1 (08:47):
It's a book based on a meme. Right, it's not
a sick it's a meme. It's a mean book. Yeah, okay, yeah,
it's a meme books being of meme books, I have
a whole thesis and I'd like to shop it to
you and our listeners, and you tell me if you
think that I'm wrong. I'm open to being wrong because
now I'm in your time's bestselling author, So I have
(09:09):
that self esteem buttress that I.
Speaker 4 (09:12):
Can set wrong. Sure.
Speaker 1 (09:13):
Last week, despite the fact that Trump World sort of
spun it in many different ways, Trump did fight with Elon,
the guy who has all our data. Then Elon called
him a pedo. Then Trump called him a drug addict.
Then Elon said maybe he shouldn't have tweeted all that
stuff that was this week, and also Trump world lost
(09:33):
in Cord and Trump the trade war. There are no
ninety deals in ninety days, you know, everything is just
sort of cluttering along. And then there's also somehow we're
going to war with ver we're on. You know, like
it just seems like all of the sort of last
week you could still blur your eyes and be like
Trump two point oho is more organized than to Trump
(09:54):
one point zero. But now all of a sudden, like
that illusion is gone and ad And that same time,
Donald Trump decides that he should send the National Guard
in to Los Angeles. Probably a coincidence, right.
Speaker 4 (10:10):
I don't know if it's a quincience. Look, I am
unlike you and Donald Trump. I am not a New
York Times bestselling author, So I feel kind of like
the odd man out in this discussion. But I think
two things can be true as they say. I think
one is the timing of this is I I think
you're onto something there. Like I do think that it
(10:31):
hasn't been a great couple of weeks for Donald Trump
by a lot of metrics, so his bread and butter
is always going to be, you know, for his base,
it's going to be going after brown people and going
after black people, et cetera. So yeah, I think the timing,
if it's coincidental, it's it's it's a very I guess.
(10:52):
I don't want to say it's a good coincidence for
Donald Trump, but it's the kind of thing that may
be coincidental that it's a incidents think, is what I'm
trying to say. But we know this is what Stephen
Miller has wanted from day one, and they may have
just seen with regard to the timing, well this happened,
and then they said, Okay, let's do what we've been
(11:15):
wanting to do. Let's send in the shock troops. Let's
send in the jack booted thugs, let's send in the
National Guard, the Marines. I'm worried that we're going to
be finding a two front ore, one in t i
Ron and one at the Grove, and that usually doesn't
go well.
Speaker 1 (11:30):
Full disclosure. I was at the grove yesterday. There is
no war at the grove. Now that said, it's a
weird little MALLLL.
Speaker 3 (11:39):
It is a weird little mall yeah.
Speaker 1 (11:40):
It's a weird little mall. Like what I love about
La Is there so many weird little spots that make
no sense. I did nearly get hit by a trolley
at the Grove buck Is there a trolley.
Speaker 4 (11:51):
Trolley was like four miles.
Speaker 1 (11:54):
Yeah, exactly. But you can still if you're tired enough,
you can wander in there. So look, I was in
La till about two hours ago, and I think.
Speaker 4 (12:04):
Did you get breathed?
Speaker 1 (12:05):
Oh yeah, oh yeah. Oh it's just it's mayhem and madness. No,
it's totally fine. There's like a small area where people
are protesting. I didn't hear anything, I didn't see anything.
I didn't see any smoke. You know, it's not what
the far right wants you to think it is. Now,
that is why the far right has And there's a
really good article about this and Wired and in the
(12:26):
Times there's a they're spreading all these fake AI generated videos,
so A generated videos, old videos saying that like Los
Angeles is a healthcape. I would think they would want
Los Angeles to be a healthcape. I guess they do
want Los Angeles.
Speaker 4 (12:43):
Yeah, yeah, no, they look they want all those pictures
to be real, you know, we've got that. They're even
playing the old hits, the Palette of Bricks, right, I
think that's right. I don't remember what year it wasn't
that that first top the chart, but they're bringing it back.
But yeah, look, I lived in LA for ten years.
I have a lot of friends in LA. Nobody in
(13:05):
La is talking about the city being on fire or
they're riots all over in the streets. That stuff is
simply just not happening, and we're being lied to for
a change by the top administration and right wing media.
If you want to talk about the violence, the violence
is very one sided and it ain't coming from the protesters.
Speaker 1 (13:28):
Yeah, well, I think I actually have a friend who
was protesting in New York and she was telling me
that the ICE agents are trying to get the protesters
to act out, like there's clearly a want here for escalation,
but it's not necessarily on the protester side. That said,
you don't you know, you can't be held responsible forever.
(13:49):
I mean, I'm sure that a lot of people and
we saw this, We've seen this with every protest, that
there are protesters who do stuff that they're not supposed
to do. So sure, I you know, you don't want
to like blankets, say that no everyone's behaving.
Speaker 4 (14:04):
Well, sure, no, someone is always going to start a
drum circle and that is reprehensible and should not be
tolerated in the year twenty twenty five. But there's going
to be stuff like that, and you can't you can't
tar the whole of the protest because somebody has a bongo.
It's just not fair. And by the same token, you see,
(14:26):
you know, the media, as as people have pointed out,
the media love to show cars on fire. That is,
they can't they they cannot get enough of a car
being on fire. So there could be one car on fire,
and we have no idea how it happened, but we
are going to see that image a thousand goddamn times,
and there's going to be a lot of handwringing about
(14:48):
the violence because a car in America is sacred, whereas
the lives of individuals not so much.
Speaker 1 (14:54):
Not so much, not so much. No, yes, that's right,
don't burn cars because cars for people. Let me ask
you a question, now that Trump and Elon have broken up,
is it still a federal crime to hurt a tesla?
Or is that has that been rescinded.
Speaker 4 (15:10):
I mean, that's a good question. And in LA it
seems to be that the protesters are going after the
Wainbow vehicles because all these self driving cars have cameras
of all of them, and the police in law enforcement
and the FEDS are using them to surveil. So I
was trying to figure out, Yeah, do you think is
Elon does he have mixed feelings about this? Is he
(15:33):
like upset that they're going after cars? But on the
other hand, it's these Google cars like that are competitors
to Tesla. This is a tough one to judge. Trump
said he was selling his or he was getting rid
of his Tesla, didn't.
Speaker 3 (15:45):
You say that?
Speaker 1 (15:45):
Yeah, yeah, which doesn't mean he did say he was
getting rid of He told a reporter in the West
Way when he was doing one of his many many
sprays or you know whatever it's called, that he is
selling his red Tesla out. Let's just talk about this
sort of musk trying to retrench himself efforts. Okay, So,
(16:08):
by the way, the polling on the BBB is the big, bad,
beautiful Buxom bill is bad, bad, bad, real bad. You
want to know how bad it is?
Speaker 4 (16:19):
How bad is it.
Speaker 1 (16:20):
I'll tell you how bad it is. It's bad, bad, bad.
I mean, if I were pulling that badly, it's like
black death. And then the BBB Big Beautiful Bill two
to one. Americans opposed independents are three to one negative
on it BBB bad bad bad. They should have called
it something else.
Speaker 4 (16:40):
Yeah, well, I guess the people are on Elon's side
on that one because he was going after the BBB.
Speaker 1 (16:47):
He was going after it because he wanted it to
give him credits so that his car company could do well,
not because he gives talk about the data sets.
Speaker 4 (16:57):
Which, by the way, is the same reason he's apologizing
to Trump. I mean, yeah, I mean the federal contracts
between you know, SpaceX and and the favors he needs
for Tesla. I mean, I have got to think that
this you know, pseudo apology or halfway apology that he
that he did was spurred on by the boards of
(17:21):
some of his various companies, by the Tesla board, by
the people at SpaceX who were like, dude, if the
federal government cancels SpaceX contracts, we're out of business. I mean,
they can't stay in business without that.
Speaker 1 (17:37):
Let's just talk for a second about this, this elon
apology tour. So his he had an early morning tweet
on Wednesday, which is either today, yesterday, or five years ago,
and in it he said he regret regretted going quote
too far in criticizing President Trump. I assume that's the
(18:00):
HEADO tweet, but it could have been something else. Love
this administration. Musk's effort, and I mean that with all
the disdain in the world, that's irony or is it sarcasm?
It's circus. Musk's efforts to make amends began on Friday
when he spoke by phone with White House Chief of Staff.
(18:20):
This is a reporting from Axios, Susie Wilds and Vice
President Vance. Three people familiar with the discussion said, so
one of that's Trump. Vance and Susie told Axios that
Musk call them it matters. Why does it matter? American
democracy is going to die. It's Musk is trying to
(18:41):
get Trump to take him back. And then Musk fears
Trump's beautiful bill will add trillions the federal deficit. Do
you think he really fears that. I don't think he
gives a fuck out.
Speaker 4 (18:50):
I don't think. No, he fears it will add money
to his deficit, yes, you know, uh not not to
the federal deficat. No, I don't believe for a second
he gives a shit about that. I mean I was
talking to I was talking to Si Cup last week
on my podcast As the World Turns, available wherever you
(19:10):
get your podcasts, also on YouTube, and uh, she asked
me who I thought had more to lose in the
Musk Trump war, and I said, you know, to me,
it was obviously Musk. I mean, Trump doesn't need Musk anymore.
This is separate, by the way, from the Republican Party
maybe needing Musk because it was his money that was
(19:33):
that is fueling a lot of the you know, that
is filling a lot of their election coffers. But Trump,
at least as of today, isn't running for office again,
So he personally doesn't need Elon really, and he does
not give a shit about the Republican Party. We know that.
(19:53):
So telling Trump he needs Elon because the Elon is
good for the Republican Party, Trump doesn't understand that. But Elon,
as as we've been talking about, Elon needs Trump because
his entire livelihood is dependent on taxpayer money, on federal funding.
So it's not really. You know, if again, if if
(20:15):
you ask me which one of them was going to blink,
I would have said, of course it's going to be Musk.
He has to.
Speaker 1 (20:20):
So that's the question. Now there is the fact that
some of the companies that Elon has and I you know,
like SpaceX and Star like, there's no other company that
necessarily does. I mean, other companies could catch up, but
they'll be a lack right, So I'm not sure. I mean,
I've certainly read pieces that say and again this is
(20:42):
really like be this is like internet, you know, like
this is like foreign policy stuff. It's just a little
bit out of my skill set. So I don't know
what this you know, what satellite manufacturing looks like in
other countries. You know, I don't know what the what
the companies are, but there definitely is I don't think
(21:03):
Elon can be I mean, I don't know that he
can lose all those government contracts right now, but he
certainly can lose like EV credits and Tesla is super overvalued,
right everyone's you know, that's a sort of no known
So I do think they're and you know, Elon is
so overleveraged, you know, between X, I mean, he's very rich,
(21:24):
but he's very leveraged, so you know, there could be
the margin call from hell there. I mean, I just
don't know enough about his finances, but I certainly know
that the fact that he's trying to get back into
Trump world, you don't do that. Like if things are
going great, it seems hard for me to imagine that
he's going to go back and eat grow.
Speaker 4 (21:42):
Yeah, and look, I think you know, Starlink maybe almost
a one of a kind thing now, you know, with
satellite Wi Fi and or satellite internet whatever and all
of that. But again, it is pretty wholly dependent on
government contracts, not just ours, other governments as well. And
(22:02):
the thing is, if if Trump decides he's going to
destroy Elon, you know, damn well, he's gonna tell other
countries to get rid of their startling systems or to
not invest in future ones. And that's really like, like
you said, they may be the only you know, player
in town right now, that doesn't mean they always will be.
And they need these future contracts to keep growing as
(22:26):
a company because that's all that Wall Street Etca. Cares
about his growth. So I think that's where and the
same goes for SpaceX space X. Yes, they're kind of
you know, Bezos has his his his own little penis
replacement thing going on. But but space X is really
the only player in the game.
Speaker 3 (22:47):
But they are the player.
Speaker 4 (22:48):
They're a player in the game again because of the
government contracts, because of contracting with NASA and and things
like that. So you yank all of those and you
can say, well, but no one else is doing with
what's SpaceX is doing. But SpaceX ain't going to be
doing that if it's just dependent on the private sector
right now.
Speaker 1 (23:06):
Right, I mean, it's a I think this is a
real these both of these men are not in great situations.
And that's okay. That's okay with me. I'm good with that.
So I just don't understand why there's not more discourse
about this. So I'm going to make you talk with
me about it.
Speaker 4 (23:23):
Just about your booking the New York Times bestseller.
Speaker 1 (23:26):
No, but it is. I'm very excited. But the BBB
is this bill that got through the House and then
you had all these House members being like, I didn't
really know what was in it. I just voted for it.
Marjorie Taylor Green. You know, all these people voted for it,
and it passed by one vote. Okay, So now it's
going through the Senate. It's wildly unpopular. You have Elon
(23:48):
tweeting about it still, uh, And I just wonder like
why are Democrats not getting you know, like Rand Paul's
already and now so we know Rand Paul's in now,
and you know that you only need like three more okay.
And we have like Bill Cassidy, the Senator from Louisiana
(24:09):
who's a doctor and should know better. He made a
deal with RFK Junior where r FFK Junior said, I'm
not going don't worry. I love vaccines. I just have
been saying that they cause terrible diseases. So he fired
the whole vaccine board and he's going to replace them
with new better maga the doctors okay maybe or just
(24:32):
not that doctor Oz right and doctor Phil and so.
So this is a guy who's really been rolled. Okay.
So I just don't understand why there isn't why Schumer
is not out there whipping votes against his bill, like what,
don't laugh at me, don't laugh at me. I were
(24:55):
Senate Majority Minority leader, I'd be out there being like, look, man.
Speaker 4 (25:00):
This is what I to say, but you can't possibly
expect that Chuck Schumer would do that. I mean, that's
what I'm laughing at, is the idea of Chuck Schumer
doing that.
Speaker 1 (25:11):
Well, if he should be doing that, are you sorry
to tell you?
Speaker 4 (25:14):
Yes, there's a whole laundry list of things he should
have been doing, and should be doing and should do
in the future, and he hasn't really checked off any
of them as far as I can tell. And you
know you're right. Look the Democrats. You started your question
and by saying why are Democrats? And I thought you
(25:34):
were going to stop there, Like I didn't realize that
this was going to be about something specific, because the
question I'm always asking is why are democrats? Because it's
so frustrating. It's just the fact that there's only like
a handful of elected Dems who understand what is going
on right now in this country. There are the people
(25:55):
like Chuck Schumer who will say that they understand, but
they won't act like they understand. And as many people
have said, like we just spent a whole election warning
about what was going to happen to democracy under Trump
two point zero, and then as soon as he got
elected the Democrats acted like they hadn't been running on that.
(26:16):
So I agree with you, but I think, you know,
putting your faith in Chuck Schumer. You know, it's like
seeing a bowl on the table with popcorn in it
and just assuming it must be you know, fresh new
popcorn for no reason at all, and then eating it
(26:38):
and getting botulism and diner.
Speaker 1 (26:41):
That was. That was. I had no idea where it
was going, dying as I was saying it.
Speaker 4 (26:48):
I had no idea where it was going.
Speaker 1 (26:50):
Andy Levy, Andy Levy, thank you, thank you, thank you.
Speaker 4 (26:56):
Always a pleasure, Molly, Always a pleasure to talk to.
A New York Times best selling author.
Speaker 1 (27:01):
Brian Merchant is the author of the book and the Substack,
Blood in the Machine, and host of the podcast System Crash.
Welcome to Fast Politics.
Speaker 3 (27:12):
Fine, thanks so much for having me, Molly.
Speaker 1 (27:15):
So let's talk about AI. It's going to replace us
all and solve all of our problems.
Speaker 3 (27:21):
Yeah, that's right. Let's just uh, let's head out to
the park, let's have a picnic, and we could call it,
call it, call it a call it a century.
Speaker 1 (27:30):
Sadly, no, I'm going to put it on the blockchain.
I am married to a venture capitalis he's a good one.
He's one of the few good ones because he's farf education. Yeah,
but they were like about five years where everything was
going on the fucking blockchain. You know, what's on the blockchain? Nothing? Nothing,
that's on the blockchain. So Trump trump coin, Yes, on
(27:51):
the blockchain, So can we put this on the blockchain?
Speaker 3 (27:54):
You know? AI is.
Speaker 5 (27:56):
It's interesting because it's not the same level of vaporware
right where you know, in some cases maybe would be
better for people who are getting affected by this stuff
if it were. But like on a couple of different levels,
like one, like the investment in AI has been through
the roof. It's bigger than you know, anything related to
(28:16):
blockchain or NFTs or the metaverse. The last you know,
handful of cycles from TA come out of Silicon Valley. No,
Silicon Valley has decided that it's you know, for all
intents and purposes, all in on AI. So that's one
of the things that is so unnerving about this moment
because they're pouring capital into it, whether or not a
lot of these products can do what they say they
(28:38):
can do, whether or not they work, whether or not
they're you know, bullshit machines. Can I say bullshit on
this show?
Speaker 1 (28:45):
Oh please? This is not cable news, just curcip of storm.
My listeners love profanity. I wonder I really hate AI,
and I'll tell you why because I'm a writer and
I love beautiful pros and I care deeply about my
words being written by human Am I going to be
(29:06):
just completely fucked in about two years?
Speaker 3 (29:08):
So as writers?
Speaker 5 (29:10):
You know, I'm a writer myself, and we we are
used to being fucked right, like we're used to getting
from all corners.
Speaker 1 (29:17):
I come from a you know, my grandfather, my mother,
so I am quite used to just watching the business
definstride itself over and over again.
Speaker 3 (29:26):
But this is new.
Speaker 5 (29:27):
The scale and the ease by which we can be
fucked now is new and is alarming and is doing
real wrecking ball level damage to the industry, especially, you know,
it's especially certain segments of the industry. Right of different,
there's I think journalism and writing like for magazines. They're
(29:50):
not going to be replaced outright, but we can already
see sort of what the glut of AI generated content
that's showing up in blogs, that's being used by place
like my former employer, like right, I used to write
the tech column at the La Times, and after I left,
they started using this AI insights tool to try to
add value to it. You know, BuzzFeed is using AI, Sports,
(30:12):
Illustrated Cnet, all of these places where like AI is
like coming in from the sides, and it's this. It's
very cheap. It's just cheap content doing what a bunch
of cheap stuff does to any market. It degrades it,
It lowers the value. It makes it harder for people selling,
you know, the real stuff to make a living. And
(30:33):
this is happening to artists and writers, graphic designers and marketers.
This is real and it is you know, like I
wrote a book about the Luddites, So this is exactly
what happened to the Luttites who made craft goods, who
made you know, cloth goods, who produced cloth stuff. Automation
flooded the market with cheap stuff, and factory owners, you know,
exploited cheap labor to compete with the people doing the
(30:53):
real stuff who still wanted to make good quality garments,
and then put them out of business through push them
to the street.
Speaker 3 (30:59):
So I think that's where we're going.
Speaker 5 (31:01):
I mean, I I worry there's a lot you know,
people like you said, people are really desirous of having
you know, human connection, like reading words written by human
people don't really seek out AI generated stuff the watcher
to read, right, Like yeah, So like I think that
there's still going to be a desire for that, but
(31:22):
I do think the market's going to be a lot worse.
And you know, like for people like you and I
I think it might eat away at our margins. But
you know, we've been at it for a while, Like
you know, I have established contacts at different places, like
I'll probably be okay, You'll probably be okay. But it's
the people coming up in the field, Like how do
I get started? Well, like, we just have an AI
(31:44):
that like generates like our our marketing copy or our
blog posts for products, or even like doing our sort
of introductory editing work because it's just cheaper to do.
So where are people getting footholds in these industries? Where
are people you know, finding opportunities to sort of even
break in? And I think AI is just like flooding
those zones and making them pretty unnavigable and so like
(32:07):
that along with the creative industries is where I really
fear for you know, AI's impact here.
Speaker 1 (32:13):
One of the things that seems to super suck to me,
not to put too fine a point on it, is
the fake AI art. Now, one of the things that
I've been told again and again is that AI is
getting better and better and better. So even though right
now it's kind of you can still tell what's written
by AI and what's written by people, you can still
(32:34):
see that AI art looks a little weird, right there's
still little bits of reality. Even like the AI videos
that I've seen, you can sort of tell it's fake.
I've been promised that everyone's getting smarter and smarter and smarter,
and that AI is just going to keep going and
that we're just going to be fucked. But is that
really true or is that sort of like the blockchain?
Speaker 5 (32:55):
Well, that's you know, the six million dollar question is
whether or not they can figure out what's called scaling,
because scaling, you know, training more basically open AI, which
has led this AI boom, its key insight, can really
just be boiled down to that. Like they figured out
that if you scale up the models, feed it more data,
feed it more compute, feed it more energy, then it'll
(33:19):
improve the quality of the results, and that has in
recent months, I kind of hit a wall where like
some things are getting better, other things are getting worse.
Speaker 3 (33:28):
Reliability is getting worse.
Speaker 5 (33:30):
What's so called hallucinations where it just makes stuff up
is getting wrong. So there's it's that's not to say
that they won't figure these things out, but there may
be sort of a limit to you know, what is
possible and what we can expect. Certainly, I think the
quality won't be exponential as the as Silicon Valley it
(33:50):
likes to sell us.
Speaker 3 (33:51):
Yes, yeah, right, but I would also.
Speaker 5 (33:53):
Just say, like look at what they're I mean, ultimately,
what they're trying to do is like make it so
that like you can have that picture of like a
Disney you know, animated film quality still and just sort
of poured it over, or you can like famously studio
ghibli style art and then it can do like a
service of serviceable enough job. And I think you can
see like the intent here, which is ultimately like why
(34:16):
why do we want this stuff to be so good?
It's like, well, so we don't have to pay animators
anymore and we can just produce this stuff in our
own backyards and sort of cut the you know, the
human artistry element out of the picture, so we can
save some bucks. That seems to be especially with this
video generation, especially with the image generation. Like what else
(34:37):
besides that intent, besides automating human creativity and human work?
Like why why do we even want to do this
in the first place, other than to do that or
to end to make an ad for the capabilities of
these AI companies so that investors will come and keep
from feeding money into into the machine.
Speaker 1 (34:55):
So I want to talk to you about this super
interesting GOP campaign to I love that nobody in Congress
left or the right has any interest in passing any regulations, right,
the one thing that they could fucking do to help
the rest of us. No interest. Okay, God forbid, we
tell you guys what to do. Don't want to regulate,
but they do want to prevent other people from regulating.
(35:19):
So talk us through this. When I saw this, I
was like, if we all survive this, it will be
because these people are fucking moron. But so this is
an idea here that this is a proposal to ban
AI lawmaking in the reconciliation. How you put that in reconciliation?
I don't fucking know. You can't cover dental care and insurance,
(35:39):
but you can put you're not allowed to regulate AI
in a reconciliation built spoiler, maybe the parliamentarian now just
as crack and doesn't do any other stuff, so maybe
she's good with this, but talk to us about this.
So you're gonna prevent regulation.
Speaker 5 (35:57):
Like wild, wild stuff, Like I mean, I'm old enough
to remember last year when GOP was like the Party
of States Rights when the issue was abortion. It's like, oh,
let's let the states decide because states rights, states rights.
When it comes to AI, though, like it's just like
this innovation is just too precious to be regulated. So
(36:17):
we're going to pass this bill sort of an amendment
nestled into reconciliation, which is it. I've never seen anything
like this before, like this is it's it's totally totally
wild and you're going to love like the justification that
they give because I called up Energy and Commerce and
I talked to them about it and amazing and they
(36:37):
told me how they're planning on justifying this. So yeah,
So basically, it's this ban on any state's ability to
enact or enforce laws about AI. That's like you just can't.
They're saying you can't do it, Like, is it constitutional?
How are they going to enforce this ban? These are
great questions, but it's clearly aimed at a few places, right,
(36:57):
like California being the big economy that it is and
having passed influential statewide legislation that has then had like
a knock on effect that everybody that you know, for
good examples automakers, right, they passed like exhaust standards, and
then suddenly all the automakers had to just decided to
make them for the whole state. So Jill p Yeah,
(37:17):
they don't like this, they don't like that California has
this power. So it's pretty chiefly aimed I think at
California and whether or not it might because it's threatening
to pass a bunch of pretty like pretty common sense
AI stuff. There's nothing even that's even that crazy. I
talked to that allowed, yes, con.
Speaker 3 (37:35):
But not allowed at all. Yeah, I talked to one
of the bill co authors.
Speaker 5 (37:39):
It's stuff like, if your employer is going to use
AI to like hire or fire you, they have to
let you know they can't use Yeah, well they have
to tell you, like right, that's beyond the pale, like
very common sense stuff. But anyways, the AI industry, Open AI,
Meta IBM even they've been lobbying full force against this,
and they've got Trump's an ear because there's so many
(37:59):
you know, tech folks.
Speaker 3 (38:01):
And they got the money, and they got tech folks
in the in the White House.
Speaker 5 (38:04):
They had they got Mark Andriesen and David Sachs and
all these guys.
Speaker 1 (38:08):
Those guys seem great. They seem very yeah.
Speaker 3 (38:11):
Real, real, real shining stars, real wonderful.
Speaker 1 (38:14):
Man.
Speaker 5 (38:14):
The GP slides this in and they're going to say
we're gonna stop California from being able to do this
before they even succeed or try. And their justification is, well,
we are going to start using AI tools to modernize
like the Department of Commerce, and we're gonna have a
chatbot in like the GSA and C and the you know,
Government Services Administration. And if we have these AI tools
(38:37):
there and we're like you know, buying them from an
AI company and California regulates it, that might make AI
more expensive and therefore it's a budget issue that we
can put into reconciliation. That's their justification that regulation might
make AI more expensive, and that then they that they could,
(38:58):
they would have to the government would would have to
bear those costs, and therefore they have to ban the state's.
Speaker 3 (39:03):
Ability to regulate it whatever. I know, they're just making
up at this point, right.
Speaker 1 (39:09):
I mean, at this point one of the things that
I've really been impressed with in this administration is just
ability to completely lie about almost everything, with making Trump
one point zero really look like salad days of moral confidence.
Speaker 5 (39:27):
It's really wild like that. They and yeah, they're just
making shit up to justify it. It does not matter
like it does. He could have said anything, even though
they still have something. I guess they just can't tell
a reporter were just going to do it. We want to,
we want to.
Speaker 3 (39:40):
That would be a refreshing, right, Yeah.
Speaker 1 (39:42):
So what about the like doomsday scenario AI. We've seen
this before in movies, right, automates whatever ends up in
a sort of cold war situation, I mean wargames. Right,
We've seen this in the eighties, the idea that AI
would would create nuclear holocaust, Like, what are the adds
to that? Fifty to fifty forty nine fifty one, I mean,
(40:06):
tell me what you think.
Speaker 5 (40:07):
So everybody thinks about skynet, right, but I think the
reality you know, skinett is where like the computers decide.
Speaker 1 (40:14):
To explain that desk because I have no idea what
that is.
Speaker 4 (40:17):
Yeah.
Speaker 5 (40:17):
Yeah, So Skynet is the villain of the Terminator movies. Yeah, okay,
we're an AI that has been integrated into the the
you know, the military, the Department of Defense, becomes self
aware and then it decides that humans are you know,
they're trying to shut it down because it becomes too powerful.
It decides humans are a threat, and then you know,
(40:39):
the Terminator franchise takes place where the machines try to
exterminate humanity because it's a threat to AI obviously. And
that's a lot of like sort of the worst case
scenarios that are sort of pedled, I think quite usefully
by the Silicon Valley companies, Like they have an interest
in us being afraid of this all powerful technology of
you know, these doomsday scenarios, because then it kind of
(41:01):
just suggests, right that like their technology is that powerful,
it has that capacity, and it may not be an
alluring prospect to have humanity you go down in flames,
but if the software is that powerful that it can
certainly like you know, replace Joe at his corner office,
and they can sell you some enterprise software for a
profit in the meantime. So they've had like an interest
(41:23):
in selling this sort of doomsday scenario. But I think
the real issue is just as scary, and that's that
it starts getting everybody to sort of outsource their thinking
to AI, or to feel comfortable just letting AI do
it and not checking it. And increasingly we are having
these automated systems that aren't so sophisticated they're going to
(41:46):
methodically wipe us all out, but that are so clumsy
that they allow for grave errors. I think like that
extends to geopolitics, right as we're.
Speaker 3 (41:56):
In we're going to die, so we we may die.
Speaker 5 (41:58):
And you know, one of the things that was going
on in Israel Gaza was that they were using an
AI system for like targeting, and they were outsourcing their thinking.
People were just hitting the button because the AI system
isn't that sophisticated enough to like really pick the right
target through it. But it creates that bureaucratic sort of
buffer zone where people can just feel like, oh, well,
(42:20):
it's not my problem, it's the AI. And they're hitting
the button and selecting a new target. And now we
have all these talks about open ayes, we'll talking to
the Pentagon and Palenteer, which does AI is taking on
a larger role with all this targeting software, and I
think we can expect it's called scientists and researchers called
this cognitive offloading. So more people are just cognitively offloading
(42:43):
their tasks and their work to AI systems. And I
think the result could be a disaster. Yeah, absolutely, Well
that's good.
Speaker 1 (42:52):
Result could be a disaster, nuclear holocaust, Jesse.
Speaker 2 (42:56):
Any thoughts, so, Brian, I'm particularly obsessed with this not
hiring young people out of college and what this could
do that you've wrote about.
Speaker 3 (43:03):
Could you talk a little bit about that. Yeah.
Speaker 5 (43:05):
So there's an interesting figure that's began to surface in
the economic data, where you know, typically one of the
most hirable sort of demographics is recent college grads.
Speaker 3 (43:17):
Right.
Speaker 5 (43:18):
They're young, educated, willing to work for less money, and
they're ambitious.
Speaker 3 (43:23):
Right.
Speaker 5 (43:23):
Usually it's like kind of a no brainer for companies
to hire these folks. And in recent years that number
has started to do the number of unemployed recent college
grads has gone up, and there are a number of
different theories behind why that may be, but a big
one is is that more and more companies might just
be filling those roles with AI or a combination of
(43:46):
AI and like part time like algorithmically curated gigwork, where
so instead of like, hey, we don't actually need to
hire a new salaried person and give them healthcare, they
were just going to be sort of like writing our
sort of like our website copy and kind of learning
the ropes anyways. And now we can have AI just
like write the website copy or do any number of tasks.
(44:07):
And so the fear is that like that very important
sort of ladder you know that I alluded to earlier
is getting sort of knocked away. And you know, these
entry level jobs are are getting harder to get and
taken over by AI because they're also the ones that
are the less complex and easier to perform by by
an AI system.
Speaker 2 (44:27):
Yeah, we saw a report this weekend that Amazon coders
are saying they're being treated like the warehouse workers now,
and I thought that was particularly telling.
Speaker 5 (44:36):
Absolutely, I mean coding, most of the sort of the
most vocal sort of opposition tends to come from people
in creative communities. We have all of these lawsuits from
artists and authors and people who are standing up and
you know, the screenwriters and the WGA and the Screen
Actors Guild that like really fought against companies using AI
in their last strike.
Speaker 3 (44:56):
But it's starting.
Speaker 5 (44:58):
To hit you know, the software and engineering community really
hard too. And so far, what I'm hearing from a
lot of folks is less that it's you know, erasing
my job, but it's making it suck. Right Like now
my job is to just produce even more check AI
generated code. The creativity that went into writing a new
(45:18):
program has largely been taken away because we're being told
to sort of churn more stuff out. And so yeah,
when I saw that Amazon report from workers there who
are saying that it's you know, they're being turned into
sort of assembly line workers or feeling like factory workers,
I could not have been less surprised. This is what
I talked to a lot of workers affected by AI.
It's just what, you know, what I did. I put
out a call, you know, that's my ongoing project is
(45:41):
called AI Killed My Job. And so I'm I'm I'm
soliciting testimonies from workers and talk and interviewing workers who've
had AI come into their workplace, and yeah, I'm hearing
from a lot of coders and tech workers who say that,
you know, it has an eraser job out right, but
it's it's made it a miserable experience, and I think
(46:02):
that is going to be increasingly universal. I also just
talk to therapists, therapists who they're being told to integrate
AI into their work process, and certain steps of the
of the process if you work for a company like Kaiser,
are being handed over to AIS in some pretty scary ways,
like when you call in if you have a mental
(46:23):
health issue, and now you get an algorithmically dictated system
that is supposed to weed out whether or not you
need to talk to a real therapist. So if you're
in real trouble, then the system is supposed to flag you.
But the system's wrong a lot. And this is just
to cut out a job that used to be a
trained therapist job to you know, say, okay, I understand
that you're in severe danger to yourself and others.
Speaker 3 (46:46):
I'm going to put you through to a therapist.
Speaker 5 (46:47):
Now we're entrusting that job to an algorithm because Kaiser
wanted to save some bucks and so This is happening
in jobs and professions, you know, everywhere where, like illustrators
and artists and things are like actually, you know, getting
their work replaced and wiped out everywhere else. It's like
this degradation and this thing that we really have to
deal with.
Speaker 1 (47:07):
Thank you, Brian, Thanks so much.
Speaker 4 (47:09):
Molly, No moment of Andrew le Molly john.
Speaker 1 (47:16):
Fest What is your moment of fuck ray?
Speaker 4 (47:19):
First of all, this segment has a familiar ring. I'm
not sure why, but it reminds me of another segment
for some reason.
Speaker 3 (47:27):
It'll come to me.
Speaker 4 (47:28):
Yeah, yeah, My moment of fuckery is Trump saying that
I guess see and Christy know them together, saying they're
gonna phase out FEMA after the twenty twenty five hurricane season.
They sit there and they say things. Christino sits there
and says things like we all know from the past
(47:50):
that FEMA has failed thousands, if not millions of people,
and President Trump does not want to see that continue
into the future. First of all, that's not true. Second
of all, if FEMA's that, man, why are you waiting
to laughter the hurricane season like it today? You know,
if you've got a better idea, you know, this whole
idea of sending it back to the states, when the
whole idea is the states are the ones who ask
(48:11):
the federal government for help when they need it when
they can't do it on their own. But we're in
sort of this post federalist world where the federal government
can nationalize the National Guard on a whim, and then
at the same time, you know, they can take away
state resources by getting rid of FEMA. The whole thing
is just beyond fucked up.
Speaker 1 (48:32):
Yes, Christinome's the one who shot her dog, right, I'm
not sure I want to be taking advice from that person.
I feel like her takes may be suspect from the
fact that she decided that the scene where she shoots
her puppy and buries it in a gravel pit was
a good thing to put in her memoir.
Speaker 4 (48:52):
Yeah. On the other hand, she's now the director of
Homeland Security.
Speaker 1 (48:57):
I don't know, she's something. Maybe she's the Secretary of State.
Speaker 4 (49:01):
She wrote that memoir and then got a promotion.
Speaker 1 (49:04):
Yeah wow. I don't know if working in the Trump
administration is really a promotion for her.
Speaker 4 (49:09):
She thinks it's a promotion anyway, I mean, and she
certainly has more power now. So it's like, I'm just
I'm looking at these quotes. The FEMA thing has not
been a very successful experiment, and he says, that's what
you have governors for. When you have a tornado or
a hurricane. They're supposed to fix those problems.
Speaker 1 (49:25):
Yeah, FEMA hasn't prevented any hurricanes.
Speaker 4 (49:29):
It's true.
Speaker 1 (49:31):
It has not prevented a single hurricane.
Speaker 4 (49:34):
That's because the Jews are more powerful than FEMA.
Speaker 1 (49:36):
Look, man, you said it. I didn't. I'm just agreeing,
but you know it. I certainly don't control the weather.
Speaker 4 (49:42):
I'm just books don't just appear on the New York
Times bestseller list.
Speaker 1 (49:46):
Molley, that's right, we use the weather machine. Thank you,
Andy Levy, Will you come back?
Speaker 5 (49:52):
No?
Speaker 1 (49:53):
Excellent. That's it for this episode of Fast Politics. Tune
in every Monday, Wednesday, Thursday and Saturday to hear the
best minds and politics make sense of all this chaos.
If you enjoy this podcast, please send it to a
friend and keep the conversation going. Thanks for listening.