All Episodes

July 12, 2025 34 mins

ProPublica’s Megan Rose details how the FDA is failing to properly police generic drugs in your medicine cabinet. Wired’s Steven Levy examines the big tech executives now working within the U.S. Army.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hi, I'm Molly John Fast and this is Fast Politics,
where we discussed the top political headlines with some of
today's best minds. We're on vacation, but that doesn't mean
we don't have a great show for you. Today. Wired's
own Stephen Levy stops by to tell us about the
big tech executives who are now working in the US Army.

(00:24):
Kind of scary. But first we have Pro Publica's own
Megan Rose about how the FDA is not properly policing
the generic drugs in your medicine cabinet. Also scary. Welcome
to Fast Politics, Megan, Thank you for having me. We're
so excited to have you. You focus on drugs, regulation

(00:48):
of drugs, medical healthcare, medical stuff, etc. Right, who wrote
a piece called seven Things to Know about Pro Publica's
investigation of the FDA secret gamble on generic drugs, So
explain to us what is happening at the FDI.

Speaker 2 (01:06):
We found that for more than a decade, the FDA
has banned substandard drug making factories in India other places
overseas from sending drugs to the US. Essentially, the agency
said this factory is so bad, we can't accept any
of the drugs that makes. But then the agency has
quietly given those same factories like a special path to

(01:28):
keep sending certain drugs here, and they really told nobody
about it. We found this happened with more than one
hundred and fifty drugs or their key ingredients, and it
affects a ton of Americans because we're talking about drugs
for cancer, depression, epilepsy, lugarrix disease. It really runs the
whole gamut of getting these drugs that are from places

(01:48):
that we ordinarily wouldn't trust to make our medicines.

Speaker 1 (01:51):
So they have decided now to get drugs from there,
even though they wouldn't before.

Speaker 2 (01:57):
Yeah, so they the FDA goes over to factories that
are making drugs coming into the US and they do
inspections to make sure they're up to par. And there
are times where these factories are really doing a poor job.
The FDA says, Okay, nope, we're done. We are banning you.
And normally that would mean they couldn't send anything, but
then they made these choices to say, well, we need

(02:20):
XYZ drug, So even though you're so bad we're banning
everything else, we're going to let you keep making and
sending these particular drugs to US.

Speaker 1 (02:28):
So secretive group inside the FDA, right, they exempted the
medications from the import bands. They sort of said it
was to protect from drug shortages, but really, who the
fuck knows? And I just want to read this because
I think this is important. The FDA allowed into United
States at least one hundred and fifty drugs or their ingredients,
and these banded factories found to have mold, foul water,

(02:51):
dirty labs, or fraudulent testing protocols. Nearly all came from
factories in India.

Speaker 2 (02:57):
Yeah, so the problems that the inspectors find at these
factories are pretty gross. They ranged from pigeon feces on
boxes of ingredients going into batches of drugs, or glass
getting mixed into injectibles like shots they are going to
go into your body, which that could kill you. Or
there are lots of problems like they manipulate testing records,

(03:20):
and there's all kinds of ways in which they do it,
but the general outcome is that they make it appear
as if every drug that they have tested for quality
has passed, when in fact, many, many batches have failed.
But what they send to the FDA is oh, look,
this is all good, and here these drugs we're sending.
So a lot of the drugs that we are getting
from these factories are called sterile injectibles. I know, that's

(03:42):
a not a term. Everybody knows that these are shots
that go directly into a person's veins or their muscles,
so the drug is in the bloodstream really fast. So
this type of medication it has to be free of
contaminants and be really pure. It could harm you, it
could kill you, like the glass and the injectibles. So
there are a lot of ruled about factories of how
they need to make these drugs so they're safe. So

(04:02):
picture workers on production lines in downs and gloves. You know,
there are restrictions on how they can touch things and
when machines need to be cleaned, like very precise instructions
to ensure sterility. But these rules are also the bare minimum.
So like when an FDA inspector goes into a factory,
that's what they're looking for. They're checking to see if
like the very basic safety standards are met. So that

(04:24):
means that these factories that have been banned have failed
to meet like even the lowest far and import bands
don't happen often. This is something the agency reserves for
the worst of the worst. So right now patients here
in this country are going in for surgery, they're getting
created for cancer, and they're unknowingly be injected with the
drug that's made at one of those factories.

Speaker 1 (04:45):
Jesus Christ, this is like the good FDI. Like, this
is probably biden FDI. Imagine what this FDA is going
to be like.

Speaker 3 (04:53):
Now.

Speaker 1 (04:54):
This group that.

Speaker 2 (04:55):
Makes these choices is the Drug Division inside the FDA,
and they just had massive c to their staff. They
were already pretty under resourced, so it'll be interesting to
see going forward what they were able to do and
not do.

Speaker 1 (05:07):
So, I mean, it's going to be ten times worse. Right,
those are the fears. I mean.

Speaker 2 (05:13):
We talked to Janet Woodcock. She was head of the
Drug Division for FDA for like almost twenty five years,
and she's very concerned about all the slashes because she
was constantly asking for more resources throughout her tenure and
the FDA really got it. She former commissioners, they're really
concerned about what the FDA if they're going to do
their job. Beforehand, they had so much approving drugs, reviewing drugs,

(05:36):
tracking harm, inspecting factories, and now they'd have less staff
and less money to do it.

Speaker 1 (05:41):
Can you explain to us why you think this has
not gotten more attention?

Speaker 2 (05:47):
Wouldn't the FDA made the choice to ban these drugs.
They did so very secretively, and since it was about
six to eight people in the room who were making
a decision, they never told Congress, they never told the public.
It was just kept very under wraps, so people didn't
know about it. And if you go on the FDA website,

(06:07):
like if you know enough to say, I'm gonna go
see like what factories have been banned from making drugs,
you can go look up an import alert on the
FDA's website. It's full of jargon and random codes that
you would then need a dictionary to figure out what
they mean. And then buried in there is like a
line that talks about products that are exempt from the band,

(06:29):
except they don't use that language. You have to like
decode the language to get there. And then you would
see the list of the drugs, and then it's buried
in this like massive document you have to scroll through,
so you would never find it unless you were really
looking for it. And they didn't bother to tell anybody,
you know, it's interesting. They have been doing this since
at least twenty and twelve, so it's been a long time.

Speaker 1 (06:50):
Yeah, And has no one ever died from these drugs
or do we just not know?

Speaker 2 (06:55):
A harm is really hard to track because unless you
are taking a drug, like there's a few proprion a
depressive antidepression drug. If you are on this antidepressive drug,
if it has a problem, it'll smell like rotten eggs,
So that's really like obvious. You open your pill bottle
and it's like you get hit with the smell. But
a lot of things they don't have such an obvious flaw,
so it's hard to know that like I'm having an

(07:18):
issue because I took this drug. So it's really underreported.
But the FDA has a database where they collect complaints.
So if you're a healthcare provider, you're a pharmacist, or
a patient and you suspect your drug has harmed you
in some way, you can file a complaint. And there
are lots of reports about these exempted drugs. So we
looked at two companies alone who are currently under import ban,

(07:40):
and they have three factories that are banned with exempted drugs,
and we found six hundred complaints just from those factories
alone about the exempted drugs, including like seventy hospitalizations nodding deaths. So,
even though in your causation is very hard to prove,
there are patterns there that the FDA could have been
looking at to decide if these drugs really worse.

Speaker 1 (08:00):
I mean, how do you get to a situation like this.
Clearly the federal government is not doing its job.

Speaker 2 (08:07):
We have had very serious drug shortages for a long time,
and nobody has seemed to really dive into wanting to
fix that problem. And granted it's complicated. There are drug
supply is So drug shortages really started in the twenty
ten twenty eleven they peaked why so it's an interesting question.

(08:27):
There was some closures of some domestic drug making plants.
There was some critical ingredients that were in short supply,
like kind of the plethora like this kind of a
combination of factors that led to these shortages, including FDA enforcement.
So if an FDA inspector goes to a factory and
shuts them down, if it's foreign factory, fans them from

(08:50):
sending their drugs, well, we're out of that supply. So
it's kind of like, well, FDA does their job to
say you're not doing these factories can't make drugs, then
they have fewer drugs, and so it's kind of this
self defeating cycle. So when these drug shortages happened and
they Congress was pissed, public was pissed, it got a
lot of attention. FDA kind of switched what it was

(09:11):
doing to really focus on drug shortages first, and so
at that point, when they decided to ban a factory,
the agency would first look to see if it effects
a drug shortage. So if we cut off this factory
from sending drugs here, would there'll still be enough drugs
for all the patients that need them. For many of
these factories, the FDA says that the answer was no,
although we have no way to fact check that because

(09:33):
they won't release shortage data. They have don't tell us
what their metrics are for making these decisions, so we
just kind of have to trust them that this would
actually have caused a shortage. But you know, the FDA
told us that nobody wanted to carve these drugs out
of an import band, but they had to balance the
risks to a patient having no drugs at all or
having one that's poorly made. But again, like the agency

(09:55):
didn't raise the alarms to the public or to Congress
and everyone said, hey, this is the terrible choice for
me because of a drug shortage or Since they've been
making these decisions since at least twenty twelve, that's more
than a decade of lost opportunity to change the equation
because the FDA stayed silent, So nobody over these years
has had a chance to say, Okay, if we've ended

(10:16):
up here where we're like making this choice between no
drugs or bad drugs, what we need to do to
fix the system. And you're right, that's a failure of
the failure of government all around, because it would take Congress,
it would you know, take the White House, and it
would take first the FDA saying hey, we have this problem,
we need some help dealing with it.

Speaker 1 (10:34):
Another part of the story is that there's a secret
panel of doctors working in the FDA, right, I mean
are they doctors? Who are these people and how did
they get this job?

Speaker 2 (10:46):
So the FDA have always had a drug shortage staff
and they're not doctors. Sol though some of them are
healthcare professionals in some way, and I think in the beginning,
before we're twenty ten, it wasn't a very glamorous job.
They were kind of on the outskirts of the FDA.

(11:06):
And then the shortages happened, and Janet Woodcock, who led
the agency for or led the drug division of the
agency for a very long time, she elevated them to
work directly to her office, and so all of a sudden,
this small crew of people were getting a seat at
the table and all of these decisions that were normally
made by specialists in compliance. So if you are inspectors

(11:27):
and you're reviewing inspection and deciding what should happen to
know a stroubbled factory, you would make that choice. But now,
all of a sudden, drug shortage people are at the table,
and they're also saying, well, maybe we can't do that
because you're gonna cause a shortage. So let's figure out
how these products they are supposed to be sterile and
aren't being made and only that keeps them sterile, can

(11:48):
still come to this country.

Speaker 1 (11:49):
There's not really accountability, right, there's no congressional oversight to
the FDA. Is there or is there not and why
isn't there If there.

Speaker 2 (11:58):
Isn't, There is lots of compression oversight of the FDA,
but it's a huge agency overseas, you know, food, medical devices, drugs,
and if the FDA never tells them about something, they
don't know what's happening. And it's interesting there was when
the drug shortages first really started to hit crisis level,

(12:19):
Congress was so mad. The FDA said, listen, you guys
need to send us a report every year that tells
us exactly what you're doing to stop drug shortages, including
the ways in which you're using what the FDA calls
regulatory flexibility, meaning they can like kind of fudge their
rules a bit to allow these bad drugs to come in.
But the FDA never mentioned this practice to Congress. So

(12:41):
for years they were telling them all these things they
were doing and never saying, oh, by the way, we're
bringing in drugs from band factories. Not until their last
report in twenty twenty four did they tell Congress in
a tiny little footnote that said making exemptions or what
they call carve outs import bands is one way in
which they're dealing with shortages. So this is the first

(13:03):
time Congress has learned of it, and how many lawmakers
have time to read their footnotes.

Speaker 1 (13:08):
Is Congress ready to start holding hearings on this or
do they not know about it still?

Speaker 2 (13:14):
Or do they There is a committee that is very
interested in the FDA and has been dogging them for
the last couple of years, particularly about foreign inspections because
the FDA was like not doing them very often and
most there's like a huge percentage of our drugs are
generic meds that come from India, but these factories were
not getting inspected, so the FDA didn't have any eyes

(13:36):
on them. That Congress was really upset about that, and
they have been looking into that issue quite a bit.
So it'll be interesting to see if that this this
in court band practice comes up at the next time
they call FDA to the Hill.

Speaker 1 (13:50):
So this really is a generic drug problem, like an
on label drug won't have this problem.

Speaker 2 (13:57):
The most part. Now, brands like the big brand and
drugs like they have their own problems of their factories,
but they have substantially more money, so it's less of
an issue. But what we were talking about, these are
all generic meds, which are about ninety percent of the
drugs that we all take. It's very rare to be
on a brand these days unless it's the only like

(14:18):
they're still underpatent. Otherwise all your insurance companies say, okay,
now you need to be on the cheaper generic, and
a lot of the generics, like about a fifth of
the generics the world's generic factories are in India and
they get a lot of their ingredients from China. And
that's the other aspect of this story is we found
that one hundred and fifty drugs. It's also including the

(14:39):
key ingredients. So these are factories that they don't make
a finished drug. You're not going to you don't take
a pill that they make, but they make like the
crucial ingredient that other drug makers are going to put
into their pills. And they're coming from these band factories
going into untold number of drugs we don't know which ones.
So we could have a whole list these are exempted drugs,

(15:01):
but it wouldn't cover everything because you could be taking
a drug from a manufacturer who bought their ingredient from
one of these band factories, and so that just like
multiplies the problem.

Speaker 1 (15:11):
Oh yeah, yeah, how will the FDA change under the
Trump administration.

Speaker 2 (15:16):
Well, I think we're still seeing how that is going
to play out. But the Trump administration and Robert Kennedy
Junior have thoughts about really slashing and burning the agency.
We've seen so many cuts. They're talking about less oversight
of drugs, faster review of applications. So if you want
to make a generic drug for Americans, you have to

(15:37):
give the FDA an application saying like, here's our formula
for the drug, here's how we're going to make it,
here's where we're going to make it, and then the
FDA has to approve it. And part of that approval
is making sure that they can make this drug in
a quality way. And sometimes that foods an inspection and
you know, sometimes it doesn't risk base. Again, the FDA
won't really tell us how they make those choices. But

(15:59):
if you you are speeding up that process, there's going
to be less review that is happening. So if you
cut people and you require them to move faster, like
what is going to go through the cracks? What things
are going to be approved that maybe shouldn't. And that's
an important question because we found that the FDA was
giving it stamp of approval to knowing bad actors. So

(16:21):
companies that had been in trouble for years, like inspects,
have been finding violations again and again, and yet all
the while the FDA is approving these companies to make
more drugs. So the agency itself, even you know, staffed
pre Trump, was not like talking to each other in
a way that they were saying, well, wait, what is
the history of this company? Can we really trust them

(16:41):
to make the drugs in a safe quality way? And
as we've found out, like quite often, the answers no.
You know, one inspector told us that we let them
grow to be monsters and now we can't go back.
So if you have fewer people doing the same amount
of work, what's going to happen next?

Speaker 1 (16:59):
I wanted to take any drugs ever. Again. Thank you,
Thank you.

Speaker 2 (17:04):
Megan, Thank you so much for having me.

Speaker 1 (17:10):
Stephen Levy is a columnist for Wired. Welcome to Fast Politics,
Steven Levy, It's great to be here. We are talking
about perhaps one of the scariest things that I have
stumbled on, is a story about tech executives. And so
we have metas CTO and leaders from open AI and

(17:36):
Palenteer have joined a detachment intended to make the US
Armed forces leaner, smarter, and more lethal.

Speaker 3 (17:46):
Discuss Yeah, well, this is Detachment two O one. This
is something that has been in the works for about
a year, maybe from someone who is the Chief Talent
Management Officer of the Pentagon, which I learned is actually
a post someone who was a combat soldier and officer

(18:09):
then went to Walmart and helped them with veterans, and
then he went back to the Pentagon and he came
up with this idea talking to the Palantiner CTO to
get executives from Silicon Valley and make them soldiers at
least in the Army Reserve so they can continue this
trend that's been going on for a couple of years

(18:31):
really of Silicon Valley companies being more willing to work
with the military and for the military in applying their
advanced technology.

Speaker 1 (18:41):
Advanced technology sounds good, right, I want to you know,
I was a person who toured the Tesla factory and thought, like,
this guy's got This was before he became the Elon
Musk that we all know now, but I thought this
is pretty good. This looks good many years before he
became what he is. But I just am curious, like,

(19:02):
why is this bad, because it clearly is. Can we
go back and talk about what Palenteer is up to
because Palenteers is very involved with our military, and I
wondered if you could just sort of give us a
little background on that.

Speaker 3 (19:17):
Okay, yeah, well I haven't done, you know, a giant
investigation into Pallenteer, but it's pretty well known. They're a
company which has been very effective selling technology. It's data
mining to get information from huge cash's you know, of data,
a lot of personal information, you know, to help improve security.

(19:40):
They're increasingly getting into AI and it's a company that
thrives on government contracting and is you know, super outfront
in saying that, you know, we're not going to apologize
for it. If this is scary to some people, it's
absolutely necessary for national security. And there's a lot of
critics of Palenteer who feel, you know, they saw the

(20:01):
police departments and things like that, it's sort of enabling
a surveillance state. So that debate goes on, and I
think the larger debate we're talking about here is, as
you say, Gee, it sounds important, and it is important
that our military is up to date, and the fact
is because of these ancient contractors that you know, they've

(20:27):
traditionally work with these big defense contractors that do these
big contracts, and they come up with these elephants that
cost billions of dollars, hardly work. It's a good idea
to apply the Silicon Valley methodology to this. You know,
you see what happened when Ukraine attacked Russia with these
drones are really an innovative scheme where you know, they

(20:49):
parked you know, these trucks or whatever and they just
opened up like trojan horses and have drones out there.
That's like, you know, almost like a hacker kind of
scheme and military to be able to do that. I
think what's different about this case is there's like a
line to say, okay, we're getting the expertise from Silicon Valley,
to say we're getting Silicon Valley people in the military,

(21:14):
you know, of course to join the military, but this
program lets them into the military and gives them the
special privileges that most Army reservists and certainly you know,
full time soldiers don't have. They're not going to get
sent to wars a battlefield, and they're not gonna uh,
maybe they visit a war zone who knows. But their

(21:37):
unit's not going to get called up. They are a
separate unit and of themselves. So you know, it's not
like saying I'm in this Army reserve unit. I'm living
my life in you know, with my wife and my
kids and my job or my you know, my partner,
and bang I'm sent out to you know, Fallujah, you know,

(21:57):
or some war zone. And no, their unit's not going
to get called up because they say, that's not what
it's for. They aren't required to spend all one hundred
and twenty annual hours, you know, working on a base
like most reservists do. They could work from home, and
the start off as lieutenant colonels, which is a very
high rank, and it's very unusual to start out at

(22:21):
that rank. And the Army told me, well, yeah, here's
these examples in World War two and World War One,
but not quite the same. So it's a different thing,
which I think gets our attention because they're still working
for these companies, some of them who are working on
military contracts. So there's this potential conflict of interest that

(22:43):
the Army says, and they say, yeah, we're doing this
in a personal capacity, but it's tough to separate that
from the job they work at every day.

Speaker 1 (22:50):
Can you explain what data mining is just for those
of us who are not completely because I think that's
an important just to explain what that exactly looks like.

Speaker 3 (23:02):
Sure, you know, well, you know, look, it's something that
you know, META certainly does and the security forces, you know,
national they do it in national defense when you fly.
It's taking you know, a lot of information. Personal information
in this case is also using a business sense, but
what we're talking about is personal information. And because you're

(23:23):
able to accumulate it, there's things that they that people
can learn about you because they have a lot of information,
which isn't necessarily you know, the kind of information you shared.
There might be something that even people close to you
don't know. But because they have this kind of database
and that kind of database, and they know what you buy,
and they know where you went to school and what

(23:46):
you lived and what your critical information is, they know
a lot about you and can make conclusions about you
that are very very sophisticated. And I mean you could
see this just the way Meta uses it in using
the like buttons. A researcher who discovered some years ago.
I wrote about this. I wrote a book about Facebook
that with like less than a dozen likes, that can

(24:07):
know your sexual orientation and political leanings. With like fifty,
they'll know you as well as a friend. With three
hundred likes, they'll know you as well as your partners,
spouse might know you. So that's pretty useful information. And
Polunteer is an expert at that and using it in
national security versus intelligence forces use it for the national

(24:29):
security That is.

Speaker 1 (24:31):
A completely insane and terrifying I mean, yikes. So you
have these guys Meta Polunteer, other technology companies involved in
the military. Sort of where is this going wrong?

Speaker 3 (24:48):
Well, again, you know, this is like a new program,
you know, you know, and we all know what's going
to happen with it, I guess do. I questioned a
couple things, and I talked to the army about this,
you know. One was why do you need to have
them in the army Army reserves right? Why can't they

(25:09):
do this as consultants? And I was told that there's
something special and unique that they would be in a
uniform and do this. So I talked to one of
the new lieutenant colonels the fellow from open Ai who
I've known for years, and you know, and I like
the guy, and he was saying, well, you know, because
we're in a uniform, we'd be able to connect on that.

(25:31):
I don't know if mathe that's necessarily the case. I mean,
if someone gives you great advice, you're going to take
that advice. And you know, and it could just as
well be that the people who serving in the rank
and file might resent these people who at a high
rank in the uniform who come in there. They didn't
have to go to basic training and you know, again
they're not called up to a battlefield, and might might

(25:52):
I might say, geez, you know, these guys took a
shortcut to get where they are, so we don't know
how that's going to work. After talking to them, I'm
only not convinced that this couldn't have been done with
a regular consultancy sort of arrangement.

Speaker 1 (26:09):
But that that is not the only problem here, right Actually.

Speaker 3 (26:13):
The biggest problem, the biggest problem, I say, is one
that comes from the future of technology specifically, so right now,
open Ai, Meta, other companies, you know, Google, Microsoft, Anthropic,
they're creating this technology that they call super intelligence. HI

(26:39):
could do anything. And there's a question that they talk
about that this could present as essential threat to humanity
and they are you know, they say that they're valed
to make it safe. And one of the things that
these companies and AI open AI specifically says Anthropics says
the same thing that we don't want our AI to

(27:04):
do harm to people, and we don't want our AI
to be used to create weaponry. Okay, so here's something
seems good, right, that seems great, and it seems, like
I tell you necessary.

Speaker 1 (27:19):
Because because we all don't want to die.

Speaker 3 (27:23):
Or our hope. If you accept their premise that this
is a shattering technology that is going to change everything,
not everyone does. If you accept that premise, then you say, well,
let's build it in a way where there's these guardrails
that they can't jump that even if the AI wants
to figure out, you know, to get that kill god,

(27:44):
you know, some humans have to go. They can't do that,
right right. Okay, here's these people who are now working
for the military whose goal is the opposite. They've said
that you're there to increase lethality. So what the military
wants to do is weaponize AI to do exactly what

(28:06):
the companies are telling us should never be done. So
here's these folks who are on one hand, sitting in
conference room saying, gee, how do we make sure our
AI will never be used as a weapon, and then
go to meet with their military compadres and say, gee,

(28:27):
how do we make AI more deadly?

Speaker 1 (28:28):
I mean, it seems like a recipe for disaster with
all of AI.

Speaker 3 (28:33):
We're an unknown territory here. We don't know how this
is going to check out. But you know, it would
seem that this was something that you'd have to get
thought to before you had people who were working for
the companies building advaincet AI then also working to use

(28:53):
it in a way that is going to harm human beings,
which is against the you know, stated ethics of the
companies they work for.

Speaker 1 (29:01):
I would love you to explain to us what the
saw would be here.

Speaker 3 (29:06):
Again, as you pointed out, we need this kind of
smarts and this kind of technology. We have to have
our military to be more nimble. We can't be spending
billions of dollars for things that don't work, you know,
white elephants, And there's a smarter way to do this
there's a company called anderl formed book by this game
of the guy named Palmer Lucky, who incidentally was fired

(29:28):
from Meta because he supported Donald Trump when that wasn't
cool at Meta, and he loved the military.

Speaker 1 (29:35):
Right now, that's very cool at Meta.

Speaker 3 (29:38):
Yes, as a matter of fact, after they fired him,
Mark Zuckerberg fired him. Just in the last few weeks,
Mark Zuckerberg says, we're going to partner with him, the
same guy, and there's a picture of them smiling together,
you know, when Zuckerberg fired him for the things that
now Zuckerberg is doing himself, which is supporting Trump and

(29:58):
you know, becoming you know, alitary fanboy. So yeah, so
we've got that. But back to andrel you know, they're
making the army smarter. You know, they're making our soldiers
more safe by giving a better view of the battlefield,
and you know and really moving us into the modern age.

(30:19):
You know, the question is with AI, are we building
and technology with a new level of destruction? Is it
like the next nuclear power that you know, we've been
careful fortunately not to use since we dropped the bomb.
Yeah forty five right. The scariest thing that I heard
researching this story is the spokesperson for the detachment or

(30:44):
actually there was a talent officer said when it was
conceiving the plan, he said, yeah, we wanted to create
like an Oppenheimer like situation. And I'm thinking j Oppenheimer
create like the Adam bomb. Maybe we don't want an
Oppenheimer like situation.

Speaker 1 (30:57):
Yeah, what did they not know who Appenheim was?

Speaker 3 (31:01):
I think he was referring to the idea that the
Manhattan Project used America's you know best.

Speaker 1 (31:07):
Scientists, yeah, and scientists.

Speaker 3 (31:10):
To help fight you know a quote good war, right,
But you know it has implications considering what AI might
be that perhaps he didn't intend.

Speaker 1 (31:20):
Are you shocked at how little tech regulation Congress has done.

Speaker 3 (31:24):
It's one of those cases and you hear this aw
a lot. Is this expression not surprised but shocked, you know,
right shocked. You know. A couple of years ago, Sam
Altman went to Congress, the CEO of Open AI, and said,
regulate us, and everyone in the industry was saying, we
need this regulation, we give this You're so powerful. And
now they're not saying that so much.

Speaker 1 (31:45):
And because they lost interest, or because.

Speaker 3 (31:47):
Because now they see they don't have to regulate, because
the Trump administration has said, you know, JD. Vance went
to a big AI conference and said, you know, I'm
not concerned with AI safety. I'm concerned with competitiveness. I'm
concerned the United States has to get AGI before China
gets it, and we'll speed ahead, right and that. And

(32:09):
the companies didn't say hoble and hold bold here, we
need regulation like they did a couple of years ago.
They're saying, Okay, yeah, let's go faster. And you know,
as a matter of fact, now that we're doing this,
why don't we classify all this copyrighted material we're using
to train our models that are being sued for Why
don't we just classify. We can do that and get
rid of these lawsuits because we have to be competitive.

Speaker 1 (32:31):
I was at a party and a member of Congress
was complaining about how there's no fact checking on the internet,
and I was like, you know, you could have. Is
there any world in which we see members of the
government deciding to regulate, I mean with something. So here's

(32:52):
my anxiety. We get so far down this road that
we can't there are no fail safes, there are no
you know, by the time they realize they have to
regulate it's too late, you know.

Speaker 3 (33:04):
I did a story about Anthropic, one of the big
AI companies, and that's a company that's going out of
its way to brand itself as we're the ones most
interested in safety and worried about these long term implications,
and we're going to do it right, and we're going
to have what they call a race to the top

(33:24):
by showing it could be done, you know, safely. Other
companies will emulate their methods. But even Anthropic says, you know, gee,
the timeline for AGI kind of comes sooner in the
timeline for our being able to control AI, right, And

(33:45):
I said, you know, so they are the one company
that's saying the right kind of regulation is it might
might be necessary. And I asked the CEO, Dario Amedek,
do you think we're going to need some sort of
AI pearl harbor before we really act on this? And
he said yes, I don't know what that looks like,
but is bad?

Speaker 1 (34:05):
Oh good? All right, well I plan to never sleep again.
Thank you, thank you, thank you, Steven.

Speaker 3 (34:11):
Thank you, Molly. I enjoyed it.

Speaker 1 (34:13):
That's it for this episode of Fast Politics. Tune in
every Monday, Wednesday, Thursday and Saturday to hear the best
minds and politics make sense of all this chaos. If
you enjoy this podcast, please send it to a friend
and keep the conversation going. Thanks for listening.
Advertise With Us

Host

Molly Jong-Fast

Molly Jong-Fast

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.