All Episodes

June 3, 2025 41 mins

Jon Stewart tracks Elon Musk's White House crash, from the high of being Trump's "first buddy" to the low of his black-eyed DOGE send-off. Now that the 100-day honeymoon is over, Jon also checks in on Trump's other struggling cabinet members, like the FBI's burned-out deputy director, Dan Bongino.

Carole Cadwalladr, the award-winning journalist behind the Substack newsletter “How to Survive the Broligarchy,” talks to Jon about how the U.S. government ignored the huge wake-up call that was the Cambridge Analytica-Facebook data breach scandal – a story Cadwalladr broke and which resulted in no legislative protections for citizens’ private data. She warns about the unregulated dangers that data-mining and AI pose to individual privacy and freedom, and what people and institutions can do to push back on big tech’s authoritarian agenda.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
You're listening to Comedy Central, from.

Speaker 2 (00:07):
The most trusted journalists at Comedy Central is America's only
source for news.

Speaker 1 (00:14):
This is the Daily Choke with your host shown hold

(00:45):
of the Guys show man Mimay gun Cher. We got
a choke for ay night. Carol Cadwallerter is going to
be joining us. She's a journalist. She's getting discussed with
us of the tech rolagarchy and how the entirety of mankind,
entirety we'll be enslaved by handful of misanthropic data hoarders.
Spoiler alert, we don't make it. Speaking of misanthropic data orders,

(01:12):
Douge has finally rooted out one of America's least efficient
government workers and marked and marked for dismissal.

Speaker 3 (01:35):
Elon Musk is no longer a special government employee.

Speaker 4 (01:38):
Friday was the billionaire's last day in charge of the
Department of Government Efficiency.

Speaker 1 (01:44):
He's leaving. He's leaving his job to make more family
with his time. He follow off, I think we're just
has a mail order sperm farm. No, I don't know
how he's like. I'm actually I'm starting to feel bad

(02:04):
for this guy. He's been looking at it. He's been
there four months. Look at poor pion. It only took
four months to go from this.

Speaker 5 (02:15):
To this.

Speaker 1 (02:19):
Guy. He went from tech titan giving a mandate to
move fast and crush the deep state to guy who
had a bad night in Nashville. Boy, you can't remember. Well, yeah,
he's got that look like has anyone seen my shoes?

(02:39):
He's beating down, he's got that look on his face
that I imagine his employees normally have black eye thousand
yards stare. This dude has seen some shit. I'd like
to know at least how that happened.

Speaker 6 (02:57):
What is your eye?

Speaker 7 (02:58):
Okay, what happened to you?

Speaker 1 (02:59):
Your eye? This was a bruise thing.

Speaker 8 (03:01):
No, I just wasn't around with the lex and I said,
go ahead, punch me in the face, and he did.

Speaker 1 (03:16):
So you're not gonna tell us what hacked? Do you
need a safe place to stay? Look, I believe sometimes
do happen when you're rough housing with your kids. But
I'm also sure the one sentence no parent has ever
uttered to their child is go ahead, punch me in
the face. But yeah, Elon spent three hundred million dollars

(03:41):
of his own money to get Trump elected irreparably damaged
his personal brand and almost all of his business and
is clearly suffering some kind of issue. But don't worry.
Trump made sure that Elon got something in return. President
Trump keep being praise on the tech type, presenting him
with a golden keeve.

Speaker 9 (03:57):
And I gave him a little special something.

Speaker 1 (03:59):
We have here, a.

Speaker 9 (04:01):
Very special that I give to very special people.

Speaker 1 (04:05):
I have given it to some, but they goes to
very special.

Speaker 9 (04:08):
People, and I thought I'd give it to Elon as
a presentation from our country.

Speaker 1 (04:14):
Thank you, Eli. You couldn't just give him the fucking key.
You had to make sure that everybody knows you give
them to a lot of people. It's just not that special,
you know. I got a bunch of these. I give
him the special people. Who's the guy who brings me
my diet coke? I give him one for every diet coke. Anyway,

(04:39):
enjoy your useless key. There was no need for Trump
and Elon to commemorate this epic fail. This this embarrassing
display of theater. Look at these guys pretending like this
is some kind of celebratory, victorious send off for a
job with Jesus. Look even Lincoln is looking down. He said,
We're gonna like it, Lincoln. Even Lincoln is looking down gone.

(05:11):
This is the most tedious performance I've ever had to
sit through. This is even Lincoln can't take it. Somebody
booth me, I don't like this too soon? Is that?

(05:37):
And Lincoln, for God's sake, with all the shit going
on in the world, I was not expecting the audience
to be like, poor Lincoln just wanted to see you play.

(05:58):
Of course, Lincoln wasn't the old one seemingly disassociating in
the over office. Of course, there might have been an
explanation for that behavior as well. The New York Times
reports that Must allegedly use drugs far more than previously known.
Look whether Elon was using drugs on the job or not,

(06:21):
I have no idea. I do know one thing about
the television industry, though, especially the news industry, and that
is whatever unusual images we have of Elon's enthusiastic time
in Washington, DC, those images will now be repurposed and

(06:42):
given a slightly different meaning in context, almost comically so,
inside edition do your worst.

Speaker 10 (06:52):
Must departure comms as a jaw dropping New York Times
report claims he was taking a cocktail of drugs while
on campaign trail. According to The Times, Expose must erratic behavior,
including waving a chainsaw around and that notorious Nazi like
stiff arm, can be attributed to a daily mix of ketamine, ecstasy,

(07:15):
psychedelic mushrooms and adderall. They claim he traveled with a
daily medication box that holds twenty pillag.

Speaker 1 (07:30):
Why do you want to do it? Brother? Dirty? That way?
Go ahead, come on inside a edition inauguration party Enthusiasm
or lea for Madness. And you might be saying, well,
look who amongst us hasn't unwound. Sometimes a little mixture

(07:50):
of ecstasy, mushrooms, ketamine and adderall, what could be the harm?

Speaker 11 (07:58):
He told people he was taking so much ketamine that
it was affecting his bladder.

Speaker 1 (08:13):
He told people that this dude is a one man
anti drug campaign. These are your pants, These are your
pants on drugs.

Speaker 12 (08:28):
And I do love the fact though that the detail
is he's the one who.

Speaker 1 (08:31):
Told him me I play fall. That means things were
so bad you had to be like, oh, don't worry,
it's something weird. It's just an overabundance of ketemy. No, obviously,
we on this program would have been delighted to offer
an unedited forum for Elon to discuss his journey from
hard working efficiency expert to drug adults child star mugshot,

(08:54):
but he chose to go in a different direction by
sort of speaking to your theater loving parents' favorite program,
CBS Sunday Morning. This is true. This was his final
Trump administration exit interview.

Speaker 13 (09:08):
I noticed that all of your businesses involve a lot
of components, a lot of parts. Do the tariffs and
the trade wars affect any of this?

Speaker 1 (09:15):
You know, tariffs is affectings a little bit? How revealing
any follow up?

Speaker 13 (09:27):
Wondering what your thought is on the ban on foreign
students the proposal. I mean you were one of those kids.

Speaker 14 (09:33):
Right, yeah, I mean I think we want to stick to,
you know, the subject of the day, which is like
spaceships as opposed to you know, presidential policy.

Speaker 13 (09:45):
Oh okay, I was sold anything's good.

Speaker 15 (09:49):
But no, well no, look what Trump has reduced this
man to.

Speaker 1 (10:00):
He has broken this poor man.

Speaker 16 (10:04):
Just in an interview. Ken, we just talk about spaceships.
I was told we'd talk about I was told we
would both be wearing helmets and talking about it. Just
a simple boy with a set of Star Wars, sheets
and pillows.

Speaker 15 (10:24):
And.

Speaker 1 (10:26):
I really would just like to talk about I mean,
you can't blame him sold projects, cutting money from the government,
trying to find efficiencies, and sneaking a trojan horse in
the back door and stealing all our data. But Trump
is spending two hundred billion more dollars than the previous
administration did in this amount of time and creating a deficit, exploding, big,

(10:47):
beautiful bill that is the antithesis of everything Musk said
he was trying to do. And now he's left softly
complaining about it to a guy whose normal news segment
is explaining to your grandparents how to download PDF.

Speaker 3 (11:01):
You know, I was like disappointed to see the massive
spending bill, frankly, which increases the budget depsit not doesn't
decrease it, and that reminds the work that the dog
team is doing. I think a bill can be can
be can be big, or it can be.

Speaker 14 (11:16):
Beautiful, but I don't know if it could be both.

Speaker 1 (11:20):
My perstal opinion, no, sir, we will not be body
shaming legislation.

Speaker 12 (11:30):
I'm going to tell you something, and I speak for
all the legislation out there, that in this country, a
bill can be big.

Speaker 1 (11:42):
And beautiful. I promise myself I wasn't going to do
this and brave. Holy shit. Here's my favorite. By the way,

(12:06):
so this actually is my favorite part of the whole interview.
So Elon actually expressed some dissatisfaction with what was happening
with the Trumpet admission. It was a turn of events
that stunned the reporter on CPS Sunday Morning, who had
no idea apparently that this was being recorded.

Speaker 13 (12:24):
Right after our interview, CBS News posted a clip of
it to promote this very report.

Speaker 1 (12:30):
It was that part where Musk.

Speaker 13 (12:32):
Criticizes Trump's spending bill.

Speaker 1 (12:34):
And his remarks became news. It went all the way
up to the White House.

Speaker 17 (12:45):
Yeah, that's what news does. He's saying that, like, so,
am I in trouble.

Speaker 1 (12:58):
I thought we were just killing time until we got
another Patti LuPone apologizes updates. I don't like any of this,
but let this be a lesson to Elon and anybody
in Trump's orbit. Whatever your passionate political belief, whatever your
ideology is, you will go from reaching for mind stars
to dissolving in a puddle of your own urine in
shame and starting a fight club with your kid just

(13:21):
to be able to feel.

Speaker 5 (13:23):
Because Trump Trump, Trump doesn't believe in anything.

Speaker 1 (13:39):
Man, what were you with him? Because of his commitment
to reign in big tech. They use big tech to
censor you. They use the deep state to spy on you.
And we have to make sure that we are protecting
the American people's privacy and data rights. When I'm president,
big tech will pay iTunes will have to agree to

(14:01):
your terms and conditions. When I'm president, traffic lights will
have to click on boxes containing pictures of you. Caption that. So,
how's that libertarian paradise vision going for you? Now?

Speaker 7 (14:25):
The Trump administration is expanding its partnership with Pallenteer. The
company is reportedly going to build a master list of
personal information on Americans that could give President Donald Trump
your meant surveillance power.

Speaker 1 (14:41):
It's never a good sign when the phrase master list
and surveillance power are coupled. No one's ever like I've
assembled a master list of puppies you can survey for boots.
But hey, how evil can Vallaunteer be pal in the name? Well? Look,

(15:03):
it's not like they're handing all of our data over
to some crackpot CEO Well, let's not judge a book
by its cover.

Speaker 9 (15:20):
The most effective way for social change is humiliate your
enemy and make them poorer. I don't think in winlees.

Speaker 1 (15:27):
I think in domination.

Speaker 9 (15:29):
I love the idea of getting a drone and having
light fentanyl laced you're in spraying on analysts who tried
to screw us.

Speaker 1 (15:42):
Well, let's not judge a book by its insides. Well,
I've always said, if there's anyone in the country who
should have access to all of my personal data, it's
the guy who wants drug laced urine spraying drones. Though
if he is serious about that, he's going to need
a source for a drug laced urine. I might know

(16:04):
a guy. No, No, here's the thing. The reality is this.
The reality of Trump is he turns even his most
fervent and enthusiastic foot soldiers upside down. Take Dan Bongino,
ex Secret Service agent, Fox News host and guy who
looks like he starts fights at literally games even though
he doesn't have a kid playing. He was very excited

(16:26):
for the Trump era.

Speaker 6 (16:27):
Right now we're in charge and this is how shit
gets done.

Speaker 2 (16:30):
Trump are freaking around. We are going to hold every
one of these people accountable. It is time for total
personnel warfare. Fire one hundred people on day one, fire,
one hundred more fire, a thousand more fire.

Speaker 1 (16:45):
Everyone.

Speaker 6 (16:45):
We're good at flipping the script on dipshit liberal comedi's
you know what's coming?

Speaker 1 (16:50):
What's coming? Does anybody know? Well that last part is
going to make for some good b roll for an
inside Edition story one day. Anyway, Trump made that guy

(17:10):
deputy director of the FBI because of course, but look
what is only three months on the job I have
done to him. I gave up everything for this. I
mean cash is there all day.

Speaker 6 (17:22):
We share offices or linked he turns on the faucet,
I hear it. He's there at he gets into like
six o'clock in the morning. He doesn't leave till seven
at night. You know, I'm in there at seven thirty
in the morning. You know he uses the gym. I
work out in my apartment, but I stare at these
four walls all day in DC.

Speaker 1 (17:39):
Yeah, that's called a job. You have a job. That's
what they are. You going in a specific time six
thirty seven around there to a specific room, mostly four walled,
and you're there all day. It's worked. It's a job. Yeah, yeah,

(18:00):
there probably is a dude in there that you hear
all fucking day. He turns the water on you. Here, Hey,
you look at it. He's chewing another sandwich.

Speaker 18 (18:10):
I hate this job. It's annoying. Yeah, it sucks. How
do you not know that, for God's sakes, you're on
the right. Haven't you even read Dilbert? For sake? Work sucks?
And how are you just finding about this now? How
was having a job.

Speaker 1 (18:31):
Now suddenly destroying everything?

Speaker 6 (18:33):
But I stare at these four walls all day in DC?
You know, by myself's divorced from my wife, not divorce,
but I mean separated, divorce, and it's hard.

Speaker 1 (18:41):
I mean, you know, we love each other and it's
hard to be a part. I mean it's just as hard.
I don't like it. It's separated, I mean divorced. I'm
not divorced. I was a divorce, but not separate. I'm
separate together but not of course alone. But I go
who know, I mean some people they leave for walk,
Guys come over and come bang it. I don't know

(19:03):
what's happening. Why can't she go to work? Why can't
I just bring Can I bring my wife to work?
Would that'd be okay. We all miss our wives. What
the fuck? The only one who's gonna come out of
there unscathed is Press Secretary Caroline Levitt, because I don't

(19:24):
think she got any principles in there left to die.

Speaker 19 (19:27):
President Trump is truly the most transparent and accessible president
in American history.

Speaker 1 (19:32):
We have truth on our side at this White House.

Speaker 19 (19:34):
I think everybody the American public believe it's absurd for
anyone to insinuate that this president is profiting off of
the presidency. It's frankly ridiculous that anyone in this room
would even suggest that President Trump is doing anything for
his own benefit.

Speaker 1 (19:49):
That's all he's doing. By the way, I think the
more she lies, the bigger her cross gets. Is that possible.
It's like some sort of weird Pinocchio cross. The president
can't be bought. I'm not even upset with this lady,

(20:12):
because just rolling with the punches is clearly the only
strategy for happiness when you're working for Trump. Trump's very
open secret has always been he doesn't believe in or
care about any policy issue at all. He wants attention,
he wants his egostroke, and he wants money. He wants
frequads and fads of money. Remember his ninety deals in
ninety days. He made them, but only for his family.

(20:32):
Those are the only deals he made. Meanwhile, the world
he said he was going to fix is burning like
so many nuclear capable planes in Siberia. And don't bother
trying to call him on it, because before you can,
he's already moved on to pulling some new crazy thing
out of his ass to distract us.

Speaker 20 (20:50):
President Trump is reposting false claims about former President Joe Biden,
saying that Biden was executed in twenty twenty. Since then
clones body doubles and wrote took Biden's place as president.

Speaker 1 (21:06):
You're saying that the Joe Biden, who doesn't even know
where he is, is actually an incredibly advanced cloned robot.
How much ketamine are you on.

Speaker 21 (21:24):
A lot?

Speaker 11 (21:25):
When we come back, Carol Cadwalader will be joining us.
All right, Hello, where about to the Dail show? So

(21:45):
I want to talk about my guest tonight.

Speaker 1 (21:47):
My guest Tonight is an award winning investigative journalist who
writes the substacked newsletter How to Survive the Brologarchy. Please
welcome to the program.

Speaker 21 (21:55):
Carol Cadwalader, you were joining me.

Speaker 1 (22:14):
Very surreal, it's very real. I want to introduce you
to our audience who might be unfamiliar with your work.
Carol was one of the first journalists that broke the
Cambridge Analytica data story. This was many years ago, That's
what I mean. Nobody was right about it. And give

(22:37):
us a little bit of just that, that backstory of
what happened when you exposed what this group was and
what happened to you.

Speaker 22 (22:45):
Well, basically, Facebook lied repeatedly, you'll be shot rya. They said, no,
I mean this with this company. Of course it didn't
have our data. This is ridiculous.

Speaker 23 (22:58):
And then guess what.

Speaker 22 (22:59):
It turned out that this company, Cambridge Analytica, did have
that data, eighty seven million people's Facebook data taken without
their consent. And yeah, we broke that story and it
turned out it was grossly illegal, and Facebook got fined
a record breaking five billion dollars by the FDC, one

(23:19):
hundred million by the SEC, hundreds of thousands of pounds
in other countries around the world. And guess what, Nobody
was ever held to account. So Mark Zuckerberg got away
with it. Scot Frey, nothing actually changed, nothing changed, nothing changed.

Speaker 1 (23:35):
Actually, the one person who was put through the ringer
doing all of this.

Speaker 22 (23:40):
Yeah that's yeah, well you well, that's the thing I say.
In all of this, we found there was gross law
breaking at every level by Facebook, by Cambridge Analystica, by
the Leave campaigns in Britain, by the Trump campaign. And yeah,
only one person got put on trial and that was me.

Speaker 1 (23:57):
Right, you were put on trial for disc this very
true thing, and they really tried to destroy you.

Speaker 23 (24:04):
Yeah.

Speaker 22 (24:05):
So, well it was a particular Brexitier who came after me.

Speaker 1 (24:08):
And the whole term brexitier makes it sound like Disney.
I hate that. Yeah, I'm a Brexitier. You're a Brexitier.
But they said not, thank god, I hope, what do
you kid me? I'm in America. We don't care what
happens there. No, but it is so. But you were
on the cusp of does that story now, with Cambridge

(24:31):
Analytica sort of siphoning the data from Facebook, weaponizing what
would get people agitated, and trying to influence those elections,
does that almost seem quaint by today's standards.

Speaker 22 (24:44):
I mean, I think it's the blueprints in many ways
for what we're seeing now which is that it was
the dream of big data, which is what could happen
if you take millions of people's data, vast quantities of it.
Because what they did, what Camidge Analytica did as well
as getting access to all of these people's personal and
it wasn't just you know, it's every post they'd ever

(25:06):
put on Facebook, every post they'd ever liked, even their
private messages. But they combined this with masses and masses
of commercially available data.

Speaker 23 (25:15):
And then what they did is.

Speaker 22 (25:16):
They brought these together to create algorithms which they then
use to target people. Was weaponized against them, to send
them Facebook ads, to sort of provoke them in a
certain way. And that is I think in many ways
now you can see is the game plan of what
we are seeing there, which is that there's a question

(25:38):
about how effective Cambridge aallyitics methodology was. But the dream
of it was this sort of big data surveillance engine
in which they would know everything about everybody and they
would know how to provoke you, how to sort of
touch every single person, and how to manipulate us.

Speaker 1 (25:56):
Was the change of that because there is a sense
you think of that sort of in the capital A
sense of the consumer, a sense like if I'll be
talking to my wife about a certain something and then
the very next thing on Instagram is an advertisement for
that very thing, and you think, oh, they've weaponized what
we're interested in to get us to buy things. But

(26:17):
this is a very different scenario, and that they've weaponized
it to defang democratic processes.

Speaker 23 (26:26):
Yeah, and more.

Speaker 22 (26:27):
I mean, I think what is happening now in America
is absolutely they are building a techno authoritarian surveillance state.
We can see that happening in real time. This is
huge amounts of data on every single person in America
that can and will be used in opaque and unaccountable ways,

(26:53):
and it is terrible.

Speaker 1 (26:54):
But on the plot side, so when you hear about
that from Palenteer, is that are you describing something generally
or are you describing exactly like Palenteer is getting all
of our data that? I mean, that was what they announced.

Speaker 22 (27:12):
Well, it's DOGE is the sort of tip of the
spear here. So this is this is Elon Musk's unvetted
operatives going into every government department and they're going in
to access the data bases of those separate departments. And
now what's happening is that this company, Palenteer, owned by

(27:35):
Peter teel a very interesting and.

Speaker 23 (27:40):
Did you see the adjective which adject shall I go for?
We went for interesting?

Speaker 22 (27:48):
This company owned by Peter Teel right, is now amassing
these different pots of data. It's putting it into one
massive database where it's merging them. It's applying applying AI
now to this database.

Speaker 1 (28:06):
And what is the AI's purpose? Is it to sift
through the data to target you for messaging or is
it also to prevent you from accessing government programs?

Speaker 22 (28:18):
I mean, it can be used in so many different ways,
and I think that's a difficult thing, which is half
of people to get their heads around. But this is
a system of control. This is what other authoritarian countries do.

Speaker 1 (28:29):
And there's no regulation on this.

Speaker 22 (28:30):
By the way, there's no regulation. Basically, you had since
we broke this scandal. It's twenty eighteen, the big story then,
and you know one of the key things about it
was you have no privacy legislation in the US. There
is nothing to protect your data. You have no rights
here whatsoever, apart from California now.

Speaker 23 (28:50):
Has a bill.

Speaker 22 (28:51):
So you had all of this time to do something
about it, and you didn't.

Speaker 1 (28:56):
And that doesn't sound like us normally, We're quite prompt
with this type of activity. I'll go even further. In
the big beautiful Bill. There is a segment of that
that says for the next ten years, they are not allowed,
yes to legislate or regulate AI in any way, shape

(29:17):
or form. It is prohibited by an Act of Congress.
If this goes through, what does that do to this?
It gives them a ten year head start on whatever
it is they want to do with it.

Speaker 22 (29:29):
I mean, it's just it is the tech you know,
it's the technocratic dream. I mean, this is what these
tech bros want. They want to just build their beautiful
AIS and do whatever they want in any way that
they want, so that they can get to Mars and
colonize it, you know, great glorious goals such as So.

Speaker 1 (29:50):
How does any of this square with their so called
libertarian principles. It's nonsense. If there is a centralized control
of data run by an opaque algorithm that nobody understands
but them, I mean, and when they want to make
changes grock for like a week, no matter what you asked,
it would be like, do you know, South African farmers

(30:12):
that are white or being killed, like they can do
whatever they want, and is the idea we're supposed to
just trust that it's in our best interest.

Speaker 23 (30:20):
I mean, it's the literal opposite of libertarian.

Speaker 22 (30:23):
But then that's always the thing you just have to
whatever they say, you've just got to realize it's the
exact opposite. So the idea that this is libertarian, which
is a state controlled machinery to surveil and control every
citizen in America, to deny them access to services, to
make inferences about them, about anybody in the country, to

(30:44):
label them domestic terrorists based upon you know what these
different databases throw up. That is it's authoritarian. It's authoritarian,
and it you know, and it's also the pathway to
Fascism's that is what it is.

Speaker 1 (30:58):
And that's here's the other thing that really bothers me
about it. So everything that we're doing to China, the
tariffs and everything else, is because they don't play fair.
They steal our ip, they steal our What is AI
if not vacuuming up anything that is proprietary, not just
about our work, but of our souls, Like it is

(31:21):
sucking up everything that we are and using it for
whatever they want to do. And what's our recourse to that.

Speaker 22 (31:28):
I mean, it's just it's based upon totally illegal behavior.

Speaker 1 (31:33):
It's just theft.

Speaker 23 (31:35):
It's that you have.

Speaker 22 (31:36):
You know, you have copyright laws are just property laws.
You don't go into people's house and then just take
their you know, their furniture and their their you know,
their stereo and then you know, flog it on the
eBay and claim that it belongs to you and keep
the profit. I mean, that is literally what they are doing.

(31:56):
And I mean this is this is the Silicon Valley model.
This is the Silicon Valley model, and this is exactly
what's now been transported into government, which is you break
the law first, you see if you can get away
with it, and generally you do.

Speaker 1 (32:11):
And do you see that? Is it different in the EU?
Did they have a sense of the peril of this
and they're acting more robustly? Is this something that you
see in the United States as the wild West? Who
is keeping an eye on this? And can we get
our own AI to keep an eye on their AI?

Speaker 22 (32:29):
I mean, I think the point is in Europe the
dangers of this are much more recent and much more present. So,
you know, the country, the one country in the world
which does really understand this is Germany, you know, and
it was first it was Nazism and then it was communism,
and both of those systems used technology to control and

(32:52):
surveil people.

Speaker 23 (32:54):
And that's why there has been so.

Speaker 1 (32:56):
How if this is the Stazi, or this is the KGB,
or this is the look this country is rife with
examples of government MK ultra and FBI surveillance on Martin
Luther King and John F. Kennedy and Robert Kennedy. You know,
we are not unfamiliar with those kinds of police state activities.

(33:20):
Is this of a different piece. Is it just that
it's more opaque? Is it more efficient at doing it?
Are we seeing something really different than we've seen?

Speaker 22 (33:32):
We are seeing something really different. It's really really systematic.
What's happening there is already we are all of us emitting,
you know, hundreds of thousands of data points a day,
a week, which are you know, They're all being collected somewhere.
And so the idea now that those can be brought
together and merged in some vast database that is going

(33:53):
to profile you, that is going to judge you that
it's going to make assumptions about you and.

Speaker 1 (33:57):
You meet for predictive purposes when we.

Speaker 22 (33:59):
Use because it's now we're laying generative AI on top
of that. I mean, you know how bullshit chat GPT is.

Speaker 23 (34:07):
I mean, it's that.

Speaker 1 (34:09):
What do you do I do my own work? What
are you saying that these questions? I just plugged into
chat GPT. But it's just easier, Carol.

Speaker 23 (34:26):
It is, But it's also just pretty dumb. It's stupid.

Speaker 22 (34:29):
It gets things wrong, It hallucinates, it makes up references,
and that is This is that system which is going
to be deciding whether you get Medicaid or not.

Speaker 1 (34:38):
That boy, did you just now? I think such a
crucial point there, beyond even the more sort of dystopian
visions of where they're going to be taking. This is
the practical application of a technology that is wildly fallible.
That is something that you know, they just put out
the Maha Report and apparently there's seven studies in it

(35:00):
that don't exist, and nobody thought like I wei should project.

Speaker 22 (35:03):
That And now you know, you know the whole thing
of like trying to get through to customer services at
any of these tech companies. If you've ever had you know,
your Facebook account hacked or your Twitter account take there's
no you can't get through to anybody. Now imagine that
that is you trying to get hold of your whatever

(35:24):
it is, your benefits or you know, the services security
computer exactly. The computer has just said no, you don't
have got no idea why what that's based on? And
then like how do you how do you challenge that?
How do you what do you do? I mean, that's
that is the reality that is absolutely going to be faced.
You know, millions of Americans are going to be facing.

Speaker 1 (35:44):
Right to try and contact and it won't be anybody.
It'll just be for this press one. But nobody will
ever come on.

Speaker 22 (35:51):
Exactly exactly, and and and and and and it's it's likely,
you know, if you get turned down, like I said,
you won't know why. But also there's no saying that
it's based on accurate information whatsoever.

Speaker 1 (36:03):
And not transparent in any way by removing the people.

Speaker 23 (36:07):
Yeah, it's a black box.

Speaker 22 (36:08):
It's just everything it's going to go into some black box.
It's going to get mixed up and it's going to
spit out answers.

Speaker 1 (36:15):
So you've been looking at this for many years you
have faced a great deal of personal repercussion for doing this.
You continue to push on it. Is there anything that
you've seen within this move that makes you feel like
we have the ability to in any way slow the

(36:35):
inevitability of it?

Speaker 23 (36:37):
Yes?

Speaker 1 (36:38):
Oh, Carol, Carol.

Speaker 12 (36:42):
I've got Italia and I'm going to say this to
the audience at all.

Speaker 1 (36:47):
I probably should have started there before everybody was on
the ledge. What do you see.

Speaker 22 (36:54):
Because they're just like all of these tech bros. They
are selling absolute bullshit. Okay, the whole AI, the whole
AI scam is a scam, and they're making out that
it's inevitable that we have to you know that they
have to. We're having to chuck millions upon millions of
dollars at Building Better II because the AI is otherwise

(37:15):
going to kill us and it's going to come anyway,
and it's inevitable. It is not inevitable. It is based
upon illegal behavior.

Speaker 23 (37:23):
Take it.

Speaker 22 (37:23):
Challenge these companies in the courts, media organizations.

Speaker 23 (37:27):
Stop doing crappy deals with these people.

Speaker 22 (37:31):
They just did a really crappy deal with Open AI,
or as I say, you know, if they married its rapist.

Speaker 23 (37:40):
They're rapist.

Speaker 1 (37:46):
This was the optimistic part of the show. This was
the part of the show where you were going to
bring us all back from the ledge. And look what
you've done. They're all crying again.

Speaker 22 (37:57):
We can we have power, We can stand up to
these companies.

Speaker 23 (38:01):
We have to stand up to these companies.

Speaker 22 (38:03):
We don't preobey, We don't make deals with open AI.
We do try and stand up for our you know,
defend our legal rights. This is law, okay. And that's
the one thing which the tech companies, these Silicon Valley platforms,
you know, just can't tolerate. And that is and that's
always been their strategy, is that they subvert it and

(38:25):
they get away with it. And they get away with
it because they act fast. We don't realize till too late,
and then it seems too late to try and try
to do anything about it. So we can do stuff
about it, and we can. Also is that you know,
we are giving our data to these companies. We have
to really think about that. You know, Instagram is not

(38:46):
your friend. You shouldn't don't post your kids photos on there.
I mean, if there's one thing to take away from
that is that these companies are now allied to your government,
which is a you know, I have to get out
through the board during a couple of days time. So
I'm slightly wary of what I say, but I mean, yeah,
it's not in.

Speaker 1 (39:06):
A good place I understand. The only way I would
push back is just to say Instagram is actually my
only friend. Uh, Carol, I just I can't tell you
how I'm pressed. I am with the work that you've done,
that you continue to do, the way that you continue
to stand up for this even when it has cost
you such a great deal in your life. Be sure
to check out Carol Substack It's How to Survive the Broligarchy,

(39:26):
and her nonprofit called The Citizens. Is that correct? Correct?
The Citizens?

Speaker 24 (39:31):
Carol Cadwaller, Oh, nobody lest something tonight before we good,

(39:55):
We're gonna take it with your house for the rest
of the week.

Speaker 1 (39:57):
Mister Michael cost the mic out, what do you got calls?
This week? Michael Well John.

Speaker 4 (40:04):
June is the start of Pride Month, So this week
we'll be celebrating all the things that I'm proudest of.
My hairline, my athletic frame, and my early pre order
of the Nintendo Switch to There's just so much to be.

Speaker 1 (40:17):
Proud of John, So I understand this month that was
really more about like gay and lesbian pride, not just
not just pride in general.

Speaker 4 (40:29):
I see, Well, uh, there was that one time I
convinced a cop to tear up my speeding ticket, but
I wouldn't say I'm proud of it.

Speaker 1 (40:50):
We won't elaborate Michael costs All this week.

Speaker 15 (40:52):
Who is.

Speaker 3 (40:56):
There is in your Times report today that accuses you
of blurring a writing way between the New York Times?

Speaker 14 (41:02):
Is that the same publication that's got a pill as
a prize for a false recording on on the Russia Gates?
Is it the same organization?

Speaker 8 (41:12):
I got to.

Speaker 2 (41:17):
Explore more shows from the Daily Show podcast universe by
searching The Daily Show wherever you get your podcasts.

Speaker 1 (41:23):
Watch The Daily Show weeknights at eleven ten Central on
Comedy Central, and stream full episodes anytime on Paramount plus

Speaker 23 (41:36):
Paramount Podcasts
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.