Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:11):
You're listening to The Buck Sexton Show podcast, make sure
you subscribe to the podcast on the iHeartRadio app or
wherever you get your podcasts.
Speaker 2 (00:20):
Hey, everybody, Welcome to the Buck Sexton Show. Kara Frederick
is with us now. She's formerly of Facebook, currently with
the Heritage Foundation, where she's Director of Tech Policy. We're
going to talk to her about AI, about government snooping
on social media platforms, and all kinds of good stuff
or bad stuff, but interesting stuff. Kara, welcome, Thanks for
(00:41):
having me. So let's start with Elon Musk recently spoke
to Tucker Carlson on Fox Tree saw it, and he said,
not only was he Elon flabbergasted by how much access
the government had to the communications and just the general
(01:01):
stuff going on at the social media platform Twitter, I'm
sure as true of other places, but that dms were
easily accessible by the government to What do you think
of that one?
Speaker 3 (01:13):
Yeah, so this, you know, honestly, it's not surprising. I mean,
especially given the backgrounds that you and I had there.
We do know, first and foremost there are legitimate reasons,
or at least there were legitimate reasons to you know,
work with some federal entities when it comes to things
like child sexual abuse material, you know.
Speaker 1 (01:31):
Sniffing out the bad guys.
Speaker 3 (01:33):
I worked on foreign Islamic terrorism when I was working
for a big tech platform and before, so you know,
there are times when you want the government that is
ostensibly there to protect the security of Americans to maybe
have some sort of access to these internal communications. Unfortunately,
over the past two to three years, we've seen that
(01:56):
there's been a abject dereliction of duty when it comes
to the you know, number one, keeping United citizens safe
and number two really prioritizing the the security of Americans,
especially from external hostile forces, and instead looking inward at
the American populace and inflating the definition of domestic extremists
(02:19):
and terrorism and whatnot. So, in my mind, the government
has lost our trust and deservedly so, and they have
been misprioritizing when they should be looking at real, actual threats.
Speaker 1 (02:31):
Then okay, maybe we can think about.
Speaker 3 (02:34):
Some of these surveillance capabilities, but that's all gone, that's
been washed away, given especially what Elon exposed in the
Twitter files, and it is very telling. I think that
we at the Heritage Foundation, we published a report in
February of twenty twenty two, and we said there's a
symbiotic relationship between these big tech companies and the government.
(02:55):
It goes as far as collusion and any sort of
collusion between government actors and these big tech companies to
silence the speech of Americans should be prohibited. And people
were like, you are, Oh, come on, You're just a
fear monger.
Speaker 1 (03:08):
What are you doing. But we've been I've proven right.
Speaker 3 (03:10):
So the fact that the government had full access to
dms and Twitter. Again, we knew Twitter dms a long
time ago were probably compromised, and this just proves it.
And frankly, this proves a Heritage Foundation right yet again.
Speaker 2 (03:21):
So you worked at Facebook, I would assume, and tell
me if it's an incorrect assumption that the very very
cozy and all too close relationship between Twitter pre Elon
and the government and as he described it, effectively, it
was a bloated activist organization Twitter masquerading as a social
(03:45):
media platform. I have to assume, and you tell me
if it's not right that the politics of Facebook, Google,
all the rest are basically the same and perhaps even worse.
Speaker 3 (03:58):
I would say, post twenty sixteen Trump elections, that's when
everything became exacerbated. So you have you know, a cadre
of US who went in before that election. I did
there in twenty six sixteen, early twenty sixteen, and you know,
we we were there to solve these big problems like
the foreign as long terrorism issue, and you had really
(04:20):
talented people, a lot of time patriots. I came right
out of a naval special warfare command where I had
been doing counter terrorism analysis as a targeter, and I
went right into Facebook sort of thinking that, you know,
we were going to do the same thing. We were
going to protect Americans and their users from this you know,
foreign Islamic menace, which was taking the form of ISIS
at the time. And then I would say in twenty sixteen,
(04:44):
something really changed and it was Trump's election, and they
just went hysterical. And the people that they started recruiting
post election, you know a lot of them went in
sort of thinking that they had a mandate to stifle
the conservative voices i'd say in America. You know, prior
to that, we there were some interesting i would say
(05:06):
data points when it came to the way that we
did our analysis, for instance, and this is no secret.
Now these companies have been using organizations and goos like
the Southern Poverty Law Center to actually help them formulate
their policies and help them formulate the way that they
treated specific actors on these platforms, and they thought nothing
(05:30):
of it. They thought, Okay, SBLC is an honest broker.
So I think that part of it is ignorance. Part
of it is, you know, the sea that they swim
in in the Bay Area, and then the other part
of it is the twenty sixteen election just really unhinging
people and making them really come in with mission oriented
to sort of stifle the voices of the people that
(05:53):
they disagreed with.
Speaker 2 (05:55):
Is it fixable at these places? Like how would you
actually if you were able to get you know, Zuckerberg
and his top lieutenants in a room and they were
willing to listen to reason. I mean, you know, I
think Facebook has just turned I'm a man that there's
still apparently as many people using it in America as
there are. I find it to be like an unwieldy
(06:16):
trash heap of nonsense, Like it's really hard to even
figure out where anything is anymore. They've it all looks
like it was made by a bunch of you know,
computer engineers. And I don't mean that in a good way,
Like it looks like it's just this kind of cobbled
together thing and all of the original just the ease,
and it's just they're throwing all this. I think it's
(06:37):
turned into a bad user user experience and user interface.
But anyway, forget about fixing that for a second. If
you were just trying to fix the fact that Facebook
is not fair to half of the country politically, is
it possible or do we just have to build our
own build our own thing. By the way, build your
own Facebook doesn't sound as crazy. Maybe it's buy your
own Facebook that's more expensive than Twitter. Not to call
elon on that one.
Speaker 3 (06:59):
Yeah, you know, I will say personally, and when I
was in the company, I thought Mark Zuckerberg's instincts were
more libertarian than you know, more leftists. So and that
is sort of the old guard of the builders.
Speaker 1 (07:12):
In Silicon Valley.
Speaker 3 (07:13):
You know, these are engineers, and they think like engineers,
they think like programmers. They want to solve problems and
fix things. Unfortunately, I do think that he's he's been
there for so long and he's been frankly led astray
by others in the C suite and that other sort
of layer of upper management as well, who are very
concerned with pr fires. And clearly we know that public
(07:34):
relations only goes one way, and that's against Conservatives at
this point. So you look at that and sort of
see how he's been co opted when his instincts were
maybe initially good. I remember a speech that he gave
in October of twenty nineteen at Georgetown University where he
said Facebook had to be the alternative to an authoritarian
China which was propagating its digital platforms here in the US.
(07:55):
You know what happened to that that that gave me
a little bit of hope. But since he since spent
four hundred million dollars pushing certain Democrats in elections under
the auspices of these out the vote measures, but we
know that they were in blue states going to Democrat
(08:16):
activist organizations who were then pushing only getting out the
vote for Democrats. So unfortunately, I think those instincts that
he previously had have been quashed, and I don't think
there's any coming back from it. I do think it
is too far gone at this point.
Speaker 2 (08:31):
Let's come back in a second, and talk about TikTok.
Since you mentioned communist China, we'll get into that in
just a second here. But everybody at home, if you
haven't tried the Geezy dream sheets, I'm I'll tell you
right now, you're missing out. Mike Lindell's got a lot
of amazing products. Karen, do you have Geezy dream sheets.
We're gonna get you hooked up.
Speaker 1 (08:48):
Planning on it. They're in the cart, They're in the cart.
Speaker 2 (08:50):
I'm like, all right, there we go the Geezy dream
sheets from Mike Linda. You got to get them for
the whole family. I'm telling you. They're amazing. Coming as
low as twenty nine to ninety nine. When you go
to my pillow dot COMUS promo code buck multiple color
styles and sizes, you can upgrade your betting. Now. By
the way, everybody, I know you think, oh, I've already
got sheets. Sheets only last you a couple of years.
You know, you watch them, they get to get kind
(09:11):
of a little too worn, threadbare, and they're not very
comfortable and they start to, you know, just look like
you need new sheets, because you do. So go to
MyPillow dot Com promo code buck twenty nine to ninety
eight for Geeza Dream Sheets. All my Pillow products come
with a sixty day money back guarantee and a ten
year warranty. So go to MyPillow dot com, click on
radio listener specials and use promo code buck for the
(09:34):
Geeza dream Sheets under thirty bucks Geeze dream Sheets, go
check them out today. I sleep on them every night.
All right. So TikTok, I gotta tell you. I mean,
I've I think that YouTube is much more concerning for
American freedom and everything else than TikTok. And I've been
saying this more and more people are saying it now.
It seemed to be about a month ago and it
(09:54):
was like, oh, TikTok, I'm sitting here. I'm like, what
about the social media companies? They are already throwing elections anyway,
We've put that aside for a second. I mean, how
worried are you? Let's just look at TikTok. And I
won't do the you know, the what about with YouTube, Google,
Facebook or Instagram all that stuff, although I could. How
(10:14):
bad is TikTok really? How big a problem is it?
Speaker 1 (10:18):
Yeah?
Speaker 3 (10:18):
I think TikTok falls into three problem buckets. I would say,
and The first and foremost is the most obvious one.
It's that Byte Dances its parents parent company. It is
headquartered in Beijing and therefore subject to the laws and
policies of the People's Republic of China. One of the
laws that we like to talk about is the twenty
(10:39):
seventeen National Intelligence Law, which effectively compels private companies to
do the work of the state. So if the Chinese state,
the CCP, the Chinese Communist Party, decides that they want
specific data, they want access to this, they want access
to that, then by virtue of this law, Byte Dance
has to comply. And this is not my original phrasing,
(11:00):
but I think it illustrates the problem pretty well. China
doesn't have rule of law, they have rule by law.
This if this is something if you're you know, the
Bite Dance CEO, you're kind of powerless to resist at
this point, and why would you. When you look at
Byte Dance, they have one of three of their board
members of the main domestic subsidiary of Byte Dance is
(11:23):
a card carrying Chinese Communist government official. If you look
at like some of the good reporting that Forbes has done,
they scoured LinkedIn and found that three hundred plus profiles
had either current or former links of these current byte
Dance employees to a Chinese propaganda arm, to Chinese state media.
So you have active and former members of the Chinese
(11:46):
Communist Party, particularly in the information realm, working in byte Dance.
And there's so many other data points that I could
talk about, but that's the first one. Byte Dance very
close links and infiltration frankly by the CCP. They have
an internal committee, a DOJ report in September twenty twenty
coming out of the Trump administration assessed as much, and
(12:06):
we know that they are deeply involved in the inner
workings of TikTok as well.
Speaker 1 (12:10):
So that's one thing, and again just tip of the iceberg.
Speaker 3 (12:14):
We can go into just really how odious some of
those connections are when it comes to the connection with
American data as well, but I'll table that for now.
And then number two there's that influence campaign aspect, and
you know, we talk about the manipulation of the information environment.
This is something that we were dealing with in the
(12:35):
intelligence community and especially in big tech companies and looking
at it now and what we've seen is pro CCP
narratives pushed on these platforms. We've seen an actual Chinese
state account come to TikTok and say, how can we
push our information? And we've seen information from those accounts
(12:57):
pushed and not labeled as state media as well. So
we know that they're trying to do the proverbial sowing
of discord, such as promoting Democratic candidates in the twenty
twenty two midterm election to the detriment of Republican candidates.
They're trying to push stories about abortion and incendiary things
to help so descent among the American populations, something everyone
(13:19):
always accused sort.
Speaker 1 (13:19):
Of the Russians of doing.
Speaker 3 (13:21):
So that's another aspect of that information environment manipulation.
Speaker 1 (13:25):
And then third, you have the kids. You know what
it's doing to children.
Speaker 3 (13:28):
And we know that TikTok, in particular with the four
U algorithm, just lights a.
Speaker 1 (13:33):
Fire under these social contagions.
Speaker 3 (13:35):
We know that they're in bed with the transgender lobby,
featuring you know, prominent transgender activists, prominent LGBTQ activists all
over their websites raising awareness. They're very very open about
doing this, and there are pediatric hospitals that are reporting
actual physical manifestations coming out in patients because they use
(14:01):
TikTok like things called TikTok tics, which most researchers are
classifying as you know, pure social contagion movements. So you know,
and this is again something that TikTok is very very
efficient at that it doesn't necessarily distinguish it wholesale from
the instagrams of the world. If you want to talk
about what about is them. We know their staffs on
(14:21):
that in terms of mental health harms. But TikTok is poised,
especially because, as Christopher Ray says, directory of the FBI
China controls the algorithm. That's even more problematic when they're
feeding our kids this poison.
Speaker 2 (14:36):
But how much of the algorithm is the kids click
on the things they click on and then it's reflected
back to them, right, Like you know, I asked this
because remember Russia, remember Russia collusion. Back in twenty sixteen,
they're saying, oh, Hillary Clinton lost because Trump worked with Russia.
And then they talked about the Facebook ads. I think
(14:58):
there was like one hundred thousand dollars that were spent
on these bogus you know, or Russian backed or whatever
Facebook ads. And when you look at the ads, I
mean a lot of them were ridiculous. I mean it
looked like a guy named Yuri, you know know, working
in like sub basement c of some you know fsb
outpost you know, on the on the in the outer
ring of Moscow was like looking at a you know,
(15:20):
a little dictionary in English. I mean it was preposterous, right,
I mean their understanding of US politics beyond lock her
up for Hillary Clinton, which you know they got that right,
but their understanding was very weak. I mean the idea
that that I just hear all these people saying, oh
my gosh, you know, the Chinese are going to brainwash
(15:41):
our kids and make them lazy or whatever. I look
at them, like, what do you guys think Disney's door.
I just this is the part of it that there's
so much more upset about what's going on with the
I understand the spying thing, and like if they can
suck up your information and do create that that's a
separate issue, but it sounds to me like there's also
just a content component of this, and I I don't
(16:03):
under I just feel like I feel like TikTok is
the shiny object where people in politics and in power
get to pretend that they're doing something that's like, oh,
we're protecting the kids from the bad influences online. They're
protecting them from one of dozens of major and endless
minor influences online that are all being pushed by the
(16:24):
Democrat Party anyway, like transgenderism. Beijing's not pushing transgenderism via
TikTok on kids. The Democrat Disney is pushing transgenderism on kids.
Speaker 3 (16:35):
Well, number one, you know, you're right, But number two,
we also don't know that Beijing isn't pushing this stuff,
and that's you know, part of the problem too. If
China is so intimately involved in the algorithm, which it
says it won't. If there's a forced investiture to an
American company of TikTok, it said, there's no way we're
giving up the algorithm. So in my mind, that means, yeah,
they want to retain control of it. There's a commercial
(16:56):
element to it because it is really good, but there's
also that information environment of manipulation potential there because they
want to keep it in Chinese hands so badly, and
they've said as much, which flies in the face of
a lot of the assurances that these TikTok executives are
providing to Congress members with their you know, Project Texas potential,
(17:17):
you know, mollification of our representatives.
Speaker 1 (17:20):
But so I also I do want to address that.
Speaker 3 (17:23):
I think that when you have over sixty seven percent
of American teenagers as of last year on a particular platform,
and you have thirty percent, according to a pupil in
twenty twenty of preteens nine to eleven year olds on
a specific platform, and we have new data coming out
of the UK saying a decent percentage of toddlers are
(17:44):
now exposed to this content, then you.
Speaker 1 (17:46):
Have a problem.
Speaker 3 (17:47):
Then you have a the fact that you know, number one,
all of these children are on it to a much
greater degree than you know, Facebook, as you talked about,
is hemorrhaging users, especially in this demographic Instagram as hemorrhaging
users as well. Then then that becomes a source of information,
and we know that they're getting it for their news
now as well. They're not just looking at those cute videos.
(18:08):
They're they're getting it to be informed about the world.
Top Google executives said as much as well. In a
tech conference. He said, when people young kids want their news,
they go to TikTok and Google is very much aware
of that, and you know, keeping their antenna up. And
then you have American adults, so looking at the kid
stuff that matters a lot, but American adults as well.
(18:29):
So you know, in two years the amount of American
adults that get their news from TikTok has tripled. That's problematic,
I think from a civic perspective. I see you want
to say something, but I want to say one more
thing before I let you talk.
Speaker 2 (18:47):
I'm standing in the way of the train. By all means,
go ahead, Sorry you were saying yeah.
Speaker 3 (18:52):
So I think the last thing is a number of
enterprising journalists have taken it upon themselves to create their
own TikTok accounts. And what they do is they register
as users from around thirteen to fourteen, and they have found,
to a man, within minutes, they are fed content that
is composed of self harm content, suicidal content, eating disorder content,
(19:14):
especially if they're registering as girls versus registering as boys.
So we do know that there is something that isn't
just responsive to shall we say.
Speaker 1 (19:24):
What the children want?
Speaker 3 (19:26):
Granted, the TikTok algorithm is based off your engagement, not
necessarily your network, So how long your eyes linger over
a specific video if you're interested in depression, Yes, it's
more likely to feed you self harm content and suicidal content.
But this appears to be pretty uniform across the board
for a lot of these journalists experimenting anyway. So there's
something in TikTok that is particularly I would say nefarious
(19:50):
when it comes to our children and the noxious content
they're being fed.
Speaker 2 (19:54):
I mean, do you ever go on TikTok because I
gotta say, some of those shuffle dance moves very catchy.
Speaker 1 (20:02):
Fuck.
Speaker 3 (20:03):
No, I am not going on TikTok, nor will I
ever go on Tiktokay.
Speaker 2 (20:08):
This is where I get to point it out all
the other people in America, usually especially the Democrats, but
I get to call them comedies and have fun with it.
Apparently on this one, I've got like a soft spot
for communist China. So I'm like, I think TikTok is
super entertaining. I gotta tell you it's all for me.
It's yeah, it's it's how to how to sear like
the perfect steak and different red meat. I follow this
(20:30):
like Max the meat guy who makes like Wagu and
briskets and all these things. Uh, guys who know how
to make like tomahawks out of the stuff you find
in your backyard like this is and of course cute
pulldog videos like like, I was like, what is this?
How is this supposedly in some way doing anything that
(20:52):
is uh, you know, going to damage me? But then again,
I'm an adult and things are different, and you know
that can be a little bit of a challenge. And yeah,
so I got to get to the Oxford Gold Group
here for a second, and we come back. We're going
to talk about AI because I think its kra Frederick
has to plane whether or not Skynet is going to
(21:12):
become self aware and cause the nuclear war that James
Cameron warned us about in Terminator one and two and
probably the other ones, but no one saw the other
ones because the other Terminator movie sucked, So you know,
there was a she knows it's true. There was a
recent banking collapse, the nation's largest collapse of a financial
institution since two thousand and eight. So stuff can go
bad real fast, you know this, And Fiat currency is
(21:35):
pretty imperfect because it's a situation. Now we have inflation.
We also have thirty two trillion dollars a debt. How
about using gold and silver as a protection for your portfolio.
Have a little gold and silver on hand just in case.
I've got gold and silver right here with me in
the radio studio. Now is the time. Don't wait, because
if a crisis hits, you're gonna need to have it
(21:55):
on hand. And also those prices of gold and silver
are going to go way up. So now's the time
to call my friends at the Oxford Gold Group. Securing
your I array or four to one K with real
gold and silver, by the way, is also a fantastic
portfolio protection plan. All you have to do is call
the Oxford Gold Group. You can own real precious metals
just like I do. Call the Oxford Gold Group at
(22:17):
eight three three four zero four gold A three three
four zero four g O L d Okay so ms
Kara Frederick who is on the task force I believe
at the Heritage Foundation for dealing with AI related matters,
So you would you would have some insights into this.
(22:38):
I I think that the AI is and look is
Elon Musk, both wealthier and smarter than me. Yes, but
I think all this stuff about how the world is
going to end because of AI is pretty crazy. Like
I'm looking at this, I'm like, how does this even happen?
(22:58):
Everyone's getting all freaked out? Do I I'm usually actually no,
I usually tell people everything's gonna be okay, because most
hysteria is just people wanting attention. But is AI really?
I know it's a big deal and it's gonna matter all.
I'm not saying it doesn't matter a lot, but as
a threat, what is the threat from AI? That's the
part of this that I still haven't No one's really
(23:18):
been able to explain to me. It's like, oh, like
it's going to hurt our democracy because of the disinformation.
Watch CNN, look what they.
Speaker 1 (23:24):
Do, fair point.
Speaker 3 (23:26):
And I do think you stand in good company with
with some of the skeptics.
Speaker 2 (23:31):
You know.
Speaker 3 (23:32):
One of the things, as you know, in the Intel community, though,
there was always that person who was like everyone everyone
China isn't ten feet tall, right, everyone everyone al Kaita
is not looking at external operations, and they like are
the naysayers and everyone kind.
Speaker 2 (23:47):
Of their whole thing is is they're the uh you know,
they're they're playing the role of what's the what's the
word we're looking for here. You know, when you're just
being in opposition, to be in opposition, I forget what
the word is. You know what I'm saying, contrarian. Thank you.
They're playing the contrarian.
Speaker 3 (24:05):
Yes, yes, yes, so so you know there's that aspect
of you know, the AI community as well. But I
do think there's there there's reason to be worried. And
I like to quote, you know, two authoritarians on this matter.
And I thought this was commonplace, but apparently not. Many
people know that Putin a few years ago said whoever
(24:28):
is going to lead an AI is going to rule
the world. I think that that's the significant discussion to
be had, especially when Putin's saying that, and then she
and China says, you know, we want to dominate in
AI by twenty thirty. So, you know, if our adversaries
have their eye on this and they see it as
(24:49):
some geopolitical strategic key, then I think that it's important
to sort of pay attention. And the reason why I
think they know harnessing these technologies is really going to
uh catapult them to the front of the line of
global dominance is is because they're doing things at machine speed.
(25:10):
Tends to be better than doing things that human speed.
And when you're coming when you're talking about war fighting,
when you're talking about things like even you know, the
stuff that the Intel pukes like us used to do,
that computer vision algorithms can do better, you know, instead
of what project may even tried to do. Instead of
a human being sitting in front of an FMV screen,
you know, saying that's labeling a truck, labeling a rock,
(25:33):
labeling a car, you have machines able to do that.
And then when machines can make decisions, and again, you know,
the autonomous weapons that we have are mostly semi autonomous,
so there's always a human in the loop. At this point,
there's a lot of debate and discussion about that. But
when you get to the point that a machine is
basically cutting out all of that analytical rigor that can
(25:58):
be applied elsewhere to do things only humans can do,
that's going to give the war fighter a massive advantage.
So if you have a computer vision algorithm determining that
that's a rock and that's a tree, and that you
shouldn't you know, hit it with a with a kinetic strike,
then that's going to be better. If you have another
analyst sort of doing things like determining if there's a
(26:19):
positive identification of that actual terrorist actor which would cause
that hell fire to rain down. So I think that
you know, it's offering specific advantages because of that machine
speed vice human speed. People talk a lot about drone
swarms too, so instead of you know, training up a
human pilot necessarily these you know, what my old boss
at the Center for an American Security used to say
(26:41):
is that, you know, the human being sort of acts
as the quarterback and you let your your smaller, you're
cheaper systems do some of the other work for you,
so you're not sort of wasting a human being in
the human capital on that kind of thing too. So
I think those are just two examples of what AI
can help accomplish in war.
Speaker 1 (27:00):
And we know that she's on a war footing.
Speaker 3 (27:03):
We know that Putin is currently at war right now too,
so so that's another thing. The information environment, like what
you were talking about before, that's a whole nother kettle
of worms, and we can talk about that all day
as well.
Speaker 2 (27:15):
Oh, I don't know, it seems to me like if
there's a little machine that can clean up after me,
and tell me how great I am all the time,
and you know, not ask any questions beyond that sounds good.
I'm not that worried about it, like turning into the
terminator and deciding that you know, it has feelings too,
and it's self aware at all. This I don't know.
(27:36):
I mean, I guess I do think it's it's definitely
going to be interesting for high school kids who want
to have some program that can easily write there, you know,
kind of B minus level term paper for them that
we know it can do pretty quickly. But friends, you
know what you need, not AI, you need chalk. Chalk
(27:57):
provides all natural supplements that help people restore their energy
potential every day. It's a daily supplement formulated to restore
lower testosterone levels and men to the levels that men
used to have. Our diets and stress levels just don't
naturally provide for the kind of testosterone that we need.
Chalk's leading ingredient in their mail, Vitality Stack, has proven
to restore twenty percent of those lower levels in just
(28:17):
three months time. You'll feel the positive effects and experience
an energy potential and focus that you haven't in a
long time. Chalk produces their products with a high level
of purity makes it potent and impactful. That's why the
Male Vitality Stack is as effective as it is. Sign
yourself up, take it, take delivery of Chalk's Mail Vitality Stack,
or any of the other products available via subscription. Get
(28:38):
thirty five percent off any Chalk subscription for a life
when you use my name at their website chalk dot com.
Is that website choq dot com, Be sure to use
my name Buck to get thirty five percent off. Go
to c choq chalk dot com. Use my name Buck
b U c K for thirty five percent off. So, Caro,
(28:59):
when you're not trying to save the Internet for the
purposes of freedom, humanity and world peace and all that stuff,
what else? What do we need to have a Cara Fredder?
You were a Navy intel analyst. Is that what I'm getting?
Because you said vice in a way only people from
the intelligence community ever say vice like that, meaning instead
you know that no one else You'll never come across
(29:20):
anybody who did not work in the IC who will
be like, well, I think that's a good idea of vice.
This other idea you're like, wait, what, we're the only
ones who do that.
Speaker 1 (29:30):
No way, Oh man about Habita.
Speaker 2 (29:32):
Oh I'll give you more. I'll give you more optics.
Only people from DC talking about well from this optic
or from that optic or whatever. There there are some
things that the the Intel people in particular, we have
this weird nerd vernacular. And I caught you doing some
nerd vernacular during this podcast. So at least I know
you're the real deal. At least I know that you know,
(29:53):
you were pouring over those reports eight cups of coffee deep.
Speaker 3 (29:59):
Yeah, it's funny, you know, get smart with Steve Carell
and Anne Hathaway. It's a it's a movie. It's like
a I don't know, spy caper. And you have the
Intel analyst who like finally gets this chance to like
be in operations officer in the field and he's you know,
talking to some operator types and he's like, wait, did
nobody read my report?
Speaker 1 (30:18):
Like that was me. I was like, did nobody read
my report?
Speaker 3 (30:22):
That person who would would go out with the guys
and special Operations forces and sort of be.
Speaker 1 (30:28):
Like, hey, everybody, this is what we should be thinking about.
Here's the target. You should go there.
Speaker 3 (30:32):
Uh So I was always a civilian, a civilian intelligence officer,
and we called him Targeters when I was working with
Naval Special Warfare Development Group.
Speaker 1 (30:39):
But that was, yeah, that was my job.
Speaker 3 (30:42):
I was an Al Qaida analyst first and foremost and
all sorcers.
Speaker 2 (30:45):
So what they did was they like original original gangster
old school al Qaeda or the uh a Qi. Because
I was a q I, Oh you were, they brought
me in to do CDC a QI and then I
got moved to OI uh o I A and and
uh and a q I basically so no way, yeah.
Speaker 1 (31:05):
No, I was.
Speaker 3 (31:06):
I was looking at guys over yeah, the original guys.
Some of the guys been hiding in Iran for a
little bit that is now in the open. And yeah,
when I deployed a couple of times, that was my
that was my thing, looking at Al Kaida external operations operatives.
Speaker 2 (31:23):
So what did you think of Thirteen Hours the movie,
by the way, because I always thought it was so
funny that the case officers in that were like so smug,
which is just great. It was just great. The analyst
were all sitting there like yeah, maybe we were just
back at headquarters making coffee, but like at least we
respected the paramilitary guys.
Speaker 1 (31:47):
Yeah, oh don't. I don't know.
Speaker 3 (31:48):
I had a lot of OGA friends when I was
out there, so you know, you got to you gotta.
Speaker 1 (31:54):
Keep those relationships warm. So I was I liked your
side of the.
Speaker 2 (32:01):
You know, but did you notice in the movie for
some reason, just to really hammer it home, like one
of the American case officers in Thirteen Hours just randomly
has kind of like a French accent and he's turning
around to like he's turning around to uh, you know,
what's his name? John Krasinsky, And like all these guys
were all, you know, jacked and these badasses there and
he's like he's like, IDA had to do fancy things, dude,
(32:23):
Like I am so cool. They're like where did this?
Where did Why is is his name? Like like Jacques Couston,
Like where did this guy? He's an Emeritan? Like they
just I love it because those guys.
Speaker 1 (32:34):
Uh, the the the.
Speaker 2 (32:35):
GRS guys in the movie, like, oh, it's funny the
things they would say about the case officer. I'm just sorry.
I just was observing. I'm just observing.
Speaker 1 (32:44):
Fuck.
Speaker 3 (32:44):
Every time I left when we were forward, every time
I left the tactical Operations Center.
Speaker 1 (32:48):
All the guys in a chorus would go beat it NERD.
So yeah, it's a yeah, but you guys were like.
Speaker 2 (32:56):
He had, we had our reports and are our cool stuff,
you know in the meetings that we did too? So interesting?
All right? Well, Cara, where should people go to check
out your work and the stuff that you're up to
and all that good stuff?
Speaker 3 (33:10):
Yeah, first and foremost, go to Heritage dot org. So
all of our work is on the Heritage website. I
direct the Tech Policy Center there and yeah, we're we're
looking at you know, five big lines of efforts. So
go to that website to see what we're up to. Personally,
I'm again kind of in the belly of the beast.
I'm on Twitter Kara A Frederick and on Instagram as
(33:32):
Karafred with two DS, so you can check out my work.
Speaker 1 (33:35):
I do a lot of some personal but mostly.
Speaker 2 (33:37):
And you got a tiny you got a tiny baby too,
right right now you're like a yeah.
Speaker 1 (33:42):
Yeah, yeah, yeah.
Speaker 3 (33:44):
So she's an infant and I don't hear her, but
she's probably gonna start crying a little bit.
Speaker 2 (33:49):
So this is I'm glad we didn't get the form
where you pretend you hear the infant crying so you
can end the interview early. So that's good. That means
we kept it moving here. Check out Karra stuff everybody.
Kara Frederick, thank you so much. Appreciate you joining us