All Episodes

December 3, 2025 52 mins

This week’s guest is Renee DiResta, professor, and author of Invisible Rulers: The People Who Turn Lies Into Reality. In part one this week, she talks to Lola and Meagan about propaganda, disinformation vs. misinformation, and how social media algorithms push us all into highly individualized, “bespoke” realities.

They discuss how she became interested in this topic after social media algorithms started suggesting anti-vaxxer content to her, how rage bait and other emotionally charged material spreads faster, and why social media makes it seem like other people have more extreme views than the majority of them actually do.

SOURCES

Invisible Rulers: The People Who Turn Lies Into Reality

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Trust me? Do you trust me?

Speaker 2 (00:04):
Right?

Speaker 3 (00:04):
Everly?

Speaker 1 (00:05):
And you astrayed? Trus This is the truth, the only truth.

Speaker 3 (00:09):
If anybody ever tells you to just trust them, don't
welcome to trust me. The podcast about cults, extreme belief
and manipulation from two Visible Rulers who have actually experienced it.

Speaker 1 (00:22):
I Am Lola Blanc and I Am Megan Elizabeth.

Speaker 3 (00:25):
Today is part one of our very informative interview with
Renee dresta professor, former research manager at Stanford Internet Observatory
and author of Invisible Rulers, The People Who Turn Lies
into Reality. In Part one this week, she is going
to talk to us about propaganda, disinformation versus misinformation, and

(00:46):
how social media algorithms push us all into highly individualized,
bespoke realities.

Speaker 1 (00:52):
We'll discuss how she became interested in this topic when
algorithms started feeding her anti vaxer content simply because she
was interested in a healthy lifestyle, How rage bait and
other emotionally charged material spreads faster, and why social media
makes it seem like other people have more extreme views
than the majority of them actually do.

Speaker 3 (01:11):
And next week we will talk about pseudo events or
nonsense controversy and not newsworthy news. How much more challenging
it's getting to discern truth from fiction and lots more,
lots more about our depressing internet lens escape.

Speaker 1 (01:29):
Ah, what's real? Megan?

Speaker 3 (01:32):
Yeah, exactly, that's the question that is before we ask
her this question, kind of rene, Okay, I would love
to know your equiltiest thing.

Speaker 1 (01:43):
Okay, sure. Well, first, ione to start off with an apology.
I did not realize that Spotify has comments, So we're
not We don't really read a lot of the comments
can and it's not because we don't want feedback. It's
I mean some of them are you know, some people
are very and not just like I mean, like you
learn ugly, like I'm stop exposing my call them and
I kill you, you know what I mean? Like people can

(02:04):
get has that happened? Yeah? Yeah, yeah, why don't you
text me about that? I didn't know. Well, I mean
it's a specific one, you know what I mean? Okay, okay,
So I didn't realize that Spotify had comments, and I
was like, oh cute, some comments not so cute. People
are really upset at me because I talked about using
AI and I was like, oh, word, like they're right,
and I really didn't realize how how bad it was

(02:29):
for the environment.

Speaker 3 (02:30):
You didn't, I thought you were. You specifically mentioned that.
That's because I started reading the comments. Oh yeah, yeah,
that's like the whole thing. That's like, Yeah, I didn't.
I didn't know the data centers are sucking the water
all of our energy.

Speaker 1 (02:45):
Yeah, so I'm making higher I was. Yeah. Yeah, so
I was like a first you know, chat cheepy tear,
like much much longer ago than people would imagine. And
it just really wasn't talked about them that it was
like an energy source. I don't know why. So I
just got kind of lulled into it just being kind
of lazy, not environmentally unfriendly. So anyway, I have built

(03:09):
my life around being friendly to the environment. I do
all the things that I can to make my imprint
as small as possible. Not that anyone else has to
do those things, it's just what I do. It's very
important to me. Y'all are correct. I deeply apologize. Please
forgive me. I erased it all Zorah off, all of
it off gone. Wow, nice, that is more than most

(03:31):
people are doing. Yeah, well, please forgive me. Embarrassing take.

Speaker 3 (03:35):
But I also think that the truth is that most
people are using AI in some former orders.

Speaker 1 (03:42):
And that's how I justified it to myself, where it's
like I'm in a space to have like animation where
I'm like, well, if I'm not using it, other people are,
and then I'll just get lost and be way behind
like people who are just sociopathic and using it. And
it's like, right, correct, that's exactly what will happen in
every space forever because that's how the world works. And
that doesn't mean I have to keep polluting.

Speaker 3 (04:04):
I also try not to use AI unless I have to.
The reality is that every corporation has integrated AI into
their operation, and you're probably going to hear an ad
for AI that we did not approve because they just
that's just what happens. Sometimes they just get they get
put on here, and I mean, you know, it is
what it is. So yeah, that's not really my cultiest thing,

(04:26):
but it kind of is.

Speaker 1 (04:27):
I think that that is wonderful.

Speaker 3 (04:29):
We should all be aspiring to use it as little
as possible. Most apps have incorporated AI into their entire functionality,
so like, I also think it's important to remember that, Like,
while it's important and good for us to all try
to do what we can on an individual level. Ultimately,
like these things are happening on a systemic level, that

(04:50):
corporations are spearheading to try to make more money. And
while it's good to try to amend or individual behavior,
when it's that a large of phasis stemic problem, like
we should be putting the onus on them and holding
the companies accountable, not like pointing fingers at each other
for like I needed a little help exactly, And I

(05:11):
think that's what I mean by like the psychopaths will
just take over. I just mean like the tech giants
are people who are so removed from reality, not everyday
users who aren't in charge of it. Even with something
like recycling, Like recycling is a scam essentially that was
marketed to us so that corporations could make more money
on plastics. Right, like these like individual behaviors, let's all

(05:34):
do the best we can, absolutely, but also like let's
hold people accountable. We are politicians to actually regulate that shit,
you know what I mean? Agreed, So yeah, let's let's
And also it's just like trying to understand we're all
doing our best.

Speaker 1 (05:47):
Yeah, moderation, yeah exactly. All that said as an artist.

Speaker 3 (05:51):
I'm obviously terrified about how these companies are stealing from
human artists to essentially make a profit for themselves, and
I don't think they should be allowed to do that.
So let's try to minimize it, and also let's try
to regulate it.

Speaker 1 (06:04):
Word, what about you? What's your cultiest thing of the week?

Speaker 3 (06:07):
I watched the movie, butgon, yeah, okay, and have you
seen it?

Speaker 2 (06:12):
No?

Speaker 1 (06:13):
Do you know what it is? Oh?

Speaker 3 (06:15):
Okay, it's Yorgos Lanthemos new movie. I love Yorgoslanthemos starring
Emma Stone and Jesse always has Emma Stone in it,
and it is I guess it's a remake of another film.

Speaker 1 (06:29):
Did you see the Onion article that was like I
might hang out?

Speaker 3 (06:36):
It's a remake of a South Korean film called Save
the Green Planet, which I have not seen yet. But basically,
I'm not going to spoil anything, but the prem this
is all on the trailer. The premise is that Jesse
Plemmons and another guy are conspiracy theorists who kidnap Emma Stone,
who's a CEO. She's a high powered yas queen CEO.

(06:57):
What or yesified I should say?

Speaker 1 (07:00):
And it's kind of. It feels like a.

Speaker 3 (07:01):
Play, like it's all about Jesse Plemons and Emma Stones
characters like trying to outsmart each other, and like.

Speaker 1 (07:07):
Okay, it's so good.

Speaker 3 (07:10):
But also there's just so much interesting psychology in there,
just about like conspiracy theorists and people protecting their belief systems.
Oh my god, and somebody trying to poke holes in
those belief systems. But she's doing it with her own
need for control, you know, or am me, it's yeah,
it's really juicy. Is it in theaters or can I

(07:30):
watch it? It's in theaters right now? Well when at
the time of recording this episode, it won't be by
the time this episode comes out. Okay, but where do
I do I have to go to a theater to
watch it, like this weekend, let's find out?

Speaker 1 (07:43):
Or can I watch it in my bad No, I'm
just kidding. I love theaters and I love the theater
experience at the movie theater. It's still in theaters. Okay, great, AMC.
So I will be going to the movie theater because
it's important to support theaters.

Speaker 3 (07:56):
It's important to support film make girls when they're not Marvel.
Yeah you did it, good dog. Anyway, Yeah, I highly recommend. Well.

Speaker 1 (08:09):
I'll see it this weekend and can't wait to get
my thoughts on it too, because what the heck? This
is great. I want to interview that guy you made it.
Would your ghosts come on our podcast? Your gis your ghosts?
I love you? Know who he is? Right, the Lobster?

Speaker 2 (08:25):
All?

Speaker 3 (08:25):
Yeah, yeah, yeah, yeah yeah, the Lobster. The favorite. The
favorite is one of my favorite movies. Honestly, the favorite
is your favorite Olivia Coleman nasty work. My favorite of
his is the Killing of a Sacred Deer.

Speaker 1 (08:39):
Have you seen that movie? Yeah?

Speaker 3 (08:40):
I have, so fucking good And doctor anyway, we're just
talking about Yorgas lamp Post now.

Speaker 1 (08:43):
I love him. Yeah, me too. As Shall we talk
about another thing that you love, which is misinformation.

Speaker 3 (08:49):
I don't love misinformation. I harden interested in propaganda and disinformation,
which she does not call misinformation. She does not use
the term mis information. And we're about to find out why.

Speaker 1 (09:00):
Let's do it. Let's do it.

Speaker 3 (09:13):
Welcome Renee du Resta to trust me. Thank you for
joining us today. Thanks for having me.

Speaker 1 (09:20):
Was devouring your book.

Speaker 3 (09:22):
This is the number one topic I'm interested in, and
so many of us are at the moment.

Speaker 1 (09:27):
It is my special interest.

Speaker 3 (09:29):
It is called invisible rulers, the people who turn lies
into reality. So can you start off by telling us
a little bit about your academic background and how you
came to research the things that you research, and tell
us what it is.

Speaker 2 (09:40):
Yeah, sure, well thanks for saying you liked it. I
didn't have an academic background. I got into this kind
of by accident. I was a well, I mean, I
am a mom and a mom of three. And in
twenty thirteen, I had my first baby, and I moved
to California, and I was putting him on these preschool
waiting lists, and I got really involved in the vaccine conversation.

(10:00):
This was before COVID, so this was like the measles conversation,
not the COVID conversation.

Speaker 1 (10:04):
And you know, all of social.

Speaker 2 (10:06):
Media, once you have a baby, starts pushing you content.
It really realizes you've had a kid. All of a sudden,
it's no longer party pictures and your friends. It's like
mom stuff. And for me that turned into California crunchy
mom stuff. It started pushing me a lot of the
stuff around. You know, make your own baby food, do
cloth diapering, and then I did both of those things,
and so it was like, naturally, you must be an
anti vaxxer. It started pushing me in the anti vaccine

(10:27):
groups too, and I am not, and so I thought
that was kind of funny. You know, my background's in tech,
and so I kind of understood why it was doing it,
like I recognized it from the standpoint of like how
those recommendations work. I mostly ignored it until all of
a sudden there was actually a measles outbreak in California,
the Disneyland measles outbreak in late twenty fourteen, and I
got very involved in the sort of question of you know,

(10:51):
what do you.

Speaker 1 (10:51):
Do about that? Right?

Speaker 2 (10:52):
How do you feel like you're sending your kid to
public school, which you know I was going to be
starting that soon, and you just want to feel like
this is not a thing that you should have to
think about. In twenty fifteen, so I started writing a
lot about how effective the anti vaccine movement was at
communicating on social media. I felt like people really needed

(11:12):
to understand it because they still thought of it as
you know, Jenny McCarthy doing talk shows in the morning,
and it had really evolved so far past that, and
so I started actually just writing about it from the
standpoint of both a mom and then a person in
tech with a data science background who wanted to just
kind of explain how the system worked and why this
group of people was so effective as communicators and how

(11:34):
social media was boosting their stuff in response to them
being great communicators and the CDC and the people who
were supposed to be authoritative voices being like candidly abysmal
communicators actually, and I really felt like, you know, kind
of like chicken little maybe, like I was saying, Okay, guys,
like this guy is falling, except maybe it really is,
you know, and maybe somebody should be paying attention to that.

(11:55):
And so I got really really involved actually in writing
about that, and as I was doing it, I started
getting pushed other types of content, right, So I joined
some of the anti vaccine groups with a different account,
clean account that had nothing to do with my stuff,
and it started pushing me other content like pizzagate, right,
and then QAnon, And I was like, oh.

Speaker 1 (12:15):
Man, it's taking you on a whole journey.

Speaker 2 (12:17):
You joined this one group, and like, we're going down
a whole rabbit hole here.

Speaker 1 (12:20):
Aren't we.

Speaker 2 (12:21):
And so I started following that and just seeing, you know,
how how far how far down does this go?

Speaker 1 (12:26):
Where does it take you next?

Speaker 2 (12:28):
And that was that was what I started writing about,
and so it kind of became a career.

Speaker 1 (12:33):
I had a supply chain logistics startup.

Speaker 2 (12:35):
I was just in tech, but it became really something
that became kind of all consuming because I felt like
people really needed to understand just how these dynamics worked
and how this was. I called it at the time
like inadvertent algorithmic radicalization or something like that in these articles,
because we didn't have a name for it. And I
was like, here's why you you know, you think you're
joining a crunchy mom a cloth diaper and group. And

(12:56):
then like bam, here's the QAnon stuff that shows up
your feed. And that was how I got into it.

Speaker 3 (13:00):
Wow, a fascinating context. I did not know any of
that and it totally yeah, makes sense.

Speaker 1 (13:06):
I love that. That is where you went when they
tried to show you that content.

Speaker 3 (13:10):
Your book kind of starts with setting up how historically
there was the rumor mil before the Internet existed, there
was just the standard traditional rumor mail amongst the.

Speaker 1 (13:21):
People of the villages.

Speaker 3 (13:23):
Yeah, and then separate from the rumor mail, you had propaganda,
which was when the government or influential people wanted to
get out a particular narrative. Can you talk a little
bit about what is propaganda? What defines propaganda?

Speaker 2 (13:38):
So I use it in the way that it was
used maybe in its origins, which comes from like the
Catholic Church after the Protestant Reformation, realizing that they've kind
of lost control of the narrative, right, and Pope Gregory
says to the cardinals, you have to go out there
and propagate the faith. And what he means by that's
where the term comes from propaganda is this. It's this

(13:58):
verb form that with an imperative, you must go out
there and do this. And so it's information with an
agenda that benefits the person who is who is doing
that propagation, right, or who is requiring that propagation, So
there is some benefit to the person who is doing it.

Speaker 1 (14:16):
I don't think that's.

Speaker 2 (14:17):
Necessarily a pejorative, right, There is always some sort of
agenda in persuasive communication.

Speaker 1 (14:23):
But I think what we're propaganda goes.

Speaker 2 (14:25):
Where that term evolves is after World War Two, in particular,
it becomes, particularly in the United States, that thing that
the Nazis do, right, that thing that bad people do
to manipulate the discourse. And that's the kind of tenor
that it takes on. For most people who hear the
term right, they think of it as something that is
very inherently manipulative, something where it's persuasion, but it's persuasion

(14:48):
that's done with kind of a surreptitious element, kind of
a manipulative element. The audience isn't fully aware of what's happening.
And that's where I think that it has traditionally been
the purview of powerful people who have the ability to
control mass media, who have the ability to reach large
numbers of people all at once into shape a narrative.

(15:11):
And for most of human history, ordinary people didn't have
that power. Right. It was popes, it was leaders, it
was rulers, it was people who controlled the media establishment
who had that power. And so what I do in
the book is I kind of compare this rumor mill,
these unofficial narratives that pass from person to person, is,
you know, we share information amongst ourselves compared to this powerful,

(15:31):
much more top down system. And in the age of
the Internet, these two things happen in the same place
at the same time, and that power to spread messages
to reach millions of people, including with an agenda, is
something that we actually all have now.

Speaker 1 (15:44):
And that's what I think is so interesting about this
moment in time.

Speaker 3 (15:47):
Yeah, yeah, I love the way that you framed the
history and how these things got smashed together.

Speaker 1 (15:53):
Is it just like makes it so?

Speaker 3 (15:56):
I feel like I developed such a clear understanding of
how and why we have gotten to where we have
got on the Internet, which is a terrifying place. And
before the Internet, you talk a little bit about Gnome
Chomsky and Edward Hermann and this idea of manufacturing consent,
which is something I'm super interested in and don't don't worry,
I will not linger on this for too long.

Speaker 1 (16:15):
But just set up for people.

Speaker 3 (16:17):
Like before social media, the reason why there was this
idea of media being controlled was because the wealthy were
the ones who were able to own the systems that
produce the media basically right. And then of course there's
advertising incentives and disincentives depending on who is you know,
paying the publication to exist, and you don't want to

(16:39):
piss them off, so you avoid certain topics as well
as like if you have a source in the government,
if you read an article that pisses them off, then
you might lose that source. So there are all these
like different factors that would contribute prior to social media,
to the media being controlled by basically like wealthy people
to some extent, Right.

Speaker 2 (16:55):
Yeah, so exactly, so, Chamski writes this book called manufacturing consent.

Speaker 1 (17:00):
The phrase refers back.

Speaker 2 (17:01):
To Walter Lippman and this book from theeen late nineteen twenties,
early nineteen thirties, which is where the title of my
book comes from too, because a lot of the stuff
we've known for a full century now, right, this understanding
of propaganda and influence and the role that powerful, top down.

Speaker 1 (17:17):
Figures can play.

Speaker 2 (17:19):
So what Chomsky says in that book and Herman his
co author, is that you have these filters, he calls them,
which are incentives that the media owners always have in
the back of their mind, maybe in the front of
their mind too, as they're thinking about how do we
shape our coverage so as not to piss off powerful people.
You might be thinking about this right now as you

(17:40):
look at some of the way that coverage is shaped
so as not to offend particular political administrations.

Speaker 1 (17:45):
Right.

Speaker 2 (17:45):
You see this very clearly outside of the United States.
Now you're starting to see it much more clearly in
the United States also. But that is the dynamic that
he writes this book about in the nineteen eighties. Manufacturing
concent was written in the nineteen eighties, so prior to
the Internet really becoming the thing that it is today,
long prior to social media. And so what I wanted
to do with the book with Invisible Rulers, which is

(18:08):
a reference to Edward Burnet, who was a propagandist in
World War One at the same time, who was a
contemporary of Lipmann, right, was to say, Okay, now we
have a different media ecosystem. It's not newspaper editors and
TV broadcasters sitting there thinking what are my incentives and
how did my incentives shape back coverage. It's actually more

(18:28):
people who are like, how does the social media companies
incentives shape what they curate for you? How does the
influencer with three million followers decide what to say?

Speaker 1 (18:38):
Right?

Speaker 2 (18:38):
Like, what is the incentives of this media ecosystem and
are they the same as the last one.

Speaker 3 (18:53):
Another thing that I think we both found super interesting
was like you do have the companies that are know,
curating their algorithms or moderating content in a particular way,
but also there is just like an inherently human element
to this. Can you talk a little bit about two
step flow and influence? I thought it was really interesting

(19:15):
you mentioned a study I believe where you know, I
think traditionally we would think if something is being shown
on the news, that is going to shape opinions. But
in reality, it's been shown that people's opinions are formed
maybe more strongly or equally by their personal connections with
other people. It's people that are influencing them directly. Can

(19:36):
you speak to that a little bit.

Speaker 1 (19:37):
Yeah.

Speaker 2 (19:38):
So that's a study from the nineteen forties and what
it looked at some women indicator Illinois, right, and it
started to notice that when people talked about how they
formed political opinions. There was a presidential election that was
happening around that time. As people in the community were
talking about how they formed political opinions, what they were
saying to the researchers is that it wasn't what was

(20:00):
coming to them through their media consumption that was most
influential to them. It was who they were talking to
and there were a handful of women in the community
who were really really plugged into what was happening on
the media. And then they would sit there and they
would talk with their friends about it. They would talk
in their communities. So you can imagine, you know, you
would sit there, maybe you're playing cards, maybe you're at
a club, maybe you're you know, you're doing your thing

(20:21):
with your friends. I remember my grandmother used to come
home and I would hear all the gossip about what
she you know, about President Reagan. She was in the eighties,
and I would hear these things from her, right, because
she would just she would be out at her club
with her friends. She's an Italian immigrant, right, and she
would be talking about these things with her community, her
church club.

Speaker 1 (20:38):
And she would come.

Speaker 2 (20:39):
Home and she would tell me all about the things
she had heard about the American president, and you know,
and she would relate them back to me in the
context of what she heard talking about her friends talking
with their friends at church. And so this was how
this is how it happens, right, It's not necessarily what
you read in the newspaper or what you hear on television.
It's like mediated through this this second step. That's where
the two steps comes from. So the two step flow

(21:01):
is somebody in media says it. Your influential opinion leader
is the term that the researchers give to these influential women.
Your opinion leader says it, and then you are forming
your opinions in part based on how that opinion leader
is mediating that information for you. So you're not necessarily
hearing something on TV and magically changing your opinion. It's

(21:23):
much more through this process of somebody that I trust,
somebody that I like, somebody who's just like me, somebody
who's in the community is making this opinion formation.

Speaker 1 (21:32):
Process happen with me.

Speaker 2 (21:33):
And that's the difference between the old way of thinking
about it, which was called the hypodermic needle model, which
was you see this information and magically, as if you've
been injected, your opinion just changes, and that's not how
you actually form opinions. A lot of people think that
that's how it happens to people they don't like, right,
they think like, oh, well those people over there, you know,

(21:55):
they watch Fox News and then bam, they've changed their minds. Right,
But what's actually happening is much more This process of
like a culturation, Like they hear it, they talk about it,
they're talking about it with their friends. All of their
community feels this way, thinks this way, talks about it
at church, and so there's much more this cultural component.
And if you think about how you know, you hear
something crazy on the news, it doesn't magically change your mind.

(22:18):
So this idea that it would magically change somebody else's
mind is sort of wrong, But we tend to think that.

Speaker 1 (22:24):
Way anyway, right right. I first want to say that
when you were speaking, I was imagining your grandma like clubbing.
When you were saying about the club I was like.

Speaker 2 (22:32):
I was thinking, like like the bridge clubs, and that
is so sad.

Speaker 1 (22:37):
Yeah, what makes somebody more influential in their community?

Speaker 2 (22:44):
I think it's well, okay, so in this in the
study that they were doing, it was actually just that
they had a lot of friends.

Speaker 1 (22:50):
They were just very much plugged in.

Speaker 2 (22:51):
They talked to a lot of people, they were you know,
there are just certain people in a community who I
think are highly relatable, very charismatic. You really go above
and I know, I mean this is where this is
where that that very fine line when you guys reached out.

Speaker 1 (23:09):
I was.

Speaker 2 (23:09):
I watched a bunch of your past podcasts and your
experiences and and even just saying, you know, church groups
sometimes for some people can be like WHOA okay, But
I think it is that that question of what makes
people find other people believable, and it's that resonance. Right,
Are you a good storyteller? Do you have that ability

(23:30):
to draw people in? And there are you know, there
are some people who have that that combination of charisma relatability,
and they're very much out there. And you see that
in the you know, in your local community too. I imagine,
you know, I can think of certain of my neighbors
who have it. Not me, but there are people who

(23:50):
are just you know, in the neighborhood. They know, they
know everything, They're friends with everybody. They can tell you
what's going on in everybody's lives. They're the people who
check in on the elderly neighbors things like this, right,
we're just very deeply, deeply plugged in. And so you
had that in that offline model of socialization, maybe you know,

(24:11):
in the before we were all on the internet, and
then if you were to port that into the age
of social media, this is where you start to see
that figure of the influencer emerge, right, the people who
have that and do it kind of at scale with
an audience that feels like friends. But this is where
you start to get into some of those dynamics of
like para sociality.

Speaker 1 (24:29):
Right.

Speaker 3 (24:30):
We see people who seem like they're just like us,
and so it's sort of simulating that neighborhood vibe, except
we don't necessarily know where they're getting information or what
they're getting paid, if what they're getting or who is
paying them exactly? Can you explain the term bespoke realities
and how we are all living in them now?

Speaker 2 (24:48):
So bespoke realities, I was trying to come up with
a way to describe the unique power that we all
have to just decide to pick and choose.

Speaker 1 (24:58):
Almost like do you remember to choose your own adventure books?

Speaker 2 (25:02):
Yeah?

Speaker 1 (25:02):
Yeah, yeah, I feel like that's a lost art that
my kids.

Speaker 2 (25:05):
I don't think my kids have ever seen it, choose
your own adventure book. Actually, I don't know that they
have them in the age of like Kindle, and maybe
maybe they do.

Speaker 1 (25:14):
But well, video games kind of have it, you know
what I mean?

Speaker 2 (25:16):
Yeah, you can kind of fork it I guess, like
just where you're going.

Speaker 1 (25:20):
But I used to love those when I was a kid.

Speaker 2 (25:22):
But you could also kind of like you know, dogg
your the page and go and look and see if
you're going to die on that one and decide.

Speaker 1 (25:27):
To go together.

Speaker 2 (25:29):
But no, it was the Uh, it's the phenomenon where
you know, you can really choose to self select into
certain media and topical universes where like this is reality
for you, Like this is what you're going to see,
and you have the power now to just say this
is what it's and it's algorithmically reinforced. Maybe we'll talk

(25:50):
about that. But like when I was going down my
anti vaccine rabbit hole, where I was doing it very intentionally, right,
it was it was a very conscious process. I wanted
to see what was going to happen. So I did
go join the flat Earth group, right, I did go
join the Chemtreals group, and I had a sense of
what was likely to happen. But that's because once you
join those groups, what becomes very interesting is that the

(26:14):
platform wants to keep you there, and so it begins
to push you more and more of that kind of content,
begins to recommend to you more and more of that
kind of content, like you've given it a signal, you've said,
I am interested in this, and it.

Speaker 1 (26:25):
Says, okay, she wants that.

Speaker 2 (26:26):
We're going to give her more of that, and then
we're going to show her this other thing that people
who were kind of similar to her statistically want to
see more of. Also, even if she's never typed in
that term, like I never typed in the term QAnon
when it started to push it to me, it was
entirely a push, a not a pull, right, yes, And
so you start to see this process though, where then
I did go and I joined that group, right, And
then here's more QAnon stuff, and then here's more posts

(26:49):
from the q and on groups in the feed because
these are highly highly active groups because people are in
there trying to figure out reality. Right, Oh, this q
drops said this thing, let's all go through it and
figure out what the hidden meaning is.

Speaker 1 (27:03):
These are highly active, highly.

Speaker 2 (27:05):
Engaged groups, and so the platform decides, oh, that's fantastic,
here's a highly engaged group. Once somebody's in this, they're
going to spend a lot more time on our platform.

Speaker 1 (27:15):
If they keep engaging with it.

Speaker 2 (27:17):
And then these become your friends, right, these become the
people that you spend all of your time with. It
starts to really supplant the other kinds of stuff that
you might have seen. You're not seeing the baby pictures,
you're not seeing the wedding pictures. You're seeing post after
post after post from this group that you're now spending
your time with, and gradually, over time, like you have
the power to say, you know what, I don't want

(27:39):
to see the stuff that's telling me that this is bullshit.
I just want to see this right. And so you
have the ability actually to just say, like, this is
the information environment that I am now going to self
select into more and more. And so you almost kind
of participate in that process yourself. And I'm giving you
an extreme example in light of like where we are
today on this podcast, but you can also do it

(28:00):
and you know, much less, much less extreme groups. You
can decide, like, you know what, I'm only going to
see left wing stuff, I'm only going to see right
wing stuff.

Speaker 1 (28:09):
I'm only going to see dog videos.

Speaker 3 (28:11):
You like want dog, but tail you see seventeen on
your feed the next day?

Speaker 1 (28:15):
Yeah, yeah.

Speaker 2 (28:16):
And so that's the process where you have the ability
to just decide like this is my information environment, now
is it?

Speaker 1 (28:22):
And like this is where I'm going to go.

Speaker 3 (28:24):
But most of the time it's not conscious, right, because
it's just that we are looking at something and it
notices how long we're looking at it, or we click
on it out of curiosity, and then the algorithms will
learn that that's the kind of content that keeps us
on the platform and keeps us engaged, so then it
feeds us more of it. You know, I tend to
have a very panicked news feed. I don't want that

(28:44):
to be my feed. It just knows that that's what
I engage with, you know, that's what will get me
to spend more time on the app.

Speaker 1 (28:50):
And then I have to like be like, oh wait, stop,
stop stop. But you know, I have to force that.

Speaker 3 (28:54):
Into my consciousness or I could just go on autopilot forever.

Speaker 1 (28:57):
And another thing that I found interesting that you spoke
that I've never really thought about. Well, I have kind of,
but not in the way that you put it. Like
I'm very susceptible to serendipity and coincidences and stuff like this,
and these algorithms create almost like a mind reading effect
where you're like I didn't even say that out loud,

(29:17):
or like, wow, this is so, this is such a coincidence,
exactly what I need yes time, yeah, and then it
feels extra true. I think.

Speaker 2 (29:27):
I think people also don't necessarily realize just how much
data mining is going into what they think is.

Speaker 1 (29:33):
Serendipitous and personalized for them.

Speaker 2 (29:35):
And this is where just the sheer amount of number
crunching that's happening behind the scenes, where you think you're
very unique, and then you realize that it's decided that
you're really just very much you know, like me living
in San Francisco doing these things. Obviously, okay, California, these behaviors,
these things naturally anti vaccine, right like, And it doesn't

(29:57):
mean necessarily that it is true all of the time.
It clearly isn't, but just the higher probability is there,
it is worth it for the algorithm to push it
to you. And if it doesn't feel serendip it is,
you're going to ignore it. But if you do feel
some resonance there, you're going to click. And that's what's
going to be the thing that then makes it decide
to go and do it again and again. You're not

(30:17):
going to notice the ones that don't hit, but the
ones that do generate some curiosity, it's going to start
that process.

Speaker 3 (30:22):
Right, So we're being influenced by all of these different factors.

Speaker 1 (30:26):
At the same time.

Speaker 3 (30:27):
We have like people in our communities or people we
perceive to be in our communities because we have a
parasocial relationship with them. We have maybe governments or corporations
or whoever is engaging a marketing team like actively trying
to sell us stuff. And then we have the algorithm
trying to get us to stay on the app with
whatever is going to engage us the most, which of.

Speaker 1 (30:47):
Course makes it very difficult.

Speaker 3 (30:49):
Once you start thinking about it too much, I'm like,
do I even have my own mind? Like, how do
you even think for yourself? It seems impossible. Yeah, in
this day and age, I mean, so, why are we
so drawn to disinformation and misinformation? What is it about it?

Speaker 2 (31:05):
So? First of all, I think misinformation is a tough word.
I generally don't love it. I know that a lot
of people in my field use it. It's one that
I don't like because I feel like it misdiagnoses the problem.
So it generally is like somebody who is wrong about something,
like they get a fact wrong, but misinformation implies that
if you just gave them a better fact, it would

(31:27):
change their mind, right, like if you know, And that's
not what's happening. Like when I was in the anti
vaccine groups, the people who are in there, they're not
in there because they haven't heard that vaccines don't cause autism, right,
or that vaccines don't cause sids. It's that they don't
believe it, and they don't believe it on such a

(31:47):
foundational deep level because they don't trust the people who
are giving them the accurate information.

Speaker 1 (31:52):
So it's not the the information. There's no information gap there.

Speaker 2 (31:56):
It's not like, oh, if we just fact checked this
one more time, we would solve that problem. It's a
real trust issue, right, It's very much an issue of trust.
It's an issue of do you believe an authority figure
or This is one of the reasons why when you
look at that progression that the algorithm sends you down,
there's something that begins to emerge from that. You have
some commonalities between anti vaccine and chemtrails and flat earth,

(32:19):
and that these are all roughly speaking, in the realm
of pseudoscience. But then you might ask, like, what does
pizzagate have to do with it? What does QAnon have
to do with it? And the answer is that they're too.
The sort of core component in those belief systems is
an incredibly deep distrust of government and incredibly deep distrust
of authority, a belief that people are actively lying to you,

(32:40):
that they're keeping the truth from you, that they've concealed
it for years, and that in order to do that,
there is like a degree of depravity in the political
elites that lets them do that. And that's where that
intersection happens. And that's where like the algorithm doesn't understand
what it's keying off of. It can't articulate that that
is the commonality between group A and group B there,

(33:03):
it just knows that there's a very high degree of overlap,
and so that's where it's it serves it up on
the platter.

Speaker 1 (33:08):
Now it doesn't do that anymore. See how the platform
like change that.

Speaker 2 (33:12):
But that was what was happening at that moment in time,
back in the twenty sixteen kind of time frame when
it was doing this twenty sixteen to twenty eighteen timeframe.
But that's what you're that's what you're starting to see happening, right.
You're seeing that that question of like what is the
what is the underlying psychology and motivation that people are
looking for that makes them not only click on this

(33:33):
content once, but become a sustained participant in the community.
And that's where I feel like misinformation is completely inadequate
as a frame for understanding what's happening there. Do you
have a term that you like? I actually just like propaganda.
I feel like it's the best word for the content, right,
because it does get at this question of what is

(33:56):
the motivation behind the creators of the content, behind the
groups that are putting it out.

Speaker 1 (34:03):
They may sincerely believe.

Speaker 2 (34:05):
It, right, The people who are producing that content oftentimes
generally do sincerely believe it, and so it's content that
they're producing with a particular motivation, very often a political motivation.

Speaker 1 (34:15):
Right.

Speaker 2 (34:15):
They want to see legislation to do the kinds of
things that you're seeing, you know, being done in health
in human services today or at a state level, very
particular pieces of legislation that they want to see introduced.
QAnon had a whole set of things that it wanted
to see happen with regard to show trials and things
like this, right, So there is a series of outcomes
that they want to see as a result of the

(34:36):
beliefs that they hold. And so that's why I feel
like propaganda is a perfectly adequate word, and I wish
that we didn't have thirty other words for it, right,
Disinformation I use and that's where I actually spend like
I've written a ton of papers on it. In the
context of state actors, where the question becomes from a
geopolitical perspective, how do you see nation states actually recognizing

(35:01):
that this dynamic happens and that they can use it
to exploit and weaken the societies of their adversaries. And
that's where when you get at questions like what does
Russia do?

Speaker 1 (35:11):
What is China do? What does Irando?

Speaker 2 (35:13):
You can name pretty much any country and stick it
in that sentence there, like the US does it too?
What is the process by which you can exploit this
ability to create deep distrust and to divide societies from within?
That's where I think disinformation campaigns are. That that I
think is where like the term is actually useful, where
there's a real clear intentionality.

Speaker 3 (35:35):
There, right right, right, I'm so interested in what's happening
for all of us, not just conspiracy theorists. Emotionally, when
we click on something that is clickbaby or is rage baby,
or you know, like, I feel like so much of
it seems tied to identity, like I identify as somebody who

(35:56):
you know, feel or rather I feel special when I
know something that other people don't know, or I'm a
good person, And so I'm going to engage in justice campaigns,
whether they be rooted in reality or not.

Speaker 1 (36:11):
Can you talk.

Speaker 3 (36:11):
About how more extreme content we see more of it
than other content, but it's actually coming from a smaller
amount of people.

Speaker 2 (36:22):
Yeah, So there's a term called majority illusion where, you know,
when we were talking about the opinion leaders who are
just very well connected and very popular. A lot of times,
if you if you form an opinion based on what
it seems like the majority of people around you believe,
and you're in a particular niche, you're going to think
that most people believe, you know, whatever, the opinion is

(36:46):
based on what's around you, whether or not it's actually
you know, you're not necessarily going to go look at
polling to.

Speaker 1 (36:51):
Try to figure that out.

Speaker 2 (36:52):
You're going to say, like, well, everybody in my community
thinks this, that must be what most people believe. So
you start to see these interesting phenomenon where people people
decide what is true or what is real based on
what surrounds them. That's what starts to seem normal. And
what's interesting on social media is this tendency towards the
extremes because social media rewards. The Atlantic had kind of

(37:18):
a Helen Lois. The Atlantic had a nice name for it,
like the extreme of files, right, people who they don't
just you know, you're not just expressing a political opinion.
You're taking like the most extreme version of the opinion
you possibly can. And this is why everything feels like
a caricature. There's no normal, middle of the road liberal opinion.
There's like this crazy, insane you know, like the kinds

(37:41):
of people, the libs of TikTok goes and like I
call it nutpicking, right, like grab some random person and
it's like this is the avatar for like this is
what liberals actually believe.

Speaker 1 (37:50):
All leftists think this, yes, and you see it on
the other side too, right, And and so there's just
these like these sort of you know, kind of extreme fringe.

Speaker 2 (38:00):
You'd have to really kind of delve down into the
belly of TikTok to find some of this stuff sometimes,
but they go and they find it, and they pull
it up and they're like, this is what the left
or the right actually believes.

Speaker 1 (38:10):
And it moves in.

Speaker 2 (38:11):
This direction where you start to see that idea of
this extreme belief becoming the thing that people believe that
a certain political group or identity group actually like.

Speaker 1 (38:23):
They start to think that that is the norm.

Speaker 2 (38:25):
And you can see this actually reflected in political science
surveys where they'll go and they'll pull people on, like
what do you think the majority of conservatives or liberals believe?
And they'll list a whole bunch of different opinions, and
they will have the actual polling, right, what percentage of
liberals believe in defunding the police?

Speaker 1 (38:40):
Right? It's actually a very very.

Speaker 2 (38:41):
Small number, but conservatives will tell you that like ninety
nine percent of liberals I just made that stat up
believe in defunding the police because that's what they think
they're seeing on Twitter, for example.

Speaker 1 (38:51):
Right, it is what they're seeing on Twitter, right exactly,
you know.

Speaker 2 (38:54):
And that's because you start to like, as that becomes
identified as like the liberal belief, for example, people who
don't hold that belief will actually be kind of quiet.

Speaker 1 (39:05):
They will self censor.

Speaker 2 (39:06):
They don't want to be seen as expressing something that
is not the correct belief or the normal belief. And
then on the other and so it actually kind of
reinforces that tendency, and so you see groups that will
tend towards the extreme. And then on social media, that
process of mocking and creating this kind of intergroup warfare
makes it even more costly to speak up and say like,

(39:28):
I don't actually think that right, So you see the moderates,
the people who and I don't mean moderates in like
a political identity sense, I just mean people who don't
hold extreme beliefs, actually become more and more silent. They
choose to self censor rather than to speak up. So
it actually really trends towards the extremes. And then the
algorithm also is rewarding that by surfacing high engagement, high

(39:50):
emotionally resonant content. So you have that process be even
more further rewarded for the influencers themselves, who can actually
make money by being those highly highly divisive content producers
as well, so there's a couple of different incentives that
intersect that move people really out into the extremes.

Speaker 3 (40:09):
Yeah, with all of this stuff, it's so tempting to
be like and it's all because of the shadowy figures
at the top. But there's like, like with the manufacturing
consent propaganda model, there's a variety of filters working together
at the same time to make this happen all at once.
It's not just the one. It's not just Russia, or
it's not just whatever government. It's a lot of things happening,

(40:31):
including just like human tendencies.

Speaker 1 (40:34):
Yeah, and I like you said somewhere in the book,
well you quoted Facebook where it said our algorithms exploit
the human brain's attraction to divisiveness, and that just like
goes to show you have a the other and we
are addicted to hating the other, and it'll take us.
I've gotten like twelve hours on my phone before just

(40:55):
hating other people. What you've racked up twelve hours on
my phone? Jesus Christ.

Speaker 2 (41:04):
I think I think it's actually like it's an honest admission, right,
It's very easy to do. And there's a there's a
phrase and like you know, trust and safety and tech
policy in the fields that I work in, which is
like the problem with social.

Speaker 1 (41:15):
Media is people, Right, that's the that's the that's like
a great joke.

Speaker 2 (41:20):
But because I think there's and this is where a
lot of times I'll get, yeah, you'll get media inquiries
that wants you to talk about the algorithm right where,
which is always like one word, the algorithm, like the
only one which is there and it's real. And you know,
as we were talking about it, maybe is unreasonable to
expect people to realize why the recommender system is pushing

(41:44):
them that thing, right, I feel serendipitous. You know, you
know this is not a I think it's actually kind
of unreasonable to expect the average person to understand how
it works and what it's doing.

Speaker 1 (41:53):
And you know, this is this is not.

Speaker 2 (41:54):
You know, you shouldn't need like like a you know,
some like operating license to use it.

Speaker 1 (42:00):
But you know I do. I do, like you know,
I have.

Speaker 2 (42:02):
I've an eleven year old who likes YouTube a lot,
and I am always trying to explain to him what
the AutoPlay function does and why it's evil. You know,
you know, no, you do like this is not to
benefit you.

Speaker 1 (42:18):
It's just to benefit it, right, this.

Speaker 2 (42:20):
Is why it's showing this to you, you know, this
is why, Like, let me explain to you how much
money the streamer you are watching makes based on keeping
you there for this.

Speaker 1 (42:30):
Like hour long video.

Speaker 2 (42:32):
Let's talk about how many ads are interspersed in this content.

Speaker 1 (42:35):
Let's go and look at the view counts, Like, let
me do the math for.

Speaker 2 (42:38):
You and explain how much money. And this is not
to knock this streamer's ability to make money, like God
bless him. Like he's out there, you know, playing video games.
God knows how many hours a day. But let me
explain to you, like the math and the incentives and
the dynamics that are happening right here. And and I
think that doesn't mean that he listens to me, right,
I still fight with him and we go through all

(42:58):
kinds of Like turns out if you paste a URL,
moms listen to this one.

Speaker 1 (43:03):
So if you paste a U.

Speaker 2 (43:04):
R L from YouTube into Google docs, you can just
watch it embedded in Google Docs.

Speaker 1 (43:07):
So even if you're blocking YouTube, it'll AutoPlay in the
Google doc.

Speaker 2 (43:11):
Yes, yes, So basically you're trying to like appeal to
your kids better nature even as they're trying to just
find a way to get around.

Speaker 1 (43:18):
The parental controls.

Speaker 2 (43:20):
But these are the sorts of things where you know,
it is incredibly entertaining, It really taps into it knows
exactly what you want.

Speaker 1 (43:28):
It is very attuned to you. And I think the
combination though, is of two things.

Speaker 2 (43:34):
One it's helping people understand like how it works, I
think is actually very important. But then on the other side,
it is trying to create these like circuit breakers that
maybe give people a little bit more in the way
of the ability to not fall down those those rabbit
holes quite so, quite so easily.

Speaker 1 (43:51):
That'd be nice.

Speaker 3 (43:52):
Yeah, yeah, I will ask you more about that. But
I as somebody who's like trying to an album right
now on social media, which is a fucking nightmare. I
hate it more than anything I've ever done. But please
listen to my album. It's called crost Oh oh my god.

(44:13):
I notice it in myself and I've talked about it
a little bit on the podcast. Like the posts of
mine that do the best, that go viral tend to
be the more polarizing ones, and they are like my
genuine opinions, like I'm not posting anything that I don't believe.
But sometimes we'll catch myself and I'll be like, Okay,
why is it that I'm trying, Why is it that
this is what I'm posting right now? It's because I

(44:34):
know that, like it might make people angry, which makes
people click, and I don't want that to be my
reasoning for anything that I create, you know what I mean.
But like we live in a landscape right now where
first of all, AI has taken over everything. I mean
people in all industries, like we're in precarious financial positions.
Influencer or person who is reaching people with their art

(44:57):
like is still a viable option on social media, So
balancing trying to reach people with like trying not to
let that incentivization like brought your brain.

Speaker 1 (45:08):
It's immune to it even if you know what's happening. Yeah,
I mean.

Speaker 3 (45:13):
It's icky, but I also like I don't know how
to escape it. Like I feel like it's the entire
aquarium and I could, like someone would have to like
dump the aquarium out, you know, Yes, yeah.

Speaker 2 (45:22):
No, You're absolutely right, And it's the feeling of if
you don't do it, someone else will, And that is
the issue of you know, it's.

Speaker 1 (45:29):
The attention game. And I get it.

Speaker 2 (45:31):
I you know, I used to be very very active
on Twitter, including as a you know, political person with
opinions in San Francisco. Right, I lived in San Francisco.
I write about this like a tiny bit in the
in the book. You see it like kind of peek
through here and there where I'm writing about, you know,
my frustrations with the school board and stuff like that,
and I'm like, all right, I know how this works.

Speaker 1 (45:48):
Yeah, I bet you had some banger tweets.

Speaker 2 (45:53):
Well, it's the it's the you know, morally righteous indignation
is the tone that it keys off of, and people
will respond to that unfortunately.

Speaker 1 (46:05):
Yeah.

Speaker 2 (46:06):
And I say unfortunately because because they then feel like
they are also in the fight, right, in the righteous fight,
and it is the thing that will get people off
the fence. And it's not bad. I mean I think
that that's what activism is. And this is the this
is the ecosystem that we live in now. And as
you know, when you have something creative that you want

(46:26):
to promote, there is that same dynamic of like you
made something beautiful and you want people to see it.
And so this is the ecosystem that you have to
operate with it. I mean I had to promote a
book that's even harder because there's like nothing visual, right,
screenshot the cover a couple more times, right, But no,
it's it is that question of like how do you
break through? And that is that challenge of what do

(46:51):
you do when people are locked into a feed that
they don't control? And so some of what I write
about from a standpoint of if you were redesigning, as
we're designing a better social media, which is part of
what I work on you. I'm a professor at Georgetown.
Part of what I study is like can you design
a better system?

Speaker 1 (47:07):
Right?

Speaker 2 (47:09):
One of the things whatever you think of Blue Sky,
I know it has a reputation as like lib Twitter,
and it has its own problems with vociferous, insular, not picking,
but but it has feeds where you can just kind
of like click and change from one to the next.

Speaker 1 (47:24):
And I actually love that because it's a way for
users to see.

Speaker 2 (47:29):
In that immediate moment, like what does it look like
if you actually do just want to have the dog
feed for a change, Like if you just want to
turn off the politics, like.

Speaker 1 (47:37):
What does this look like?

Speaker 2 (47:38):
And I think more than anything else when you get
people using it, and I do like interviews where I
ask people how does it make them? You know, how
does make them feel? Why do they create fees? I
ask a lot of feed creators this question, and they
just say, like, I wanted to create a feed that
highlighted art, for example, is one and just so that
I could go and use it and just see a

(47:59):
bunch of art posted a bunch of politics and other things,
and I wanted other people to have that experience. And
when they talk about these things, it's really interesting because
they see it as a way to give people more
control and create an experience that doesn't have that that
feeling where you come away just feeling exhausted and bad.

Speaker 1 (48:18):
And it also to them feels like.

Speaker 2 (48:21):
They're giving users the you know, they're giving other users
like them the ability to just have a little bit
more agency and control over their own experience. And this
is the sort of thing that I think most of
the big platforms don't do because it doesn't keep you
on site, and it does mean that, you know, the
people who want to push ads at you are not
having that necessarily, that integral experience so interesting.

Speaker 1 (48:43):
It's so crazy that all of this is just for
people to sell like plastic pieces of shit. Yeah, it's
so insane.

Speaker 3 (48:54):
Well, and or to sell ideas. I mean, yeah, yeah,
it's amazing. It's all to make money.

Speaker 1 (48:59):
Yeah.

Speaker 3 (49:01):
And that's where we will leave part one for now.
Come back next week for part two for even more information.
And Megan, I don't know, tell me a content story.

Speaker 1 (49:11):
Oh okay, well this is this is like very the
light version of how this can go. Essentially, there was
this horse, like a pony with rain boots on that
was obviously AI generated, and I rode under it. This
is my horse, gumpy and tag my friend just I
don't know why I thought it was funny. I woke

(49:33):
up the next day. I'm not really active on the internet,
like the social media thing, thousands of comments that is
not your horse, Yes it is her horse. No it's not.
That's so stupid. And I was like, oh my yeah.

(49:57):
So yeah, the back and forth and then somebody of
it's like my bad, okay, maybe it's her horse, you know,
and about other person's like no worry is like it
happens to all of us, and I was just like
oh my god. And again, like you know, this was
back before I before you were yes, so yeah, I
just I think that's, you know, a funny little example.
But people are getting more savvy, are they.

Speaker 3 (50:20):
I have been sent like seven Sora videos this week
from people not like noticing.

Speaker 1 (50:26):
No well the past three weeks.

Speaker 3 (50:28):
But yeah, yeah, oh my god, no, No, it's I mean,
when you log into TikTok, like, no, I know, but
it says Sora.

Speaker 1 (50:34):
Yeah, but not immediately.

Speaker 3 (50:36):
It usually takes like five seconds and like by the
time we see it, like we're looking at the video.

Speaker 1 (50:41):
So it's like it's like the Gorilla.

Speaker 3 (50:42):
Experiment right right with the ball if you all don't
remember google the Gorilla experiment. Basically, it's just an example
of like you can be really focused on something and
miss something else, really blatant in your face because your
focus is it directed toward.

Speaker 1 (50:55):
The other thing.

Speaker 3 (50:55):
Sure, we've definitely talked about the Grilla experiment.

Speaker 1 (50:59):
I just want people to watch it because I was
like really into it and was like, damn I missed it.
It's so interesting.

Speaker 3 (51:05):
Yeah, just another example of bespokerele You know, whatever we're
focused on is the thing we.

Speaker 1 (51:10):
See yup, yep, yep, yep. Anyway, thank you for spending
another week with us. Make sure to rate us five stars.
We have merched now, we have merched now, big deal.
Uh so do all the things. And we can't wait
to see you again next week. And as always, remember
to follow your gut, watch out for rad flex, and
never ever trust me. This has been an exactly right

(51:35):
production hosted by me Lola Blanc and Me Megan Elizabeth.
Our senior producer is Gee Holly. This episode was mixed
by John Bradley. Our associate producer is Christina Chamberlain, and
our guest booker is Patrick Kottner. Our theme song was
composed by Holly amber Church. Trust Me as executive produced
by Karen Kilgareth Georgia Hartstark and Danielle Kramer.

Speaker 3 (51:56):
You can find us on Instagram at trust Me podcast
or on TikTok at trust Me Cult Podcast.

Speaker 1 (52:01):
Got your own story about cults, extreme belief, our manipulation,
Shoot us an email at trustmepod at gmail dot com.

Speaker 3 (52:07):
Listen to trust Me on the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.