All Episodes

December 10, 2025 49 mins

Renee DiResta, professor and author of Invisible Rulers: The People Who Turn Lies Into Reality, joins the show again to dive deeper into disinformation, propaganda, and "pseudo-events," or manufactured controversies and news stories.

She explains why it’s so difficult to tell fact from fiction online, how propagandists stoke real, often legitimate tensions to create division, how American politicians are amplifying false information instead of shutting it down, and possible policies that could help reduce our culty social media silos.

SOURCES

Invisible Rulers: The People Who Turn Lies Into Reality


The Image: A Guide to Pseudo-events in America

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Trust me?

Speaker 2 (00:02):
Do you trust me?

Speaker 3 (00:04):
Right?

Speaker 4 (00:04):
Everly? And you wistor tr us.

Speaker 1 (00:06):
This is the truth, the only truth.

Speaker 4 (00:09):
If anybody ever tells you to just trust them, don't
welcome to trust me. The podcast about called extreme Belief
and Manipulation from two savvy Internet users who've actually experienced it.

Speaker 5 (00:23):
I am Lola.

Speaker 4 (00:23):
Blanc and I am Megan Elizabeth, And today our guest
is once again Renee deresta professor and author of Invisible Rulers,
The People Who Turn Lies into Reality. Last week we
talked about propaganda versus disinformation on the Internet and how
we're all living in bespoke realities. And this week we're
going to chat about what she calls pseudo events or

(00:45):
controversies or news stories that are entirely manufactured, How easy
it is for images and videos to be taken out
of context and feed disinformation, and why it's so difficult
to tell fact from fiction.

Speaker 6 (00:56):
That's right, She'll tell us how propagandists purposely stoke real,
pre existing tensions in the countries they're targeting, how politicians
are amplifying false information instead of shutting it down, and
some possible policies that could help reduce our CULTI Social
media silos.

Speaker 4 (01:13):
Before we learn so much more from Renee ar resta
about why we're all fucked on the internet began what's
your cultiest thing of the week.

Speaker 6 (01:22):
So my cultiest thing is looking into this Korean religious sect.
It's called Grace Road and it's in the news a
lot lately because it's been moving over to Fiji, which
is interesting.

Speaker 5 (01:37):
The leader, her name is shn.

Speaker 6 (01:39):
I'm not even gonna attempt to say her last name
because I don't want to butcher it. But she did
get arrested because they take part in something called threshing
where essentially the leader would just beat the shit out
of congress and my god, and she's in jail right now.
But her son is taking over. And again I'm just
seeing reports from people in feat being like, dang, there's

(02:01):
a lot of South Korean people here right now. They
all seem to be pregnant because they're trying to grow
the cult by having new members, starting a lot of restaurants,
even a grocery store, which is something we see a lot,
you know, get into the food biz. So the fact
that there's even grocery stores is interesting. The old son
or daughter taking over is an effect and I can't

(02:24):
believe we haven't done an episode on this yet.

Speaker 5 (02:26):
It's really horrible to watch.

Speaker 6 (02:28):
A lot of these videos were made public of beating people.

Speaker 5 (02:32):
I don't know.

Speaker 6 (02:33):
It's a hard one to look at, but I'm really
curious to see what's going to happen there. And one
of the experts that's talking about it a lot, I'd
also love to have him on, but he is from
South Korea and he's just talking about how it's such
a The political disress there has made a really perfect
silo for a lot of extreme belief to start bubbling.

Speaker 5 (02:54):
And then.

Speaker 6 (02:56):
Some leaders are really good at using cultural Zeitgei's things
like kpop and putting it into their religions. So it's
very it's very fascinating stuff, very sad, very scary, but
I want to keep an eye on it, and I
would love to interview a survivor.

Speaker 4 (03:12):
That sounds fascinating. There are so many cults in other
countries that I feel like we have to talk to
people from them because the different cultural context does make
it so interesting. I know, I can't wait to learn
more about it. I guess we'll talk about it in
a future episode. What about you, What's your cultiest thing
of the week. My cultiest thing of the week is

(03:32):
that I almost got scammed again, but more directly, this
time by a social media marketing guru.

Speaker 5 (03:46):
Uh.

Speaker 4 (03:46):
There are so many of these people online, and some
of them have legitimately useful things to say, right, and
others this man is Okay. I saw some videos from
him on like TikTok or whatever, talking about like music
promotion and a lot of the time people will have

(04:06):
really surface level stuff to stay and he was talking
more about like core identity stuff and like staying aligned
with who you really are and like not values. Yeah,
I mean like stuff that I care about because for me,
like just chasing the numbers it feels really like stupid
and pointless. And he's talking about stuff that appeals to me.
So you can like set up a one on one

(04:29):
on his website and get a consultation or whatever.

Speaker 2 (04:33):
So I do the consultation. It's free, why not?

Speaker 4 (04:35):
And Okay, So let me tell you this guy is
so charismatic. And he's like this guy in New York
who's I don't even know, Like he was in a band,
his boyfriend was in a band something I don't even
know anything about. This man, but like the way that
he talked, I was I felt so seen and so

(04:57):
heard and so understood. And he's just had this like
cadence and charisma and as he's talking, I'm like, oh,
I get it.

Speaker 2 (05:05):
This is how it happened because I.

Speaker 4 (05:08):
Like want to listen to whatever he says to me,
and he's asking me these like deeply personal questions for
a social media marketing call, and like some of that
stuff is necessary, but he's getting like really like, so
what is it about that? What does it make you
feel when that happens? And how do you feel about

(05:28):
what I'm saying to you right now? And you know,
like stuff that I'm like it is relevant, but you're
asking me a lot about my feelings and not telling
me about the price. And we talked for literally an
hour and fifteen minutes and like finally forty five minutes
in he like tells me the like logistics and cost

(05:50):
of the program and it turns out it's just a
pre made YouTube series like little Seminar that you pay
a fuck ton of money for and he had marketed
it as like this one on one thing, but it's
actually that there's group zoom calls and these like videos
and you're supposed to pay a lot of money for them.
And so he says this to me, and I'm like, oh, okay,

(06:14):
and he's like, how does that make you feel? I
can see that something blah blah blah, something didn't strike you.
And I was like, well, you know, like if I
were to spend money, like I don't have a lot
of it, and I would ideally be looking for someone
to like assess my page and like figure it out
with me, that that like stuff where you kind of
do it on your own. I've done that before. That's

(06:35):
not I don't necessarily need that. And he then spent
twenty minutes plus trying to convince me why my fears
were not relevant and I should spend the money on
the program. And I could see he like thought it
was like a fish that he had to like cat
f reelin yeah. And by the end of it, I

(06:55):
was like, oh, man, Like those first thirty minutes, I
felt so like there was this person who was just
gonna solve all my problems.

Speaker 2 (07:05):
And then of course, damn it, all the feelings were there.

Speaker 4 (07:10):
He like got all the feelings there, and it was
just like just wrong enough that I didn't do it.

Speaker 5 (07:15):
Now, Lola, here's the plan.

Speaker 6 (07:18):
They're going to make a hundred first impression setups with
charismatic leaders and get their free hour.

Speaker 5 (07:29):
They're all going to sell you.

Speaker 6 (07:30):
At the end, they're gonna get thirty minutes of actual
connection from all of them.

Speaker 2 (07:36):
I just want someone to ask me what I'm feeling.
I'm for someone asking me what I'm feeling.

Speaker 6 (07:42):
And you know what's really scary is if let's say that,
you know, I would love to speak to you more
about this, to see how ill intent this man's intent.

Speaker 5 (07:50):
How ill intent this man's intent was ill.

Speaker 6 (07:56):
But uh, you know, like if it was super not good,
just how much information he would have to use against
you and know what your weaknesses are and your fear points.

Speaker 4 (08:08):
And oh I was really mindful of that as we
were talking, because I was like, fortunately, like this is
I wouldn't say this is all stuff I would say
on the podcast, so this is fine. But as of
we were talking, I was definitely clocking like somebody could
be sharing stuff that's like super personal yep, and he's
recording the call, and like, I don't necessarily think that

(08:29):
this is some evil guy. I think it's just a
classic like sales person. Yeah, like technique, but it felt
really scammy to me in the way it was set
up for it, like felt like a bait and switch.
But yeah, just like getting that glimpse into like peeop.
There are people who just have that thing and like
know how to make you feel special, and some of

(08:52):
them are just salespeople and some of them are worse.
So once again, if somebody makes you feel so oh
uniquely special and understood so immediately before you know each other.

Speaker 6 (09:05):
At all, warning, warning, you should not have sparks with
people from the start.

Speaker 2 (09:14):
Hate that. Just let movies be real.

Speaker 5 (09:17):
I know, I know, no, and sometimes you can and
that's healthy. I'm just I'm just doing a little bit, y'all.
Sometimes it's real.

Speaker 2 (09:26):
Sometimes, but two percent of the time.

Speaker 4 (09:29):
Yeah, I'm gonna go with that made up statistic.

Speaker 5 (09:32):
I agree with it. Well, let's talk to Renee. Let's
do it.

Speaker 4 (09:49):
So you've talked about this idea, the ninety nine to
one rule, So can you kind of explain what that is.

Speaker 3 (09:56):
Yeah, So most people on social media are like quiet lurkers.

Speaker 1 (10:00):
So ninety percent of people don't post.

Speaker 3 (10:02):
They just kind of take out their phone and maybe scroll,
maybe they'llhit the like button, but they're not really creating content.

Speaker 1 (10:08):
Nine percent or so of people are making something.

Speaker 3 (10:11):
Maybe they're contributing a little bit, maybe they're commenting, maybe
they're doing a quote retweet, maybe they're making a post
every now and then.

Speaker 1 (10:18):
And then one percent.

Speaker 3 (10:19):
Of people are producing an overwhelming amount of the content.
Like that is what you were seeing. It's really overwhelmingly
a very very very very tiny percentage of people who
are actually out there as content creators and out there
even on some of the most divisive platforms. Where again,
like for that majority illusion comment from earlier, you think
you're seeing the reflected opinions of the average ordinary person,

(10:43):
and you very much or not, you're seeing like whatever
that one percent decides they want to be talking about.

Speaker 4 (10:47):
And that one percent like often will rise to the
top because what they're posting is extreme and emotional. It
is emotionally charged, and things that are emotionally charged spread faster,
so therefore we can't escape it.

Speaker 6 (10:59):
Yeah, and I didn't really understand until recently, an embarrassingly recently,
how much the algorithm is so random, Like it does
feed off of divisiveness, it does feed off of emotion,
but also there, and I saw that you said in
the book there was like a hot button people at
TikTok were able to press to kind of make oh,

(11:20):
the heating thing, the heating thing. Someone pressed that for me, Yeah,
to like kind of make things pop off. But it
does feel as though it's like super random.

Speaker 3 (11:30):
Depends on the app, right, so it don't make much
video content. It does depend on the app. There's certain
types of you know, every now and then the platforms
will push out one of these creator invitations. I don't
know if you guys have ever gotten one of these.
I got one from LinkedIn, which absolutely blew my mind.
Like a linked influencer, I always say yes when they

(11:55):
do these things because I'm from more from a research
standpoint than as a content creator in anyway, but just
because I'm kind of curious to see what guidelines they give.
They're doing this on threads now also like people who
want to be uniquely kind of text based creators on threads.
Sometimes they're doing bonuses for creators if you create this
many posts. You know, there's like ways that they're trying

(12:15):
to hook people in and draw them into the you know,
make the platform have more stuff on it that was
happening on Facebook.

Speaker 1 (12:24):
Videos for a while. I'll write about that a little
bit in the book.

Speaker 3 (12:26):
They were really trying to promote the watch tab as
TikTok was rising. So you'll see again the platforms are
all in competition with each other, and so they'll do
a little bit of occasionally try to like recruit certain
types of creators onto the platform to sort of boost certain.

Speaker 1 (12:40):
Types of content.

Speaker 3 (12:42):
But there are these things where they're trying to give
creators tips on how to succeed, and when they're doing that,
that's where you occasionally get a little bit of a
glimpse into what they're optimizing for. So on threads they'll
tell people like responding actually really matters. They want to
build up conversational dynamic so that it's not all just

(13:02):
people broadcasting. So they tell the creators you should really
be in the replies, you should be talking to people
like that raises your profile, not just posting, And so
you can kind of start to get a sense for.

Speaker 1 (13:14):
Like what is the algorithm reward. It's not only your post.

Speaker 3 (13:18):
They also don't want links, because that drives you away
from the platform if people are clicking out, and that's
why you'll see people like post the screenshot. I also
find it like a complete mystery sometimes like why did
this get a ton of views?

Speaker 1 (13:31):
And that nothing?

Speaker 3 (13:31):
And you know it's I think again, it gets at
these questions of like is it the right language? You know,
you'll see people who will really be in there with
a fine tooth comb, particularly on YouTube. I had a
few friends who are willing to kind of sit there
as I was doing the writing and just walk me through,
like this is how I analyze, literally like minute by minute,

(13:52):
the optimization functions that go into making some of these videos.
And that's where you see like for them, how much
of it is, you know, like the incentives that go
into this is the thumbnail I'm going to make, This
is the title I'm going to give it right exactly.
It's not just oh I made a video and it
hit you know yeah. Yeah.

Speaker 4 (14:15):
Like as an artist, it makes me feel so depressed
because I'm like, for me, I believe, like, you know,
if it's good, it'll rise, yeah, and like art should
be something that's authentic and true to you and not
with the audience in mind. But then it's like in
today's modern world, like you're fucked if that's your approach.
But okay, I loved your section talking about pseudo events,

(14:38):
and I wondered if you could give us some examples
of a pseudo event and what that means.

Speaker 3 (14:43):
The one thing I always think of is like Kim
Kardashian breaking the Internet, which maybe shows my age, but
you no.

Speaker 1 (14:50):
So pseudo events.

Speaker 3 (14:51):
That is a term from Daniel Borstein, who is a
media theorist.

Speaker 1 (14:56):
And I'm trying to remember what year his book was written.

Speaker 3 (14:58):
I think it was maybe the nineteen six Sometime I
will look that up and give it to you for
the show notes or something. But he writes this book
and it's an absolutely fantastic book. It's so relevant today
because he's writing, you know, again before before social media.
The pseudo event is an event that is made entirely
for media to report on it. It is a thing

(15:20):
that if it happened in the world, unless you notified
media about it, it would not be anything worth reporting on.
It's like the grand reopening of a store, you know,
where they'll invite media to go there for the ribbon cutting.
But that's not newsworthy. It's only newsworthy because they have
chosen to make it newsworthy. So it becomes newsworthy kind
of through the act of reporting. By having the media there,

(15:42):
it becomes something that, you know, it's almost like the
hype generates the newsworthiness itself. It's entirely constructed as opposed
to a real event happened that actual people should know about.
And he writes about this because it starts to happen
around the time that media, particularly tell vision becomes this
twenty four you know, it's sort of like twenty four

(16:03):
hour cycle, right, all of a sudden, you have to
be filling the television time. Do you ever watch something
like Election Night kind of comes to mind, But there's
other there's other moments like during the Olympics and stuff
where they'll just constantly be teasing it and teasing it
and teasing it and teasing it like nothing has happened yet.
You know, you're still just watching it, watching it and waiting,

(16:23):
but you know, you know that eventually maybe there'll be
a payoff. But in the meantime they're just kind of
hooking you there or otherwise, it's a really slow news
cycle that day, so you're going to be watching the
television program where it's they're really kind of scraping the
bottom of the barrel, trying to find something to tell you,
you know, to.

Speaker 1 (16:39):
Tell you about.

Speaker 3 (16:40):
And what's interesting on social media is you don't even
realize it, but so much of what you pay attention
to on social media is complete bullshit, right, It's just
these are things where but for the fact that you
have your phone out and there's nothing else going on
on your phone on that moment, that people are kind of,
you know, equally bored and hitting the reweet button about like,

(17:01):
this is not a thing that you necessarily need to
know about in the world, and there's an awful lot
of that kind of thing that that starts to happen.
And so I talked a little bit about these moments
on Twitter, where the entire internet just like erupts an
outrage at random tweets.

Speaker 1 (17:19):
There was one, there was one called bean Dad.

Speaker 3 (17:22):
I don't know if you guys remember, no, I was
always much more on Twitter than than Instagram. So this
is again my like my personal bias from being more
of a text based social media person. But like, this
guy wanted his daughter to figure out how to open
a can of beans with a can opener, so he
kind of posted about it.

Speaker 1 (17:42):
And like the Internet.

Speaker 3 (17:44):
Blew up with whether or not this was like horribly
abusive that he wasn't teaching her how.

Speaker 7 (17:48):
To like operate the can opener, right, I mean this
like this was the discourse for the whole afternoon Wow.

Speaker 3 (17:59):
Or there was one where there was a woman who
posted about how much she loved having coffee in the
garden with her husband, like and how they never ran out.

Speaker 1 (18:07):
Of things to talk about. It's just like a really
cute you know.

Speaker 3 (18:09):
She was just probably like out there, happy, glowing, sitting
with her husband having her coffee, and she posted about
how wonderful it was that she had coffee with her husband,
and like again, the whole internet.

Speaker 1 (18:18):
Came for her.

Speaker 3 (18:19):
It was like, you know, not everybody is as privileged
as you are. Not everybody has husbands and coffee and
time gardens, you know, and these are the kinds of
things though we're like you'd open the app sometimes you'd
be like, what are we mad about today? You know, Yeah,
you would never have known that this person existed, or

(18:40):
that that man was doing this with his daughter, and
that you know can't opener or any of these things.

Speaker 1 (18:44):
These are not real moments.

Speaker 3 (18:45):
That you should have, like raised your blood pressure over
but like, you know, the whole world was like there.

Speaker 1 (18:51):
You know, and it's it's interesting because you really see it.

Speaker 3 (18:56):
It brings people in and these are some of the
stupid ones, but they're also you know, there's also moments
where it leads to two people get into a fight
over a parking space, right, and then the Internet decides
to dox one of them because they feel like somebody
was you know, sexist or racist or this or that,

(19:17):
or they don't see, you know, they misconstrue the interaction,
or they don't misconstrue it.

Speaker 1 (19:21):
Maybe the person was you know, a real jerk.

Speaker 3 (19:24):
And and but there's that moment where like this would
never have been on the internet, and the question is like,
is this really the thing that we need to be
fixating on. Two people fight over a parking space and
but for the fact that someone's there with a camera phone,
you never would have known about it.

Speaker 6 (19:40):
Oh and do you best believe I'm watching it? I'm
watching the whole thing.

Speaker 4 (19:58):
You wrote a little bit about some other instances, including
the wayfair fake controversy, as well as like videos that
go viral that are completely taken out of context, which
is something that I just like, it makes me feel
so crazy. I encourage people, if you're looking at a
video and there is text written on it about what
is happening, to not assume that the text is accurate

(20:20):
for what is happening unless you literally see that exact
thing happening in the video. Just so many times things
are shared and I'm like, but why, But how do
we know that that's what this random person on Twitter
says it is?

Speaker 2 (20:31):
And so much of the time it turns out not
to be.

Speaker 1 (20:34):
Often often times yeah.

Speaker 4 (20:35):
Yeah, yeah, And there are people who are posting this
on purpose to rage bait, to keep us divided, or
to get clicks.

Speaker 3 (20:43):
Like as you were talking, I was thinking of one.
You know, when the ice the ice raids kind of started,
there was a video that went around saying that it
was ice raids up in the Bronx up in New
York City, and it had a baby being taken away
from a mom and the mom was being put in
the squad car and it was going viral and Blue

(21:06):
Sky somebody sent it to me and I was actually
really curious if it was AI, because that's the other component, Like,
not only is it decontextualized, sometimes but sometimes it's just
not even real now, And that was why it was
sent to me with the question like is this even real?
And I figured out it was real, right one of
the guys had a jersey on and the number of
the football player matched the number on the jersey.

Speaker 1 (21:27):
AI is not that good yet.

Speaker 3 (21:28):
And I sent it to a few other people who
also do the kind of work that I do, like
against like authentication or where does this come from? These
sort of like content traces, basically trying to figure out
is this you know, is this real?

Speaker 1 (21:39):
Where has it come from?

Speaker 3 (21:40):
And the general consensus was like this was old and decontextualized,
and we were trying to find news articles about it,
and eventually somebody sorted out that it was actually in
fact from like years and years ago and had been
like a custody dispute and that was where this that
was where this footag should come from. But these are
the sorts of things where when you just see, like

(22:02):
and the person who had posted it to Blue Sky
actually took it down. I'd sent them a message saying like,
can you tell me where you got this from?

Speaker 6 (22:08):
Right?

Speaker 3 (22:09):
Not in an accusatory way, you want to kind of
like ask that because most people don't know, right they
see something somebody sends it to them, like they really
genuinely believe that they're helping inform their community of something
that is really bad, Like that was clearly the intent
with which this was posted. And so you know, you
try to to get a sense of like where might

(22:30):
they have seen it from? Did they see it on
Instagram and go and pull it or did somebody send
it to them?

Speaker 1 (22:35):
So it's it's just very hard now for.

Speaker 3 (22:37):
People to have any sense that what they're seeing is
real and true and real and true are not the
same thing, right, Real is did a machine make it?

Speaker 1 (22:46):
Or is it actually authentic footage?

Speaker 3 (22:48):
And then true is is the footage used in the
context in which the person is claiming it is? And
that's what you mean when you say, like the is
the is the text above it accurately representing what is
in the image below?

Speaker 1 (23:01):
And then the third question is like do they know right?

Speaker 3 (23:04):
Are they the account that's trying to be manipulative or
do they sincerely believe that what they're saying is true?

Speaker 4 (23:09):
Right?

Speaker 3 (23:10):
And oftentimes with that you're gauging based on like their
prior reputation and their prior actions, Like, are they somebody
who generally is like a good faith actor who tries
to be accurate and when they find out that they
haven't been, like takes it down or writes an apology
or like clearly indicates that like, whips, I got this wrong,
here's what it actually is. Or are they just like, well,
I'm going to leave that up and you know, yeah,

(23:31):
keep cashing in the views right right, And so that's
the that's the dynamic under which this kind of stuff
increasingly happens.

Speaker 2 (23:38):
I mean, it's it's.

Speaker 4 (23:40):
So frustrating because like there are genuine injustices occurring in
America and the entire world, including the massive funding of
ice and you know, like that becoming a priority in
our country over a lot of other things that are
very important.

Speaker 2 (23:56):
And it just becomes so difficult.

Speaker 4 (23:58):
I feel like in this era now, there's so much
distrust because when there are genuine injustices happening, there are
going to be people who point to examples like that
and say, well, there's been misinformation about this, so this
must be fake. And of course then there are you know,
people trying to influence us to not fight back against

(24:19):
it injustice and then there are also people invested in
keeping America divided. And that's the piece that I am
particularly interested in right now, is like who, who, first
of all, is invested in America being a divided country
and why what does that achieve?

Speaker 1 (24:34):
So sometimes it.

Speaker 3 (24:35):
Is state actors, you know, you do see that just
from the standpoint of anybody in a This is where
the geopolitical component comes in. You know, Russia's been doing
this for decades. It's not just recently. It's going back
to the Cold War. You start to see the dynamics
of disinformation and coming up with a story and laundering
it through sympathetic media. The idea that the CIA created

(24:58):
AIDS is one of these canonical examples where you take
a story and something that's likely to divide people and
you move it. At that point, it was through newspapers.
Happened very slowly, but that was you know, that was
an effort to tap into race relations, distrust, you know,
fears of AIDS at the time, Questions about was the
government lying to people about where it came from, a

(25:19):
lot of these different things that went into that operation.
So what you start to see is this, can you
use that real story, that real tension, that real underlying
social issue and can you just exploit it. Very little
of the social division content is new, you know, They're

(25:40):
not like creating social divisions out of whole cloth. They're
using stories to exploit social divisions that are already there.
That's the thing that people often misconstrue. So there is
like real issues and they're just using what they have
to weds that further apart. But then there's the things
that are much more impactful, right, which is the domestic

(26:02):
side of this, where you do have political parties that
begin to realize that this is advantageous to them.

Speaker 1 (26:09):
And there'll be a paperback version of the book.

Speaker 3 (26:12):
And I was writing the epilogue and my publisher might like,
get mad. But one of the stories that I chose
to include the epilogue is the story of eating the
pets during election twenty twenty four. So during the twenty
twenty four campaign, there was a moment where in a
random Facebook group catering to the community of Springfield, Ohio,

(26:32):
a woman who sincerely believed this made a post saying,
my neighbor's friend's daughter's cat went missing, and then she
saw the neighbors across the street Haitian immigrants hanging it
up as if to skin it. Watch your pets, right,
So she talks about how her so this is a rumor, right,

(26:52):
my neighbor's friend's daughters, or she says it in some
you know, some kind of formulation of those things.

Speaker 1 (26:57):
I don't have my notes in front of me right now.

Speaker 3 (26:58):
But she tells the story of a second third hand
story in which he accuses the Haitian immigrants in the
community of eating this cat. And that screenshot in a
private Facebook group is taken, and then it is moved
to Twitter, where this other account says, see, they're talking

(27:19):
about the cats. But I told you this was happening
with the ducks, and you all laughed at me. But
it's happening with the ducks. And then this post goes everywhere.
So this is a pretty standard run of the mill
rumor construct, like, oh, I heard my neighbor's friend's sister's
dog did this. You know, these kinds of ways in which,
again the rumor mill that we were talking about people
sharing information from person to person. They're doing it because

(27:41):
they genuinely think they're helping their community. But what happens
next with that post is that jd Vance picks it
up so as it goes viral. He's running for the
vice presidency and so he's on the ticket with Donald Trump,
and he picks up that story, picks up that rumor,
and he's the senator from Ohio, and he says something

(28:04):
to the effect of, see, I've been talking about this
the people in the people of Ohio. Nobody's been listening
to them. Where is our borders are? Where is Kamala Harris?
So he takes the story and instead of letting it go,
he takes this viral moment because this is trending on Twitter,

(28:25):
and he hops on the trend and this is where
you see that handoff from rumor to propaganda. In my opinion,
when I say these things are happening in the same place.
Now you have an incredibly powerful person with an absolutely
massive following, who has the imprimature of authority.

Speaker 1 (28:38):
This is a sitting senator running for vice president.

Speaker 3 (28:41):
Instead of saying this is bullshit, he picks it up
and he says, this is absolutely happening, right.

Speaker 1 (28:45):
And he boosts it.

Speaker 3 (28:47):
And what happens next is that it leads to a
two week media cycle about whether or not the Haitian
immigrants in Springfield, Ohio are eating the pets so crazy,
and this is expressed in the presidential campaign. And it
sounds insane to think that this is something that we
America are discussing, right, because it's racist, right, It's like

(29:09):
it's an egregious rumor. And what makes it worse is
that as the investigators, like the investigative journalists go out there,
they're talking to the Republican governor of Ohio, They're talking
to the Republican sheriffs, the Republican business leaders, all of
whom are saying this.

Speaker 1 (29:25):
Is not true.

Speaker 3 (29:27):
Meanwhile, bomb threats are being called into the schools, into
the hospitals, like this is no longer just online, and
they're not trying to diffuse it. Instead they're saying, well,
the media is just not talking to the right people.
The media is trying to cover this up. You know,
those rhinos over there are just not telling you the truth.
And they and they keep it going. And so the
story that I tell in the epilogue is like how

(29:48):
this like basically, how this walks through This is one
of many stories in the epilogue, But yes, exactly, but
I think it's really important for people to see it
as you know, it's And this is not to say
that only the right does this or only the left
does this, but in this particular example, rather than you know,
we used to see our political leaders being the sort

(30:10):
of like fire break in this stuff.

Speaker 1 (30:12):
Right, No, this didn't happen, like this is just.

Speaker 3 (30:15):
Some stuff that people online are saying. But we're not
going to give that the imperimature of the United States government.
We're not going to level that up to pretending that
this is real. But instead Advance does the opposite. And
then when media finally says like, look, come on, there
is no evidence that this is happening, what he says

(30:35):
is if I have to make up stories to make
the media pay attention to what's happening in the plight
of the people of Springfield who are having their resources
used by immigrants, then that's what I'm going to do.

Speaker 4 (30:49):
I mean, yeah, we already were living in a world
where it was difficult to discern reality from unreality, and
now within administration that is systematically dismantling scientific institutions, for example,
just whatever serves their interests, like and then of course
we have just our human instincts and we have the algorithms,

(31:11):
and we have the brands, and we have and we
have AI now to contend with fucking Sora, Like what
how are we supposed to navigate this land that we do?

Speaker 1 (31:21):
Yeah, Like, you know, it's.

Speaker 3 (31:24):
Funny because I when I when you guys reached out,
I was like, oh, you know, I remember in twenty eighteen,
I think it was twenty eighteen, maybe it was a
little later when I was writing this article. I was
writing for Wired at the time. I actually did reach
out to a psychologist who worked with people who were
leaving cults, and I asked about QAnon and I was like, look,

(31:46):
you know, this isn't my isn't my field, right, Like
I work in like data science and analysis and stuff
like that, but it seems like a decentralized cult.

Speaker 1 (31:52):
You know, Yeah, is that the wrong read?

Speaker 3 (31:55):
You know, tell me, I'm tell me if I'm wrong,
And if I'm not wrong, then then you know, what
is the what do you do with this?

Speaker 2 (32:01):
Like?

Speaker 1 (32:01):
What do you do when it's this decentralized?

Speaker 3 (32:04):
Because I think the difference between the old and the
new is that in the old model you had that
in person component right where there was like one figure
and you know, there was like a you know, people
were like oriented around that, whereas it seems so much

(32:25):
more decentralized now these different there are like cult influencers
in QAnon as opposed to like one centralized figure.

Speaker 1 (32:32):
And so the question is what do you what do
you do when that is the dynamic?

Speaker 3 (32:35):
And unfortunately I have to say, like there was not
really an easy answer there.

Speaker 1 (32:39):
At the time.

Speaker 3 (32:41):
The suggestion was, well, you try to stop social media
companies from pushing people into these groups, right, but that
doesn't really do.

Speaker 1 (32:50):
Very much when it's sort of achieved critical mass and
the momentum is there.

Speaker 3 (32:54):
And then at the time again in twenty eighteen, twenty nineteen,
it hadn't really been fully i think legitimized in quite
the way that it is now not just QAnon, but
just this idea that it's normal to just tell lies
like that.

Speaker 2 (33:11):
Right.

Speaker 3 (33:12):
So I think at this point where I've come down
is you have to give somebody another story, another sense
of meaning, another sense of you know, a way to
rebuild trust. You have to you know, do all of
these different things. So how do you help that process happen?
And it has to happen on social media where people are,

(33:33):
so what are the ways that you do start to
get you know, these folks communicating better, more accurate information
to the public. And so I spent a bunch more
time lately actually asking that question, like, what are the
who are the public health influencers?

Speaker 1 (33:48):
You know?

Speaker 6 (33:49):
Yeah, well you are one. I mean, you have done
tireless work. I don't know how you managed to be
a mother and you're like And then on my free
time I was I'm like, what at what energy source
are you running off of? It's incredible. We need more
of that.

Speaker 3 (34:05):
Well, I mean, you guys are doing a lot of
it too, right, like helping tell people's stories. I think
I think the storytelling component is where so many of
the gaps have been. I think my frustration with the
CDC coming out of the measles situation in twenty fifteen
was that they still really, sincerely, deeply believed that the

(34:27):
public would continue to listen to them just because it
always had right, Well, we're the government, so we put
out these things, we say this stuff, and that's it.

Speaker 1 (34:35):
That's where it ends.

Speaker 3 (34:36):
You know, people will continue to vaccinate their kids because
they believe us, because we're the government.

Speaker 1 (34:40):
And when COVID started.

Speaker 3 (34:42):
I thought like, man, this is going to be such
a bloodbath because now it's going to happen right now,
They're going to see just how huge that gulf actually
is and just how unprepared they actually are for the
fight that is about to come.

Speaker 1 (34:58):
And sure enough that was what happened. And and so.

Speaker 3 (35:01):
I think like, I'm I look at people like doctor
Mike on YouTube, right, or some of the you know,
your local epidemiologist or unbiased science or some of these
others who are out there again trying to just use
content creation and storytelling.

Speaker 1 (35:15):
Hank Green I think is really great at this.

Speaker 4 (35:17):
Also Jessica and Nurik is that he's doing it. I
love her, yeah, yeah, yes, yes.

Speaker 3 (35:22):
And I think there's so many of them who are
you know, some of the there's a lot of mom
influencers who are out there trying to do it now,
who are just trying to be out there as positive,
empathetic voices that recognize that the facts are not what
people are looking for. They're looking for somebody to help
them understand, somebody that they feel that they can trust.

Speaker 1 (35:41):
And how do you communicate like that in this environment?
And that's where I think I think a lot of
this has to come from there at.

Speaker 4 (35:47):
This point, which is so sad because it means influencers are.

Speaker 2 (35:53):
Are our hope, Like, that's so depressed. I don't want that.

Speaker 4 (35:59):
But but like if we really have always as humans
relied on other people in our communities to interpret the
media for us, and that's how we form our opinions,
then yes, that is what we need to be doing.
It's just like in this scarce attention economy where people
have like a four second attention span, I mean, it's

(36:21):
just really discarded.

Speaker 6 (36:23):
Yeah, and I think that that thing we always come
back to with cults in general applies where it's like
are you isolated? Are you speaking to people in real life?
Are you only going to one person? Like, you know,
getting out of that bubble is really important, so hopefully
people can just get the fuck offline.

Speaker 3 (36:39):
Yeah, no, you're right about that, or even the there
are a couple.

Speaker 1 (36:43):
Of articles that have been written about it lately.

Speaker 3 (36:44):
Then I think it's really true that social media is
not really social in quite the same way anymore. Right,
it's just entertainment, right, And Sora is an interesting evolution
in this now where it's just content, it's just pure content.

Speaker 1 (36:59):
It's like a study drip of content. And I feel
like I only socialize.

Speaker 3 (37:04):
I socialize in like group chats, Like it's yeah, yeah,
it's like people I know, like I know most of
them in real life, but I talk to them in
group chats, not on Facebook. Like I don't see content
from almost anybody I know on Facebook at this point,
whereas ten years ago that was how I followed weddings
and birthdays and stuff.

Speaker 1 (37:23):
I have a private.

Speaker 3 (37:24):
Instagram account where I only follow and am followed by
people I know in real life, and that's where I
see like, oh cool, this is where people's Halloween pictures are.
And then I have my other, my other one where
it's right, this is the one where I'm going to
follow all the people I need to understand professionally, and
that's my content account, right Yeah, but it's just a

(37:44):
weird you know, even LinkedIn no the idea that you
know you use it to like connect with people you
might need to get a job from someday, and now
you're supposed to be out there like influencing on.

Speaker 6 (37:55):
Influencing, I saw this and that was like, do people
on LinkedIn know we're gonna dice is just and Okay,
I just had this random question comes to my mind
and we can cut it out of if you don't
answer it. But like, were you ever influenced by any
of the stuff that you came across that you were like,
oh damn, that was crazy.

Speaker 2 (38:16):
Maybe the Earth is flat?

Speaker 3 (38:17):
Yeah, yeah, not so much, the not so much the
studio sciencetuff.

Speaker 2 (38:23):
I've definitely found.

Speaker 3 (38:26):
I've definitely found stuff on YouTube where I've like followed accounts,
my most my my most fun one was it pushed
me the red bull dancer style videos for like from
like worlds. I don't know if you guys know what
that is, but it's like this dance competition. I was
like a street dance competition, and it pushed me this
video and I was watching them. I was like, Dan,
they look like they're having so much fun. And I
went and I found a couple more videos, and then

(38:47):
I wound up down this like street dance rabbit hole.
I used to dance when I was in college and
I'm like in my forties now, and I.

Speaker 1 (38:53):
Was like, oh my god, look at this.

Speaker 3 (38:54):
This is so great, and then it just sent me
down this like it's like you know, rabbit hole maybe
a year and a half ago now and I was like, oh,
this is fantastic.

Speaker 1 (39:03):
I have a new hobby.

Speaker 3 (39:05):
So I think, like you can I mean for stuff
like that, it is like a you know, I show
it to my kids. I'm like, look at these inspirational
sixteen year olds. You know that you do this if
you practice, you know. But it's just the I think again,
there's like amazingly creative, incredible people who are producing extraordinary art,

(39:26):
and like, that's what I want to see and that's
what That's what this stuff was intended to be. And
I would like social to go back to being at
least semi social. Same I would be happy to make
new friends in a normal group. I think I just
don't know how that would happen at this point.

Speaker 4 (39:42):
So, yes, we rely on people, so we do need
influencers to you know, be speaking the truth. However, influencers
are dependent on algorithms which are created and you know,
designed by tech companies which are owned by billionaires. So
like on the system level, are there solutions that have

(40:02):
been shown to be effective just briefly in terms of
like what social media companies can do or what regulation
can do governmentally, regulation.

Speaker 1 (40:10):
Is tough in the United States.

Speaker 3 (40:12):
I mean, the reality is, like platforms have the right
to curate how they see fit. This is something that
I think, again, you would want to see an environment where,
in my opinion, there are a lot of platforms so
that users can choose among them, so that that lock
in isn't quite so isn't quite so strong, so that

(40:33):
kind of stranglehold is released.

Speaker 1 (40:34):
This is why when Blue.

Speaker 3 (40:36):
Sky and Threads started, I thought, oh good, we have
a little bit more options.

Speaker 1 (40:41):
Yeah, yeah, you can go.

Speaker 3 (40:42):
And then what I love about again, what I love
about Blue Sky is the tech, Like I would love
to see that model of you know, shift your feed,
have your ability to have more agency to like, you know,
just not be so tied into what the platform wants
you to see. Like I wish that more people knew
that existed and we're using it, and i'd like to

(41:04):
see that.

Speaker 1 (41:04):
Exist in more places.

Speaker 3 (41:06):
Europe is where you're actually seeing more of that regulation
come into play. And that's because they just have different
views of the First Amendment, you know, where it's not
the content moderation piece, it's actually that even things like
transparency is considered compelled speech, so like there's certain things
that the platforms can say, like you have to make

(41:27):
the following disclosures, and they'll say that in Europe, where
the platforms have to provide a little bit more visibility
into how algorithms work, into issus who have Yeah, so
users who have their content taken down have a right
to appeal in Europe or users who have a say.

Speaker 1 (41:44):
The laws in Europe are great.

Speaker 3 (41:45):
You know, there's certain things I don't like about them,
but there's just a little bit of a different way
of thinking about that power and those trade offs. And
I think they're also working much more on what's called interoperability,
which would let you take your data and move it
somewhere so that the lock in is less strong. So
a lot of the time people will say, like, well,

(42:06):
if I leave Facebook, for example, I'm going to lose
my you know, twenty years or whatever, fifteen years of
post history, my photos, my friends, this, that, and so
some of the questions are would interoperability or data portability
make it so that that stranglehold wouldn't be so tight,

(42:26):
which maybe would make the platform a little bit more
willing to do some more experimentation around things like custom
feeds and stuff like that, so that's something that you're
starting to see regulators in Europe looking at. They're also
looking more at like business model and other kinds of
other changes over there. So in the US, I think
it's just really hard to get anything done. This is

(42:49):
partly just because the polarization here means that the Democrats
and the Republicans just have very different visions for what
social media should be, and that leads to a lot
of tension around what they consider a propriate for governing it.

Speaker 4 (43:01):
Also, corporations are allowed to lobby, and those are the
ones that are but they're basically making the art decisions
for us.

Speaker 3 (43:10):
I'll make one other point, which is that the sec
right Federal Election Commission, the politicians themselves have not wanted
to pass laws saying that like if an influencer is
being paid to promote a candidate, you know, right, things
that like they have.

Speaker 1 (43:28):
To disclose to you when they're promoting shoes.

Speaker 3 (43:31):
Right, they don't, you know, but there's like certain political
communications where that just you know, don't tell, right, And
so yeah, so I think that there's just different different
ways in which, you know, transparency laws here are really abysmal,
and even that I think would shift the incentives if
the influencers had to make clear just how much money

(43:52):
they were receiving in their in their political endeavors.

Speaker 4 (43:56):
Yeah, if someone is being paid by a government to
make a video about a candidate or an issue, I
want to know who is paying them like that, because
that will inform whether or not I take that content seriously.

Speaker 2 (44:07):
I think that's incredibly important.

Speaker 4 (44:09):
Okay, Well, anyway, do you have any final thoughts on propaganda,
disinformation and or how we are all living in our
own individualized algorithmic cults.

Speaker 2 (44:21):
No, I think I.

Speaker 3 (44:22):
Really admire what you guys are doing, and I've really
enjoyed listening to your content on this.

Speaker 1 (44:26):
I think there's so much fascinating stuff.

Speaker 3 (44:30):
There's a there's a book called When Prophecy Failed that
I that I read back in the day when I
started the research, just on this idea that it's a
book about cults. It's a cult study in academia from
couple gosh, I.

Speaker 1 (44:44):
Think the nineteen forties.

Speaker 3 (44:47):
Also, I think I was from around the same time
as the uh that study nineteen fifties, nineteen fifties, and
it was interesting. It's one of these ones where, like
you know, the aliens are going to come as a
doomsday cult, and then the aliens don't come, and then
they just shift.

Speaker 2 (45:00):
Oh we got the math wrong.

Speaker 1 (45:01):
The aliens are coming. Oh you know yeah.

Speaker 3 (45:04):
And what was U was fascinating to me as I
as I read that study was you just see this
like the deep, the deep, unwavering trust that just persists.

Speaker 2 (45:13):
And this is.

Speaker 3 (45:15):
Something that I think watching that dynamic play out in
some of the guru style influencers is something that I
really admire. You know, your ability to kind of connect
the dots for people there, and so thanks for all
the work that you're doing.

Speaker 2 (45:28):
Thank you, Thank you. Where do people find you?

Speaker 7 (45:33):
So?

Speaker 3 (45:34):
Renee arrested dot com has all the various book links
and stuff, and then I am mostly threads, Instagram, Blue
Sky and uh yeah, Renee dot Doresta or at no
Upside is my old Twitter handle, which I still use
on some platforms.

Speaker 2 (45:48):
Too awesome. Thank you so much for joining us today.

Speaker 1 (45:52):
Thanks for having me.

Speaker 6 (45:54):
Yeah, Lola, I'm going to ask you a question today
after too of Renee.

Speaker 5 (46:01):
Are you prepared?

Speaker 2 (46:02):
I'm prepared?

Speaker 5 (46:04):
Have you ever fell victim?

Speaker 6 (46:06):
Have you ever fell a skew of some misinformation yourself?
I love these made up phrases saying everything but what
I'm trying.

Speaker 2 (46:17):
Of course we all have.

Speaker 4 (46:19):
I mean, I don't think that I necessarily can think
of example, like big, dramatic, concrete examples off the top
of my head. But I have so many times been
on social media and seen a video or a photo
or a tweet or whatever and been like, what, that's crazy, Yeah,

(46:40):
and then later either decided to google it or often
in the moment, now I just searched things right away,
but later found out that it was just completely made up,
or it was completely taken out of context, or you know,
it was like partially true or like representative of a
thing that was happening, but not a video from that
actual time.

Speaker 6 (46:59):
You know, partial truths are the most dangerous thing. Yeah,
why can't people go back to just completely lying? Yeah,
just either make something up completely or tell the truth.
Then we might have some reference to be like that's
not true. But when I hold some truth, you're really
making us do a lot of legwork as human beings.

Speaker 4 (47:19):
It's confusing, and we, like most of us, don't have
the time to take on that cognitive burden to like
fact check everything, And that is why I do think
we should be holding these platforms more accountable for letting
that stuff spread so much, and they are, and like,
in fairness, there are some efforts that have been made

(47:39):
and I you know, and we talk a little bit
about that, but more good efforts, more good efforts, and laws,
more laws.

Speaker 2 (47:47):
I think more laws is the is the big one.

Speaker 5 (47:49):
Absolutely yeah.

Speaker 4 (47:51):
But anyway, now I'm at a point where I, like,
I trust so little that if I see a like
a news story or a news meme or something, if
it makes me feel strong feelings, the first thing I
do is look it up and make sure because so
many times it's you know, like mostly just honest mistakes
from people sharing things that they think are really you know,

(48:12):
like I'm so afraid to share something that is not accurate,
even though it's bound to happen every now and then,
you know, while.

Speaker 6 (48:21):
Renee is doing important work. We're so grateful she came
back for part two. As always, remember to go rate
us five stars if you have it within you. Also,
we have merch out now, so go to exactly rightstore
dot com and pick them up for Christmas.

Speaker 5 (48:38):
Hats, t shirts, what have you.

Speaker 6 (48:41):
Hell yeah, well, there goes another week We'll see you
next time. Remember to follow your gut, watch out for
red flats, and never.

Speaker 2 (48:48):
Ever trust me. By bye.

Speaker 4 (48:53):
This has been an exactly right production hosted by me
Lo La.

Speaker 6 (48:56):
Blanc and me Megan Elizabeth. Our senior producer is Gee.

Speaker 2 (49:00):
This episode was mixed by John Bradley.

Speaker 6 (49:02):
Our associate producer is Christina Chamberlain, and our guest booker
is Patrick Kottner.

Speaker 2 (49:07):
Our theme song was composed by Holly amber Church.

Speaker 6 (49:09):
Trust Me as executive produced by Karen Kilgareth Georgia Hardstark
and Daniel Kramer.

Speaker 4 (49:15):
You can find us on Instagram at trust Me podcast
or on TikTok at trust Me coult podcast.

Speaker 6 (49:20):
Got your own story about cults, extreme belief, our manipulation,
Shoot us an email at trustmepod at gmail dot com.

Speaker 4 (49:26):
Listen to trust Me on the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Betrayal: Weekly

Betrayal: Weekly

Betrayal Weekly is back for a brand new season. Every Thursday, Betrayal Weekly shares first-hand accounts of broken trust, shocking deceptions, and the trail of destruction they leave behind. Hosted by Andrea Gunning, this weekly ongoing series digs into real-life stories of betrayal and the aftermath. From stories of double lives to dark discoveries, these are cautionary tales and accounts of resilience against all odds. From the producers of the critically acclaimed Betrayal series, Betrayal Weekly drops new episodes every Thursday. Please join our Substack for additional exclusive content, curated book recommendations and community discussions. Sign up FREE by clicking this link Beyond Betrayal Substack. Join our community dedicated to truth, resilience and healing. Your voice matters! Be a part of our Betrayal journey on Substack. And make sure to check out Seasons 1-4 of Betrayal, along with Betrayal Weekly Season 1.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.