All Episodes

July 28, 2025 12 mins

Send us a text

We dive deep into how digital algorithms shape our thinking and behavior through subtle reward systems rather than direct commands, exploring Michael Aponte's concept of "digitally optimized obedience" and its far-reaching implications for individual autonomy and society. Drawing from Aponte's research, the Meadows Mental Health Policy Institute, and Harvard Medical School findings, we examine how technology is fundamentally reshaping our sense of morality and acceptable speech through invisible algorithmic nudges.

• Digitally optimized obedience works through rewards and incentives, not direct commands or fear
• Algorithms create feedback loops that train users to behave in ways that generate engagement 
• Content amplification functions as implicit moral approval while shadow-banning marks ideas as unacceptable
• Echo chambers and filter bubbles create the illusion of information while narrowing our perspectives
• Algorithms deliberately escalate content toward more extreme versions to maintain engagement
• Digital platforms known to target children's developing brains despite awareness of potential harm
• Self-censorship emerges as users internalize algorithmic preferences to gain social rewards
• Reclaiming autonomy requires conscious awareness of how algorithms shape our choices

Take a moment to consider how deeply algorithms are influencing your thoughts and behaviors. What does genuine freedom of choice look like in our digitally optimized world? Please like, comment, share, and subscribe to Thinking2Think for more explorations into the forces shaping our minds.


Support the show

🎧 Don't forget to like, share, and subscribe to join our growing community of thoughtful individuals!

🔗 Follow us:
📖 Check out my book: The Logical Mind: Learn Critical Thinking to Make Better Decisions:


📲 Let’s connect on social media!

  • https://x.com/Thinking_2Think
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Lyra Morgan (00:08):
Welcome to Thinking2Think podcast.
Today we're diving deep intosomething really timely how our
digital lives are well subtlyshaping how we think and act,
and a huge thank you to MichaelAponte for letting us explore
his groundbreaking work.

Dr. Elias Quinn (00:23):
Absolutely.
The algorithm made me do it.
It's fascinating stuff.

Lyra Morgan (00:27):
Yeah, we're going to explore how technology,
especially social mediaalgorithms, might be
fundamentally rewiring ourunderstanding of obedience, even
our sense of morality.

Dr. Elias Quinn (00:36):
That's the plan .
Our mission today is really tounpack Michael Aponte's theory
he calls it digitally optimizedobedience and then see how it
connects with well otherresearch on digital influence
that's out there.
We've got insights fromAponte's obviously some of his
notes from July 16, 2025, plusresearch from the Meadows Mental
Health Policy Institute theydid a piece on how social media

(00:58):
changes our brains and someinsights from Harvard Medical
School on screen time and thebrain.
Lots to dig into.

Lyra Morgan (01:05):
Okay, let's unpack this.
So obedience most peopleprobably think of like the
Milgram experiment.

Dr. Elias Quinn (01:10):
Yeah, the classic ones.

Lyra Morgan (01:11):
Direct commands, someone in charge telling you
what to do.
Maybe a threat involved, veryclear cut.
But Michael Ponte, he's talkingabout something different, this
digitally optimized obedience.
How does that work?
How is it different?

Dr. Elias Quinn (01:25):
Well, the key difference that Ponte points out
is it's a new kind ofcompliance.
It's not driven by fear, notlike Milgram, where people
feared punishment.
Instead, it's actuallyincentivized.
The algorithms themselves,their very design, encourages
obedience.
You're not being forced,exactly, you're being trained.
Trained, not forced.

(01:45):
So it's about rewards.
How does that reward systemactually like show up day to day
?
Can you give an example of howwe might be getting rewarded
into comply without realizing it?

Lyra Morgan (01:53):
Sure, Think about likes, shares.
Your post starts trending.
Each one is like a littledigital pad on the back.
Your brain learns.
Okay, this kind of post, thisway of saying things, it gets me
that reward.

Dr. Elias Quinn (02:09):
Ah, the feedback loop.
Exactly, it's this continuouscycle.
The algorithm rewards certainbehaviors and that subtly shapes
what you post, Maybe even whatyou think is okay to post.
It makes it well profitable ina social sense to obey and kind
of uncomfortable not to.
You're not getting a directorder, You're being nudged,
guided and over time youinternalize what the algorithm
wants.
It's obedience by design, notby force.

Lyra Morgan (02:32):
I see the difference.
It's much smoother, almostinvisible, compared to Milgram.
But calling it obedience,that's a strong word.
It implies a level of control.
Is there a risk we'reoverstating how much control
these algorithms really have?
I mean, people still have freewill, right, they can choose.

Dr. Elias Quinn (02:47):
That's a really important question, and Aponte
does stress that, yes, we dohave agency, but the systems are
so subtle, exercising thatagency becomes really, really
difficult.
It's not like a direct commandyou can just say no to.
It's more like a constantstream of tiny nudges, often
happening below our consciousradar, rewarding us with
visibility, connection, approval.

(03:08):
It feels good, so it's seamless.
The influence isn't one bigdecision point, but this ongoing
flow of incentives that justkind of reshape our default
behaviors.

Lyra Morgan (03:18):
You mentioned.
This goes beyond just whatcontent we consume.
How do these algorithmsactually start to influence our
deeper beliefs or even what wefeel we can say online?

Dr. Elias Quinn (03:28):
Oh, it goes much deeper than just your feed.
Algorithms don't just show youstuff.
They actively shape what youcome to believe and, critically,
what you feel safe saying outloud or online.
How so Well?
Aponte argues and researchbacks this up that algorithms
optimize for engagement, nottruth.
Truth not nuance engagementmeaning strong emotions exactly
outrage.

(03:48):
Strong agreement, things thatget a quick reaction, that
content gets amplified, often atthe expense of, you know,
complex or balanced views andthis creates what you called
algorithmic morality.
Yeah, that's a Ponzi's term.
Basically, whatever trends,whatever the algorithm boosts,
implicitly gets tagged as trueor valuable or acceptable and

(04:09):
the flip side content that getsshadow banned, you know, quietly
hidden or demoted so nobodysees it.
That's implicitly marked asshameful, unacceptable.
The Meadows Mental HealthPolicy Institute talks about
this too, how platforms tailoreverything to your interests and

(04:30):
behavior.

Lyra Morgan (04:30):
They curate a reality the algorithm thinks you
want.
So the algorithm is kind ofsetting the terms of what's good
or bad online, and thatnaturally leads us into well,
filter bubbles and echo chamberswe hear those terms a lot.

Dr. Elias Quinn (04:38):
Precisely.
You think you're gettinginfinite information Right.
That's the problem.

Lyra Morgan (04:41):
Yeah, the whole internet at your fingertips.

Dr. Elias Quinn (04:43):
But what you actually get is a highly
personalized slice.
Often it's ideologically verynarrow.
Your feed becomes as a Ponce isa mirror reflecting your own
views back at you.

Lyra Morgan (04:54):
Instead of a window into different perspectives.

Dr. Elias Quinn (04:56):
Exactly.
The more you click on stuff youagree with, the more that stuff
you see.
It makes you feel superinformed, like you know what's
going on, but you're actuallybecoming more isolated from
different viewpoints and thatjust digs the algorithmic
influence in deeper.

Lyra Morgan (05:10):
Okay, so we internalize these rules.
Our feed becomes this mirror.
What's the cost?
Aponte talks about this leadingto digital self-censorship.
We start policing ourselves.

Dr. Elias Quinn (05:22):
That's right.
You learn the unspoken rulespretty quickly.
What kind of posts get rewards?
How should you phrase things?
Which topics are safe?
Which ones might get youpushback or, worse, get you
hidden by the algorithm?

Lyra Morgan (05:34):
So you adjust your behavior.

Dr. Elias Quinn (05:35):
You do, and it's not even a response to a
person disagreeing with you.
Sometimes it's in response tothis invisible system Over time
that genuinely reshapes what youthink is worth saying.
You adjust your speech, maybeeven your thoughts, to fit in.
You internalize the algorithm'slogic.
Wow, it makes you think aboutyou know, when you search for
something random online, likeshoes.

Lyra Morgan (05:56):
Oh yeah.
And then you see ads for thoseshoes everywhere for weeks.

Dr. Elias Quinn (06:00):
Right, it's kind of annoying for shoes, but
imagine that same relentlesspush applied to your political
views or your social beliefs oryour self-image.
That's the mechanism.
That's what we're talking about.

Lyra Morgan (06:09):
Okay.
When you put it like that, theconsequences seem huge.
What are the real costs herefor us as individuals and maybe
for society overall?

Dr. Elias Quinn (06:18):
The costs are, yeah, are profound For
individuals.
Think about it.
You end up performing youridentity for this invisible
algorithmic audience.
Your beliefs might get shapednot by deep conviction but by
what gets likes.
So it stifles your real self,suppresses honest doubt, because

(06:39):
doubt doesn't trend well.
You post what aligns, what getstraction.
Maybe you even delete thingslater if they don't perform well
or attract the wrong kind ofattention.
It's exhausting and it's notauthentic.

Lyra Morgan (06:50):
And for society if we're all doing this.

Dr. Elias Quinn (06:53):
Well, think about it.
If our collective beliefs, whatwe talk about as a society, are
guided by algorithms optimizedjust for engagement, what
happens?

Lyra Morgan (07:01):
Fragmented realities, everyone in their own
bubble.

Dr. Elias Quinn (07:04):
Exactly Fragmented realities, less
critical thinking overall, andit really damages our ability to
have nuanced publicconversations.
Because dissent, differentviews, they just get deranked.
Conformity gets rewarded.
And connecting this to thebigger picture, the Meadows
Institute research points outsomething really stark
Algorithms don't just connecthobbyists.

(07:25):
They can identify extremeinterests and connect people
with shared radical views.

Lyra Morgan (07:29):
Like terrorist networks.

Dr. Elias Quinn (07:31):
Their research found examples.
Yes, facebook's own SuggestedFriends feature was apparently
used for recruitment byextremist groups.
In some cases, it learns whatyou're into and finds others
like you, for better or worse.

Lyra Morgan (07:44):
That's incredibly disturbing, and it's not just
connecting people who arealready extreme, is it?
You mentioned the algorithmsmight actually push people
towards extremism?

Dr. Elias Quinn (07:52):
That's a crucial part of it.
The Meadows Institutehighlights this escalation
effect.
It's about intensity.
You start looking for, say,jogging tips.
The algorithm notices you'reinterested in fitness, so it
pushes you towards maybemarathon training.
There may be Ironmancompetitions.
It keeps escalating to holdyour attention.

Lyra Morgan (08:09):
Or healthy recipes leading to.

Dr. Elias Quinn (08:12):
To potentially pro-anorexia content, or someone
looking for dating advicegetting funneled into
misogynistic pickup artist stuff.
Wow, this escalation, aponteand others argue isn't a glitch,
it's a feature.
It's how the algorithms keepyou hooked, find what stimulates
your brain and then just keeppushing you down that path,

(08:32):
often towards more extremeversions.

Lyra Morgan (08:34):
And this must be even worse during times of
uncertainty.
Right Like a pandemic, ourbrains are trying to find
patterns, make sense of chaos.

Dr. Elias Quinn (08:41):
Absolutely.
When people are stressed,looking for answers, algorithms
can really exploit that.
They could lead you down theserabbit holes of disinformation,
conspiracy theories.

Lyra Morgan (08:50):
Because it feels like you're finding answers.

Dr. Elias Quinn (08:52):
Yes, and because often belonging into the
group sharing those theoriesfeels safer than being kicked
out for disagreeing.
People might accept aconspiracy rather than risk
social isolation, especiallyonline.
Your brain wants answers and itwants to belong.
Algorithms can hijack bothneeds.

Lyra Morgan (09:08):
Okay, this is heavy stuff for adults, but what
about kids?
If grown-ups with fully formedbrains struggle?
What are the unique dangers forchildren and teenagers whose
brains are still developing?

Dr. Elias Quinn (09:20):
This is where it gets particularly alarming.
Leaked documents, which boththe Meadows Institute and
Harvard Medical School researchtouch upon, show platforms like
Instagram knew they weretargeting kids.

Lyra Morgan (09:31):
Deliberately.

Dr. Elias Quinn (09:32):
Yes, spending huge advertising budgets to
reach teens, even whileinternally acknowledging the
psychological harm theiralgorithms were causing Things
like increased anxiety,depression, body image issues,
especially for teenage girls.
It wasn't an accident.
It seems to have been a knownconsequence of their design
choices.

Lyra Morgan (09:50):
So a kid's natural curiosity online that can lead
them down dangerous paths reallyquickly.

Dr. Elias Quinn (09:56):
Incredibly quickly, a few clicks, a few
hours watching videos.
A child looking for healthyeating tips could end up seeing
content promoting dangerouslylow calorie counts or extreme
exercise.
Teenage boys looking for datingadvice might stumble into that
misogynistic content wementioned and the key difference
from, say, video games.

Lyra Morgan (10:13):
Which are mostly fantasy.

Dr. Elias Quinn (10:14):
Right.
Kids generally know gamesaren't real life, but social
media actively tries to modifyreal world behavior how long you
spend on the app, what you buy,how you interact with people
both online and off.
Harvard Medical School researchpoints out the developing brain
is constantly buildingconnections.
Digital media use activelyshapes that process and often

(10:40):
it's what they call impoverishedstimulation compared to real
world interactions andexperiences.

Lyra Morgan (10:42):
So, faced with all this, the subtle obedience, the
echo chambers, the risks,especially for kids, what's the
way forward?
What does Michael Apontesuggest we actually do?
We can't just unplug entirelyright?

Dr. Elias Quinn (10:51):
No, and he doesn't suggest that His core
solution is deceptively simpleCultivate awareness.

Lyra Morgan (10:56):
Awareness Meaning.
What exactly Meaning?

Dr. Elias Quinn (10:59):
consciously noticing those subtle nudges
from the algorithms, activelyquestioning why something is
trending, pausing before you hitpost or share and asking
yourself why am I doing this?
What's my real intention here?
It's about bringing mindfulnessto your digital life.

Lyra Morgan (11:13):
So it's about taking back control, reclaiming
our autonomy in this environmentthat's constantly trying to
guide us.

Dr. Elias Quinn (11:18):
Exactly that Aponta really emphasizes that
algorithms aren't magic, they'retools tools designed by people
with specific goals, usuallyengagement and profit.
By understanding that, by beingaware, you can start to reclaim
your power to choose to speakyour own mind authentically, to
question the narratives you'rebeing fed, to think
independently, instead of justletting the algorithm guide you

(11:41):
passively.
So the central message is thecentral message really from the
algorithm made me do it?
And related research is thatthis technology is subtly
rewiring us.
It's incentivizing conformity,suppressing dissent.

Lyra Morgan (12:07):
Wow, that's a really powerful thought to end
on.
A lot to process there as yougo about your day, maybe take a
moment to think about that.
How deeply are the systems youuse every day shaping your
thoughts?
What does genuine freedom ofchoice even look like in a world
so optimized by algorithms?
What does it mean for you totruly choose?
If this deep dive got youthinking, sparked some new

(12:28):
insights, please do like comment, share this with someone who
might find it interesting and,of course, subscribe to
Thinking2Think podcast for moreexplorations into the forces
shaping our world.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.