All Episodes

December 5, 2025 26 mins

What time of day do you scroll the most?

Have you tried setting limits on your screen time?

Today, Jay dives into one of the defining questions of our digital age: is the algorithm shaping who we become, or are we the ones quietly teaching it how to shape us? He reveals how every click, pause, and late-night scroll acts as a subtle signal, tiny instructions that train the system, which then turns around and begins to train us. Before we even realize it, our insecurities become fuel, our curiosity becomes comparison, and outrage becomes entertainment.

But Jay also reminds us that we’re not powerless, our agency hasn’t disappeared; it’s just buried beneath layers of habit. With calm, practical guidance, he shares how we can take our feed back into our own hands, break the doom-scroll cycle, and actually reprogram the digital environment influencing our minds. Whether it’s choosing who you follow more intentionally, setting healthy boundaries in the morning, sharing more consciously, or reconnecting with real-world anchors, Jay shows that we’re not just participants, we’re contributors to how the system works. And when we change how we show up, everything around us begins to shift as well.

In this episode, you'll learn:

How to Retrain Your Algorithm in Minutes

How to Recognize When the Algorithm Is Steering You

How to Build a Healthier, Calmer Feed

How to Use Social Media Without Losing Yourself

How to Strengthen Your Digital Self-Control

You weren’t meant to be overwhelmed by noise or pulled into constant comparison. You were built to create a life rooted in values, peace, and purpose. So take a breath, make one mindful choice at a time, and let it guide the next.

With Love and Gratitude,

Jay Shetty

Join over 750,000 people to receive my most transformative wisdom directly in your inbox every single week with my free newsletter. Subscribe here.

What We Discuss:

00:00 Intro

00:31 Even the Algorithm Has a Glitch

03:04 4 Subtle Ways the Algorithm Shapes You

07:59 How Your Clicks Create the Pattern

09:45 What a Social Network Looks Like Without All the Noise

13:08 Doom-Scrolling Can Give You Anxiety!

14:47 Solution #1: Bring Back Chronological Feeds

15:10 Solution #2: Take a Moment Before Hitting Share

16:06 Solution #3: Demand Algorithmic Transparency

16:29 Why Emotional Mastery and Critical Thinking Matter

19:11 5 Simple Ways to Reset Your For You Page

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Is our destiny coded in the algorithm. If you feel
addicted to social media, this video is for you. If
you feel glued to whatever's on your feed and can't
stop doom scrolling, this video is for you. And if
you're worried about how social media is rewiring your brain,
this video is for you. Don't skip it. The number

(00:23):
one health and wellness podcast Jay Sety Jay Shetty set
I wanted to start today saying one thing. The algorithm
isn't as smart as we think it is. But the
deeper I went into my research, the more I realize
something unsettling. It's stronger than me, stronger than you, stronger

(00:48):
than all of us because it knows our weaknesses. But
here's what I also found. Even the strongest system that
has a glitch, the algorithm doesn't just know us. It
depends on us. And if we learn how it feeds,
we can decide whether to starve it or steer it.
When you google the words will I ever, the first

(01:11):
thing that comes up is will I ever find love?
The second is will I ever be enough? And the
third is will I am net worth? We go from
love to worth to money really quickly. But this search
for love, worth and belonging is what the algorithm exploits,

(01:33):
but not in the way you think. Picture this, it's midnight.
Think of a girl named Amelia lies in bed, phone
in her hand. She posts a photo, nothing dramatic, just
hoping someone notices. The likes trickle in her friend's comment.
She taps on another girl's profile, prettier, thinner, more followers,

(01:56):
She lingers, she clicks, She scrolls of them, pays attention.
The next night, her feed feels different, more flawless faces,
more filters, more diets, more lives that look nothing like hers.
Curiosity turns into comparison. Comparison turns into obsession, and soon

(02:18):
every scroll feels like it's whispering the same three words,
You're not enough. Until one night she doesn't see herself anymore.
She only sees the mirror the algorithm is holding up
to her. This isn't just a Merelia story. Fifty six

(02:40):
percent of girls feel they can't live up to the
beauty standards they see on social media. Ninety percent of
girls follow at least one social media account that makes
them feel less beautiful. But here's the real question. Did
the algorithm build that mirror or did she was it
coded in Silicon Valley or coded in our own clicks.

(03:04):
Let's look at the algorithm first. What do algorithms actually do?
Number one They watch every pause, every click, every like,
every share, even how long you hover over a video
or comment. TikTok tracks watch time down to the second.
If you rewatch a clip, it's a super strong signal.

(03:25):
Number two. They predict using your history and the behaviors
of millions of people like you, Algorithms predict what are
you most likely to engage with. Next. If people who
watch fitness videos also tend to watch diet hacks, you'll
probably get diet hacks. Number three. They amplify. The posts

(03:45):
that get more engagement, especially emotional engagement, are pushed to
more people. Number four. They adapt. Every click retrains the system.
Your feed tomorrow is shaped by what you do today.
YouTube's recommendation engine is called a reinforcement system. It's literally

(04:06):
designed to learn from your actions in real time. The
most accurate model is a cycle. First of all, we
click what feels good, familiar, or emotionally hot. Two, the
algorithm learns and serves us more of that to keep
us there. Number three, we become more entrenched and less
exposed to alternatives, and number four, outraging divisions spread faster

(04:30):
because anger is more contagious. In plain words, the algorithm
isn't a mastermind. It's a machine that asks one question,
over and over again, what will keep you here the longest?
It's like a maximum security prison. So how do we

(04:50):
get trapped? First? The nudge think, Netflix, AutoPlay, TikTok, infinite scroll,
the design that says, don't think, don't choose, just keep watching.
That's how you start a Korean baking show you didn't
even know existed, and three hours later you're crying over
a documentary on penguins. A study found disabling AutoPlay led

(05:13):
to a seventeen minute shorter average session. Showing AutoPlay measurably
extends watch time. It's not a choice disappearing. It's a
choice so well hidden you don't realize you never made it. Second,
the loop, Yale researchers found when people post moral outrage online,

(05:35):
people reward them with likes and retweets. That person now
posts even more outrage the next time. It's not the algorithm,
it's us. It's real people. As one researcher put it,
we don't just consume outrage, we start producing it because
outrage performs better than honesty. And Third, the push Mozilla's

(05:58):
YouTube Regrets project from twenty found that volunteers who started
with neutral searches like fitness or politics reported being steered
toward extremist, conspiratorial, or misogynistic content. Seventy one percent of
the videos people regretted watching were never searched for they

(06:19):
were recommended the ucl Kent study from twenty twenty four.
In a recent algorithmic model study, accounts on TikTok were
shown four times more misogynistic content on the for you
page within just five days of casual strolling. What does
this do to men and women? Women get more insecure

(06:42):
about their appearance. Men get more exposed to misogynistic content.
Women experience more anxiety and self doubt. Men become more
lonely and disconnected. Women compared their lives to others and
feel their falling behind. Men compared their state to others
and feel like they're being left behind. Both end up

(07:04):
in the same place on social media, isolated, exhausted, and
shaped by the same machine. The algorithm will do anything
to keep us glued. There is a huge incentive issue
for the algorithm because in one study where they chose
not to show toxic posts, users spent approximately nine percent

(07:26):
less time daily, experience fewer AD impressions, and generated fewer
AD clicks. The algorithm's goal is not to make us polarized.
It's not to make us happy. It's to make us
addicted and glued to our screens. It is showing you
what people like you are engaging with, assuming you will

(07:47):
stay as well. We talked about what the algorithm does.
Let's look at what role we play. Our clicks build
the cage. False news stories are seventy percent more likely
to be retweeted than true stories are. It also takes
true stories about six times as long to reach fifteen

(08:08):
hundred people as it does for false stories to reach
the same number of people. Algorithms don't see truth or lies,
They only see clicks from people like us want to
make a real difference This giving season, this December on

(08:31):
Purpose is part of Pods Fight Poverty podcast, teaming up
to lift three villages in Rwanda out of extreme poverty.
We're doing it through Give Directly, which sends cash straight
to families so they can choose what they need most.
Donate at GiveDirectly dot org forward slash on purpose. First

(08:52):
time gifts are matched, doubling your impact. Our goal is
one million dollars by year's end, enough to lift seven
hundred families out of poverty. Join us at GiveDirectly dot
org forward slash on purpose. Number two. False news spread

(09:18):
six times faster than true news because shocking content sparks
more clicks and shares from us, so the algorithm promotes
it further. The content must already have emotional potency. An
algorithm won't manufacture depth or resonance from nothing. It can't
make it go viral. Number three. For major media outlets,

(09:40):
each additional negative effect word in a post is associated
with a five to eight percent increase in shares and
retweets from US and four Facebook studies showed that even
when given the opportunity, users click links confirming their bias
far more often than opposing one. Liberals chose cross cutting

(10:03):
news twenty one percent of the time, conservatives thirty percent
of the time. Here's the twist. The algorithm doesn't pick
sides we do. It just learns our choice and builds
a fortress around it. The danger is in that we
have no choice. It's that we don't notice when our
choices are being shaped for us. So let's do a

(10:26):
thought experiment. Why don't we create a social media platform
without these incentives, one that doesn't play these games with us.
They already tried that, and when I'm about to share
with you shocked me the most. A new study out
of the University of Amsterdam tested this by creating a
stripped down social network, no ads, no recommendation algorithms, no

(10:51):
invisible hand pushing content. Researchers released five hundred AI chatbots
onto the platform, each powered by Open Ai, and gave
them distinct political and social identities. Then they let them
loose across five separate experiments amounting to ten thousand interactions.
The bots began to behave exactly like us. They followed

(11:16):
those who thought like them. They've reposted the loudest, most
extreme voices. They gravitated into echo chambers, not because an
algorithm pushed them there, but because that's where they chose
to go. It also found that users who posted the
most partisan content tended to get the most followers and reposts.

(11:37):
Researchers tried interventions dampening virality, hiding follow accounts, even boosting
opposing views, but nothing cracked the cycle. The most they
managed was a six percent reduction in partisan engagement. In
some cases, when they started hiding user bios, the divide

(11:57):
actually grew sharper, and the most extreme posts gained even
more traction. The implication is chilling. Maybe it isn't just
the algorithms that wop us. Maybe social media itself is
wired against our better nature. Think about it like a
funhouse mirror. It doesn't simply reflect who we are. It

(12:17):
stretches our fears. It magnifies our biases and turns our
flaws into a spectacle. As humans, we can live consciously
or unconsciously. We can choose our stronger selves or our
weaker selves. When we choose our weaker self, humans are
not just curious. We're programmed to measure ourselves against others.

(12:42):
Comparison is our oldest survival instinct. Envy is the emotional fuel.
The algorithm didn't invent it, but it does exploit it
when we're tired, overwhelmed, and exhausted. Humans are not ruled
by curiosity. We're ruled by comparison, and envy is the
price of admission. The algorithm didn't create envy, it just

(13:05):
turned it into an economy. Now why do we do this?
The first is negativity bias. Evolution tuned us to notice
threats more than opportunities. Missing a bery was fine, Missing
a snake was fatal. Number two, outrage is social currency.

(13:26):
Expressing outrage signals loyalty to your group. It tells others
I'm one of us, and in polarized contexts, this isn't
just emotion, its identity signaling. Clicking rage is clicking belonging.
Number three cognitive efficiency. Negative content is often simpler. This

(13:47):
is bad. They're wrong, We're threatened. The brain prefers cognitive
ease over nuance. Complex balanced content demands more effort. Negativity
feels immedia, digest and actionable. So what do we do
about this? Doom Scrolling increases cortisol, anxiety, and learned helplessness.

(14:10):
In that state, people feel like they have no agency,
which can reinforce the sense of doom. So we have
an incentive issue for the platforms because they're just trying
to keep us glued, and we have a lack of
mental resilience for us. Put those both together, that's what
we're experiencing. Right now. So what do we do about

(14:30):
the incentive issue? People often ask me if I think
AI will ever have a soul, and my response is,
I don't know if AI will ever have a soul.
I just hope that people building AI have a soul.
The people who created these algorithms will lose millions or

(14:51):
billions if they adjusted the algorithm. Would they do that,
Will they recognize or think they have a responsibility. It's
a really interesting thing to think about because it's almost
like we're making something that is becoming us. It's almost
like Frankenstein, that idea that whatever system we build has

(15:14):
a part of us in it. If you build a company,
it has a part of you in it. There's an
energetic exchange as well. So what does that feel like
when you're building a platform that millions and billions of
people use. The truth is we can't afford to just
diagnose the problem. And I get intrigued by that sometimes
when people just want to diagnose the problem. But we

(15:36):
need to find solutions, and here are three changes social
media companies could try. The first is platforms should offer
chronological feeds by default, not buried in settings, and give
users transparent control to toggle between chronological and algorithmic. Facebook's
own studies show chronological feeds reduce political polarization and misinformation exposure,

(16:04):
though engagement does drop. The second thing they can do
is actually probably my favorite. Add friction before sharing. For example,
read before retweet prompts, share limits, cooling off periods on
viral posts. Imagine you couldn't share something until you add
read it in full. Imagine you can share something until

(16:26):
you'd what's that video in full? Twitter's twenty twenty read

(16:47):
before retweet experiment led to a forty percent increase in
people opening articles before sharing. WhatsApps forwarding limits dramatically slowed
misinformation in India. This could actually make a difference because
not only are we misinforming others, we're underinformed ourselves. If

(17:08):
you're retweeting something just based on the headline and have
no idea what's inside of it, we're now propelling ideas
that we don't fully grasp and understand. And number three
require algorithmic transparency and independent audits. Companies must publish how
recommendation systems prioritize content and allow external researchers to study

(17:32):
the impacts. The EU's Digital Services Act is already moving
this way, requiring large platforms to open their algorithms to scrutiny.
Now what do we do about the human nature issue?
I want to share with you one of my favorite stories.
A student once asked the Buddha, what do you gain

(17:52):
from meditation? He said nothing. The student asked them, why
do you meditate if you gain nothing? The Buddha replied,
I don't meditate because of what I gain. I meditate
because of what I lose. I lose anger, I lose envy,
I lose ego. If the algorithm is made of us,

(18:15):
then changing it doesn't start with code, It starts with character.
We have to remember that we are wired for generosity
but educated for greed. When will we finally start teaching
emotional mastery in schools? How long before we start teaching

(18:37):
critical thinking at an early age. Maybe the real test
isn't to build a happier network, it's to build happier users.
We built a machine to know us, and it became
us when we started on purpose. There are only three
things that went viral, cats and dogs. Sorry to put

(18:58):
them in the same group, babies and people taking their
clothes off. I had the innocent intention the naive vision
of making wisdom go viral. Today, we do over five
hundred million views across platforms every month, not playing into
rage bait, not trying to get pivot at be angry.

(19:20):
What does it show me? It shows me that people
will choose healthier options if they're available, if it's presented
to them in a digestible way. People will choose a
salad if they know why it's better for them, and
if it's available and has a great dressing. It's our
role to not play into the fear and find ways
to make love more attractive and accessible. It's so easy

(19:44):
to sell fear, It's so easy to sell negativity, it's
so easy to sell gossip. But the truth is, why
sell the things that sell people short? Why not provide
them with alternatives that are healthy, strengthening, empowering, that give
them the tools to make a difference in their life.

(20:04):
Here's the good news. Algorithms do not fully decide your fate.
They're predictive, not deterministic. They rely on your past clicks.
But you can override them by searching, subscribing to diverse sources,
and consciously engaging with content outside of your bubble. So
I want you to take a look at a new account.

(20:25):
I started in the for you page. The for you
page is pretty simple. It's beautiful imagery. It's introducing me
to a couple of scenery, and as I scroll down,
you start to see more of what the average person
would see. The for you page as you go deeper
shows you everything from political podcasts, shows me people working out,

(20:48):
shows me influencer content. Now I'm going to show you
how easy it is to change your for you page
because this page is so visual. I'm going to do
it through finding quotes and also, you know I love quotes,
So I'm going to go follow some quotes. I'm going
to like some quotes, going to like another quote, I'm
going to hover over it for a while. This is
really important to actually hover over the quote, to actually

(21:12):
read it, to actually be present with it. And now
I'm even going to share a quote with a friend
who's now going to think they have an issue because
I just shared some wisdom with them. When I refresh
check out my for you page, it's pretty much all quotes.
Through three to four simple steps, I transformed my for

(21:32):
you page. This is almost a cleansing, filtering process that
I recommend you do. It's simple. I want you to
follow five people you wouldn't usually follow. Agency isn't eliminated,
it's eroded by habit. People who intentionally curate their feeds,
limit usage, or diversify inputs show significantly less polarization. The

(21:56):
second thing I want you to do is hover over
and comment on five pieces of content you want to
see more of your offline life still matters. Real books,
real conversations, and communities can counteract the digital echo chamber.
And number three, I want you to share five pieces
of content you wouldn't usually and see how that changes

(22:18):
your algorithm. Number four, don't look at your phone first
thing in the morning. It's like letting one hundred strangers
walk into your bedroom before you've brush your teeth or
washed your face. You would never do that in real life,
don't do it online. And five be present with joy.
Celebrate your friend's wins and accomplishments. Stop overreacting to negativity

(22:42):
and underreacting to joy. We remember the bad times more
than the good times, because when we lose, we cry
for a month, and when we win, we celebrate for
a night. Here's what I want you to remember when
you like something. You're telling the algorithm show me more
of this. When you hover over something, you're saying to

(23:06):
the algorithm, I pay attention when you show me this.
When you comment on something, you're saying, this is really
important to me. And when you share it off the platform,
you're saying, fill my feed with this. You're co creating
your algorithm. You're actually coding it. One of my favorite

(23:28):
thoughts comes from F. Scott Fitzgerald. He said, the test
of a first rate intelligence is the ability to hold
two opposed ideas in the mind at the same time
and still retain the ability to function. One should, for example,

(23:48):
he said, be able to see that things are hopeless
and yet be determined to make them otherwise. That second
part is so needed right now, That's what our stories need,
accepting that things are tough, things are really hard, and
at the same time reminding each other that you can

(24:10):
make a change, You can transform your life, you can
take accountability, you can take action, you do have agency.
Reminding the world that extraordinary things have always been achieved
by a group of ordinary people. I'll leave you with this.
Imagine you walk into a party. At first, it looks fun,

(24:32):
people laughing, music playing, stories being told. But then you
notice something strange. Everywhere you turn, someone's doing better than you,
someone richer, someone prettier, someone with more friends, more followers,
more success. You walk into another room, and this one
feels worse. The room is full of arguments, everyone's shouting,

(24:53):
no one's listening, and the louder and angrier someone is
the bigger the crowd around them. That's when it hits you.
You never chose to come to this room. You were
invited by the algorithm. That's the cruel genius of social media.
It doesn't force us into comparison. It discovers we're already
drawn to it. It doesn't create division, It learns that

(25:17):
anger holds our gaze longer than joy. The algorithm didn't
create outrage. It turned outrage into entertainment. And here's the
question only you can answer. When you pick up your
phone tonight, are you walking back into that same party

(25:37):
or will you finally leave? Thank you for listening. I
hope you've subscribed. Share this episode with someone who needs
to hear it, And remember I'm forever in your corner
and I'm always rooting for you. If you love this episode,
you will also love my interview with Charles Douhig on
how to hack your brain, change any habit effortlessly, and

(25:59):
the secret to making better decisions. Look, am I hesitating
on this because I'm scared of making the choice, because
I'm scared of doing the work, Or am I sitting
with this because it just doesn't feel right yet
Advertise With Us

Host

Jay Shetty

Jay Shetty

Popular Podcasts

Are You A Charlotte?

Are You A Charlotte?

In 1997, actress Kristin Davis’ life was forever changed when she took on the role of Charlotte York in Sex and the City. As we watched Carrie, Samantha, Miranda and Charlotte navigate relationships in NYC, the show helped push once unacceptable conversation topics out of the shadows and altered the narrative around women and sex. We all saw ourselves in them as they searched for fulfillment in life, sex and friendships. Now, Kristin Davis wants to connect with you, the fans, and share untold stories and all the behind the scenes. Together, with Kristin and special guests, what will begin with Sex and the City will evolve into talks about themes that are still so relevant today. "Are you a Charlotte?" is much more than just rewatching this beloved show, it brings the past and the present together as we talk with heart, humor and of course some optimism.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.