Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The rate at which technology develops and changes is really fast, right,
way faster than science can keep up with understanding its effects.
And yeah, that's part of the reason why we're in
the situation that we're in in terms of trying to
understand the psychological effects of different sorts of technologies.
Speaker 2 (00:24):
Welcome to the one you feed. Throughout time, great thinkers
have recognized the importance of the thoughts we have. Quotes
like garbage in, garbage out, or you are what you think,
ring true. And yet for many of us, our thoughts
don't strengthen or empower us. We tend toward negativity, self pity, jealousy,
(00:44):
or fear. We see what we don't have instead of
what we do. We think things that hold us back
and dampen our spirit. But it's not just about thinking.
Our actions matter. It takes conscious, consistent, and creative effort
to make a life worth living. This podcast is about
how other people keep themselves moving in the right direction,
(01:04):
how they feed their good wolf. Thanks for joining us.
Our guest on this episode is Pete Etchel's, a psychologist
(01:25):
and science writer. He is a professor of psychology and
science communication at bath SPA University, where he studies the
behavior effects of playing video games. Pete's writing can be
found in BBC Science Focused Magazine and the psychology blog Headquarters. Today,
Eric and Pete discuss his book Unlocked, The Real Science
(01:45):
of screen Time and how to Spend it Better.
Speaker 3 (01:48):
Hi, Pete, welcome to the show.
Speaker 4 (01:49):
Ah you, thank you for having me.
Speaker 3 (01:50):
I'm really excited to have you on. We're going to
be discussing your book Unlocked, the Real Science of screen
time and how to spend it better, which is a
really good topic because there's not a lot of nuance
in the screen time discussions, and I am a big
fan of nuance. I don't think anything is ever as
simple as it's presented to us. So I really enjoyed
(02:12):
your book for that reason. And we'll get to that
in a second. But before we do, we'll start like
we always do, with the Parable. And in the Parable,
there's a grandparent who's talking with their grandchild and they say,
in life, there are two wolves inside of us that
are always at battle. One is a good wolf, which
represents things like kindness and bravery and love, and the
other is a bad wolf, which represents things like greed
(02:34):
and hatred and fear and the grandchild stops, they think
about it for a second, they look up at their
grandparent and they say, well, which one wins, and the
grandparent replies, the one you feed. So I'd like to
start off by asking you what that parable means to
you in your life and in the work that you do.
Speaker 1 (02:51):
I've been thinking about this a lot over the past
few days, and for me, it's about understanding that the
sum products of the two wolves, right, you know, we
all possess traits of both of them, and that's okay.
You know, it's okay to be angry sometimes, but the
critical thing is that you have choice. So you have
choice over which traits, which behaviors, which habits you want
(03:14):
to nurture. And this feels very relevant to me in
terms of the debates that we're having at the minute
around the influences of things like digital tech. Right, you know,
things like social media or smartphones, video games, the internet
at large are seen as things that very much feed
our bad wolf, you know, and that they only feed
(03:36):
the bad wolf, and kind of because of that, we
don't have choice in the matter. So, you know, if
you go on social media, you're going to have a
bad time because that's what it's designed for, and for me,
that removes the sense of agency from the equation. You know,
there are so many good things that we get from
our online lives, often without realizing it. But if we
(03:58):
feel that all it's doing is feeding the bad wolf,
we're left with a sense that we don't have any
control over that, and maybe the only solution that we've
got left is to ditch the tech, and it becomes
a reinforcing thing. You know, we become more negative, more
toxic in our interactions. So for me, the parable is
(04:19):
about actively thinking about what the good and the bad
things are in our relationship with digital tech and knowing
that you have the power to feed the hopefully good
ones that you want and staff the ones that you don't.
Speaker 3 (04:33):
I love that idea, and I love what you said
in the beginning. We're sort of the sum of different things,
and I think if we want to look at the
impact anything as having in our lives, we have to
look at the sum of it. And most everything in
life is some degree of trade off. There's some good
and there's some bad.
Speaker 2 (04:48):
You know, you have a child.
Speaker 3 (04:50):
There are great things about having a child, and there
are things that are challenging about having a child, and
your experience is sort of the sum of those. Before
we get too much further into that, though, you had
some great reviews for your book, and my favorite one
was from your four year old daughter, who said, it's
not for children, it's for adults because it's boring. I
(05:12):
loved that you actually put that out there. That's very good.
It's very good.
Speaker 4 (05:17):
I think she meant it in a super positive way
as well.
Speaker 3 (05:19):
Right. The other thing that I wanted to hit on
before we go too deep into the book is something
you talk about fairly early on in the book, which
is that you unearthed a journal of yours that you
kept online, something called live journal.
Speaker 4 (05:35):
Yeah.
Speaker 3 (05:35):
It was sort of a blog you kept back in
the day, your teenage self, and you were looking back
on it. And I was really struck by that part
of the book because your teenage self sounds a lot
like my teenage self, like very troubled, deeply troubled. Yeah,
and I'm curious how you relate to that.
Speaker 1 (05:53):
Now, that's a great question. That was a really serender.
It's a thing that happened, right, you know. I don't
think I could have engineered it in a better way.
Somebody asked me once, you know, did this actually happen?
And I was like, yeah, it did. And you know,
if I'd have made it up, it would be a
really boring thing to make up that I got this
random email. But you know, it was during the pandemic.
I got this email completely out of the blue that
(06:16):
reminded me that I had this thing, this live journal,
and I think I'd completely forgotten about it up until
that point. And there's a bit in the book where
I have this like horrible realization that it's still there.
Not only is it still there, it's not locked away anymore.
It's you know, literally anybody can see it, and you know,
this sort of really horrifying feeling at the time that oh,
(06:38):
somebody's going to come along and take it and just
put all of my deep seated worries and fears on
the Nobody cares about what I was writing when I
was seventeen eighteen. But it was a huge nostalgia trip
for me to go through that. You know, I downloaded
all of the posts that I put up and deleted
it so it's not there anymore online, but spent a
long time afterwards just going through it and you know,
(07:00):
like you say, I went through them, and this sense
that I got reading through them was, Wow, this kid
was miserable. This was not a happy person and in
not a happy place. And I think part of that
is teenage angst, you know. I think it's very easy
to put down in writing things that maybe sound overly
(07:22):
dramatic sometimes. And I certainly know that I was a
person like that when I was younger.
Speaker 4 (07:26):
I know that it came from a.
Speaker 1 (07:29):
Place of difficulty, though, so something that I don't talk
about so much in Unlocked. It comes up a little bit,
but is very much more a feature of my first book,
which is about video games, which was actually not about
video games in a way. It was about grief. Was
that my dad died when I was fourteen, and I
spent a long time not dealing with that because it
(07:54):
was too big a thing to deal with as a kid.
And I think over time that then starts to come
out in different ways. Right, And this is what I
saw in the live journal posts, that what was happening
here was somebody who was trying to process something really
horrible that had happened and not really letting it in
and not allowing that grief process to happen. And when
(08:15):
you do that, when you start to build walls and
build dams, it leaks out in sometimes not very helpful ways. So, yeah,
it was really heartbreaking in a way.
Speaker 4 (08:24):
You know.
Speaker 1 (08:24):
It's one of those points where you feel like you
wish you had a time travel machine. You could go
back and just say to that kid, You'll be all right.
You know, things will work out. You will never get
over these things, because that's not what happens with grief.
Speaker 2 (08:37):
You know.
Speaker 1 (08:37):
We often think about when difficult things happen to us,
or we experience death, that we will get over it
at some point and then we'll go back to whatever
life was like before. And that's not how grief works, right,
You grow around it. It's always a part of you,
but it stays with you and hopefully gets sort of
smaller over time, or rather it stays the same and
(08:58):
you get bigger. To be able to go back and
say that would be a nice thing, but you know,
such as life.
Speaker 3 (09:04):
Yeah, and I think you even say this in the book.
I'm not even sure that my fifteen year old self
would even believe me if I did, but I agree
with you. I look back at my younger self, and
I'm like, I wish I could just go back and
be like, just relax a little bit. Things are going
to be very different than you imagine. I just look
back at some of the ways I thought at fifteen
and eighteen and twenty one. There's this sense of finality
(09:28):
about things. I mean, I was so young and I
was just like, O, I'll never find love again. I
mean that kind of thing right where you're like, okay,
you know, it seems so real. Then, so let's pivot
now and talk about screen time. I'm going to summarize
the gist of your book and argument very quickly here,
and then we're gonna unpack it and you're going to
(09:49):
clarify it. The gist of it to me is that
there are a lot of voices out there declaring that
screens are really bad for us. You know, we've lost
a whole generation to screens. The kids are not all right.
It's claiming all of our lives, Our attention is breaking down,
on and on and on. You can find them constantly.
But that if you look at the science, the science
(10:11):
is not nearly so clear or settled as all of that.
Is that a reasonable position to again sum up a
lot very quickly.
Speaker 1 (10:20):
Yeah, I think so. This is something that I've always
tried to do in my writing, is to try and
start from an objective viewpoint. I tried really hard not
to go into this going I don't think any of
this is sensible, so I'm going to show that it's
all wrong. What happened when I started writing and Locked
was that I really wanted to get to a position where, yeah,
(10:41):
maybe there's some things, some topics or some chapters where
we go into the science and actually it supports what
everybody's worrying about, and what we can do is look at,
you know, where things are really scary and justified, and
maybe where some of our worries could be better directed. Frustratingly,
there was a consistent thing that any sort of area
of research where you look at digital technology effects, you
(11:02):
find a very similar story, which is that the public
facing discussions around this particular thing, whether it's you know,
social media and.
Speaker 4 (11:11):
Mental health or screens and sleep.
Speaker 1 (11:14):
Or whatever, it is at face value a very scary,
very definitive, or very confident things being said about them
that you know they are clearly detrimental, and you go
into the research and that's not what you find. You
find some research to support that line of thinking, you
also find research that doesn't support it. And when you
try and look at the entire picture, it becomes very
(11:35):
difficult to get a sense of what we actually know
where the general direction of travels point in.
Speaker 3 (11:40):
One of the things that psychology can be good at
doing when done right, is that I think it shows
us places that are common sense or our intuition may
not be right. That's one of the things that modern
psychology research seems to do. And we all have an
intuition or a feeling the screens are bad or are detrimental.
(12:03):
Where do you think that's coming from.
Speaker 1 (12:05):
That's a good question. I think a lot of this
comes from a lack of understanding how to use these
technologies in the best sort of ways. So the rate
at which technology develops and changes is really fast, right,
way faster than science can keep up with understanding its effects.
(12:26):
And yeah, that's part of the reason why we're in
the situation that we're in in terms of trying to
understand the psychological effects of different sorts of technologies. I
think it also means in a day to day perspective,
just in our normal, everyday lives. These things appear and
they don't come with a training manual. They don't come
with a list of things like this is what you
would use this technology to better your life, These are
(12:47):
the things to watch out it for. If you use
them in these sorts of ways, you're going to have problems.
There's nothing like that. We wing it right. So if
you look at smartphones in particular, So smartphones have been
around for a while, but really exploded in popularity around
about two thousand and seven with the introduction of the iPhone.
I remember getting an iPhone when they first came out.
It was nearly tech a doctor at the time, and
(13:09):
I thought it was great. It really reminded me of
like old school sci fi shows from like the mid nineties.
There was one called Earth Final Conflict that I always
really used to love, where people literally had like a
thing that looked like a smartphone. Right, it was a
device that had a screen or you can make video
calls with people, and it's like, this is the future.
So there was that really like cool, it's happened now,
(13:31):
and then it was coupled with okay, so what do
I do next? And you know, for our generation, for
people who got those early smartphones. There was a built
in gating mechanism, right, So when the first iPhone came out,
it was a glorified phone, right. You know, you could
make phone calls on it, you could go on the
Internet with it, but it wasn't a great experience. You
(13:51):
could do emails, things like that. Social media wasn't really
a thing there. There was no app store when it
first came out. Those things came out over time, so
it takes a few years for the store to kick in,
and then a few years after that for things like
Instagram to appear. So that gave us this sort of
natural gating where we could try and figure things out
as we go along, and some people did that really well.
Some people didn't so much. Where things are different now
(14:15):
and I think this is part of the reason why
it feels so much more pressing that there are issues,
is that everything's there now. They've got everything all at once,
and particularly for kids and teenagers who are getting their
first devices, there's no staggering of these things. They're getting
everything all at the same time, with no manual, no
way of thinking about how do we navigate this sort
(14:36):
of technology, And that's where it becomes a little bit
random in a way you know whether you're going to
have a good go of it or not. So because
we're in that trajectory of using them, you know, and
it's still the case for our generations now, but for
every generation that what we're doing is we're figuring out
how to do it as we go along, right, and
(14:58):
that lends itself very to having bad experiences. I think
one of the great problems with the way that we
talk about technology and the relationships that we have with
technology is that all the things that we get that
are good from them tend to be entertainment or convenience factors,
and because of that, we don't notice them.
Speaker 4 (15:16):
Right.
Speaker 1 (15:16):
So, you know, you use your phone when you don't
know where you're going, You'll just bring up a map
and then you'll find where you're going, and you don't
get lost, you don't have a bad time, so you
forget about it. You don't have your wallet on you
or your cards, so you can't pay for a coffee
or something or drink when you really need one, but
you can pay with your phone and you don't have
a bad experience, so you don't think about it. So
we tend not to notice them. But when we do
(15:38):
have bad experiences, they are more salient because they don't
align with fundamentally what we want to use the technologies
for all of us have got experiences of not using
digital tech in a way that aligns with what we
want to do, in a way that feels like it's
messing with our well being, right, And I'm the same,
(16:00):
you know.
Speaker 4 (16:01):
Say a lot.
Speaker 1 (16:01):
You know, there's been a few times recently that it's
got to like ten o'clock at night and I'm am
really tired and I want to go to bed, and
an hour later, I'm still there scrolling through Instagram and
it feels not good because it feels really unhealthy, right,
And there's regret there, Like, man, that's an hour that
I could have had a sleep, and I know I'm
going to be tied in the morning, and it's happened again.
(16:23):
And we all have experiences of that, right, And it's
very easy then to feel that we don't have control
over that experience, that this is something that's happening to us,
and that because everybody has these that it's a feature
of the system.
Speaker 3 (16:37):
Yeah. I felt a couple different things as I was
reading your book, I had a couple ping pong in reactions.
One reaction was, this all makes complete sense. The science
doesn't seem to be settled, and people are taking bits
of that science and amplifying it and blowing it up
before clickbait and headlines, which people, you know, happens to
science of all kinds all the time. I had that
(17:00):
reaction where I felt like, Okay, this is good nuance.
The other reaction that I found myself having a couple
times was this is what big tobacco and climate change
did too. They just kept saying the science isn't good enough,
the science isn't good enough, the science isn't good enough,
and so nobody did anything for a long time, and
there really were and are legitimate problems underline. And so
(17:25):
that was kind of the two things that were going
on inside of me. When I say that, what does
that bring up for you?
Speaker 1 (17:31):
I think it's a really understandable analogy to make, and
I think a lot of people make it. So one
thing I'll say is that, to use the old phrase,
you know, absence of evidence isn't evidence of absence that
you know, what we're not saying here is well, you know,
there's no conclusive evidence to show that social media is
bad for mental health. Therefore there's nothing to worry about.
(17:53):
That's absolutely not what people are saying, and I think
it's often made out that's that's what's happening, right, So,
I think think if you talk to scientists on all
sides of these these debates, I think everybody's trying to
do the right thing, and everybody cares about the same
end goal, which is everybody wants better healthier relationships for
everybody with their tech. I think where we differ is
(18:15):
the means and the methods and the journey and the
messaging towards that. I think where analogies with things like
big tobacco maybe breakdown is that we actually need very
early on in those debates and in that research that
chemically these things were really bad for us. You know,
And if you look at the intentional use of things
(18:37):
like tobacco, if you use it as intended, as in
like the quote correct way that it was meant to
be used, you know, it has a ridiculously high chance
of killing you. That's almost sort of what it's designed
to do. What we're talking about with digital technologies is
not something that has a chemical interaction with our bodies,
(18:58):
and fundamentally, what it is about is connecting each other.
And this arguably, I fully admit that this is maybe
a naive way of thinking about it with something like
social media. I know there are other things going on here,
but fundamentally, what those sorts of technologies are aiming to
do is to bring people together, their technologies of of entertainment,
of pleasure, convenience, of connection, and they're designed to facilitate that.
(19:22):
So the analogy for me with I understand where it
comes from. And it could well be the case that
we get some conclusive research in a few years time
that shows that actually, unequivocally social media is bad for us.
I don't think that's the case, because we're talking about
fundamentally different things and different mechanisms, different ways in which
we interact with them. But I really understand and empathize
(19:44):
where those analogies come from, and they're driven by deep
seated worries about the way that we interact with these technologies,
and I think it's important that we don't disregard them
or just you know, laugh people office that's a ridiculous analogy,
because you know you're you're not smoking phones or anything
like that. The reason that these analogies come up, and
(20:07):
the reason that these conversations can be so vitriolic sometimes
is because people are really worried and people really care
about what the impacts are. For me, there's been a
maybe a bit of a failure both in terms of
science communication but in the way that we do science,
in that you know, there are clearly lessons that we
could learn, even if you look at digital technology research.
(20:32):
You know, you look at the cycle of panics around
digital tech effects. You know, before social media and smartphones,
it was gaming addiction, and before that it was violent
video games and aggression, And you know, you can track
these technology panics back.
Speaker 3 (20:47):
Right.
Speaker 1 (20:48):
You never know whether you're in a moral panic until
you get out of it, right, So we can't say
that this is just another moral panic because we're in
it at the minute. But for me, though, what we've
not done is figure out how to get out of
them quicker and future proof ourselves a little bit more
so that the next time a new technology comes along
or a new thing that we're worried about, we've got
(21:10):
the means to research it quickly and understand its affects quickly.
So that we can figure out what to do with
it and move on, so we don't keep getting stuck
in these conversations.
Speaker 3 (21:20):
Right. One of the big things that your book points
out about the debate around screen time is that screen
time is not a thing. I mean, it's not one thing. Right.
You give a great example in the book of if
we were to try and log an hour of you know,
either you or I or anybody screen time, right, it
(21:44):
might be well, I was on email for five minutes,
I was writing my book for twenty five minutes, I
was on Twitter for ten minutes. I'm all over the place.
And then you make the point even further that even
within Twitter, there are times that that might be an
edifying experience. Giants, I'm connecting with a colleague and we're
having a good discussion versus I am. Whatever thing you
(22:07):
get into that you know is not helpful to you,
And so that we tend to say screen time or
social media or these big blobs of things are bad
for us. So the first problem is that we're not
making any differentiation there. And then you make the further
point that when we talk about mental health effects, we
are also not doing a very good job of tweezing
(22:29):
that apart, what do we mean by bad for our
mental health? And all that stuff is so squishy that
it gets very hard to run the sort of and
have the same sort of randomized control trial that you
would to see whether a medicine reduces blood pressure.
Speaker 1 (22:49):
Yeah, it's a huge problem, not just for the research,
but for the way that we talk about this more
generally as well. So I've really struggled with this into
to talking to journalists since the book has come out,
because all will happen is you'll have a conversation where
you go or they'll ask a question around you know,
what does the research say on the effects of social media.
Speaker 4 (23:11):
On mental health?
Speaker 1 (23:13):
And you know, we'll start talking about some studies or
some findings, and you know, we'll talk about the mess
in the area and things like that, and then they'll say, yeah,
but what about the effects on suicidal ideation? But yeah,
what about the effects of these sorts of forms of
social media? And the conversation feels like it's all the
same stuff. It's screen time and mental health, but it
(23:33):
bounces around in lots of different places. And part of
the reason for that is, like you say, we don't
have a good handle on this in the research literature.
Part of the reason why you can point to a
lot of studies that show negative effects if that fits
with your worldview, or you can point to a lot
of studies that show positive or non effects if that
(23:56):
fits with your your particular worldview, is because they're all
out there, yes, because they're all measuring slightly different things.
One of the things I tried to do with them
not to say, okay, well what if we take everything altogether,
what do we see? And that's where you get the
mess right, You go, well, we don't have clear findings
in one direction or another. The counter to that, I
(24:16):
think in some ways is the book is about different
aspects of screen time, and I think I was a
hostage to fortune in some ways. And that almost immediately
as soon as the book came out, that the conversation shifted.
Speaker 4 (24:27):
I don't think.
Speaker 1 (24:28):
Anybody really cares about screen time anymore. They care about
social media and smartphones. So the first counter I get
quite a lot as well, Yeah, that is like nobody
cares about screen time. So that feels like a bit
of a strong man argument. I think the same can
be said of social media as a term I've not
seen a sufficiently good definition of social media. You know
(24:48):
that you might use for research purposes or also for
regulatory purposes. Yeat, that doesn't inadvertently scoop up things that
you maybe don't want to regulate, right. You know, very
often the definitions will also include things like text messaging
or WhatsApp messaging. Even things like forums or Google classroom
(25:08):
and things like that share many of the characteristics of
social media. And this is not me saying, oh, well,
you know, very nerdy. You need to define your variables
because I'm a scientist and I care about those sorts
of boring details. We need to figure out what it
is precisely that we're worried about, because then we'll have
a better chance of doing something about it, something effective.
Speaker 3 (25:45):
One of the parts of the book that I found
really interesting was the discussions on attention. You reference a
person who's been a frequent guest of the show and
a little bit of a friend, Johann Hari and his book.
I think it's called Stolen Attention. I don't remember the
exact title, but the idea is that our attention is
being hijacked and stolen from us. Now, Johann has a
(26:07):
particular view of the world. He's written a number of
books on addiction on depression, and I have both those
things in my past, so I find him interesting. And
he has a view on the world that tends to
take things and make them societal forces. The problem is
bigger than the individual. It's all the stuff around the individual.
And so he's got this idea that these things are
(26:29):
competing for our attention. And yet you say that the
science on attention is just as confused as the screen time.
Speaker 4 (26:36):
Yeah.
Speaker 1 (26:36):
Again, it's one of those areas where we probably use
the term attention in a general sense as a sort
of capsual term when we mean other things. Maybe we
mean other things at different times, but I think more
often than not we mean our ability to hold concentration
or not be distracted. And I think that's where a
(26:58):
lot of people feel as though things have gone wrong recently.
That And yeah, this is how I lead out that chapter.
That You've got a piece of work to do, and
you sit down and try and do it, and you
immediately get distracted by other things. So there's sort of
two realities there as to what's going on. And I
have this conversation with my university students quite a lot,
(27:19):
you know. I'll have students come in and say I
allocated today to writing an essay, and I sat down
at my computer in the morning, and I really wanted
to do this, and then like five minutes in and
I'm on the phone or I'm playing a game, and
the next thing i know, it's lunch time. And then
I'm in panic mode because that's three hours that I've
(27:39):
lost already. I'm already behind, and i know that I
need to do it now, but I'm struggling to concentrate
because I'm so worried. And then I go and play
video games to try and calm myself down for a bit,
and then you know, the day's gone and I've not
done anything.
Speaker 4 (27:51):
I do this as well. I think everybody has experience
of this, right.
Speaker 1 (27:54):
It's very easy to get distracted because we've got lots
of stuff around us that makes us distractable. The fact
that many of us have this sort of experience is
not sufficient evidence to say that therefore there is a
collective collapse in our attention. This has happened almost by design.
That's a very substantial claim to make. So I talk
(28:17):
about this in the chapter and think about Well, if
that was the case, we would probably see that signal
in the research literature. You don't even need to do
a specific study on this. You just need to look
at the past fifteen twenty years of research on attentional
queuing or the postner queuing paradigm or something like that,
and look at do you see declines in the averages
(28:37):
in those studies over time?
Speaker 4 (28:38):
You don't see that.
Speaker 1 (28:39):
Nobody's noticed it anyway, you know, I think there are
claims that I think one of the claims that take
Johann's task on in the book is that, you know,
it seems like we've all suffered this like collective twenty
percent reduction in brain power. And again, that's a very
substantial claim to make, and I think if that was true,
that that would be disastrous for human civilization, right, And
(29:04):
I think it would be so overtly obvious, more so
than any claims around you everybody feeling a little bit distracted.
I don't think we'd be having this conversation. I don't
think there would have been a debate around it. There's
another one of your guests, I think, who I spoke
to you for the book, who I think has a
different take on this is Near aar and I spoke
to him for my book and we had this conversation
around when you sit down at start of the day
(29:25):
and you just feel completely distracted, you know, And he
says that the first thing you do is ask people, well,
show me how you planned your day. Again, I've done
this with my students, do you use a diary or
do use a calendar to plan when you're going to
do things? And seven times out of ten they'll say no.
Sometimes they'll say yes, But even in the times they
say yes, there's not really any structure to that. More
(29:46):
often than not they don't really use a diary or
a calendar. And you know, that's not to be disparaging,
like I didn't use anything like that.
Speaker 4 (29:54):
When I was a student.
Speaker 1 (29:56):
But Near makes this argument, and he sort of said
this to me when I was talking to him for
the book, that how can you be distracted by something
when you didn't plan your day? And I think, again,
it goes back to this idea that we use technology
in a haphazard way without really thinking about how it
aligns with our goals. And some people they manage that fine,
(30:17):
but for a lot of people that's when we use
it in an inappropriate or an unhelpful way. So you know,
I can think of days where I was writing un locked,
where I knew that I needed to hit my word
count for the day. I work really well under those
sorts of circumstances. If I know that I've got a clear,
well defined goal by the end of the day, that
(30:37):
means that I know by the end of the day,
I know whether I've had a good day or not,
so I can manage accordingly. If I've not had a
good day, I know I need to pick it up
the day after. If I had a good day, I
know I can relax more and pat myself on the back.
And then equally, there are other days where I didn't
really give myself a goal. I was just I need
to write, and those were the days that I got
really distracted. So one of the things that I talk
(30:58):
about in the book is, yeah, we don't have a
clear idea of what we actually mean when we talk
about attention. And I am also painfully aware that that
is entirely useless for everybody in these conversations, for somebody
like me, as I just come along and say, well,
we don't actually know what we're talking about here, it's like, yeah, great,
well done you. That doesn't help us with anything. But
(31:21):
what a try and point to is a newer line
of researchers emerging research over the past three four five
years that tries to recharacterize attention in a way that
I thought was really helpful for thinking about how we
cope with digital technologies. So attention is not this simplistic
thing that you know, when a notification on your phone pings,
(31:43):
it will automatically in always grab it, and then as
soon as you're on your phone, that's it. You're stuck
for the next hour. That's not really the right way
of characterizing attention. A better way of thinking about it
is so in the literature they're called priority maps that
you can think of them like heat maps. So if
you imagine like your visual environment in front of you,
your audio visual environment, an overlaid on that is like
(32:06):
a heat map, and wherever there's a peak in that
heat map, there's something that is worth your attention, and
where it's flat, there's nothing. So for me at the minute,
you know, I've got a microphone in front of me,
a screen in front of that, my phone is off
to one side my bags down here. Most of this
map is flat apart from my screen because that's where
(32:28):
my attention is focused, and I want to talk to
you what affects that map. So it is bottom up processes.
So if you know, visually something happens, if somebody came
through the door, that would cause a spike in that
area of the map. For me, if my phone pinned off,
that would cause a spike. But top down processes have
(32:48):
an impact as well. So top down processes are things
like what are your goals, what are your motivations.
Speaker 4 (32:56):
Right now?
Speaker 1 (32:57):
I am really motivated to not sound like I'm talking
complete gobbledygook on this, and I'm finding it an interesting
conversation and this is fun for me. I know that
if my phone were to go off a that would
be really distracting. You know, I'd lose my train of thought,
I'd lose what I was talking about. It would also
(33:17):
be quite rude to you as well, and I don't
want to do that. So I mean, I've got my
phone on mute, so it's irrelevant. But if it did
go off, that little spike on the priority map would
be a little bit, but it wouldn't be sufficient to
grab my attention. If you think about a different scenario
where I'm sort of in the same environment, but maybe
(33:37):
instead of having this conversation with you, I'm trying to
write an email or I'm trying to write a document
and I've not really thought about it, and I'm struggling,
and I'm worrying about, you know, who's going to pick
up the kids and what we're going to have for
dinner and things like that. I'm not really in the
task because I'm trying to avoid it because I've not
really prepped for it, and then my phone pings off
(33:59):
it will be a much higher spike on that map
and I'll go ooh, something to distract me for a bit.
So thinking about it in those sorts of terms, I
think is more helpful, because when we talk about attentional collapse,
that's such a scary thing that has happened to us.
How could we possibly do anything about that? Whereas thinking
about it in terms of these things in front of me.
(34:19):
These screens are tools. They are there to help me
do the things that I want to do. I just
need to figure out what they are and make them
work for me. You can curate that experience a little
bit more and have more control over what is worth
your attention.
Speaker 3 (34:35):
Yeah, I found those priority maps to be a fascinating
way to think about attention. I think about attention a lot.
I'm a long time meditator, so there's a certain way
in which attention is used. Their attention is, in many
ways one of our most fundamental assets. And I find
the priority map to make more sense to me, you know,
and it depends on both bottom up and top down processes.
(34:59):
Makes complete sense to me. I also think that what
you said your students described to you, I can say
was happening to me in nineteen eighty nine. I thought
I was going to be an author. I would sit
down to write, and fifteen minutes later I'd be like, well,
I am now reading. You know, my distractions might have
(35:19):
been different. They might have been I'm reading a novel
instead of being on Instagram. But I didn't know how
to stay on task. I didn't know how to do
any of those things. I do think that there is
also something to be said for we have a whole
lot more coming at us, and I'm a believer that
we have more control over that than we often think
(35:40):
that we do. It's all the classic stuff, turn off notifications,
and you know, I use tools like I don't remember
what this tool is called. It's something I've been using.
I want to call it screen time, but that's the
Apple app. But basically, all it does when I try
and open something that I don't want to open all
the time, it just pops up a little thing and says,
(36:02):
more or less, take a breath, do you really want
to do this? And very often that's enough?
Speaker 4 (36:08):
Right?
Speaker 3 (36:08):
No, Actually I don't want to do that, right, So
I think we can use technology. I think your other
point is an important one, and I talk a lot
about achieving your goals or changing your behavior, and you
know there are structural elements of that, planning your day
and turning off your notifications, right, and then there are
emotional elements of that what's happening inside me, whether it's
(36:29):
in nineteen eighty nine and I think I should be
writing a novel and I instead pick up a novel
to read it, versus today if I were to hop
over and start playing solitaire. You know there's a common
thread underneath there. So problems with productivity didn't just start.
We've been writing about this stuff for a long long time.
And at the same time, I do think to me,
(36:52):
it seems self evident that there are smart people who
are trying to think about how to get me to
spend more time on their app, their streaming service, their whatever.
They know the emotional manipulation tricks, and so there's an
element of individual agency for sure, and there's an element
(37:12):
of somebody trying to get me to do something in
the same way that I could say with fast food,
there's an element of individual agency whether I eat a
big Mac or not, and we know that those foods
are designed in such a way to capture my attention.
So I think it ends up being kind of back
to where we started, right. There is a lot of nuance.
Speaker 1 (37:33):
Here, absolutely, and I think nobody is saying again, on
all sides of the debate. Often there's a perception that
people like me who are saying that actually the research
doesn't support some of these more fear mong Greek claims
that are made out there, therefore there's nothing to worry about.
That's absolutely not what people are saying. And I think
a lot of people with similar views to me say
(37:54):
the same thing as me, which is, you can hold
two things true at the same time. One is that
the research is not great in this area and it
doesn't support some of these wilder claims they're made. At
the same time, tech companies absolutely need to be held
more to account and to design these platforms better so
that they're more supportive for our well being. They play
safety and well being at the core of their design,
(38:17):
not just as something that we maybe want to think
and talk about every now and again, but like they're
absolutely fundamental to it. Those two things can be true
at the same time, and I think, yeah, you're right
that thinking about this in terms of there are pressures
on social media platforms, let's kind of keep with them,
for example. So there's this sort of naive view of
(38:38):
what social media is, which is this way of connecting
or facilitating human connection, which is wonderful and utopian and
not really the full story. Right. There is also the
business element of it, which is that you need to
make money out of these platforms, and there are various
ways in which you can do that. More often than not,
it requires holding people's attention, but keeping them on those sites.
(38:59):
That's where the tension comes, or one of the tensions,
one of many tensions that that maybe is not with
our best well being interests at heart. But again, for me,
I think a big aspect of this is yes, yeah,
we do need to push more for better safety by
design principles and well being by design, but that's going
(39:20):
to take time, and there are still things that we
can do very immediately that support our well being that
we can do for ourselves. I'll give you an example. So,
and this has happened really recently to me. I was
on a debate and online debate recently, and I know
that there was somebody who was sending some not nice
messages on one particular social media platform I'd come across inadvertently.
(39:44):
I think one of my friends had sent me. I
was like, hey, have you seen what this person is
saying about them? I'd already blocked them a long time ago,
but I was still able to see them. And you know,
that was not a nice feeling for me. You know,
I felt very low about it, very nervous. I'm constantly
anxious in this debate that you know, have I just
you know, what if I've just got this wrong? You know,
(40:05):
And that's why I sort of try and keep to
the evidence as much as possible. But I wasn't in
a particularly happy place with all of that happening, and
I was like, I remember going, I wonder what they're
saying about me on other platforms, and I went onto LinkedIn.
Actually I went onto LinkedIn, and it turns out I'd
already blocked them on there, so you know, this is
somebody who had had prior experience with and I was like,
(40:28):
I'm going to unblock them, and I know that this
is a stupid idea. This is not going to be
good for me, because what's going to happen is I'm
going to unlock them.
Speaker 4 (40:36):
The best case.
Speaker 1 (40:36):
Scenario is that they've not said anything about me. I'm
expecting that they've said something, and that's going to make
me not feel good.
Speaker 4 (40:42):
Why am I doing this to myself?
Speaker 1 (40:44):
And I didn't know how to unblock anybody in LinkedIn,
so I had to kind of figure it out, and
it turns out it was really what I feel is
a really healthy system of unblocking on LinkedIn that if
you block somebody, you've got to go into a certain
part of the settings, find them, and when you click unblock,
you get a little pop up that says, are you
sure you want to do this? If you do this,
you can't block them again for another forty eight hours.
(41:06):
So I didn't unblock them because I thought, hang on
a sec. This has just given me enough chance to
pause and go, I'm not going to get a good
outcome for myself from this. I was expecting I could
just like unblock them, have a quick look, and then
quickly block them again. I clearly can't do that, and
that leaves more risk open. So I'm not going to
(41:26):
do it, and I think my mental health is all
the better for it, you know. I took more time
to think about it afterwards and go, actually, I don't
care what this person thinks. I don't know them. I've
never met them. They're just not nice at interacting with
me online. And it's probably best that we don't talk
to each other. And that's the fact of life that
some people don't get on. That's okay, and let's just
(41:47):
both go on with our lives. And I think that
was the best outcome. That was thinking carefully about the
tools that we've got at our disposal to curate our
existence on these sorts of platforms and what we want
to get out of it. But in that particular scenario,
there was a bit of a buffer mechanism in place
that almost kind of like protected me from myself a
(42:07):
little bit. And I think that's a really good example
of where little tips and tricks and tools can help us.
And obviously there are lots of other examples on other
social media platforms where things maybe aren't so great. And
I think it's that sort of thing where there's an
educational element to this, which is around thinking about what
do we want out of our tech use, whether that's
(42:28):
a particular social media platform or our laptops or whatever
it is that we're talking about video games. And then
there's the other side of it, which is what can
the tech companies do very quickly to give us more
helpful tools to help us create that experience? And what
are things that are a little bit harder to implement
that need to be done over the long term, and
(42:49):
how do we push for that? Yeah, And the sad
fact of this conversation, I think is that at some
point regulation needs to come into the conversation, because you know,
I think if you ask industries to self regulate, they're
not good at that, right for obvious reasons.
Speaker 4 (43:07):
Right.
Speaker 1 (43:07):
So for me, then the question becomes around where do
we direct regulatory efforts so that they're most effective for
helping us with these things. And that's maybe where we've
got some of the conversation wrong.
Speaker 3 (43:20):
At the minute, I want to pause for a quick
(43:41):
good Wolf reminder. This one's about a habit change and
a mistake I see people making, and that's really that
we don't think about these new habits that we want
to add in the context of our entire life. Right,
habits don't happen in a vacuum. They have to fit
in the life that we have. So when we just
keep and I should do this, I should do that,
I should do this, we get discouraged because we haven't
(44:04):
really thought about what we're not going to do in
order to make that happen. So it's really helpful for
you to think about where is this going to fit
and what in my life might I.
Speaker 2 (44:14):
Need to remove.
Speaker 3 (44:15):
If you want to step by step guide for how
you can easily build new habits that feed your good Wolf,
go to good Wolf dot me, slash change and join
the free masterclass. The part of the book that I
probably struggled with the most was the sections on addiction,
given that I have addiction history, alcoholism, heroin, addiction in
(44:37):
my past, and I've been around addiction for a long
long time. And you sort of start that chapter off
by saying pretty clearly you are not addicted to your
cell phone. I sort of understand your argument because I
read the book, but help listeners understand why you feel
confident making that claim.
Speaker 1 (44:56):
That was actually a sentence in the book that I
agonized over a lot. You know, it's a very strong
claim to make, and I fully appreciate that it's one
that doesn't sit very well with many people's experiences of
their phone use. So I try and unpack that in
(45:16):
the rest of the chapter, and I think for me,
one of the big things here is around the language
that we use in terms of characterizing our relationships with
our digital tech and also perhaps the influence therefore that
that's had on the direction of research in the area.
That we use the word addiction a lot in day
(45:39):
to day speak when I don't think we should. You know,
I think we overuse it, and we use it inappropriately,
We use it in a way that I think can
be unhelpful to what we're trying to do sometimes. So
you know, my take on a lot of this is
that when you say I'm addicted to my phone or
I'm addicted to social media. That what people very often
(46:01):
mean is that they feel like they're using it a lot.
They feel like maybe they're using it too much, and
they feel like they're not happy with the amount that
they're using it. That's entirely understandable. My problem with couching
this in terms of an addiction framework is that it
leaves you with very few solutions that particularly when we're
(46:26):
talking about this in terms of it's not just that
you're addicted to your phone, but that you are addicted
by design, that this is something that has happened to
you without you realizing. Like, if you'd have known that
this was an addictive product, then you would have done
something different. You might not have used it at all.
You know, you weren't given the full facts, and therefore
(46:46):
this is the space that we've got ourselves into. It's
totally understandable when you frame it in those terms that
what you then reach for in terms of how to
deal with this is to remove it, to either get
governments to regulate it, or it needs to be kind
of removed, you know. So you see all these digital
(47:07):
detox programs and things like that they're very much grounded
in the language of substance use. They are very different
things that we're talking about. So one thing to say
there is that if you look at the research on
digital abstinence or digital detox, so where you get these
studies where people are asked to either not use their
phone completely for a certain amount of time, or to
(47:29):
not use a particular social media app or things like that,
you get very mixed findings, very weak effects, and certainly
nothing in the sense of long term effects. Now, I
know that there are some scientists out there who disagree
with me on this, and that's actually a big debate going.
Speaker 4 (47:45):
On at the minute around that.
Speaker 1 (47:47):
And again it goes back to that what are we
talking about, How are we defining this. I've seen some
people point to a bunch of studies that show that
digital abstinence works, and therefore this is what we should
be implementing for kids. That are studies on people adults
stopping using Facebook. Right, That is not what we're talking about.
Kids don't use Facebook, Right, It's completely useless line of
(48:11):
research for the thing that you're worried about. The other
thing to say about this is that if you look
at the trajectory of research on digital behavioral addictions. That
the sort of this similar trajectory that happens over time,
which is people start talking about this thing in the
way that we mentioned earlier, that people use addiction in
a common day to day sense, and so scientists came
(48:33):
along go everybody's talking about Internet addiction or something like that,
we should study this, and that's correct, and we absolutely should.
But it always comes from the starting point that this
thing exists and is very quickly and easily definable, and
I think that's where we get things wrong quite a lot.
Speaker 4 (48:49):
Again.
Speaker 1 (48:49):
Nobody's sort of saying that people don't have problematic use
issues with their phone or with social media. But how
we characterize it helps us define it better. It helps
us understand the populations who are really struggling with it,
and that leads to more useful treatments in those situations.
(49:10):
And then for the vast majority of people who are
not addicted to these sorts of devices or these sorts
of platforms, whatever, if you use other frames of thinking
about this, then you open up a whole new tool
set of ways that you can deal with it and
again get better outcomes for yourself. Maximize the benefits, minimize
the harms.
Speaker 3 (49:28):
Yeah, I can see why nobody in the research and
the findings in this area are all over the place
because we've been trying to define alcoholism or drug addiction
for a long time and we still really can't. Generally,
we've moved away from if you answer yes to six
out of twelve questions, now you're an alcoholic. We talk
(49:50):
more about alcohol use disorder that falls along a spectrum, right,
And so I think a similar way of thinking about
phone use could be helpful. I also think that the
other thing that's challenging is and I've had this discussion
on the show with countless people of different stripes. Is
(50:12):
it helpful to call yourself an addict? Is that empowering?
Is that disempowering? Lots of people have different experiences with that.
Some people find that that, yes, that's really a helpful
way to think about it. I find that good. I've
gotten recovery that way, and other people go it just
made me feel like I couldn't do anything. Why would
I say that I'm powerless over something that doesn't empower me?
(50:33):
And so I think the same thing is kind of
going on here, and there's a point made in the
book that you know, sometimes we may be framing something
as addiction that's something else, like it's a self esteem issue,
or it's a coping mechanism, which I think all addictions
are to some degree, right, All addictions, in my experience,
(50:54):
are generally there's an underlying something that you're trying to
accomplish with the thing. So I think it's possible, probably
in the way that I understand addiction to be Again,
I'm hesitant like you to use the word addicted, because
I think gambling is our best example that mirrors it,
meaning that it's a behavior, it's not a substance, and
(51:16):
so I don't think it has to be a substance.
I do think behavioral addictions exist, But I also agree
with you that probably most people in the same way
that if we look at the number of people who
use alcohol, or who use an opiate, or who use cocaine,
it's a percentage of them that go on to have
real problems with it, and I would imagine that it's
(51:38):
a similar thing. It's really hard to use the language
of habits, and I've done a lot of coaching with
people over the years on changing behavior, and boy that
line between a habit and an addiction, like, what even
is it? How do you even really know? It's very muddy,
(51:59):
But I do think I think that broadly speaking, claiming
addiction across the board for all of our interactions is
probably not helpful.
Speaker 1 (52:10):
Yeah, I think probably the closed analogy here is gaming addiction,
which is the only digital addiction that is formally categorized
as a clinical disorder anywhere. So the World Health Organization
classified this as a disorder in twenty eighteen in ICD eleven,
the International Classification of Diseases, and there was a big
(52:33):
debate at the time between broadly speaking, two camps of
researchers and clinicians about whether this was the right thing
to do or not.
Speaker 4 (52:42):
So I totally take the point that you may and the.
Speaker 1 (52:45):
Point that the people on the other side of the
debate made in twenty eighteen, which is that it's too
simplistic to say, well, you need to know whether this
is a coping strategy for something else, or whether it's
just better accountable by depression or something like that, because
very often with addictions they are comorbid with other disorders.
Speaker 4 (53:03):
That's very often the case Chicken and egg, right, yeah.
Speaker 1 (53:06):
Yeah, I think the problem from the side that I
was on, which was that not that there aren't some
people out there that struggle with gaming to the point
that it becomes actively harmful, but that we don't know
enough about it yet. Is this a unique disorder? Is
it better characterized as another form of disorder, maybe like
an impulse control disorder or something like that. And we
(53:28):
need to answer those questions if what you want to
do is help the people that actually need help. So,
if you look on the World Health Organization's website for
gaming disorder, which is what they call gaming addiction, there
is a link there to a twenty twenty systematic review
of papers in the area, and that review covers about
one hundred and sixty papers. And across those one hundred
(53:50):
and sixty papers, there are thirty five different ways in
which gaming disorder is assessed, not one, thirty five, And
across those thirty five studies, you get a prevalence rate
of anywhere between zero point two percent of the gaming
population up to about fifty eight percent. Now, what that
(54:10):
says to me is that you don't know what this
thing specifically is. If either nobody has it or pretty
much everybody has it. That's not a helpful thing for people.
Let's just for the sake of argument, let's just say
that the true rate is three percent. If you go
around thinking that the prevalence rate is sixty percent, you
are diagnosing a lot of people with a clinical disorder
(54:33):
that they don't have. If you think the true rate
is zero point two percent, you are missing a lot
of people that need help. And this is where I
struggle with that debate. Again, me saying I don't think
addiction is the right way to talk about this is
not me saying there aren't people that really really struggle
in the apathological sense or in a heart you know,
(54:54):
they are experiencing harm through their tech use. That is
not what I'm saying at all. It's that we don't
have the right ways of talking about it to help
to identify who they are, to figure out what the
unique features are of that particular disorder, and to help them.
And I think just rushing along and saying problematic smartphone
use is actually smartphone addiction and this is how you
(55:16):
test it doesn't actually help people in the long run.
Speaker 3 (55:19):
Right. Yeah, we could talk about this for four hours
because I find it a fascinating subject both gaming use
any disorder, and people have been debating what exactly these
things mean for a long long time. I do think
that there's a generally common sense approach we used to
say in twelve step programs, which is, if your drinking
(55:41):
is causing problems, then you probably have a drinking problem.
Exactly where you categorize it, what you call it is
probably not as important as the fact that, like, okay,
something needs to be done here right, something needs to
be tried. And to that end, I don't want to
put this in the post show conversation, which is going
to make this longer than normal, but I don't want
(56:03):
to talk about a problem for a long time without
it all addressing your thoughts on how we work with this,
and you sort of end with if we want to
think differently about the relationship with our screens, we need
to take a couple different steps. So what are some
of your broad takeaways for what is a useful way
(56:24):
to think about this thing that we don't quite know
what to call. We don't know how bad it is,
and yet, as you've said clearly, we know there are
some problems that emerge out of it for some people.
So how do we think about or talk about this
in a way that is helpful.
Speaker 1 (56:39):
I think being reflective on your tech use is a
really helpful thing to do here, and I appreciate that
very often that's not an easy thing to do. So
there's no quick fix, right. This is not like a
here's one weird psychological trick that you can use to
magically fix everything about your screen timeline. If there are
things that you're not happy about with your tech use
(57:00):
or your screen time. Things like that actually give yourself
a pattern back for identifying that to begin with, because
more often than not, we don't notice when we're getting
into these sorts of bad habits and developing, you know,
bad relationships with them. So when you start noticing it,
that's the first step, right. That means that you can
do something about it. If you're thinking about this in
(57:21):
the context of I have the power to do something
over this. You know this is not something that's happened
to me. You know these things are designed to enable
habits and enable bad habits sometimes, but you know I
still have control over this.
Speaker 4 (57:34):
I can fix this.
Speaker 1 (57:35):
Then what happens next is experimenting, So figuring out what
works for you This is where you know, I always
like to take an evidence based approach with this sort
of writing all these sorts of things like I do.
And this is where it breaks down in the book
because there's not much good evidence anywhere to say this
is a really effective solution for fixing things, so it
becomes very individualistic. So for example, in the book, I
(57:57):
talk about sleep at one point, and iPhones have speech
called night shift, which is it turns the screen yellow,
and the idea is that less blue light I mean
out of your screen doesn't disrupt your sleep as much.
There are studies which actually show that this doesn't work.
It doesn't have an effect, so it's a bit of
a placebo effect. But I also say in the book
that I still have night shift mode on my phone,
not because I think it has some sort of biological effect,
(58:19):
but it's a really over marker when if I'm playing
a game or on Instagram or whatever at night, I've
set it to twenty minutes earlier than i want to
go to bed, so I've got a nice clear marker
that now is the time to start shifting to doing
something else. It doesn't always work, but the important thing
is that when it doesn't work for me when I
still find myself like scrolling or whatever. Every time I've
(58:42):
found that happening recently, it's because I'm avoiding something like
something difficult, you know, and maybe I've had a bad
day or something difficult's happen at work, or even sometimes
it's just been things like I've not been able to
sit down until like quarter to ten at night, and actually,
you know what, I want some time to myself. Yeah,
(59:02):
so giving ourselves a break, I think is a good
thing to do here that you know, we all struggle
with building these sorts of healthy relationships, and that's okay.
You know, it's never too late to start changing things,
and it just starts with those little steps being more
aware of what you're doing, thinking about what you want
out of your tech use, and you know, catering things
(59:26):
to align with that goal. I was in hospital about
six months ago and I downloaded a game I don't
think I would have ever downloaded otherwise. It was like
a city building game. Awful, awful game, Like, you know,
you could spend thousands of pounds on that game. It's terrible,
but I was just like, I just want to play
something mindless like this. For a bit and I started
playing it lots, and I left notifications on and it
(59:46):
was one of those annoying games where other people can
come and attack your city whenever, and I found it
being really unhelpful.
Speaker 4 (59:54):
You know.
Speaker 1 (59:54):
I would be at the dinner table and I get
a buzz on my phone saying somebody's attacking your city,
and I'm like, oh, oh, I need to do something
about that, and I don't want that situation to happen,
so I turn the notifications off. What happened was that
more people started invading my city and killing me lots.
And then I realized, I don't care because that game
(01:00:15):
was something that served a purpose for me when I
was in hospital, and actually it's not now, so I'm
going to delete it. All of the time and energy
and effort that I'd invested to get in a certain
level it all disappeared. And you know, art doesn't matter
because what I'm getting is a nicer experience now. I
value protected time at the dinner table with my kids,
(01:00:35):
and this was something that I'd allowed to happen. I
didn't beat myself up about it, but I thought, next time,
I'm in a space where I'm going to download a
game that does something like that. I'm going to think
about the knock on effects later on, and you know,
get rid of notifications, be more ruthless basically about when
I stop playing them. Things like that and just little
shifts in thinking like that can help. They're not going
(01:00:57):
to fix everything, but they will help a bit.
Speaker 3 (01:01:00):
That's a great place for us to wrap up. Pete,
thank you so much. Like I said, I really did
enjoy the book. I appreciate nuance. There was a lot
of good nuance, and you know, I didn't feel like
you were dragging the science one way or the other
to suit an opinion, and I think that is always
a useful service.
Speaker 4 (01:01:17):
So thank you, thank you.
Speaker 2 (01:01:34):
If what you just heard was helpful to you, please
consider making a monthly donation to support the One You
Feed podcast. When you join our membership community with this
monthly pledge, you get lots of exclusive members only benefits.
It's our way of saying thank you for your support.
Speaker 3 (01:01:50):
Now.
Speaker 2 (01:01:50):
We are so grateful for the members of our community.
We wouldn't be able to do what we do without
their support, and we don't take a single dollar for granted.
To learn more, make a donation at any level and
become a member of the one You Feed community. Go
to oneufeed dot net slash join The One You Feed
podcast would like to sincerely thank our sponsors for supporting
(01:02:12):
the show.