Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Stuff to Blow Your Mind from how Stuff
Works dot com. Hey you welcome to Stuff to Blow
your Mind. My name is Robert Lamb and I'm Joe McCormick.
And this is going to be part two of a
two part episode about the attention economy and the technosphere,
(00:23):
the war for your eyeballs and the fact that we
live in a technological economy that in which platforms and
apps and devices are constantly trying to suck in your
attention and they're succeeding. They're working so good. If you
have not listened to our first episode, you should go
back listen to that first. That's where we lay the
whole groundwork try to explain the problem of what's going
(00:46):
on with the attention economy today. But just as a
quick refresher, if you've already listened to the episode, to
put it back in your mind, what do we talk
about last time? We basically talked about the attention economy,
the idea that your attention, my attention, it is a depletable,
finite resource. And every time there is a pop up
on our phone, every time we obey the impulse to
(01:09):
check Facebook or Twitter or Instagram or what have you,
it is depleting not only precious time, but precious energy
and precious will. Right, it's it really matters. It's taking
us away from the things that we would like to
be doing with our time ultimately to get what we
want out of life, the things that really matter to us,
and instead putting us into these easy, habitual modes of
(01:30):
passive consumption of digitally supplied information and entertainment that that
ultimately we can end up feeling hollow and regretful after
we experience. Right, even if it's not hurting us causing
us direct harm, that those that I think there's some
evidence that it is actually hurting us in the sense
of what we talked about with by constantly interrupting us.
(01:50):
It's lowering our cognitive performance, making us less smart by
constantly pulling us out of focus on tasks and interrupting
us all the time, and just having us always aware
of like the interruption, threat of a new message or
notification or something. So it might actually be harming us.
But even if it's not harming us, it's causing us
to voluntarily spend our time and attention in ways that
(02:13):
are not what we want out of life, and we
end up feeling regretful, and in a way we are
victims of shock, future shock, Future Shock, tell me about
future shock? Yes, well, okay, so listen. Long time listeners
to the show might remember that we we had a
two parter years and years ago about the about future shock,
(02:34):
Alvin Toffler, Alvin Toffler's nineteen seventy book and just general
idea of future shock. Also, there is a wonderful, I
think nineteen seventy two TV documentary about the Cooper Cronenberg. Yes,
it has kind of a twilight Zony Cronenbergie vibe, and
it's narrated by Orson Wells. It's a it's a little overwrought,
(02:55):
as you might imagine, but it's still very good. It's
it's all on YouTube. You should be able to find it.
So I've never actually read this book. I know it's
a classic work of futurism, uh, and I should read
it at some point, but I haven't read it. So Robert,
can you explain to me the idea of future shock? Alright?
So the book itself, which is still a fine read,
even though it is again in the decades old book
at this point, but it touched on a number of
(03:17):
contemporary and looming aspects of technological advancement, including over choice,
pressure to keep up with the latest technology, rapidly expanding knowledge,
information overload, computer fueled society, temporary consumer culture, young new
transient lifestyles, instant intimacy, as well as more sort of
(03:42):
sci fi ideas you might say, such as cyborgs, modular bodies,
and prosthetics. Okay, but how's this come into the idea
of the attention economy. Did Toffler actually predict that we
would be in this situation of being surrounded by by
machines that's so efficiently drained away our attention? He did?
He did, in fact, so I want to read a
passage here from future Shock. For while we tend to
(04:06):
focus on only one situation at a time, the increased
rate at which situations flow past us vastly complicates the
entire structure of life, multiplying the number of roles we
must play and the number of choices we are forced
to make. This, in turn, accounts for the choking sense
of complexity about contemporary life. Moreover, the speeded up flow
(04:29):
through of situations demands much more work from the complex
focusing mechanisms by which we shift our attention from one
situation to another. There is more switching back and forth,
less time for extended, peaceful attention to one problem or
situation at a time. This is what lies behind the
vague feeling noted earlier that quote things are moving faster,
(04:52):
they are around us and through us. So that sounds like,
for one thing, some recognition of a thing we mentioned
in the last episode, the ta shifting penalty, right, that
we're downgrading the quality of our cognition by constantly shifting
between focus on different tasks. But it's more than that, right, Yeah,
he's he He goes on in the section of the
(05:14):
book and says, quote all this represents the press of
engineered messages against his senses, and the pressure is rising.
In an effort to transmit even richer image producing messages
at an even faster rate. Communications people, artists, and others
consciously work to make each instant of exposure to the
mass media carry a heavier informational and emotional freight. I
(05:38):
want to read just one more quote here from the book,
because this touches on sort of again. This was like
nine late sixties that he would have been writing this,
Uh to to ground it but I think it's very
applicable to some of what we're discussing here today, he said. Quote.
The religious fervor and bizarre behavior of certain hippie cultists
may arise not merely from drug abuse, but from experimentation
(06:01):
with both sensory deprivation and bombardment. The chanting of monotonous mantras,
the attempt to focus the individual's attention on interior bodily
sensation to the exclusion of outside stimuli, our efforts to
induce the weird and sometimes hallucinatory effects of under stimulation.
At the other end of the scale, we note the
(06:22):
glaze stairs and numb expressionalist faces of youthful dancers at
the great rock music auditoriums, where light shows, split screen movies,
high decibel scream shouts and moans, grotesque costumes, and writhing
painted bodies create a sensory environment characterized by high input
and extreme unpredictability and novelty. Now, now that's a fun passage,
(06:45):
because there is this sort of sense of oh, the
the young hippies and their their their craziness. But but
at at heart, this idea of over and understimulation, this
questing for bodily stillness in relation to this overly stimula
added world, I think is crucial to what we're talking
about here, and I think it kind of supports this
(07:06):
idea that that Alvin Toffler warned us. Alvin Toffler told
us so he warned us of this future shock. Well,
it makes me think about the inherent uh fallacy where
we would tend to assume that boy having lots of
input coming into the mind, that would essentially be an
enriching experience. Right, the more input is coming in, the
(07:26):
more stocks of information and wisdom you should be building up, right,
But I think that's not necessarily the case. In fact,
having many information streams coming into the mind tends to
instead confuse the input processing devices within the mind, and
instead of enriching the mind, it sort of just numbs
(07:47):
the mind. Yeah, like the the idea that he touches
on about over choice, I feel over choice, uh like crazy.
I feel like any time that I am just casting
about on say Netflix, to see what I might be
interested in watching, they're just there's so many options that
will end up just scaling through them all and not
watching anything, and hopefully you have a completely unfulfilling experience.
(08:11):
Because I didn't actually view a movie on this this app,
this service, this industry that is about delivering me a movie. Now,
something I want to come back to in a minute
is it's interesting because that app is different than many
of the other things we're talking about. Many of the
things we're talking about demand our attention because they are
ad supported, or because they're gathering data on the user,
(08:34):
in which the user's attention and the times they spend
paying attention to the app is literally what's valuable to
the content producer or to the platform. Right. Netflix is
a different thing because it's subscription based and so you're
paying to have access to it, So like, why would
they want you to keep paying attention that that's an
interesting question. We can maybe talk about that a little
(08:56):
bit more. But another point about Netflix I do want
to make real quick is that, as we I think
said last time, we don't want to demonize all consumption
of information or entertainment through digital formats, right, because ultimately,
what you want when you use Netflix is to watch
a movie, and we like watching movies. Like watching a
movie is a valuable activity to me, and when it's
(09:18):
a movie, I like that, I that I care about Uh,
that is something that I feel like is worth doing
with my free time. It's something that that brings me pleasure.
And I'd say a similar thing for lots of the
other apps I use. I enjoy the digital content I
consume through my books apps. I enjoy the digital content
I consume through my podcast apps. But one of the
things that makes these apps a little more difficult to use,
(09:42):
even though we get more enjoyment out of them, is
that you have to make conscious decisions about how to
use them, right. You have to seek out the thing
you want. I know, I want to listen to this podcast.
I know I want to listen to this musical album.
I know I want to watch this movie. Whereas we
are served this content so much more passively and automatically
(10:03):
by many of these other apps that make us feel regretful,
by our social media apps and by the games that
just sort of launch and then it's an incoming stream
of stimulation. Does that make sense? Yeah? No, yeah, it
does make me wonder to what extent these various uh
like Alexa type products UH where you have some sort
(10:24):
of a smart speaker that you talk to and then
it fetches what you need. I wonder if that could
be a way to mitigate this effect in the future
where it's it's you have this robot that's essentially doing
the surfing for you. It's going to be the one
dealing with the and and and ultimately not having to
deal with the over choice of say a Netflix que Yeah. Absolutely,
(10:46):
I mean, I think one thing we should explore in
the second episode here is the idea of how because
like I said, we don't want to just demonize technology.
It's not like digital platforms are inherently bad as a concept.
It's that they have taken on this form that is
giving up, that's taking away our attention in our lives
without really giving us a lot of value, giving us
(11:07):
always the things we want. Though sometimes we do get
the things we want out of them, it's just that
that's not always presented in such an engrossing and addictive
way as the stuff we don't really ultimately want. Is So,
how to make technology work better for us to give
us what we actually want and actually care about that
that's something that matters, and it's a question we should address.
(11:28):
But I think maybe we should take a quick break,
and we come back from that break, we will look
at an essay about how these apps and devices are
specifically engineered to hijack our brains. Alright, we're back, so
let's get into some of the specific tricks that are
that are involved here, like what what are these devices doing?
(11:50):
All right? Right here, I want to turn to a
twenty sixteen essay by a guy we've mentioned, a guy
named Tristan Harris who's a former Google employee. Now he
uh found. He's one of the foul unders of this
movement now called Time Well Spent, which is focused on
exactly this issue we've been talking about over these past
couple of episodes, the idea of the attention economy and
the fact that our technological devices and apps are not
(12:12):
really giving us what we want in terms of our
attention investment, but they're they're nevertheless extremely powerful and harnessing
that attention. And so Harris in this essay is trying
to explain how it works, how they capture our attentions
so effectively, so powerfully, and how we are so powerless
to resist. So he cites the fact that he was
(12:33):
formerly a design ethicistic Google and he spent his life
studying security vulnerabilities in the human brain. Ways to trick
the brain and subvert its will without the person ever
realizing what's happening. In his words, quote how technology hijacks
are psychological vulnerabilities, and he goes on the list ten
ways that technology mapul manipulates and controls us invisibly. I'll
(12:56):
try to mention them all, but explain a few of
them in depth. One of them is that when you
have a technological platform that allows people to use it freely,
that platform controls the menu of options that people have
available to them. And yet it's still simulates the feeling
of having free choice, right. I mean, it's kind of
(13:18):
like asking a five year old what they want for dinner.
You you don't you don't let them have free range.
You don't say what do you want? You say, do
you want broccoli or spinach? Right? You give them the
choices that you are already prepared to make, and you
don't have a third choice that you that you are
totally not into yourself. Right, And well, that makes a
lot of sense with the child, and maybe sometimes with
(13:40):
certain types of bosses, but that is also how these
apps work. They have the feeling of giving us a
lot of choice by giving us a menu of options
of things to do, but also by discouraging us from
wondering why aren't I allowed to do different things that
aren't on this list? Okay, So the idea here is,
again we have an illusion of choice, but we're still
(14:01):
being herded into a few pre selected ideal choices, right,
and generally, what those pre selected ideal choices do is
keep us engaged with the apps and platforms we're using.
So instead of uh, instead of thinking of the open
ended question, who could I hang out with tonight? Instead,
the apps on your phone encourage you to think, who
(14:22):
are the most recent people I've been texting with that
I could pay about this and continue texting with. Or
instead of what's happening in the world an open ended question,
you could investigate lots of ways, it becomes let's see
what's happening on my news app feed? Keeps me engaged
with the device and with the app on it, Yes,
(14:43):
and which stories have been updated? And okay are they
going to be updated? Now? Oh? I better check it again,
see if that story has been updated once more. The
second point that Harris makes, I think is maybe the
most important point on the whole list. He he draws
a lot of attention to it, And I think this
is very important. We've discussed slow machines on the show before, yes,
we I don't think it's uh, it's unfair for us
(15:05):
to say that that we do not like slot machines.
They are these highly advanced, very smart, highly developed parasites
that are engineered perfectly to drain us of money by
appealing to our reward seeking minds. But why are slot
machines so addictive? What's the trick like? One of the
things that has pointed out in harris His essay is
(15:28):
that he cites the work of the n y U
professor Natasha Dao school, author of Addiction by Design, which
found that people get quote problematically involved with slot machines
three to four times faster than they do with other
types of gambling. So why would that be? Why are
slot machines so much more effective at causing problem gambling
(15:50):
behaviors than a craps table or a poker game. Well,
I mean, on one hand, it's certainly a farm more
mindless endeavor. Right, You're just you're just to pull on
the level, You're just essentially hitting refresh and seeing what happens.
Am I a winner? This time, I am a winner.
This time, there's no there's no strategy, there's no there's
certainly no card counting. It's just let me take another
(16:12):
go at it and see what life a k A.
The machine that had been programmed to drain my money, uh,
to see what it allows to happen. Yeah, and apparently
the number one factor making slot machines so devilishly irresistible
and addictive is what's called quote intermittent variable rewards. Intermittent
variable rewards. This means that you create a circuit where
(16:35):
a simple user action, just like you're talking about pressing
a button, pulling a lever, reloading a page, opening an app,
very simple action to do, leads to a variable reward
at a variable rate. So variable reward means that when
you complete the user action, sometimes you get nothing, sometimes
you get a small reward, sometimes you get a big reward.
(16:57):
And the variable rate means that it's not predictive will win.
These rewards will come in a sequence of actions. Yeah,
It's it's like checking Twitter, for instance. Yeah, I refresh,
and who knows what I'll get. Maybe I'll get to
have a notification. Yeah, and somebody at mention, Yes, somebody's
talking to me or about me, or my favorite comedian
has a funny statement to make. There's a new trailer
(17:17):
for a movie I want, or oh, nuclear war, it
could be it could be anything right, or the message inbox,
likes on a post or a tweet, the matches on
a dating app. It's all intermittent variable rewards. You keep refreshing,
you keep checking, hoping that there will be some reward,
and sometimes there is. Sometimes there's something that's a little
(17:37):
bit socially gratifying, sometimes there's something that's a lot socially gratifying.
Sometimes there's nothing. There's nothing. A lot of times there's nothing.
And when there is nothing, that actually really keeps you
checking it once you keeps you searching that next reward.
So the intermittent variable rewards matrix is an extremely powerful
exploit on the human mind. It's a it's a known
(17:59):
vulnerable pity, and most tech devices like phones and stuff
and the most used apps are making use of intermittent
variable rewards. Now, Harris mentions that this kind of thing
doesn't have to be an intentional design at first, like
apps could land on these strategies by accident. They didn't.
They didn't have to say let's make it a slot
(18:20):
machine from the beginning, But once it works like a
slot machine, they realize that that's very effective and maximizing
time on site or user engagement or repeated pickups throughout
the day. Yeah, it's kind of a parallel evolution type situation,
right where this particular form moves the fastest through water.
This is the design that has been selected. Okay. A
(18:41):
third point Harris makes is that these apps encourage us
to think, what if I miss something important? Well, we
have a bit little bit of a fear of missing
out and all of that, but also, yeah, I do
want to know if there's nuclear war. I'm not I'm
not exactly sure why. There may not be much that
I can do, but it's gonna be something. It's it's
an important event, and I need to know that it
(19:02):
has happened. Another thing, he mentions our desire for social approval.
They really play on this, like, we are largely social
creatures with socially shaped brains, and we're highly motivated by
feelings of social inclusion and approval and avoiding feelings of
social exclusion and disapproval. So when somebody thinks to quote
(19:24):
mention you on Facebook. You've seen this, uh this capability, right,
you can tag a person's name and a post on Facebook.
This feels intensely rewarding to people. But I got mentioned
it was me, I'm I'm what this person was thinking about? Right,
and then you click and you see, oh wait, they
tagged everybody, yeah and so. But unfortunately, these mentions, as
(19:45):
rewarding as they feel, they don't spring forth directly from
people's social relationships. You'll notice that Facebook tries to get
people to do this. It tries to make you mention
other Facebook users as often as possible, suggesting they be
tagged in photos, suggesting they be named and tagged in
the post you're making, and so forth. Also, have you
(20:06):
ever noticed how the Facebook news feed algorithm favors prominent
and repeated display of posts that have comments saying the
word congratulations. I have not noticed this. Tried this sometimes,
like if somebody posts something that has no congratulatory content,
nothing worth congratulating in it, just start commenting congratulations on it.
(20:27):
I think that this moves that post up in the
news feed and more people see it. And because they
want to get people engaging with some kind of chain
of social approval, which keeps people glued to the app
see this is this is probably one of those situations
where I would have assumed it had more to do
with my attention, the idea that I'm more inclined to
notice when somebody else has been congratulated on something, and
(20:50):
that may go in like negative or positive directions, right
because it could be somebody that I deeply care about them,
or it's you know, or it's me, and of course
I'm invested in them getting congratulate relations or I'm I'm
I have this kind of bitter feeling where it's like, ah,
then look at that they got a new puppy dog
and the getting all these congratulations about it. I don't
have a puppy dog. Well, either way, you're engaged, aren't you.
(21:11):
But I don't know if that's the case. I really
strongly think that's the case. That's not a point Harris
makes about congratulations. That was just something I think I've observed,
But I wonder I really do see it seeming to
get highlighted and moved up. Sometimes Facebook will even like
bold the word congratulations in the comments as they display
on your news feed. Another point Harris mentions is playing
(21:34):
upon our feeling of responsibility for social reciprocity. Somebody followed you,
better follow them back. Somebody wrote you a LinkedIn recommendation,
Better write one for them. Here's a really devious iteration
of this. Why did Facebook start doing that thing in
Facebook Messenger where it tells the sender that the message
has been seen. Oh yeah, that's that's It's like they're
(21:57):
just trying to get you in trouble for for dragging
your feet. It puts pressure on the recipient of a
message to respond. You can't pretend you haven't read the message.
It tells them you saw it, so now you really
need to spend some time in the app composing a response,
or or it's something like in a chat where it's
saying Joe is typing a response and oh, one what
he's chatting. Oh he's really been going at this for
(22:19):
a while. This must be great, and then he never
sends it and he's like, oh, what was it? What
was this epic poem that Joe was composing. I can't
remember where I saw this cartoon, but it was hilarious
where I probably just saw it on some social media
feed anonymously in the middle of the ether, but it
was where you you message somebody, Um, hey, what did
(22:40):
you think of my thing? You know, some something that
you did, What did you think of my poem? And
it's like so and so is typing so and so
is typing so and so is typing so and so
is typing so and so is typing response It was great. Yeah, yeah,
I've I've seen that's that's sort of affected before as
(23:03):
well on the whole um, uh, you know following people
back then, I do have to add that, surely I'm
not alone when I see like a celebrity Twitter account
where they either have they either like they'll have like
you know, several million followers or whatever, but then if
they follow zero people or only one person or it's
some pre selected fun number, Um, I kind of always
(23:25):
have this impulse to say, you're a monster, what is
the matter with you? Like, I don't have that judgment
about people who have more followers than than people they're following.
I mean, you could ultimately only follow so many feeds,
I feel, uh, and certainly if you were if you
were a celebrity or there's going to be that imbalance.
But when when these people have like I follow nobody,
(23:45):
I am my own voice, or or I follow one
person and it is the lord, or I don't know this. Yeah,
it's it always kind of irks me. Okay. The next
point Harris makes that they use is infinite displays, AutoPlay videos,
infinitely scrolling news feeds. There's no natural stopping point. Yeah. Yeah,
you just keep going on and down, down forever, and
(24:06):
maybe you cannot get to the bottom of the screen.
It's like playing tetris. Yeah. And this is a psychological
vulnerability as well, because instead of having to actively select
that you want to continue doing something. Think about like
if your Facebook news feed was pageinated and you had
to click, you had to keep clicking to the next
page instead of just constantly scrolling down for infinity time.
(24:29):
It's harder to make a decision to stop and change
what you're doing, or to make a decision to do something,
than it is to just continue your passive consumption experience
that's ongoing and unchanging. The next one, this is a
big one, instant interruptions. Does your phone wait and update
you about notifications in batches separated by a few hours, No,
(24:50):
of course not. I mean most of the time by
default and there's gonna be differences between different devices and
apps and how they by default give you notifications, but
generally they're going to want, by default to say something
just happened right now, you need to check it immediately.
It interrupts whatever you're doing to tempt you with an
intermittent variable reward, and then once it tem tempts you in,
(25:13):
you're in there. Yeah. Yeah, I've had to turn most
of those off because it seems like for a while,
I'd have like a major news organization app on my phone,
and uh, they would send me an update and I
think something important happened. I'd get kind of anxious about it,
or is this was there a nuclear explosion? Was there
some sort of terrible event that occurred? And then no,
(25:33):
it's a major sporting events score. Right. It's like I
didn't even sign up for that. I that's but now
I'm interrupted by it and now I'm mad about it. Uh.
Two more Harris mentions are actually three more. Two I
want to mention briefly. One is putting the most profitable
or attention grabbing part of the app between you and
the reason you opened the app. So it's like you
opened the Facebook app to check on an event or
(25:56):
to see a message or something. But it's gonna try
to route you through the news feed. Right, It's kind
of the gift shop kind of scenario exactly. Next thing
is increasing the friction and making choices that tech businesses
don't want you to make. So just making it more
effort and more difficulty to unsubscribe or opt out of something.
(26:16):
Oh yeah, my favorite is of course you cannot unsubscribe
through the app or the website. You have to call
a number on a particular day, or how about the
ones that they make it? Uh, you have to opt
out to make your account private on something things are
public by default. You've got to go in and mess
with a bunch of toggling settings to keep people from
seeing the stuff you post. Finally, and this is what
(26:38):
I want to discuss in a little more depth. Harris
points out that these apps and services use foot in
the door sales strategies. So, Robert, you've bought a car, right, yes,
And so if you've ever bought a car, you go
to the go to the car lot, salesperson is trying
to sell you on the car, and you'll know that
the salesperson makes all these kind of little bids for
(27:01):
your continued engagement that are at no cost to you, right,
and so they feel easy and no risk, like why
not just take a little test drive. I know you're
not ready to buy it. That's fine, you don't have
to do anything today. Just take a test drive. It's free,
it's low investment. Just take a few minutes, just see
what it feels like and make you better inform your
decision ultimately. And then you do the test drive and
(27:22):
you're like, okay, well I want to go think about it,
and they don't want to let you go. They want
to say, actually, you know, I know you need time
to think about it. That totally makes sense that we
don't want to force you to do anything, but I
do want you to come inside real quick and just
get something on paper so I can show you what
kind of deal we can put together for you that
will help inform your decision making. If you're looking at
some other cars, you see the same thing with so
(27:43):
many of these services where it's try it for seven
days if you like it, and then when you go
to unsubscribe, they're like, why didn't you like it? Is
it too expensive? And and maybe they'll say, well, I'll
tell you what. Try it off about that. You know
they'll throw some sort of deal or to say, how
about this, Maybe don't unsubscribe, Maybe just snooth just for
a while, and you know you'll come back to it
a k. You'll forget about us, and then we'll charge
(28:04):
you for the service. Exactly right. But with the car salesperson,
what they're doing every time they make a little bid
for your continued engagement is they're exploiting psychological vulnerabilities. They
know the more time you spend talking to them, and
if they can get you into a chair seated across
from them at a desk looking at a piece of paper,
their psychologically advancing you toward the sale, making it more
(28:28):
and more psychologically difficult for you to back out. Even
though you haven't committed to doing anything explicitly, you're becoming
more and more implicitly psychologically committed, and they just keep
doing that. They're always advancing the sale. And so many
of these social media apps work the same way. How
about Kenny tagged you in a photo click to see
(28:48):
the photo. Well, you might be in the middle of
doing something, but if you get that notification, you're like, well,
I can look at a photo that just takes a
few seconds, so you click to see the photo. But
in Harris's words, quote, people don't intuitively forecast the true
cost of a click when it's presented to them, because
you think, click to see photo, that I'll just take
(29:08):
a few seconds. Okay, that's fine. I can invest those
few seconds and look at the photo. But then you're
in the app and all of the other stuff in
the app is there, and you could click on other
people or click to see more photos, or end up
back on the news feed, and it's pulling you in.
And that's how it works. They present you the idea
that will there will just be a quick little investment.
(29:30):
Instagram does stuff like this. You know, you want to
see one thing, You've got a message, or you were
tagged in a photo, there was something, just one little
thing you need to check, but then you're scrolling. Ultimately,
Harris ends the article by saying what we need our smartphones,
notification screens, and web browsers to be exo skeletons for
our minds and interpersonal relationships that put our values, not
(29:55):
our impulses first, And so he's advocating that we need
to read as in our technological landscape to make our
technology serve what we want out of life rather than
what's easy for us to do in the moment, because
what's easy for us to do in the moment is
so easily exploited by people who are not working in
(30:15):
our best interests. Yeah, obviously, you know, I was thinking
about this just the other day because, uh, Facebook actually
rolled out this little questionnaire. I don't know if you
received this as well, UH, asking you know, what can
we what can we do to make the world a
better place? What can we do to to to to
help the the product? Uh, make your life better? And
(30:36):
uh And you know, on one hand, I appreciated the outreach.
I'll be at a you know, a corporate technological outreach.
But in the other hand, I wanted to say, like,
I'm talking to a machine. You were not a person.
You were a corporation. You're a business, You're a technology.
And I know what technologies want. I know what corporations want,
(30:58):
and they are not necessar they're they are they almost
certainly not what I want. People think the wrong way
about the evil that gets done by like tech businesses
and stuff. They often think that like, oh, you've got
the snidely whiplash executives sitting up there thinking about how
to wreck people's lives, and that that's not what happens.
I mean, these businesses are like other businesses. They're full
(31:21):
of people who are mostly decent people who are just
trying to just trying to do a job and make
a business and make a product that works. And the
problem is with the incentives. The incentives in the structure
of the business. The business incentivizes captivating as much of
people's attention as possible. This is not presented to the
(31:41):
people who work in these businesses as an overtly evil
thing to do. It's presented in terms of things like engagement.
So we want to make the app engaging. You want
to make it something that people want to use. If
people are using your app a lot, doesn't that seem
like they're getting something out of it. I mean, you're
not forcing them to do it. They're doing it because
they want to. And so you can get locked into
(32:02):
this way of thinking about things without ever making the
decision to try to be evil and and upset people's
goals and hurt the quality of their lives. What's kind
of like with video game design, you know, you want
to make a really fun game, but then when the
response is, hey, I'm sorry, you made that game a
little bit too fun, it's making people sad. And well,
(32:23):
there are different ways to make a fun game. You
could make a fun game that people, when they were
done playing, felt good about the time they spent playing it.
They're like, I had fun, that was a fulfilling experience.
I don't regret it. You can also make a fun
game that when people are done they feel hollow and
regretful and aren't glad they spent their time that way. Well,
I'm always reminded of the old saying, uh, leave one,
(32:45):
they're wanting you to stay. That's I can count of
that all the time. With these big games like it.
I like a nice short game. Give me a four
hour game, a six hour game, one of those games
that online reviewers criticized for being too short, because that's
probably the right length. It's these big, open ided games
where it seems like you just play them till you're
sick of it. You like, this brings me no joy anymore,
(33:06):
And I'm not sure why I'm playing it right now. Yeah,
it's an overabundance problem. There's too much supply of stimulating
game and actually not enough demand of attention, right, um,
and so I think this is a good way to
transition to something that we've both read that's pretty interesting.
It's an interview with another one of the co founders
of Time Well Spent whose a guy named James Williams,
(33:26):
a former Google employee as well, who started thinking trying
to become a technology ethicist, thinking about like how technology
can serve what we want out of life and serve
our values instead of just sucking our attention away in
these mindless activities. He actually wrote an essay called stand
Out of Our Light, and one exerpt from it is
that he pointed out the problem is that digital technologies
(33:48):
privilege our impulses over our intentions, and instead they should
help us in achieving our intentions, not just satisfying what
we impulsively do when given the opera tunity. But he
talks about several things in this interview with Nautilus from
that I think are really interesting. One of them is
that Williams makes an appeal based on the economics of information.
(34:11):
So he says, like, we used to live in a
world in which information was scarce and attention was abundant,
it was hard to find things out. Sometimes it was
hard to entertain and stimulate your mind. Your local newspaper
was a critical information resource because in many, many cases
it was the only way you could find things out
(34:32):
about the world. And one way I like to think
about this difference over time is that people used to
talk all the time about being bored. To me, boredom
is a word in attention economics, terms of an attention surplus.
In an information scarce environment, you have attention to spend
(34:53):
and nothing to spend it on. It's when the attention
economy is in a state of high demand and low supply.
Now we live in this completely opposite world where information
is more than abundant. Your news source, whatever that is,
is not your news source because you it's the only
way you can learn about things. It's because it's your
preferred source among thousands or functionally infinite numbers of sources.
(35:16):
And the entertainment you turn to is not whatever is available.
It's not the one channel on your TV. It's what
you choose from a field of almost limitless options. Now
what scarce is not information, but the attention you have
to spend on it. And increasingly the role of information
sources is not to provide you with information which you
could access a thousand different ways, but to focus your
(35:38):
attention to let you know which pieces of information are true,
which are false, which are important, which are trivial. My
own observation is that I think people often prefer certain
news sources over others because they learn which ones will
make them feel good, yeah, which ones are going to
confirm their view of reality. Yeah. And so Williams goes
(35:59):
on to describe this modern technological economy, the attention economy
as quote a denial of service attack on the human will.
So a denial of service attack is something you see
in cybersecurity. It's when you get a bunch of computers
or something bombarding say a website or some kind of
user facing service, a website or a network, and they
(36:21):
they bombard it with so much traffic that it can't
serve itself to legitimate users. So you try to go
to the website and it won't load for you because
it's getting so many requests for loading from all of
these fraudulent, fraudulent bots and stuff like that. So a
denial of service attack on the human will is the
idea that there's so much coming at you and demand
(36:42):
making demands on your attention, you can't really spend your
attention intentionally. Yeah, I mean, it's the very real scenario
of decision fatigue. At the end of the day, you've
been you've made so many choices already, you cannot make
the simplest choice of, say, what do I want to
drink with my dinner? You know, it's like, I don't know,
it's fizzy water milk, but I cannot. I cannot extend
(37:03):
the willpower to make this one final choice. And uh,
you know he touches on some of this in the interview.
One of the quotes was that that stood out to me.
They keep us looking and clicking. I think this wears
down certain capacities like willpower by having us to make
more decisions. And certainly we've covered decision fat dig on
the show before. We talked about multiple willpower experiments. Uh.
(37:25):
So many of them seem to involve chocolate cake. The
most delicious chocolate cake you've ever seen in your life,
by the way, Uh that they'll they'll show this to
somebody and then they'll have to deal with some sort
of additional uh threat to their willpower additional decision challenge,
and so so the idea here is that we deplete
our video game esque willpower bar until we have no
(37:47):
more choice. Do we end up clicking on that ad,
eating that cake, or making a digital purchase? And I
find I find it rather interesting that we we take
for granted the ability to purchase items and have shipped
to us at any given moment, you know, right from
our tiny pocket computers. There's this tremendous convenience in this, certainly,
(38:08):
but it also means that when you were at your weakest,
perhaps at the end of a long day, in the moment,
in a moment of foolish pride or drunken confidence, you
can simply finalize the purchase of just about anything within
just a few key strokes. It's usually treated as a
source of humor that people say, like, I got drunk
(38:28):
and bought X on Amazon, but I don't know that.
I mean, it is kind of funny, but it's also
kind of not funny. Like you're you're in your house
and you have essentially like decreased your inhibitions to dangerously
low levels with the power to deplete your bank accounts
on frivolous purchases while you're doing this, and it need
not involve alcohol or any substance. It can just be
(38:50):
a matter of Yeah, at the end of the day,
my willpower was beaten down enough that I decided that
I deserved that blue ray of screamers, and I simply
ordered it. Like it, Like your whole day is kind
of building up to that moment when you finally break
and give in to not only your your own desires,
but also the marketing that you are your your hit
with online. Another thing Williams points out that I think
(39:13):
is actually really worth considering is the way that the
media and attention economy landscape affects political values and changes
in society. Right. Yeah, here's another wonderful, uh slash horrifying
quote from the interview. Quote. Radio was a huge factor
in Hitler's rise to power. It's why he put one
in every house. I think that's an interest in comparison.
(39:37):
Marshall mclewin, a Canadian media theorist, talked about this. He said,
when a new technology comes out and we still don't
know how to wrap our heads around it, there's an
initial period where our sensory ratios, our perception is re
acclimating a kind of hypnosis moment. He makes the point
that the hypnotic effect of Hitler's style of oratory was
(39:58):
amplified by the hip and antic effect of this new media,
which is a type of information overload in people's lives. Yeah,
Williams makes this fascinating connection between the technological attention economy
and the recent resurgence of authoritarian populism. Right. So, the
idea is that technology trains us to live by impulse
battering down our long term gold driven and value driven
(40:21):
behavior one distraction at a time. And what does it
do to your brain when everything you really want to
do with your time keeps getting interrupted by impulsively tempting
digital candy. Does it make you complacent with the idea
of impulsive decision making in other forms, even driving your
politics towards things that feel good in the moment, regardless
of whatever you think is really right in the long
(40:43):
term or in terms of your moral values. This is
on top of the effect of you know, we haven't
even really touched on the idea of these platforms rewarding
certain types of information that are often very negative in
their political consequences, like fake new is being more viral
on social media than real news. Yeah, conspiracy theories um
(41:07):
resonating with view with readers more than like a perhaps
a more stuffy breakdown of what we we do know
and what we don't know. Right, Okay, well, I think
we should take a break and then when we come back,
we will discuss options for how to fight back against
this state of affairs. All right, we're back. Let's let's
(41:27):
get into it. What do what do we need to
do to aim our video drone cancer gun at the
TV screen that is threatening us? I don't know. I mean,
so there are several options we could look at broadly.
One of them is that we could hope the attention
monopolizers will realize what's going on and stop monopolizing our attention.
(41:48):
You know that they'll like Facebook and all these big
companies and Apple and Google. They'll say like, oh, people
aren't really getting the value that they want out of
all the time that they're spending with these apps and devices.
Maybe we'll just make them less compelling so people go
spend their time on other things. Let's make our our
game needs to be less fun because people are playing
(42:10):
it too much, and our product is because ultimately, the
sad part about this idea is that we're not dealing
with people. We're dealing with corporations, which operate like machines.
They're full of people. People make them possible, and in
some case, in many cases, there is a particular creator involved,
but that creator is only a creator. They are no
(42:31):
longer the master. Yeah, the corporations are mobilized by market incentives.
And market incentives currently in like an ad supported and
data collection supported model of the technology sphere, are going
to incentivize for keeping you glued to the device or
stuck in the app. They want time on site, they
(42:52):
want screen time, they want your eyeballs, they want attention
because that's how they make money. Now, this does make
you wonder, well, what if they had a different way
of making money? Could things change? Then? Like what if
technology largely turned to a paid subscriber model rather than
an advertising supported model. Then if it didn't matter how
much time you spent in an app or on a device,
(43:16):
would everything be okay then? Because then they wouldn't be
nearly as incentivized to keep you using it. To the
Netflix problem, right, that's a paid subscriber thing. But you
notice Netflix introduces things that seem like they're geared to
keep you in the app, even though it's a paid
subscriber model. What is what's causing that? Like the AutoPlay functionality,
(43:37):
the thing on Netflix where it starts playing a movie
even though you didn't click on it. Why does it
do that? I think there are maybe a couple of
reasons for this. One is that how much time a
user spins on an app is somewhat predictive of whether
they'll keep subscribing to it. So, even on a subscriber model,
if you don't watch, you might unsubscribe. So they want
(43:58):
you to watch so you'll feel like you're getting value
out of the app and stay a subscriber. It's one
thing to get the get you to binge watch one show,
they want you to binge watch multiple shows. They want
you to subscribe next month as well. Yeah, because if
you realize, hey, I'm never using Netflix, you probably want
to subscribe, right. Another thing is that the more you
use an app, the more data the app can gather
(44:20):
about you, and that data is worth money. If not
externally then certainly internally. What kind of shows are resonating
with our average view or the certain demographics of viewers,
certain demographics of subscribers. Okay, so that's the first option,
hope they'll stop monopolizing our attention. That doesn't seem super likely.
Maybe would work a little bit better if there was
(44:40):
more subscription and less ad driven support for digital content,
but not clear. The second option is maybe we can
hope that attention monopolizers will shift to monopolizing our attention
with things that are truly fulfilling rather than things that
leave us feeling unhappy, regretful, and hollow. I mean, think
about this way. What if there was something that was
(45:02):
as impulsively addictive as your favorite social media app that
you spend hours just mindlessly wandering through, but it made
you feel as fulfilled as the stuff you really care
about doing. Like it it was full of intellectually stimulating
information and stuff that made you feel like, I'm really
getting value out of this, this is what I want
(45:24):
to be doing with my time. Sounds like you're describing
stuff to blue your mind dot contract. I mean that's
actually what we hope. Like we are digital content creators,
we hope people will consume our content. But I hope,
I mean, I don't get the sense based on our
contact with listeners, that people consume us just mindlessly and
then they're really regretful later. I hope that's not the case.
What we're what we hope is that we provide value
(45:46):
in people's lives. That increasingly we have the thing. I mean,
it sounds a little bit tacky to put it in
these these terms, but increasingly we can't just market the
website and say, go check out this website. It has
our content on it. Uh, it has to go through
something like Facebook. Facebook is the way that people get
to your content. Yeah. Yeah, And so we we in
these episodes by saying, like check us out on social media.
(46:09):
Isn't that ironic? But I mean, like that is. That's
sort of like if you've got a store that's in
the mall, and you you think it's good for people
to come to your store because you think we have
really good products and people get a lot of value
out of coming to our store, but they do have
to walk through the mall to get to us. Yeah,
and maybe they have to walk by six competitors who
are willing to pay for better better positioning within the all.
(46:32):
I mean you see that all the time. Yeah, So anyway,
can we hold out hope for the fact that maybe
maybe everybody will try to do it more like this,
that all these apps and these platforms and everything will
will focus on making their experience something that's deeply fulfilling
and aligned with people's goals. I think we can't just
assume anything like this will happen. We can't hold out
(46:54):
hope for that. One thing I want to mention is
that there was an example of an announcement along these
lines earlier this year. Mark Zuckerberg claimed on a public
post on January elevenen that quote, one of our big
focus areas inen is making sure that the time we
all spend on Facebook is time well spent. So he's
even using the tagle the phrase time well spent. It
(47:16):
sounds like he's been influenced by Tristan Harris and these
other people who are making this argument right. And the
main way he says, Facebook is going to do this
by altering the news feed algorithm to reduce the amount
of stuff people see from pages and publishers and increasing
what people see from friends and family. And this works
on the assumption that people like spending time seeing posts
(47:37):
and comments by people they know, and this will leave
them feeling less regretful than if they were seeing posts
by you know, check out our insane videos, or by
a podcast that they listened to. Say so, to his credit,
Zuckerberg acknowledged in the post he said, now, I want
to be clear. By making these changes, I expect the
time people spend on Facebook and some measures of engagement
(47:58):
will go down, but I expect the time you do
spend on Facebook will be more valuable. So if we
take him at this word here he's it sounds like
he's trying to say, I want to make my product
something that people get value and meaning from in their lives,
something that is more fulfilling to them. I don't know
if the strategy will work. I mean, who knows what
(48:19):
the real motives are. I mean, I try not to
be super cynical about things, but there could be a
lot of messaging reasons for putting up a post like this.
But I don't know if this will actually make things
all that much better. And then, on top of that,
even if it did make things better, the business incentive
for companies generally to do things like this is not there.
Like apparently, when Zuckerberg announced this on the post, Facebook
(48:42):
shares went down four. Yeah, because he might be the
monster's creator, but he is no longer the monster's master.
He can't dictate the hunger of this beast, at least
not for long. And I just keep coming back to
the idea of what does the corporation want? And not
just a finite and fixed thing either. We're talking that
the this is a beast that swells and shrinks as
(49:05):
necessary to survive the changing demands of its economic, political,
and social environment. It's a monster that wants nothing short
of complete consumption and endless digestion of you, your loved
ones in every generation to follow. It's the hungry, jealous
god that we could only dream of an ages past.
I want to start applauding, but it would sound kind
(49:25):
of sad if it's just one person applauding. So imagine
the crowd applauding. Yeah, Okay, Well, I'm being a little
negative and cynical about the whole thing. But if it's
a monster, maybe the creator can't control it. But if
I know anything about monster movies is that a hero
can sometimes slay it or drive it away. Right, you
gotta take defensive measures. So that's what we should talk about.
(49:46):
What can you literally do to protect yourself against having
your attention gobbled up by all of these digital slot machines.
One way you can protect yourself, of course, would be
to delete apps or block website that you discover are
making you unhappy or making you spend your time in
a way you later regret. But that's not always practical,
(50:06):
is it. Like some people us included, need to use
social media apps for work, and sometimes you want to
have the option to stay connected to friends and receive
messages from family through social media, even though you don't
want to spend your time being mindlessly hypnotized through a
flow of content from it. Right, So, if you want
to still have access to these apps and platforms, but
(50:27):
to make them less effective at hijacking your mind and
gobbling up your attention, the Center for Humane Technology recommends
a few steps specific to mobile devices that will help
make them less addictive and less able to hijack your attention.
The first one, turn off notifications if you haven't already
done this basically, unless it's something you really care about
(50:49):
knowing immediately when it happens, maybe like direct messages from
human beings, turn them off. Don't let that just go
to your phone whenever somebody at mentions you on Twitter
or whatever. Because the notifications or the foot in the
door strategy, right, I can check that one notification. Not
only are they distracting, but they get you in there
and then they try to push you towards the sale
(51:10):
of let's get a couple hours from you on this app.
Another thing, this is a pretty interesting appeal. They say,
switch your phone to gray scale display in I didn't
even know that was an option. Yeah, get rid of
the colors. Apparently you can go into the settings on
some phones and turn off the color so you just
see your phone in black and white or in gray scale. Well,
(51:31):
you can toggle it off and on right, So if
you want to watch a movie or look at a
photo or something like that, you could turn the color
back on. But if you generally have it in gray scale,
app designers are carefully they've carefully researched how to use
color to arouse your attention and reward seeking behaviors. And
if you switch your phone to gray scale, it will
have less power to lasso your brain with like red
(51:53):
flag notifications and stuff. Another thing they suggest, sort your
shortcuts on the phone. You've usually got like a series
of screens. You got a home screen, you swipe over
to other screens, and then you've got shortcuts that gets
you into your apps. You can choose where to put
the shortcuts and move them around. Right, So, when you
open your phone on your home screen, are the apps
(52:14):
that drain your attention the most efficiently? Are they right
there on the home screen? Maybe you should put them
in a place that's harder to reach, like sort them
into a secondary screen, or or just deeper than that,
like like store them six sudden folders deep and in
a prison of your smartphone. Right, so you really have
to make a decision that you're going to go into
(52:35):
the app. You don't just mindlessly open it every time
you unlock your phone. You know, I did something like
this for a while with with one of my social
media apps. I deleted the app, but I I still
could access it through the browser, so it didn't keep
me from checking it, but it made it just a
little more difficult for me to do it, and it
(52:57):
it seemed to help, at least for a while. A
big thing, going back to our first episode here, that
I would recommend is putting your phone in a different
room whenever you can when you're in bed. Don't charge
your phone right beside the bed. Charge your phone in
a different room in the house. When you're reading, or
trying to do some work, or trying to watch a movie,
trying to do anything else that you just want to
(53:17):
devote your attention to. Don't put your phone beside, you
put your phone somewhere else. It's it's a nice idea,
but that then you get into issues of like, well,
don't I need the phone next to my bed if
there's an emergency, or like for me, I use my
phone to play white noise every evening, so I become addicted.
I'm granted I could get a white noise machine and
(53:38):
and uh and get around that. Well, I mean it's
a question of do you find yourself having a problem.
I mean, if you just use it to play white
noise and that's helpful to you, then that's fine. Actually,
then the device is working the way it should work,
the way that it's good for it to work. It's
helping you get what you want. But if you find
that by having it there, you tend to open it
and you tend to start looking at stuff that's gobbling
(54:01):
up your time and attention in a way that you
later regret. That's when the problem comes and maybe it
should be playing white noise. But on the other side
of the ring sort of thing. Maybe there are also
a lot of apps that are designed essentially as defensive
weapons against the addictive power of the other apps. Right,
Like we mentioned that one earlier moment that tracks what
apps you use and shows you so you can be aware.
(54:23):
But there are a bunch of apps that do stuff
like this. There are apps that block your access to
certain websites, so you say, actually, I don't want to
be able to go to Twitter dot com within the
next few hours or something like that. Uh, And then
there are more recommendations you can look up if you
go to the Center for Humane Technology website. But we
should point out that these are mundane and specific defenses
(54:46):
against certain types of attention attacks from our current generation
of personal devices. They don't really address the larger problem
that we live in an economy that incentivizes companies to
get as much of our attention as they can. They've
come up with really smart, really effective strategies for doing this,
and their strategies for doing this are only going to
(55:07):
get better over time and They infect so many areas
of everyday life that it's difficult to escape them without
cutting yourself off from mainstream technological society. I think one
of the more painful things about this whole scenario is
that I feel like the more people grow up with
within this world, uh, you know, the more they don't
know what it's like to live without this current level
(55:29):
of over stimulation and hyper choice and decision fatigue. For instance,
in my family, we limit my son's scream time as
much as possible, but we we do let him watch
educational shows, play educational games. But we wonder sometimes if
he truly knows what it's like to be bored. And
even though you hear kids still use the word board,
(55:51):
they still talk about being bored. But do they really
know what it's like to be left with only your
own thoughts to entertain you and pass the time? Are
they left with depleted skills for reaching out into the
real world around them or the inner world of imagination
to pass the hours of the day. I do worry
about this. I sometimes think about how I'm never bored anymore.
(56:11):
I remember being bored when I was a kid, but
now I've always got ten thousand things to do that
I haven't gotten to yet, and I can access them
at pretty much any time. So I mean, I just
never really experienced boredom. There's always like something I could
be working on, something I could be reading, something I
could be getting to that I've meant to get to,
(56:33):
and boredom might be important. That might be an important
cognitive space, something that matters for our cognitive health and development.
What about larger institutional changes though, Yeah, that's something we've
got to think about. So here's one thing. Some governments
regulate the design of slot machines, right, which are brilliantly
created techno drugs that are addictive that serve the function
(56:55):
of draining people of as much money as possible, getting
them to play to extinction. Could we also regulate the
addictive properties of techno drugs designed to drain us of
as much attention as possible, to play to extinction of
our attention? Should the government regulate the way things can
hijack our attention? I'm not saying necessarily should, but it's
(57:16):
worth considering that the costs and benefits. I want to
read another exerpt from that James Williams essay, where he
mentions what we can do about this, He says, quote, First,
we must reject the impulse to ask users to just
adapt to distraction. We must also move briskly past the
illusion that media literacy will ever be enough. Nor can
(57:37):
we reply that if someone doesn't like the choices on
technologies menu, the only option is to unplug or detox.
This is a pessimistic and unsustainable view of technology. And
of course we can't expect the attention economy to fix itself.
We must then move urgently to assert and defend our
freedom of attention. So he ends up advocating essentially that
(57:59):
we need to create a discipline of studying freedom of
attention in order to come up with the right uh,
the right philosophy, the right framework, in the right terms
in which to discuss this problem, so that it's not
just something that a few people in the technology ethics
sphere are observing, but that it becomes a part of
our moral philosophy, or even just a part of like
(58:22):
in the same sense that a medication has to pass
various trials and become approved by a governmental body before
it can be actually used as a treatment measure, to
what extent should we get into a situation where an
app or some other kind of program or social media
and interface has to be cleared. It has to meet
certain thresholds of acceptability. I mean, it is difficult because
(58:46):
the ways in which these technological innovations hurt us are
not ways that are acute and easy to identify. It's
not like they caused us to be to become sick,
or they caused us to die in the show term. Instead,
what they're doing is depriving us of the attention and
will to get what we want out of life. And
(59:08):
this is something that only gets recognized in the large
scale over long periods of time. Yeah, until suddenly you're
in a situation where you have to wage a Buttlerian
jihad against the machines. Right, Well, that that is a
brilliant idea. So it goes back to the idea and
Dune that you know there there was, ultimately, at some
point in the past history of the Dune sci fi universe,
a war between humans and machines, not all machines, but
(59:31):
thinking machines, machines that were too smart and that had
hurt humanity's own sense of itself by recreating sort of
humans and the image of a machine because they had
created machines and the image of humans. Yeah, And I
think the jihad aspect is key here because you know,
people who are maybe not familiar with the the the
(59:53):
Islamic use of the term, you might just think jihad
is war, jihad is some sort of violent act, But
jahad means a struggle against depression and it it can,
it need not be an actual struggle of physical violence
of forms. Yeah, So, I mean, I don't think it's
it's unrealistic or overwrought to think of our struggle against
(01:00:15):
UH attention fatigue and the design of these various devices
as being a sort of job and to think of
it as a form of oppression. Now that being said,
obviously there's a lot of very real oppression going on
in the world that is UH, that is beyond the
scope of mere smartphones. But then it's not like there's
(01:00:35):
there's no interplay between the two because our our devices
are apps. We're talking about news feedes, we're talking about
political movements, we're talking about um social energy that does
impact actual physical oppression in the world and our understanding
of it, our knowledge of it, and our opinions of it. Yeah, exactly,
So there are two things that I'm trying to juggle
(01:00:56):
in my mind right now, and one of them is
this is a thing Tristan Harris points out, is that
we need to reject the myth that technology is always
just a neutral tool that you can use how you
want it. Technologies have content, and sometimes the content of
the technology has a net positive or negative effect on
us that's intrinsic to the technology itself. You can't just
(01:01:19):
wave your hand and say, well, well, technology is always
just neutral, it's how you use it. Some technologies really do.
They really are made in a way that gives us
things other than what we want, or affects us in
a way that we ultimately view as negative. But the
other side of this is that we don't have to
be technophobic. We don't have to say that technology necessarily
(01:01:40):
is bad or necessarily hurts us. Technology in general could
serve our best needs. It could be a thing that
helped us get exactly what we want out of life
instead of subverting what we want to get out of life. Right,
And I guess it's part of our job to think
about like how we can each individually work to try
to shape the world we even and the technosphere we
(01:02:01):
occupy to be more like that. To to to steer
it towards a you know, a positive feedback loop with
our own goals and desires and dreams and and interests,
rather than a place of perverse incentives that drives us
towards mindless consumption. Yeah, to to use it as a tool,
(01:02:21):
to keep it as a tool, and to sort of
maintain it, to cultivate it as a tool. Alright, So
there you have it, UH fight the power, I suppose
hopefully in this episode, we you know, we we laid
some kind of harrowing facts on you, but I think
we also provided some hope, We provided some tips, and
we would love to hear from everyone out there about
(01:02:42):
you know, for instance, how how it how it goes
when you start applying some of these tips to your
own use of social media and various devices in your life,
or if you have additional tips, additional strategies that you've
come across, additional thoughts on the on the topic in general. Um.
In the meantime, head on over to stuff to Blow
your Mind dot com if we will find all the
podcast episodes as well as linked out to those social
(01:03:03):
media accounts and uh big thanks as always to our
excellent audio producers Alex Williams and Tory Harrison. And if
you want to get in touch with us directly but
let us know feedback on this episode or any other,
to suggest topic for the future, or just to say hi,
or to tell us your personal story, you can email
us at blow the Mind at how stuff works dot
(01:03:24):
com for more on this and thousands of other topics.
Does it how stuff works dot com? No,