Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
My welcome to Stuff to Blow Your Mind, a production
of I Heart Radios How Stuff Works. Hey, you welcome
to Stuff to Blow your Mind. My name is Robert
Lamb and I'm Joe McCormick. And we're back to a
subject that may be familiar to you if you've been
(00:20):
listening to the show for a bit. It's the next
battle in the war against the Machines. That's right, you know,
as we've already discussed on the show, and I think
it's obvious to most of our listeners. The wonders of
interconnectedness that the web have have have given us have
also unleashed some less than satisfying realities. You know, we
we worry about smartphone addiction and that the degree to
(00:42):
which these these devices and the many apps they have
on them have been engineered to gain our attention. And
then there's been growing attention given to the role social
media has in in you know, playing into you know,
corporate and state manipulation, endangering democracy, our personal freedom, and
our happiness. We've we've recorded a few different notable episodes
(01:03):
on these topics. Right, so, I know you and Christian
a while back did an episode. Now this was years ago, now,
so the research has come a long way since then,
I think. But you did an episode on what the
measurable psychological effects of social media were, uh, and there's
still I think this is still a developing field. Like
I notice a lot of conflicting findings when I read
(01:24):
upon this, like is social media making us more lonely,
more depressed? Whatever? It seems like the answer to that
question is it's complicated, right, And and I think we'll
discuss a little this later on too. That you know,
it also depends on how you're using social media, what
your role is, or using social media as part of
your job, etcetera. Yeah. Uh, then we also you and
I Robert did a couple of episodes within the past
(01:46):
couple of years called the Great Eyeball Wars that were
primarily about the attention economy, about the fact that our
our devices, and specifically especially like social media platforms on
those devices, but other platforms also are because they make
their money by getting you to use them and pay
attention through you know, advertiser dollars. Like you are the
(02:08):
product on these platforms. You're not the customer, uh, and
your attention is what is being sold to the advertisers.
They're addicting us, and they're addicting us on purpose pretty much, right, Like,
one of the I think the handiest metaphors is that
a social media app on your smartphone turns your smartphone
into a tiny slot machine. Yes, but it's but while
(02:29):
a Vegas slot machine is is programmed to steal all
of your money, uh, this slot machine, this tiny slot machine,
is programmed to steal all of your time, your time
and your attention. Yeah, it wants you using it as
much as possible. And the numbers on this are kind
of freaky, like when when you actually measure how much
time people spend on their smartphones, especially looking at apps
(02:52):
like Facebook or you know, their social media apps, they
don't usually like what they find out when those numbers
come in, right, And so only we do have tools
on a lot of our smartphones now to track our
screen time and to keep tabs on and even set
up little barriers to excessive use. But as far as
I know, like most of that stuff is still very voluntary.
(03:15):
And then we're not doing as the default settings on
a phone um or certainly when you get a new phone,
you don't tend to bring over your your sort of
legacy settings from the former phone. So, um, you know,
I think there's a there's a huge argument to be
made that that these companies are not really you know,
going in head first in in the battle to reduce
(03:35):
your screen time, right. Uh. And then also we did
a more recent episode called The Doppelganger Network where we
talked about the jumping off point for that was an
article that we read by Robert Zapolski, the neuroendo chronologist
to Think at Stanford and he uh, then he made
a comparison between the effects of social media use and
and digital media more generally, and uh, the psychological conditions
(03:59):
like cap craw a delusion that caused this rift between
recognition and familiarity in the brain. So it creates a
kind of a kind of strange, alienated world where where
normally you'd pair these experiences of cognitive recognition, you know,
knowing that you recognize something you see and the feeling
of familiarity with it, but that is kind of torn
(04:20):
asunder by the dynamics of social media. Yeah. And and
so I I don't think you know the idea this
argument that oh that so the social media is potentially dangerous,
that it's that it has you know, ill effects. This
is probably not new. I think everyone's heard somebody argue
it to some extent, and we've even seen people make
fun of the argument, like, for instance, one image that
(04:42):
it frequently makes the round is UH is an old
old timey photograph of like a subway car or a
train full of individuals, everybody reading a newspaper, everybody's face
hidden in a newspaper, and then using this to make
fun of the argument that you know, oh, everybody's just
plugged into their phones, they're not connecting with each other,
as if to say, well, this is exactly the same thing,
(05:04):
but it's not. I think it's it's definitely worth driving
home that the type of UH psychological involvement that's going
on with a social media platform on a mobile device
is entirely different than what you would find by just
sticking your head into a book or a newspaper. A
book or a newspaper is not a real time feedback device. Yeah,
(05:28):
I mean, I think one thing that that that that
kind of like poking fun at this argument does reveal
is that you can take the argument too far and
be kind of totalizing about it. Because it's important to
recognize that the reason people use these services are because
they do provide something that people want. You know, people
through social media are able to keep up with friendships
(05:48):
that might have fallen by the wayside otherwise, you know,
maybe long distance friendships. Uh. You know, these these these
platforms and these devices do enable all kinds of things
and peoples lives that are valuable and good. Right, And
you can certainly go overboard and kind of a luddite
response to it and say, well, the internet is bad
(06:08):
or technology is bad. Uh, and I feel like everybody's
gonna have varying opinions. Like to one extreme, you may
be convinced that social media is uh you know, you know,
it's leaning more and more towards the self digestion of
human civilization. Or or you may see nothing wrong with
social media though you may say, look, Robert and Joe,
I you know, I use my Instagram, I use my Facebook.
(06:29):
I keep with up with a few friends, uh, you know,
a few celebrities or a few bands or what have you.
I get the news through it, and that's it. And
it doesn't you know, you know, greatly improve or harm
my psychic experience of reality. Yeah, and if and if
that's what you believe, our goal is not to convince
you that, no, you don't understand it's actually ruining your life.
(06:51):
I mean, you may very well have a perfectly healthy,
limited relationship with social media, and you may be one
of the people who's getting more out of it than
it's getting out of you. But for a lot of people,
I think we do want to make the case that
that is not what's going on. Right I would say
that most of us. Even if you you can say
you know honestly that you have a healthy relationship with
(07:12):
social media, you can probably not say the same for
everyone in your circle. There there there's probably somebody or
several people that the display signs of unhealthy usage. So
in this episode, you know, we're gonna we're gonna look
at some arguments against social media. Specifically, we're going to
(07:34):
be discussing UH an author by the name of Jarn
Lanier and his two thousand eighteen book Ten Arguments for
Deleting Your Social Media Accounts right Now. It's a great title,
it gets right to the point it does and and
it is a fabulous little book that Joe and I
both both read for this episode. It's um it's short,
it's something on the order of what hundred and forty
(07:55):
six pages long. It's extremely accessible, Like he did not
make this a you know, a high high level computer
science intellectual argument. It is written so that I think
pretty much anybody could understand it. It's very accessible, it's
very ground level, and I think it makes a pretty
compelling case that at least well he His argument is
(08:17):
that social media in the business model that exists today,
is doing more harm than good, and our best way
of fighting that harm is to have everybody get off
of these platforms, because this will force the the company
is involved to actually implement changes, right and and so
his his argument is not one against technology, and it's
(08:40):
not even necessarily against one one against um a certain
form of social media. It's against a particular business model
that powers social media. And that business model, I think
is what he ends up calling the bummer business model.
It's a business model that is paid for by behave
year old modification. Yes, bummer, be you in any are,
(09:03):
which Linear says stands for behaviors of users modified and
made into an empire for rent. It's clever having an
acronym because it sticks with you and uh and yeah,
it's clever and it's a little bit funny, and and
that can be said for the entire book. Despite being
a fairly serious topic with some potentially serious ramifications for
(09:24):
humanity at large and for individual you know, self worth,
it is a humorous read at times, and it is
it is fun to read. So I can't recommend it
highly enough. There's a book you can read on the
train and the plane, on a toilet. Uh, you know
it just it makes for a great but important casual read. Yeah,
and we we've talked about maybe getting him on the
(09:46):
podcast sometime soon, and that would be great to have
a conversation with him. But today we just wanted to
talk about maybe a couple of the arguments that he
brings up in the book and and our thoughts about them. Yeah,
we're not gonna attempt to regurgitate the entire book because
the book is is already already speaks for itself. So
we'll start though, by just talking about Jarren Lanier himself.
(10:07):
You might be familiar with him already, perhaps you're not.
Perhaps you've just read his name and thought it was
pronounced Jared Lantier, which is how I've been pronouncing in
my mind. That's how I've said it on the show
a lot. But anyway. He is a scientist, a musician,
and a writer. He is a major figure in the
realm of virtual reality, having founded the VR company VPL
(10:29):
Research in the nineteen eighties, and while he didn't coin
the term virtual reality, this is generally attributed to French
playwright uh Anton and R. Toad In, he did popularize it.
He helped create the first commercial VR products and introduced avatars,
multiperson virtual world experiences, and prototypes of major VR applications
(10:52):
such as surgical simulation. He was involved in the creation
of the Nintendo power glove. Yeah, now you're playing with
power which you know, which I have to say, the
power glove. I never had one as a kid, but
I saw people with them, I knew people who had them,
and it it was this, this instrument of wonder. I
don't think it was all that practical as a gaming device,
(11:12):
but I think it inspired a lot of people. And
I also love seeing, especially nineteen nineties science fiction where
they have reused a power glove as part of like
a cybernetic you know, outfit for somebody. Anyway, he was
also involved in the with the creation of the headset
apparently for the nine film The lawnmower Man. So you're
really bringing out the hits here. Yeah, And I think
(11:34):
this is probably not the stuff that usually gets highlighted
about his career, but it's this, you know, some of
the stuff that I think some of our listeners might
be familiar with. Exactly. I should point out that he's
not credited on lawnmower Man, but he does get a
thanks on the far superior sci fi work Minority Report.
Uh he But more to the point, though, he's an
(11:54):
author of several books, such as the two thousand six
in Nation is an Alienated Experience. You are not a
gadget of Manifesto from two thousand ten, who owns the
future from and Don of the New Everything from seventeen. Now.
I tend to think of him kind of as a
technology philosopher, and I really like a lot of his
(12:16):
approach because it's got a healthy skepticism about over hyping
technology and what it can do. And at the same time,
he's not anti technology. He's clearly somebody who loves digital technology,
loves computers. You know, he's worked with them his whole career,
and so he doesn't end up saying throw your smartphone
in the fire, flush it down the toilet, smash it
with a hammer. It's not an anti technology message. He
(12:39):
actually has a very specifically tailored message trying to identify
exactly what it is about the social media platforms specifically
as they exist today that's causing problems for us and
for our society, and how could they be changed. Yeah,
ultimately he is he's an optimist. Like he's he's presenting
an optimistic view of the future, like certainly highlighting problems,
(13:01):
but discussing how we can address them, which I love.
I feel like I've spent too much of my life, um,
you know, looking at more pessimistic views of reality and
dystopian views of reality, and I've gotten to the point
where those just don't serve me anymore. So I far
prefer reading an author like like Jarrelin here. So the
(13:24):
book in question, I have to talk about the cover
of it, because the cover is is very simple, you know,
just a black and red text on a white background
and a silhouette of a cat walking off the cover
of the book. And the cat is a central metaphor
in the book, right, because as he points out, you know,
as much as we love dogs, I love dogs, you do.
You do love dogs, And I love dogs too, just
(13:46):
as I don't own one. But as much as we
love dogs, we domesticated the dog, the cat arguably domesticated itself. Uh.
You know, it interacts in our lives more on its terms. Uh.
You you were at great pains to attempt to train
the cat, as anyone who's ever certainly you know, tried
to to you know, shoot a film about cats can
(14:09):
attest to um. So, Lanier argues that social media is
essentially turning us into well trained dogs. But we should
really strive to be cats, able to dictate our involvement
in the relationship at hand, to scratch the hand that
feeds us, if we so wish, to sleep wherever we choose,
refuse food, walk on all the furniture. Ultimately we should
(14:30):
we should want to be cats. We don't want to
be social media's dog. Yeah. And also, I mean, while
he's advising people to quit social media, he's making an argument,
and he's not. It's not a totalizing argument. I mean,
he realizes that different people are in different circumstances. Uh,
that it's it's not a choice that will work for everybody, right, Like,
he's very clear on the on the fact that quitting
(14:52):
social media is a privilege and not everybody is able
to do it. A lot of people, uh you know,
own a business or or or part of their job
entails them using social media. I know some people like that,
and then they're just they're trapped in they're trapped in it,
you like, you just can't walk away from it, or
you might be shackled to it more socially, like, well,
if I stop using uh, you know, Facebook, how am
(15:13):
I gonna can connect connect with my friends who all
live in another city and I just moved to a
place where I don't know anybody. Um, you know, they're
all these arguments to be made, and so he's not.
He's not, you know, drawing this firm line in the
sand and saying, you know, the winners over here, losers
over there, or anything of the of the sword. And
he's certainly not arguing that, you know, we should we
should all go and make a big dramatic to do
(15:35):
about quitting social media either because I think I think
we all see that occasionally on our feet to oh,
that's the most embarrassing thing when you see somebody post
a lot about how they're quitting and then they quit
for a week and then they're back. Yeah. Yeah. And
in fact, I actually I shared I shared something about
this book on my private social media feed, and and
(15:57):
immediately like somebody was like calling me out for having
posted about quitting social media on social media. But of
course I think what's wrong with that? I mean, I mean,
where are we going to talk about leaving social media
but social media to a certain extent um. And and
also well, I mean this episode where we're discussing his
arguments again, you know, we're not necessarily saying everybody's got
(16:19):
to get off social media, but I do think these
are some interesting arguments, very worth considering. Um, we're discussing
these on an episode that will be promoted on social
media because that is part of the distribution business model
of this podcast exactly. It's like, this is how, you know,
one way that we reach listeners, and if we don't
do this, it won't reach as many listeners. So I
(16:41):
don't know, how do you how do you balance that? Is? Like,
are you actually doing better if you say, well, let's
not post the episode on social media so fewer people
will hear it. Yeah, thus is the world. Thus have
we made it? But at any rate, Linear also ultimately
says like, hey, I'm not even saying quit social media forever.
And you know, because all timately he's hope, he's hopeful
(17:01):
that social media can be corrected, that we can come
back to a version of social media media that is
not harmful to us in so many ways. And then
also he's saying, like, you know, quit for a while
and come back. That's the only way you'll you'll have
any kind of like insight on what it's doing to you,
Like this will help give you the uh, you know,
the vantage point by which to to understand the interaction
(17:24):
between your life and these bummer systems. All right, well,
maybe we should take a quick break and then when
we come back, we can discuss a couple of the
arguments from the book and our thoughts about them. Than
than alright, we're back. So again, the book that we're
discussing is uh journalin Ears ten arguments for deleting your
(17:44):
social media accounts. Right now, you're probably wondering, what are
those ten arguments. We're not going to regurgitate all ten
arguments here. If you want to know what they are.
You should pick up a copy of it because without
even opening the book, all ten are listed on the
back right, which is which is was handy. You know,
you can just you can instantly see what you're in for.
We're gonna be talking about I think basically four of
(18:06):
the arguments and uh in discussing them a bit here
for you. Yeah, some in more depth than others. Now,
one thing that we should talk about upfront, because it's
sort of a foundational argument that feeds into all the others,
is this point that, in Linear's words, due to social media,
you are partially losing your free will. Uh So, one
of his core arguments on which many of the others
(18:28):
rest is that social media is at heart a mass
program of behavior modification for rent. That is how these
companies make money. So if your Facebook, the way that
you make money is that people pay you to have
some kind of influence on users of Facebook, and that
(18:48):
influence could be a very traditional, normal style of advertising,
the kind of thing that you know that happens everywhere
and most people aren't bothered by, right because it's clear
what's happening. You're just seeing an ad for a product
that somebody thinks you might want and you know, there's
the ad and you might go by it. That's you know,
we're not generally very bothered by that. Right. Always reminds
(19:09):
me of the moment in Futurama where a fry encounters
in the future and advertisement in his dream. That's a
little creepier. We find it intrusive. And he says, you know,
we didn't have that in my time. We just had advertisements,
you know, all over the place and in the sky
on the board. I mean, certainly we do live even
without social media. We live in an age of you
just ubiquitous advertising. Yeah, and you know that is This
(19:33):
is one thing that he does sort of attack is
the problem that the web arose on an advertising pay
for model. Uh. You know, back in the early days
of the web, there was this idea that everything needed
to be free to access. You couldn't charge people to
get stuff on the web. But then how do you
pay for producing that stuff. Somebody's got to make it,
(19:54):
you know that they've got to get paid somehow. So
what happens, Well, you'd pay for it by showing advertising
along with the thing, and the advertisers would pay for
what you're seeing right. And of course, again coming back
to podcasting, we're not blind to the fact that that's
essentially what you have with this podcast. This podcast is
provided to you for free, but you have to put
up with advertisements. Right now, I don't really mind that
(20:15):
from an advertisement from a from a podcast point of view,
because when I listen to podcasts and when I make
a podcast, I generally think the advertising that's happening there
is fairly straightforward. It's pretty clear what's going on. Somebody's
pitching you a product. I mean, likewise on television. Of course,
television start off with the same model and he discusses this.
You know, it's like here's the signal, here's some programming,
(20:36):
but here are also some advertisements. But generally speaking, without
getting into some of the you know, the trickier forms
of television advertising and product integration and so forth, you're
you're still dealing with a situation where it's like I'm
watching a show. Okay, now I'm watching an ad. Now
I'm watching the show again. Right. But even when even
with other things, you know, like uh, sponsored content and
(20:57):
stuff like that, I mean, I think there's there's a
big difference between what's clear and what's sneaky. I think
generally people tend to be okay with advertising when it's
clear what's going on, When you know, they're told who's
paying for what they're seeing, and it's clear what the
person who's paying for what they're seeing wants them to do.
(21:18):
Usually it's like buy my product or become a member
of my service or something like that. That that's that's
the kind of thing that I usually feel fine about
that most people tend to feel fine about what's going
on with social media. According to Tolneer's argument is that
there is a much sneakier, more subtle, and perhaps more
sinister thing happening, which is that our behavior is being
(21:41):
modified in ways that are not clear, that are not
that are not obvious to us, ways that we're not
aware of, and we oftentimes, in fact, most of the
time don't know who's doing it right. And I think
we all have those moments using a product like Facebook
where something an advertisement will pop up or predict or
perhaps a post even made by a friend will pop up,
(22:04):
and you'll stop for a second and wonder, why was
that served to me? Why am I seeing this? Yeah?
What is what have I done? What sort of interactions
on my part or you know, a demographic information on
my part has led to this being put in my face.
But there's a good chance that it is. It's either
serving the like the supply model of Facebook, which is
(22:26):
trying to keep you engaged on the platform for as
long as possible, so that that's one thing. They just
want to keep you on there so they can modify
your behavior more, gather more data about you, show you
more ads. Uh, So that's one thing. They might be
just showing you it to you because their vast collection
of data shows that when you see stuff like this,
you use Facebook for a longer period of time, or
(22:48):
you log back on more often later in the day.
So that might be one thing. But the other thing
might be that you're seeing that thing because somebody has
targeted you for some kind of act. They want to
generate some kind of effect. And this could be like
become my customer kind of effect, or it could be
something else. It could be in effect like we want
to make people stay home instead of go out and
(23:11):
vote today. Now at this point I do want to
drive home as well. That one thing that Lanier is
very clear on is that he is making no argument
that there is like a room at Facebook or Twitter
or wherever where like evil operations go on, that there's
like a sinister cabal in any of these organizations that
are sitting down and saying what can we do to
(23:32):
destroy the human soul or anything like that? Right, there's
not a m wahahaha committee. There is. Instead, there is well, well,
I mean, there are decisions being made by people about
what types of incentives will be algorithmically optimized and so
so that is something where it's not like the humans
who operate in these companies don't bear responsibility. They do,
(23:55):
but they're not they're they're generally not trying to ruin
the human race, So they're not trying to say, what,
what's the worst thing we can do to our users? Right?
That would that would just be a simplification of something
that is a lot more complicated, and it's taking place
not on the scale of even of an individual or individuals,
but taking place in the scale of a corporation. What
Linear argues is that bad business incentives which prioritize behavior
(24:19):
modification are leading to the creation of algorithms that sort
of automatically commit these these behavior modification schemes, and so
behavior modification can be linked to traditions and the behaviorist
school of psychology, which flourished in the twentieth century. We've
talked on other episodes of the show about like B. F.
Skinner and behaviorism. Behaviorism sometimes gets demonized, and they're definitely
(24:43):
really good reasons to be historically critical of the behaviorist
trend in the history of psychology, but it also wasn't
entirely wrong, like behaviorism, I think could be credited with
trying to make psychology a more objective science. But it
also you know, a common criticism is the it sort
of treats the human brain like a black box. You know,
conditioning goes in, behavior comes out, and whatever happens inside
(25:07):
just isn't really important. But one thing that it did
show is that if you treat the human brain like
this kind of like mystery machine where you just kind
of like put conditioning in and keep calibrating until you
get the output behavior you want, you can actually change
behavior in an extremely effective way like that, Like you know,
behavioral conditioning can be very powerful. And Landiar's Linear's argument
(25:32):
is that modern social media is sort of almost a
perfect vehicle for refining behavioral conditioning because it can collect
extremely minute data about you, and lots of it because
of course it has you know this connection like it.
It records everything you do and all the stuff you
do on social media. There's actually a lot of things
(25:52):
you do on social media that they can learn a
lot about, from your brain states to where you are
through location tagging, to what you spend money on and
you know your linked accounts, to the words you use
reflecting your moods. Like they can, they know a lot
about you and about how what they show you affects
what you do right, you know, I would be loath
(26:12):
perhaps to say that they know us better than we
know ourselves, but they're that. I feel like they often,
these platforms often know us better than we are perhaps
prepared to admit to ourselves. Yeah, well, they know things
about you that are different from the way you think
about yourselves. They know about you from a behavioral conditioning
standpoint where they can track in a really minute way
(26:33):
what conditioning goes in and what behavior comes out. We
tend not to think of ourselves that way. We tend
to think of ourselves from the inside out. We think
about our own mind states. You know, we tell a
narrative about our behaviors in which everything is rationalized to
make sense. So it's often we feel kind of demeaned
when it's suggested that we are vulnerable to behavioral conditioning.
(26:55):
It's like no, no, no, I'm not. You know. It's
like how people think advertising doesn't work on that, you know, like, no,
I'm not more likely to buy a product because I
saw a commercial for it, but you know you probably are.
I mean, we just tend to think we're more mentally
independent of the effects of stimuli that we actually are.
Or we will will criticize and say it doesn't affect
(27:17):
us on one platform and then on another will celebrate
it without naming any names. One might use a particularly popular,
you know, music streaming service, and you might go, wow,
I really love the algorithm on this thing. It really
knows what I'm into it. Yes, me, it keeps giving
me musical suggestions that are they're totally on point. Yeah,
and you need you have to stop sometimes like, well,
(27:40):
I guess that's a good thing, but is it really Yeah.
So social media platforms they can gather all this data
about you, but also they have extreme psychological power over you.
I mean, especially the big one to think about his
Facebook just like it's this almost perfect machine for maximizing
potential and behavior modification. It can make you feel and
(28:00):
think what it wants with astonishing effectiveness just by showing
you certain things that it has figured out that when
it shows these things two people like you, you tend
to react by behaving a certain way. And then of
course it can just keep calibrating those those efforts more finally,
and finally, finally with extremely high quality feedback based on
(28:22):
the way it tracks your behavior and all the ways
we mentioned earlier. So this this is his basic argument,
and you can see that companies like Facebook are interested
in in always getting tighter and tighter control of the
data and feedback about you in these sorts of ways.
I was just reading a story the other day, uh,
in the M I T. Tech Review about Facebook being
(28:44):
involved in funding research on a wearable headband that could
supposedly read your thoughts. Now, I don't think we should
get overly alarmed about like this one particular news story,
because this kind of technology is probably very crude at
this point. You know, it doesn't tell you a lot
today going to go from like zero to black mirror
in a year on this particular front, right, But I
(29:05):
do think the fact that Facebook is funding this kind
of research should tell us something. They want to get deeper,
They want to go further. They want to get closer
and closer to your brain to know exactly how you're
reacting to things in real time all the time. They
want they want sort of like infinitely precise and finally
(29:26):
calibrated data about you. And why would they want that.
It's because it gives them better control over what you do.
And that control can be rented out to sponsors. And again,
and those sponsors could be fairly straightforward. They could be
somebody trying to get you to buy their brand of shoes,
or it could be a government trying to control what
(29:47):
you do on a certain day, or or control how
you feel about a political movement. Absolutely, and and that
certainly they did. The involvement of of of other states
or operator is acting on the part of on the
behalf of other states has has certainly been in the
news a great deal recently. Well, yeah, it's very strange
(30:07):
that like that. I don't know if people would have
predicted ten years ago. Maybe they did, and I wasn't
aware of it that social media would start to become
a major um statecraft and national security concern. Yeah. I mean, well,
I think we were all probably in the you know,
in the in the same boat. You know, for the
most part, when social media started up, it just seemed
(30:30):
like a thing that was a fun way to connect
with a few other people that you knew and maybe
meet some new people. It's yeah, and it was you know,
for the most part, you know, it was just you
and friends and potential new friends on there, and your
parents weren't on it yet. Major politicians were not on
it yet. Um major governments were seemingly not involved yet. Well,
(30:51):
I mean it's the I don't want to be overly,
I don't want to demonize it too much, but I
mean it's kind of the addiction model, right. It's like,
you know, your first your first dose of this drug
is free, maybe your first free, uh, several doses, you
get hooked. And then what Linear talks about a big
issue going on with these companies is that there is
this problem of network effects. Right. Network effects are the
(31:15):
thing that happens in digital media businesses where people tend
to get locked into a service, and once people are
locked into a service. It's very hard for there to
be an effective competitor and get people to move over there.
People are already on Facebook. If you tried to start
a better Facebook today, and people have tried to start
like nonprofit facebooks, no, nobody goes over there new before
(31:38):
and it lasts like an afternoon. Yeah, you're you're already.
People are on Facebook because that's where their friends are,
and that's where all that's where everything is happening. It's
network effects that keep people locked in. Once they're already there,
and once everybody's there, you can kind of like lock
the doors and then start doing whatever you want inside
and you can check out any time you like, right,
but you can never leave you all right, So we're
(32:02):
skipping and skipping around a bit. But another argument, basically
the fifth argument that Linear makes is that social media
is making what you say meaningless, and the core argument
here comes down to context. Now, entering into this argument,
I personally thought about the rather simple example that I'm
sure many of you are thinking of right now, and
(32:24):
an ambiguous text message or email that has taken out
of context because there are limitations to what we can
typically convey in these you know, limited generally short form
communication formats. Uh. You know, an emoji can only do
so much to provide context to what you're saying. You
ever notice how if you listen to a friend of
(32:44):
yours talk about something, it sounds normal and fine. But
if you listen to a snippet of a stranger's conversation,
they sound like a complete moron and it's like embarrassing
or or at best like a cryptic alien where you're
just like, what what what? What was that? I'll never know?
But do you know? I mean? And it's not like
strangers happened to be Like there's something wrong with people
(33:07):
you don't know as opposed to people you do know.
Is that when you know somebody, you have context based
in their personality and your relationship with them. What they're
saying makes sense to you because you know them and
you know how they generally think about things and all that.
If you were if you didn't know this person you
just happened to overhear them. Is you know, you walk
(33:28):
by on the sidewalk and they were saying the same
thing that would normally say that would sound normal to
you as part of a conversation. You might think like, wow,
what that What a freak. I know, people probably think
the same thing about me all the time. Oh, I'm
sure people think that about I mean I think about
this all the time, Like I'll be out somewhere having
a conversation about Highland or to the quickening, and I'm like,
what do I sound like? What kind of what kind
(33:51):
of idiot do I sound like to somebody who doesn't
know me? I mean, I probably maybe I sound that
bad to people who do know me, but no. Alright,
So but anyway, coming back to linear here, he's getting
some of thisp flies to his case, but basically he's
saying that you with context or the lack of context,
you know, it comes down to the lack of individual
(34:12):
control over context on social media. And he points to
two extreme examples of this online to sort of, huh,
you know, better illustrate what's happening elsewhere. So, like the
two extreme cases would be when, um, when you have
legit ads popping up on say, terrorist recruitment videos on YouTube,
(34:32):
which was which was a problem at least early on,
is that you would have a legitimate advertiser and ultimately
an illegitimate content or an illegitimate user, and those would
be matched together in the context, would would be you know,
accidental and terrible. Yeah, platforms are juxtaposing content without understanding
what that content is, and it very often doesn't reflect
(34:54):
well on any of the content, or certainly on some
of it. Right. And then the other example he brought up,
we're when you have images of say, women and girls
whose whose images are are sexualized or incorporated into violent
media without their consent. Again, that's like a it's horrible,
and it is an extreme example, but he argues that
(35:16):
these extreme cases are possible because social media in general
robs us of control over context. When we express ourselves online,
we have no idea how that expression will be presented
specifically to anybody else. How often is there some public
controversy about like a tweet where you know, somebody is
(35:40):
like somebody's like, hey, I found a tweet by ex celebrity,
you know, where they said something that looks really bad,
and then that person defends themselves by saying, but you
took it out of context. And what you really feel
about this, of course, is going to very highly case
by case because maybe the context changes the meaning of
it and maybe it doesn't. But when it's is presented
as a snippet that individual doesn't have control of the context.
(36:04):
Maybe you have control of the context, but but they don't.
It's similar when and we certainly see this, uh in
our political cycles as well. You know, something that a
candidate said or wrote, uh ten years ago is you know,
twenty years ago, what have you? Uh? Is taken out
as a snippet and presented often without completely either without
(36:24):
context or without like complete contexts as to what they
were talking about. Uh. And you know, and in some cases,
I mean, anytime you're quoting somebody, you are you are
literally taking them out of context. But like, uh, there
are ways that being taken out of context can be
pernicious and then they're i mean, they're totally normal ways
to do it too. But then again, is when when
(36:47):
something is taken out of context, is there's kind of
implied by that statement that context is usually in place.
And it's like saying, who who left that this toy
on the kitchen floor? Let's put it back where it
those you don't want to live in a reality where
the toys are always on the kitchen floor, in which
context is always out of place. And uh, And so
(37:08):
that's ultimately what he's getting out with social media. I'm
gonna gonna read just a quote from his writings here, quote,
we have given up our connection to context. Social media
mashes up meaning. Whatever you say will be contextualized and
given meaning by the way algorithms and crowds of fake
people who are actually algorithms match it up with what
(37:29):
other people say. And he continues, speaking through social media
isn't really speaking at all. Context is applied to what
you say after you say it for someone else's purposes
and profit. So essentially, we have surrendered context to the
bummer platform, he's saying, rendering communication quote petty, shallow and predictable,
(37:51):
and as such, only the extreme voices, the worst of us,
the loudest, the you know, the most acidic voices in
our coal sure are going to be the ones that
rise up. Oh yeah, I mean, this is a whole
other point he makes that I guess we're not going
into in depth, but I mean, he argues that these
platforms necessarily promote the worst voices because the worst voices
(38:15):
tend to drive engagement, and the platforms want to drive engagement.
What gets people using the platforms more, keeps people glued
to Twitter, glued to Facebook. It's like whatever gin's up
the most negative emotion, right, And one cookword on engagement.
One thing that he hammers home a number of times
in the book is that when you hear the word
engagement used in a social media context, what we're talking
(38:37):
about is manipulation. So try to think about that next
time if you're attending a meeting or reading an article,
uh sort of like especially a pro social media argument,
uh you know about social about engagement, just uh switch
it out for the word manipulation and see how the
taste fits you. But called engagement cotton manipulation of whatever
(38:58):
you like. It ends up creating the environment where everybody
has to play that game in order to be heard.
And of course that means it changes the way that say,
legitimate journalistic uh um publications have to play the game.
You know, you have to uh lean into even you know,
more ridiculous headlines for instance clickbai clickbait headlines in order
(39:20):
to get that precious and engagement, and that can have
an eroding effect on the institutions of journalism and themselves.
Oh absolutely, I mean well that also goes with the
just like it is the most uh the most toxic
or most uh, whatever voices gin up the most negative
emotion tend to be promoted on these platforms because they
(39:40):
drive engagement. The same could be true of topics. So
like you might not say that, well, my voice is toxic,
but like, whatever topics get people the most, like upset
and worried and aggravated and angry and all that, those
topics are going to be favored by algorithms that are
trying to prease engagement. So they know, you know, I
(40:02):
think a lot of times, um, publishers of online media
know that there are certain topics that get people really upset,
and those will be the highest performing articles on the
videos on social media those days. Yea. Like think of
chemicals as engagement, like which chemicals are most engaging when
you get them on your skin? You know, I instantly
(40:22):
think of the ones that are going to sting right, uh,
And suddenly I can't think of anything but those stingy
places on my skin. Um. So you know again, one
of the things that he comes back around to too
is that you know what, in this context situation is
that what you were saying is only valuable or relevant
insofar as it serves the platform and again, you were
(40:43):
not the customer on this platform. You are the product.
The customers are those uh, those corporations or even state
players that have the money to pay into the bummer system. Yeah.
Now I was glad to see that he thinks podcasts
have not been ruined yet. Yes, yes, so there there's
a whole section in this uh this argument where he
(41:04):
goes on to discuss podcasts and uh and he makes
an argument that podcasts are a rare area of our
media that are not bummer yet. But he does dream
up and describe an absolute nightmare scenario for the future
of podcasts that I hope never comes to pass. And
while I was reading it, I was literally breaking out
in sweats. Yeah. I mean, because think about what we
(41:27):
do on the show, you know, stuff to blow your mind.
Is full of our personalities and our context, and we
have the time and we have the space to present
topics and ideas, to present our takes and topics and ideas,
and a chance to share our personalities with the listener
in a continuous episodic manner. And as a result, you,
the listener, you kind of know us. You you know,
(41:49):
when when when we get something wrong, you have the
context to understand what's going on generally, you know, and
what we do on this show is not easily reduced
to sound own bites. That's true. And this has been
a problem before, right, Yeah, you should trust us on
this because every now and then, not currently, but it's
did with a fair amount of regularity, someone will come
(42:10):
to us and say, we need some sound bites. Yeah,
give us fifteen second clips of your show to show
what the show is really about. To show what the show,
that should be the clip. But no, it's like it's
impossible to do. When you try to pull out fifteen
seconds of our show, nothing sounds right, I mean, nothing
really makes sense on its own. That's that short. Yeah,
we're not we're not a zing or factory here. We're
(42:31):
not calculating celebrities or want to be celebrities, and we're
not actively trying to play the sound bite game. We're
not trying to play the bummer game with this show.
But to clarify all this and really explain what he's
talking about, uh, Lanear devises a way to quote ruined podcasting.
Uh And then he had nobody do this. Okay, so
(42:53):
somebody's gonna do I mean, I yeah, I can, I
can see it. How Basically, he describes a sort of
podcast aggregator app that serves up snippets from podcasts with ads,
of course, and the resulting situation would be that podcasts
would then be incentivized to produce the sort of content
that lends itself to this format, so fiery, you know,
(43:16):
extreme attention grabbing sound bites, the kind. Okay, so imagine
clips of podcasts that are that are prioritized the same
way that like tweets or Facebook posts or YouTube videos
are prioritized by the content recommendation algorithms. So just basically
like pulling out the most egregious and like negative emotion
(43:40):
conjuring moments of things. Yeah, and the horrible outside of context. Yeah,
And the horrible thing is that I can well imagine
the marketing on soci an aggregator. You know, it would
be something like, don't have time for all the podcasts
in your life? Well, now you don't have to let
telepod pick out the best parts of the shows and
topics you love and serve it up to you in
(44:01):
an easy to consume dose of wonder that fits into
your busy schedule. And I and it sounds kind of convincing.
You know, it's like, yeah, I'm busy. I don't have
time for a whole bunch of hour long podcasts. There's
so many of them. Why not let the algorithm slice
out just the choice cuts from all my favorite shows
and give them to me. Friends out there in podcast land.
(44:22):
If this happens, boycott I do not do this. Do
not unless you're already this type of show that is,
you know, leaning, because there are shows out there that
do land themselves well to uh to this sort of thing,
not necessarily by design, but just by sort of the
suit that the type of topics are already covering, or
the individuals involved, the personalities involved, you know. And it's
(44:44):
not saying that you know that their villains because of it,
but I think there are a lot of great shows
and you know, hopefully ours is on that list where yeah,
you you you can't cut out these little segments and
expect the host organism to survive or to be able
to communicate what it's trying to com unicate. It's gonna
be called pod butcher. And if pod Butcher came to pass,
(45:06):
you know, I can tell you that we would not
be able to produce the sort of show we produced
now that you presumably like. If we had to satisfy
such an algorithm, and what you like would have even
less to do with it, because you would have handed
your likes and choices over to Bummer, just as we, uh,
you know, would have handed over context itself. And the
(45:26):
interesting thing is Linear is not the only person, or
the first really to recognize the death of context in
social media. I was looking back at an article from
twelve and Forbes from Susan Tardonico, and she wrote, quote,
every relevant metrics shows that we are interacting at breakneck
speed with frequency through social media, But are we really
communicating with our communication context stripped away? We are now
(45:51):
attempting to forge relationships and make decisions based on phrases, abbreviations, snippets, emoticons,
which may or may not be accurate rep presentations of
the truth. And we can we can actually go back
even even further on this notion as well um on
the show. In the past, I've discussed the work of
futurists Alvin and Heidi toffler Uh. They wrote an important
(46:13):
nineteen seventy book titled Future Shock, which was also made
into a uninformative but also kind of more entertaining than
informative orson Wells narrated TV show. That's also that's wonderful
in its own right. But the book, uh was pretty great,
and they discussed the apparent and possible disruptions of the
human experience and human society due to rapid advances and technology.
(46:37):
And I would like everybody to consider this quote from
Future Shock nineteen seventy again quote. Rational behavior in particular
depends upon a ceaseless flow of data from the environment.
It depends upon the power of the individual to predict,
with at least fair success, the outcome of his own actions.
To do this, he must be able to predict how
the environment will respond to his acts. Sanity It's elf
(47:00):
thus hinges on man's ability to predict his immediate personal
future on the basis of information fed him by the
environment and me personally. I would extrapolate this to the
digital environment that we've grown increasingly dependent upon. You know,
it's like every day, like multiple times a day, over
and over again, we're plugging into the digital environment and
(47:22):
checking out of our physical environment as being a main
determiner of our behavior. But it's not an environment that's
chiefly mechanically dominated by things like you know, everyday Newtonian
physics that we can predict pretty well. It's more like
the main environments that we're participating in are like the giant,
you know, super complex slot machine where you you pull
(47:45):
the lever and you know, you can pull the lever
in a few different kinds of ways and you don't
know exactly what's going to come back out at you.
So this is this is the digital context, the digital environment,
and when everyone to keep that in mind, is I
continue reading this quote from the Toffler's here quote. When
the individual is plunged into a fast and irregularly changing
situation or a novelty loaded context, however, his predictive of
(48:07):
accuracy plummets. He can no longer make the reasonably correct
assessments on which rational behavior is dependent. To compensate for this,
to bring his accuracy up to the normal level again,
he must scoop up and process far more information than before,
and he must do this at extremely high rates of speed.
In short, the more rapidly changing and novel the environment,
(48:28):
the more information the individual needs to process in order
to make effective rational decisions. And of course there are
limits to our speed. So anyway that the tofflers, I think,
really strike a chord with our current situation in this passage,
at least from my standpoint. I mean, sanity itself hinges
on our ability to predict our immediate personal future on
(48:49):
the basis of information fed to us by the digital environment.
And here we are attempting to communicate, act, and absorb
knowledge in a social media environment that makes it impossible
to control the information fed to us, absorb it all properly,
and control the context of our own voices. Yeah, all true,
I think. I mean it's it's kind of scary. I
(49:10):
mean to think about the fact that I'm quite sure
that nobody, there is nobody at Facebook who fully understands
all the decisions made by the you know, the content
recommendation algorithm that creates the Facebook feed, right, I mean there,
I mean, maybe they could go maybe they have some
way of going through if they could look at an
individual one and say, okay, here are probably some reasons
(49:34):
why you were shown this, why you were shown that.
But they can't, like they can't predict it all. You know,
you can't generate one of these feeds just uh, you know,
with your own brain. Yeah, I mean, and we're we're
ultimately we're just continuing to understand what and to what extent,
you know, the harm is. I was looking around, um
(49:55):
at some various uh, you know, posts and sort of
industry thoughts about about how social media has been linked
to struggles with depression and anxiety. Uh. And there are
actually some serious discussions going on within the journalism field
where journalists often feel the need to maintain a social
media presence or are mandated to and are forced to
deal with the resulting actually, I mean actual trauma and PTSD.
(50:19):
Oh man, if you're a journalist, especially working in some
kinds of fields, you are going to be inundated day
in and day out with with hateful messaging from people
who don't like whatever you're you're saying or doing in
your career. Uh. People telling you to go kill yourself,
people telling you that you're a fraud. And you know,
and heaven yea, and Heaven forbid your your gender, your
(50:40):
gender identity, your your your race or um or what
have you. Uh, you know, happens to to you know,
put you in the lines of sights of you know,
particular troll groups on social media. Totally. And and I
know that there is a kind of common reaction, I think,
especially from people who don't deal with this problem themselves,
(51:00):
say like, you know, it's just people, it's just trolling.
Just get over it, you know, it's just trolling. Like
you're probably just failing to imagine what it is like
to to actually face this kind of like hate and
abuse all the time. You know, we're we're highly social creature.
I mean, even if you don't actually fear violence against yourself,
which you might have good reason to, depending on you know,
(51:22):
who you are and who these people are and what
they say. But I mean, if anything should be clear,
is that the digital world and the physical world are
are not separated by an impassable chasm, you know. I mean,
we online Israel life Israel life, and we see violence
stemming from digital activities. Yeah. But even if you're not
in that category, dealing with an onslaught of just hate
(51:43):
and abuse all the time, is it is life ruining. Yeah,
And and generally we're not trained to deal with it.
So a lot of the discussions going on in journalism
are are, you know, should we be training people to
to cope with the flood of negative commentary and effectively
sort out what you be ignored, what requires attention, and
what constitutes an actual threat that should be you know,
(52:05):
reported to authorities and to reiterate, of course, it's not
just journalists that deal with this kind of issue, but
like journalism is one field where a lot of times
people have to be on these platforms, whether they want
to be or not, and also by the nature of
their work just naturally just attract a lot of negative attention.
Right if, if, if someone wants to learn more about this,
(52:26):
I should point out that Kyle Bessie wrote a great
post just last month at journalism dot co dot uk
titled how social media impacts mental health and Journalists. And
of course, in thinking about all this, you know, comes
back to the fact that like this is the reality
we've built for ourselves, you know. I mean it's through
this complex interface of you know, of corporate interests and
(52:48):
algorithms and and so forth, but still like this is
the world we've made for ourselves. We've created, we've created,
you know, through our use of technology, these new pitfalls
and these new pairs to social engagement. All right, we
need to take another break, but after this we'll be
back with more of our discussion. Thank thank alright, we're back.
(53:11):
So one of Jarren Lanier's arguments from this book ten
arguments to what is it? How exactly does it go
for deleting your social media accounts? Right now? Yeah? There
it is. Um. One of them is that quote social
media hates your soul? Now, Robert, what are we to
make of this? So this is this is argument ten.
(53:33):
So this is the like, the final, all encompassing argument.
So in a sense, it's almost a little bit unfair
for me to skip to it because it it hinges
upon all the arguments that he's made. But it's not
going to spoil the book. It's not. But to give
you a taste of it, I'm just gonna read the
second paragraph from the first page of this argument quote
to review. Your understanding of others has been disrupted because
(53:55):
you don't know what they've experienced in their feeds. While
the reverse is also true, the empathy others might offer
you is challenged because you can't know the context in
which you'll be understood. You're probably becoming more of an a,
but you're also probably sadder. Another pair of bummer disruptions
that are mirror images your ability to know the world,
(54:16):
to know truth has been degraded, while the world's ability
to know you has been corrupted. Politics has become unreal
and terrifying, while economics has become unreal and unsustainable, two
sides of the same coin. That's pretty bleak, And it's
hard to imagine just from that paragraph that he is
ultimately presenting an optimistic message. He is, he is, yeah,
(54:39):
but you know he's he's saying though that Look, Bummer's
behavioral modification has taken place not only at an individual scale,
but at a societal scale. In this it's reach is
more like a religion, and it's and it concerns not
only the way you live your life online, but what
it means to be a person at all, which, again
may say on a bit extreme and a bet out there,
(55:01):
but I think he really makes a strong case for this. Uh.
First of all, coming back to free will, which we
talked about earlier, free will is central to most religions.
You're you're hard pressed to find a spiritual model in
which humans are mere automatons. But under Bummer, he argues,
free will isn't destroyed, but it is assaulted. It's degraded,
you have less of it, and certain parties wind up
(55:23):
with more alongside, you know, their wealth and power that
they've already accumulated or are accumulating as they take more
and more of your free will, you know. And this, actually,
I think does does go along with I don't know.
So we've talked before about the coherence of the idea
of free will on this show, and I'm sure we're
gonna get a lot of kind of like a materialist
(55:43):
push back on the idea of free will, saying, hey,
wait a minute, free will is an incoherent concept. I
think you could make that argument based on some definitions
of free will, but in this sense, the sense that
he uses it, I think free will is an important
thing to consider and is a real concept. Basically it
free will means um the feeling that you have control
(56:04):
over your own behavior by understanding the influences on yourself
and thinking about them consciously. And you know, you can
never understand all of the influences on yourself, but there
are definitely ways in which you can be in a
system where you you feel like you understand most of
the inputs that are coming in on you, and you
can process them consciously and your decision making versus a
(56:27):
system like like a kind of behavioral modification system where
you in fact don't know you're being influenced. You don't
know who's influencing you, and this is all opaque. You're
just suddenly producing behaviors that feel alien and you don't
know why you're doing them. Yes. Now, another religion based
argumently makes uh in this chapter is that Bummer ultimately
(56:50):
wants you to believe it is the Internet and it
is the main part of your devices. Uh. But but
it but it ultimately is separated. You know. He says
you can have the Internet with social media, and he
makes a strong argument for that. You know, that's ultimately
I think his point is that you know, you you
could have a great Internet just by sort of switching
everything from the behavior modification pay for a model to
(57:11):
like a subscription model where you know, like you could
have something like Facebook, but instead of modifying your behavior
uh to pay for it, you just you just pay
to get access, right and uh and uh. Lanier argues
that you know, in the same way that Protestants rejected
papal indulgences, you know you can you know, you can
reject the social media, but keep the Internet. You can
(57:32):
reject the version of your faith that is uh that is,
you know, tiresome or offensive or dangerous, and keep the
parts that work for you. I mean, I've said that
before regarding just religion in general. You know, whatever religion
you adhere to, uh, I can almost guarantee you there
is some version of it, uh that is that is
(57:52):
ultimately more accepting and uh you know, and more liberal
in its outlook. Uh you know, whether it's in your
immediate area or or you have to go out to
find it. As another issue, he also makes the argument
that Bummer activates the pack setting of her mind, and
in doing so, it quote resurrects old conflicts that had
(58:13):
been associated with religion in order to engage people as
intensely as possible. So basically, social media riles us up
and causes the kind of like pack mentality and tribalism
that normally you would have to look to a religion
to do. Yeah, and what he This is part of
an earlier argument in the book, but basically it's part
(58:33):
of an argument he makes about how uh social media
is making us meaner and and worse by by triggering
types of thought patterns that are more associated with obsession
with hierarchies and and in group out group thinking and
that kind of thing. Yeah, and domination mentality, yeah, dominated culture.
(58:53):
And then he he also points out that Bummer ultimately
asked you to have faith not in in God or
God's or a godess, but in the almighty algorithms that
decide what slice of news, political commentary, fake news, parody,
conspiracy theory, or just outright hate you see in your feed,
and that it's you know, entirely anti ENLiGHT enlightenment in
(59:15):
that regard, and that it makes learning subservient to human
power hierarchies. Well, yeah, I mean by allowing these algorithms
to control what you see all day, you are, in practice,
whether you know you think about this or not, in practice,
you're giving your consent to somebody to shape who you
become as a person. And uh, And that that's a
(59:37):
lot of power, that's a lot of faith to put
in some business. He goes on from here to discuss
another destructive force in human discourse, memes. Um, So this
is that this is one of these areas where when
I talk about memes, I feel like I often just
come off like an old person, an old, grumpy person
who doesn't understand how the kids talk, you know, complaining
about memes. But no, I I am with you there,
(59:59):
even when I see a meme that's funny, and I
see him all the time, you know, I'm not I'm
not above memes. It's like, yeah, some are really funny
and I like them, but there's a part of me
that always sort of rebels. And I think it has
to do with what we were talking about about context earlier.
You know, I worry about meme culture degrading the context
of original images and text in a way that that
(01:00:22):
just constantly batters down our defenses and batters down our
desire to understand things and in their original meaning. Yeah.
He He writes that memes may at first, you know,
when we first engage with them, they might seem to
amplify what we're feeling or trying to say. And you know,
for instance, somebody posts a bit of its news or
thought and you back it up with like, what do
(01:00:44):
you see in so many Facebook responses? Or I guess
on other social media platforms as well, you see like
gifts and memes that seem to be sort of an
amen brother kind of an argument or or or some
other interaction with the content. But but but ultimately, like this,
this feeling of amplification is an illusion. You're only reinforcing
the notion that virality is truth. Whatever is the most
(01:01:07):
viral is rewarded, and that's a key part of Bummer's design.
But just because it's viral does not mean it is
the truth. And uh, which seems like an overstatement of
the obvious when I say it like that, But but again,
don't think about you know, think about the way we
interact with it and the way we use memes. I mean,
(01:01:27):
if you go to any fact checking website, just pick
your favorite I mean, poll it a fact or whatever,
and you like scroll down and you scroll down to see,
you know, like truth ratings by source. What's always the
number one source of total fabrications. You might think of
immediately your least favorite politician, but no, it's not a person.
(01:01:49):
It's viral image. Viral image is always the number one
source of false facts that are spreading around the Internet.
Why is that, Well, because viral image is an extremely
powerful method for spreading falsehoods. It spreads way easier than
somebody going on TV and saying them and here goes
on from here to you know, also point out that
(01:02:11):
many of these bummer companies and in some of the
key individuals associated with them, you know, they're also they
also put a lot of emphasis on grander ideas of
organizing all information, of providing communities with purpose of creating AI.
And ultimately he says that you know, this is a
this is a danger to to personhood. You know that
(01:02:33):
there's a spiritual danger uh, to us in our sense
of personhood when we start, you know, putting too much
emphasis on these these non human models of of thinking
and humanity. Uh, you know which, Again this is getting
into kind of heady, headier territory than most of the book.
But I think it's it's an interesting case. You know
(01:02:54):
what is how how are these powerful corporations thinking about
what it is to be human? And then how is
that changing the definition at least like the sort of
the spiritual definition and the self defining principle of personhood.
So again, like this this argument in the book, I
it does have I think there is a possibility that
(01:03:15):
you hear this or even you read it, and you start,
you know, asking yourself, well, is that that really the case?
So we're really thinking about social media like it's a religion?
Is it still? Is it? Is it actually uh serving
a purpose that is akin to religion? And I think
I think he makes a strong case. I think that
I think it's one of those things where you kind
of like you wake up one day and you realize, oh,
(01:03:37):
I I have joined the church social media and I've
actually I've been attending services for years and I didn't
quite realize it. You know, you know, you tend to
think of of a religion as you know, is this
as a as a church or a temple? Is these
images of God's and you don't think of the roles
that they play in a culture. And how even as
(01:03:59):
you know, for the most part, and you know in
many cultures, there is a movement away from these organized religions.
You know, there's this there's this possibility that we are
we are rebuilding something that operates in much the same
way or maybe in just some of the same worst ways, Yeah,
without without some of the same, without providing some of
the best. Because I've you know, as I've discussed in
(01:04:21):
the show before, you know, I think they're I think
religion and spirituality, uh that they both have tremendous benefits
to individuals and as certainly to uh uh to societies
and uh you know, you can get that sense of
community with a religious organization. You know, they can do
a great deal of harm if they are you know,
(01:04:43):
if they have toxic beliefs wound up into their their fabric,
or they they promote uh toxic personality dynamics. Sometimes, yes, certainly,
but yeah, we we would. What we don't want to
do is certainly to put all the of that aside
and then rebuild. Yeah, like, like you say, a new
digital religion that is based on most of the worst
(01:05:04):
qualities of what came before. Yeah, I mean, I I
find most of this book a pretty compelling message, but
I do want to emphasize again that it's like, um,
it emphasizes a lot of what's negative about the current
model of social media. But even with these like bummer companies,
he doesn't demonize. He's not saying and we're not saying
(01:05:24):
that like all the people who work at these companies
are bad people. We're not even saying that like necessarily
all the things done by these companies are all their
products are bad. I mean, like you know, Facebook and
Google and all these companies have. They provide real technologies
and real products that do you know, great things for
people's lives that make all kinds of things easier. They
provide enormous wealth and convenience and stuff like that. So
(01:05:47):
it's not that like everything these companies do is bad.
I mean that that's not true by any stretch of
the imagination. It's just that there are certain elements about
the business model of especially the social media platforms, and
specifically it's the element that its behavior modification for rent
that really do need to be reformed. And that's where
his recommendation comes in. He says, if you want to
(01:06:08):
give these companies financial incentive to reform, delete your accounts. Right. Yeah,
the bummer illness is there in social media, and the
only way to get rid of it, to rid the
host of that that the infection, is to step away
from it, to have them to create this financial incentive
for those for these corporations to change, and and ultimately, yeah,
(01:06:31):
he he has this positive view, this optimistic view that
it can change that if we were to do this,
if we're all to do this, not all at once,
but perhaps this gradual awakening uh to the reality you know,
we could reach the point where social media can exist
in a form that is not not demeaning of our personhood,
that is not modifying our behavior to the benefit of
(01:06:54):
you know, you know, unknown corporations or by state or
non state players. Yes. There. So there are a ton
of other arguments he gets into in the book that
we did not have time to address today. Of course,
you know, we're not going to address everything in the book.
If we can get Jared Lanier to come on the
show sometime, I'd be interested in talking about some of
the other ones, uh, especially, but also I'm interested in
(01:07:16):
hearing from listeners who who disagree, maybe who or you know,
agree or disagree, because one thing I recognize is that
I think I have an emotional predisposition to bias in
favor of these arguments. Uh, simply because I I feel
like in my life I have witnessed a lot of
negativity growing out of social media. I personally, emotionally don't
(01:07:39):
like platforms like Facebook and Twitter and TRU and so
you know, I I've got a predisposition that makes this scene,
this all seemed good to me. So I'd be interested
in hearing, you know, the arguments presented to the contrary
that that would go against my biases that say, no,
maybe it's not as bad as he's representing. Right. And likewise,
if anyone out there has quit social media, if you
(01:08:01):
have deleted your social media accounts, I'd be interested to
hear your take on how that went. You know why?
You know, why did you do it? Uh? Did did
you accomplish what you hope to accomplish by stepping away? Uh?
You know all this is a fair game for discussion.
So again, the title of the book is Ten Arguments
for Deleting Your Social Media Accounts Right Now by Jared Lanier.
(01:08:21):
That's l A in I E. R. It's a as
of this recording, it's currently available in hard back and
it's coming out in paperback very soon. Yeah. If it's
not out by the time this episode publishes, it will
be coming out soon. Yeah. So again, highly recommend this
read and would love to hear from listeners who read
it as well. In the meantime, if you want to
follow what we do, head on over to Stuff to
(01:08:43):
Blow your Mind dot com. And if you want to
support the show, rate and review us wherever you have
the power to do so, and make sure you have
subscribed huge thanks to our excellent audio producer, Maya Cole.
If you would like to get in touch with us
with feedback on today's episode to suggest a topic for
the future, to follow up on any of the prompts
we just issued, you can email us at contact at
(01:09:04):
stuff to Blow your Mind dot com. Stuff to Blow
Your Mind is a production of iHeart Radio's How Stuff Works.
For more podcasts from my heart Radio, visit the iHeart
Radio app, Apple Podcasts, or wherever you listen to your
favorite shows.