Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, you welcome to Stuff to Blow Your Mind. My
name is Robert Lamb and I'm Joe McCormick, and it's Saturday.
Tabnico into the Vault for an older episode of the show.
This one originally aired on August six, nineteen, and it
was called Social Media Is a Bummer. This is where
we talked about Jarn Lanier's book, which was highly critical
of social media. That's right, Ten arguments for deleting your
(00:26):
social media accounts right now. I think it's an excellent
food for thought as we continue through. Welcome to Stuff
to Blow Your Mind, a production of I Heart Radios
How Stuff Work. Hey, you welcome to Stuff to Blow
(00:47):
your Mind. My name is Robert Lamb and I'm Joe McCormick,
and we're back to a subject that may be familiar
to you if you've been listening to the show for
a bit. It's the next battle in the war against
the Machines. That's right, you know, as we've already got
us in the show, and I think it's obvious to
most of our listeners the wonders of interconnectedness that the
web have have have given us have also unleashed some
(01:09):
less than satisfying realities. You know, we we worry about
smartphone addiction and the degree to which the these devices
and the many apps they have on them have been
engineered to gain our attention. And then there's been growing
attention given to the role social media has in in
you know, playing into you know, corporate and state manipulation
and endangering democracy, our personal freedom, and our happiness. We've
(01:33):
we've recorded a few different notable episodes on these topics. Right, So,
I know you and Christian a while back did an episode.
Now this was years ago now, so that the research
has come a long way since then, I think, but
you did an episode on what the measurable psychological effects
of social media were, Uh, and there's still I think
this is still a developing field. Like I notice a
(01:55):
lot of conflicting findings when I read upon this, Like
is social media making us more lonely, more depressed? Whatever?
It seems like the answer to that question is it's complicated, right,
And and I think we'll discuss a little bit this
later on too. That you know, it also depends on
how you're using social media, what your role is, or
using social media is part of your job, etcetera. Yeah, Uh,
(02:16):
then we also you and I. Robert did a couple
of episodes within the past couple of years called the
Great Eyeball Wars that were primarily about the attention economy,
about the fact that our our devices, and specifically especially
like social media platforms on those devices, but other platforms
also are because they make their money by getting you
(02:37):
to use them and pay attention through you know, advertiser dollars.
Like you are the product on these platforms. You're not
the customer. Uh, and your attention is what is being
sold to the advertisers. They're addicting us, and they're addicting
us on purpose pretty much right. Like one of the
I think the handiest metaphors is that a social media
app on your smartphone turns your smartphone into a tiny machine. Yes,
(03:01):
but it's but while a Vegas slot machine is is
programmed to steal all of your money. Uh, this slot machine,
this tiny slot machine, is programmed to steal all of
your time, your time and your attention. Yeah, it wants
you using it as much as possible. And the numbers
on this are kind of freaky, Like when when you
actually measure how much time people spend on their smartphones,
(03:24):
especially looking at apps like Facebook or you know, other
social media apps, they don't usually like what they find
out when those numbers come in, right. And certainly we
do have tools on a lot of our smartphones now
to track our screen time and to keep tabs on
and even set up little barriers to excessive use. But
(03:44):
as far as I know, like most of that stuff
is still very voluntary. And then we're not doing this
the default settings on a phone. Um or certainly when
you get a new phone, you didn't tend to bring
over your your sort of legacy settings from the former phone.
So um, you know, I think there's a there's a
huge argument to be made that that these companies are
not really you know, going in head first in the
(04:07):
battle to reduce your screen time, right. Uh. And then
also we did a more recent episode called The Doppelganger
Network where we talked about the jumping off point for
that was an article that we read by Robert Zapolski,
the neuro endo chronologist I think at Stanford and he
uh then he made a comparison between the effects of
social media use and and digital media more generally, and uh,
(04:31):
the psychological conditions like cap craw delusion that caused this
rift between recognition and familiarity in the brain. So it
creates a kind of a kind of strange, alienated world
where where normally you'd pair these experiences of cognitive recognition,
you know, knowing that you recognize something you see, and
the feeling of familiarity with it, but that is kind
(04:53):
of torn asunder by the dynamics of social media. Yeah,
and and so, I I don't think you know, the idea,
this argument that, oh, so the social media is potentially dangerous,
that it's that it has you know, ill effects. This
is probably not new. I think everyone's heard somebody argue
it to some extent, and we've even seen people make
fun of the argument, like, for instance, one image that
(05:15):
it frequently makes the round is uh is an old,
old timey photograph of like a subway car or a
train full of individuals, everybody reading a newspaper, everybody's face
hidden in a newspaper, and then using this to make
fun of the argument that you know, everybody's just plugged
into their phones and they're not connecting with each other,
as if to say, well, this was exactly the same thing,
(05:37):
but it's not. I think it's it's definitely worth driving
home that the type of uh, psychological involvement that's going
on with the social media platform on a mobile device
is entirely different than what you would find by just
sticking your head into a book or a newspaper. A
book or a newspaper is not a real time feedback
(05:59):
to advice. Yeah, I mean, I think one thing that
that that that kind of like poking fun at this
argument does reveal is that you can take the argument
too far and be kind of totalizing about it, because
it's important to recognize that the reason people use these
services are because they do provide something that people want.
You know, people through social media are able to keep
(06:20):
up with friendships that might have fallen by the wayside otherwise,
you know, maybe long distance friendships. Uh. You know, these
these these platforms and these devices do enable all kinds
of things in people's lives that are valuable and good righting.
And you can certainly go overboard and kind of a
luddyite response to it and say, well, the internet is
(06:41):
bad or technology is bad, uh, And I feel like
everybody's gonna have varying opinions, Like to one extreme, you
may be convinced that social media is uh, you know,
you know, it's leaning more and more towards the self
digestion of human civilization. Or you may see nothing wrong
with social media, though you may say, look, Robert and Joe,
I know I use my Instagram, I use my Facebook,
(07:02):
I keep with up with a few friends, uh, you know,
a few celebrities or a few bands or what have you.
I get the news through it, and that's it. And
it doesn't, you know, you know, greatly improve or harm
my psychic experience of reality. Yeah, and if and if
that's what you believe, our goal is not to convince
you that, no, you don't understand it's actually ruining your life.
(07:24):
I mean, you may very well have a perfectly healthy,
limited relationship with social media, and you may be one
of the people who's getting more out of it than
it's getting out of you. But for a lot of people,
I think we do want to make the case that
that is not what's going on. Right I would say
that most of us. Even if you you can say,
you know, honestly that you have a healthy relationship with
(07:45):
social media, you can probably not say the same for
everyone in your circle. There there there's probably somebody or
several people that the display signs of unhealthy usage. So
in this episode, you know, we're gonna we're gonna look
at some arguments against social media. Specifically, we're going to
(08:07):
be discussing UH an author by the name of Jarren
Lanier and his two thousand eighteen book Ten Arguments for
Deleting Your Social Media Accounts right Now. It's a great title.
It gets right to the point it does, and and
it is a fabulous little book that Joe and I
both both read for this episode. It's um it's short,
It's something on the order of what hundred and forty
(08:28):
six pages long. It's extremely accessible, Like he did not
make this a you know, a high high level computer
science intellectual argument. It is written so that I think
pretty much anybody could understand it. It's very accessible, it's
very ground level, and I think it makes a pretty
compelling case that at least well he his argument is
(08:50):
that social media in the business model that exists today
is doing more harm than good, and our best way
of fighting that harm is to have everybody get off
of these platforms, because this will force the company is
involved to actually implement changes, right and and so his
(09:10):
His argument is not one against technology, and it's not
even necessarily against one one against um, a certain form
of social media. It's against a particular business model that
powers social media. And that business model, I think is
what he ends up calling the bummer business model. It's
a business model that is paid for by behavioral modification. Yes,
(09:34):
bummer be you in any R, which Linear says stands
for behaviors of users modified and made into an empire
for rent. It's clever having an acronym because sticks with
you and uh and yeah, it's clever and it's a
little bit funny. And and that can be said for
the entire book, despite being a fairly serious topic with
some potentially serious ramifications for humanity at large and for
(09:59):
into the dual um you know, self worth, it is
a humorous read at times, and it is it is
fun to read, So I can't recommend it highly enough.
There's a book you can read on the train and
the plane, on a toilet. Uh, you know it just
it makes for a great but important casual read. Yeah,
and we we've talked about maybe getting him on the
podcast sometime soon, and that would be great to have
(10:21):
a conversation with him, But today we just wanted to
talk about maybe a couple of the arguments that he
brings up in the book and and our thoughts about them. Yeah,
we're not gonna attempt to regurgitate the entire book because
the book uh is already already speaks for itself. So
we'll start though, by just talking about Jarren Lanier himself.
You might be familiar with him already, perhaps you're not.
(10:43):
Perhaps you've just read his name and thought it was
pronounced Jared Lantier, which is how I've been pronouncing in
my mind. That's how I've said it on the show
a lot. But anyway, he is a scientist, a musician,
and a writer. He is a major figure in the
realm of virtual reality, having founded the VR company VPL
Research in the nineteen eighties, and while he didn't coin
(11:05):
the term virtual reality, this is generally attributed to French
playwright Anton and R. Toad In, he did popularize it.
He helped create the first commercial VR products and introduced avatars,
multiperson virtual world experiences, and prototypes of major VR applications
such as surgical simulation. He was involved in the creation
(11:28):
of the Nintendo Power Glove. Yeah, now you're playing with
power which you know, which I have to say, the
power glove. I never had one as a kid, but
I saw people with them, I knew people who had them,
and it it was this this instrument of wonder. I
don't think it was all that practical as a gaming device,
but I think it inspired a lot of people. And
I also love seeing, especially nineteen nineties science fiction where
(11:52):
they have reused a power glove as part of like
a cybernetic you know, outfit for somebody. Anyway, he was
also involved in the with the creation of the headset,
apparently for the film The lawnmower Man. So you're really
bringing out the hits here. Yeah, And I think this
is probably not the stuff that usually gets highlighted about
his career, but it's this, you know, some of the
(12:12):
stuff that I think some of our listeners might be
familiar with. Exactly. I should point out that he's not
credited on lawnmower Man, but he does get a thanks
on the far Superior Sci Fi Work Minority Report. Okay,
uh he But more to the point, though he's an
author of several books, such as The two thousand six
(12:32):
Information is an Alienated Experience you are not a gadget
of Manifesto from two thousand ten, who owns the Future
from and Dawn of the New Everything from seventeen. I
tend to think of him kind of as a technology philosopher,
and I really like a lot of his approach because
it's got a healthy skepticism about over hyping technology and
(12:56):
what it can do. And at the same time, he's
not anti technology. He's clearly somebody who loves digital technology,
loves computers. You know, he's worked with them his whole career,
and so he doesn't end up saying, through your smartphone
in the fire, flush it down the toilet, smash it
with a hammer. It's not an anti technology message. He
actually has a very specifically tailored message trying to identify
(13:17):
exactly what it is about the social media platforms specifically
as they exist today that's causing problems for us and
for our society and how could they be changed. Yeah,
ultimately he is he's an optimist. Like he's he's presenting
an optimistic view of the future, like certainly highlighting problems,
but discussing how we can address them, which I love.
(13:37):
I I feel like I've spent too much of my life, um,
you know, looking at more pessimistic views of reality and
dystopian views of reality, and I've gotten to the point
where those just don't serve me anymore. So I far
prefer reading an author like uh, like Jerry Lynn here.
So the booking question, I have to talk about the
(13:59):
cover of it, because the cover is is very simple,
you know, just a black and red text on a
white background and a silhouette of a cat walking off
the cover of the book. And the cat is a
central metaphor in the book, right because, as he points out,
you know, as much as we love dogs, I love dogs,
you do, you do love dogs, and I love dogs too,
just as I don't own one. But as much as
(14:22):
we love dogs, we domesticated the dog. The cat arguably
domesticated itself. Uh you know, it interacts in our lives
more on its terms. You you were at great pains
to attempt to train the cat, as anyone who's ever
certainly you know, tried to to you know, shoot a
film about cats can attest to um. So, Lanier argues
(14:45):
that social media is essentially turning us into well trained dogs,
but we should really strive to be cats able to
dictate our involvement in the relationship at hand, to scratch
the hand that feeds us, if we so wish, to
sleep wherever we choose, refuse food, walk on all the furniture. Ultimately,
we should we should want to be cats. We don't
want to be social media's dog. Yeah. And also, I mean,
(15:08):
while he's advising people to quit social media, he's making
an argument and he's not it's not a totalizing argument.
I mean he realizes that different people are in different circumstances, uh,
that it's it's not a choice that will work for everybody, right, Like,
he's very clear on the on the fact that quitting
social media is a privilege and not everybody is able
(15:28):
to do it. A lot of people, uh you know,
own a business or or or part of their job
entails them using social media. I know some people like that,
and they're just they're trapped in they're trapped in it,
like you just can't walk away from it, or you
might be shackled to it. More socially, like, well, if
I stop using uh, you know, Facebook, how am I
gonna connect connect with my friends who all live in
(15:49):
another city. And I just moved to a place where
I don't know anybody, Um, you know, they're all these
arguments to be made, and so he's not. He's not,
you know, drawing this firm line in the sand and saying,
you know, the winners over here, users over there, or
anything of the of the sword. And he's certainly not
arguing that, you know, we should we should all go
and make a big dramatic to do about quitting social
(16:09):
media either because I think I think we all see
that occasionally on our feet to oh, that's the most
embarrassing thing when you see somebody post a lot about
how they're quitting, and then they quit for a week
and then they're back. Yeah. Yeah. And in fact, I
actually I shared I shared something about this book on
my private social media feed, and immediately like somebody was
(16:32):
like calling me out for having posted about quitting social
media on social media. But of course I think what's
wrong with that? I mean, where are we going to
talk about leaving social media but social media to a
certain extent um. And and also well, I mean this
episode where we're discussing his arguments again, you know, we're
not necessarily saying everybody's got to get off social media.
(16:53):
But I do think these are some interesting arguments, very
worth considering. Um, we're discussing these on an episode that
will be promoted on social media because that is part
of the distribution business model of this podcast exactly. It's like,
this is how, you know, one way that we reach listeners,
and if we don't do this, it won't reach as
many listeners. So I don't know, how do you how
(17:15):
do you balance That's like, are you actually doing better
if you say, well, let's not post the episode on
social media so fewer people will hear it. Yeah, thus
is the world? Thus have we made it? But at
any rate, Linear also ultimately says like, hey, I'm not
even saying quit social media forever. Um, you know, because
ultimately he's hope, he's hopeful that social media can be corrected,
(17:36):
that we can come back to a version of social
media media that is not harmful to us in so
many ways. And then also he's saying, like, you know,
quit for a while and come back. That's the only
way you'll you'll have any kind of like insight on
what it's doing to you, Like this will help give
you the uh, you know, the vantage point by which
to to understand the interaction between your life and these
(18:00):
bummer systems. All right, well, maybe we should take a
quick break and then when we come back we can
discuss a couple of the arguments from the book and
our thoughts about them. Than all right, we're back. So again,
the book that we're discussing is uh Journalinears ten arguments
for deleting your Social media accounts. Right now, you're probably
wondering what are those ten arguments. We're not going to
(18:22):
regurgitate all ten arguments here. If you want to know
what they are, you should pick up a copy of
it because without even opening the book, all ten are
listed on the back right, which is which is prob
was handy. You know, you can just you can instantly
see what you're in for. We're gonna be talking about
i think basically four of the arguments, and uh In
discussing them a bit here for you, yeah, someone more
(18:43):
depth than others. Now, one thing that we should talk
about upfront, because it's sort of a foundational argument that
feeds into all the others, is this point that, in
Lanear's words, due to social media, you are partially losing
your free will. Uh So, one of his core arguments
on which many of the others rest is that social
media is at heart a mass program of behavior modification
(19:08):
for rent. That is how these companies make money. So
if your Facebook, the way that you make money is
that people pay you to have some kind of influence
on users of Facebook, and that influence could be a
very traditional, normal style of advertising, the kind of thing
that you know that happens everywhere, and most people aren't
(19:29):
bothered by, right, because it's clear what's happening. You're just
seeing an ad for a product that somebody thinks you
might want, and you know, there's the ad and you
might go by it. That's you know, we're not generally
very bothered by that. Right. Always reminds me of the
moment in Futurama where a fry encounters in the future
and advertisement in his dream. That's a little creepier. We
(19:50):
find it intrusive. And he says, you know, we didn't
have that in my time. We just had advertisements, you know,
all over the place and in the sky and on
the billboard. I mean, certainly we do even without social media.
We live in an age of you just ubiquitous advertising. Yeah,
and you know that is this is one thing that
he does sort of attack is the problem that the
(20:10):
web arose on an advertising pay for model. Uh. You know,
back in the early days of the web, there was
this idea that everything needed to be free to access.
You couldn't charge people to get stuff on the web.
But then how do you pay for producing that stuff.
Somebody's got to make it, you know that they've got
to get paid somehow. So what happens, Well, you'd pay
(20:30):
for it by showing advertising along with the thing, and
the advertisers would pay for what you're seeing. Right And
of course, again coming back to podcasting, we're not blind
to the fact that that's essentially what you have with
this podcast. This podcast is provided to you for free,
but you have to put up with advertisements. Right now,
I don't really mind that from an advertiser, from a
from a podcast point of view, because when I listen
(20:52):
to podcasts and when I make a podcast, I generally
think the advertising that's happening there is fairly straightforward. It's
pretty clear what's going on. Somebody's pitching you a product.
I mean, likewise, on television, of course, television start off
with the same model and he discusses this. You know,
it's like, here's the signal. Here's some programming, but here
are also some advertisements. But generally speaking, without getting into
(21:14):
some of the you know, the trickier forms of television
advertising and product integration and so forth, you're you're still
dealing with the situation where it's like I'm watching a show. Okay,
now I'm watching an ad. Now I'm watching the show again. Right.
But even when even with other things, you know, like uh,
sponsored content and stuff like that, I mean, I think
there's there's a big difference between what's clear and what's sneaky.
(21:37):
I think generally people tend to be okay with advertising
when it's clear what's going on. When you know, they're
told who's paying for what they're seeing, and it's clear
what the person who's paying for what they're seeing wants
them to do. Usually it's like buy my product or
become a member of my service or something like that.
That that that's the kind of thing that I usually
(21:57):
feel fine about that. Most people tend to feel fine
abou out what's going on with social media, According to
Tullneier's argument is that there is a much sneakier, more subtle,
and perhaps more sinister thing happening which is that our
behavior is being modified in ways that are not clear,
that are not that are not obvious to us, ways
(22:20):
that we're not aware of, and we oftentimes, in fact,
most of the time, don't know who's doing it right.
And I think we all have those moments using a
product like Facebook where something an advertisement will pop up
or predict or perhaps a post even made by a
friend will pop up, and you'll stop for a second
and wonder, why was that served to me? Why am
(22:41):
I seeing this? Yeah? What is what have I done?
What sort of interactions on my part or you know,
demographic information on my part has led to this being
put in my face? But there's a good chance that
it is. It's either serving the like the supply model
of Facebook, which is trying to keep you engaged on
(23:01):
the platform for as long as possible, so that that's
one thing that they just want to keep you on
there so they can modify your behavior more, gather more
data about you, show you more ads. Uh, So that's
one thing they might be just showing you it to
you because their vast collection of data shows that when
you see stuff like this, you use Facebook for a
longer period of time, or you log back on more
(23:23):
often later in the day. So that might be one thing,
but the other thing might be that you're seeing that
thing because somebody has targeted you for some kind of effect.
They want to generate some kind of effect. And this
could be like become my customer kind of effect, or
it could be something else. It could be in effect
like we want to make people stay home instead of
(23:43):
go out and vote today. Now at this point, I
do want to drive home as well. That one thing
that Lanier is very clear on is that he is
making no argument that there is like a broom at
Facebook or Twitter or wherever where like evil operations go on,
that there's like a sinister cabal uh in any of
these organizations that are sitting down and saying what can
(24:04):
we do to destroy the human soul or anything like that. Right,
there's not a m wahahaha committee, right there is. Instead,
there is well, well, I mean, there are decisions being
made by people about what types of incentives will be
algorithmically optimized and so so that is something where it's
not like the humans who operate in these companies don't
(24:26):
bear responsibility. They do, but they're not they're they're generally
not trying to ruin the human race. They're they're not
trying to say, what's the worst thing we can do
to our users, right, that would that would just be
a simplification of something that is a lot more complicated,
and it's taking place not on the scale of even
of an individual or individuals, but taking place in the
(24:46):
scale of a corporation. What Linear argues is that bad
business incentives which prioritize behavior modification are leading to the
creation of algorithms that sort of automatically commit these these
behavior your modification schemes, and so behavior modification can be
linked to traditions and the behaviorist school of psychology, which
(25:07):
flourished in the twentieth century. We've talked on other episodes
of the show about like b F. Skinner and behaviorism.
Behaviorism sometimes gets demonized, and they're definitely really good reasons
to be historically critical of the behaviorist trend in the
history of psychology, but it also wasn't entirely wrong. Like behaviorism,
I think could be credited with trying to make psychology
(25:28):
a more objective science. But it also you know, a
common criticism is that it sort of treats the human
brain like a black box. You know, conditioning goes in,
behavior comes out, and whatever happens inside just isn't really important.
But one thing that it did show is that if
you treat the human brain like this kind of like
mystery machine where you just kind of like put conditioning
(25:50):
in and keep calibrating until you get the output behavior
you want, you can actually change behavior in an extremely
effective way like that, like you know, be behavioral conditioning
can be very powerful. And Laniers Linears argument is that
modern social media is sort of almost a perfect vehicle
for refining behavioral conditioning because it can collect extremely minute
(26:14):
data about you, and lots of it because of course
it has you know, this connection like it it records
everything you do and all the stuff you do on
social media. There's actually a lot of things you do
on social media that they can learn a lot about,
from your brain states to where you are through location tagging,
to what you spend money on you know, your linked accounts,
(26:34):
to the words you use reflecting your moods. Like they can,
they know a lot about you and about how what
they show you affects what you do. Right, you know,
I would be loath perhaps to say that they know
us better than we know ourselves. But there, but I
feel like they often, these platforms often know us better
than we are perhaps prepared to admit to ourselves. Yeah, well,
(26:56):
they know things about you that are different from the
way you think about yourselves. They know about you from
a behavioral conditioning standpoint, where they can track in a
really minute way what conditioning goes in and what behavior
comes out. We tend not to think of ourselves that way.
We tend to think of ourselves from the inside out.
We think about our own mind states. You know, we
(27:16):
tell a narrative about our behaviors in which everything is
rationalized to make sense. So it's often we feel kind
of demeaned when it's suggested that we are vulnerable to
behavioral conditioning. It's like, no, no, no, I'm not. You know.
It's like how people think advertising doesn't work on them,
you know, like, no, I'm not more likely to buy
a product because I saw a commercial for it, but
(27:38):
you know you probably are. I mean, we just tend
to think we're more mentally independent of the effects of
stimuli that we actually are, or we will will criticize
and say it doesn't affect us on one platform and
then on another will celebrate it without naming any names.
One might use a particularly popular music streaming service and
(28:01):
you might go, wow, I really love the algorithm on
this thing. It really knows what I'm into it. Yes, me,
it keeps giving me musical suggestions that are they're totally
on point. Yeah, and you need you have to stop
sometimes like, well, I guess that's a good thing, but
is it really Yeah. So social media platforms they can
gather all this data about you, but also they have
extreme psychological power over you. I mean, especially the big
(28:24):
one to think about his Facebook just like it's this
almost perfect machine for maximizing potential and behavior modification. It
can make you feel and think what it wants with
astonishing effectiveness just by showing you certain things that it
has figured out that when it shows these things two
people like you, you tend to react by behaving a
(28:45):
certain way. And then of course it can just keep
calibrating those those efforts more finally, and finally, finally with
extremely high quality feedback based on the way it tracks
your behavior and all the ways we mentioned earlier. So
this this is basic argument, and you can see that
companies like Facebook are interested in in always getting tighter
(29:08):
and tighter control of the data and feedback about you
in these sorts of ways. I was just reading a
story the other day, uh, in the m I T.
Tech Review about Facebook being involved in funding research on
a wearable headband that could supposedly read your thoughts. Now.
I don't think we should get overly alarmed about like
this one particular news story, because this kind of technology
(29:28):
is probably very crude at this point. You know, it
doesn't tell you a lot today. We're not going to
go from like zero to black mirror in a year
on this particular front, right, But I do think the
fact that Facebook is funding this kind of research should
tell us something. They want to get deeper, They want
to go further. They want to get closer and closer
(29:48):
to your brain to know exactly how you're reacting to
things in real time all the time. They want they
want sort of like infinitely precise and finally calibrated data
about you. And why would they want that. It's because
it gives them better control over what you do, and
that control can be rented out to sponsors and again,
(30:11):
and those sponsors could be fairly straightforward. They could be
somebody trying to get you to buy their brand of shoes,
or it could be a government trying to control what
you do on a certain day, or or control how
you feel about a political movement. Absolutely, and and that
certainly they did. The involvement of of of other states
(30:32):
or operatives acting on the part of on the behalf
of other states has has certainly been in the news
a great deal recently. Well, yeah, it's very strange that
like that. I don't know if people would have predicted
ten years ago. Maybe they did, and I wasn't aware
of it that social media would start to become a
major um statecraft and national security concern. Yeah. I mean,
(30:56):
well I think we were all probably you know, in
the in the same boat. You know, for the most
part when social media started up, it just seemed like
a thing that was a fun way to connect with
a few other people that you knew and maybe meet
some new people. It's cool. Yeah, And it was you know,
for the most part, you know, it was just you
and friends and potential new friends on there. And your
parents weren't on it yet. Major politicians were not on
(31:19):
it yet. Um, major governments were seemingly not involved yet. Well,
I mean, it's the I don't want to be overly.
I don't want to demonize it too much, but I
mean it's kind of the addiction model, right. It's like,
you know, your first your first dose of this drug
is free, maybe your first free, uh, several doses you
get hooked. And then what Linear talks about a big
(31:40):
issue going on with these companies is that there is
this problem of network effects, right. Network effects are the
thing that happens in digital media businesses where people tend
to get locked into a service. And once people are
locked into a service, it's very hard for there to
be an effective competitor and get people to move over there.
(32:01):
People are already on Facebook. If you tried to start
a better Facebook today, and people have tried to start
like nonprofit facebooks, nobody goes over there, right new. I've
joined those before and it lasts like an afternoon. Yeah,
you're you're already. People are on Facebook because that's where
their friends are and that's where all that's where everything
is happening. It's network effects that keep people locked in
(32:23):
once they're already there, and once everybody's there, you can
kind of like lock the doors and then start doing
whatever you want inside, and you can check out any
time you like, right, but you can never leave you alright,
So we're skipping and skipping around a bit. But another argument,
basically the fifth argument that Linear makes, is that social
media is making what you say meaningless, and the core
(32:46):
argument here comes down to context. Now, entering into this argument,
I personally thought about the rather simple example that I'm
sure many of you are thinking of right now, and
an ambiguous text message or email has taken out of
context because there are limitations to what we can typically
convey in these you know, limited generally short form communication formats. Uh.
(33:09):
You know, an emoji can only do so much to
provide context to what you're saying. You ever, notice how
if you listen to a friend of yours talk about something,
it sounds normal and fine, But if you listen to
a snippet of a stranger's conversation, they sound like a
complete moron and it's like embarrassing or or at best,
(33:30):
like a cryptic alien where you're just like, what what what?
What was that? I'll never know? But you know what
I mean? And it's not like strangers happen to be like,
there's something wrong with people you don't know as opposed
to people you do know is that when you know somebody,
you have context based in their personality and your relationship
with them. What they're saying makes sense to you because
(33:53):
you know them and you know how they generally think
about things and all that. If you were if you
didn't know this person, you just happen to over or
hear them, is you know, you walk by on the
sidewalk and they were saying the same thing that would
normally say that would sound normal to you as part
of a conversation. You might think like, wow, what that
What a freak? I know, people probably think the same
thing about me all the time. Oh, I'm sure people
(34:14):
think that about I mean I think about this all
the time, Like I'll be out somewhere having a conversation
about Highlander to the Quickening, and I'm like, what do
I sound like? What kind of what kind of idiot
do I sound like? To somebody who doesn't know me?
I mean, I probably maybe I sound that bad to
people who do know me, but no, alright, So but anyway,
coming back to linear here, he's getting some of this
(34:37):
applies to his case, but basically he's saying that you know,
with context or the lack of context. You know, it
comes down to the lack of individual control over context
on social media, and he points to two extreme examples
of this online to sort of hu, you know, better
illustrate what's happening elsewhere. So, like the two extreme cases
(34:58):
would be when, um, when you have legit ads popping
up on say, terrorist recruitment videos on YouTube, which was
which was a problem at least early on, is that
you would have a legitimate advertiser and ultimately an illegitimate
content or an illegitimate user, and those will be matched
together in the context it would would be you know,
(35:19):
accidental and terrible. Platforms are juxtaposing content without understanding what
that content is, and it very often doesn't reflect well
on any of the content, or certainly on some of it. Right.
And then the other example he brought up, we're when
you have images of say women and girls whose whose
images are are sexualized or incorporated into violent media without
(35:41):
their consent. Again, that's like a that's that's horrible, and
it is an extreme example. But he argues that these
extreme cases are possible because social media in general robs
us of control over context. When we express ourselves online.
We have no idea how that expression will be presented
(36:02):
specifically to anybody else. How often is there some public
controversy about like a tweet where you know, somebody is
like somebody's like, hey, I found a tweet by ex celebrity,
you know, where they said something that looks really bad,
and then that person defends themselves by saying, but you
took it out of context. What you really feel about this,
(36:26):
of course, is going to very highly case by case
because maybe the context changes the meaning of it, and
maybe it doesn't, right, But when it's just presented as
a snippet, that individual doesn't have control of the context.
Maybe you have control of the context, but but they don't.
It's similar when and we certainly see this, uh in
our political cycles as well. You know, something that a
candidate said or wrote ten years ago is you know,
(36:50):
twenty years ago, what have you? Uh? Is taken out
as a snippet and presented often without completely either without
context or without like complete context or to what they
were talking about. Uh. And you know, and in some cases,
I mean, anytime you're quoting somebody, you are you are
literally taking them out of context. But like, uh, there
(37:12):
are ways that being taken out of context can be pernicious,
and then they're I mean, they're totally normal ways to
do it too. But then again, is when when something
is taken out of context, is there's kind of implied
by that statement that context is usually in place. And
it's like saying, who who left this toy on the
kitchen floor, Let's put it back where it goes. You
(37:33):
don't want to live in a reality where the toys
are always on the kitchen floor, in which context is
always out of place. And and so that's ultimately what
he's getting out of social media. I'm gonna going to
read just a quote from his writings here, quote, we
have given up our connection to context. Social media mashes
up meaning whatever you say will be contextualized and given
(37:54):
meaning by the way algorithms and crowds of fake people
who are actually out rhythms match it up with what
other people say. And he continues, speaking through social media
isn't really speaking at all. Context is applied to what
you say after you say it for someone else's purposes
and profit. So essentially we have surrendered context to the
(38:18):
bummer platform. He's saying, rendering communication quote petty, shallow and predictable,
and as such, only the extreme voices, the worst of us,
the loudest, the you know, the most acidic voices in
our culture are going to be the ones that rise up.
Oh yeah, I mean this is a whole other point
he makes that I guess we're not going into in depth,
(38:39):
but I mean he argues that these platforms necessarily promote
the worst voices because the worst voices tend to drive engagement, right,
and the platforms want to drive engagement. What gets people
using the platforms more keeps people glued to Twitter, glued
to Facebook. It's like whatever, gin's up the most net
(39:00):
gative emotion, right, and one click word on engagement. One
thing that he hammers home a number of times in
the book is that when you hear the word engagement
used in a social media context, what we're talking about
is manipulation. So try to think about that next time.
If you're attending a meeting or reading an article, uh
sort of like especially a pro social media argument, uh
(39:20):
you know about social about engagement, Just uh switch it
out for the word manipulation and see how the taste
fits you. But called engagement caught manipulation, whatever you like,
it ends up creating this environment where everybody has to
play that game in order to be heard, And of
course that means it changes the way that say, legitimate
(39:40):
journalistic uh um publications have to play the game. You know,
you have to lean into even you know, more ridiculous
headlines for instance, click bad headlines in order to get
that precious and engagement, and that can have an eroding
effect on the institutions of journalism and themselves. Oh absolutely,
(40:01):
I mean well that also goes with the just like
it is the most uh, the most toxic or most
uh whatever voices gin up the most negative emotion tend
to be promoted on these platforms because they drive engagement.
The same could be true of topics. So like you
might not say that, well, my voice is toxic, but
like whatever topics get people the most like upset and
(40:26):
worried and aggravated and angry and all that, those topics
are going to be favored by algorithms that are trying
to increase engagement. So they know, you know, I think
a lot of times, um publishers of online media know
that there are certain topics that get people really upset,
and those will be the highest performing articles on our
videos on social media those days, like think of chemicals
(40:50):
as engagement, Like which chemicals are most engaging when you
get them on your skin? You know, I instantly think
of the ones that are going to sting right, uh
And suddenly I can't think of anything but those stingy
places on my skin. Um. So you know again, one
of the things that he comes back around to too
is that you know what in this context situation is
that what you were saying is only valuable or relevant
(41:12):
insofar as it serves the platform. And again, you were
not the customer on this platform. You are the product.
The customers are those uh, those corporations or even state
to players that have the money to pay into the
bummer system. Yeah. Now I was glad to see that
he thinks podcasts have not been ruined yet. Yes, yes,
(41:34):
so there there's a whole section in this uh this
argument where he goes under discussed podcasts and uh and
he makes in an argument that podcasts are a rare
area of our media that are not bummer yet. But
he does dream up and describe an absolute nightmare scenario
for the future of podcasts that I hope never comes
to pass. And while I was reading it. I was
(41:56):
literally breaking out in sweats. Yeah, because think about what
we do on the show, you know, stuff to blow
your mind. Is full of our personalities and our context,
and we have the time and we have the space
to present topics and ideas, to present our takes and
topics and ideas, and a chance to share our personalities
with the listener in a continuous episodic manner. And as
(42:18):
a result, you, the listener, you kind of know us.
You you know, when when we get something wrong, you
have the context to understand what's going on generally, you know,
and what we do on this show is not easily
reduced to sound bites. That's true. And this has been
a problem before, right, Yeah, you should trust us on
this because every now and then, not currently, but it's
(42:41):
did with a fair amount of regularity, someone will come
to us and say, we need some sound bites. Yeah,
give us fifteen second clips of your show to show
what the show is really about, to show what the show.
That should be the clip. But no, it's like it's
impossible to do. When you try to pull out fifteen
seconds of our show, nothing sounds right, I mean, nothing
really make sense on its own. That's that short, Yeah,
(43:02):
we're not. We're not a Zinger factory here. We're not
calculating celebrities or want to be celebrities, and we're not
actively trying to play the SoundBite game. We're not trying
to play the bummer game with this show. But to
clarify all this and really explain what he's talking about, uh,
Lanear devises a way to quote ruined podcasting. Uh. And
(43:23):
then he adds, nobody do this. Okay, so somebody's gonna
do it. I mean, I yeah, I can second see
it happen. Basically, he describes a sort of podcast aggregator
app that serves up snippets from podcasts with ads, of course,
and the resulting situation would be that podcasts would then
be incentivized to produce the sort of content that lends
(43:45):
itself to this format, so fiery, you know, extreme attention
grabbing sound bites the kind. Okay, So imagine clips of
podcasts that are that are prioritized the same way that
like tweets or Facebook posts or YouTube videos are prioritized
by the content recommendation algorithms. So just basically like pulling
(44:08):
out the most egregious and like negative emotion conjuring moments
of things. Yeah, and the horrible outside of context. Yeah,
and the horrible thing is that I can well imagine
the marketing on social aggregator. You know, it would be
something like, don't have time for all the podcasts in
your life? Well, now you don't have to let telepod
(44:30):
pick out the best parts of the shows and topics
you love and serve it up to you in an
easy to consume dose of wonder that fits into your
busy schedule. And I and it sounds kind of convincing,
you know, it's like, yeah, I'm busy. I don't have
time for a whole bunch of hour long podcasts. There's
so many of them. Why not let the algorithm slice
out just the choice cuts from all my favorite shows
(44:52):
and give them to me. Friends out there in podcast land.
If this happens, boycott I. Do not do this. Do
not unless you're already this type of show that is,
you know, leaning, because there are shows out there that
doling themselves well to uh to this sort of thing,
not necessarily by design, but just by sort of the
suit that the type of topics are already covering, or
(45:13):
the individuals involved, the personalities involved. You know, and then
it's not saying that you know that their villains because
of it. But I think there are a lot of
great shows and you know, hopefully ours is on that
list where yeah, you you you can't cut out these
little segments and expect the host organism to survive or
to be able to communicate what it's trying to communicate.
(45:34):
It's going to be called pod butcher. And if pod
Butcher came to pass, you know, I can tell you
that we would not be able to produce the sort
of show we produced now that you presumably like if
we had to satisfy such an algorithm, and what you
like would have even less to do with it, because
you would have handed your likes and choices over to Bummer,
(45:54):
just as we, uh you know, would have handed over
context itself. And the interesting thing is Linear is not
the only person, or the first really to to recognize
the death of context in social media. I was looking
back at an article from twelve and Forbes from Susan Tardonico,
and she wrote, quote, every relevant metrics shows that we
are interacting at breakneck speed with frequency through social media,
(46:18):
but are we really communicating with our communication context stripped away?
We are now attempting to forge relationships and make decisions
based on phrases, abbreviations, snippets, emoticons, which may or may
not be accurate representations of the truth. And we can
we can actually go back even even further on this
notion as well. Um on the show. In the past,
(46:40):
I've discussed the work of futurists Alvin and Heidi Toffler. Uh.
They wrote an important nineteen seventy book titled Future Shock,
which was also made into a uninformative but also kind
of more entertaining than informative orson Wells narrated TV show.
That's also that's wonderful in its own right. But the book,
uh was pretty great, and they discussed the apparent and
(47:03):
possible disruptions of the human experience and human society due
to rapid advances and technology. And I would like everybody
to consider this quote from Future Shock nineteen seventy again. Quote.
Rational behavior in particular depends upon a ceaseless flow of
data from the environment. It depends upon the power of
the individual to predict, with at least fair success, the
(47:26):
outcome of his own actions. To do this, he must
be able to predict how the environment will respond to
his acts. Sanity itself thus hinges on man's ability to
predict his immediate personal future on the basis of information
fed him by the environment and me personally. I would
extrapolate this to the digital environment that we've grown increasingly
(47:46):
dependent upon. You know, it's like every day, like multiple
times a day, over and over again, we're plugging into
the digital environment and checking out of our physical environment
as being a main determiner of our behavior of year.
But it's not an environment that's chiefly mechanically dominated by
things like you know, everyday Newtonian physics that we can
(48:07):
predict pretty well. It's more like the main environments that
we're participating in are like the giant, you know, super
complex slot machine where you you pull the lever and
you know, you can pull the lever in a few
different kinds of ways and you don't know exactly what's
going to come back out at you. So this is
this is the digital context, the digital environment. I want
(48:28):
every want to keep that in mind. Is I continue
reading this quote from the Tofflers here quote, when the
individual is plunged into a fast and irregularly changing situation
or a novelty loaded context. However, his predictive of accuracy plummets.
He can no longer make the reasonably correct assessments on
which rational behavior is dependent. To compensate for this, to
(48:49):
bring his accuracy up to the normal level again, he
must scoop up and process far more information than before,
and he must do this it extremely high rates of speed.
In short, the more rapidly change and novel the environment,
the more information the individual needs to process in order
to make effective rational decisions. And of course there are
limits to our speed. So anyway that the tofflers, I think,
(49:12):
really strike a chord with our current situation in this passage,
at least from my standpoint. I mean, sanity itself hinges
on our ability to predict our immediate personal future on
the basis of information fed to us by the digital environment.
And here we are attempting to communicate, act and absorb
knowledge in a social media environment that makes it impossible
(49:33):
to control the information fed to us, absorb it all properly,
and control the context of our own voices. Yeah, all true,
I think, I mean it's it's kind of scary. I mean,
to think about the fact that I'm quite sure that nobody.
There is nobody at Facebook who fully understands all the
decisions made by the you know, the content recommendation algorithm
(49:56):
that creates the Facebook feed, right, I mean there. I mean,
maybe they could go maybe they have some way of
going through if they could look at an individual one
and say, Okay, here are probably some reasons why you
were shown this, why you were shown that. But they can't, like,
they can't predict it all. You know, you can't generate
one of these feeds just uh, you know, with your
(50:16):
own brain. Yeah, I mean, and we're we're ultimately, we're
just continuing to understand what and to what extent, you know,
the harm is. I was looking around, um at some
various uh you know, posts and sort of industry thoughts
about about how social media has been linked to struggles
(50:36):
with depression and anxiety. Uh. And there are actually some
serious discussions going on within the journalism field where journalists
often feel the need to maintain a social media presence
or are mandated to and are forced to deal with
the resulting actually, I mean actual trauma and PTSD. Oh man,
if you're a journalist, especially working in some kinds of fields,
you are going to be inundated day in and day
(50:58):
out with with hateful messaging from people who don't like
whatever you're you're saying or doing in your career. People
telling you to go kill yourself, people telling you that
you're a fraud, and you know, and heaven and Heaven
forbid your your gender, your gender identity, your your your race,
um or what have you. Uh, you know, happens to
to you know, put you in the lines of sights
(51:21):
of you know, particular troll groups on social media totally
and and I know that there is a kind of
common reaction, I think, especially from people who don't deal
with this problem themselves, to say, like, you know, it's
just people, it's just trolling. Just get over it. You know,
it's just trolling. Like you're probably just failing to imagine
(51:41):
what it is like to to actually face this kind
of like hate and abuse all the time. You know,
we're we're highly social creature. I mean, even if you
don't actually fear violence against yourself, which you might have
good reason to, depending on you know, who you are
and who these people are and what they say. But
I mean, if anything, should be clear that the digital
world in the physical world are are not separated by
(52:04):
an impassable chasm, you know, I mean we online Israel
life is reel life, and we see violence stemming from
digital activities. Yeah. But even if you're not in that category,
dealing with an onslaught of just hate and abuse all
the time, is it does life ruining? Yeah? And and
generally we're not trained to deal with it. So a
(52:24):
lot of the discussions going on in journalism are are
you know, should we be training people to to cope
with the flood of negative commentary and effectively sort out
what should be ignored, what requires attention, and what constitutes
an actual threat that should be you know, reported to
authorities and to reiterate. Of course, it's not just journalists
that deal with this kind of issue, but like journalism
(52:46):
is one field where a lot of times people have
to be on these platforms, whether they want to be
or not, and also by the nature of their work
just naturally just attract a lot of negative attention. Right
if if, if someone wants to learn more about this,
I should point out that Kyle Bessie wrote a great
post just last month at journalism dot co dot uk
(53:06):
titled how social media impacts mental health and journalists and
of course in thinking about all this, you know, comes
back to the fact that like this is the reality
we've built for ourselves, you know. I mean it's it's
through this complex interface of you know, of corporate interests
and algorithms and and so forth. But still like this
is the world we've made for ourselves. We've created, we've created,
(53:28):
you know, through our use of technology, these new pitfalls
and these new perils to social engagement. All right, we
need to take another break, but after this we will
be back with more of our discussion. Alright, we're back.
So one of Jarren Lanier's arguments from this book ten
(53:49):
arguments to what is it? How exactly does it go
for deleting your social media accounts? Right now? Yeah, there
it is. Um. One of them is that quote social
media hates your soul? Now, Robert, what are we to
make of this? So this is this is argument ten.
So this is the like, the final, all encompassing argument.
(54:10):
So in a sense, it's almost a little bit unfair
for me to skip to it because it it hinges
upon all the arguments that he's made. But it's not
going to spoil the book, it's not. But to give
you a test of it. I'm just gonna read the
second paragraph from the first page of this argument quote
to review. Your understanding of others has been disrupted because
you don't know what they've experienced in their feeds, while
(54:31):
the reverse is also true. The empathy others might offer
you is challenged because you can't know the context in
which you'll be understood. You're probably becoming more of an
but you're also probably sadder. Another pair of Bummer disruptions
that are mirror images. Your ability to know the world
to know truth has been degraded, while the world's ability
(54:53):
to know you has been corrupted. Politics has become unreal
and terrifying, while economics has become unreal and unsustainable, two
sides of the same coin. That's pretty bleak, and it's
hard to imagine just from that paragraph that he is
ultimately presenting an optimistic message. Though he is. He is, yeah,
but you know he's he's saying though that. Look, Bummer's
(55:15):
behavioral modification is taking place not only at an individual scale,
but at a societal scale. In this it's reach is
more like a religion, and it's and it concerns not
only the way you live your life online but what
it means to be a person at all, which again
may sound a bit extreme and a bet out there,
but I think he really makes a strong case for this. Uh.
(55:38):
First of all, coming back to free will, which we
talked about earlier, free will is central to most religions.
You're you're hard pressed to find a spiritual model in
which humans are mere automatons. But under Bummer, he argues,
free will isn't destroyed, but it is assaulted. It's degraded.
You have less of it, and certain parties wind up
with more alongside, you know, their wealth and power that
(55:59):
they've already accumulated or are accumulating as they take a
more and more of your free will, you know. And this, actually,
I think does does go along with I don't know.
So we've talked before about the coherence of the idea
of free will on the show, and I'm sure we're
gonna get a lot of kind of like a materialist
push back on the idea of free will, saying, hey,
wait a minute, free will is an incoherent concept. I
(56:21):
think you could make that argument based on some definitions
of free will, but in this sense the sense that
he uses it, I think free will is an important
thing to consider and is a real concept. Basically, it
free will means um the feeling that you have control
over your own behavior by understanding the influences on yourself
(56:42):
and thinking about them consciously. And you know, you can
never understand all of the influences on yourself, but there
are definitely ways in which you can be in a
system where you you feel like you understand most of
the inputs that are coming in on you and you
can process them consciously and your decision making. For is
a system like like a kind of behavioral modification system
(57:04):
where you in fact don't know you're being influenced. You
don't know who's influencing you, and this is all opaque.
You're just suddenly producing behaviors that feel alien and you
don't know why you're doing them. Yes. Now, another religion
based argumently makes uh in this chapter is that Bummer
ultimately wants you to believe it is the Internet and
(57:25):
it is the main part of your devices. Uh. But
but it but it ultimately is separated. You know. He
says you can have the Internet without social media, and
he makes a strong argument for that. You know, that's
ultimately I think his point is that you know, you
you could have a great Internet just by sort of
switching everything from the behavior modification pay for a model
to like a subscription model where you know, like you
(57:47):
could have something like Facebook, but instead of modifying your
behavior to pay for it, you just you just pay
to get access. Right and uh and uh, Lanier argues that,
you know, in the same way the Protestants rejected papal indulgence,
is you know, you can you can reject the social
media but keep the Internet. You can reject the version
of your faith that is uh that is, you know,
(58:10):
tiresome or offensive or dangerous, and keep the parts that
work for you. I mean, I've said that before regarding
just a religion in general. You know, whatever religion you
adhere to, uh, I can almost guarantee you there is
some version of it, uh that is that is ultimately
more accepting and uh, you know, and more liberal in
its outlook. Uh you know, whether it's in your immediate
(58:32):
area or you have to you know, go out to
find it. As another issue, Uh. He also makes the
argument that bummer activates pack the pack setting of her mind. Uh,
and in doing so, it quote resurrects old conflicts that
had been associated with religion in order to engage people
as intensely as possible. So basically, social media riles us
(58:55):
up and causes the kind of like pack mentality and
tribalism that normally you would have to look to a
religion to do. Yeah what he This is part of
an earlier argument in the book, but basically it's part
of an argument he makes about how, uh, social media
is making us meaner and and worse by by triggering
types of thought patterns that are more associated with obsession
(59:18):
with hierarchies and and in group out group thinking and
that kind of thing. Yeah, and domination mentality, yeah, dominated culture.
And then he he also points out that Bummer ultimately
asked you to have faith not in in God or
God's or a godess, but in the almighty algorithms that
decide what slice of news, political commentary, fake news, parody,
(59:41):
conspiracy theory, or just outright hate you see in your feed,
and that it's you know, entirely anti ENLiGHT enlightenment in
that regard, and that it makes learning subservient to human
power hierarchies. Well, yeah, I mean by allowing these algorithms
to control what you see all day you are in practice,
whether you know you think about this or not, in practice,
(01:00:03):
you're giving your consent to somebody to shape who you
become as a person. And uh, and that that's a
lot of power, that's a lot of faith to put
in some business. He goes on from here to discuss
another destructive force in human discourse, memes. Um. So this
is that this is one of these areas where when
I talk about memes, I feel like often just come
(01:00:25):
off like an old person, an old, grumpy person who
doesn't understand how the kids talk, you know, complaining about memes.
But no, I I am with you there. Even when
I see a meme that's funny, and I see him
all the time, you know, I'm not. I'm not above memes.
It's like, yeah, some are really funny and I like them,
but there's a part of me that always sort of rebels.
And I think it has to do with what we
(01:00:45):
were talking about about context earlier. You know, I worry
about meme culture degrading the context of original images and
text in a way that that just constantly batters down
our defenses and batters down our desire to understand things.
And in their original meaning. Yeah, he He writes that
memes may at first, you know, when we first engage
(01:01:07):
with them, they might seem to amplify what we're feeling
or trying to say. And you know, for instance, somebody
posts a bit of h it's news or thought, and
you back it up with like, what do you see
in so many Facebook responses? Or I guess on other
social media platforms as well, you see like gifts and
memes that seem to be sort of an amen brother
kind of an argument or or or some other interaction
(01:01:29):
with the content. But but but ultimately, like this, this
feeling of amplification is an illusion. You're only reinforcing the
notion that virality is truth. Whatever is the most viral
is rewarded, and that's a key part of Bummer's design.
But just because it's viral does not mean it is
the truth. And uh, which seems like an overstatement of
(01:01:51):
the obvious when I say it like that, But but again,
don't think about you know, think about the way we
interact with it and the way we use memes. I mean,
if you go to any fact checking website, just pick
your favorite, I mean, and pull it a fact or
whatever and you like, scroll down and you scroll down
to see you know, like truth ratings by source, what's
(01:02:15):
always the number one source of total fabrications. You might
think of immediately your least favorite politician, But no, it's
not a person. It's viral image, right, Viral image is
always the number one source of false facts that are
spreading around the internet. Why is that, Well, because viral
image is an extremely powerful method for spreading falsehoods. It
(01:02:37):
spreads way easier than somebody going on TV and saying them.
And here goes on from here to you know, also
point out that in many of the these bummer companies
and in some of the key individuals associated with them,
you know, they're also they also put a lot of
emphasis on grander ideas of organizing all information, of providing
(01:02:57):
communities with purpose of creating AI. And ultimately, he says,
you know, this is a this is a danger to
to personhood. You know that there's a spiritual danger uh,
to us in our sense of personhood when we start,
you know, putting too much emphasis on these these non
human models of of thinking and humanity. Uh. You know which, again,
(01:03:20):
this is getting into kind of heavy, headier territory than
most of the book. But I think it's it's an
interesting case. You know what is how how are these
powerful corporations thinking about what it is to be human?
And then how is that changing the definition at least
like the sort of the spiritual definition and the self
defining principle of personhood. So again, like this this argument
(01:03:44):
in the book, I it does have I think there
is a possibility that you hear this or even you
read it and you start, you know, asking yourself, well,
is that that really the case? Are we're really thinking
about social media like it's a religion? Is it still?
Is it? Is it actually uh serving a purpose us
that is akin to religion? And I think I think
he makes a strong case. I think that I think
(01:04:06):
it's one of those things where you kind of like
you wake up one day and you realize, oh, I
have joined the church social media and I've actually I've
been attending services for years and I didn't quite realize it.
You know, you know, you tend to think of of
a religion as you know, as this as a as
a church or a temple, as these images of God's
and you don't think of the roles that they play
(01:04:28):
in a culture and how even as you know, for
the most part, and you know in many cultures there
is a movement away from these organized religions. You know,
there's this there's this possibility that we are we are
rebuilding something that operates in much the same way, or
maybe in just some of the same worst ways, Yeah,
(01:04:49):
without without some of the same, without providing some of
the best. Because I've you know, as I've discussed in
the show before, you know, I think they're I think
religion and spirituality, uh that they both have tremendous benefits
uh to individuals and as certainly to uh uh to
societies and uh you know, you can get that sense
of community with a religious organization. You know, they can
(01:05:11):
do a great deal of harm if they are you know,
if they have toxic beliefs wound up into their their fabric,
or they they promote uh toxic personality dynamics. Sometimes yes, certainly,
but but yeah, we we would. What we don't want
to do is certainly to put all the of that
(01:05:31):
aside and then rebuild. Yeah, like like you say, a
new digital religion that is based on most of the
worst qualities of what came before. Yeah, I mean, I
I find most of this book a pretty compelling message.
But I do want to emphasize again that it's like
um it emphasizes a lot of what's negative about the
current model of social media. But even with these like
(01:05:54):
bummer companies, he doesn't demonize. He's not saying, and we're
not saying that like all the people who work at
these companies are bad people. We're not even saying that
like necessarily all the things done by these companies are
all their products are bad. I mean, like you know,
Facebook and Google and all these companies have. They provide
real technologies and real products that do you know, great
(01:06:14):
things for people's lives, that make all kinds of things easier.
They provide enormous wealth and convenience and stuff like that.
So it's not that like everything these companies do is bad.
I mean that that's not true by any stretch of
the imagination. It's just that there are certain elements about
the business model of especially the social media platforms, and
specifically it's the element that its behavior modification for rent
(01:06:37):
that really do need to be reformed. And that's where
his recommendation comes in. He says, if you want to
give these companies financial incentive to reform delete your accounts. Right, Yeah,
the bummer illness is there in social media, and the
only way to get rid of it, to rid the
host of that that infection, is to step away from it.
(01:06:57):
To have to create this financial incentive for those for
these corporations to change and and ultimately, yeah, he has
this positive view, this optimistic view that it can change
that if we were to do this, if we're all
to do this, not all at once, but perhaps this
gradual awakening, uh to the reality, you know, we could
reach the point where social media can exist in a
(01:07:19):
form that is not not demeaning of our personhood, that
is not modifying our behavior to the benefit of you know,
you know, unknown corporations or by state or non state players. Yes. There.
So there are a ton of other arguments he gets
into in the book that we did not have time
to address today. Of course, you know, we're not going
to address everything in the book. If we can get
(01:07:41):
Jared Lannier to come on the show sometime, I'd be
interested in talking about some of the other ones, uh, especially,
but also I am interested in hearing from listeners who
who disagree maybe who or you know agree or disagree
because one thing I recognize is that I think I
have an emotional predispos position to bias in favor of
these arguments. Uh, simply because I I feel like in
(01:08:04):
my life I have witnessed a lot of negativity growing
out of social media. I personally, emotionally don't like platforms
like Facebook and Twitter and and so you know, I
I've got a predisposition that makes this scene, this all
seemed good to me. So I'd be interested in hearing,
you know, the arguments presented to the contrary that that
(01:08:25):
would go against my biases that say, no, maybe it's
not as bad as he's representing. Right. And likewise, if
anyone out there has quit social media, if you have
deleted your social media accounts, I'd be interested to hear
your take on how that went. You know why, you know,
why did you do it? Uh? Did did you accomplish
what you hope to accomplish by stepping away? Uh? You know,
(01:08:46):
all this is a fair game for discussion. So again,
the title of the book is Ten Arguments for Deleting
Your Social Media Accounts right Now by Jared Lanier. That's
l A in I E. R. It's as of this recording,
it's currently available in hard back and it's coming out
in paperback very soon. Yeah. If it's not out by
the time this episode publishes, it will be coming out soon. Yeah.
(01:09:07):
So again, highly recommend this read and would love to
hear from listeners who read it as well. In the meantime,
If you want to follow what we do, head on
over to stuff to Blow your Mind dot com. And
if you want to support the show, rate and review
us wherever you have the power to do so and
make sure you have subscribed huge thanks to our excellent
audio producer, Maya Cole. If you would like to get
(01:09:28):
in touch with us with feedback on today's episode to
suggest a topic for the future to follow up on
any of the prompts we just issued, you can email
us at contact at stuff to Blow your Mind dot com.
Stuff to Blow Your Mind is a production of iHeart
(01:09:48):
Radio's How Stuff Works. For more podcasts from my Heart Radio,
visit the iHeart Radio app, Apple Podcasts, or wherever you
listen to your favorite shows. The twin fourth Foot, Proper
(01:10:10):
Foo