Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production of I Heart Radios
How Stuff Works. Hey there, and welcome to tech Stuff.
I'm your host, Jonathan Strickland. I'm an executive producer with
How Stuff Works in My Heart Radio and I love
all things tech And recently I attended a Techonomy conference
(00:24):
in New York City where people from different parts of
the technology space came together to give presentations and interviews
about stuff like technology, government, society, and business. And most
of the focus of the talk sort of gravitated toward
a big question, will technology save us or will it
(00:45):
destroy us? Now? I think ultimately everyone at that conference
all agreed that that question, while compelling, masks the real question,
which is will we work to save ourselves or will
we allow ourselves to be destroyed? Technology, as it turns out,
is really just a tool. It can facilitate either outcome.
(01:07):
The determining factor, however, is us how we design and
implement that technology. Now, that being said, technology can have
real effects on us, and sometimes we might not even
be fully aware of those effects. There was a lot
of talk during the conference about how we tend to
(01:27):
think about the Internet as a connective tissue that allows
us to communicate with practically anyone anywhere on Earth. For
many years, that was how we described the Internet. In fact,
I would argue we still describe the Internet in this fashion.
It's this global network of networks and you can easily
get in contact with practically anybody at a moment's notice.
(01:49):
But in reality, we've seen the Internet also serve as
a means of creating silos of people who grow increasingly
separated and insulated from each other, mostly from people who
don't share their views, so they all end up an
individual echo chambers that reinforce their views while simultaneously dismissing
(02:10):
or denying the views of other people. So rather than
a uniting force, the Internet is enabling greater division than ever.
And there are a lot of reasons for that. And
you could even argue that that is too narrow a view,
that yes, that is happening, but it may be that
that's not the only thing is happening, or not even
(02:31):
the primary thing that's happening. So maybe someday I'm going
to do a full episode or or perhaps even a
short series about that topic. But today I just wanted
to look at one specific component of this overall picture
that factors into these discussions, and that component is addiction
to technology, and addiction is a strong word. Uh, it
(02:53):
may not be fully appropriate to use the word addiction,
but it is the one that a lot of people
do use to describe the kind of behaviors I'll be
talking about. So for the purposes of simplicity, I'm going
to use the word largely because I don't have an
alternative that is as good at summing up the general meaning. Now,
(03:14):
to understand the ideas behind this addictive nature of technology,
I thought we could look at an older tech that
I've covered before, and uh, it's from a Tech Stuff
classic episode. I'm talking about slot machines. The technology of
slot machines, when you really boil it down, is pretty simple,
But the principle behind slot machines depends upon something that
(03:38):
has nothing to do with the technology when you really
boil it all down. Instead, it's depending upon human psychology
and how through technology we can exploit the way we
humans behave and make a profit from that behavior, because
that's at the very heart of what we're looking at
in tech today. Now, just in case you aren't familiar
(03:58):
with a slot machine, it's like ambling device. Typically it
has a panel that has divided up into columns, and
each column is a reel or a wheel, if you
can think of it that way. But it's a reel
that has a selection of symbols on the outer side
of that reel. So you put whatever the amount of
money is into the slot machine for that particular machine.
(04:19):
You know they're penny slots, they're hundred dollar slots, and
then of course if you're in another country, it'll relate
to whatever currency that country uses. But then you after
you put the money in, you typically will pull a lever,
or more frequently you'll push a button, and the reels
begin to spin. The reels, by the way, it can
be physical or they can be virtual. They can be
(04:41):
digital reels on a video screen, or they can be
actual physical reels that are spinning on a electro mechanical spindle.
Then each reel stops. Typically they stop left to right,
so if it's a three reel slot machine, it would
go one to three, and if you have the right
symbol that are aligned in a specific way, you win.
(05:04):
There's no skill involved in playing a slot machine. There's
only chance the symbols, like I said, could be permanently
affixed to those physical wheels, or it could be a
video screen that simulates a spinning wheel. I know people
who won't play those games because they think of them
as being somehow fixed that the video screen can show
you any any, any collection of symbols at once, quote
(05:28):
unquote it once, whereas a physical machine is different. But
here's a secret, they're not really different at all. Now,
some games will allow you to add in extra features,
like you could play multiple credits at once, which typically
means you get an increased payout if you do happen
to win um, but it could also unlock extra features
(05:49):
such as numerous potential lines that could represent a win. Typically,
on a very basic machine, the three symbols have to
line up in the center of each of those slots,
and it has to be the same symbol on all
three reels, and then you win. If one of those
is offset, if it's a little too high or a
(06:09):
little too low, it's the same symbol as the first two,
but it's not in line, then it doesn't count. Some
slot machines allow you to have multiple lines of play
so that you can actually have those count, But otherwise
there's not really that's not really skill. It's just that
you're spending more money for a slightly increased chance of
(06:31):
winning something. Um now I say slightly, and even that
is a little misleading, because at the heart of a
modern slot machine, there's a microchip that determines what result
you're going to get, and typically it's based off the
exact moment you either pull the lever or you push
the button. When you do that, it stops a random
(06:51):
number generator or r n G, and that r NGG
is just cycling through millions or even billions of numbers
for every set, and it's just going super fast. So
the moment it stops that number is what determines whether
or not you've won. It determines which stop each wheel
or reel will hit. So you can think of that
(07:13):
real is having numerous stops, some of which correspond to
a symbol and some of which correspond to the blank
space between symbols, and this random number generator figures out
which ones those are for any given spin, and so
it determines whether or not you win. In fact, you
could technically strip all that other stuff out of a
(07:33):
slot machine, the reels, the symbols, all that stuff, all
the bells, and whistles. You could just have a random
number generator, and you could just have a schedule or
a sheet that indicates whether any particular number represents a win,
and that would be the same thing, but you lose
the theater of slot machines, the presentation, which is part
(07:56):
of the appeal. So you could do that where just
have a random number of generator. It tells you whether
you want or lost, but that's not nearly as sexy
as the slot machine. Modern slot machines use a lot
of sensory stimuli to kind of entice and reward players. So,
as I said, there's no skill in the game, there's
not really any way to meaningfully improve your chances. You
(08:20):
could boost a payout, but your chance of actually hitting
a payout doesn't improve. Um, you could add lines of
play and that might move the needle a little bit,
but so little that it doesn't make a huge difference.
And also, when you pay more money when you do lose,
it means you're losing more money on that spend than
you would on a regular spend. So how the heck
(08:42):
are slot machines even popular if the odds are bad
and you know the payouts are rare and it's all
based on random number of generators. How can they be
so popular? And when I say popular, I mean really popular.
If you visit a casino, especially in the United States,
they typically slop sans will typically take up about eight
percent of the casino floor space on the playing floor,
(09:05):
which means that you only have twenty percent reserved for
things like table games, like games like black jack and
roulette and craps and that kind of stuff. So slot
machines dominate the physical space of casinos. They also can
account for between seventy to eight percent of the profits
for the house. It's big business. There's some regions in
(09:26):
the United States where gambling is legal, but the only
type of gambling that's allowed our slot machines. So what's
going on? How come they're so popular if it's just
a random number generator. Well, at the heart of it
is the psychology behind playing a slot machine, and it's
pretty ingenious and more than a little scary. The psychological
(09:46):
design is largely based off the work of a famous
psychologist named B. F. Skinner, who heavily researched human behavior.
So back in the nineteen thirties, B F. Skinner researched
what he was calling operant conditioning. This is a method
of learning that relies upon either rewards or punishment for
(10:07):
a combination of the two, and the basic principle is
pretty darn easy to understand. If a behavior is followed
by pleasant consequences, were more likely to engage in that
behavior again. If it results in unpleasant consequences, were more
likely to avoid engaging in that behavior, which again follows
(10:27):
common sense right. If you do something and something good
happens as a result, you're more likely to do it again.
If you do something and something bad happens to you,
you're probably not as eager to try it again. So
if I bite into a chocolate bar, I'm going to
have the sense of pleasure as I taste the chocolate,
and I'm gonna think, oh, I want to do that
again in the future. But if I bide into i
don't know, an old sneaker, I'm probably not gonna enjoy
(10:50):
it very much, and I'm probably not likely to go
and do that again in the future. So by the
nineteen forties, Skinner developed what he called an operant conditioning box.
Just about everyone else uses the term Skinner box, and
it was a box that you would put a small
animal into, like a rat or a pigeon, and typically
would have a lever in the box. UM there would
(11:12):
be in the classic skinner box and electric grid on
the floor. UM. Some of them would also have speakers,
a couple of indicator lights, and a food cup. And
he would put animals like pigeons or rats in the
box and train them to see how or not even
train them, just really letting them learn. Through this method,
(11:34):
he wanted to sort out three types of responses that
can follow any given behavior. So these responses or operant
fall into three categories. There's neutral oprint uh, this has
no effect on the probability that a particular behavior would
be repeated or not. Then you have reinforcers. This is
a type of operant that would encourage behavior repetition and
(11:57):
thus increase the probability that the critter inside the box
would do whatever that thing was again in the future.
And reinforcers can be either positive reinforcers or negative reinforcers.
A positive reinforcement encourages repeated behavior in return for a reward. Well,
a negative reinforcement encourages repeated behavior by removing something unpleasant.
(12:22):
So some people will conflate negative reinforcement with punishment. Those
are two different things. Punishment is the third operant. By
the way, punishment is where you punish a subject for
making the wrong choice, as opposed to rewarding a subject
for making the right choice. Negative reinforcement remains removing something
(12:45):
unpleasant when the person or a thing makes the correct choice.
So for example, let's say that I've got a room
and there's a button on a little table in the room,
and you are put into the room, and then the
ramans are blasting at a very uncomfortable volume, and you
(13:08):
want to just have some peace and quiet, and you
know that if you push the button it will stop
the um the ramons blasting at you. The behavior I'm
trying to get you to do is to push the button.
So you push the button, the negative stimuli goes away.
That's negative reinforcement. Uh, And when it comes back, you
push the button again. So now I have essentially trained
(13:31):
you on this behavior where you know if you push
the button you will prevent having the ramones blasted at you.
The positive reinforcement would be you come in, You're in
a room, there's a table with a button on it,
You push the button and then you get a reward
of some sort like a slice of pizza or something.
So those are the basic operant categories neutral reinforcers punishment.
(13:55):
So a typical experiment when involved putting a hungry rat
inside a Skinner box and if the rat were to
move the lever, typically it was done accidentally as the
rat was just exploring its box. UH. That would cause
a pellet of food to fall down a little shoot
into the food cup inside the box, and so the
(14:15):
rat would then get a food pellet, and before long
the rats would learn to associate that moving the lever
meant that a food pellet would come down, and so
they would start to press the lever again and again
in order to get the rewards. Uh. Skinner also did
an experiment with negative reinforcement in which he would run
a weak current through that electric grid in the box,
(14:38):
which would cause the rats to experience discomfort. It wasn't
like a strong enough electric current to truly shock the rat,
but it was not a pleasant sensation, and hitting the
lever would shut off the current for a while. So
the rats would learn to hit the lever and turn
off the current as quickly as possible. Now that bit
doesn't really fall into what we use was lot machines,
(15:00):
because as far as I know, casinos aren't using negative
reinforcement to keep people playing, at least not yet. However,
Skinner then went a step further to learn more about
the reinforcement method of learning. He wanted to see what
would happen if you were to change the schedule of reinforcement.
What happens if you stop dispensing pellets when the rat
(15:21):
hits the lever? So first the rat learns that hitting
the lever dispenses pellets, Then you stop dispensing pellets. How
many times will the rat hit the lever before it
gives up, before it realizes that the thing that was
working is no longer working and that behavior fades away.
How long does it take for it to abandon that behavior.
Skinner worked with other behavioral lists to determine things like
(15:44):
response rate, which is how frequently an animal like a
rat would hit the lever in an effort to get
a pellet, and the extinction rate, and that's the rate
at which animals would give up if they were not
being rewarded. Then, by adjusting the frequencies at which a
reward would come out. Skinner and others could determine the
balance that would encourage the most work for the slowest
(16:07):
rate of extinction. So, in other words, how frequently do
you need to dispense a reward so that the rat
doesn't give up and is incentivized to continue engaging in
that behavior. Skinner discovered that the best approach was to
use a variable ratio method, meaning there was no set
number of lever pushes needed between rewards. It would change.
(16:30):
So it might be that the first time you hit
the lever you get a pellet. Then you have to
hit the lever four more times before the next pellet
comes out. Then two times and another pellet comes out,
then three times and another pellet comes out. And they
found that by using this variable ratio approach, you could
extend the amount of time that the animal would engage
in the behavior you wanted it to do. Now, I
(16:53):
would like to tell you that we human beings are
better than rats, but as it turns out that it's
not true. We humans behave the same way, and that's
what the design of slot machines is ultimately predicated upon.
A slot machine promises payouts. In fact, a lot of
slot machines in casinos have a payout rate of or more, meaning,
(17:19):
over the life span of the slot machine, it will
pay out about of the money it takes in in
the long run. But the key phrase of that is
in the long run, not in a play session. Over
the entire lifetime of the slot machine, the likelihood of
any one spin resulting in a jackpot is very low.
(17:39):
It could be just a fraction of a fraction of
a percentage point. Lower payouts have better odds, and if
you sit at a slot machine long enough, you're likely
to at least get a low payout hit. Now, that
hit might not be enough to offset the amount of
money you poured into the machine. In fact, more often
than not it won't be, but it could be enough
(18:00):
to keep you playing the game, and that's the rub.
Here are a couple of other tricks that slot machines
used in order to keep you playing the game. The
odds of a specific combination coming up depends in part
on how many of the corresponding symbols there are on
the stops of each wheel or reel in that machine.
(18:21):
So let's say you've gotten old fashioned three real slot machine.
It's a physical one. You've got these three wheels, uh,
and on that outer edge you've got the symbols on them.
And the symbols include things like cherries and oranges and
the good old bar sign. And let's say that the
the jackpot symbol is a bag of money. It's got
(18:41):
a bag and a little dollar symbol on it, and
that represents the really big jackpot. Each reel also has
a number of stops on it, and these are all
the possible positions the real could be and when it
stops rotating, which is completely determined by that microchip. Some
of those stops have a symbol associated with them, you know,
a cherry or an orange or the bag of money.
(19:02):
Some of the stops actually represent the blank space between
symbols on the reel. So let's say our machine has
fifty stops per real, so each reel could stop in
one of fifty different configurations. The bag of money represents
just one stop on each of the three reels. So
you do have a bag of money on real one,
(19:23):
Reel two, and Real three, but you have forty nine
other things on those reels as well. So what are
the odds that you would hit all three of those well,
it's one out of fifty times one out of fifty
times one out of fifty and then multiplying by a
hundred to get the percentage, which comes out to point
zero zero zero eight chance. That's the chance you have
(19:44):
of hitting the jackpot. Not great odds. But we're not
done yet. I'll explain more in just a second, but
first let's take a quick break. All right. So I
mentioned earlier that the big jackpot might have a really
low likelihood of popping up. But slot machine manufacturers have
(20:07):
come up with ways to avoid discouraging players. They built
in systems that would make it more likely that that
bag of money symbol, for example, might appear lined up
on the first two reels more frequently, which gives the
player the sense of a near miss. So you get
a bag of money, a bag of money, and oh,
shoot cherries or a blank spot. That's nothing right there.
(20:31):
There's no payout with that, but it feels like it
was almost something because you had two out of the
three bags you needed to hit the jackpot. It's like
you almost got that big payout, but in reality, you
didn't almost do anything at all. You've got a result
based on a random number generator in the machines circuitry
that amounted to a loss. The way the machine displayed
(20:53):
that loss to you was in a way that made
it seem like it was almost a win. So to us,
it feels like we near really grabbed that brass ring,
so if we just stick at it, we're gonna get it. Meanwhile,
other symbols might take up far more space on the reels,
symbols that would result in a much lower payout, thus
increasing the likelihood that you'll at least get some combination
(21:14):
that represents a more modest wind if you play long enough.
At least that's the way it looks on the outside right,
that the odds of hitting a low payout might be
closer to So we get two types of rewards. One
are of these small payouts, which, if the machine is
working properly, tends to be less than what we've poured
(21:36):
into the machine in the first place, or at least
the payout is less than what the machine has been
collecting for that day. And another are these near misses
that make us feel like we're getting closer to that goal. Now.
Once in a blue moon, these machines will do the
jackpot payout Uh, that's part of what keeps people playing
these games is the promise that it can happen, but
(21:58):
it doesn't happen freak ly. The odds alone tell you
that it's pretty rare. Now, in addition to all that,
we humans are also really good at recognizing patterns. This
becomes more important with video slot machines. In fact, we're
so good at recognizing patterns we will frequently perceive a
pattern where there is no pattern, or at least no
pattern that has been consciously created. So, for example, when
(22:21):
you look at clouds and you see faces in them
or other complex shapes, that's your brain applying a pattern
to something that is actually random and not ordered. With
slot machines, we're looking at the creation of patterns. Three
symbols in a row would be a pattern. So there's
another little psychological boost that keeps us playing even if
we're not in a lucky streak, because we're looking for
(22:43):
those patterns. We find that very satisfying. Now, there are
a lot of people who give advice on how to
play slot machines well, but at the very end of
the day, you have to acknowledge the slot machines are
designed to take money. There's no way to skillfully play
a slot machine. You can use some wisdom to help
pick out machines that might have more favorable odds for
(23:03):
a payout, but those odds are are favorable because they
ultimately benefit the house. You have very few decisions when
it comes to playing a slot machine, and generally speaking,
in house games, in casino games, gambling games, you narrow
the odds between you and the house in those games
where you have more options as a player, assuming that
(23:25):
you are pursuing optimal play, that you're playing without mistakes.
That's asking a lot because optimal play is a pretty
tough skill to develop. But with slot machines, there's not
really much you can optimize, so you're not really going
to narrow those odds very much. Now, in addition to
all I've just said are the more flashy slot machines
that have come out over the last couple of decades.
(23:47):
Many of these include using licensed i P to attract
players to them. So if you walk around a modern casino,
you're gonna see a lot of slot machines that are
modeled after licensed material there licensed from popular movies like
the Lord of the Rings franchise. That's one that my
friends Shannon really likes, or television shows like the series Friends.
(24:07):
These machines frequently include music and video clips to help
lure people over to play them there they're more demographically
aligned to capture certain types of people, and then the
gameplay is what keeps them there. Um. They might also
include bonus features that make the game more exciting to
play without significantly increasing the likelihood of a big payout. Okay,
(24:30):
so I've covered the psychology of slot machines, So why
did I spend so much time on that. It's because
many developers of both software platforms and apps rely upon
those same sort of designs when they're creating their work.
They designed the experience to have that same sort of
reward system for users, and that encourages users to spend
more time on the platform, which ultimately benefits the developer
(24:53):
in some way. Now, that way could be through advertising.
If you go to advertisers with data a out how
many users you have and the average amount of time
those users spend per day on your product, you've got
a pretty powerful tool in your sales toolbox. If you've
got a lot of users that use A that spend
a lot of time on your on your platform, you
(25:14):
can ask for better rates, and you can get better sponsors,
and you can assure those sponsors that people using your
services are spending more time doing that and less time
doing other things. So advertisers one eyeballs, right, if the
eyeballs are all on your app, they're not other places,
So they want to be where the eyeballs are. That's
your app. So it gives you a lot more leverage,
(25:35):
and it allows you to have much better rates, sales rates.
You're gonna make a lot more money, So you've got
a really strong incentive to encourage people to look at
your app or service for a long time and not
at other stuff. Another possibility is that maybe you're using
pay for content or pay for elements in your service,
and this could be done in place of or in
(25:58):
conjunction with advertising. A lot of mobile games use this method.
You play the game and you make a little progress,
so you receive that little psychological reward for achieving things
in the game, little dopamine going through your system. But
sooner or later you hit a barrier, maybe you run
out of turns or lives in a game. Maybe you
need a boost to get past a certain point in
(26:18):
the game. Whatever the case, you typically have two options
in most of these types of games. One is that
you wait a given amount of time before you can
try again, with no guarantee of success. So you might
have to wait thirty minutes or an hour before you
can try again and you aren't sure if you're gonna
make any progress. Option two is that you can pay
(26:40):
some money and in return you get immediate gratification, either
you get more turns or lives, or maybe some sort
of in game asset that makes it easier to get
beyond that part of the game. Now, this isn't just
something we see in mobile games. It's how the business
model of shareware works too. When I was in high school,
games like Castle Wolfenstein re D used that model. Developers
(27:03):
would release the first level or the first part of
a game for free so you could easily get your
hands on that version and you could play through to
your heart's content, and when you reach the end of
whatever that free section was, you would typically get a
message that would encourage you to purchase the rest of
the game if you enjoyed what you had done. And
it was an effective tool and one that a lot
(27:23):
of gamers actually approved of because it gave them a
chance to try out a game before committing real money
to it, and if the game was good, many people
were willing to cough up the cash to get the rest.
So while the shareware model followed a bit in the
wake of Skinner's work, it wasn't seen as predatory because
you could just say like, well, you know, I played it.
(27:44):
It didn't really impress me. I'm going to walk away.
Now that's not necessarily the case with apps and services
that are popular today, whether it's the algorithm that serves
up a feed of content on a social platform cough
Facebook cough, or the propensity of game features like looped
crates that became incredibly pervasive. Over the last couple of years.
(28:05):
We've seen developers exploit the same behavioral tendency, and we
as consumers, have engaged in those behaviors again and again,
just as predictably as the rats in Skinners experiments. Now,
this also plays into the larger concept of game theory.
How do you balance out different factors to maximize engagement
(28:27):
with games? Developers have to balance challenges and rewards carefully.
If a game is too easy, players may become bored
and move on to something else. If it's too challenging,
they become frustrated and they won't stick with it. So
it needs to be challenging enough to the player so
that when they win a game or achieve a goal,
there's a sense of accomplishment, and this needs to be
(28:49):
repeatable to keep the experience going. The same theory can
be applied to other stuff, not just too explicit games.
So let's take Tender for example. The dating app Share
is a lot in common with slot machines. Users flick
through photos of other users, swiping left or right, and
there's a quick psychological reward. Either you find someone attractive
(29:11):
or you don't, and you pass judgment, then you move on.
So if that person has similarly passed judgment upon you
and found you attractive, and you found them attractive, then
you can connect. All the elements are there. And I'm
not getting on a high horse here. I'm as guilty,
or probably even more guilty than any of you guys
out there. I find myself having to be mindful to
(29:33):
not be actively using some device that's got a screen
on it. Otherwise that's where you're gonna find me, is
behind a screen. I try to make rules for myself,
and they are hard for me to follow because I'm
pretty darn addicted to screens. It takes effort for me
to not whip out my phone as soon as I
sit down to eat, for example, or if I'm in
(29:54):
line anywhere, I want my phone out in my hand.
But even when I'm at home, I'm usually either are
playing a game or watching something online. I have numerous
tabs open all the time in my browsers. I play
some mobile games that are definitely built on top of
this understanding of human behavior, and I find myself growing
more disenchanted with it overall. And we're starting to see
(30:15):
a bit of backlash against this trend in broader culture.
This includes articles, books, public speaking engagements, politicians, activist organizations,
and more. There are lots of articles out there pointing
out our tendency to spend more time in front of screens.
A recent article in The Chicago Tribune cited a study
conducted by a nonprofit group called Common Sense Media that
(30:38):
found US teenagers are spending about nine hours each day
on some sort of digital media, and adults. Don't feel
good about yourselves because we're even more extreme eleven hours
per day devoted to screen time. Now, this prompts us
to ask the question, is that much time in front
of screens harmful to us? Should we be concerned that
(31:00):
we're dedicating so many waking hours to interacting with digital
media that it is encouraging this behavior and rewarding us
in psychological ways for engaging in it. Could we actually
be hurting ourselves and each other through these practices. Now,
as it turns out, there are a lot of people
who have stuff to say about this particular set of ideas,
(31:21):
and as you might imagine, the responses range from well,
of course, this is dangerous and harmful behavior and we
need to do something about it, to a more cautious
statement like, we honestly don't have enough empirical evidence to
support any sort of conclusion on the matter. So as
tempting as it is to come to one, the responsible
thing to say is we don't know enough yet. So
(31:41):
in our next section will dive into some of those arguments,
and some of them will appeal to common sense, But
it's good to remember that common sense isn't always right.
Sometimes we'll draw a conclusion because we all see a
correlation that ends up being impossible to support upon further study. Remember,
correlation and causation are not the same thing. Sometimes the
(32:02):
correlation can indicate a causal sort of relationship between two things,
but it's not always the case. So when we come
back from this break, we'll look further into this issue
and we'll be right back. Okay, So it's pretty clear
(32:24):
people are spending more time with digital medium than ever before,
and if we're dedicating time to those screens, we might
also be taking time away from other tasks. So at
some level you could argue that spending time on digital
media can have a negative impact if we're neglecting real
world issues while we are engaged in the digital world.
(32:46):
But to make that argument convincingly, we need more data. Further,
there are many who might argue that because we're spending
so much time on digital media, we're hurting our own
ability to form meaningful connections with other people in the
real world. There's a growing concern that we're relying more
on online interactions than ones in a physical space, and
(33:08):
we're seeing it reflected in multiple behaviors, such as relying
more heavily on text and essueing phone calls. Most of
the people I know will not answer the phone, even
if they know who it's coming from. They would rather
just have a text. They actively avoid having to talk
on the phone. It's a social interaction that they don't
(33:28):
want to have. So we're choosing interactions that don't connect
us with each other in more traditional ways and arguably
more meaningful ways. But the jury is still out over
whether or not this constitutes actual harmful behavior. Is an
online relationship less meaningful or emotionally fulfilling as one done
(33:49):
in real space? Can we even draw any sort of
conclusion about that in general or does it depend upon
the actual situations and individuals involved On a case by
case Base says it may be that we cannot make
a generalization about the health of an online relationship versus
and a meat space relationship. Now, from an armchair psychology standpoint,
(34:14):
you could argue that the online world has deprived us
of a sense of intimacy that we tend to crave.
And I'm not speaking solely of romantic intimacy, though that
can play apart two. Rather, I'm talking about the intimate
connection between two people in a one on one interaction,
not a romantic intimacy, but just a human connection. It
(34:36):
is very tempting to argue that the rise and popularity
of things like a S m R style videos in
which a creator focuses their attention on the viewer in
some way in an effort to soothe, relax or tingle
that person, is evidence that we in general lack the
sort of connections in our day to day lives. We
(34:58):
crave that connection and that we're not getting and that
has given birth to an entire genre of videos that
have become incredibly popular over the last five years. If
you do a quick search on YouTube for the terms
personal attention, you're going to find thousands of videos catering
to that audience. These videos tend to feature one or
(35:21):
more creators speaking to the camera and microphones as they
serve as a stand in for the viewer, so they
are simulating the sort of interactions people would otherwise have
in real spaces. It's possible that the popularity of this
genre is in part due to a decline in such
meaningful interactions happening in the real world, but without an
(35:44):
actual study investigating the matter, it's not scientifically responsible to
make that claim. You could say, I suspect I have
a hypothesis that this is because of that, but until
I test it, I cannot really be certain. Also, these
sort of end up being very tricky to arrange because
(36:04):
there are a lot of potential variables that could affect
the outcome, and isolating all those variables is notoriously difficult
to do. In these sort of social uh survey and
social scientific studies, they're very very challenging. It's much more
challenging than say, taking a substance and saying is this flammable?
That's pretty easy claim to test. These sort of claims
(36:27):
much trickier. Now. We might also draw the conclusion that
all this technology is hurting our ability to focus on
specific tasks and interactions. When our world is filled with
notifications and demands for our attention, are we giving anything
our full focus at any given time or are we scattered?
There are a lot of articles suggesting that that's the case,
(36:48):
that we're not capable of really focusing on things anymore,
though again, few studies really back up this claim, And
I should also point out that this doesn't mean the
claim is false, That just because it's not yet supported
by scientific research doesn't mean it's wrong. It just means
we can't be certain. You can make a claim and
not have evidence, and you can end up being right,
(37:11):
but you have to have the evidence to prove whether
or not you're right. People who follow me outside of
tech stuff might know that I used to do a
show called Podcast Without Pretense with my co hosts Eric
Sandean and I as actar, and the show gradually evolved
into a sort of jokey experiment. Each week, one of
us would pick a movie, usually a really poorly reviewed
(37:34):
film that we could watch on a streaming service like Netflix,
and then we would set time aside to watch the movie.
Each of us would make time in the week to
try and watch the movie, and the rule was you
could not have any distractions present while you tried to
watch the film. No second screens, no smartphones out, no laptops,
(37:54):
nothing like that. It was just you and the movie.
And then we'd each write down how long into the
film we lasted before we either grabbed a second screen
because we didn't want to pay full attention to the
film anymore, or we just gave up entirely and turned
the film off, which happened more rarely than you would think.
Usually we would just give up and grab a second
(38:15):
screen and keep on going. Now, this was in reaction
to that same feeling that our dependence on technology meant
that when we do something like try and watch a show,
we'd have another screen open at the same time, and
it would leave us feeling like we hadn't really watched anything,
that we were just sort of there while it was happening.
So this was sort of us testing our ability to
(38:37):
focus on a topic even if that topic didn't deserve
our attention, and dedicate all of our attention to it,
and it wasn't easy, even when the films weren't absolutely awful. Similarly,
claims that screen time is harmful in of itself, while
appealing on a common sense level, haven't had a lot
of critical scientific study. It's a challenging thing to explore.
(39:00):
You might see a rise in screen time and a
similar trend in people seeking help to treat depression, so
that is attempting connection there. But does that mean people
are becoming more depressed as they spend more time interacting
with screens or could it mean that depressed people seek
out screens as a means to help alleviate their depression.
(39:21):
One does not necessarily cause the other, and it could
be that both trends have no meaningful connection at all.
They're both on the rise, but maybe they both stem
from a common source that doesn't have a connection between
the two. Now, I say all this to remind myself
just as much to remind all you that these are
complicated issues. I have my own opinions, which typically lean
(39:44):
toward the idea that more screen time and less interaction
in real space is probably not the most psychologically healthy
behavior we could engage in. But I also must admit
that I don't have any scholarly evidence to support this,
and that most of my conclusions are really drawn from
my own personal experience, and as any good scientist will
tell you, anecdotal evidence isn't really evidence at all. At
(40:09):
the same time, for my own mental health, I feel
the need to do something, and so I've been trying
to sort of wean myself off of my dependence of
social media and screens to see if that has a
positive impact on my life. I'm not the only one
drawing these sort of tentative conclusions, and some people are
far less cautious in connecting the dots than I am,
(40:31):
but they also are more educated in the field than
I am, So it's not meant to be a dig So,
for example, the Chicago Tribune piece that I mentioned earlier
in this episode, the author, Drene Dodge and McGee is
a psychologist and a researcher, and she believes that there
is a connection between an increase independence upon technology and
the rise of depression among certain populations in the United States.
(40:54):
She connects increased usage with declining mental health, and she
may be right, but I still worry that without some
carefully designed studies, we can't be sure that that's the
actual flow of behavior. That is, that people begin to
experience a decline and mental health as they use more
and more technology, instead of people are using more and
more technology as a result of them already dealing with
(41:17):
a decline in mental health. Like, without establishing the actual
causal relationship there, I don't think you can make any
real conclusions. Now, another factor we should take into consideration
is what is the actual behavior being reinforced through these
various technologies and services if the purpose is to capture
eyeballs for the purposes of generating revenue, And nearly every
(41:41):
case does boil down to that if you go far enough,
the best you might be able to say is that
it's not necessarily doing direct harm to someone. If the
design is meant to convince someone to pour money into
a service, you could argue that the service can be
financially harmful to those who are vulnerable to the reinforcement cycle.
And while social networks can help people connect with one
(42:03):
another and stay in touch, if the algorithm that populates
a news feed is selecting which posts you see and
which ones you don't in an effort to convince you
to spend more time on that platform, you're not really
engaging with those friends and loved ones. Instead, you're seeing
a curated list that isn't meant to create meaningful connections,
but rather keep you on the platform longer so that
(42:25):
the platform can serve you more ads. If you're lucky,
just like with a slot machine player, than some percentage
of those posts you are seeing are actually meaningful and
do help you create those close connections, but otherwise you're
being fed stuff with the intent of keeping you there.
From a business standpoint, I can totally understand that design
(42:46):
businesses generate revenue, and generally speaking, you want as much
revenue as you can earn. You want that revenue to
grow year over year. To return money on the investment
for building the business in the first place. Publicly traded
companies face growth targets each year with the goal of
creating a return for shareholders, But that just means that
the business has to find ways to make more money,
(43:08):
not to make money in a responsible or a compassionate way.
As long as the business isn't outright violating any regulations
or laws, it's pretty much fair game. And this has
created the environment we see today and has enabled companies
to create services that incentivize us to use them more
and more. They can serve up ads, and in the
case of companies like Facebook or Google, they can use
(43:28):
the collected data that we generate to great advantage. So
even if the use of these strategies isn't directly causing
us harm, it is certainly an attempt to manipulate us
into being ever more dependent upon those services, and it
totally works, just as it worked in those rats in
Skinner's experimental box. Now there are people out there who
(43:50):
are dedicated to breaking this dependence on technology. In some cases,
it all starts with more tech. There's an app called Moment,
for example, and it tracks how much you use your
phone in her tablet and includes a breakdown of which
apps you use most frequently and for the longest amount
of time. Sometimes just getting quantifiable data can help you
get some perspective. For some people, the results might not
(44:12):
be a surprise or even alarming, but for others it
could serve as an incentive to try and reduce the
amount of time spent on devices. There are books and
camps dedicated to helping people break free of technology dependence,
and I think that can be well intentioned. But I
also want to point out that the world we live
in skews pretty darn heavily in favor of people who
(44:33):
are making use of technology or communication systems. Means of
finding work, uh commerce systems and more grow increasingly tech dependent.
I think it's not realistic to truly shed your use
of technology, to go off the grid entirely and be
meaningfully interacting with society at large. I suppose it's possible,
(44:57):
but it's really really hard to do. I do think, however,
you can make some choices to reduce your dependence on
technology if that's your goal. Now. For me, I plan
on stepping back a significant amount later this year. En
I'm likely deactivating my Facebook account in June of two
thousand nineteen. And this is a choice I'm making for myself.
(45:18):
I'm not urging anyone to do Likewise, I don't pretend
that I have the answer for everyone out there. Heck,
I'm not even certain that cutting back on screen time
and social platforms will have any meaningful positive effect in
my day to day life, but something I'm going to
try in an effort to be more engaged with the
people around me and my real world community. So we'll
see if it makes an impact in the meantime. If
(45:42):
you guys have any suggestions for future topics of tech stuff,
or you have any questions or comments for me reach out.
You can do so through email. The address is tech
stuff at how stuff Works dot com, or on Facebook
or Twitter. The handle for both of those is text
stuff HSW. I'm eager to hear what you guys think
about this stuff. Also, remember you can pop on over
to our website, tech Stuff Podcast dot com. I promise
(46:05):
I don't use any of those tricks to try and
keep you on that site forever, but you can search
the archive and look at all the past episodes we
have up there in case you want to see if
there's something specific in our library. And you can also
pop on over to our merchandise store if you want
to get a tech Stuff T shirt or a hat
or a coffee mug or something like that. And remember
(46:25):
every purchase you make goes to help the show. We
greatly appreciate it, and I'll talk to you again really soon.
Hext Stuff is a production of I Heart Radio's How
Stuff Works. For more podcasts from my heart Radio, visit
the i heart Radio app, Apple Podcasts, or wherever you
listen to your favorite shows.