All Episodes

May 2, 2019 33 mins

Welcome to the A.I. revolution that is already transforming our lives, for good and evil. But what exactly are we sleepwalking into? We start by investigating the connections between online dating, terrorism, and screen addiction.

In this episode we hear from: Tristan Harris of the Center for Humane Technology, Gillian Brockell of the Washington Post, Yasmin Green of Jigsaw, and Dr. Helen Fisher.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Sleep Walkers is a production of I Heart Radio and
Unusual Productions. So I mean, let's just ground the conversation
for a second. There's two point two billion people who
use Facebook. It's about the size of Christianity. There's one
point nine billion people who use YouTube. It's about the
size of Islam. That's Tristan Harris. When I walk into

(00:22):
that Facebook room, you know, I'm walking into a room
that's designed to never make me leave with a thousand
engineers using supercomputers to calculate the perfect seductive thing to
put in front of my brain. And Tristan knows a
thing or two about seduction. I was a design ethosistic
Google and have spent a decade understanding some of the
invisible forces that shape the way that we see and

(00:44):
make sense of the world and choose in the world.
In case you're wondering they teach this stuff in school, well,
they do at Stanford. I studied at this lab called
the Stanford Persuasive Technology Lab that teaches engineering students about
the entire discipline of persuasion. Everything you know how to
persuade dogs with clicker training, So click click, you know
you've got the food. Click click, reward all the way

(01:05):
to casino design and slot machines, and how do you
change the lighting to get people to buy things? And
then you know supermarket design and choice architecture and putting
the candy in the final aisle, because that's the thing
that gets you to buy. Tristan and his classmates used
what they learned at the Persuasive Technology Lab to define
how we live online. My partners in that class, and
two thousands of sex at Stanford were the founders of Instagram.

(01:27):
Half my friends built some of these products and then
to the like button. Think about that for a moment.
The people who designed the apps we use the most
learned how to make them as appealing as possible by
borrowing the science that trains dogs and gets people hooked
on gambling. And now all of that is baked into
a device that's basically become an extension of our body.

(01:48):
We check our phones about eighty times a day, that's
the conservative number. And you know their incentive is to
calculate what is the perfect most seductive thing I can
show you next to the most seductive red color for
that notification, or the most seductive video that you know
you can't help but want to watch next. At a
time when technology is changing faster than our ability to

(02:09):
understand it and seeping into nearly every corner of our lives,
what kind of murky future are we sleepwalking into? What
can we do to take back some control? And how
is emerging technology changing our lives for the better. This
is Sleepwalkers, So welcome. I'mrs and I'm doing this show

(02:39):
because I'm fascinated with how we relate to the technologies
that they're changing our lives, whether they end up being
something like Dr Frankenstein's Monster or ai craft it seltzer Water.
In this episode, we look at how our technology gets
into our heads and we take on some distinctly modern monsters,
from the trick to successful online dating to deterring potential

(03:01):
terrorists with invisible technology. And I'll have some company along
the way. Hi. That's Kara Price. She's my friend and
she hosted a show called Talk Nerdy to Me for
the Huffington Post. Um. So, I saw this article on
the Times, the New York Times about how all of
these Silicon Valley exacts are taking away their kids screen time,

(03:23):
you know, telling their kids nanny's that the nanny can't
use screens. And there's actually this quote from Mark Zuckerberg's
former executive assistant, this woman Athena Tivaria. She says, I
am convinced the devil lives in our phones and is
wreaking havoc on our children. Is that like the devil
is in the details, or is it like the devil
who lives in hell? No? I think it's the red

(03:45):
devil who wears no clothes that guys in the phone. Yeah,
because you know what he does. It leaves people in temptation.
That's right, that's the devil's show. That's really right, and
it is have you ever seen a child on an iPad?
Have you ever seen me? Well, have you ever seen me? Yes? Mesmerized,
clickity clack, don't come back, actually showing Pocca, who's the
first president of Facebook, said God only knows what it's

(04:05):
doing to our children's brains. Got on the devil. Here's
all of these technology executives who have built basically what
we use every day to do everything from being in
touch with our friends to meeting our loves to getting
from A to B. And suddenly they're saying, well, you guys,
go ahead, but not for my kids. Yeah, yeah. I

(04:26):
think it's interesting that this feels very similar to the
conversation we have surrounding you know, junk food, sugar, tobacco, alcohol,
that these are things that your parents are supposed to
protect you from up until a certain point, whether it's
when you go to college or whatever. And now our
technology used, whether it be how much we're on social
media or how much time we're spending gaming on our screens,

(04:48):
is something that parents have to regulate. The problem is
one like I'm too old for my parents to regulate it,
so so what am I going to do? And also
like my parents use it, so it is sort of
like sugar in the sense it's like there are parents
that tell their kids not each sugar who have major
sugar addictions. We had those conversations in the past, and
now we're starting to have them about technology. But how
do we tell our kids to regulate their use of technology?

(05:09):
And we can't regulate our own And wants to say
to us that we can't regulate our own technology. His
Cara with a story of what happens when you add
service think they know everything about you, but actually get
it all wrong. I just wrote the whole thing in

(05:30):
like thirty minutes just you know, banging the keyboard, just
sobbing the whole time. Dear tech companies. Gillian is a
social media power user. She uses Twitter for work at
the Washington Post and Facebook for her social calendar. Even
her wedding is on YouTube, and in late she was

(05:51):
entering the final months of her pregnancy. She and her
husband Bobby were doing the normal what to expect when
you're expecting stuff, preparing for their newborn son to come
into their lives. I can remember about two days before
everything happened, soaking in the tub, and I was just thinking, God,
this has just gone perfectly, Like I've never been accidentally

(06:14):
pregnant despite not always being responsible, and we had no
problem getting pregnant when we wanted to, despite my being
thirty eight. So I was just like, wow, one and done.
This totally worked. A couple of days later, I was
feeling some pain, so I called the doctor's office and

(06:36):
I said, well, I haven't felt a move today, and
they said, okay, come in, and so I went in
and they put me on the sonogram, and I knew immediately.
I was just screaming no, no, no. A week or
so after Gillian's sun died. She was being haunted by

(06:58):
targeted advertising ad that assumed she had given birth to
a healthy baby boy. When I would scroll through Facebook
or Instagram, I would get maternity where ads, and so
I was like, okay, I have to teach it that
I'm not pregnant anymore. Gillian did the only thing people
can do with ads they don't want to see. She
clicked the three dots in the corner of the ad

(07:19):
and gave feedback. So I would say I don't want
to see this ad, and then it would say why,
and I would say, because it's not relevant to me,
which is like so hard to acknowledge. But Gillian learned
the hard way that the Facebook algorithm isn't programmed for
the outcome of a still birth. Then when I god,
when I got that experienced email, I just I can't

(07:41):
even like I just snapped finished registering your baby for
lifelong credit tracking. I just I was just like, you
have got to be kidding me. Gillian was angry, and
she knew she had a platform, so she wrote the
letter and posted it on Twitter. Dear tech companies, I
know you knew I was pregnant. You probably saw me

(08:02):
googling maternity plaid and baby safe crib paint, and I
bet Amazon dot Com even told you my due date
when I created that prime registry, and silly me, I
even clicked once or twice on the maternity where ads
Facebook served up. What can I say, I'm your ideal
engaged user, but didn't you also see me googling baby

(08:25):
not moving? And then the announcement posts with key where
it's like heartbroken and problem and stillborn and the two
hundred tear drop emoticons from my friends. Gillian's letters struck
a chord. It got retweeted twenty eight thousand times, and
shortly after Gillian posted her letter to Twitter, she received
a response tweet from Facebook's VP of Ads, Rob Goldman.

(08:49):
I'm so sorry for your loss, he said, and you're
painful experience with our products. We have a setting available
that can block ads about some topics people may find painful,
including parenting. It still needs improvement, but please know that
we're working on it and welcome your feedback. So I
turned it off, and within a few hours I got

(09:09):
an ad for adoption agencies, and the next day I
got an ad for um father's son matching one sees.
I've taken screenshots to while every time I get one,
I just take a picture of it, just like it's
not working. Yeah, the adoption one g f y. You know,

(09:33):
Rob Goldman's advice may have changed the ads, but it
didn't solve the fundamental problem. The algorithms couldn't stop reminding
Gillian of her loss. Other people were like, well, don't
be on Facebook at all, you know, don't do any
of those things, And first of all, I don't think
that's realistic, especially because for Gillian, Facebook was also providing

(09:53):
comfort through it all. It was helpful to have my
friends timing in and saying, we're so sorry, you know,
what can we do? And you know, a bunch of
people made an Uber eats fund for me and my
husband so we could just have food delivered. While when
we got home it's like, you know, the twenty one
century bringing over lasagna. And this was all organized on Facebook. Yeah,

(10:16):
it was on Facebook, and ironically it helped her connect
with people going through the same thing. A woman who
used to go to my church, she had a still
birth on Christmas Day a few weeks after I did,
and the memorial service for her baby was posted to Facebook,
I wouldn't have known about it, and I wouldn't have
gone if I hadn't checked Facebook. Gillian isn't planning to

(10:39):
delete social media. She just wishes it could be better.
She's still going to use it to keep in touch
with friends and family. But whereas she was once comfortable
with her wedding on YouTube, there are now something she
won't be comfortable sharing. And I have to say, now
having this experience, you know, I knew it was being tracked,
but having the tracking revealed to me in such a

(11:01):
garish display. If we do have a living child someday,
I think that's going to be actually really easy for
me to just be like, no Internet, you don't get
to have that. Even after the experience. Gillian doesn't hate Facebook.

(11:24):
At the same time, she wouldn't want her future children tracked,
a bit like the Silicon Valley execs we talked about
earlier taking their own children off social media. But what
if the same targeting technology that harm Gillian could help
others and keep people who are hurting from damaging themselves
and society. And what if it wasn't a broad group
like new mothers being targeted by ad bots, but a

(11:47):
specific group of people having their personal search results changed.
I heard about a programmer Alphabet, Google's parent company trying
to do just that, and I wanted to know more.
We'll get there after the brain. My name is Yasmin Green.

(12:09):
I am the director of Research and Development at Jigsaw.
We are sitting in one of our rooms in our
New York office. The name of the meeting room is
Smoke Signal Communication Technology. Yeah, they're all named after different
communication technologies. Jigsaw is a part of Alphabet and Google

(12:29):
that's focused on technology that addresses big security challenges. So
how did Jigsaw begin? Two decades ago? When we were
designing the platforms and apps, we were not really imagining
that repressive governments and criminals and terrorist groups or like
salivating and about innovating to use these platforms just as

(12:50):
just as well as everyone else. Um. And now we
realize that we really need to understand their goals and
their activities if we want to keep people safe online.
It's Yasmin's job to look at how those repressive regimes,
criminals and terrorist groups operate, find trends and try to
counter them. Our approaches to try to understand the human
journeys and to understand the role of technology and see

(13:10):
if we can build technology that stops people getting to
the point that we would consider them of ident extremist
or a terrorist. So if you want to understand terrorists,
where do you start. Well, to better understand extremism, Yasmin's
team use some somewhat extreme methods themselves. One of the
first things we did in two thousand and eleven was
we brought together eighty four former extremists and survivors of terrorism.

(13:36):
And we had isn't a mist people who's going to
find Afghanistan or al Shabab from Somalia, Nigerian isna miss militia,
but we also had a form of violent Israeli settler.
We had form of Christian militia. We've got them one
together on one place. Can you imagine the security concern
for Google in helping us convene everyone in one place.
We had snipers on the roof, We had six months

(13:57):
vetting for everyone. We cared about them being public and
associated with an engine to bring people out of those
extremist groups. And beyond gathering all the former terrorists in
one place, Yasmin's team also invited the victims of terrorism.
We had survivors, So we had people who celebrated nine
eleven and people who had lost family members in eleven.

(14:17):
We had a lady who said that she had woken
up in hospital with a name band on her risk
that said gender unknown because her body was like so mutilated.
The point of bringing everyone together was to say what
is common in the human radicalization path even when you're
looking across ideologies, and does technology have a role to play?

(14:39):
In other words, Yasmin wanted to understand how people become
terrorists and what role the internet plays. But Chicksaw's role
was never just to have summits and collect the information.
They wanted to take action, and ISIS was the obvious
place to start. In March of this year, ISIS actually
lost the last of its remaining territory, but when the

(15:00):
left fate was that it's height. Thousands of foreign fighters
were recruited online. They left their homes in places like Germany, England,
America for the battle fields of Syria. This trip that
we took to Iraq, where we spoke to defectors of ISIS,
was really instructive for us. We had these face to
face conversations with these young men who had left home,

(15:20):
gone to Syria, and Iraq trained with ISIS, did their
religious indoctrination with ISIS, got their postings, some of them
as suicide bombers, some of them as technical drivers, some
of them as night watchmen. And they had realized that
it was all a lie. So they're telling us about it,
and we say, if you knew the day that you
left everything that you know and now, would you still
have gone? And they invariably said that they would still

(15:44):
have gone. Honestly, I'd already finished, he asked me in
a sentence. In my head, I thought she was going
to say, they invariably said that they wouldn't have gone,
but in fact, knowing they were facing not just heroic martyrdom,
red cues, lack of medical care, splintered leadership, they still

(16:04):
would have left the comfort of their home countries again
places like America, England, Canada, to travel to Syria, an
active war zone. So Jackson needed to understand what what
ICE is doing to lure people in that was so
very powerful. We identified the recruitment narratives and they were
largely around people who thought that this was the devout, correct,

(16:25):
religous thing to do, that this wasn't mslim utopia than
was going to lead to a healthier, happier life than
remaining in the West. People who are interested in the
military conflated. So there were several and we generated a
targeting strategy to reach these people based on their online browsing,
specifically like their online searches take a moment to absorb that.

(16:47):
Based on online browsing history and searches, Jigsaw were able
to identify potential extremists and serve them adds to push
them towards alternative content. But wouldn't that raise alarm bells
for the people do in the searching Yasmin and her
team had to be subtle. If you were interested in
fact was about Jahard, religious udics about Johard, we would

(17:09):
give you those originally, It's just not the ones that
Isis was proposing, but alternative ones. Or if you're interested
to understand what life is like in the Caliphate, let's
show you citizen journalism of the long cueues for bread
or the state of the hospitals in the Caliphate. So
you're still getting something that addresses your interest. Um, it
would just be an alternative information source to the one

(17:29):
that you were looking for and it was really important
to us to make sure that we were targeting, you know,
and finding people who were really at risk as opposed
to just people who are interested in isis that's right.
Looking at browsing and search history, Jigsaw and Google can
look at two different people looking up fatoas online and
only serve redirecting ads to one. When you get the

(17:49):
level more granular, you can start to set up a
targeting strategy that really does differentiate mainstream interest in this
group to the people who are you know, sympathetic and
potential join us. It has turned out to be really
effective since my conversation with Yasmin. A terrorist attack on
two mosques in Christchurch, New Zealand killed fifty people and

(18:11):
focus the world's attention on the threat posed by far
right terror. I emailed with Yasmin and she told me
that she and her team are quote attending to the
far right threat with increased urgency. I think it's nice
that they're trying to fix things, but you know, it's
like the Anhauser Busch family funding a study on alcohol, right,

(18:32):
I guess, except we kind of no alcohol is bad,
whereas we can still hope that the internet is neutral,
although it is clear the Internet has become a key
method of radicalization for terrorists of all stripes, and so
for me at least, it's good to know that the
technology companies are acknowledging that and smart people like Yasmin
are working on it. Although it does also mean living
in a world where our main internet search providers also

(18:54):
edit the results. Do we want Google controlling what we
know about the world or recommending what else we should now? Right?
And then what if Google turned that immense power to
influence directly onto me or you? More on that after
the break, Tristan, we've been talking about chicksaw on their

(19:18):
manipulation when you look at elections and the way extremist
groups are using technology to radicalize and proliferate. Should tech
companies be trying to offset some of the damage they're
doing by intervening for good? Yeah, Well, this is an
incredibly nuanced topic because essentially the way to see what's
going on with technology today is there's a massive asymmetry.

(19:41):
It would be easy for you and I and he
has mean to say clearly isis is bad and terrorism
is bad. So let's redirect when you're searching for things
that look like terrorist videos. But if you suddenly then
open the door and say you're about to watch a
video on climate change, and people at Google, for whatever reason,
believe that climate change wasn't real and they said, we're

(20:04):
going to start redirecting you away from that. Again, who
would they be to say? And that's actually a critical
juncture that we're at right now, because there's kind of
a pluralism where we all believe and think different things,
and we want to say that's your truth and not
my truth, and so we have this collapse of well,
who is the moral authority on these topics? And after all,
we're all Americans, well not me, but a citizens of

(20:27):
a democracy. We have a voice in our diplomacy. What
about is uses as the Internet? I think it's important
that we not try to just have a philosophy cocktail
conversation and debate it. We have to actually recognize that
this is having real world consequences. But the difficult thing
to reckon with is that it all comes back to attention.
YouTube has a tilt. We're on the one side of

(20:48):
the spectrum you have the calm science Carl Siggin slowly explaining,
and on the other side of the spectrum. You have UFOs, Bigfoot,
conspiracy theories, and crazy Town. If I'm YouTube and I
want you to watch more, Let's say you start at
the science section. If I want you to watch more,
which direction between those two am I going to steer you?
I'm always going to steer you too. Crazy Town so

(21:11):
at scale, even in language that the engineers don't speak,
the algorithms had figured out that crazy Town is really
really good at getting people's attention. And we know the
effect isn't just on politics, what about regular people who
get pushed into joinning ISIS or taking other extreme positions.
If you start a team girl on a dieting video,
the YouTube algorithm recommends an arexia videos because it's calculating

(21:34):
away and figures out those are the things that are
really good at keeping people of that age demographic on YouTube.
And if you start someone on a Supember eleventh news
video of the planes crashing into the towers, the recommendations
are all going to be the nine eleven conspiracy theory videos.
And Alex Jones YouTube actually recommended fifteen billion times Alex
Jones videos, two billion of which were viewed, even if

(21:57):
only one out of a thousand of those people believed it,
So one out of a thousand people watching those two
billion views believe that that's like printing, you know, a
church of scientology called about you know once a month.
In terms of the scale, we are jacked into these systems,
We are jacked into these environments that are telling us
and steering us towards flows of attention and ways of

(22:17):
seeing the world that are inevitable. And the question is
what flows of attention do we want? And in the end,
it all comes down to who's steering the ship and
what they're steering you towards. Who watches the watchman he
has been argues that jakes Will takes ethics heavily into consideration,
but not all companies do. And Tristan actually has a

(22:39):
radical idea. He believes that we need to wean ourselves
as an economy and as a society from business models
that compete for our attention to sell us things. I
think we really underestimate that this is affecting every layer
of society. And when you have three or four tech
companies choosing what will show up in the minds of
two bill and people every day, you know, our minds

(23:02):
are the source of all of our actions, But don't
we bear some responsibility for our own actions? Isn't that
part of the American dream? After all? No one forces
you into a casino or to buy that bar of
chocolate as you're paying at the supermarket. Right. You know,
you're the one clicking on those YouTube videos. You're the

(23:22):
ones who are clicking on those Russian ads. You're the ones,
you know, checking your phones because of those light colors
that are shining up all the time. We're just giving
people what they want. You're responsible for your choices. What
we have to do is flip the table around. We're
not giving people what they want. We're giving people what
they can't help but watch and once we acknowledge that,
we can start to find some reasons to hope. We've

(23:43):
found by getting Apple and Google to launch both um
these digital well being initiatives, which take the blue light
out of your screens and do gray scale late at
night and show you ways of minimizing the time you
spend on your phone and things like that. It's a
baby step in the right direction. But that happened against
their own business interests because the engineers themselves started to

(24:04):
see it this way. Of course, this addiction economy is
fueling so much more, from heck democracy to radicalization, to
mental health issues and the epidemic of a loneliness. The
list of problems that we need to solve is much
bigger than just making our own gray skill, and that's
what has to happen next. Happily, not everything in life

(24:25):
that makes us feel good is bad for us, and
some target recommendations can make us very happy. Indeed, So
let's talk about love, because one of the ways that
you might have felt AI touched your life is through
online dating. Maybe the person you woke up next to
this morning was matched to you by a computer algorithm.
And according to a study released in January Stanford, straight

(24:50):
and six of LGBTQ couples now meet online. Tinder processes
two billion swipes per day. That's a lot of data,
and it feels pretty good to meet someone who like
no matter how it happens. The lgbt Q statistic makes
a lot of sense to me because we don't have
to come in contact with judgment, with hate speech, with
home whatever it may be, there's a lot less of

(25:12):
it when it's a peer to peer thing on your phone.
Hopefully the other thing is the heterosexual dating pool is
a pool. The in my case, lesbian dating pool is
a puddle, and that puddle at least can be amplified
on an app like Hinge and Tinder and whatever. I mean.
I have noticed a lot more gay couples in the
New York Times wedding section, which I read religiously. The

(25:33):
other thing that I notice is that a lot of
the couples, both straight and gay, meet on apps. It's
interesting to see that it's influenced the culture in that
way so much so that you know, X amount of
people per week are meeting on those apps. And actually
in the States right now, there's been quite a significant
uptick in interracial marriages. That's pretty cool and it's something

(25:54):
to really celebrate as well. I mean, I was on
Instagram a few days ago and I saw somebody who
uploaded it picture of their wedding cake which had the
Tinder logo on it. Um. So you know, people feel
real terms of gratitude and excitement about meeting their partner, obviously,
And one person had a successful Tinder experience is the
producer of the show Julian Weller. That's right. So I
met my girlfriend on mine I've actually been to a

(26:16):
couple of weddings also that they called Tinder Ella stories
if the iPhone case fits. Yeah, But you know, I
think in essence, the use of something like Tinder is
that it just makes your dating pool bigger. To Cara's point,
but there are other apps, right, there's other approaches where
people can enter lots more information in about themselves beyond
just this sort of casual way to meet. We wanted

(26:37):
to find out more about the data that drives who
we fall in love with, so we spoke to Helen Fisher.
I'm Dr Helen Fisher. I'm a biological anthropologist at the
Kinsey Institute. I've written six books on love, put people
in brain scanners, and actually I'm also the chief scientific
advisor to match dot com. We've been talking a lot
about computer algorithms influencing behavior in this episode Code, But

(27:01):
what motivates us in the first place, Well, there are
pathways and rules in the brain, not unlike a computer.
I developed a questionnaire some time ago that's now been
taken by fourteen million people in forty countries. Actually, and
there's patterns of behavior. I mean, we walk around with
algorithms in our head I mean, the brain is constantly
responding to all kinds of things, and it certainly is

(27:23):
a series of algorithms. So if the brain is like
a collection of computer algorithms but much more complicated, what
are the inputs what drives who we fall in love with?
There's four brain systems that each one of them is
linked with a constellation of biological traits, and they are
the dopamine, serotonin, testasterone, and estrogen systems. And so I

(27:45):
created a questionnaire to see what you know, how you
expressed the traits in each one of these systems. Once
Helen had developed the quiz, she validated it using a
brain scanner, and then I watched now at fourteen million people,
and Helen got an insight into the brain system that
makes us experience love in the first place. When we
put the people in the machine, I had expected that

(28:05):
brain readers, link with the emotions and cognitive processes, would
become active, and they do. But what everybody had in
common is activity in brain readers way at the base
of the brain link with drive, with craving, with focus,
with motivation, with energy. In other words, Helen found that
the ability to fall in love is just as deep
in the brain as other survival drives things we have

(28:27):
no control over, but it lies right near factories that
orchestrate thirst and hunger. Thirst and hunger keep you alive today.
Romantic love drives you to fall in love form of
pair bound and pass your DNA into tomorrow. That brain system,
the one that makes us feel that we need to eat,
or need to drink, or need to be with someone,

(28:48):
is also the one that drives addiction, and that has
implications for the technologies we use every day beyond just
dating apps, not only the substance addictions, but the behavioral
addictions like gambling or food addiction. That brain region became
activated not only among people who are madly and happily
in love, but also among people who were rejected in love,

(29:10):
and even in people who are in love long term.
And it is linked with the addiction centers in the brain.
And perhaps at some point we're going to come to
understand a much broader view of the word addiction. This
brings us right back to what Tristan was saying. Our
apps and smart devices are hijacking some of the deepest
and most powerful systems in our brain. The truth is

(29:31):
that Tinder apps social media validation. They all generate the
same feelings as romantic love. So of course we're prone
to be addicted. We want to look at somebody, to
hear somebody, to have somebody respond to us. If you're
an alien who came to Earth and he looked at
the way human touched their phone, always in their hand,
caressing it, reaching forward with a panic look in their eye,

(29:54):
responding to it with a smile, you would say, the
same thing is happening. Well, it's very different, but it may.
We used to me that some of the same brain systems.
I mean, you know when you feel fear, you feel fear.
As Helen found, love is fundamentally about the survival of
our species. So it's handled by the same part of
our brain as hunger, thirst, and addiction. And it's those

(30:16):
very brain centers that Twitter and Facebook on Instagram appeal
to using behavioral science to keep us engaged, to keep
us sharing. So when thousand Twitter users retweeted Gillian's open
letter to the technology companies, they did it because on
a very deep level, it triggered a survival response, and
in turn, that feedback loop is addictive. You know, I

(30:38):
did a study with Matt and I asked the singles
in America and do you feel that these machines are addictive?
And some like eight over said yes, And people said
that they would like to go back to dating without
any of them, and knowing how tempting it can be
to keep swiping, keep searching. Helen has some advice for
those of us looking for love. It's very well own

(31:00):
in this community that the more choices you have, the
less likely you are to choose anybody. So one of
the things that I say to people is, after you
meet nine people, stop and get to know at least
one of those people more. Because all the data show
that the more you get to know somebody, the more
you like them and the more that you think that
they're like you. You know, for millions of years, we

(31:21):
weren't doing this over the internet or even the telephone.
We were doing this in person. I mean, that's the
way people met, and the brain is built to meet
in person. Helen Fisher still believes in love. She just
doesn't want you to sleep walk into endlessly swiping, endless
date spurred by an addiction to see what's next, what's possible,

(31:42):
what's right around the corner. We know we're at a
dangerous crossroads and that we're susceptible to manipulation, and companies
know they can manipulate us for good and evil with
technology that touches us at our evolutionary roots. But the
future of those technologies isn't in vitable, and we still
have the power in our hands to decide what to

(32:05):
allow in our lives, because the ways we decide to
live with new technologies could have as much impact on
our lives as the Constitution or the New Deal. What
we decide to do next and how we do it
will have consequences. And on this series, we'll be interviewing
some of the world's greatest thinkers, that people who are
changing the future of our food, and the researchers helping

(32:26):
disabled people control machines with their minds. Together, we'll see
what we can do to pry our eyes open from
this dangerous sleepwalk I'm as veloshin see you next time.

(32:50):
Sleepwalkers is a production of our Heart Radio and unusual productions.
There's so much we don't have time for in our
episodes but that we'd love to share with you. So
the latest AI news, live interviews and behind the scenes
footage find us on Instagram, at Sleepwalker's Podcast or at
Sleepwalker's podcast dot com. Sleepwalkers is hosted by me Osloshin

(33:11):
and co hosted by me Kara Price. We produced by
Julian Weller, with audio editing by Jacopo Penzo, Taylor ship
Coin and mixing by Tristan McNeil. Our story editor is
Matthew Riddle. Recording assistance this episode from Chris Hambroke and
Jackson Bierfeld. Sleepwalkers is executive produced by me Ozveloshin and
Manges Ticketer. For more podcasts from my Heart Radio, visit

(33:32):
the I heart Radio app, Apple Podcasts, or wherever you
listen to your favorite shows.

Sleepwalkers News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.