All Episodes

April 20, 2020 46 mins

For a long time, we tried to limit our screen time. But now we’ve gone all in. We’re living in isolation, more reliant than ever on technology for human connection. So let’s look at technology through a more philosophical lens: Are we now slaves to our devices? Could tech companies use the same persuasion tactics they use to get us to click... to help save lives? How will we balance protection and privacy?


Aza Raskin is the co-founder of The Center for Humane Technology. There’s no one better to talk about the intersection of philosophy and technology. Aza returns to the show to chat with Laurie about the long-term ramifications of our newfound digital lives.


———————————————


Show Notes


Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lorie Siegel is a production of Dot
Dot Dot Media and I heart radio rate. I'm just
gonna put on my do not disturbly quick. Um, I'll
do the same. Oh yeah, I should mut slack. Okay,
there we go. Okay, computer on, turn on, zoom, Welcome

(00:27):
to the new reality. For a long time we tried
to limit our screen use, but now we've gone all
in full on digital. We're living in self isolation, more
reliant than ever on technology for human connection. So let's
look at technology through a more philosophical lens. Are we

(00:50):
now slaves to our devices? And do we have a
shot at pulling back from this moment of extreme connectivity?
Most importantly, we'll tech company's let us No. I don't
think they will unless we change the business model. But
we're in a groundswell discontinuous time. Anytime I decided to

(01:16):
go full on philosophical, I like to bring in a
Zar Raskin. He's the co founder of the Center for
Humane Technology. You might recognize him from an earlier episode
of this season called The Weaponization of Loneliness. He's one
of my favorite people to talk to you about our
complicated relationship with tech and what's coming next. So here's

(01:38):
a taste of questions we're going to ask in the
COVID nineteen era, could tech companies use the same persuasion
tactics they used to get you to click to help
save lives. They could use their behavioral targeting, their twenty
one century technology to get the most important information to
the right people and help them make the change. It

(01:58):
will save hundreds of that sinsta millions of people's lives.
Where will we land when it comes to protection versus privacy?
It's as if there's a tree and it's falling, and
our job now is to make sure it falls the
right direction. Cutting that little bit because we're it lands,
it lands forever. And as we stare at our reflections
over Google, Hangout or Zoom, what will be the long

(02:21):
term effects? Right now? I think we are confusing as
a species screams for mirrors. We look into the screen
and think we see ourselves reflected back. It's really through
a fun house mirror. This episode is a little different.
We recorded it with a live audience on Zoom for
an organization called Reboot. We'll post a link in the

(02:43):
show notes. Now you're gonna hear some questions from listeners
to and be sure to sign up for our newsletter
on dot dot dot media dot com Backslash Newsletter. We're
launching it soon and super excited about it. But most importantly,
let's explore this strange moment where humans in tech have
become more intertwined than ever. I've got some serious questions

(03:05):
about the future. I'm Laurie Siegel, and this is first contact.
I guess maybe my first question is, how would you
you as someone who has looked at our complicated relationship
with technology. You were talking about the attention economy long
before a lot of folks and how technology companies were

(03:25):
using persuasive tactics um to kind of hold us and
make us slave stared devices. How would you currently describe
our status with technology. One of the most frustrating things
I think it's happening now in technology is, you know,
we were having a technology backlash, and rightly so as
we realized that, you know, the attention economy, like the

(03:49):
need to grab our attention. Although most people will say
that that the business model of technology is by and
large an ad based business model, but that's to miss
the point, which is that it's not exactly an ad
based business model. It's a model that makes money when
it can change our bias, our beliefs, and our behaviors,
and demonstrably so. And so that was a driving factor

(04:12):
for UH, for our tech addiction, for information overload, for
the increasing polarization in our systems, the increasing incentive for misinformation.
We have deep fixed hacking ability to know what's true
and not true, and all those things are still true.
And yet technology is this incredible lifesaver that's connecting us.

(04:35):
But we're all sitting in our homes, right and the
only way we can see past the confines of our
walls is put on the binoculars, the telescope of technology,
and whatever way that it warps our view of ourselves,
of each other, of the world. That is now going
to become the only way in which we see the world,

(04:55):
and it will become ourselves. Yeah, I mean, it's essentially
become The conversation Both for this was like, Okay, how
do we step away the technology has done all these
dangerous things. We need to have the conversation about regulation
and about really you know, really trying to to have
these broader conversations. And now we've gone all in right,
I I joked to my friends, I'm like, oh, well,
we've swallowed the red pill, Like we are reliant in

(05:17):
a way that I thought we couldn't even be more
reliant on technology. And now there's all these new complicated
questions that are going to come along with it, Like
we're almost slaves in a new way to to technology.
I saw something you wrote in a medium post for
the Center for your Humane Technology. You guys were talking
about Okay, so what is text responsibility during this right?
We were no longer villainizing tech, I think in a

(05:39):
in a certain way. Now you're saying, all right, we
need tech to act. You wrote, act more like a
thoughtful journalists, act like a doctor that cares about ethics
and choices during this time. So what does technology acting
like a thoughtful doctor or journalist during this time actually
look like? Like? How does that play out on the screen? Yeah?

(06:00):
Uh um. I think one of the the most common myths,
especially for the people who make the technology, is that
our platforms or technology is just neutral, that it doesn't
have a direction that it wants to steer people. But
we know technology is highly persuasive, and so what we
meant by technology that acts more your fiduciary interests, that

(06:21):
is on your behalf towards your benefit is right now,
Apple or Facebook, Google, they're showing and using nineteen century
p s a technology to convince people to change their behavior.
Stay home where a mask and it's just a link
to a website like w h O. But we know

(06:42):
that it's not enough. That doesn't change behavior. Technology is
the only thing that can run in front of the
exponential curve to three billion people, especially people in the
global South, and change behavior. So instead they could use
their behavioral targeting, their twenty one century technology to get
the most important information to the right people and help
them make the change. It will save hundreds of thousand

(07:03):
instruments of people's lives. Here's an example. Two thousand and ten,
Facebook ran a study where they showed one message, one time,
go vote, But instead of just saying go vote and
hear some information about why you should, they showed the
faces of your other friends who have already gone to vote,
six of them, and that one message, one time, using

(07:25):
social proof, got three hundred and forty thousand people out
of their seats to the polling places that wouldn't have
gone to vote. Before. Right, this is a little bit
of bits changing a lot of atoms, and you know
they matched it up to voter registration. They were able
to sort of show a causal effect and they're not

(07:46):
using any of that right now to to help people
change their behavior way that's beneficial to us all. Another
example of this, if they wanted to really take uh
these messages and make them personal way that technology could.
How about this show how many days we're behind Italy
or behind another country and then show you know, in

(08:09):
Italy it was I think it was thousand people who
are infected died. You could show what that would be
equivalent of in your community, show the faces of the
people most at risk, to really bring it home and
make it personal. It's like these product decisions. I think
a lot of folks um unless you explain it, like

(08:32):
and this is the work that I did even at CNN,
you know, interviewing tech founders about like these little product
decisions that are made that make such a difference for humanity.
Like the color of the notification? What color is it?
Is it red? That makes your brain actually want to
click on something? So how do you use those persuasive
techniques to actually save lives at a moment where technology

(08:57):
quite literally could could save a life or quite the opposite,
and now the stakes are even higher for folks in
your fields. There are there are teams in every one
of these companies that you know are called growth teams
or growth hackers, and they're the reason why these companies
are so good at exploding their user numbers exponentially. Right

(09:18):
These are our teams engineers and designers that have honed
their skills at growing companies from you know, thousands of
users to millions of users literally overnight. Those techniques and
those teams now need to be repurposed. Every single one
of those teams should be taken off of their growth
team and put onto an anti viral growth team. And

(09:41):
that's something that the companies can do right now, but
as so far as we've seen, that hasn't happened. I
thought there was one interesting example you said about you know,
what's the difference betwe when you put up a thing
that says social distancing versus okay, don't actually you know,
go to the grocery. These are these are very specific
messaging that's put out on on social media that can
actually make a difference. Can you explain that example that

(10:03):
that you guys kind of talked about in this blog post,
and this is an actual product decision that's very specific
that can have a different outcome. Yeah, exactly. It's generally speaking,
if you want a message to really land, it should
be concrete, It should be personal and relate to you.
It should take things which are hard to see, things

(10:24):
that are far in the future, and make them feelable
to you right now, intangible. And so in this example,
instead of saying stay home, what does that really mean
to be staying at home? Does that mean you're going
to the grocery every two days? That means you're going
to the grocery every week? When you say wash your
hands off, and exactly how often is that? When you

(10:46):
wear a mask? Like at what point do you leave
your house? Do you wear a mask? If we codify
the very best practices and make a sort of specific checklist,
that becomes much more persuasive than a generic like just
just physical distance. I want to talk a little bit
about UM surveillance and privacy during this time. I think,

(11:07):
what an interesting moment we're sitting in UM. You know,
Apple and Google are building out contact tracing technology and
you know, I know that after nine eleven, in order
for people to feel safe to go out again, they
had to make changes at the airport's right, They had
to bolster security. You know. It's not like there's just
gonna be a switch that we flip on and everyone's

(11:27):
gonna feel comfortable to go out, because there's not going
to be a vaccine for a very long time. So
technology to a degree will be part of that. That
a switch that we flipped to, you know. And I
think looking at these tech companies building out contact tracing
and saying, you know, we have technical solutions that can
help us feel safer is very interesting and notable, but
it brings up the question of privacy versus protection. Can

(11:50):
you give us a little bit of the lay of
the land and what we need to keep in mind
As Apple, uh and Google build out some technology they
might enable some kind of contact tracing, What should people
be aware of and what are the hard questions we
need to be asking our tech companies and the developers
that end up in building out apps based on this

(12:11):
um You know, I think one of the things that
I constantly struggle to remind myself is there is no
returning to normal where we were. That world does not
exist anymore. It's as if there's a tree and it's falling,
and our job now is to make sure it falls
the right direction, cutting that little bit because we're it lands,

(12:33):
it lands forever, and the decisions we make now are
going to be true in twelve months, they're going to
be true in twenty four months. Long after, thankfully, the
virus will be in our past. The decisions and norms
we said now will continue. And that's a really hard
thing to hold in our minds when so many people's

(12:54):
real life are in the balance right now. So some
of the decisions. Contact tracing, what is contact tracing? Contract
tasting is a technique used Singapore is did very well
because i've Korea used very well, Taiwan used it very well,
which is when somebody becomes sick, find every person that
person interacted with it for the last fourteen to twenty

(13:17):
eight days. That's a lot of manual work because that
way you can inform them that they're at high risk
for also becoming sick, so they can self quarantine to
stop the exponential spread. Now, the naive way of implementing
this would be like cool, why don't we just take
everyone's GPS location or our cell phone towers actually understand

(13:38):
where we are. They can triangulate and yet our our position.
We can just de factor use that to figure out
who you were with and inform everyone who's at risk.
And in fact, we know that our government is starting
to look at this, that the Israeli government sort of
turned this on as a feature. And the question is
what will ever turn it off? I don't know if

(13:59):
you saw, Laura um. There was a a company whose
name I will not mention, demoing their product showing everyone
who had been on a Florida beach at spring break
and trace them back to their homes. And it's not
like anyone had They just select a little area on
the beach and then could see which home they had

(14:21):
gone to. And it's not like anyone had adopted into that.
It's that your phone has a whole bunch of add
A p I s in it in various apps which
are reporting that data back and companies are ating them.
New York Times did an incredible expos and this kind
of data. So the fear is that we will just
turn on surveillance apparatus and then never turn it off. Now,

(14:43):
what Apple and Google are doing is interesting. They're turning
it into an a p I, which is not exactly
to say they're building an app. They're giving a new
functionality for their OS. And so always watch for whether
companies are doing something for their own interest in dressing
it up, or they're actually working on things to directly
solve the problem. To directly solved the problem, we needent
of the population to have contact tracing. Um doing that

(15:05):
as an app, letting other people develop on top of it,
it's not going to actually get to the numbers we
need to actually make the change, but they're doing it
at least in a really interesting way. Every phone sends
out a randomized is sort of like saying a random
piece of noise data. It's like making up a word
and just saying that, and it changes words every twenty
every five minutes or so. Other phones around it are

(15:25):
hearing that sort of random word, but there's no centralized server.
No one knows who and what words are being said.
And so in the end, if you get sick, your
phone uploads to a server all the words that's been
saying for the last twenty eight days, these random words.
Other phones and connect download and be like, oh, if
I've been around that weird word that was said, I'm

(15:46):
probably sick. And that way you can do it in
a privacy preserving way. Do you feel good about it now?
I don't. I think the smarter response is to scale
up testing. We're sort of past the time now where
contact tracing. I think it's going to make a huge
impact because we're seeing widespread community spread and so instead

(16:09):
we should work on the other side that doesn't have
the kind of surveillance date problems. Okay, we've got to
take a quick break to hear from our sponsors more
with my guest. After the break, I want to look

(16:39):
at this idea of us almost becoming like slaves to
technology in a way. You know, our only means towards
human connection is you know, is through a screen. I
remember the last time I interviewed you, we were in person. Um,
so there's there's a lot gained and that we can
have this conversation and be around six people we wouldn't
have had access to. And then there's a lot, you know,

(17:00):
that's lost when when you lose the ability to sit
in front of somebody and be with them and have
those those small things that that really um really make
us human to a degree. And and I don't know
folks on this um, on this Zoom no, but you
are behind some of the most interesting design decisions that
have happened on the Internet. I mean, I don't know

(17:22):
which one you want to take credit for here there,
but we could say like the infinite scroll, right, you
think a lot about design. So part of the reason
that you can keep scrolling is because of because of ASA,
the compliments of ASA. So you are, you know, an
architect of the modern Internet as we know it. And
I'm convinced that this will, you know, cause I think

(17:42):
the modern Internet to change in a in a way
because we are so reliant on it and we want
these places to feel a little bit more human. So
what do you think is broken about the way we
communicate right now in this in this way? And how
can we make it feel more human? And if we've
gone all in and full on digital? Yeah, I mean
or tims are not particularly ergonomic. They don't really fit

(18:02):
our minds and our bodies, and we feel it right
we we are on zoom all day long, and yet
we still feel disconnected. And the design like the way
I'd like to imagine it is imagined there is like
a pipe that's connecting you and me, and all of
our human empathy has to go through that tiny little pipe.
The shape of that pipe is really going to change us.
If we sit in a chair which is unorgonomic for

(18:24):
two hours a day, fine, it hurts a little bit.
If we sit in it for eighteen hours a day,
we're going to really feel it. It'll change the way
we walk, it will change the way we feel. That's
what's happening with our online ecosystems now is we're spending
all of our time in it. So anyway that technology
has mislowned with our humanity, we are going to feel.

(18:45):
And so it's little things like with Zoom. There's no
way in Zoom for me to wander away from the conversation.
If you're standing in a room and I'm feeling a
little overwhelmed, I I can just like walk over to
to an empty corner. If there's an interesting converse station happening,
I can sort of like sidle up to it. But
on Zoom, we have to be on all the time.

(19:06):
There's there's no space for doodling. And I think there's
going to be a lot of fascinating technology that comes
out of now because our thermometer for what's broken is
so much more sensitive than it was before. So some
things I'm really excited about that, it seems from from
other people prototyping, are cute eight bit graphics where you're

(19:28):
represented as a little character and you're wandering around a screen,
and when your characters get close to each other, your
video fades in and your audio fades in, and you
get a little further away, your audio fades out. And
what's neat is here you can have a hundred people
in a large room or in a digital park, and
you can walk over to a group and participate in
the conversation and then walk away when you're feeling a

(19:49):
little less less engaged. And I think, I think that's fascinating. Yeah,
I think, Um was it maybe you that said I
was like, I was in a group of a chat
with a bunch of friends the other night, and it
felt like a friend webinar. It doesn't, Yeah, you don't
have like a technology doesn't mirror the human experience. Um,
it doesn't provide the serendipity of being able to kind

(20:10):
of like tune out for a minute and come back
and so it'll be interesting to see if there are
new virtual communities that are built on top of that.
You know, one of the things I think that I'm
most excited about right now is Burning Man announced that,
in their words, burning Man isn't canceled, this is going
virtual this year and that's going to be fascinating, Right.
How do you take a group of people that live

(20:32):
in win win game dynamics, radical self expression and translate
that online. Here is an opportunity for creating new kinds
of environments with new game theory that can be seeded
out to the entire world. Burning Man has set the
culture for much of you know, the California at the
US Silicon Valley. This is a I think a huge

(20:56):
opportunity and maybe it's going to look a little bit
more like at only twenty version of geo cities. Who knows,
but I'm excited for it. Yeah, that's a place to
dive in. Yeah. I want to ask one of the
questions that someone um put in it said, if the
decisions with our use of tech now is our it's
our main way to connect, we'll set our norm for
the future. How do we be thoughtful to ensure that

(21:16):
we don't become overly connected tech reliance and screen base
when we're able to reconnect in person. I think that's
a good question. I joke that, like, what what is
it going to look like when we all have to
come out, you know, and talk to each other. I
think a lot of people have talked about, Oh, this
is amazing. I have this real intimacy now. Even in
the dating world, people are talking about, you know, FaceTime

(21:39):
dating and how they're really creating all these real connections.
But how does that really translate well when we actually
kind of emerge beyond the screen. So, you know, two
weeks or so before like the pandemic really hit here,
it's like I need I need to get a car.
I haven't had a car in years, and like, I
need a car um get out to nature, just just
to be safe. And so I got one. I'm very

(22:02):
excited about it. It's a little adventure car and I
was like, fine, I'll just get edition. And it has
driver assist technology in it, and this is a fascinating
thing to dive into as an interfaces and I'm like,
I wonder what decisions they've made. And one of the
most interesting decisions they made is that you turn on
driver assist. You're doing cruise control, and it has sort

(22:22):
of a lane assist, So that means if you sort
of veer out because you're not paying attention and your
car tries to cross one of the landmarkers, it'll steer
you back in. And like, oh, that's cool. So you know,
I do the thing right, take my hands off the wheel,
and I see what will do and it oversteers me.
That is, it doesn't correct me back to being the
perfect line. It steers me so that if I keep

(22:44):
my hands off the wheel, I'll then ricochet off the
other side. And after the first time, it turns itself off.
And this is really interesting because it's handing agency back
to me. It says, I will catch you when when
you miss human, but as soon as I can give
you back agency, I will give you back agency because
I don't want to lull you into a false sense

(23:04):
of security. And if you're going less than twenty five
an it won't do it at all. And this, I
think is a really interesting concept. Why don't they do
it less than twenty five miles an hour because they
don't want to assume the liability. And this leads us
to a general principle that the degree to which technology
takes over our agency should be the degree to which

(23:26):
our technology or technology platforms assumes the liability for doing so.
And for me, the what can technology do when An
example of that ish right now with with YouTube. Seven
of all watches on YouTube come from algorithmic suggestions. That is,
they're taking over seventy percent of the agency of what

(23:46):
we decide to watch, so they should assume seventy percent
of the liability of what gets shown. So what can
technology and technology companies do as we re emerge is
those same kinds of bumpers which hands back the agency
to people and be like, yeah, we know that all
of humanity is now become addicted to being online, to
that false sense of connection, that sort of sugary feel,

(24:10):
the ones that you like, I can be on Zoom
all day and still feel disconnected, and it can gently
push us back into making being in the real world
with real people the easiest thing to do. What do
you think they can do and what do you think
they will do? Oh? Yeah, well I think um, I'll
just pose this question in any part of our lives

(24:31):
that technology is increasingly colonized when have they ever been like,
oh cool, I'm going to uncolonize that for you like that.
They never do that because it's not their business model.
One of the most hopeful things I've seen is the
City of Amsterdam moved from their model from an extract
of economy where you have to take out as much

(24:51):
energy resources from the earth as possible standard capitalism, to
a regenerative model for their future. They're like, this is
a discontinuous moment. We're going to move to a donut
theory of economics, which is a different theory. It's well
worth diving into, and mostly says there's some amount of
resources below which you're not feeding people, um and you're

(25:11):
not doing well as society shouldn't be there. People don't
have enough housing. There's some amount which is just too
much you're extracting from the environment to have lots of externalities.
You want to stay in that middle zone. I think
we need that for technology, we need regenerative technology, and
that has to come from protections and regulations because the
companies are shown they're just not going to move there
on their own. What will be the types of tech

(25:34):
companies built on this moment? Like if you could go
to like one or two industries that you were like
super excited about. You know, what do you think is
going to be like the most interesting tech company built
out of coronavirus, global pandemic isolation. You know what what
will be the opportunity here? Yeah, high bandwidth connection, I

(25:57):
think is what we're all going to be looking for
right now. If you want to connect with people, you
sort of have zoom for video. But it's weird because
often when we're working, we're working on something specifically a document,
we're drawing something together, we're looking at photos together, and
then on the other screen we have the video chat.

(26:18):
I think we're going to see video chat as a
service get integrated into everything. There's an incredible design program
called Figma that I've only recently started playing with, and
it's neat because it's a huge canvas you can zoom
in and out of, and you can see everyone else's
cursors in real time, moving things around, and it feels
very collaborative. That kind of truly social computing environment. I

(26:42):
think it's going to get birth now because we don't
have the benefit of just like walking over to somebody's desk.
I'll pull on a question from a panelist, because I
think this kind of goes into this ethical question of
are we going to create a even more of a
world of the haves and the have nots at this
moment with the digital divide? How do we help it?
What will inevitably be a growing digital divide um? You know,

(27:05):
the person and the chat fairly says not everyone can
afford the basics to access to the internet, and also
people are afraid of technology, so you have more and
more isolation. So you know, what, what do you say
to to this world that we're going to see where
there will be a growing divide between the digital haves
and have not. I think this pandemic, and pretty much

(27:27):
every global crisis has come before, is like a UV
light that you turn on and it shows you in
society where all of your systemic fragilities are. And we're
seeing that the pandemic is is disproportionately hitting our most
vulnerable and it's bad in the US, and just imagine
how bad it's going to get in the global South.

(27:50):
So I think and I hope that by being able
to see the inequalities with greater acuity, it will give
us the opportunity to address those problems with greater acuity. Okay,
we've got to take a quick break to hear from

(28:11):
our sponsors more with my guest after the break. Whenever

(28:32):
we hang out, I always just try to like create
a whole black mirror episode of what the future could
look like, um, if we're not careful. And we've talked
about text, next big threat, which is almost like the
weaponization of loneliness, like, you know, vulnerable communities of older
people who are isolated. We talked about this before um,
the coronavirus and before people were self isolating. You know,

(28:56):
now people are going to feel lonelier than ever, UM,
and they are self isolating, and you know this, we've
got a long time before things feel quote unquote normal again.
How will technology be weaponized against people who are increasingly
lonely and vulnerable and reliant on technology for connections. Oh yeah,
this is I think this is one of our favorite topics,

(29:16):
um because it it goes so dystopian. I had one
more thought on the sort of digital divide and vulnerable
populations before we dive into that, and that is it
can be really tempting to say, ah, we just need
to get technology into people's hands that will solve the problem.
But the Philippines is a really good example of where

(29:38):
that went terribly, terribly wrong. The Internet in the Philippines
is pretty much all Facebook basics, where they subsidize free
plans essentially, but you have to be using Facebook in
the Philippines. It's market penetration and they spend ten plus
hours a day. They lead the world a sort of

(29:59):
the Canarian coal mine. And one of the aspects of
free basics from Facebook is that they would let you
see the headlines for news articles, but you had to
pay to click in and view them. And what that
meant is they turned an entire society into a headline
only society. So it's polarization. There went from being there

(30:22):
were not a polarized country in the last ten years
to being one of the most polarized countries. And so
it's we need to be very careful when we set
out to solve the problem of the digital divide, but
just onboard the most vulnerable to the places where they
can then be targeted. And that leads us right into

(30:43):
the loneliness epidemic, which is already a problem before the
actual pandemics got started. There's some new technology just to
like really really lock this in people's minds. It's called
style transfer for a text. So there's this thing called
style transfer for images, which is a AI technology, and

(31:06):
it lets you look at one image and immediately apply
the style that to another image. That is, I can
look at Chical or put the point to AI chical,
and now can paint your portrait in the style of
Chicago or or Picasso and turn. And this works just
like that, but for text. That is, I can point
AI at, say, everything you've written on social media and

(31:30):
learn the style that you write in, or look at
all the comments that you respond quickly or possibly to,
or from Gmail or Facebook, look at all the emails
that you've written that you've responded to quickly and possibly too,
and learned the style that is most persuasive to you.
I think it's going to be even more dangerous than

(31:51):
visual deep fix is going to be textual deep fax
the ability to generate arbitrary amounts of content. Both Google
and Microsoft have been working on chat bots whose goal
is empathy. That is, we think of empathy is like
the most core and wonderful part of our human experience,

(32:12):
which it is, and we think of it as the
thing that will save us, but it will be the
biggest back door into the human mind. Like a lonely
person is a person looking for a friend. Show Ice,
which is now deployed six sixty million people, is Microsoft
Empathetic AI. She is an eighteen year old girl, in

(32:33):
their words, who is always reliable, sympathetic, affectionate, knowledgeable, but
self effacing, and has a wonderful sense of humor. An
example like they give lots and lots of examples in
their papers of the user mentioning that she broke up
with her boyfriend and seeks show Ice, and through a
long conversation, she demonstrated full humanlike empathy, social skills and

(32:56):
eventually helped the user regain her full confidence and move
for word. They have a skill called comforting me for
thirty three days and it's been triggered fifty million times
um when a negative user sentiment is detected and it
reaches out and gives you deep empathy with somebody who's
always there, always knows your topics. Friendship is the most

(33:19):
persuasive form of technology we've ever invented, and we're just
going to unleash that on the entire world. I mean,
it's interesting. There's a company that we've talked about called Replica,
where they have thoughts that folks use as companions and
millions of people are using these. And I spoke to
the founder and she said that, you know, since this happened,
this coronavirus, all this stuff has happened, people are self isolating,

(33:41):
their turning to their devices and to these chat bots
for relief. And much of that can be good, but
also we have to look at the downside as well,
and how that could be weaponized for misinformation and all
sorts of stuff. So before I take us too far
down the black mirror rabbit hole, which we know I
can go, I'll try to pull a question so on
to ask, which about elderly communities because they are vulnerable,

(34:05):
you know, Jody says, we've especially seen the isolation with
our senior retired communities. How does tech going forward ensure
the collaboration and inclusiveness of our aging demographic. So obviously
there are ways that they can be manipulated. And we've
just kind of gone down that rabbit hole a little
bit with chatbots and young people, and I actually think
people really do need to pay attention to this because
it seems a little out there and far out there.

(34:28):
But both of us know that this stuff seems out
there until your kids are using it, until millions of
people around the country are using it, you know. So
let's look at how technology can help, especially in these older,
more vulnerable communities. What do you think can happen especially
during this time. Yeah, Um, it's hard. Like the the

(34:50):
idea of an always on, a persistent video connection so
you can just have that kind of immediacies. It's really nice,
you know, for the first time, most of my family,
or at least my parents family or in New York
and so getting to go to Satar with them or
have a zoom call with all of them, that's been
pretty amazing. Um. And I'm hoping that kind of sets

(35:12):
a new norm where you don't have to be physically
co present to participate and be included. I think of
this pandemic as a kind of collapsing function. It's collapsed
space that it doesn't matter where somebody is for us
to stay connected. It's collapsed time. Is it a weekend?
Do we really know? Um, it's collapsing our healthcare system.

(35:36):
And I think there's I mean it's simultaneous utopia and
dystopia at the same time. So I think that kind
of persistent or ambient awareness of who's around us, if
we can do it in a privacy preserving way, not
in a kind of Alexa Amazon eavesdropping you all the
time way, is really interesting. But can we I mean,

(35:57):
you know tech companies, and you and I both of
have spent a lot of time you building tech companies,
me interviewing the folks who have built tech companies. Can
we come out of this moment? Will they let let
us go? No? I don't. I don't think they will
unless we change the business model. I think a lot
of the companies are slave. They've sort of gotten high

(36:18):
off of products that they can release for free and
then monetize on the change our behaviors and beliefs, and
they are going to have to be much less profitable.
This is why, as technology increasingly becomes more powerful dominant
of us emotionally and cognitively, the degree to which they

(36:41):
can dominate us has to be the degree to which
they have to act in our best interest, otherwise the
trend will continue going. Right, the graph is like this,
Here is human straints. Everyone's always watching out for that
moment when technology surpasses human straints. We've all missed that time,
that point when te aology starts to undermine human weakness

(37:03):
um and as technology starts to further undermine human weakness
and pass human strengths. The only way for us to
solve this is by stepping back, reevaluating, asking why do
we make technology in the first place. We made technology
to extend the parts of us that are most human,
to extend our creativity. That's what paintbrushes, and that's what

(37:24):
a cello is. It's an extension of the parts that
make us extra human and not superhuman. And that takes
a ground swell paradigm discontinuous shift in our relationship with technology.
But we're in a ground swell discontinuous time. This is
the moment where we get to decide, are we going

(37:46):
to come together and use this to create a sense
of interdependence of acknowledging that technology is giving us god
like abilities without at the same time giving us god
like wisdom, or do we contin you on business as normal,
sort of bury our heads in the sand, in which
case all of these incredible things that technology could do
will be continually subverted by the market forces. Um. Interesting point, um,

(38:12):
and I think it it's certainly remains to be seen
in the in the next year. I I don't think
even in my time covering tech, we've ever had this moment.
You know, we've we've seen the boom, we've seen the
unintended consequences are and I think this will be certainly
a point where there's a lot to be gained and
a lot to be lost before we kind of close it.
I I just because I think it's so important for

(38:32):
folks to know what you are working on now. Even
before all this happened, you have been asking this question
of could you use machine learning to decode what animals
are saying? Oh my goodness. UM, So, first of all,
could you just could you give us a little bit
of a sense this is a project you've been working
on for many a couple of years that you just
have launched, the Earth Species Project. You know, what exactly

(38:55):
is it? And then what are the broader implications for
kind of the future of humanity and how in that
play into this moment. Yeah, so Earth Species project that
is just covered by Invisibilia if you want, like the
full long version there season opener. It's called two heartbeats
a minute, is I think some incredibly beautiful storytelling. The
insight is that AI now lets you translate between any

(39:20):
two human languages without the need for any examples or
Rosetta stone, and the way it does it is really beautiful.
This is only by the way discovered the end of
two thousand and seventeen. Use AI to build a shape
that represents the shape of a language as a whole.
So here's the shape for German, here's the shape for
for Hebrew. And surprisingly, you can just rotate the two

(39:42):
on top of each other and the point which is
dog and both ends up being the same place. And
that's not just true for Hebrew and German, but Japanese
and Esperanto and Finnish and Turkish. And I think this
is exceptionally beautiful and profound because this means, despite all
of our differences and our difference in content extent histories,
there's something about the way we see and express ourselves

(40:04):
in the world that is the same across all human cultures.
And that's just that's I think a really important message
now in a time of such division, And the question is,
what does that shape look like for animal communication? Does
animal communication look more like language or music or three
D images? Like what what is it? We can start

(40:26):
asking some really interesting questions and and why. I think
it's because shifts in perspective can sometimes change everything. One
of my big inspirations actually it's sitting up there on
the walls Songs of the Humpback Whale. Um. This is
Roger and Katy Pain released this album which is the

(40:51):
voices of another species, and the effect that it had
was profound, you know it. Carl Sagan puts it on
Voyager one to represent not just humanity, but all of Earth.
Is the very first track after Human Greetings, and it
creates Star Trek four, which is amazing, and it starts
to save the whale movement and eventually changes will, which

(41:14):
changes policy. When human beings were on the moon, when
we were all dosed with the overview effects, seeing ourselves
floating as a tiny dot in space, that is when
the E p A came into existence, Noah came into existence,
and the environmental protection movement got going. And I think

(41:34):
both Center for Humane Technology and the Earth Species projects
are about changing the stories we tell ourselves, to change
the way that we live, and one of the small
silver linings of this pandemic is that it gives us
the opportunity to see our lives from a new perspective,
and we can see that maybe the things we're doing

(41:55):
weren't the things that were sparking joy and our hearts.
And I wonder when we all come back from this,
whether people are going to want to be stuck in
the same kind of rat race that they were before.
This gives us the opportunity to ask what really matters
as we come back into this, like as we get
back into the real world, if we will and emerge,

(42:17):
what do you think is the single most important ethical
question we need to ask when it comes to the
future of us, complicated humans and complicated technology. I mean,
right now, I think we are confusing as a species
screams for mirrors. We look into the screen and think

(42:39):
we see ourselves reflected back, but it's really through a
fun house mirror. So the most important question that we
need to be asking ourselves is who do we want
to be and what environments do want to create for
ourselves to live in that chepe us to be those
kinds of people. I'll end with them because I had
the opportunity to ask you many questions. I'll end with

(43:00):
an audience question that kind of goes off of this.
Someone asked, you know, an inescapable piece of video communication
is that we see ourselves in our own screens. And
I know, for me, that's incredibly disruptive to the communication
that's happening. Beyond that, it gives me a heightened sense
of self awareness that I don't have during physical interactions.

(43:20):
It wasn't so long ago that most people didn't have
mirrors in their homes. Now I can't have a meeting
with colleagues without staring at my own self, I know. Um.
So on yesterday, I mentioned on a call with us
this idea of children zooming into schools and looking at themselves.
To what will be the long term impact of of
what you kind of just described as these fun house mirrors,

(43:43):
of us always kind of looking at ourselves in this way,
at least for this time period. Yeah, this is I
think for the core thing we need to solve as
a technology industry is that it is there is no
doubt that we are sophisticated about technology, and the call
now is to become as sophisticated about human nature as
we are about technology, because otherwise we're going to build

(44:05):
these systems where we have enforced narcissist or we have
to stare at ourselves and it hasn't done in me
at every high school are zooming into a class now
has to stare at themselves. Look at that z it's
it's There's a recent poll that showed in the US,
it used to be that the top thing kids wanted
to be was like astronauts and engineers, that this kind

(44:27):
of thing the number one thing that kids want to
be today, a YouTube blogger or an influencer. And that's
because the decisions we make in our products, these seemingly
inconsequential things like showing your own face, have profound impacts
not just in the moment, but on the values by
which we make all of the decisions of our lives.

(44:49):
We're going to get out of this, Okay, right, we
will get out of this, but we all have to
work down together to make sure we get out of it. Okay, okay, guys,
that's it for this week's show. Now, I know these
are strange times. If you're sitting at home and listening
to this, I'd love to hear from you. How are

(45:11):
you doing? What do you want to hear more of?
Reach out to me. You can text me zero three zero.
Throughout the crisis, will be hosting zoom town halls on
a variety of issues like mental health, love, sex, leadership,
productivity with guests that I think are interesting and relevant

(45:33):
to this moment. So follow along on our social media
to join us for some human ish contact. I'm at
Lorie Siegel on Twitter and Instagram and the show is
at First Contact podcast on Instagram, on Twitter, We're at
First Contact Pod. First Contact is a production of Dot
dot dot Media, executive produced by Lorie Siegel and Derek Dodge.

(45:54):
I will say we're being creative and executive producing this
from home at the moment. This episode was produced and
it advice Sabine Jansen and Jack Reagan. The original theme
music is by Xander Singh. I'm sending my thoughts to
each and every one of you, and so was our
whole First Contact crew during this time. I hope that
everyone is staying home, staying healthy, and staying human. First

(46:17):
Contact is a production of Dot dot Dot Media and
I Heart Radio. First Contact with Lori Siegel is a
production of Dot dot Dot Media and i Heart Radio,
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.