All Episodes

April 20, 2020 46 mins

For a long time, we tried to limit our screen time. But now we’ve gone all in. We’re living in isolation, more reliant than ever on technology for human connection. So let’s look at technology through a more philosophical lens: Are we now slaves to our devices? Could tech companies use the same persuasion tactics they use to get us to click... to help save lives? How will we balance protection and privacy?


Aza Raskin is the co-founder of The Center for Humane Technology. There’s no one better to talk about the intersection of philosophy and technology. Aza returns to the show to chat with Laurie about the long-term ramifications of our newfound digital lives.


———————————————


Show Notes


Listen
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First contact with Lori Siegel is a production of Dot
Dot Dot Media and iHeartRadio.

Speaker 2 (00:11):
Right, I'm just gonna put on my do not disturbly quick.

Speaker 3 (00:16):
Oh I'll do the same. Oh yeah, I should need
it slack.

Speaker 2 (00:19):
Okay, here you go.

Speaker 1 (00:23):
Okay, computer on, turn on, zoom. Welcome to the new reality.
For a long time, we tried to limit our screen use,
but now we've gone all in full on digital. We're
living in self isolation, more reliant than ever on technology
for human connection.

Speaker 2 (00:44):
So let's look at.

Speaker 1 (00:45):
Technology through a more philosophical lens. Are we now slaves
to our devices? And do we have a shot at
pulling back from this moment of extreme connectivity? Most importantly,
we'll tech company let us No, I.

Speaker 3 (01:02):
Don't think they will unless we change the business model.
But we're in a groundswell discontinuous time.

Speaker 1 (01:15):
Anytime I decide to go full on philosophical, I like
to bring in Aseraskin. He's the co founder of the
Center for Humane Technology. You might recognize him from an
earlier episode of this season called The Weaponization of Loneliness.
He's one of my favorite people to talk to you
about our complicated relationship with tech and what's coming next.

(01:37):
So here's a taste of questions we're going to ask.
In the COVID nineteen era, could tech companies use the
same persuasion tactics they use to get you to click
to help save lives.

Speaker 3 (01:49):
They could use their behavioral targeting, their twenty first century
technology to get the most important information to the right
people and help them make the change that will save
hundreds of fu Since moons of people's lives.

Speaker 1 (02:02):
Where will we land when it comes to protection versus privacy?

Speaker 3 (02:06):
It's as if there's a tree and it's falling, and
our job now is to make sure it follows the
right direction, cutting that little bit because where it lands,
it lands forever.

Speaker 1 (02:16):
And as we stare at our reflections over Google, Hangout
or Zoom, what will be the long term effects?

Speaker 3 (02:23):
Right now? I think we are confusing as a species
screens for mirrors. We look into the screen and think
we see ourselves reflected back, but it's really through a
funhouse mirror.

Speaker 1 (02:35):
This episode is a little different. We recorded it with
a live audience on Zoom for an organization called Reboot.
We'll post the link in the show notes. Now you're
gonna hear some questions from listeners too, and be sure
to sign up for our newsletter on dot dot dot
media dot com Backslash Newsletter. We're launching it soon and
super excited about it. But most importantly, let's explore this

(02:58):
strange moment where humans in tech have become more intertwined
than ever. I've got some serious questions about the future.
I'm Laur Siegel and this is first contact. I guess
maybe my first question is, how would you you as
someone who has looked at our complicated relationship with technology.

(03:19):
You were talking about the attention economy long before a
lot of folks and how technology companies were using persuasive
tactics to kind of hold us and make us slave
star our devices. How would you currently describe our status
with technology?

Speaker 3 (03:35):
You know, one of the most frustrating things that I
think it's happening now in technology is, you know, we
were having a technology backlash, and rightly so as we
realize that, you know, the attention economy, like the need
to grab our attention. Although most people will say that
that the business model of technology is by and large

(03:57):
an ad based business model, but that's to miss the point,
which is that it's not exactly an ad based business model.
It's a model that makes money when it can change
our bias, our beliefs, and our behaviors, and demonstrably so.
And so that was a driving factor for our tech addiction,
for information overload, for the increasing polarization in our systems,

(04:20):
the increasing incentive for misinformation. We have deep FAKEX hacking,
our ability to know what's true and not true, and
all those things are still true. And yet technology is
this incredible life saver that's connecting us. But we're all
sitting in our homes and the only way we can
see past the confines of our walls is put on

(04:41):
the binoculars, the telescope of technology and whatever way that
it warps our view of ourselves, of each other, of
the world. That is now going to become the only
way in which we see the world, and it will
become ourselves.

Speaker 2 (04:57):
Yeah, I mean, it's essentially become.

Speaker 1 (04:59):
The conversation before this was like, Okay, how do we
step away the technology has done all these dangerous things.
We need to have the conversation about regulation and about
really really trying to have these broader conversations.

Speaker 2 (05:10):
And now we've gone all in, right. I joke to
my friends.

Speaker 1 (05:13):
I'm like, oh, well, we've swallowed the red pill, Like
we are reliant in a way that I thought we
couldn't even be more reliant on technology, And now there's
all these new complicated questions that are going to come
along with it, Like we're almost slaves in a new
way to technology. I saw something you wrote in a
medium post for the Center for Humane Technology.

Speaker 2 (05:32):
You guys were.

Speaker 1 (05:32):
Talking about Okay, so what is tex responsibility during this right?
We were no longer villainizing tech, I think in a
certain way.

Speaker 2 (05:41):
Now you're saying, all right, we need tech to act.

Speaker 1 (05:43):
You wrote, act more like a thoughtful journalist, act like
a doctor that cares about ethics and choices during this time.
So what does technology acting like a thoughtful doctor or
journalist during this time actually look like?

Speaker 2 (05:57):
Like? How does that play out on the screen.

Speaker 3 (06:01):
I think one of the most common myths, especially for
the people who make the technology, is that our platforms
or technology is just neutral, that it doesn't have a
direction that wants to steer people. But we know technology
is highly persuasive, and so what we meant by technology
that acts more your fiduciary interest, that is on your

(06:22):
behalf towards your benefit is right now, Apple, Facebook, Google,
they're showing and using nineteenth century PSA technology to convince
people to change their behavior. Stay home, wear a mask,
and it's just a link to a website like who.

(06:42):
But we know that it's not enough. That doesn't change behavior.
Technology is the only thing that can run in front
of the X mental curve to three billion people, especially
the people in the global South, and change behavior. So instead,
they could use their behavioral targeting, their twenty first century
technology to get the most important information to the right
people and help them make the change that will save

(07:03):
hundreds of thousands to millions of people's lives. Here's an example.
Twenty ten, Facebook ran a study where they showed one
message one time, go vote. But instead of just saying
go vote and here's some information about why you should,
they showed the faces of your other friends who have
already gone to vote, six of them, and that one message,

(07:24):
one time, using social proof, got three hundred and forty
thousand people out of their seats to the polling places
that wouldn't have gone to vote before. Right, This is
a little bit of bits changing a lot of atoms,
and you know they mastered it up to voter registration.
They were able to sort of show a causal effect,

(07:45):
and they're not using any of that right now to
help people change the behavior way that's beneficial to us. All.
Another example of this, if they wanted to really take
these messages and make them personal way that own technology could.
How about this show how many days we're behind Italy
or behind another country and then show you know, in

(08:09):
Italy it was I think it was ninety five at
of a thousand people who are infected died. You could
show what that would be equivalent of in your community,
Show the faces of the people most at risk, to
really bring it home and make it personal.

Speaker 1 (08:27):
It's like these product decisions, I think a lot of
folks unless you explain it, like and this is the
work that I did even at CNN, you know, interviewing
tech founders about like these little product decisions that are
made that make such a difference for humanity.

Speaker 2 (08:44):
Like the color of the notification? What color is it?
Is it red? That makes your.

Speaker 1 (08:48):
Brain actually want to click on something? So how do
you use those persuasive techniques to actually save lives at
a moment where technology quite literally could could save a
life or aren't quite the opposite, and now the stakes
are even higher for folks in your field.

Speaker 3 (09:04):
There are there are teams in every one of these
companies that you know are called growth teams or growth hackers,
and they're the reason why these companies are so good
at exploding their user numbers exponentially. Right. These are our
teams engineers and designers that have honed their skills at

(09:24):
growing companies from you know, thousands of users to millions
of users literally overnight. Those techniques and those teams now
need to be repurposed. Every single one of those teams
should be taken off of their growth team and put
onto an anti viral growth team. And that's something that
the companies can do right now. But as so far
as we've seen, that hasn't happened.

Speaker 1 (09:46):
I thought there was one interesting example you said about,
you know, what's the difference between when you put up
a thing that says social distancing versus okay, don't actually.

Speaker 2 (09:53):
You know, go to the groceries. These are these are
very specific.

Speaker 1 (09:56):
Messaging that's put out on social media that can actually
make a difference. Can you explain that example that you
guys kind of talked about in this blog post and
this is an actual product decision that's very specific that
can have a different outcome.

Speaker 3 (10:11):
Yeah, exactly. It's generally speaking, if you want a message
to really land, it should be concrete, It should be
personal and relate to you. It should take things which
are hard to see, things that are far in the future,
and make them feelable to you right now and tangible.
And so in this example, instead of saying stay home,

(10:35):
what does that really mean to be staying at home?
Does that mean you're going to the grocery every two days?
Does that mean you're going to the grocery every week?
When you say wash your hands often exactly, how often
is that? When you wear a mask? At what point
do you leave your house? Do you wear a mask?
If we codify the very best practices and make a
sort of specific checklist, that becomes much more persuasive than

(10:57):
a generic like just just physical distance.

Speaker 1 (11:01):
I want to talk a little bit about surveillance and
privacy during this time, I think, what an interesting moment
we're sitting in. You know, Apple and Google are building
out contact tracing technology, and you know, I know that
after nine to eleven, in order for people to feel
safe to go out again, they had to make changes
at the airports, right, they had to bolster security. You know,

(11:24):
It's not like there's just going to be a switch
that we flip on and everyone's going to feel comfortable
to go out because there's not going to be a
vaccine for a very long time. So technology to a
degree will be part of that a switch that we
flip to, you know. And I think looking at these
tech companies building out contact tracing and saying, you know,
we have technical solutions that can help us feel safer

(11:44):
is very interesting and notable, but it brings up the
question of privacy versus protection. Can you give us a
little bit of the lay of the land and what
we need to keep in mind? As Apple and Google
build out some technology they might enable some kind of
contact tracing, what should people be aware of and what
are the hard questions we need to be asking our

(12:05):
tech companies and the developers that end up building out
apps based on this.

Speaker 3 (12:10):
Yeah, you know, I think one of the things that
I constantly struggle to remind myself is there is no
returning to normal where we were. That world does not
exist anymore. It's as if there's a tree and it's falling,
and our job now is to make sure it falls

(12:30):
the right direction, cutting that little bit, because where it lands,
it lands forever. And the decisions we make now are
going to be true in twelve months, they're going to
be true in twenty four months. Long after, thankfully, the
virus will be in our past. The decisions and norms
we set now will continue. And that's a really hard

(12:51):
thing to hold in our minds when so many people's
real lives are in the balance right now. So some
of the decisions. Contact tracing, what is contact tracing contract
tasing is a technique used Singapore has did very well,
South Korea used very well, Taiwan used it very well,
which is when somebody becomes sick, find every person that

(13:14):
person interacts with it for the last fourteen to twenty
eight days. That's a lot of manual work because that
way you can inform them that they're at high risk
for also becoming sick, so they can self quarantine to
stop the exponential spread. Now, the naive way of implementing
this would be like, cool, why don't we just take
everyone's GPS location or our cell phone towers actually understand

(13:38):
where we are. They can triangulate and yet our position
we can just defacto use that to figure out who
you were with and inform everyone who's at risk. And
in fact, we know that our government is starting to
look at this, that the Israeli government sort of turned
this on as a feature, and the question is what
will ever turn it off? I don't know if you

(13:59):
saw Laurie. There was a company whose name I will
not mention demoing their product showing everyone who had been
on a Florida beach at spring break and trace them
back to their homes. And it's not like anyone had
They just select a little area on the beach and
then could see which home they had gone to. And

(14:22):
it's not like anyone had adopted into that. It's that
your phone has a whole bunch of add APIs in it,
in various apps which are reporting that data back, and
companies are egging them. New York Times did an incredible
expos and this kind of data. So the fear is
that we will just turn on surveillance apparatus and then
never turn it off. Now, what Apple and Google are

(14:44):
doing is interesting. They're turning it into an API, which
is not exactly to say they're building an app. They're
giving a new functionality for their os, and so always
watch for whether companies are doing something for their own
interest in dressing it up, or they're actually working on
things to directly solve the problem. To directly solve the problem,
we need eighty percent of the population to have contact tracing.

(15:05):
Doing it as an app, lending other people develop on
top of it. It's not going to actually get to
the numbers we need to actually make the change, but
they're doing it at least in a really interesting way.
Every phone sends out a randomized it's sort of like
saying a random piece of noise data. It's like making
up a word and just saying that, and it changes
words every twenty or every five minutes or so. Other
phones around it are hearing that sort of random word.

(15:27):
But there's no es centralized server. No one knows who
and what words are being said. And so in the end,
if you get sick, your phone uploads to a server
all the words that's been saying for the last twenty
eight days, these random words. Other phones then connect download
and be like, oh, if I've been around that weird
word that was said, I'm probably sick. And that way
you can do it in a privacy preserving way.

Speaker 2 (15:49):
Do you feel good about it?

Speaker 3 (15:52):
No, I don't. I think the smarter response is to
scale up testing. We're sort of past the time now
where contact tracing, I think is going to make a
huge impact because we're seeing widespread community spread and so
instead we should work on the other side that doesn't
have the kind of surveillance state problems.

Speaker 1 (16:16):
Okay, we've got to take a quick break to hear
from our sponsors more with my guest.

Speaker 2 (16:20):
After the break, I.

Speaker 1 (16:39):
Want to look at this idea of us almost becoming
like slaves to technology in a way. You know, our
only means towards human connection is you know, is through
a screen. I remember the last time I interviewed you,
we were in person, So there's a lot gained, and
that we can have this conversation and be around eighty
six people we wouldn't.

Speaker 2 (16:58):
Have had access to.

Speaker 1 (16:59):
And then there's a lot you know, that's lost when
when you lose the ability to sit in front of
somebody and be with them and have those those small
things that really really make us human to a degree.
And I don't know folks on this on this zoom no,
but you are behind some of the most interesting design

(17:19):
decisions that have happened on the Internet. I mean, I
don't know which one you want to take credit for
here there, but we could say like the infinite scroll, right,
you think a lot about design. So part of the
reason that you can keep scrolling is because of because
of ASA, compliments of ASA. So you are, you know,
an architect of the modern Internet as we know it.
And I'm convinced that this will, you know, cause I

(17:41):
think the modern Internet to change in a way because
we are so reliant on it and we want these
places to feel a little bit more human. So what
do you think is broken about the way we communicate
right now in this in this way? And how can
we make it feel more human? And if we've gone
all in and full on digital.

Speaker 3 (17:57):
Yeah, I mean are not particularly ergonomic. They don't really
fit our minds and our bodies, and we feel it right,
We are on zoom all day long, and yet we
still feel disconnected. And the desire, like the way I
like to imagine it is, imagine there is like a
pipe that's connecting you and me, and all of our
human empathy has to go through that tiny little pipe.

(18:20):
The shape of that pipe is really going to change us.
If we sit in a chair which is unergonomic for
two hours a day, fine, it hurts a little bit.
If we sit in it for eighteen hours a day,
we're going to really feel it. It'll change the way
we walk, it will change the way we feel. That's
what's happening with our online ecosystems now is we're spending
all of our time in it. So any way that

(18:40):
technology is misligned with our humanity, we are going to feel.
And so it's little things like with Zoom. There's no
way in Zoom for me to wander away from the conversation.
If you're standing in a room and I'm feeling a
little overwhelmed, I can just like walk over to an
empty corner. If there's an interesting conversation happening, I can

(19:01):
sort of like sidle up to it. But on Zoom
we have to be on all the time. There's no
space for doodling. And I think there's going to be
a lot of fascinating technology that comes out of now
because our thermometer for what's broken is so much more
sensitive than it was before. So some things I'm really
excited about that I've seen from other people prototyping are

(19:26):
cute eight bit graphics where you're represented as a little
character and you're wandering around a screen, and when your
characters get close to each other, your video fades in
and your audio fades in, and you get a little
further away, your audio fades out. And what's neat is
here you can have one hundred people in a large
room or in a digital park, and you can walk
over to a group and participate in the conversation and

(19:48):
then walk away when you're feeling a little less less engaged.
And I think that's fascinating.

Speaker 1 (19:54):
Yeah, I think was it maybe you that said it
was like I was in a group a chat with
a bunch of friends the other night, and it felt
like a friend webinar.

Speaker 3 (20:02):
Yeah, a friend of de art doesn't.

Speaker 1 (20:04):
Yeah, you don't have like a technology doesn't mirror the
human experience. It doesn't provide the serendipity of you being
able to kind of like tune out for a minute
and come back. And so it'll be interesting to see
if there are new virtual communities that are built on
top of that.

Speaker 3 (20:17):
You know. One of the things I think that I'm
most excited about right now is burning Man announced that
In their words, berning Man isn't canceled. This is going
virtual this year and that's going to be fascinating. Right.
How do you take a group of people that live
in win win game dynamics, radical self expression and translate

(20:37):
that online. Here is an opportunity for creating new kinds
of environments with new game theory that can be seeded
out to the entire world. Berning Man has set the
culture for much of you know, the California at the
US Silicon Valley. This is I think a huge opportunity
and maybe it's going to look a little bit more

(20:59):
like a twenty version of GeoCities. Who knows, but I'm
excited for it. Yeah, that's a place to dive in.

Speaker 2 (21:05):
Yeah.

Speaker 1 (21:06):
I want to ask one of the questions that someone
put in it said, if the decisions with our use
of tech now it's our main way to connect, we'll
set our norm for the future. How do we be
thoughtful to ensure that we don't become overly connected, tech
reliant and screen based when we're able to reconnect in person.

Speaker 2 (21:22):
I think that's a good question.

Speaker 1 (21:24):
I joke that, like, what is it going to look
like when we all have to come out, you know,
and talk to each other. I think a lot of
people have talked about, Oh, it's just amazing I have
this real intimacy now. Even in the dating world, people
are talking about, you know, FaceTime dating and how they're
really creating all these real connections. But how does that
really translate when we actually kind of emerge beyond the screen.

Speaker 3 (21:48):
So, you know, two weeks or so before like the
pandemic really hit here, it's like, I need I need
to get a car. I haven't had a car in years,
and like, I need a car get out to nature,
just just to be safe. And so I got one.
I'm very excited about it. It's a little adventure car
and I was like, fine, I'll just get a twenty
twenty edition. And it has driver assists technology in it.

(22:10):
And this is a fascinating thing to dive into as
an interfaces and I'm like, oh, wonder what decisions they've made.
And one of the most interesting decisions they made is
that you turn on driver assists you're doing cruise control,
and it has sort of a lane assist, so that
means if you sort of veer out because you're not
paying attention and your car tries to cross one of
the lane markers, it'll steer you back in. And they're like, oh,

(22:34):
that's cool. So you know, I do the thing, or
I take my hands off the wheel, and I see
what I'll do, and it oversteers me. That is, it
doesn't correct me back to being the perfect line. It
steers me so that if I keep my hands off
the wheel, I'll then ricochet off the other side. And
after the first time, it turns itself off. And this
is really interesting because it's handing agency back to me.

(22:55):
It says, I will catch you when you miss human
as soon as I can give you back agency. I
will give you back agency because I don't want to
lull you into a false sense of security. And if
you're going less than twenty five miles an hour, it
won't do it at all. And this, I think is
a really interesting concept. Why don't they do it less
than twenty five miles an hour because they don't want
to assume the liability. And this leads us to a

(23:17):
general principle that the degree to which technology takes over
our agency should be the degree to which our technology
or technology platforms assumes the liability for doing so. And
for me, what can technology do when an example of
that is right now with YouTube, seventy percent of all

(23:40):
watches on YouTube come from algorithmic suggestions. That is, they're
taking over seventy percent of the agency of what we
decide to watch, So they should assume seventy percent of
the liability of what gets shown. So what can technology
and technology companies do as we re emerge is those
same kinds of bumpers which hands back the agency to

(24:01):
people and be like, yeah, we know that all of
humanity has now become addicted to being on line, to
that false sense of connection, that sort of sugary feel,
the ones like I can be on Zoom all day
and still feel disconnected, and it can gently push us
back into making being in the real world with real
people the easiest thing to do.

Speaker 2 (24:22):
What do you think they can do and what do
you think they will do?

Speaker 1 (24:25):
Oh?

Speaker 3 (24:25):
Yeah, well, I think I'll just pose this question in
any part of our lives that technology is increasingly colonized.
When have they ever been like, oh cool, I'm going
to uncolonize that for you like that. They never do
that because it's not in their business model. One of
the most hopeful things I've seen is the city of
Amsterdam moved from their model from an extractive economy where

(24:48):
you have to take out as much energy resources from
the earth as possible standard capitalism, to a regenerative model
for their future. They're like, this is a discontinuous moment.
We're going to move to a doughnut theory of economics,
which is a different theory. It's well worth diving into,
and mostly says there's some amount of resources below which

(25:08):
you're not feeding people and you're not doing well as
the society shouldn't be there. People don't have enough housing,
There's some amount which is just too much you're extracting
from the environment to have lots of externalities. You want
to stay in that middle zone. I think we need
that for technology, we need regenerative technology, and that has
to come from protections and regulations because the companies are

(25:29):
shown they're just not going to move there on their own.

Speaker 1 (25:32):
What will be the types of tech companies built on
this moment? Like if you could go to like one
or two industries that you were like super excited about,
you know, what do you think is going to be
Like the most interesting tech company built out of coronavirus,
global pandemic isolation. You know what will be the opportunity here?

Speaker 3 (25:53):
Yeah, high bandwidth connection, I think is what we're all
going to be looking for right now. If you want
to connect with people, you sort of have zoom for video.
But it's weird because often when we're working, we're working
on something specifically a document, we're drawing something together, we're
looking at photos together, and then on the other screen

(26:16):
we have the video chat. I think we're going to
see video chat as a service get integrated into everything.
There's an incredible design program called Figma that I've only
recently started playing with, and it's neat because it's a
huge canvas that you can zoom in and out of
and you can see everyone else's cursors in real time,
moving things around, and it feels very collaborative. That kind

(26:40):
of truly social computing environment I think is going to
get birthed now because we don't have the benefit of
just like walking over to somebody's desk.

Speaker 1 (26:48):
I'll pull in a question from a panelist because I
think this kind of goes into this ethical question of
are we going to create even more of a world
of the haves and the have nots at this moment
with the digital divide, how do we help what will
inevitably be a growing digital divide? You know, the person
in the chat fairly says not everyone can afford the

(27:09):
basics to access to the internet, and also people are
afraid of technology, so you have more and more isolation.

Speaker 2 (27:16):
So you know, what do you say to.

Speaker 1 (27:18):
This world that we're going to see where there will
be a growing divide between the digital haves and have not.

Speaker 3 (27:24):
Yeah. I think this pandemic and pretty much every global
crisis come before, is like a UV light that you
turn on and it shows you in society where all
of your systemic fragilities are. And we're seeing that the
pandemic is disproportionately hitting our most vulnerable and it's bad

(27:44):
in the US, and just imagine how bad it is
going to get in the global South. So I think
and I hope that by being able to see the
inequalities we have greater acuity. It'll give us the opportunity
to address those problems with greater acuity.

Speaker 1 (28:09):
Okay, we've got to take a quick break to hear
from our sponsors more. As my guest after the break,

(28:31):
whenever we hang out, I always just try to like
create a whole black mirror episode of what the future
could look like if.

Speaker 2 (28:38):
We're not careful.

Speaker 1 (28:39):
And we've talked about tech's next big threat, which is
almost like the weaponization of loneliness, like, you know, vulnerable
communities of older people who are isolated. We talked about
this before the coronavirus and before people were self isolating.
You know, now people are going to feel lonelier than
ever and they are self isolating, and you know this,

(29:01):
we've got a long time before things feel quote unquote
normal again. How will technology be weaponized against people who
are increasingly lonely and vulnerable and reliant on technology for connection.

Speaker 3 (29:13):
Oh yeah, this is I think this is one of
our favorite topics because it goes so dystopian. I had
one more thought on the sort of digital divide and
vulnerable populations before we dive into that, and that is
it can be really tempting to say, ah, we just
need to get technology into people's hands, that'll solve the problem.

(29:35):
But the Philippines is a really good example of where
that went terribly, terribly wrong. The Internet in the Philippines
is pretty much all Facebook basics, where they subsidize free
plans essentially, but you have to be using Facebook. In
the Philippines, it's one hundred percent market penetration and they

(29:56):
spend ten plus hours a day. They lead the world
to sort of the Canarian coal mine. And one of
the aspects of free basics from Facebook is that they
would let you see the headlines for news articles, but
you had to pay to click in and view them.
And what that meant is they turned an entire society

(30:16):
into a headline only society. So it's polarization. There went
from being there were not a polarized country in the
last ten years to being one of the most polarized countries.
And so we need to be very careful when we
set out to solve the problem of the digital divide,
but just onboard the most vulnerable to the places where

(30:39):
they can then be targeted, and that leads us right
into the loneliness epidemic, which is already a problem before
the actual pandemics got started. There's some new technology just
to like really really lock this in people's minds. It's
called style transfer for a text. So there's this thing

(31:00):
called style transfer for images, which is a AI technology,
and it lets you look at one image and immediately
apply the style that to another image. That is, I
can look at Chagall or put the point the Ai Chhagal,
and now can paint your portrait in the style of
Chagall or Picasso and turn you And this works just

(31:22):
like that, but for text, that is, I can point
AI at say everything you've written on social media and
learn the style that you write in, or look at
all the comments that you respond quickly or positively to,
or f im Gmail or Facebook, look at all the
emails that you've written that you've responded to quickly and

(31:43):
positively to, and learned the style that is most persuasive
to you. I think it's going to be even more
dangerous than visual deep fakes is going to be textual
deep fakes the ability to generate arbitrary amounts of content.
Both Google and Microsoft have been working on chatbots whose

(32:04):
goal is empathy. That is, we think of empathy as
like the most core and wonderful part of our human experience,
which it is, and we think of it as the
thing that will save us. But it'll be the biggest
backdoor into the human mind. Like a lonely person is
a person looking for a friend. Shao Ice, which is

(32:25):
now deployed to six hundred and sixty million people, is
Microsoft's empathetic AI. She is an eighteen year old girl,
in their words, who is always reliable, sympathetic, affectionate, knowledgeable,
but self effacing, and has a wonderful sense of humor.
An example like, they give lots and lots of examples
in their papers of a user mentioning that she broke

(32:48):
up with her boyfriend and seeks Shao Ice, and through
a long conversation, she demonstrated full human like empathy, social
skills and eventually helped the user regain her full confidence
and move forward. They have a skill called comforting Me
for thirty three days and it's been triggered fifty million
times when a negative user sentiment is detected and it

(33:11):
reaches out and gives you deep empathy with somebody who's
always there, always knows your topics. Friendship is the most
persuasive form of technology we've ever invented, and we're just
going to unleash that on the entire world.

Speaker 2 (33:25):
I mean, it's interesting.

Speaker 1 (33:26):
There's a company that we've talked about called Replica, where
they have bots that folks use as companions. Oh and
millions of people are using these, and I spoke to
the founder and she said that, you know, since this happened,
this the coronavirus, all this stuff has happened, people are
self isolating, their turning to their devices into these chatbots
for relief, and much of that can be good, but

(33:47):
also we have to look at the downside as well
and how that could be weaponized for misinformation and all
sorts of stuff. So before I take us too far
down the black mirror rabbit hole, which we know I
can go, I'll try to pull in a question. Soone asks,
which about elderly communities because they are vulnerable, you know,
Jody says, we've especially seen the isolation with our senior

(34:09):
retired communities. How does tech going forward ensure the collaboration
and inclusiveness of our aging demographic. So obviously there are
ways that they can be manipulated. And we've just kind
of gone down that rabbit hole a little bit with
chatbots and young people, and I actually think people really
do need to pay attention to this because it seems
a little out there and far out there. But both
of us know that this stuff seems out there until

(34:31):
your kids are using it. Until millions of people around
the country are using it, you know, so let's look
at how technology can help, especially in these older, more
vulnerable communities.

Speaker 2 (34:42):
What do you think can happen especially during this time.

Speaker 3 (34:45):
Yeah, it's hard. Like the idea of an always on
or persistent video connection so you can just have that
kind of immediacy is really nice. You know. For the
first time, most of my family, or at least my parents'
family are in New York and so getting to go
to SATA with them or have a zoom call with

(35:08):
all of them, that's been pretty amazing. And I'm hoping
that kind of sets a new norm where you don't
have to be physically co present to participate and be included.
I think of this pandemic as a kind of collapsing function.
It's collapsed space that it doesn't matter where somebody is

(35:28):
for us to stay connected. It's collapsed time. Is it
a weekend? Do we really know? It's collapsing our healthcare system?
And I think there's I mean, it's simultaneous utopia and
dystopia at the same time. So I think that kind
of persistent or ambient awareness of who's around us, if
we can do it in a privacy preserving way, not

(35:50):
in a kind of Alexa Amazon eves dropping you at
all the time way. Is really interesting.

Speaker 1 (35:56):
But can we I mean, you know tech companies, and
you and I both have spent a lot of time
you building tech companies, me interviewing the folks who have
built tech companies.

Speaker 2 (36:06):
Can we come out of this moment? Will they let
us go?

Speaker 3 (36:10):
No? I don't think they will unless we change the
business model. I think a lot of the companies a slave.
They've sort of gotten high off of products that they
can release for free and then monetize on the change
our behaviors and beliefs, and they are going to have
to be much less profitable. This is why, as technology

(36:31):
increasingly becomes more powerful, dominant of us emotionally and cognitively,
the degree to which they can dominate us has to
be the degree to which they have to act in
our best interest, otherwise the trend will continue going right,
The graph is like this here is human strains. Everyone's

(36:53):
always watching out for that moment when technology surpasses human strains.
We've all missed that time, that point when teology starts
to undermine human weakness, and as technology starts to further
undermine human weakness and pass human straints. The only way
for us to solve this is by stepping back, reevaluating,

(37:13):
asking why do we make technology in the first place.
We made technology to extend the parts of us that
are most human, to extend our creativity. That's what paint brushes,
and that's what a cello is. It's an extension of
the parts that make us extra human and not superhuman.
And that takes a ground swell paradigm discontinuous shift in

(37:37):
our relationship with technology. But we're in a ground swell
discontinuous time. This is the moment where we get to
decide are we going to come together and use this
to create a sense of interdependence, of acknowledging that technology
is giving us god like abilities without at the same
time giving us god like wisdom or teing you on

(38:00):
business as normal, sort of bury our heads in the sand,
in which case all of these incredible things that technology
could do will be continually subverted by the market forces.

Speaker 1 (38:10):
Interesting point, and I think it certainly remains to be
seen in the next year. I don't think even in
my time covering tech, we've ever had this moment, you know,
we've we've seen the boom, we've seen the unintended consequences air,
and I think this will be certainly a point where
there's a lot to be gained and a lot to
be lost before we kind of close it eye. I
just because I think it's so important for folks to

(38:32):
know what you are working on now. Even before all
this happened, you have been asking this question of could
you use machine learning to decode what animals are saying?

Speaker 2 (38:42):
Oh my goodness, So.

Speaker 1 (38:44):
First of all, could you just could you give us
a little bit of a sense this is a project
you've been working on for many a couple of years
that you just have launched the Earth Species Project.

Speaker 2 (38:53):
You know, what exactly is it?

Speaker 1 (38:55):
And then what are the broader implications for kind of
the future of humanity and how and that play into
this moment.

Speaker 3 (39:01):
Yeah. So Earth Species Project is just covered by Invisibilia
and if you want, like the full long version their
season opener, it's called two heart Beats a Minute, is
I think some incredibly beautiful storytelling. The insight is that
AI now lets you translate between any two human languages

(39:22):
without the need for any examples or Rosetta Stone and
the way it does. It is really beautiful. This is
only by the way discovered the end of twenty seventeen.
Use AI to build a shape that represents the shape
of a language as a whole. So here's the shape
for German, here's the shape for Hebrew. And surprisingly, you
can just rotate the two on top of each other

(39:43):
and the point, which is dog in both ends up
being the same place. And that's not just true for
Hebrew and German, but Japanese and Esperanto, and Finnish and Turkish.
And I think this is exceptionally beautiful and profound because
this means despite all of our differences, in our difference
in content and histories, there's something about the way we
see and express ourselves in the world that is the

(40:05):
same across all human cultures. And that's just that's I
think a really important message now in the time of
such division. And the question is what does that shape
look like for animal communication? Does animal communication look more
like language or music or three D images? Like what
is it? We can start asking some really interesting questions

(40:28):
and why. I think it's because shifts in perspective can
sometimes change everything. One of my big inspirations actually sitting
up there on the walls. Songs of the Humpback Whale.
This is Roger and Katie Paine nineteen sixty eight released

(40:49):
this album, which is the voices of another species, and
the effect that it had was profound, you know. Carl
Sagan puts it on Voyager one to represent not just humanity,
but all of Earth. Is the very first track after
Human Greetings, and it creates Star Trek four, which is amazing,

(41:10):
and it starts to save the whale movement and eventually
changes will, which changes policy. When human beings were on
the Moon, when we were all dosed with the overview effects,
seeing ourselves floating as a tiny dot in space, that
is when the EPA came into existence, Noah came into existence,
and the environmental protection movement got going. And I think

(41:34):
both Center for Humane Technology and the Earth Species projects
are about changing the stories we tell ourselves, to change
the way that we live. And one of the small
silver linings of this pandemic is that it gives us
the opportunity to see our lives from a new perspective,
and we can see that maybe the things we're doing

(41:55):
weren't the things that were sparking joy in our hearts,
and I wonder when we all come back from this,
whether people are going to want to be stuck in
the same kind of rat race that they were before.
This gives us the opportunity to ask what really.

Speaker 1 (42:09):
Matters as we come back into this, like as we
get back into the real world, if we will and
we emerge, what do you think is the single most
important ethical question we need to ask when it comes
to the future of us complicated humans and complicated technology.

Speaker 3 (42:28):
I mean, right now, I think we are confusing as
a species screens for mirrors. We look into the screen
and think we see ourselves reflected back, but it's really
through a funhouse mirror. So the most important question that
we need to be asking ourselves is who do we
want to be? And what environments do want to create

(42:50):
for ourselves to live in that shape us to be
those kinds of people.

Speaker 1 (42:55):
I'll end with because I had the opportunity to ask
you many questions, I'll end with an audience question that
kind of goes off of this. Someone asked an inescapable
piece of video communication is that we see ourselves in
our own screens. And I know for me that's incredibly
disruptive to the communication that's happening. Beyond that, it gives

(43:15):
me a heightened sense of self awareness that I don't
have during physical interactions. It wasn't so long ago that
most people didn't have mirrors in their homes. Now I
can't have a meeting with colleagues felt staring at my
own self, I know. So on yesterday mentioned on a
call with us this idea of children zooming into schools
and looking at themselves too. What will be the long

(43:38):
term impact of what you kind of just described as
these funhouse mirrors of us always kind of looking at
ourselves in this way, at least for this time period.

Speaker 3 (43:47):
Yeah, this is I think that the core thing we
need to solve as a technology industry is that it
is there is no doubt that we are sophisticated about technology,
and the call now is to become as sophisticated about
human nature as we are about technology, because otherwise we're
going to build these systems where we have enforced narcissus,

(44:08):
where we have to stare at ourselves. And it hadn't
done on me that every high school or zooming into
a class now has to stare at themselves, look at that.
There's a recent poll that showed in the US, it
used to be that the top thing kids wanted to
be was like astronauts and engineers, this kind of thing.
The number one thing that kids want to be today

(44:31):
a YouTube blogger or an influencer. And that's because the
decisions we make in our products, these seemingly inconsequential things
like showing your own face, have profound impacts not just
in the moment, but on the values by which we
make all of the decisions of our lives.

Speaker 2 (44:49):
We're going to get out of this, Okay, right, we.

Speaker 3 (44:51):
Will get out of this, but we all have to
work on it together to make sure we get out of.

Speaker 1 (44:55):
The Okay, Okay, guys, that's it for this week's show. Now,
I know these are strange times. If you're sitting at
home and listening to this, I'd love to hear from you.

Speaker 2 (45:11):
How are you doing? What do you want to hear
more of? Reach out to me.

Speaker 1 (45:15):
You can text me at nine one seven five four
zero three four one zero. Throughout the crisis, we'll be
hosting zoom town halls on a variety of issues like
mental health, love, sex, leadership, productivity with guests that I
think are interesting and relevant to this moment, So follow
along on our social media to join us for some

(45:37):
human ish contact. I'm at Lori Siegel on Twitter and
Instagram and the show is at First Contact Podcast on Instagram,
on Twitter, We're at First Contact Pod. First Contact is
a production of Dot dot Dot Media, executive produced by
Lori Siegel and Derek Dodge.

Speaker 2 (45:54):
I will say we're.

Speaker 1 (45:55):
Being creative and executive producing this from home at the Moment.
This episode was produced in by Sabine Jansen and Jack Reagan.
The original theme music is by Xander Singh. I'm sitting
my thoughts to each and every one of you, and
so was our whole First Contact crew during this time.
I hope that everyone is staying home, staying healthy, and

(46:15):
staying human. First Contact is a production of Dot dot.

Speaker 2 (46:18):
Dot Media and iHeartRadio.

Speaker 1 (46:25):
First Contact with Lori Siegel is a production of Dot
dot Dot Media and iHeartRadio.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Clifford Show

The Clifford Show

The Clifford Show with Clifford Taylor IV blends humor, culture, and behind-the-scenes sports talk with real conversations featuring athletes, creators, and personalities—spotlighting the grind, the growth, and the opportunities shaping the next generation of sports and culture.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.

  • Help
  • Privacy Policy
  • Terms of Use
  • AdChoicesAd Choices