Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
First Contact with Lori Siegel is a production of Dot
Dot Dot Media and iHeartRadio. You are building out a
new computing platform, You're building out a new social world,
and I don't actually know if people realize what a
big deal this is.
Speaker 2 (00:17):
I could not agree more. This is like the beginning
of computing again. You know, this is going back to
the nineteen sixties, the nineteen seventies, when the computers that
we know now, even the ones we have in our pockets,
really come back to the same fundamentals that were designed
way back in the middle of the last century. And
with VR and AR, suddenly it is a different paradigm.
Speaker 1 (00:52):
He's sitting in one of the most influential seats at Facebook, building.
Speaker 3 (00:55):
Out a new virtual social world.
Speaker 1 (00:59):
Andrew, better known as Boz, was among the first fifteen
engineers at Facebook. He still remembers the exact day he
started in two thousand and six. Over the years, he
helped build news Feed, He oversaw Facebook's advertising efforts during
the twenty sixteen election, and he's also become known for
his controversial and unfiltered opinions, which he's sent out in
(01:20):
company wide attention grabbing memos. Now Facebook is pursuing an
ambitious vision of virtual reality and augmented reality, and it's
Boss who's in charge. There is a lot at stake
as Facebook builds out what has potential to be a
whole new dimension. What will AR and VR look like
down the line, and how do you make virtual interaction
(01:43):
feel as natural as in person interaction? What will this
mean for the future of remote work? How close is
too close in the virtual world? And when it comes
to creating your virtual self, who owns your identity? These
are all ethical questions, and there are the types of
questions Facebook is looking at as they invest pretty heavily
in building out AR and VR, and as the company
(02:06):
creates another social layer. I think it's important to ask
Boz how confident is he that Facebook won't recreate the
problems they've faced as a platform over the last decade.
But before we get to Boss, I want to tell
you about something new from Dot dot Dot that I
am really excited about. It's our new email newsletter, The
Gray Area. Each month, the Gray Area confronts the complex
(02:28):
issues facing technology and humanity, issues that aren't necessarily black
and white. In October, we're exploring the controversial topic of
technology and neutrality, including an unfiltered perspective from a well
known Silicon Valley founder who says tech should not try
to be neutral and those building platforms now should start
(02:49):
building with.
Speaker 3 (02:50):
That in mind.
Speaker 1 (02:51):
It's actually a fascinating take, and I hope you guys
don't miss out. You can sign up at dot dot
dot media dot com slash newsletter and now it's signed
for Bob. I'm Laurie Siegel and this is first contact Boz.
You are called Boz. That's the nickname. That's the correct
way I should describe you.
Speaker 2 (03:11):
That's right. You're also welcome to call me Andrew. I
do respond to both names. From time immemorial, people have
preferred to call me Boz.
Speaker 1 (03:19):
Great currently VP at Facebook Reality Labs. But you've been
at Facebook since two thousand and six.
Speaker 2 (03:25):
That's right, January ninth, two thousand and six.
Speaker 3 (03:28):
I love that you by the way that you know
the date that you.
Speaker 2 (03:30):
Submit, that's a big deal. It was also two days
after my birthday. I started. I was my birthday January seventh. Wow.
Speaker 3 (03:37):
Do you remember your first day vividly?
Speaker 2 (03:39):
Yeah? Absolutely? And it was a bunch of really great
Facebook engineers joined at the same time that I did,
Mark Slee, Dave Fetterman. It was a fun time for
us and quite a crazy change. It came from Microsoft,
and so going to this little startup where like there
was just no one even to greet you just kind
of wandered in and found yourself a desk. It was
was pretty different.
Speaker 1 (03:59):
Was your first conversation, Like with Mark dead Day, what
was the first thing he had you do?
Speaker 2 (04:03):
We had to go fix bugs, which is a tradition
that continues to this day. It was like, hey, here's
your desk, here's a computer, go fix some bugs.
Speaker 3 (04:11):
Right.
Speaker 1 (04:11):
I mean for folks who are listening to the context
is you're one of the first fifteen engineers at the
company and you help build news feed messenger groups. So
you have just been a part of every major milestone
of the company to this day.
Speaker 2 (04:26):
I've had a good fortune of working across a huge
breath of the company. Of course, there's always so much more,
a lot of things that I've never gotten to work
on that are great. But I've had a really fun
set of projects to work on, connecting people and creating
these communication tools that people use every day.
Speaker 1 (04:43):
It's really satisfying and what an extraordinary time to be
sitting in your seat.
Speaker 3 (04:47):
And also one of the things I love about.
Speaker 1 (04:49):
You, having been in the industry for a while and
having followed the company closely, is you're kind of one
of the executives that says how he feels, which us
as journalists but just like a people we really appreciate.
But it is really fascinating because you've said some things
throughout the years that always get a lot of attention,
but you are a person at the company, at a
(05:10):
big corporate company who's kind of known for saying how
you feel.
Speaker 2 (05:14):
Yeah, I think it's saying how I feel, but also
trying to bring voice to conversations that are important is
a bigger part of it. I think it's tempting for
any company that gets big to get comfortable or to
get into a habit of not asking the hard questions.
And I've always wanted to be someone that doesn't matter
if I'm just a regular employee or if I'm an executive.
(05:34):
I want to be somebody who invites the hard questions,
who bring those to the surface to make sure that
we're always doing that work. And it's never been more
important than it is now.
Speaker 1 (05:42):
Yeah, I want to get to what you guys are
announcing and virtual reality and augmented reality, which I just
think is actually fascinating. You know, I think a lot
of times in technology, everybody has one conversation and we
totally don't look at the other way. And I think
that you're sitting in probably one of the most influential
seats at Facebook.
Speaker 3 (06:03):
Given everything that's happening.
Speaker 1 (06:06):
You're almost building out you are building out a new
computing platform, You're building out a new social world. And
I don't actually know if people realize what a big
deal this is. You know, I know people have been
talking about VR for a long time. You've been talking
about augmented reality. I know that that Mark has in
his non New Year's resolutions, he always posts on Facebook
can he did this year on non New Year's resolution
(06:27):
where he posted something about augmented reality, fast forward, pandemic
remote work. It seems to me that you have one
of the most important roles at the company right now.
Speaker 2 (06:39):
It's hard to gauge its importance. You know, how would
you weigh the importance of new computing platforms versus bringing
greater integrity and privacy and security to existing platforms. I
think they're all important I can certainly tell you it's
one of the most fun jobs at Facebook right now,
and I really want to double down. You said you
don't think people know how important this is, and I
(06:59):
could not agree more. This is like the beginning of
computing again. You know, this is going back to the
nineteen sixties, the nineteen seventies, when the computers that we
know now, even the ones we have in our pockets,
really come back to the same fundamentals that were designed
way back in the middle of the last century. And
with VR and AR, suddenly it is a different paradigm.
(07:23):
It's not just flat two D windows that you directly
manipulate somehow, whether with you a mouse or your finger.
It's like in the world there's a bunch of elements
that the computer can't control that has to adapt to.
Not only was that impossible previously from a standpoint of
the displays, which still don't exist yet, but we think
they can, the sensors all that. It's also the AI,
(07:45):
the intelligence you need to be able to be useful
in that kind of scenario. So it does feel like
we're at the beginning of a really big arc in
progress for technology. Whereas the mobile phone was maybe the
end of the last arc of progress and that's exciting.
Speaker 1 (08:03):
But give me to sell because like, I've been in
this for a while, right, And how many times have
we heard people say, like VR, the next big bet
is VR and AR and you would we both know
that people have been saying this for a really long time.
I think my instinct is saying no, no, no. Something
about this actually feels kind of different. And maybe it's
because of all the external things happening with remote work
(08:25):
and with our reliance on screens and our craving humanity
in a different way, but something does feel different. So like, Baza,
give me the cell. Like VR hasn't really hit one
hundred percent. Maybe you can argue with me on this. No,
but why now do you think this is the moment
for virtual reality augmented reality.
Speaker 2 (08:45):
Well, in the case of virtual reality, you have the
mobile phone. To think. Mobile phones created an incentive for
technology to miniaturize, improve performance, per wat, improve things like
cameras and make them smaller and cheaper and higher fidelity,
and improve things like displays very very small, tightly packed displays.
Without all that, the physics of virtual reality, which yeah,
(09:08):
have been around since the eighties, were just unworkable. You know,
some of the early VR headsets were so heavy they
had to be suspended from the ceiling by steel cables.
They used to call that the sword of Damocles because
if it fell off, it would kill you while you
were using VR. You know, today we've got Quest two
weighing in fifteen percent lighter than a generation that was
launched just a year and a half earlier, four times
(09:29):
more powerful, fifty percent more pixels, and it's one hundred
dollars cheaper. That is, you know, the benefit that we
have of working on a supply chain that was really
developed for mobile phones but works beautifully for VR as well,
not to mention tremendous wealth now of three D content
thanks to years of heavily investment in the industry around
(09:50):
three D gaming in particular. So I do think it
does feel differently. It reminds me if you went back
and you had like a palm pilot or you had
a handspring. Those were awesome devices and you could glimpse
in those devices what the iPhone would become, but they
weren't the iPhone. I think the previous generation of VR
was kind of like those Palm pilot Handspring type devices. Right, yeah,
(10:11):
I get it, Like, if you could do this, it
would be cool, but you can't do it yet. We
can do it now. It's exciting, Like it's here. If
you've used it, you're like, oh my gosh, this is it.
This is what I've been waiting for. Now you can
use your hands. It can be very natural augmented reality.
We can see, but we're like, it's we're still trying
to solve some of the fundamental physics problems, you know,
like how do you literally make certain wavelengths of light?
(10:34):
How do you bend those wavelengths of light in the
right way. So we're at a little bit more of
a fundamental stage with AR, but the same technology should
allow us to cross the bridge and also huge advances
in wireless technology. You're critical as well.
Speaker 1 (10:48):
Do you think when it comes to the future, And
I want to get into specifically some of the platforms
that you guys are launching and what you guys just announced,
But I mean, do you think that the future of
Facebook is you know, I look at the history of Facebook,
and I look at Instagram and WhatsApp and all of
these different products that are owned by Facebook to some degree,
and that have been integrated with Facebook. Do you think
(11:09):
that when we look at Facebook and ten years and
maybe this is just to kind of talk about the
stakes of this and to talk about how important, you know,
building out another world is and what will come along
with that. Do you think that these worlds that you
guys are building, the ones that we're about to talk about,
will be the kind of the next dimension that in
the next layer of Facebook that in maybe ten years
(11:31):
we might not even be on the Facebook we know,
will be in these different worlds that you're building today.
Speaker 2 (11:38):
I love that you use the word layers there, because
that is how I think of it, you know, I don't.
We still make phone calls today, and that's a layer
of communication that we as a society laid one hundred
years ago the foundations for and then we added text messaging,
and we're increasing the speed and the fidelity and the
richness with which we can exchange information when we're at
a distance. Look, nothing is as good as being in
(12:00):
the same room as somebody you love. You know, That's
that's that's a high standard to meet. But can it
be better than VC? Today? Absolutely? Like, we can do
better than this, And I think all the time about bowling, Laurie,
have you ever been bowling?
Speaker 3 (12:15):
Totally? I was on a bowling league when I was younger.
Speaker 2 (12:18):
I love it. What a what a weird thing for
us as a species to do. Can you imagine if
we saw like an ants go bowling or like a
dog bowling, it would be the greatest sensation of all times?
What we why do we go? We go? You want
to have something, just any excuse for us to have
a shared experience, to create memories, to have an excuse
to go to a place and be together. And I
think when you're in virtual reality. Look, I've done a
(12:40):
lot of you know, happy hours with friends over portal
and they've been great, but at some point they kind
of drop off the calendar because you don't have a
reason to do it. There's not like a thing that
anchors it. With virtual reality, with augmented reality, you potentially
do have those things. Do I think so? Do I
think that they replace Facebook as we know it today? No,
there's still plenty of opportunity there where I want to
(13:01):
either asynchronously communicate through sharing and broadcasting or multicasting, or
I want to communicate one on one, or do really richly.
And by the way, video calling for two people is
pretty great. You can see my full facial expression. It's
really rich. You have a good sense of my emotion.
So there's a lot all that value still exists. There's
going to be new forms of value, which yeah, I
(13:22):
think a year ago would have been maybe a tougher sell.
But now that people are in lockdown, they experience what
it's like to be in quarantine. You get it. It's like,
oh yeah, yeah, and that's what for some people, for immigrants,
that's what life is like every day. They don't know anybody,
Their loved ones are far away, and there's nothing they
can do about it. So I really believe in this
direction for us as a society, and I think it's
(13:43):
also important, as we're seeing now for people who want
to collaborate at work.
Speaker 1 (13:47):
Yeah, you know, I remember I started covering tech in
two thousand and nine, right out of the recession two
thousand and eight, and there was all this innovation that
happened because you saw that there's so much broken and
I think this experience, even being on Zoom and the
fact that you know, I don't think we'll go back
to work in the same way though I hope we
all go back to work in some capacity, but some.
Speaker 3 (14:06):
Of this will remain.
Speaker 1 (14:07):
Right, These ideas are remote work, So there is a
tremendous opportunity, you know, for these platforms. And I can
see that Facebook, you know, in many ways, wants to
own that, right And of course I think that comes
with so many interesting questions on the human side about
what comes along with that. You know. But this experience
right now we have is pretty broken, right, and I
(14:28):
think people are sick of the zoom apocalypse, and will
people you know, want to zoom when we're kind of
going back to work and what will be that in
between human connection? It seems like that's the thing that
you're thinking about.
Speaker 2 (14:40):
Well. I also I also want to clear like I
don't know that I want to own it. I just
want to make sure there's space for it. I mean, honestly,
if you look at there's lots of examples of technology
that we use that we don't primarily use to communicate
it connect with people. And this also goes back seventy
eighty years is a very deep divide in the history
of computation where some peop people felt, hey, this was
(15:01):
about a tool being useful for me as an individual,
and it would make me more powerful. And there's the
people who said, no, this is a tool to connect
with other people. That's why it's so incredible that in
nineteen sixty eight Doug Engelbart, who debuted the computer mouse,
also debuted video conferencing and shared document editing. You know,
it was important to the early pioneers of the current
(15:21):
generation of computing that this be not just about oh,
I can do spreadsheets more effectively, but also enabling the internet.
Ethernet came out of a Zerox park like incredible leaps
forward in our ability to connect across distances, and for
us at Facebook, that is what we care the most about.
And I do legitimately worry that if we're not in
there at the pioneering stage of these new technologies, that
(15:42):
other technology providers will just cut that use case out
and it'll still be great devices that'll be super useful
to you, but they won't help you connect with other people.
And so I don't need to own the whole thing.
I'm happy to play in a lot of different systems.
I need to make sure there's space for this really
valuable work to happen and we're the ones who care
the most about it.
Speaker 1 (16:01):
And I want to get into that, by the way,
because I do think this idea of connection has come
along with complicated questions, and so I want to talk
about how you guys are thinking about that as you
build out a new computer interface, as you build out
these layers.
Speaker 3 (16:12):
But I don't want to speak around. I want to
talk specifics.
Speaker 1 (16:15):
You guys launched quite a bit, you made quite a
few announcements in the last couple of weeks with Facebook Connects,
So talk to me. I mean, let's start with Horizon.
I thought Horizon was super interesting. What exactly is it?
What's the experience like? And it's kind of weird to
talk about it over a podcast, but like if you
can just like close your eyes and pretend like we're
there and like.
Speaker 3 (16:34):
Describe to people what you see when you're with.
Speaker 2 (16:36):
Horizon, Yeah, I mean, Horizon is a virtual world. It's
got things to do. There's rooms, there's spaces that hopefully
a large community of creators can build out more and
more spaces, and those could be performance spaces or if
you wanted to be do an artist or do poetry
or do a performance, those could be little game spaces
(16:57):
like we played a little laser tag game that could
be a puzzle, you know, a little puzzle space. Escape
rooms are one of the popular early ones that some
of our internal devs have done. Again to my point
about Bowling earlier, it's a place where you and I
could go and just have a shared experience and it's social.
You've got an avatar. I've got an avatar, and they're
not high fidelity, but you do get that sense. And
(17:18):
we're all about Facebook Reality Labs. Is that sense of presence,
you have, that sense that I am with somebody, that
sense of being with somebody and having an experience that
the two of you share together. And Horizon is it's
just an open beta right now, and it's pretty cool.
It's early yet. We're still kind of working out some
of the scaling issues and getting the avatars just right
(17:39):
and getting the quality just right. But yeah, I know
it's you can think of it as a virtual space
for people to go and be together.
Speaker 1 (17:50):
More from Boz after the break, and make sure you
sign up for our newsletter at dot dot dot media
dot com Backslash newsletter we'll be launching in October. You
(18:16):
kind of touch on like this idea of social VR
right and being able to be around other people and
be present with other people and do things together. I
always think that's fascinating and important, especially now. But I
also when I look at these tech platforms being built right,
and you see the wonderful videos and bas you guys
put these together so you know them better than anyone,
(18:38):
like where it's all the amazing things you can do together.
And all of a sudden you're in the virtual world
and you're playing games and you're building things. It's not
like you guys are putting warning and also xyz right,
and you've got to be really careful as you're building
out a new dimension of some sort of new layer,
that you don't recreate many of the problems that you
have on Facebook og platform, because obviously you guys have
(19:02):
been dealing with many complicated issues over the last decade,
even you know, especially over the last five years. So
what are you thinking about as you build out a
new layer? You know, what are the ethical issues you're
thinking about? What are kind of some of the human
problems you're thinking about.
Speaker 2 (19:18):
Yeah, they run a huge range. I mean here we
benefit so much actually from being a part of Facebook
because we do stand on the shoulders of all the
lessons learned over the last ten or fifteen years, working
through the platforms that we've built there, and we benefit
from all the technology that's been deployed around that. We
also have a few additional advantages. For example, in Horizon,
which you just talked about, if there is some kind
(19:40):
of an abuse happening, you have the ability, which is
unique to virtual reality, to literally freeze the world, find
the offending individual, and like disappear them. That person just
doesn't exist for you anymore, and you get to go
about your day. And so you have a lot of
power in virtual reality to control your own experience, which
makes sense because you know, you it's all just virtual.
(20:01):
And so I think we've got two advantages on virtual
reality which are really valuable, one of which is the
history and technology that Facebook merings to the table, and
the second one is the nature of the medium is
a little bit more empowering. And then you know, the
other thing we talked about it connect last week is
for example, Project Aria. Project Aria is a research vehicle
that we're rolling out one hundred kind of handbuilt pairs
(20:23):
of glasses that have sensor packages on them. They have
outward facing sensors, they have inward facing sensors, they have GPS,
and you know, somewhere on the order of one hundred
Facebook employees and contractors fully trained are going to be
wearing them around in the Bay area in Seattle. And
this is inviting, very intentionally inviting a conversation about, Hey,
what is the nature of what we should expect or
(20:45):
allow as a society when it comes to these types
of devices. To be clear about project area, we're being
very careful with it. All data is quarantined for three days.
It gives us time to scrub any faces out. We
blur faces, we blur license plates, We don't use the
data at all with those things in tech, the people
are wearing shirts, they're all identifiable. But set aside the specifics,
(21:05):
the more general question is like, hey, augmented reality could
be incredibly valuable in the really specific use case. You know,
we're partnering with Carnegie Mallon University to say, hey, could
these help people who are visually impaired navigate physical spaces? Right,
It's a device that could help them be able to
actually not see, but navigate physical spaces more effectively than
(21:27):
they could otherwise that's pretty good. But it's also got
a bunch of cameras on it that are going to
like see other humans doing things in the world. What
is the impact not just on the person wearing it,
which is the major focus when it comes to like
mobile phones, what is the impact to the people who
aren't wearing it, What is the impact to underserved or
marginalized communities who come into contact with this technology. And
(21:50):
so there's really tremendous opportunity for good and there's you know,
obviously a huge amount of risk. Well, we're now trying
to start that public conversation today last week, I suppose
so that we because we're years away from having a
consumer product out and so let's have it. Let's figure
it out as a society. What do we think is
a good trade off? What is an acceptable level of protection.
(22:12):
We're not going to get rid of all the harms,
but we can hopefully find a balance that we find
equitable as a society. And so that's a really big shift.
I mean, for you covering Facebook for ten fifteen years,
you know, like that's a shift. We're trying to get
this stuff out way in advance of when the product
arrives so that there's no surprises.
Speaker 1 (22:29):
Although we all know you put a product out there
and just people misuse it, right, people use it in
all sorts of ways that will shock you.
Speaker 3 (22:36):
How could you have seen that Russia was going to
do what it did for the election? Right?
Speaker 1 (22:39):
So it's almost also how do you anticipate the unintended consequences?
Speaker 2 (22:44):
Yeah, so by definition, I suppose you can't find all
the unintended consequences, but we can certainly do a lot
of them. Yeah, that's really the work that we've been doing,
certainly the last five years, really intensified the last three
where it's like, hey, okay, what are all the forms
of harm that you're going to try to get from
nation state actors? You're right, we weren't looking at nation
(23:05):
states before we are. Now there is a list of them.
Will we catch all of them?
Speaker 1 (23:09):
No?
Speaker 2 (23:10):
But I don't think consumers really hold people to that standard.
They just want you to like control the obvious negative externalities.
You know, people, you can make a mistake, you just
don't want to make it twice. And so for me
at least, I think we are trying to take all
those lessons learned what we call the responsible innovation principles,
which are informed by the entire history of abuse that
(23:31):
we've observed on other platforms, and saying, okay, let's run
everything we build through every single one of those types
of abuse and understand what the risks are, what the
opportunities are, and how to do our best to mitigate.
Speaker 1 (23:44):
I mean, I think it's going to be so fascinating
to be building this right now. Can you take us
take us to the inside me Are you leave it?
You're I guess you're not in Menlo Park right now?
I mean like you guys are I guess take us
to the virtual rooms that you guys are discussing some
of these issues, right Like, I remember covering a virtual
reality a woman who had been harassed in the virtual world, right,
(24:06):
and she talked about not having the physical control to
like push someone away, but you hear people's voices. I
mean that was insane to me, buzz, right Like, she
talked about, you know, this idea that she couldn't actually
physically move someone away and they could continue to harass her.
And this was through oculous and right, but it was
again the game developers hadn't really understood that harassment could
(24:29):
actually happen like this in this world. So, like, take
me to the behind the scenes conversations that you guys
are having about these Like, are there anything specific that
you guys are talking about. I just feel like these
jam sessions because you talk about how you are trying
to think through some of these scenarios before, they've got
to be pretty interesting.
Speaker 3 (24:48):
Take us to them.
Speaker 2 (24:51):
Yeah, I mean, so you know, to use the horizon
example just as it's convenience. Yeah, that's a scenario that
we obviously did think through. And in fact, one of
the debates we have inside is what is the distance
that you can allow avatars to get close to each other?
Because for some people, close talkers in virtual reality give
them sense a sense of unease, But for other people
they want to be able to do things like whisper
(25:12):
quietly and have intimate conversations. And so that's an example
of a conversation we're having right now inside the company
of like, hey, like, how much social distance is required
in virtual reality for like maximum comfort When we get
those situations, we air on the side of comfort. You know,
obviously you have to start by building If you don't
have a team that's diverse, inclusive, and equitable. Then you're
(25:33):
not going to have even the eyes on the problems.
You know. One of the things that I think we
were lucky for early on is we had quite a
few women in the Oculus organization who were testing the
headsets out. It wasn't working for hair, it wasn't working
for makeup. So now we've got the accessory program for
Quest two, which should allow for a more diverse variety
of hairstyles. Mine is admittedly relatively easy to operate for
(25:55):
those who don't know I am bald, but that's obviously
not a thing that you want to optimize your headset up.
So there's you have to start with a team. You
have to create space for those conversations. Frankly, a lot
of it is also working with external parties, with experts.
You know, we announced to RFPs last week for over
a million dollars on understanding the impact of technology like
(26:16):
AR on underserved communities and unrepresented communities, and that's worked.
Why would you expect us at Facebook to be able
to do that work, Like, you know, we're not almost
by definition an underserved community. We have to be doing
outreach in those spaces. You have to be paying attention
and inviting the criticism and the hard conversation that comes.
So it's three things, right. One is like looking at
(26:37):
historic harms that you've observed. Those are kind of the
best case. We know how to manage those pretty well,
and so you can talk through those. The second one
is having a team that's really agile and able to
hear and understand criticism as it's coming in and internalize it.
And then you also have to have the reactive Okay,
we didn't see this one coming and no one did
and something bad happened. How do we adapt quickly? You
have to have all three of those muscles, and we
(26:59):
do it different times.
Speaker 1 (27:00):
You know, I was doing a God it was some
kind of demo. It wasn't the Horizon dem It was
another demo I think that you know. It was about
kind of a virtual work type space, and they created
an avatar for me, and I mean, I swear to god,
buzz like you know, it was.
Speaker 3 (27:17):
First of all, everyone's pantless, right, Like we should just.
Speaker 1 (27:19):
Say that, right, yeah, legless? Okay, I mean it's just
I'm just saying, like it's a little everyone's legless.
Speaker 3 (27:28):
And I think like they made.
Speaker 1 (27:29):
My body look very strange, and as a woman, I
was like, this is just so weird.
Speaker 3 (27:33):
And it was all these dudes.
Speaker 1 (27:35):
Around me, like talking to me about work, and it
was a very strange experience where let me just say this,
like the person who sees the who can see the
future and see the point, was like, oh, this is
we're onto something real. But then I was saying to
myself as I was sitting there legless, sorry not pantless, legless,
like with my arms were like seven feet long, it
(27:56):
felt like.
Speaker 3 (27:57):
And I was surrounded by like tech kind of tech.
Speaker 1 (27:59):
Bro who were speaking at me about, you know, the
future of work, and I could barely get things to move.
Speaker 3 (28:05):
I was like, this feels a little weird, you know,
So you gotta.
Speaker 1 (28:09):
This is why I ask you about these things, because
I mean, I do think it's probably the future, and
you've got to be kind of thinking about the little
things if this is like the world that we're building
out that we will eventually in some capacity be living in.
Speaker 2 (28:23):
Yeah, so it sounds like you land yourself in the
uncanny valley, and you know, it's unquestionable that you really
need representations on digital spaces to either be cartoonishly inaccurate
where no one expects accuracy or like really accurate to
the point that like you really are proud of it.
You know, listen, hey, you know you may have combed
your hair today. I oiled my beard, like we did
(28:44):
things to look presentable. And this is just a video conference.
I'm not even sure if anyone this is a podcast,
like I made myself look pretty in the face for
a podcast or this is pretty as I get for
my apologies to the audience, but like so, of course
people care about how they present themselves in all spaces,
digital spaces included. You can do either deep pressure eyes,
which is how we've taken of the approach were taken
with Horizon and venues. So far, it's very consistent. You know,
(29:06):
it's in line with the Facebook avatars that they've launched.
It's all built by this Facebook Reality Labs team. And
then we do have a vision to get from what
we call codec avatars, which are incredibly rich, realistic reproductions
of faces. Bodies are pretty much always going to be estimated.
Other than faces in Hands, humans don't really key in
on specific details that much about one another. Faces in
(29:28):
Hands is a tremendously rich communication service we're all tuned
in our brains to identify small movements in those things.
The rest of it we can kind of estimate. Legs
are particularly hard. I'm sitting down. Do I look short
to you? I don't know, Like, how do we want
to play that? Virtual spaces do have some challenges relative
to the extra degrees of freedom cause some challenges as well.
(29:49):
But I do think, like you have to take that
glimmer and realize, honestly, it's not that far. I think
we're actually, honestly, we're seeing tremendous vertical adoption of virtual reality.
It's early yet, but that's how these things always start.
And what I like about it is the place that
we are is not a place free of problems. I
don't like to be in a place free of problems.
It's a place full of problems that I believe we
(30:11):
can solve and people will care when we do. That's
where I like to be.
Speaker 3 (30:15):
What do you what do you mean by that? What
do you mean?
Speaker 2 (30:18):
Consider news Feed? When I when I first joined Facebook,
you know, I worked with Chris Cox and Rouchie song
V on Newsfeed and we just knew it was going
to be a hit because the way that people use
Facebook before that was insane. They would click around, yeah, profile,
the profile, the profile, the profile to see what it changed,
and we're like, oh, we can do better than this.
(30:38):
Messenger people were already doing so many kind of tricks
and hacks to try to get chat to work on
mobile phones, to get around SMS fees, which were monstrous.
Now looking back, think about SMS fees. What a monstrous
thing that was. How amazing is it that the Internet
has freed us from, like, you know, ten cents of
message nonsense, Like we're doing amazing. It's like I like
(30:59):
it when you're the be of something and you're like, oh,
this is not only to kind of glimpse it. It's
pretty good, but I see a hundred things that can
make it even better. Right. That's where I think VR
is VRS like gotten good and I just see like
a hundred things ahead that can make it even better.
Speaker 3 (31:13):
Totally.
Speaker 1 (31:13):
I mean, but go back to Newsfeed, right, I remember
when you guys launched newsfeed. Newsfeed was the one that
everybody was like, Facebook is over?
Speaker 2 (31:19):
Right?
Speaker 1 (31:20):
Was that was that the one that everyone was like,
everyone was very upset about it at first.
Speaker 2 (31:25):
I think you're describing every change you've ever made okay.
Speaker 3 (31:28):
In history, No, I think I feel like NEWSBEO.
Speaker 2 (31:31):
I don't know.
Speaker 3 (31:32):
When there's protests, was that the one one of the protests.
Speaker 2 (31:35):
Newsfeed was the first thing that we had done that
people were so I don't want to overfit the curve.
I think sometimes when people really are upset, they really
are upset, and sometimes when they're upset, it's because there's
been a change and there's an adjustment period to that,
and learning to distinguish those two is part of the
art I suppose of being in these jobs. Let me
give you my analogy for newspeed. We launched newsfeed. You
(31:57):
even have that thing where you're at a party and
you know, everyone's talking, music set up, everyone's loud, and
then for whatever reason, like the music gets cut, everyone
gets quiet, and the last thing that you said is
like it just hangs in the air and everyone can
hear it. That's what newsfeed was like, Like we did
that to everyone all at once because everyone had been
out there posting on walls and doing things, and yet
technically those things were discoverable. They didn't know, right, they
(32:18):
didn't think it would be discovered, and then we like
organized it differently, So we basically record scratched the entire community.
So from that takeaway, we didn't think, oh, we did
it perfectly ignore them, No, we were like, oh man,
we really screwed the roll up out. We needed to
tell them what we were doing, why we were doing it,
roll it out steadily, and like, we didn't do that
back then, and so we learned from that. So each
time we really learn. Every time things happen, oh okay,
(32:41):
like we screwed that up, let's not make that mistake again.
So Newsfeed had like certainly a very strong visceral reaction. However,
from a product standpoint, it solved the problem we were
solving for almost immediately, Like we saw usage double kind
of overnight and never go back down because people were
having more success finding the content they were looking for.
(33:02):
They weren't having to click around, like do you remember
the entire center of homemade You should just be like pokes,
like number of pokes.
Speaker 1 (33:08):
Yeah, poking was so creepy, Like why did I just.
Speaker 2 (33:13):
Feel like that's why you guys probably had too many
too many men working at Facebook. I've never worked it,
I've never worked on poke. That's one I can't take
questions on that.
Speaker 1 (33:23):
But let me but let me just say so, yes,
news feed was so disruptive to everything. And and I'm
not one to say like, you know, whoa like they
should have never haduced news Feed changed.
Speaker 3 (33:34):
Everything, right, It truly changed the web.
Speaker 1 (33:37):
It changed the web, and everyone at first was kind
of like, oh, you know, like we talk about you
talk about kind of the party where you did this
all at once. I really think that to some degree
the stuff you're working on and maybe maybe I could
be completely wrong, but could could have depending on timing
and what kind of comes out and this moment we're in,
like you know, could have similar impact. Right, But but
we can't deny the the last years. We can't deny
(34:01):
the fact that news Feed also, you know, let's not
oversimplify this and be like tech is good and bad, right,
like that news Feed also, you know, created misinformation and
people are trying to figure out what truth is and
there are filter bubbles and you know, and people have
talked about the ad model at Facebook being you know,
being one of the most disruptive and terrible things ever.
(34:22):
So I mean it created this host of problems as well.
That doesn't mean it should have gone away or should
have never been done, but it did create this host
of problems. So as you sit there and what I
go back to my first question is is one of
the most important seats at Facebook, which I don't think
people maybe realize. And I'm just calling it one of
the most important seats because I think it's a very
important seat. Like you know, you got to I'm assuming
(34:46):
you got to be thinking about it like this, right,
like you've got to be thinking about it with the
same stakes as.
Speaker 3 (34:51):
You guys did maybe with Newsfeed. Am I is that
incorrect to say? Or am I just being dramatic?
Speaker 1 (34:56):
I know you sometimes you think journalists are dramatic, but
I certainly feel passionate about that.
Speaker 2 (35:03):
I love the passion and I think it's important. I
don't think journals are dramatic. I think journalist are doing
a great job. I think we've got we live in
tremendously interesting times, certainly much more interesting than the several
decades that preceded it, and it invites, it deserves the
degree of public debate that we're having on the issue.
So you know, far from it, I don't agree with
a lot of what you just said. And you know,
early on I think I said Newspeater was built by
(35:25):
the core team with three people Ruchie Songvi including there
was more women at the beginning of Facebook than people
seem to recognize. Maybe you've seen a fictional film about it.
Don't believe that that's fiction setting it. So I don't
agree with a bunch of things that you just said. However,
transitioning to the question of like, what is the thing
that we're doing, the work that I'm doing right now
feels important. Yeah, in the capital eye sense of the word.
(35:48):
I think it's important for society. I think it's got
tremendous opportunity for impact. Of course, it's very hard to
separate positive impacts from negative impacts, and thinking those through
really rigorously is something that we've said publicly weren't doing
in the early parts of Facebook, and we that clearly
was a mistake. It's one that we're you know, we're
trying to correctifind now with massive investment. That's one that
(36:09):
I get the benefit of. It's a mistake I don't
plan to make again. It's a mistake that we actually
don't have to, you know, I get to benefit from
all of Facebook's learning on it and their technological investment
on it comes to bear. I do think it's very different, though,
you're going to find very different problems. You know, news
Feed dealt primarily in information, and it raises important questions
(36:29):
about free speech and democracy and who's allowed to say
things that aren't true, and who's allowed to say, you know,
get distribution, who's allowed to be listened to. Those are
hard questions. The problems that we'll deal with in virtual
reality and augmented reality are different. I don't think it's
going to be at all like news Feed because unfortunately,
even as affordable as we've made Quest two at three
(36:50):
hundred dollars, three hundred dollars is a pretty far away
from zero dollars, and it's going to take a long
time for us to continue to get this technology deployed.
I'm worried about honey access. I'm worried about Hey, can
only rich people get access to this technology which is
potentially very empowering. What's the problem that comes with that?
That's novel, That's not something you have to deal with
for Facebook. Facebook is an absolutely wonderful consumer aligned business
(37:12):
model that allows us to deliver a tremendous amount of
services that used to cost ten cents a message for free,
and we don't think about that, maybe because we have
means and we're not thinking about all the people who
are benefiting tremendously. So those are the types of things
that do worry me a lot. And you know, obviously
we're not We're in a different business than Apple. You know,
we're not charging on these headsets the amount that you know,
(37:36):
someone trying to make margin on a business would charge
for them. So we are taking a different approach as
a consequence of that. So I think it's very important technology.
I think it's important to empowering a workforce that's global,
that doesn't have geographic or economic mobility. If you follow
Rose Chetty's work, FIRSTUS Stanford and now at Harvard. So
I think it's important work and I do take it
(37:57):
very seriously, and I'm grateful for the resources I have
at Facebook that helped me do it better.
Speaker 1 (38:01):
I think it's so interesting what you say when you
talk about the digital divide. I think that's probably one
of the most important things that's probably really on display
right now. I mean, it's always been an issue, but
it couldn't be more on display right now. You know,
during the pandemic where people children are having to do
remote work. You know, there's that image of you know,
(38:22):
kids trying to get Wi Fi from a Taco Belt
parking lot, you know, Taco Belt, you know, So I wonder,
I mean, I do think, you know, you talk about
even the future of work, and I've looked at the
demo you guys did. It's it's really interesting, you know,
where you put on the headset and you're you're and
there's presence and there's the ability to be around people.
Speaker 3 (38:43):
But you're absolutely right that that's for.
Speaker 1 (38:45):
Some people, you know, in an increasingly divided world and
what we're seeing.
Speaker 3 (38:51):
So what do you think is the solution? As you
kind of wad into.
Speaker 2 (38:56):
This, so a couple of things from a technological standpoint,
you know, we are trying to do. Actually, this is
a huge area of investment for Facebook, as you know,
you know, Internet dot Org was on the premise of like, hey,
can we get internet to more people at a more
affordable price, and it's wild to meet that that became
a controversial program in any sense. It is literally trying
(39:18):
to fill a gap that other companies and public services
have completely failed to fill, leading people in a very
precarious position as it relates to access to information, which
ultimately you and I agree is probably access to education,
to jobs, to a bunch of other pieces potentially to mates. Like,
it's a huge issue and it's one that we're passionate
about as a company. There's things that we can kind
(39:39):
of do. So one example I just saw recently, for example,
video calling. This kind of video calling takes a tremendous
amount of bandwidth. You have to have not just an
Internet connection, but a very good Internet connection to sustain
this over time. And my heart goes. You know, I
have a kindergartener who's on Zoom right now, I think,
and can you it's hard enough right now for him,
(39:59):
a five year old to be in a Zoom class.
Can you imagine if the video is cutting out, it's choppy,
he's not allowed to he can't contribute because when he
unmutes it's too late, and they can't draw. It's awful.
So something that we could do, for example, we've seen
demos internally where we can use the type of technology
that powers deep fakes, which we're concerned about and doing
a watch of detection on, but instead says, hey, what if
(40:20):
we recreate a little point cloud of your face and
then transmit it very lightweight over the wire and then
reconstitute on the other side so that you can actually
have richer, more lifelike video communication at lower bandwidth. That's
the kind of research that my team is doing that
I think could have a huge impact on our ability
to communicate richly under a range of conditions. And indeed,
(40:41):
if you look at our responsible innovation principles, things like
how does this behave in low bandwidth conditions? How does
this affect those who are economically disadvantaged is one of
the things that we look at. So, yeah, this is
an area that's that we're all passionate about at Facebook
and I think probably as an entire industry.
Speaker 1 (40:59):
More from Boz after the break, and make sure to
subscribe to First Contact wherever you listen to your podcast
so you don't miss an episode. I'd be curious to
(41:25):
know you talked about codec avatars earlier. Yeah, not to
keep out over these things, but I mean for folks
who are listening, you should like, go look at this.
Speaker 3 (41:33):
It is really.
Speaker 2 (41:35):
Fool your mother. We got to work on the insides
of mouths. That's what gets you as soon as someone
starts opening their mouth, like, oh, okay, that's fake.
Speaker 1 (41:42):
Can you describe it? Like if we were in VR,
and like it's like I would be seeing you, but
it really looks like you, right like I.
Speaker 2 (41:49):
Cod avatars are extremely lifelike reproductions of somebody's face and
the musculature that powers their face. And what hopefully allows
us to do is have really hyphens the interactions with
lots of people at low bandwidth because we're not actually
sending a video of your face. We're sending a small
number of key points and machine learning metadata that allows
(42:10):
us to reanimate the avatar of you on the other side.
And like I said, there's a famous concept the uncanny valley,
where you want if things are kind of lifelike, that's
very bad. They either need to be clearly representational or
pretty literal. And codec avatars are clearly over the uncanny valley.
They are on the other side, they are clearly good enough.
(42:33):
The challenge we have codek avatars is generating them. Right now,
it's like a thirty minute of you saying funny phrases
that get your mouth to animate in certain things and
expressing certain emotions in like a camera capture. Rig to
get to a codec avatar that's always not something that's scalable.
Can we get to it where you can just take
ten pictures with your phone at home and we can
(42:53):
do it that way. So we've got to do a
lot to miniaturize that and hopefully deploy it as something
that people, anyone could do over a messenger, you could say, hey,
like I'm you know, I didn't have time to shave today,
I'm a mess Let me just get my codec avatar
in the game and animate it that way for this conversation.
Or maybe I'm in a low bandwidth area, or maybe
there's gonna be a lot of people on the call
(43:13):
and it's gonna start to break down. And then obviously
as you move to virtual or augmented spaces, it's the
only way to work. You know, how am I? How
old am I gonna get somebody to have a fireside
conversation with me if I don't have some kind of
representation that looks like them and causes me to feel like, wow,
we're having a meaningful talk right now.
Speaker 1 (43:32):
Yeah. I mean I can see the virtual space, you know,
the workspace. It's kind of like it is really kind
of hard to take someone seriously if they're kind of
like a human pickle or something.
Speaker 3 (43:40):
You know what I'm saying, Like it it.
Speaker 2 (43:41):
Is hard, it's so real. So I have a weekly
meeting in virtual reality in kind of one of our
infinite office prototypes. I had it on Monday, and literally
my team just like it's out doing themselves. Like one
of my one of my team members fight because he
comes in wearing a Santa shirt, Santa Claus like outfit
and like a Captain cook hat. Another guy came with
(44:01):
red mohawk and a We are having serious work conversations
about serious topics, but at some point you're just like
it is hard to focus.
Speaker 3 (44:08):
When that's going on range, right.
Speaker 2 (44:10):
No, we really want to keep driving on codec avatars.
Speaker 1 (44:13):
Well, and so so let me ask you with codek avatars,
they look so real, like could someone this is where
I ask you, like deep fix, like could someone take
my identity and turn themselves into me with a Kodek
avatar in the virtual space? Like not to get too
black mirror on you, but you know, I think you
guys have to anticipate these types of things, like could
I pretend to be Boss assume your identity with my
codec avatar.
Speaker 2 (44:34):
Yeah, this is exactly the kind of threat bactor I'm
talking about when I say, yeah, we this is obviously
we're worried about. We're thinking of ways to ensure that
you have unique possession of it. This is actually an
area that I think we have pretty strong ability to
create guarantees because the codek avatar will be somewhat computationally
specific in terms of what it takes to create it. Like,
I don't think other people will very readily in the
(44:54):
near term be able to kind of create their own
codec avatar version of you, and so we'll be able
the kind of store it and ensure it's just for
your exclusive access. I don't think we'll allow the loaning
of avatars anytime soon. Deep thinks are a little different,
right because they start with a footage that's already in public,
and that's something that you know, we've talked a little bit.
That's more of SHREP and the AI team are focusing
(45:16):
on that. How do you ensure the providence of an
image is certainly one of the open areas of investigation,
not just for Facebook but for the industry to kind
of ensure that we have greater idea sense of providence
of an image, that it's real and hasn't been tampered with.
Speaker 1 (45:30):
I mean, God, the applications, and I don't want to
get into because we don't have too much time, but
I mean even thinking about, like you know, the future
of death and mortality and remembrance and with Kodec avatars,
I mean, there's so many interesting use cases I'm imagining
you guys could use, especially since you guys have so
much data on our lives and so much about us.
(45:51):
I don't know, I just it certainly seems like there's
a lot there.
Speaker 2 (45:55):
There's there's yeah, I think there's real potential there. You know.
One of the things that we do is we're working
with Stanford on a project to, for example, do really
rich volumetric captures of historic sites for those who have
been paying attention over the last twenty years, in particular,
under some regimes like the Taliban, really amazing historic sites
have been destroyed systematically and intentionally, and that's a loss
(46:17):
for historians, it's a loss for children, it's a loss
for people who would like to go see and experience
that thing for themselves. So we are already trying to
do those things. I recently saw a paper not from Facebook,
of somebody recreating what they thought the using machine learning
to see what the Roman emperors actually looked like as
using their sculptures and working backwards. And there's an appetite
(46:38):
for those things. So the idea of having like that
kind of potentially autobiographical exposure to people who are both
living and deceased is very interesting to me.
Speaker 1 (46:47):
I love want to go back to Project Daria because
we just kind of brushed over it. I mean, it's
really interesting. Can you give us really quick what exactly
it is one more time? And I just want to
dig into it a teeny bit more.
Speaker 2 (47:00):
Yeah. I mean it looks like a pair of conventional glasses,
except that you'll notice it's got cameras facing outside, it's
got cameras facing towards the wearer's eyes, it's got a GPS,
and it's connected to you know, an app. The researcher
who's wearing it has no access to the data and
it provides them no value, Like, they get nothing out
of this except that we pay them to wear it around.
What we get as once it's once it's been scrubbed
(47:22):
and quarantined and cleaned of identifying data. Then we get
to use it to validate you know, what sensors do
we need to provide augmented reality. For example, why do
you need outward facing sensors at all? For two reasons,
One of which is to locate you in space, you know,
us being able to put somebody in the sidewalk in
front of you as a codec avatar. We have to
know where the sidewalk is. Otherwise their feet are gonna
(47:44):
be They're gonna be floating, or their feet you're gonna
be under the serve. It's not gonna look correct, and
you're gonna take take your reality. You know you're not
gonna believe it. You want to play agenga game on
the table, how do you do that if you don't
know where the table is? And when I pull a
piece out and I drop it. I need to know
what the world looks like. So I need to be
able to localize you in space and understand the topography.
The second thing is it's potentially very useful. You know,
(48:05):
if I'm walking up to a restaurant and I'm looking
at the menu. Oh, my friend took a picture of
one of these menu items. Can we overlay that, like,
so it's potentially useful from a you know, giving you
value of wearing these glasses persona of view once they
have a display, which is obviously where we're planning to go.
At the same time, they raise these tremendous questions, Hey,
like who else is this video taking?
Speaker 1 (48:24):
Like?
Speaker 2 (48:25):
Do I have access to the camera feed? Can I
post photos from it? Can the government subpoena access to
the camera feed? You know what are tremendously deep questions
for us. We have a goal of understanding, Hey, we
want to cut One thing that's really nice is we
want to capture as little data as possible. The reason
is data capture is very expensive. Data capture and agent
(48:46):
to reality glasses. They're tiny, they're going to fit in
your face. We don't have that much battery, we don't
have that much compute. We have to dissipate thermal energy
without burning you, so we don't have that much thermal space.
So we would like to capture as absolutely little data
as possible to deliver great experiences to you. So with
these research glasses, we're trying to figure out, Hey, how
much data do we need to deliver These experiences. What
(49:08):
is different about ecocentric data? You know, we can't just
use data from, for example, cars that have been driving
around forever street view, because it turns out when you're
on the sidewalk, you're walking under trees, and there's different
lighting conditions, it's not the same. So how does it
perform in different weather conditions? How does it perform with
a human wearing it who are constantly taking it on,
taking it off, doing different things with it, fussing with it.
How often is it getting smudged? How often is it
(49:29):
like the video quality compromise? Can we even detect that
to a huge amount of questions that we need to answer,
you know, and we want to answer them now years
before the technology is actually in a consumer device. So
for us, we've got technological questions, that's one goal, but
we've also got social questions and kind of frankly societal
questions about the use and benefit of these technologies versus
(49:51):
the trade offs.
Speaker 1 (49:52):
Yeah, I saw one person was saying, you know, there
are proactive steps that we should be taking. Declaring biometric
data's helped DA legislating more consumer protections, making privacy choices
simpler and better informed.
Speaker 2 (50:04):
What do you think, I don't know those specifics that
you're referring to. I don't know enough about what it
means for something to be health data or not. I
do think that fundamentally, as Facebook has been saying for
a while, we'd like to have a unified privacy you know,
legal framework that we can work within. You know, Facebook
has been really open about this, like, you want legislation,
(50:24):
You want legislation written by people who understand technology enough
for this legislation to make sense, which isn't always a guarantee.
And so we are like very in favor of kind
of a legislation that makes clear how to handle things
like face recognition, which you see as a patchwork right now,
it's coming up Illinois and Portland. Patchworks are hard. That's
(50:44):
like really hard to deploy scale technological solutions too. It's
hard to invest all of our energy in getting it
right when there's like four or five different jurisdictions. And
that's just in the United States, let alone in Europe.
And so I think for us, like, yeah, the more
clarity we have, I'm like, hey, here's the data framework,
here's the privacy framework, here's what's allowed, the happier I'm
going to be because I can put one hundred percent
(51:05):
of my engineering resources on executing on that. I literally,
like don't know the specifics, just to say that I
want to have a more unified framework. We do have
teams who are spending a lot more time than I
am on trying to like make sure that there's some
progress there.
Speaker 1 (51:18):
You have something called personal API. I've read something you
wrote about personal API and learning when to say no.
Can you talk to us about your personal API? And
then I want to talk about the last time you
said no?
Speaker 2 (51:30):
Yeah, So personal API is just about like we exist
in these ecocentric bubbles where our worlds are so clear
to us, and we sometimes don't understand why other people
Why it's not clear to other people who are around us,
and like why should it be clear? You have to
tell them, like, hey, here's who I am, Here's what
I'm trying to accomplish. You know. For me, it's like, hey,
(51:50):
I used to have these really strong visions of myself
in one light, and it was hard for me when
people would, even if they were complimenting me, they would
compliment me in a way that didn't align with my
internal vision that that was a miss, I find like
talking openly about Hey, you know, I like to work
on technology that allows people to connect. It turns out
(52:11):
I like to work on technology across a huge range.
It doesn't you know. I can do it and broadcast
multicasts one on one virtual reality. I like building things
that then two other people can find some connection on that.
It's very satisfying to me, and it's what I've dedicated
myself to doing and I enjoy that. And the more
(52:31):
people understand that's where I'm coming from. The easierness to
work with me, the easierness to understand me. And I
think Facebook, you know, we can do a better job
as a company. Hey, Like, we're trying to connect people,
and there's we got to do a better job of
getting rid of the bad stuff. But there's a lot
of stuff that we really value, and we got to
figure out how to write the the distribution between those things.
(52:52):
So I write a lot about these things for the benefit. Honestly,
these are almost always hard learned lessons from me. What
ends up happening is I screw up something up for
ten years and then I finally have an epiphany or
somebody coaches me. God bless and says, hey, you're screwing
this up, and then I'm like, oh, right, and then
I try to write it so that hopefully somebody else
gets that ephany a little sooner than I did. You know,
that's definitely been the story of my careers is making
(53:13):
mistakes off and out in public, often and embarrassing ways,
learning from those and trying to help other people with them.
And so I actually I wrote two notes. I wrote
one called say Yes, and I wrote one called say no.
And it's intentional. I have these. I love working at
these like balances. A lot of times people are too
instinct to say no because I'm busy. They're not saying
no because it's a bad idea, They're just saying no,
I'm not ready to hear a thing right now. And
(53:33):
then you just say yes. However, the same time, just
as often people want to say yes to everybody because
it feels good to say yes. But then at some
point you say yes to so many things, you start
letting people down because you can't do all the things.
And so sometimes the most the best way to show
respect to somebody say hey, I'm sorry, I can't I'm
not going to do it, or I'm not interested in it.
You know, I'm just saying. I'm just saying not interested
(53:54):
in it. At the risk of getting myself in trouble
with some of your peers. I say no to interviews
that I want to do, interviews that I would love
to do, people who want to get me on the podcast,
people who want to get me, and I just say no,
I can't commit to it because I take this stuff
seriously and I want to put my time and make
sure that every one of these I do is quality.
You'll get to tell me how I did on that.
(54:16):
And so I do say no to other people, but
I said yes to you, so it's not all bad news.
Speaker 1 (54:21):
One of the things I've always thought was interesting. I mean,
when you talk about mistakes, you know, you talk about
learning from your mistakes. And I know we hear a
lot of executives who talk about mistakes and they've learned,
and we hear the company line over and over again.
But I think the thing that people crave more than
anything is just humans, right, and people just to be human.
Speaker 3 (54:42):
And I think that.
Speaker 1 (54:42):
Goes for a lot to like, maybe you could just
be human with us for a moment. What do you
think is like the biggest mistake you've made.
Speaker 2 (54:51):
Oh, it's it's easy. It's that stupid memo, the ugly
memo that I wrote years back. You know, I wrote
a thing. It was pursuing to a bunch of internal
conversation which have since been lost to time. It was
relevant to them, it made sense in the context of
what they were. But I wrote it glibly. I wrote it.
I think I wrote it like five minutes I took
I did an edit, nobody reviewed it. I put it
(55:12):
up there. People hated it. It had a discussion that
I thought was valuable. Though it's like it's like that's
you know. I was like, I was like cool, like
kick that discussion off, like let me move on, you know,
And it really is a part of an instance. I've
actually since gone back and written what I had intended
to write the first time. The second one I wrote
was like thoughtful, It was really nuanced. It had all
(55:33):
the points. Of course, no one cared about that one
because that was not this like glib kind of shoot
from the hip thing. I think sometimes people confuse, you know,
being like controversial as like authenticity. That's not authenticity. My
authentic self is incredibly nuanced, you know, I have layers
upon layers of feelings about things, and sometimes they conflict
(55:54):
and I want to work through them and all the meaningfulness.
And so I actually when when that leaked, and it
was really, you know, embarrassing for me. It was embarrassing
for the company. And I still to this day, you know,
I think back on how foolish it was to have
written it in the first place the way that I did.
I wrote to the company, I said, Hey, the solution
to this isn't for me to write less. It's for
(56:15):
me to write more. It's to not give the glib,
hot take one liner that's like, you know, punchy and
gets a reaction. It's to write the thoughtful, nuanced thing.
And so yeah, no, I you know, you live and learn.
It's an embarrassment to me still and I think it
will be to the end of time. But it can
also be a valuable lesson to me and too others.
(56:37):
And I've tried to turn that into something positive. But
I know, you know, I do tend to wear my
mistakes on my sleeve. I do think we live, to
your point, Laurie, in an age of authenticity, an age
where when we see something that's perfect we almost want
to tear it down more. When we see imperfections, it's
more relatable, it's more understandable, it's more real, and we
trust that a little bit more. And well, at least
(56:58):
I'm hoping that's the case, because that's I'm a case
study of it.
Speaker 1 (57:01):
So what do you you talk about the how that's
not the authentic Q, but the authentic Q is thinking
about some of these things in a much more nuanced way.
What are you thinking about now that that you would
say is the more appropriate bas memo now?
Speaker 2 (57:15):
Right?
Speaker 1 (57:16):
What is what is the thing you're doing on now
that that you just think is super important and needs
attention and needs debate and needs discussion.
Speaker 2 (57:27):
Yeah, for us, you know, I think there's two pieces
one of you know, from the body of work that
the world is exposed to right now that I've put out.
I think it's just a question about the nature of
democracies in general. You know. I think a lot of
times these things are couched in terms of free speech,
and sometimes we're the ones who use that language. For me,
it's about democracy. It's it's like, hey, you know, who's
(57:48):
allowed to have an opinion? Are people allowed to be wrong? U?
Should we return to an era of central gatekeepers who
watches the watchers? You know, it's it's a tremendous challenge
and and it's an area that I see a lot
of nuance myself, and I don't see as much nuance
in the public sphere in the conversation around it. It's
(58:10):
partisan all the way to the bottom. And listen, I'm
partisan in a way. You know. I have politics. They're
not hard to discover for those who look but like
I do. Also believe in democracy, and I'm torn on
some of these issues that sometimes my liberal brethren find
so cut and dry. I'm a little more anxious about
eventually taking that power and putting it in the hands
(58:30):
of the other side when the democracy swing is the
other way. So that's one set of things where I
wish we had a little more nuance collectively in the conversation.
I'm not the only one there, and this probably isn't
the time for it. I'm saying this to you because
you asked the question, but I don't think right now
National Voter Registration Day, everybody go out and registered to vote.
(58:50):
I realize that this won't be done today. So where
every time, whatever time, it is vote in the next election.
You can, but like I res, it's not the time
right now for that conversation, and I'm not putting it,
but it is in my head and it's something that
I think about day to day. Though more operably for
me is thinking about how to get this technology out
to people, you know, making it more accessible, making it
(59:11):
more universal, making it more user friendly, making it so
it doesn't feel weird to you to be in a
conversation with people, making it so that you do feel
like your virtual self as a representation of your real
self and you're comfortable with that there. Like I said before,
this is my favorite part of working on products. It
does feel like the beginning of news Feed, or the
(59:32):
beginning of the ads work that I did, or the
beginning of Messenger, or the beginning of groups. It feels
that way. It feels like the potential is here. I
can see it, I can feel it, I experience it
every day. How do I get it to everyone? Everyone
should have this power, everyone should have this opportunity. And
so that's where I am day to day.
Speaker 1 (59:49):
So that tell me, I mean you have You've been
in the warroom at Facebook for since two thousand and six,
the highs, the lows of criticism, the moments. You know
all of it, and we know there's been then tons
of it. Why do you do what you do?
Speaker 2 (01:00:05):
It really, for me comes down to a joy at
these small experiences that were not possible before that are
possible now two people connecting. And I don't judge other
people's motivations. For some people they want to empower civilizations
and revolutions, and some people want to just tell their
(01:00:27):
mom and have their mom be using something. And some
people want to be famous and they want to be
in the news. I don't know why those No, those
things really motivate me very much. The only thing that
I really find every day I get up is like, hey,
there's some people who tomorrow will do a thing that
was not possible before I did the work that I do,
(01:00:49):
and that is immensely satisfying for me. I guess I'm
a little bit I'm a little bit thought about legacy,
leaving something that outlasts me, having done something that leaves
some kind of impact on the world. The only impact
in the world that matters is the impact you have
on the people of the world and making their lives
a little bit better. And I get a chance to
(01:01:10):
do that every day, and it's pretty dang fun.
Speaker 1 (01:01:12):
I think if you have Facebook for life, or you
have Facebook life, well.
Speaker 2 (01:01:16):
If Mark's asking no, you know, I'm really happy. I
certainly it can't predict more than a few years out
in my life. But so far, at every turn I
found more new and exciting work that kept me deeply engaged.
So you know, I really am as excited today as
I was that first day January ninth, two thousand and six.
Speaker 1 (01:01:47):
For more from dot dot dot, sign up for our
newsletter at dot dot dot media dot com Backslash newsletter.
We're launching in October with an exciting guest and topic,
a prominent Silicon Valley founder who believes the toxicity found
on the biggest platforms could be avoided simply by taking
a stance from the get go. He says that tech
shouldn't be neutral, it should be opinionated.
Speaker 3 (01:02:09):
It's a fascinating take.
Speaker 1 (01:02:11):
And follow along on our social media. I'm at Lori
Siegel on Twitter and Instagram, and the show is at
First Contact Podcast on Instagram. On Twitter, We're at First
Contact Pod and if you like what you heard, leave
us a review.
Speaker 3 (01:02:23):
On Apple podcasts, or wherever you listen. We really appreciate it.
Speaker 1 (01:02:27):
First Contact is a production of Dot dot Dot Media,
Executive produced by Lori Siegel and Derek Dodge. This episode
was produced and edited by Sabine Jansen and Jack Reagan.
The original theme music is by Xander Singh. First Contact
(01:02:53):
with Lori Siegel is a production of Dot dot Dot
Media and iHeartRadio