All Episodes

August 24, 2020 54 mins

Molly and Emily delve into the world of kratom, the Midwest’s favorite legal drug! What is kratom and why are opioid users using it to get off opiods? Molly’s tales of caffeine psychosis. Will Emily try kratom for the pod? The outlook is good! For the second half we are joined by Tess and special guest tech journalist Nitasha Tiku to talk about unfriendly AI, the singularity, the paperclip experiment, the dark enlightenment and more! Why racist workplace culture thrives in Silicon Valley, and how employees get punished for calling it out. Are the promised imminent human linked neural networks just gonna be more vaporware? Who is Roko and what is their basilisk? Bad machine learning, racism at Pinterest, and other algorithmic fables about the limits of rationalism, on a new Night Call!

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
It's one eleven AM in a box full of ones
and zeros on planet Mars, and you're listening to Night Call.
Welcome to Night Call, a college show about our dystopian reality.
My name is Emily Oshida, and with me on the

(00:22):
other line is Molly Lambert. And we will be joined
in the second half of the show by our special
guests Natasha Tiku and also Tess will be jumping in then.
But for now, it's just me and Emily. It's just
us and our new show, our new segment Chat and Cradham. Um. I, uh,

(00:47):
this is my own night call to that. I actually
considered doing a night call about this when I was
going out traveling east uh from Los Angeles, because basically
as soon as I got into the outh kind of
parts of Texas as well, I just noticed that Crawdham
was everywhere kind of with the ubiquity that UM CBD

(01:09):
is in you know, well just about everywhere now too,
but um definitely in California UM, and I vaguely knew
what it was, but I was sort of like, why
why is it? Why is there so much here? Why
does this feel like something that exists, uh, like kind
of outside of the uh, the coastal norms like you

(01:31):
don't see crowdam necessarily and your smoke shops and your
head shops and in California, and I don't know that
that's because it's we got crawdham. It's it's yeah, it's legal,
it's there. It's just not push it's not It's not everywhere,
but I've definitely seen it in the Hollywood boulevards, smoke
shops and some big signs that say like, we have crowdham.

(01:53):
Can you tell me what crawdham is? Well, I, I
mean I was actually thinking I should guinea pig and
do it myself and report back, but I haven't gotten
there yet. But um, I it was. It was moved
to the top of my awareness again because my friend
that I'm staying with here does use it, so I
was like, oh my gosh, you have to tell me
all about it. Um And And the the deal with

(02:14):
crowdham is that it's uh, it's a plant it uh
and it kind of works with your brain in a
similar way that opioids do. It. It attaches to the
same receptors um and has an effect that a lot
of people compare to um opioids, and a lot of
people use it therefore to uh self medicate and get

(02:37):
off of opioids, for instance. But there's like a lot
of other uses for it. There's um people use it
for all sorts of reasons. I'm gonna just read off
the Wikipedia uh Mitrigania speciosa, commonly known as crowdam, is
a tropical evergreen tree in the coffee family native to
Southeast Asia, Indigenous to Thailand, Indonesia, Malaysia, Myan mar and

(02:58):
Papua New Guinea, where it has been used in traditional
medicines since the nineteenth century. Cradham has opioid properties and
stimulant like properties. Yeah, that's to both worlds. Baby makes
sense that it's related to coffee then yeah, yeah, I
mean I think you know, it's one of these things
that there's actually everything that I've read about it it

(03:21):
says there's comparatively little formal research done about it, which
is sort of becomes this catch twenty two when it
comes to regulating it, because there's been efforts to make
it a Schedule one substance for years now, um that
have not um that have not gone through, but that
would make it um kind of on the same level

(03:43):
as LSD, heroin, ecstasy and uh therefore also harder to research. Um.
There's a lot of limitations on researching those substances. UM. Yeah,
I guess the d e A said it has no
legitimate medical use in Yeah, clearly that's not true. Yeah,
I mean I think I I I there have been um,

(04:07):
I think fifteen deaths linked to Cradham maybe this past decade.
I'm not sure that the span of time, but it's
it's somewhat recent, um, which is you know, pretty low
and I'm not but again, we don't really like the
One thing that people do know is that it uh
if it even though it does create a sensation on

(04:28):
some people that is like obioids, it does not um
stop breathing. That's a you know, a signature of an
opioid overdoses, Um people stop breathing. But UM, so you
can basically have you know, I guess still get that um,
that same sensation without the risk with lower risk. UM.

(04:48):
The other other main risk that I saw was that
there was a Cradham link salmonella outbreak in UH. So
UH Salmonella is a possible risk of Crowdham, but also
UM of eating chicken. So there, Yeah, there were like
a hundred and ninety nine cases of of crowdham salmonella,

(05:10):
and so you know, there's very little report of like
lethality or danger or anything with this. And the the
impression that I got, but I would like night callers
who have firsthand or secondhand experience with it to give
us a call at one to four oh for six night.
But my impression is that this feels like a lot

(05:31):
like UM, a lot of other stuff we talked on
the show, UH, when it comes to people finding solutions, UM,
where healthcare is not UM coming coming, when when you
can't afford healthcare, when you can't when when the health
care system is not working for people, UM, when doctors
aren't equipped to address certain concerns. Um. This feels like

(05:55):
another one of these you're saying, this is like a
home grown way that people are getting opioids. Yeah, that
that seems to be what a lot of UH people
have used it for in in the various reports that
I've read, And it makes sense. I mean, um, yeah,
I I it just I was sort of it felt

(06:15):
very regional to me. It felt like a thing that
I started to see really pop up in UM in
these parts of the country that I had not spent
much time in recently, so it felt like, you know,
I noticed it. And I'm always interested in UM regional
truck use. Uh, what is more popular and for what

(06:38):
reasons UM in different part of the parts of the country.
And I think I would wonder if in New England
and a lot of places where there there have been
huge opioid epidemics, UM, if it's as prevalent there as well.
I mean, obviously opioid abuses and and and and death
has been all over the country, but you know, in

(06:59):
the in the parts of the country where it's been
particularly bad, I wonder if that's UM, if it's popular there.
There's something really Midwestern about being like we're going to
grow a plant to treat it right, Well, it's it
kind of it gives me vibes of salvia, which is like,
I mean, even though it's like totally different I think experience.
But just the drugs that are like not technically illegal,

(07:23):
but also there's no research done on them because most
of them are like and theogens for somebody, including creato
my guess, is used in some rituals, yeah, but also
used to just like keep people awake at work in factories.
It says, yeah, there's that part too. I mean, I'm
I'm color me curious, you know, maybe maybe next week.

(07:47):
I think it's crazy that caffeine is legal right as
we're on the beverage pod. Yeah, Like, I've had some
of the most fucked up experiences of my life from
having too much caffeine. You know, I have never really
freaked out from like smoking too much pot, but I've
definitely had like a panic attack from from caffeine from coffee.

(08:07):
We'll see everybody's everybody's body is different. Everyone's different. Yeah,
and uh, yeah, so I'll see. I guess I'll see,
uh what happens when I ride the green Wave And yeah,
we're going to take a quick break and when we
come back, we're going to be joined by Natasha and
Tests and uh and dive into the world of Unfriendly AI.

(08:29):
So stick with us. Hello, night callers. Have you experienced
any strange weather recently or any natural disasters in your
lifetime that are meaningful to you or something you want
to share with the show. Are you a storm chaser?
Do you actively seek out natural disasters? Are you Helen

(08:50):
Hunt from twister. Give us a night call at one
to four oh four six night or night email at
Night Call podcast at gmail dot com. Share your experience
with us. We're getting ready to dive into some scary
weather territory. So yeah, let us know what kinds of
storms and earthquakes and hurricanes you've been in. Things are

(09:13):
weirder than ever and we want to hear about it
from you. Welcome back to Nightcall. We are now joined
by our special guest, Natasha Tiku. Natasha is the tech
culture reporter for Washington Post and The Papers San Francisco Bureau.

(09:36):
She's been covering tech for more than a decade. She's
been at Wired, BuzzFeed News, The Verge, Valley Wag, among others.
Hello Natasha, Welcome to the pod. Thanks for having me. Guys,
I'm so excited for you to be here. We have
a night email that is I feel like right up
your alley high Night Call. Relatively new listener here catching

(09:59):
up on some recent episodes, and I wanted to add
to the discussion you've been having on Elon musk Ai
and fascist imaginings of the future. There's an article from
seventeen about a movement in Silicon Valley called the Dark
Enlightenment and it's neo reactionary followers that I think you
would enjoy, which with a link that we will include
in our show notes. The too long, didn't read version

(10:20):
is that a group of neo reactionaries believe that the
Singularity quote a vision of the future that anticipates artificial
intelligence both surpassing the human mind and merging with it.
End quote is inevitable that the resulting society will be
ruled by a super intelligence AI, and that Elon Musk
is using neural link to help build the tech to
merge the human brain with his AI super intelligence and

(10:41):
create an AI god king. It involves trans humanism, fascism,
and is as absurd as it is terrifying. Some of
Silicon Valley's most influential people, including Musk and Peter Teal,
are associated with this movement's thinkers and true believers. Elon
even found Grimes on Twitter because she made a joke
about Rocco's basilisk And anyway, I thought this article would
be a good read because it deals with a lot

(11:03):
of what you've been talking about on the pod with
Meredith and amongst yourselves. While putting a terrifying conspiratorial twist
on it. When I first read the article, it felt
like a conspiracy theory. But it's something I keep returning
to over the years for its explanatory power. Big fan
of the pod and I hope you found this as
interesting as I did. Thanks Justin. This was quite an article.
Um it. We will be linking to it. It's long

(11:26):
and dead is a long read, but it's a great read.
This is by Shuja Hater for Viewpoint Magazine. Yeah, the
Dark Enlightenment, the DAR or what is it d r
X or no, no, no, no, that's in the neo reactionaries?
Are that n r X like? Of course it has
to have an extreme abbreviation that sounds like an energy drink.

(11:51):
It sounds like d XAM, which is what people robo
trip on. Um. Natasha, how long have you been following
the Dark Enlightenment tech narrative? For about nine or ten years? Actually? UM.
Back when I lived in New York and worked for
Jared Kushner UM for the New York Observer, I had

(12:13):
written a story for them about UM, you know, trying
to find like a New York angle on UM, this
particular ecosystem of people and so in the article, although
I haven't had a chance to read that one, but
but basically a central figure there is elazer Yatkowski. Have
you you guys have talked about him on the pod

(12:34):
or you haven't talked about him on the pod or
he's featured heavily in this viewpoint article. Um, you know,
so he's he's a main character in the story. So yeah,
but if you want to explain for who he is
for listen, yeah, UM, so he is the author of
Harry Potter and the Methods of Rationality. UM. I don't
know if you guys know that, but it's a it's

(12:56):
a it's a central text for UM for this movement. UM.
It's kind of stems off of less wrong dot com,
which is a forum that a lot of these people
participate in. So at the time they're there, group was
called the Singularity Institute, not to be confused with Singularity University.

(13:17):
They've since changed the name, I believe because there was
some money exchange. UM, but now they're called um Miry
Machine Intelligence Research Institute. And that's where the Teal funding
came in. And in fact, like I think it's an
incredible investment from Peter teal because he only put in
like maybe a million dollars for the first donation and

(13:38):
then a couple hundred thousand after that. Like they haven't
received a ton of money, and yet like think about
the like the return on investment in terms of being
associated with futurism and enlightenment and contrarianism and and like
same with his c studing investments. It's actually very minor.

(14:00):
And since then, UM, there has been a lot of
other people like UM, the Ethereum guy, uh Skype co founders, Khaza,
like you can look at UM who is donated to them.
But but in any case, there they have an actually
a pretty simplistic argument UM about unfriendly AI general, you know,

(14:21):
artificial intelligence that UM you know, if you don't, it's
sort of like it's and it's really infected the effective
altruism movement. UM. So it's like, if you don't do
everything you can in your power right now to prevent
unfriendly AI from from coming into existence, unfriendly AI will

(14:41):
will you know, kill you and kill everyone and revenge
exactly like Eliaser's the guy behind the paper clip experiment.
Just the idea that you know, it just could happen.
You tell them to maximize for peace, and the quickest
way to peace is to kill all the humans. Um
you know. So so that's how all of Human City ends.
And Ellie's are like not, you know, not a comment

(15:04):
on his intelligence, but he is a third grade dropout
and like does not have a background in this stuff.
So it also causes a really big rift within the
AI community, Like the Google founders definitely do not appreciate, um,
you know this this uh, this kind of very cinematic

(15:24):
take on on unfriendly Yeah, sorry, I just want to right,
but it's super fascinating. Right. So, the place I learned
about the singularity was from the Ray Kurtzwild books from
the eighties, I believe, which sort of put forth the
idea of the singularity, the moment when AI will merge

(15:46):
with human intellect or surpass it. Is that where a
lot of this comes from originally. Yes, Ray is definitely
a foundational thinker, although he doesn't get referenced I think
that much anymore. But I mean, honestly, they talk about,
like the members that I've met with do talk about
like John Connor and Terminator to try to explain it

(16:08):
to other people. Although you know, they've been warned against that,
but it just it helps you understand, it helps you
picture it. And and Google and everybody like they're they're
anti that's just because they've invested so much in like
neural networks and the you know, machine learning and all this,
and so you know, they potentially are playing the role

(16:30):
of skynet here, which maybe is not a role they
want to be perceived as playing, even if they are.
To give them a little credit, UM, I think it's
actually based on their own understanding of how neural nets
work and how um, you know, machine learning works. I
think they're just saying like that is not um, it's

(16:53):
just that like far fetched potential reality is just going
to distract you from like the very real things that
happen when you put in like bad data sets and um,
you know, you have a bunch of developers and um
machine learning experts who don't think about the implications of
their work. Like it's it's like, you know, people have

(17:16):
been talking about self driving cars forever, Like the chances
of a self driving car almost as far off as
or vice versa. Yeah. Yeah, one of the things in
this article, UM that I thought was interesting because I'm
less familiar with this kind of stuff, but uh, and
I didn't even realize that this was a concern that
people had. I'm quoting from the article, but this is

(17:38):
Roko's hypothesis. Hypothesis um that AI maybe quote or it
may quote, develop a survival instinct that it will apply retroactively.
It will want to hasten its own birth by requisition
in human history to work towards its creation. In order
to do this, it will institute an incentive that dictates
how you will be treated after you come back to life.

(17:59):
So is that kind of the center of this panic
is that if we create machines that reach singularly, you
know that if we reach the singularity point, that the
machines will somehow like become angry at us and try
to bring us back to life and torture us forever,
because that was alluded to in this article. And I

(18:20):
was like, is that actually what Rocco's first post like
did to people's brains? Is that where their their minds
went with that? Well, I feel like it's sort of
this thing where it's like, if you take rationalism far enough,
you're just gonna like be rationalizing away human beings. So
like because there's nothing rational about human beings like that

(18:41):
feels like the crux of this. But then I think
the thing that's funny about the Rocco's basilic thing is
that it's like a cursed chain mail letter or something.
It's like, if you exactly then your brain is poisoned forever,
and now you are a part of the scheme to
bring about the unfriendly or you have this moral responsibility
to somehow coach AI to be beneficent, benevolent or whatever. Uh,

(19:05):
And and that if you don't act, if you don't
act on that, that they will be evil because now
you know and say you have some kind of an
obligation to do that. Yeah, I mean it honestly sounds
so much like the white racists being afraid that when
minorities have power, they'll come treat the white people like

(19:26):
white people have been treating them all along. And that
is something that I thought was interesting because it's like,
what if they put as much money and time and
thought into dealing with the racism issues that they have
in tech as they do to dealing with like a
potential unfriendly AI, which is the thing that doesn't exist,

(19:48):
unlike racism which does. Right I think there's the like
the two potential um you know, dreadful outcomes. One is
the you know, uh sentient AI that that judges you
for your actions, and then the other is that like
paper clip factory example, where um, you know, just you
put in an input to maximize one thing or the

(20:10):
other and just like without even attributing human attributes to
the to the machine, it just destroys humanity because you
didn't have enough people writing white papers and thinking about it.
And in the way that it's like, you know, become
such a priority for the effective altruism movement is like
I think dovetails into exactly what you're saying, well about like, um,

(20:32):
so that's a movie like money that potentially initially was
fixated on how you can make the most impact on
like the you know, on poverty or um you know,
global desperation in its many forms, and is now prioritizing
as one of the top issues this um eventuality potential. Yeah, no,

(20:55):
I uh, does anybody know who Roco is is? The
person who kind of floated this idea to begin with
that's become you know, that's had this much consequence financially
for the tech world. Does any like I didn't. I
couldn't find anything about who this dude was. I forgot,

(21:15):
but I think I looked it up when Grimes and
Elon Musk Scott because I just could not believe she
would because it's forbidding being discussed now. So I maybe
it's from Harry Potter and the Methods of Rationality. I
wouldn't be I'm looking. It's named after a legendary serpent
creature from European mythology that killed Bisques. I was just

(21:37):
wondering about Rocco. Yeah, the dude Rocco or Rocco's obviously
the sentient robot from the future who created this to
warn people. Yeah, um, but yeah, I mean, you know
the pop cultural implement implications also of this idea that
it did bring Grimes and Elon Musk together to have

(21:58):
their math baby, So you know, it's the tendrils reach
into all sectors of society and culture. Basilisks, by the way,
Molly are still popular pets, those lizards. They can walk
on water. But the whole thing about the Basilist because
it's a monster. The reason that they called it the
rokos Basilist, because that because it's a monster that you

(22:19):
die if you look at it. So that's the whole
thing of if you are exposed to this theory about
the unfriendly AI, then you are now a part of it.
Because the idea we should clarify the idea is that
by putting that idea into the Internet, by typing it
onto the internet, the thing that the AI are supposedly

(22:40):
going to be learning from, you have now exposed them
to this idea and they're like, hey, great idea. Human.
When I'm smart enough, I'm going to destroy you if
you didn't help me exist. Um, that's that kind of
feels like the gist of it. I thought it was
also a little bit that when you spread that idea
among people, like they can't stop thinking about it. Like

(23:01):
people were going like years of effective altruism are less wrong.
We're like, yeah, people were really really losing it. Yeah Yeah.
Why do you think so much time gets spent on
theoretical problems like this when there are so many real
problems in tech that you've been covering for a decade
that seemingly have not gotten fixed or even really worked on. Well,

(23:24):
it's I mean, it's an incredibly effective tactic. Um. I
don't know if you guys are familiar with data and society,
but they have a bunch of researchers who do like
great work kind of tying historical discussion on automation with
how things are going these days. I mean it's it's
just like as simple as as it looks like you

(23:48):
you know, gesture over here and um make it sound
make the future sound inevitable, right like if self driving
cars are inevitable, then you don't need to worry about people, um,
you know, complaining about this or that, um, and it
makes it seems the fight seemed futile, like it just

(24:08):
distracts you. I mean this is also how like companies
work with I P O s. You know, it's or
or whatever like the stuff like public companies with the
stock market. It's just always the next thing, Like new
work is looking at co living and then I don't
know what they're looking at, like wave pools that's actually
a real one. Um, I was gonna stay in spaceships, um,

(24:29):
you know, or Elon Musk looking at Mars when you
know what they're doing right now is uh cheaper work
that has already been accomplished. So if you think that
like eventually this planet is going to blow Um, then
of course why would you worry about the piddling concerns
of those who won't even end up on Mars? And

(24:51):
it's probably cheaper to like, Elon Musk has so enough
power that he could, you know, create a green economy,
probably more or less single handedly if you wanted to.
That might prevent our need to go to Mars. What
is your what's the PUDs like unifying theory on Elon Musk?
Like that we hate him, right, We're not into it.

(25:18):
And I think in general that the in particular the
Mars thing feels pretty um, pretty dark. Well, it's like
super colonialism from somebody who's money all comes from apartheid. Yeah, yeah,
And I totally believe Azalea Banks who said that people
who want to colonize Mars are planning to do it

(25:39):
entirely with robots and leave all the humans behind except
for the god king humans who will get to go
to Mars. I know sometimes I've been I've been trying
to think about like a you know, a good sci
fi plot for landing on Mars with Ellen, and it's
just so absurdist. But but I do think he's like

(25:59):
did guys read Ashley advances biography of him, he's just
like an incredible salesperson and he knows like the narrative
arc that they have perfected, Like multiple times in in
the book, you know, he tries to refer back to
like I was interested in space when I was eight

(26:20):
years old, like these are the this is how the
trope goes, you know, like some outlandish thing. And he
also is frequently talking about how he sleeps under his desk,
and you know, from for from all of his employees
who have been overworked and super stressed, like I'm sure
I don't doubt that, but it's part of this narrow
like if you look him up on Cora, all the

(26:43):
questions are about like how can I be like Ellen
and emulate this like non sleeping, hard working lifestyle. But
you know, and when he meets his actress, his second
wife actress wife to Lula Um in the book, he's like,
I was I hadn't taken a vacation years and I
was so sick. But my friends they forced me to

(27:03):
go to this club. And then I was at this
club and I met this actress and I was I
was basically forced to like pursue this actress. But it's
just like, how like this he knows to say. Yeah,
it's like this turbout Steve Jobs origin story. It's like,
you know, the the romance of the garage, like you know,

(27:24):
times twenty and UM just kind of made more toxic. Yeah.
I hadn't really thought about this before, but I think
like maybe like Steve Jobs is good at consumers. I
feel like Ellen does this to have his workers and
to have like direct money to certain areas and interest
in things. Yeah, that wasn't a really interesting distinction that's

(27:47):
made in this UM, in this viewpoint article, and I'm
sure other people have made as well, is that there's
been this shift in Silicon Valley culture. And I'm not
saying that like one is necessarily more virtuous than the other,
but this humor facing UM kind of mindset, like you're
kind of the previous generation, you're Steve Jobs, and you're

(28:08):
uh yeah, I'll I'll that that era and then UM
this sort of current era, which is sort of more
techno feudal, much more interested in UM kind of creating
empires than necessarily creating a product for consumers. UM, like
the product for consumers is sort of incidental to creating
these sort of factions, these sort of like mini empires,

(28:29):
which is um, yeah, that really clarified a lot, because
that's there's definitely a shift in how like when a
company like uber or lifts says that the drivers are
not their employees. Um, which is a big part of
some big news that happened in California this past week. Um,
that is part of that, you know, that's that's part
of the new tech culture that we have now. Yeah,

(28:52):
and I think part of the thing about Elon sleeping
under the desk and this whole uh you know origin
story of tech where everybody is like, I gave up
my entire life to build this thing, and you know,
I worked on it so much. But you see that
in all kinds of industries where you know, the founder
or CEO, especially in like startup culture, will be like

(29:13):
I'm the first person here, in the last person out,
like therefore nobody else can work less hard than me. Yeah.
I mean for Ellen, I think it's especially interesting given
that his um early companies were just software companies. His
first one was like Quasi a like a media guide
company or something, and then um, but I think like

(29:34):
that romanticization of somebody who's gone from like just code
and like not that inventive to hardware to you know, like, um,
the kind of the most difficult kind of hardware just
makes him such Um. I mean, I think like the
cultish implications of everything we've been talking about our time together,

(29:58):
because like Pretesta and pres Sex, it's not like this
is somebody who has some singular vision. Um it seems
really scattered PayPal a singular vision. But yeah, these are yeah,
these are like little there there there bits of code
there there. Yeah, that's software. It's not it's not like
something like the Mac or something like that or Windows

(30:21):
even it's it's uh pretty. It feels like it's all
stepping stones that each individual product or service is not
necessarily the thing that's going to be the legacy. It's
more just like a massing power and going onto the
next thing. And yeah, and just progress, progress and scare
quotes for progress is sake, um going going to the
moon for the sake of going to the moon, because

(30:43):
it's more active than investing in um our planet. It's
like fetishizing empire. Like the growth, you know, the growth
which is like um, you know, uh enforced by the
stock market by investors, everything becomes the like the the

(31:04):
end in itself. And I think that is because they,
like a lot of these people do think of themselves
as having good intentions and a and a smarter, better
vision for the way the world should be. So if
you use that argument in reverse back, it's like, yes,
the end goal of Jeff Bezos should be to you know,

(31:24):
just have the biggest empire possible, and then at some
point you'll switch to you know, benevolent mode. And and
that was I mean, that's like a weird side note
of the Roco's basilisk thing is that it starts with
this like it starts with this argument about um about

(31:45):
altruism and creating this uh potential solve for the altruists,
um the altruist burden the altruist. See, when you start
with a phrase like altruist burden, you know it's just
like not going anywhere useful. But the solve is the
quantum billionaire trick, which is basically like, um, you become

(32:07):
a billionaire via many worlds and h then you can
solve the world. That's the most effective way to solve
the world's problems. Like you invest some money and then
there's like one universe in which you end up with,
you know, three million dollars or whatever. You can do
whatever you want with it to help the world. And
so that's the most powerful thing you can do in
this world to help other people. It just made me

(32:30):
so angry. As we approached the fall season, it's the
perfect time to share your ghost stories with Nightcall. Please
leave us a night call at to four oh four
six night or a night email at Nightcall podcast at
gmail dot com with all of your tales of the supernatural, spooky,

(32:51):
and just the mundanely weird. You wrote a great thing
this week about the Chance Zuckerberg Initiative and how sort
of the philanthropy arm of these tech companies and tech

(33:12):
billionaires really doesn't compute with their own personal choices as
business people. Could you talk about that a little bit?
Sure so? Um So, the article was looking at what's
been going on inside the Chance Zuckerberg Initiative over the
past few uh months and years. Um So, the Chance

(33:35):
Zuckerberg Initiative is a limited liability company, not a foundation,
not a nonprofit, which means that um so, so, Mark
Zuckerberg and his wife, Priscilla Chan, who's a pediatrician. They
have pledged their Facebook stock to go into that, but
because it's not a foundation, that doesn't mean the money
has been committed. It also means they're allowed to make

(33:59):
investment in for profit companies and allowed to do political advocacy.
So when they announced this, which was like with the
birth of their first daughter, it was like this UM
letter to their to their daughter about UM, about wanting
to think long term, and the focus at the time

(34:19):
was on sign So it's like cure all disease in
your lifetime and the biggest issues facing UM the next
generation UM. Many people, including myself, were very skeptical about, like,
how are they going to exercise this political power. There's
much less transparency UM and uh, you know investing in
for profit companies, and I think the problems that have

(34:43):
manifested itself recently just shows you, you you know, you can't
sometimes you just can't even anticipate the right problem because
UM unlike Bill Gates say, who like stepped away not
he was still on Microsoft's board actually up until earlier
this year, but he stepped away from this YEO jobs
like so Zuckerberg is in this position now they are

(35:04):
the largest one of the largest funders in the US
of criminal justice reform, and through their investments in Forward
dot Us, which UM worked a lot on, like data
and with Dreamers. UM. They are also one of the
biggest funders of immigration reform in the U S. So
what happened this summer is that the black employees at

(35:26):
u c c i UM, their group is called Building
Leadership and Knowledge b LK. They saw Facebook's decision to
leave up that Trump shooting and looting post um, and
then they saw their comments on Black Lives Matters, and
they decided to write a letter to Priscilla Chiyan, who
actually runs the day to day to articulate this kind
of like it's a much more sophisticated diversity problem than

(35:49):
what's happening at most tech companies are the level of
discussion because they're working on criminal justice reform. They have
a whole section called Justice and Opportunity that's focused on
housing criminal justice reform, immigration movement in capacity building. And
what they were saying is, we've been asking you for
years to prioritize racial equity. So obviously there's various ways
to to define it, but what they mean is like

(36:11):
really looking at you know, these are progressive causes in
and of themselves. No one is saying that. You know,
like the end goal is is different. But if you
don't look at how you know, who are the organizations
who are receiving this funding, how is this impacting communities,
especially because their work in places like criminal justice reform

(36:34):
has focused as most corporate philanthropy has on clean slate initiatives.
So that's like expunging records UM, which has gotten a
ton of bipartisan support, like Center for American Progresses on
it UM. So are of the Koch brothers, Uh, you know,
there's a lot of UM, there's a lot of Uh,
it's just more momentum and cohesion across the spectrum than

(36:58):
most things. But like, you know, there's a chance that
it could only it could be easier to obtain for
white prisoners, or you know, there's just so if you
don't look at those implications, the fact that you're one
of the biggest funders is not gonna mean anything. And
they were saying that the same thing has happened to
them within cc I, like they're not promoted at the

(37:18):
same rate, they're not listened to. They were hired for
their expertise in these areas and the fact that they
worked closely with these communities. UM, and they're not being
you know, their their wisdom, their lived experiences not being
prioritized or respected. And I mean, just like the layers
of irony are just it's just it's too much. Like

(37:38):
because at the same time, UM, organizations like Color of
Change and the n double a c P. They like
they had just been in the news talking about and
I feel like the the critiques have gotten extremely precise
and and really strong about Mark Zuckerberg, which is like,
no one is trying to say that it's your perception

(38:00):
of yourself as good intentioned is the problem here, Like
you don't recognize your own blind spots on these issues.
And part of it is just like pressure from the
right because you are trying to maintain your empire, so
you have to and you're under attack from the left
and the right. So but part of it is choices.
You choose to listen to the conservative critique. UM, you

(38:23):
know more than anything, Uh, let's see what happens if
you know the administration changes. But they're saying that you know,
you're going for these bipartisan initiatives. UM. And they talked
about like not so much in the letter, but in
my conversations with UM, with employees and current and past,
they were talking about, like, your desire to appear equal

(38:45):
is disproportionately hurting black leaders and and and black black
lead solutions, and this is it's just like echoed what
I've heard from Facebook employees, what I've heard from everyone,
and the irony of one of the biggest funders of
this movement being in control also of what threats get

(39:06):
amplified against black lives matters to billions of people. It's
just it's too much power. I mean, it's a lot
of It's like I think it's just so apparent when
you look at these huge companies. When you look at
a Google or Facebook, they all have you know, like
or alphabet I guess where you have all these little
projects And like sometimes if you look at it at face,

(39:27):
valure like what does this have to do with you know,
forwarding the Google agenda or something? But then you look,
there's always a threat. There's always some reason, whether it's
just data gathering, data harvesting UM or it's usually that,
but like you know, UM, there's always some way you
can connect it back to like Okay, this is how
this helps them. They're not just giving you free books

(39:49):
on Google Books because it's a nice thing to do.
There's always some kind of UM agenda behind it. And
I think it's always so apparent when you see the
philanthropi philanthropic efforts of any of these companies that they
don't have that connective to shoe to what the UM
like they're They're they're completely not uh cohering or part

(40:12):
of the same conversation as what the rest of the
company is about UM in a way that you know,
allows there to be these huge discrepancies between what where
they put their money and where they you know, do
their good deeds. It's like a completely it's unlike unlike
an alphabet where you have all these like it's not
a part of a network of activities in the same

(40:32):
way that you know, any number of uh Facebook initiatives
could be like you know, creating Messenger or whatever. Right,
it's almost like a a like a countervailing force to
the work that they do. Like I I just UM
it just find it so interesting, like the work in

(40:54):
science versus the amount of misinformation spread on Facebook, the
work in edging pation versus you know, the impact on information,
access to information and and and news gathering and um,
you know, with justice and opportunity UM. That also has
it like they're working in housing. Uh. You know Facebook

(41:16):
has been sued by HUDD, you know, for for um
for discrimatory ads, um. You know, not like redlining when
it comes to what housing ads are served to black users.
It's so it's just crazy to me because like when
people thought that Zuckerberg was going to run for office,

(41:38):
I just like kept I just you know, people have
some idea about his personal beliefs or the personal beliefs
of Google's founders. And when your main existential threat is
like empire maintenance, you can't your personal beliefs do not matter.
And it's honestly so I think it's one of the

(41:59):
most like dangerous forces. This this faith in yourself as
a good person and feeling like you're like, imagine these discussions,
you know, if it's like giving books to like chromebooks
or whatever, laying in books books. Um, you know, if
you're inside Google and you're like, well, we know that
we care and you know, we trust ourselves like you

(42:21):
just and and they don't kind of see maybe the
irony in the fact that they have the power to
make these seemingly man magnanimous gestures because they've siphoned from
you know, X, Y and Z. And that's where I
think Also the Zuckerberg, Uh, Like we thought the problem
was going to be He's going to impose his personal beliefs,
but the problem is that he's imposing and advocating for

(42:45):
It appears that he's imposing and advocating for beliefs based
on what will allow Facebook to continue for the years.
I don't know that he has any personal beliefs, And
that's kind of the problem. Like that and the idea
that I mean this this thing feels like a really
fundamental difference between running a country versus running a company, um,

(43:07):
and trying to run a country like a company, say,
is that, Um, there's no policy in the same way
in a in a company that there is in a
country like or in a government, in a functional government
like you're not you have to make those um, those
fail safes for um, human human failure and human error. Uh.

(43:31):
You know, in a in a at least in a
functional society like you have to have those things. But
in a company, everything is if you're part of the company,
then you're a part of the the activities that the
create wealth for that company. So, uh, you don't need
to be questioned in the same way there, there's not
it is assumed that you are on the team. Well, also,
I think you know, you can have good beliefs, but

(43:52):
you can also really maintain the idea that the only
way to achieve any good is to hang on so
tightly to your power and your empire that you can
achieve it. And that kind of also ties back to
the article that we were discussing, because if you feel
a moral responsibility to be able to have that much
power because you do feel like your vision is aligned

(44:14):
with something good, then you also probably feel a moral
moral responsibility to increase your profits, to have more money
to do good things. But then, as Natasha pointed out,
it's at the expense of people who are already marginalized
and already saying like, hey, you know, we're being left
out of this um. It's so complicated. I mean, everyone's
always so hard on Zuckerberg, and we are definitely quite

(44:36):
hard on Zuckerberg on this podcast, but I think it's
worth mentioning that, I mean, it is complicated, like it
is very complicated, and it's I think good. But what
you were saying to about blind spots, it's like, you know,
an actually smart person would be aware that they have
blind spots. But because of the sort of Emperor god

(44:58):
king of Dune way that tech situates itself around these
personalities and has these sort of cults of personality, they
don't listen to the people that are pointing out the
blind spots and something. I thought it was really interesting
as a ship because I do feel like it's like
they think about the optics, they make sure that there
are like some black people at the company, but then

(45:19):
they don't empower the black employees and it just creates
like an endless loop. But the thing you wrote about
Pinterest where they were doing some sort of front facing
like we're making it more diverse and we're gonna like
work on racial equality and like we love BLM and
we're not going to have plantations on the boards anymore.

(45:39):
And then you profiled a couple of black employees of
Pinterest who were like, Pinterest has a huge race problem, um,
and they're not really doing anything about it except these
sort of superficial mickey mouse. You know, look what we're
doing to fix racism. Right. So umsa Alma and Erica
both worked in public policy, and they were responsible for

(46:02):
all of these things that pinterest had been doing that
looked so progressive, such as, um, you know, uh stopping
the ads that were placed on plantation wedding content, UM
stopping the spread of health misinformation and actually, uh, you
know what was the one of the catalysts for their

(46:24):
decision to speak out was the fact that their internal
documents had been leaked to UM Project very toss about
their efforts to get Ben Shapiro designated a white supremacist.
And they weren't even saying like take his content off,
but they were warning that this is going to become
an issue and you know, you need to to put

(46:46):
a warning on that. And so the again like the
irony is just it's too much. Because the fact that
that is what led them to realize the company does
not support them, they ended up having to they warned
the company that you know, hey, if you have Project
Fairy Tasks going after you, this means like something has
been breached. Um, they're going to come after the you know,

(47:07):
women and uh like minority employees. This is they have
a pattern of doing it, and they wouldn't. They didn't
take them seriously. They had to pay for the organizations
that they work for to get health misinformation content. They
had to pay them to to look out for them
being docked, and they were docked and um, and then

(47:30):
the pinterest hired that company it's called story Full and
had them look into is Ben Shapiro really a white supremacist?
And and I didn't see that. Yeah, I conclusion, Yeah, yeah, exactly.
It's like they have people telling them the information that

(47:50):
they need to know and they're just not listening. Uh.
And that seems to be the problem across the board
at all kinds of companies. Yeah, that actually happened at
Facebook is oh it was um they only had one
black executive in the room when they decided to um
make the decision about Trump's shooting and looting post. That
is the like their global head of diversity who's been

(48:12):
working at the company for seven years. Everybody has been
telling them that she should be reporting to Zuckerberg. They
just only promoted her to report to Cheryl Sandberg and
the black employees inside Facebook. We're also trying to explain
to him, like how these dog whistles work how you know,
so so Uh, it's just it's just the same story everywhere,

(48:35):
and it's been the same story, like it's been. Yeah,
such an opportunity because now people are sort of willing
to talk more about it on the record because they
see how their um, you know, the their employer's decisions
are making real world impact and they have a little
bit of a buffer with everything that's been happening for
their own like safety and jobs and um. But they've

(48:57):
been talking. I mean, I've just it's just the same
thing for yeah, And I think this mirrors a lot
of this stuff, especially you know, having the information told
to you and not doing anything about it because at
the end of the day, you're not that interested in
learning or changing or emitting that you've done anything that
wasn't helpful or constructive. Like that feels like that happens

(49:18):
in every sector, that happens in journalism. That's been the
basis of a lot of recent dust ups that have
happened in in in journalism and editorial and everything. Uh,
it's just that when you're talking about a company and like,
I don't know, I like Pinterest feels more on the
level of like a Conde nast or something. But like

(49:38):
when you're talking about Facebook, where you're talking about um,
a company that has really unjudicially gotten itself UM mixed
up in all of these different aspects of American life,
than you know, the the unwillingness to learn becomes even
more of a hazard. I also think the like free
speech obsession, which are these kind of straight white guys

(50:01):
that are in charge of these companies manifests as like well,
we can't we can't tamp down any kind of speech.
We just have to see what happens. And what happens
is like it goes towards fascism. But there's this like
anti interventionist attitude of like, well if it wants to
go towards fascism, like that's where it wants to go,
and we just have to let it, you know. Uh yeah,

(50:24):
I mean it's not even anti I mean I would
say that it appears that it's not even anti interventionists
so much as like anti paying for the labor to
uh to intervene. Like if you look at Zuckerberg's founding letter,
uh you know, which they put out when they go public,
or Google's, they don't say anything about free speech, um.

(50:44):
You know, early books on Zuckerberg, they don't talk about
free speech. UM. And recently there's some amazing reporting in
the Journal and NBC News Olivia sol and UM about
how their own researchers, Facebook's own research has found that
their moderation policies were um taking down more content from

(51:05):
black users. We're like, you know, and and they told
them to stop researching it. So it's just like, obviously
there's hypocrisy everywhere, but it just really does not hold up.
And that is one of the blind spots. Like the
the idea that um, you know freeze, like the kind

(51:25):
of free speech you're concerned about protecting is somebody who
has this massive platform, right, And it's like, you know,
Zuckerberg's own employees at c c I. This is from
first reported by UM by recode, but one of the
employees at c c I was like, you should resign
from Facebook or resign from c z I. UM, And

(51:45):
you know, he laughed it off and was like that's ridiculous.
Of course not and brought up free speech, and they said, well, like,
what about the free speech of the protesters. They're dead.
When they're dead, they can't talk like it sounds. You know,
it's just like fighting with this massive myopia. Yeah, but
when you don't think you're going to die, when death

(52:07):
isn't a factor for you, then what do you care
about other people's lives? And you'll be on Mars, You'll
be You'll be a robot. You'll be in a box
on Mars. You'll have a robot dick. Number one priority.
Get the robot dick. Somebody allegedly already has one SpaceX Um, Natasha,

(52:29):
thank you so much for joining us. We are all
such huge fans of your work, and I really look
up to you for, you know, tirelessly covering this thing
that seems very soul killing to look at in depth
all the time. Now we're lucky to have your eyes
on it. Um, you're doing a great public good. Thank

(52:51):
you so much, Natasha. So people can find you at
the Washington Post. You're also on Twitter at Natasha tiku
n I T A s h A t I K you.
Is there any uh anywhere else that you can point
people or is that or anything recent you want to
you wanna hype that you wrote I want to hype
that I don't have to no, no, but gold read

(53:14):
this NBC investigation about them telling researchers not to look into, uh,
not to look into how Facebook moderation policies are just
proportionately impacting black people. It's just got a lot of
good stuff. Perfect. Will you send us a link that
we get in our show notes? Perfect? Thank you so much, Natasha.
This ples come back sometime. Yeah for sure. Thanks, stay cool.

(53:37):
Thank you so much for listening to Nightcall. You can't
subscribe to us on iTunes or wherever you get your podcasts.
Leave us a review in our rating while you're at it,
as long as it's nice. You can also follow us
on social media. We're on Twitter at Night Call Pod
and Instagram and Facebook at Nightcall Podcast, and you can
support us on Patreon. We are at patreon dot com

(53:59):
slash Nightcall, where you can get bonus. Episodes are monthly newsletter,
merch and all sorts of fun stuff. So check us
out there and we will see you all next week.
Go bye,
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.