Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to another episode of Internet Hate Machine. As always,
I am joined by my lovely producer, Sophie. Sophie, thank
you for being here. Happy to be here as always. Bridget,
how are you? I am As I told you earlier,
I kind of have the bloss. I'm not drinking for
dry January, which is taking its toll. But I'm good.
(00:23):
I'm good. How about you? I was telling Bridget off
Mike that I was like, I'm thinking about doing thirty
days no dairy, but I don't know how to do
that without cheese, And Bridget was like, cheese is one
of the only purely good things that we have in
this world. You can't give it up. And I take
that to heart. I'm like, you're You're so right. We
just don't have that many things that are just pure
(00:44):
good in this world. We need to protect them, indulge them,
respect them. I appreciate so. As you know, so far
on this podcast, we have talked so much about things
like harassment and abuse and those kinds of things happening
on social media platforms that we know disproportionately impact marginalized
(01:04):
people like black women, so much so that black women
are sometimes called canaries in a coal mine when it
comes to technology harms because, as I hope I've demonstrated
in this podcast, first, something harmful online will happen to
black women, and those black women speak up about it,
and nobody with power really does anything to address it.
And then that harmful thing online happens to more and
(01:26):
more people, and then it just becomes a normalized thing.
That is everybody's problem. I have argued many many times,
and I think that our digital media ecosystem is broken
because lies and conspiracy theories and harassment are amplified and incentivized.
But I want to be clear that it does not
have to be this way. I firmly believe that something
(01:46):
different is possible. In fact, many of the technologists, organizers,
and activists who are doing the work of trying to
build that better Internet future come from the same marginalized
backgrounds that are so often harmed online. But doing that
critical work of making the Internet safer comes at such
a high cost for these folks, high cost financially, professionally,
(02:09):
and personally, and it shouldn't have to be this way,
and I think that's really part of the problem. One
of the reasons why this problem is such an ongoing
persistent one is because a lot of the people who
are trying to make technology safer are at best ignored
and at worst like actively punished for it. And until
we change that aspect of our culture and technology, I
(02:31):
don't think we're gonna get anywhere. And a story of
one of my favorite technologists, Efoma Uzoma, I think illustrates
exactly what I mean. So this episode is gonna be
a little bit different. Um. I want to tell you
a bit about I Foma and her story, and then
we're gonna hear from her in her own words. So
let's get into it. I love a format up. Yeah,
(02:53):
switching it up. I think it's it's great whenever you
can hear somebody's story in their own words. Oh, it's
It's like why I became a podcaster. I feel like
there's something you know, you could read someone's op ed
or read someone's writing, but there's something about hearing someone
tell their story and how they felt and where they
were at in their own words that just I don't
know it. It hits to the quick a little a
(03:15):
little better. I don't know completely agree. So Ethoma is
a public policy expert and technologist. She worked at Google,
and also worked on anti hate speech initiatives and community
standards at Facebook before landing at the social media platform Pinterest,
where she was one of the first hires on Pinterests
then newly formed public policy team. Sophie, Do you do
(03:37):
you use Pinterest? Uh No, but I definitely did, Like
I stopped really using it around then as well. Go
back and look at some of the things that you pinned,
Like if you were like, oh, my dream wedding board
and just have a I hate myself had the worst taste.
(03:59):
It's yeah, if that's I hope that doesn't still exist.
I am sorry to say it might stops recording in scrubs. Um, yeah,
I don't know. I am actually googling if I have
atuff still. Oh man, I'm pretty sure it does exist though,
But I don't think it's like there's much on there.
(04:20):
There's definitely like a photo of me that should not
be on the internet because it's atrocious haircut style. But
you know, I had to go in and like wipe
mine because also that was the era of American apparel
kind of like hipster stuff. I was definitely into that.
So it was a lot of just really questionable fashion
(04:42):
choices that I am happy that to say that I
deleted from the Internet. Okay, so I have to I
have nothing that I've created on picterest. I have too
safe posts. One is a post that's like twenty five
ridiculously healthy foods and it's like a graphic. It's like cool, weird,
problematic diet stuff. And then I have a quote that says,
(05:03):
your flaws are perfect for the heart that is meant
to love you. Oh Sophie. And this amazing side Bang
situation if you can see it you, Oh my god,
it's a version of me. Oh my god. I I
wish I could hear from that version of Sophie. I
(05:25):
didn't know. I didn't. I didn't know Side Bang inspirational
quote Sophie. Oh she did a smokey eye. Oh incredible.
I have like a John and John and Kate plus
eight haircut in this situation. Like it's like I definitely didn't.
But this crop is like not helping me. Oh boy boy,
(05:47):
howdy what a time Anyways, this is not about me, Pintra, Yeah,
I mean it. It is such an interesting sort of
digital time capsule of times gone by. I guess I'll say.
And so Ifoma, you know, she gets hired at Pinterest,
and before that she had working at Facebook, and she
describes the company culture of working at Facebook as very direct.
(06:10):
She told Time Magazine about a time where she directly
questioned Mark Zuckerberg about why he had not spoken up
after the death of Heather Higher in Charlottesville at an
all hands meeting, and Zuckerberg actually admitted in front of
everybody that it was a mistake to not do so.
And so she was like, oh, yeah, I came from
this culture of you know, directness that was tantamount at Facebook.
(06:31):
So if the culture at Facebook was direct at Pinterest,
she found it was very very different. Interesting, Yeah, it is.
I think like it's interesting because I think that Pinterest
kind of had a, i guess perhaps unearned reputation of
being kind of a softer, cuddlier tech company. You know,
(06:51):
nobody's gonna mistake somebody like Jeff Bezos or Mark Zuckerberg
for like a cuddly nice guy running a cuddly nice platform, right,
We're all it's clear the kind of platform that they're
running and the kind of people they are. Can't even
imagine Bezos giving somebody a hug. Oh oh, I don't.
I don't want it because it would be very sharp.
I wouldn't want it. But yeah, Pinterest had this reputation
(07:13):
of being like a gentler, cuddlier kind of a tech
company that cares, for lack of a better phrase, Ifoma's
job at Pinterest was working on the platform's policies to
create safer and better online experiences for their users. She
says that from her very first meetings at pinterest, she
could kind of tell the vibes were off. She told
Time that in her very first meeting with senior leadership,
(07:35):
she questioned the company's decision to keep info wars is
Alex Jones on the platform and got pushed back. She says,
quote the entire strategy was lay low, don't wait in
on anything, And I was like, this is not controversial.
This is someone who is harassing the parents of Sandy Hook. So,
you know, wanting to kick somebody like Alex Jones off
(07:56):
of the platform that shouldn't have been controversial, And in
fact that it's the kind of thing that she's hired
to do at Pinterest. I happen to know I Foma.
She is a get ship done type of woman, and
so she does not lay low when it comes to
her work, and they're in kind of lies a little
bit of the dilemma for women who are tasked with
making technology and the internet safer. You might get praise
(08:19):
for it publicly, but then be punished for it privately.
And that's exactly what happened to her, right that, I know,
it's not it's not great, So listeners might actually know
a little bit about Efoma's work without necessarily like knowing
it was hers or knowing her name. She spearheaded a
lot of policies at pinterest that earned the platform a
lot of public praise and like really good press. While
(08:42):
working at pinterest, she ushered in the first real medical
misinformation policy at a social media platform that included limiting
vaccine misinformation and searches in into that's and nineteen um.
This came after a measles outbreak, which you might remember
that was really driven by you know, parents being like,
who am going to vaccinate my kid against the measles?
Because it's trendy, And this was really important because pinterest
(09:05):
is a kind of platform that a lot of parents use,
a lot of moms use it. It's where you might
spend time if you're planning a wedding or setting up
a nursery. Of mothers and thirty percent of fathers in
the United States are on Pinterest, according to research, And
unlike a platform like Twitter or Facebook, it's a platform
where you might not be mentally prepared or primed to
(09:27):
expect to encounter medical misinformation, so you're not really like
you can be more susceptible to it. If you're just scrolling,
you know, inspirational quotes and nursery ideas on Pinterest, you
might not be expecting to encounter vaccine misinformation, and then
your guard is going to be a little bit down
for sure. And it's a platform that skews heavily female,
(09:49):
and we know that women are more likely to be
the sort of medical decision makers in their household, and
so them being targeted by misinformation on Pinterest, it's kind
of a big deal. And so Pinterest coming out with
this policy to ban medical misinformation was like kind of
a big deal in the misinformation space. This was also
(10:09):
way before COVID was a thing, So becoming the first
platform to proactively deal with medical misinformation something that we
know would become so important during COVID, was like a
huge deal, right, So like this was this was we
saw this happen quite a lot and you know, sometimes
in a performative way but also like in a good
way during But we're talking about what this is. This
(10:35):
is twenty okay, so that's still that's like very I
feel it's very relevant that it was happening before. I mean,
having done a little bit of this work myself. When
big platforms make a change like this, it can be
the domino effect that has other platforms do the same.
And so you know, if Reddit does something, it might
(10:57):
create the conditions for other platforms to be like, oh,
I kind of have to do it. And so Pinterest
becoming pretty much the first platform to do this kind
of was a big deal, and I think it I
think it did create the conditions for other platforms to
be like, oh wait, this is something we have to
take seriously. Okay, let's let's let's figure it out. Yeah
that's very uh before the times, Yeah, thank you we foma.
(11:18):
So a lot of people will probably associate Pinterest with
planning weddings, and folks might recall that in twenty nineteen,
Pinterest also announced that they were going to no longer
allow plantations that used to house enslaved people to be
listed as you know, romantic wedding venues on the platform.
That was also a foma's doing. She worked with Jade
Magnus of the racial justice organization Color of Change to
(11:42):
change this policy at pinterest. That got so much good press, like, wow,
this company is like proactively banning slave plantations as a
place that you can describe as a romantic venue for
a wedding, And it kind of created a new conversation around,
you know, platforms and the responsibility they have to ethically
(12:03):
talk about things like plantations. Yes, we know that, Jennifer
Lopez and Ben Affleck, We're not using pinterest. It's so
wealthy that they got married what like a few months
ago on a plantation. I don't understand it. Blake Lively,
I'm like, Okay, well, maybe it's possible that if you're
from the West Coast and before we were kind of
(12:24):
having the national conversation about this, I'll allow that maybe
you just never thought about it. Jennifer Lopez, I feel
like there's no excuse. I'm also looking at you the
Bibers did did they get married on a plantation? I'm
nodding the listeners. I am nodding my head in South Carolina.
Oh no, yeah, yeah, just let me double check my
(12:46):
factor check that one. I mean I wouldn't know, but
I wouldn't I wouldn't put it past him. The first
thing that pops up is plantation weddings will no one
to be freely promoted on Pinterest. Thank you, Foma. This
article start. The idea that celebrity says Prominence Black Live
or Justin Bieber would get me to get married at
a former concentration camp is particularly inconceivable, and yet both
(13:08):
Lively and be Ribbon fact chosen to get married at
the sites of former forced labor camps in recent years. Yep,
my insult stands. Yes, your insult stands, um so I.
Foma basically was like a one woman positive press shop
for Pinterest. She got all of this positive press for
making these progressive, proactive policy decisions and setting a good
(13:30):
example for other platforms. But rather than being praised for
this work that made the company look so great and
so progressive and so cuddly and warm and YadA YadA YadA,
and also added a lot of value to the company
as it was preparing to go public. She was retaliated
against for this work by her own colleagues. So here's
where it starts to go down. I Foma suggested that
(13:51):
Pinterest start adding a content warning to posts made by
right wing grifter Ben Shapiro. I know he, I can't
even and I hate him so much. At the time,
y'all might remember that Shapiro was doing a lot of
espousing of the Great Replacement conspiracy theory, which we know
is a white supremacist conspiracy theory that posits that non
(14:13):
white lesser people are threatening the white majority with the
help of globalists and elitists. I eat Jewish people. Uh,
and so if Alma suggested that these posts from Ben
Shapiro get a content warning on Pinterest. When she suggested this,
one of her colleagues at pinterest, who was a software developer,
(14:34):
dosed her. Shared her full name, her cell phone number,
her email addressed, and her picture was Project Veritas, which
led to her personal information and being published on eight
Chan and Fortune, which we already know. You know, like
sites full of extremists and abusers and bad actors who
already had roles in harassing women, like Adrian Richards. We've
(14:54):
already talked about on the pod and say I eat
dongle Gate exactly. Um, she gets death threats, rape threats,
and she was terrified if almost starts keeping a gun
and eventually she had to move because the harassment got
so bad by her own fucking colleagues first, like like
her own fucking colleagues. Also, it's like to hear that,
(15:15):
like Ben Shapiro, like infiltrated Pinterest is so wild to me,
Like that just like that as like a concept is
it just like doesn't comprehend in my brain. Yeah, I
get what you mean, And I think it really speaks
to the way that misinformation and conspiracy theories work these days.
Where again, if you're just if you're just pinning pictures
(15:36):
of recipes and meal planning ideas and flowers or whatever
on Pinterest, you might not be expecting to encounter this
kind of harmful, anti Semitic, racist conspiracy theory. But oftentimes
on those platforms, these kind of dangerous conspiracy theories can
be dressed up intentionally to target women. So you might
(15:58):
be just like looking for nurse ideas, are looking for
ideas on how to can your own vegetables. Next thing,
you know, you're ten pins deep into like great replacement
trad wife in cell manno sphere bullshit, And I think
that's that's kind of how we're seeing a lot of
these conspiracy theories spread online today, in these in these
(16:19):
very unlikely corners of the Internet. So when e Foma
reported what her coworker had done, rather than investigating whether
or not, you know, one of their employees had actually
jeopardized the safety of another employee, if Almo says that
(16:41):
Pinterest instead wanted to investigate whether or not Ben Shapiro
was actually indeed a white supremacist. She told the Washington
Post quote, instead of focusing on security and making sure
that we were fine and validating the concerns that we had,
their concern was, is what you said valid? Almost like
the employee had a legitimate reason to share my personal
(17:02):
information all over the internet. Pinterest did not help e
Foma get the information removed, nor did they ever punish
the employee who was responsible for docs in her and
spreading her information on the internet. I'm glad I haven't
used this, sus. Yeah, it's awful. So separate from all
(17:22):
of this, e Foma had lodged a pay discrimination complaint.
While working at pinterest, she had been doing all of
this groundbreaking work, but she had been hired at a
much more junior role than she should have been compared
to the workload that she actually had. She said that
she was doing the same work as her manager. I
Foma was a good performer during her time at pinterest,
like all of her performance reviews are stellar. She earned
(17:44):
two raises and a promotion during her two year span
at the company, but none of this addressed the fact
that she had been hired initially at the wrong level,
which means that she is losing thousands of dollars in
stock options just because of the like particularities of compensation
packages at tech companies. She's still getting these great marks
with the company, but when she starts pushing for fair pay,
(18:07):
even going so far as pursuing it with a lawyer,
the company starts to further alienate her. Like during a
performance review, while she was talking about her work getting
platforms to stop promoting slave plantations as wedding venues, her
manager said that I Foma should have provided the pros
of promoting slave plantations as well as the cons conny,
What yeah, Like, what are the pros to continuing to
(18:30):
have these on your platform, I would say none. It's awful.
And honestly, if you think that's bad, during our conversation
that you'll hear in a moment, I Foma actually dropped
another little nugget of something that she found out about
this particular manager that will make it all makes sense.
So if you think that's bad, listen to that conversation,
because it gets a lot worse. What is this person
(18:53):
that's like what? Yeah, so basically just both sides ing flavory. Yeah,
I'm sorry, Like read read the root. What do you
think about what you're saying before you say it? This
person should not be anybody's manager. That is so dangerous
and harmful exactly. So in Mayfoma quits Pinterest. She gets
(19:18):
six months severance, and honestly, that might have been the
end of it, right, Like she talks about how she
wanted things to be professional. She really was loyal to
the company until summer during the racial justice uprisings in
the wake of the death of George Floyd. Now, I
remember this time because it was like it just was
like a weird time where do you remember, like people
(19:40):
would put their black lives Matter posts up and it
would be like, well, actually you're racist, or like actually
you said this to me? Like it was like a
time where I was like, before you post your black square,
really make sure that your own house is in order.
Like who was the woman from Glee who was like
I stand with black Live List? Was it Leah Michelle?
(20:02):
It was Liah Michelle and Amber Riley, her black Glee classmate,
was like, oh, Liam Michelle, black lives matter? Did black
lives matter? When you called me a black bitch and
told everybody you were going to ship in my wig
when we were on set? Oh god, yeah, It's just
it's it's like during Pride Month when you're like, Burger King,
(20:24):
what are you doing? Uh yeah, Burger King stands with
the queer community. You're like, can I have my whopper?
You're making me uncomfortable. So Pinterest puts up a statement
in support of Black Lives Matter. They say, with everything
we do, we will make it clear that our black
employees matter. And if she them, did she roast them?
(20:48):
She roast them? So cool? So she tweets I recently
decided to leave Pinterest, which is just declared solidarity with
Black Lives Matter? What a joke. As a black woman
seeing Pinterests middle of the night, Black employees matter statement
made me scratch my head after I just fought for
a full year to be paid and leveled fairly, a
year in which I was a docked by a white
(21:10):
male colleague. He shared my cell phone number, of photo
and name with violently racist and misogynistic parts of the Internet,
followed up by a dangerously inadequate response from Pinterest. B
continued to service the leader of and spokesperson for Pinterests
biggest public policy wins see scores of interviews and articles
on addressing health, misinformation, emotional well being, stopping the promotion
(21:30):
of plantation wedding venues see kept all of the above
qualet for professionalism and then the hope that pinterest would
do the right thing. Instead, they doubled down on retaliation.
Now pinterest is claiming to be listening and acting mere
weeks after replacing me and another black woman colleague who
also decided to leave with you guessed it probably a
(21:51):
white person. I am so proud of the initiatives that
I led it during my time here, addressing health missinfote decisively.
It's no longer novel. Thanks to that work. I just
ship wasn't sulfid by the racism, gas lighting and disrespect
from my manager and the company's legal and HR leadership.
Racism is dehumanizing and exhausting. I busted my ass at Yale, Google,
then Facebook before Pinterest recruited me as the second higher
(22:14):
on the global public policy team. I lad work that
raised our public policy profile globally. It didn't matter because
I'm a black woman. I've seen examples of genuine contrition,
even reparation this past week from others. I hope Pinterest
take this opportunity to express not only their solidarity but
also follows through on their commitment and taking action sharing
(22:35):
This is scary, especially after being docks and knowing the
many forms that retaliation can take. I owe it to
myself and black colleagues still there to hold the company
to the commitments it's made Pinterest. Black employees do indeed matter,
pay us fairly. Period periods spelled out and then appear
in the an actual period period period. So her tweets
(22:57):
go hella viral. This is when she was first on
I radar. I was like, oh my god, who is this, Like,
who is this amazing woman calling out this company publicly
and like demanding accountability. So cool, so cool. So staffers
at pinterest they hold a walkout in support of Efoma
and a Foma was taking a huge She had an NDA,
(23:19):
so she's taking a huge risk in speaking up. So
when she tweeted about her experience, she was breaking an
NDI a uh, and so Pinterest could have pursued legal
action or sued her for speaking up about her experience
at the company. That's because of a loophole in the law.
Oh yeah, yeah, yeah, yeah, yeah. DiScRIBinATE. There's a there's
a discrimination loophole exactly, exactly, so you were allowed to
(23:41):
break an India if you were experiencing gender based harassment
or discrimination, but not racial harassment or discrimination. And that's
because the stand Act, which is the Stand Together Against
Nondisclosure Act, which passed in response to the Me Too movement,
nullified n d as in cases of sexual harassment, assaulted discrimination.
I have to be honest, it's like one of those
things were like as powerful as Me Too was, a
(24:04):
lot of the legislation and stuff that was championed and
came out of that movement was not always done with
like an intersectional framework. And so since i Foma was
alleging that she was treated unfairly because of her race,
not specifically her gender, she could not legally break her
n d A, and she was taking this big risk
in speaking about what she experienced, and in doing so,
(24:27):
she became a whistleblower. Her speaking up alerted shareholders, other employees,
and the public to what was found to be rampant
gender and racial discrimination at Pinterest. There was an investigation
into the workplace culture at pinterest and the findings were
not good. After all this became public, Pinterest had the
subtle a lawsuit with their shareholders, who sued the company
(24:48):
saying that executives had breached their fiduciary duty by perpetuating
or knowingly ignoring the long standing and systemic culture of
discrimination and retaliation at Pinterest. Separately to all this, after
e Foma and another black staff or at Pinterest started
speaking up about the discrimination that they faced at the company,
a white woman who had formerly been the company CEO,
(25:09):
name Francois bro Her, also filed a gender discrimination lawsuit
against the company, and she won. Pinchers had to pay
her a record twenty two point five million dollars settlement,
which was the largest known settlement for gender discrimination in
US history, with two point five million of that being
earmarked to commit to nonprofits that support underrepresented in marginalized
(25:29):
groups in technology. Francois Brauer said that I Foma and
the other black staff or at pinterest Erica speaking up
is what inspired her to do this, which honestly is
like great for her. But Francois Brauer was already a
millionaire before coming to work at Pinterest, and she got
twenty million dollars when she spoke up and filed this
discrimination lawsuit. I Foma got almost nothing. She got six
(25:53):
months severance and like punished for doing her job. And
so I Foma really did something that we see so
often when it comes to black women in technology. She
used this horrible personal experience to make things better for everyone,
not just for black women, not just for herself, but everyone,
all of us. She drafted and worked successfully passed the
(26:14):
Silence No More Act, new legislation in California that makes
it illegal for companies to bar employees from speaking out
about harassment and discrimination, and that actually became law in
in California. So thank you, I Foma. This was also
around the time when a lot of other tech whistle
blowers were speaking up, like Francis Hogan and Sophie's on
speaking up about the harm that they witnessed at tech companies.
(26:37):
E FOMA published the Tech Worker Handbook, which is a
free online resource for folks who are thinking about becoming
whistleblowers at tech companies. It's not meant to encourage people
to be whistleblowers, but it's just a like fact based
step by step guide of like, yo, if you're gonna
blow the listlet some harm that you're seeing at your
tech company, Here's all the considerations that you should take.
(26:58):
Here's how you should do it safely so that you're
not being trapped, and here's what you need to know
before you do it. To date, i FOMA has successfully
gotten companies to adopt policies or language that allow workers
to speak up about discrimination even if they have signed
n d A s, which will, ultimately, I think, make
everyone safer, because everything in technology is better when people
are not beholden to stay quiet about harms or wrongdoing
(27:21):
that they have witnessed within these companies. And so I
think i foma story is one that really helps us
understand I think one one of the reasons why this
problem will not get better until we have massive culture
shift in technology. I think that we need to really
rethink who has the power in tech, who we center
(27:41):
whose experience as we center in value and technology. And
I don't think until we do that, and I or
until people like I Foma, who you know, speak out
and try to make things safer and better, are welcomed
instead of being ignored, isolated, alienated and punished. So with that,
let's hear from A Foma in her own words, Thank
(28:02):
you Foma. My full name is e Foma Zoma, and
I'm the founder and principle of the consulting firm I
started that's called earth Seed, which is actually named after
the community that Octavia Butler created in parable of the Sower,
(28:26):
in parable of the talent. So I love that, love
Butler's work. Really having extra extra meeting the last few
so I would love to start by talking about some
of your work at pinterest. You know, before before things
went south. It sounds like you had a lot of
really great winds there. Tell me, tell me a little
(28:47):
bit about how you came to be working there. It's
so it's even crazier because as things were going south,
I was still doing the work. But I'll back up
and start at the beginning. My entire career before starting
my own consulting firm has been intact, starting in d
C at Google, working on public policy and public affairs work,
(29:10):
primarily with elected officials at the federal level and a
few at the state level. And then after that I
moved to Google and or to Facebook in California, and
there I was doing a lot of international work around
hate speech and programs that we ran with a civic institutions,
(29:32):
and that's where I started getting more exposed to content
moderation and to UH particularly hate speech, but how content
moderation worked at the international level and in the international context,
especially when companies based in California are making those decisions,
and the disparate impacts that exist when that's the case.
(29:55):
And then I was recruited to Pinterest to be the
second person on the public policy team there, and my
first week on the job, I pushed our g C
and our trust and Safety team and the content policy
team to make the decision that we ended up making
on Alex Jones and removing him entirely from the platform
(30:18):
at a time. This is in at a time when
he was not really being addressed by any platform. And
the argument I made at that point UM was first
that he already violated a lot of the policies that
we had his content UM and two important things. One
that if you're acting on misinformation. At that point, no
(30:39):
other platform had a misinformation policy other than like medium.
And my point was, well, if we're making decisions because
we know content is misinforming, we have a misinformation policy.
We need to just write it and be clear and
stand in our convictions and post it on the site.
A few researchers will pick it up, others may, I couldopt,
(31:00):
most people won't, But I do think when you're making decisions,
you need to be transparent about what those decisions are
and why and so. From there and the policy that
came out of that, then I was able to push
UM for a lot of the health misinformation work that
I did, which started with getting a landscape analysis. That's
(31:20):
something that I feel has been missing from a lot
of platforms. If you don't know what's on your platform
and others don't know, how do you address it properly?
If you don't know who it's impacting because mostation misinformation
is not an equal opportunity harm. It's mostly targeted at
people of color and at women, and when you're looking
(31:42):
at health misinformation, it's especially start And so having that information,
I was then um able to push for things that
I was retaliated against because of But that's a whole
other set of stories. I've been working in the misinformation
space for a while, and you're exactly right that a
few platforms and had any kind of public facing policy
(32:03):
around it, and so pinterest was able to really position themselves,
at least from from my perspective as an outsider at
that point, as a leader in this space who could
walk with conviction. Deep down, I think I knew it was.
You know, there's probably a black woman leading this work somewhere.
But they were really able to enjoy this public persona
(32:24):
as a company with a little bit of backbone and
was going to take responsibility. And yet while enjoying the
you know that, while enjoying that reputation, they were internally
making your life harder at pinterest for that work that
they were getting so much acclaim from folks like me
from the outside for like, how does that sit how
are you able to sit with that um with a
(32:46):
lot of peace that karma is real, um and it
will come back to get them. But but also that
was one of the reasons why I had to go
public about the retaliation that I faced when I raised
pay discrimination concerns about the docks sing that I experienced
from a white male supremacist who happened to also work
(33:09):
at pinterest after I pushed for decisions to be made
around white supremacist content that had existed on the platform.
There are a lot of intersections, as you know, with misinformation,
and so even though pinterest was most well known because
of a health misinformation work that I did, we took
a lot of steps to address other types of misinformation.
(33:33):
And where those two I guess circles and event diagram
met was on an anti choice site that had been
posting misinformation around birth control and access to abortion services
being targeted specifically at black communities to uh push eugenics
(33:57):
agenda from the pro choice movement, just and the ways
in which the misinformation aligned. I pushed internally that like
this is this is why we have to look at
misinformation in general. We have to look at the specific
ways in which it it harms people. But if you
(34:18):
just take a well, this is an opinion point of view,
then you're missing that this is both health misinformation and
it's also political because it's targeted at a group using
language and using imagery. Because they're very good at using
images of black mothers and black children on these websites
that are run entirely by white supremacists, that you may
(34:41):
miss some of the context if you only look at
the content and not what their entire website is pushing.
That's a really good point, and I think it's a
point that we miss about how misinformation works, you know,
especially on a platform like Pinterest that is so visual.
You might be on Pinterest just looking to like, you know,
looking inspiration for your new baby nursery or something. You
(35:03):
might you're you're not Your users might not really think
that they're in a space where they're going to be,
you know, encountering this kind of charge, no racially charged, racist,
sexist imagery that's meant to mislead them about their health. Right.
You might be just like using it casually because you're
looking for for pretty pictures. And I think that's so
that's so key that people's guards are down often times
(35:25):
when they're on site side of interest, so they don't
even realize like, oh, I'm actually being I'm actually consuming
content specifically meant to to lead me astray and misinformed me.
And that's why it was particularly harmful. And that was
my even though the platform had never really done anything
public in the policy space, wasn't certainly not known by
(35:49):
any of the reporters I ended up working with about
policy decisions. One of the reasons why I felt so
strongly about this is it's a platform that was going
towards I p O at the time that I had started,
so I started pre I p O and part of
the messaging around that is where a platform with eight
(36:09):
and ten moms in the US on here, lots of
women use the platform around the world, and women are
often decision makers when it comes to financial choices for
their households. So it's a great platform for advertisers. But
at the same time, that's what made it a prime
target for misinformation purveyors because you have a captive audience
(36:31):
folks who are not a tune to looking for miss
and disinformation because they're not on Facebook, they're not on Twitter,
they're in a place that feels safe to them, and
so they're the perfect opportunity to then hawk whatever goods
you're selling. A point that I made often because I
(36:52):
get invited by the w h O, CDC and others
to talk about this health misinformation work um that they
had not thought about it as much is the financial
incentives that are tied to a lot of misinformation. Whether
it's Alex Jones selling his nonsense T shirts and supplements
and whatever else. These people are scam artists. That's their
(37:13):
number Their number one job is scamming folks. They use
the values that people have, they use the fears that
people have to then sell their products. But at the
end of the day, these are spammers and scammers, and
so you need to also be looking at what it
is that they're trying to push on your platform. For
(37:33):
almost every single health misinformation site, they were selling supplements.
So if you would address dangerous supplements on the platform
as spam, why would you not consider this at the
same level of harm to the platform and ultimately harm
to legitimate advertisers. That's a really good point I had not.
(37:55):
I mean, it makes so much sense. I think that
we're so used to thinking about scammers as people you know,
selling fake Gucci on the street and like no, it can.
People can can scam online and they're misleading you in
order to get you to buy whatever bullshit product they're hockeing.
I would actually argue that the person selling the Gucci
(38:15):
handbag that's fake, that's not harmful. You got a cheaper
bag if it's made well, it looks pretty good, like
you get a deal. They get a deal. Gucci doesn't
get a deal, but what do they need one for? Um?
But that's not harmful in the same way that telling parents,
and especially at the point at which most parents make
(38:36):
decisions about vaccines in the last trime master before they
have the kid, that they instead of getting a vaccine
for their child, which will save their child's life, they
should instead go buy your vitamin K supplement. That is
so harmful and dangerous in a way that we need
to take it more seriously. It's not a difference of opinion.
(38:58):
It's actually costing people's lives definitely. And I think to
your point about how many moms are on the platform,
you know, as we go into talking about you know,
vaccine rollout for COVID and things like that, it is
a lot of times moms who are making health decisions
for the family, and so if moms are being inundated
with really harmful health misinformation on this platform where they
(39:20):
think they're going to be safe, it is a real
problem that could could have a real human cost. And
I think, yeah, the person scamming fake Gucci belt on
the street, other than not giving Gucci more money, which frankly,
I'm not really that mad at. You know, we have
to look at the kind of harm that these platforms
can really be responsible for pushing on communities who are
oftentimes already marginalized or underrepresented. That's exactly right. So I
(39:45):
want to switch gears a little bit. So a thing
in technology that is so frustrating is marginalized people being
punished for speaking honestly and accurately about things like hate
and extremism. And I feel like we're never gonna get
anywhere and addressing this if are not even allowed to
speak honestly about it without consequences. A perfect example is
you know what happened at get Hub around January six.
(40:08):
So folks might recall that during the January six insurrection,
a Jewish staffer who worked at the tech company get
Hub watched it all unfold on television like we all did,
and he saw these insurrectionists storming the capital, some of
whom were objectively holding Nazi flags. So he posted a
message in slack warning his coworkers to stay safe. He wrote, like,
(40:29):
stay safe, there's Nazis about and get hub fired him
for this. Now, eventually the company apologized and all of that.
But you've spoken with dealing with a very similar kind
of thing where you spoke honestly and openly about hate
and extremism within technology and you were punished for it. Yeah,
and not only punished, I was personally targeted. So the
(40:50):
story I was referring to earlier, the white supremacist colleague
who wasn't someone I worked with closely, but worked on
the engineering side of trust and safety, I saw a
message that I posted in exactly the right place for
me to post. It wasn't a general conversation area, but
I posted that a pretty popular white supremacist was in
(41:13):
fact a white supremacist. I linked to the content that
violated our policies that was of concern, and then I
put in a note as well that the platform or
the folks working on trust and safety should be mindful
of these terms, like, here's a set of terms that
are dog whistles unless you're a white supremacist or unless
(41:34):
you're the target of the white supremacist harm. And these
are what we need to look out for, because these
folks aren't going to title their videos on YouTube as Hey,
I'm a white supremacist and this is my view. It's
going to say something about population control around white replacement theory,
which a lot of folks are not aware of, but
(41:57):
is a huge red flag and it is a calling
card for many white supremacists. A few months after posting
that warning, sharing the context and the content, which was
my job as a public policy person who helped inform
content safety decisions that we made, I was then doted.
This person dots me and two other women, another one
(42:20):
who was a black woman and a woman that he
a white woman. He assumed to be a lesbian um
and we only know that because of the comments that
came up on Gateway Pundit and in other places where
we had been targeted. And for me, I guess he
took a particular disliking to me and so shared my
phone number as well, and all of the identifying information
(42:43):
that you would need to find me. At this point,
I had already separate from all of this and separate
from the work I was doing. I had already raised
paid discrimination complaints with UH the appropriate leadership at the company,
my manager, managers, manager, HR, etcetera, and was getting serious
pushback from them. And so then when I was docked,
(43:05):
the lack of response from them to take care of
my safety to address what was going on was so
apparently part of the retaliation that I had already been
facing on the pay discrimination side UH that it was
pretty traumatizing being at a company where it was clear
I was not safe. I was not necessarily safe at
(43:28):
home because it's not very difficult to track someone down
once you have enough information UH. And then was also
dealing with everything else at the same time. So I
really related to the get Hub story because of the
docks thing that I experienced and the lack of response
from the company. But then also later on that same year,
(43:51):
UH Color of Change had come to me because I
was the liason with outside groups academics and civil society
on content safety issues. They had come to me sharing
that they were still seeing slave plantations pop up as
wedding venues and suggestions for weddings. If you know Pinterest
or you know someone who uses it, the number one
(44:13):
use of the platform might be planning a wedding or
preparing for some sort of celebration. And I agreed with them,
it's completely inappropriate that the platform would be pushing concentration camps,
which is what they were uh and torture sites as
a celebration venue. And so I brought it to our
(44:33):
content safety team with my recommendation, shared exactly what color
of Change had shared with me, and then I got
pushed back from the head of that team, particularly the
head of content policy, who happens to still be there
and still speaking on behalf of the company. Um. What
I later found out was that she was married on
(44:55):
a plantation that she never shared that in all of
the fact that I got. But she ended up working
with my manager, who had already been retaliating against me,
to ding me on my performance review. So even though
Pinterest ended up doing exactly what I recommended, ended up
getting praised in forty plus headlines because of the decision
(45:20):
to stop promoting slave plantations and the decisions that I
had pushed to be made I was ding on my
performance review which affected my pay well. First of all,
the revealed that this woman had her own I mean,
yike level, the levels like just like what you said, right,
(45:44):
These companies get to enjoy the positive press that makes
them look like a woke company or a company that
really cares, like I heard so many times people say, oh,
Pinterest is a company that has a you know, like
they are a company that prizes like empathy, YadA, YadA, YadA.
And then to hear the inner workings of how this
happened is so it's such a disconnect. I feel, it
(46:06):
really is, you know, it really illustrates how so many
different levels can come together to suppress and push out
and harm a black woman for doing her job right,
Like this was your job. It's not like it's not
like you were overstepping bounces. Is what you were hired
to do. And I also think this, you know, and
(46:27):
this is the conversation that feel comes up again and
again and again where black women are punished for like
doing the right thing, for practicing public courage, public morality,
for doing their jobs trying to make things safer or better.
Someone else gets to enjoy the benefits of that work.
(46:48):
But that work is at best, you know, unappreciated, underappreciated,
at worst dangerous and risky for your own personal safety,
right like you, Like you did this work of making
Pinterest as safer, better platform at great personal cost and
at great risks your own like safe safety. Yep, And
and all of those things happened. Not only was I
(47:11):
paid unfairly and I had to pursue legal action because
of that, I was then also my life was starting,
So yeah, I got all of it while I was there.
And people often like to call black women canaries in
the coal mine or whatever term they want to use.
(47:32):
But I would like to just do my job, get
paid fairly, and not be put in danger for for
doing the right thing for a company. I mean, uh,
something like half more than half of the articles that
were written about pinterest in the month before the public
(47:52):
offering which happened in April referenced my work. So not
only did my work have value to the actual users
of the platform, end up pushing Facebook and others to
have to respond about why they weren't addressing health misinformation,
particularly around vaccines. And remember that was during a different
(48:12):
health issue, a measles outbreak measles outbreaks on the East
and West coast. Uh So, not only did it have
that sort of impact, which is a slam dunk if
you're in a policy space, but it also had material
benefit to the company in the form of the I
p O. And I still was not treated fairly. Yeah,
(48:34):
and the company still hasn't actually acknowledged that anything they
did in my case was wrong. I mean, what is
that like for you? Like you, you seem like someone
who and maybe this is just my you know, we're
talking over squadcast, you know, Uh, you seem like someone
who has a lot of peace for how horribly you
(48:55):
were treated, And I just I have to like, I've
had a lot of age. I've had a lot of
rage as well. You know, I really do believe in karma,
and I believe it's not just you, It'll be the
next seven generations that are hit with whatever evil you
put out into the world. And so I take the
(49:15):
long view. I'm like Aria from Game of Thrones. I
have a list. I'm making my way through the list,
but you'll get yours eventually. That's that's my long view.
And then also I'm a student of history and political science,
None of this is new. What I'm dealing with is
(49:36):
not new, It's not unique to my situation. Does it suck, yes,
has it been miserable, Yes, I've paid physical consequences for
it because my actual health was impaired for the two
plus years I was in a legal fight with them.
But like, I'm good at the end of the day,
(49:57):
I'm at peace with every single decision I made. I've
never lied about what I experienced there. And so when
they're out here lying about what they did to me,
um and getting called out on lines, that is enough
for me. Wow. And then I have to add, you know,
reading that a former Pinterest CEO, a white woman, was
(50:18):
able to get what was it, a twenty million dollar
uh discrimination uh pay out from Pinterest her and she
explicitly said, you and your other black women colleagues speaking
up about what you experienced that Pinterest, that was a
big part of why she felt she was able to
get this, to get to get this settlement. And you know,
and you if I remember queestly like I didn't even
(50:40):
get a year of severance from Pinterest, that's correct, and
laid the groundwork she was not going to speak up
publicly and actually in the medium post where she didn't
reference to us when she first went public. Um, but
she herself said that women had come to her over
the course of her career and she actually had not
(51:00):
been a helpful ally to other women. So I think
she said everything. Um. But yeah, no, I I what
was crazy about that situation is people got to see
in real time what it means to lead a movement
and then be left out of whatever progress comes. And
black women are often I mean, when you think about
(51:21):
me too, who has me too actually benefited if not
white women, maybe maybe in a sense all women, but
it is most benefited white women. And yet it was
started by Toronto Burke, a black woman. So this is
not it's not new, it's not I didn't speak up
because I expected them to do the right thing. I
(51:45):
from the jump expected them not I expected them to
denigrate my name, my experience, which they've done all of that,
and then not to pay me what they owe me.
But seeing it all happen in front of everyone, I
think was a lesson, not only for the folks who
(52:05):
were watching it and just expecting that the right thing
would happen, but then also some of the reporters who
worked on the story, and we're the very first ones
to reach out and be like, how is it possible
that you? Like, I literally remember talking to you and
then talking to her several months later. You are the
reason why she came forward. You are the reason why
(52:27):
she had a strong case, and then this happens. Yeah,
I mean, I think you're right that, for better or
for worse, it always seems that black women we are
the ones building, Like we're the ones building, and then
other people are the ones who are benefiting. And I
think I see that in politics, I see that in tech.
(52:49):
I see that in so many different It just seems
to be you know, I at this point it almost
you know it is what it is. But like that
seems to be our on this on this earth, like
building things that we then don't get to to use ourselves, right,
Like I think it was this writer Clarissa Brooks who
(53:10):
once wrote like, I, as a black woman, I don't
want my back to be used as a bridge to
a world and I'll never see right. I feel like
I feel like that is our lot, and I see
it particularly in tech, but in so many different different avenues.
And that's why even though uh, this was painful personally painful,
of course to have been the one who experienced all
(53:32):
of this and then have someone else benefit from it.
It was instructive for everyone to see it and to
see the timeline and how quickly and the different way
in which they responded to her. And then also I
think it was a helpful lesson for people who considered
themselves allies to see as well, because there were a
lot of people who think of themselves as allies who
(53:56):
saw it and we're like, wait, what, how is it
possible that this is happening. This happens all of the time.
You just don't usually see the dollar amount that's attached
to the progress that certain people get and others do not. Absolutely,
you know, kind of kind of connected to that idea,
you know, when we're thinking about how to combat some
(54:19):
of the more harmful things that black women and other
underrepresented communities space online disinformation, misinformation, online harassment, I do
have this feeling that there is this underlying assumption that
black women like like tech and the internet on all
of these domains, or it's like not our rightful domain,
so we should not we are not able to expect
(54:41):
experiences that are not harmful in these spaces, and I
think for me, it's kind of this vicious cycle where
that dynamic is mirrored at tech companies and so black
women engineers, black women technologists are b are pushed out
of these companies and are not are not being listen to,
and thus these platforms are not able to to prioritize
(55:05):
our safety. So it's like I feel like we, in
my mind, will never address the true harm that platforms
have been responsible for if these tech companies cannot figure
out a way to really have black women be meaningfully
centered and heard, because it just seems like a this
like horrible cycle. I'm not sure this makes any sense,
(55:25):
but um no, it makes complete sense. And it's not
a lack of figuring it out, it's a lack of desire.
They I said to I think it was Charlie worzel
Row column a few months ago about Facebook and it
may be being a lost cause, and I said, platforms
(55:45):
reflect the people who lead them. That you're only seeing
on Facebook, on Twitter, on pinters wherever, what the people
who lead those platforms want you to see. And so
if they're operating in a white supremacist structure and worldview
and that is where their actual interests align. That's what
(56:06):
we're going to see on the platform. So none of
this is by accident. None of this is all of
a sudden out of control. This is exactly what they
design working in the ways that they want it to work. Yeah,
it always comes out of that question of is it
a they can't problem or are they won't they won't
they don't want to problem? And I think if we really,
(56:28):
if anything, your situation really reflects what the answer might be,
It's like they could, they just don't want to, because
this is you know, they've made these choices. They've they've
made choices about who and what to prioritize and who
can what to you know, suppress and shut down and
not prioritize. And the example I it makes usually the
audiences that I give this or share this analogy with
(56:51):
our health focus, public health professionals, experts, etcetera. And I say,
if you want to understand how non accidental any of
this is, think about pornography. How often do you randomly
encounter porn on Facebook or Twitter or YouTube or wherever else?
Not that often, not often, it's not often often, and
(57:13):
yet you see misinformation every other post you see hate
every other post. There's a financial reason for that. Advertisers
have said they don't want pornography next to their content,
and so what have platforms done. Poured every single resource
into making sure that's the case. They've made a choice here,
and the choice is not on the side of safety.
(57:35):
I would argue that if it's legal consenting adult pornography,
that is way less harmful to randomly encounter on a
platform than health misinformation or Nazi content. Right. I mean,
this is such a silly example, but for the longest
time on Instagram, and maybe it's still a case if
you showed your nipples on Instagram or like if you
(57:57):
were breastfeeding on Instagram. There was a woman who had
a pick sure of of her but where she had
her period on Instagram and that was removed. But they
were not removing you know, they were very diligent about
removing this stuff, but also not removing like this information
or like violent content or like hate speech. You know
it really it really you have to ask some questions
about priorities, you really do, yep. And the priorities are
(58:20):
clear in all of our experiences on these platforms. Absolutely absolutely,
and I think justis like you know, black women on
the Internet. I think that I believe that we have
come to expect that our experiences with technology and online
will not be safe. And I want to get to
a place where we can radically rethink that, radically rethink
(58:43):
what our experiences on these platforms should should be like
and can be like. I think that we need to
really do. I would love to see some like radical
rethinking about what we can expect from our Internet experiences.
I agree, And that's the basis of all the work
that I have done, the work that I'm doing out
through my consulting firm on tech accountability, whether it's in
(59:05):
the health misinformation space or whether it's on the organizing
side and providing protections for whistle bars UM. And I
think it's it's a conversation that it's unfortunate that we
have to bear the burden of since we're the people
who are harmed by it. But even just a few
days ago, I was part of a conversation on Clubhouse
(59:26):
UM around content moderation and around the decisions that platforms
made to the platform Donald Trump, And before we went
into that conversation, I sent a note to everyone on
the panel, saying clubhouse is a place that I am
not frequenting by choice because black women are often targeted
(59:46):
and people use very loud dog whistles basically just short
of using the N word and using straight up misogynistic language.
But there's a ton of that going on, and I
put the onus on every one on the panel to
if that happened in the course of our conversation from
anyone from the audience, to not make it be my
(01:00:09):
responsibility to be the only one to say something, and
every single one was was great and said absolutely, of course,
But that should be the way that we're setting up conversations.
It shouldn't be the responsibility of the person who is
most likely to be harmed to say, like, hey, I
hate to be the one to bring the mood down,
(01:00:31):
but this could happen, so can we please watch out
for it. God, I have been that person a thousand times,
and it's kind of like what you were saying. It
just sucks like you you want to you want to
do your job and be paid what you're owned for
doing that work. I feel that black women are just
not often afforded the ability to just do your work
(01:00:52):
and keep your head down. It's like you have to
take on all this often unpaid. Might I add extra labor,
extra energy, extra every anything, just to exist and do
your job and put your message out there. Really, it
is exhausting. I know exactly that that feeling of like,
oh God, I'm gonna be at to have to be
the person that raises this again and like everyone's gonna grown.
(01:01:14):
I just know that feeling, and it sucks. Yeah, it
does sucks. So I have one last question for you.
What do you think platforms or policy policy folks or
any anybody who has powered decision makers, what should folks
be doing to keep this and misinformation off of platforms? Well? Um,
(01:01:35):
I think the first step is on the platform side,
having people who are empowered not just there um, who
are also knowledgeable about the ways that dis and misinformation
target specific community. Is the ways that it's showing up
differently on each platform, because these folks are recycle a
(01:01:57):
lot of content, but they know what content works best
and on which platform, and so the platforms also have
to be aware of the ways in which their platforms
are being used. On the policy side, I think it's
great that the administration is passing the bar that was
on the ground from the Trump administration for diversity and
(01:02:21):
so at least they've cleared that low bar and standard.
But I am not seeing enough um black women in
positions of leadership when it comes to misinformation and tech
policy specifically. UM. I was encouraged by the science I
(01:02:42):
think it's some sort of science focused department within the
administration that was announced recently that a Laundron Nelson will
be on. That's incredible. Uh, But on the tech side,
it cannot just be pulling that sperat piece of people
like Eric Schmidt and other white tech executives to then
(01:03:06):
reform the same industry that they've made billions off of. Like,
that's just not how it works. That's not how it
should work. UM and I and I think it's important
to pull academics, but you also need practitioners who have
experienced things on the inside of these platforms to be
informing the decisions and any regulation or reform that comes
(01:03:27):
as a result. Definitely. Actually, do have one last question
for you. I know, I said, I don't realized, but
you know we've talked a lot about you know, black
women's experiences in tech. I'm sure that you know this.
I know this is from from doing the work. So
much of the infrastructure of what we rely on to
make the Internet safer and better. So people who are
(01:03:49):
fighting disinformation and misinformation have been for a long time.
So much of that infrastructure is Black women. What is
it like to know that we have such a being
role in doing a lot of the work that is
making the Internet safer and better for everybody? I mean,
I it's tough because on the one hand, when we
(01:04:14):
when progress is made that we push for everyone benefits
um and often we benefit the least. And so it
is it's just a role that many Black women have
taken on to protect themselves and our communities. On the
other hand, I don't blame any black women who are like,
(01:04:34):
you know what, this is not my fight, this is
not my battle. I'm tired. I'm just trying to live
during a pandemic. I'm trying to feed my kids, I'm
trying to feed myself. I'm trying to take a damn
nap like. I ascribe wholeheartedly to the NAP Ministry and
the work that the nat Ministry has been doing because
I think sometimes we have to say you know what
(01:04:55):
I told you, so now I'm going to write that's it.
I'm done, I'm done, I'm bound out, And so I
allow the space for that at any point, while also
hoping that when black women say, you know what, this
is work that I want to do, that we're uplifted
and we're empowered. The flip side of that is making
(01:05:15):
sure that allies are supposed allies are not then saying,
oh my gosh, you're so good at this, you need
to be the one leading it. No, no, after the
fifth and the results in Georgia, when everyone is posting
about and not black women because black women were not
doing this, but when everyone else was posting about what
Stacy Abrams needs to be doing now. If she wants
(01:05:38):
to go to a SPA for the next month, for
the next decade, that is her decision, and that's what
she should be empowered to do, and those same people
trying to demand labor of her should donate so that
she can have her SPA time for as long as
she wants. Like that is a kind of ally ship
that I want to see, not just finding new work
for us to do when we're the only ones paying
(01:06:00):
the price for the work. Internet hate machine is a
production of cool Zone Media from More podcasts from cool
Zone Media, check out our website cool zone media dot com,
or find us on the I Heart Radio app, Apple Podcasts,
or wherever you get your podcasts.