Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Anny and Samantha. I hope my Stuff
I Never Told You a production of iHeartRadio, and in
our continuing end of the year Favorite episodes wrap up
bring Back, we're also choosing two of our favorite Bridget episodes.
(00:27):
Those are so wonderful they're hard to choose from. One
that really stuck out to me was the one we
did about Facebook's abortion blackout, where we were talking about
what Facebook was doing. They're saying they're doing free expression,
but in fact they're not doing that at all and
(00:50):
not doing anything with transparency, and it's just such a relevant,
unfortunately relevant conversation, and it's just really I think it's
a good reminder that we need to be aware of
what these companies are doing and just your existence and
(01:11):
data and how it operates in that space. So yeah,
please enjoy this classic episode.
Speaker 2 (01:27):
Hey, this is Annie and Samantha.
Speaker 1 (01:28):
I'm welcome to Stuff I Never Told You a productive iHeartRadio.
And today we are once again thrilled to be joined
by the fabulous, the fantastic Bridget Pod.
Speaker 3 (01:45):
Welcome Bridget, thank you for having me back. I'm so
excited to be here.
Speaker 1 (01:49):
We're so excited to have you always, but yes, we
were discussing beforehand. There's a lot for us that we
would like your expertise on. So always we appreciate you
coming on. There's so much going on right now. That
being said, how are you bridget.
Speaker 3 (02:06):
You know, this week was I don't know if you
all feel it. I feel like the last few weeks
has been pretty rough. It feels like things are different.
It just feels like a shift in the air, which
can be hard to exist in, let alone make content
in that doesn't make people fearful and want to check out,
which I think maybe we're all kind of navigating.
Speaker 4 (02:29):
Yeah, is that something that resonates with youtwo.
Speaker 1 (02:32):
Yes, ah, yes, it's we make a lot of content
and we do try to mix things up, but we
also don't want to make more things. But I mean,
for instance, yesterday I just had a lot of trouble
concentrating on my work because I was thinking about all
this other stuff. I'm like, what's going on in the world?
What can I do? Which I've always maintained that if
(02:54):
you want to get more productivity out of people, then
they're going about it the completely wrong. But that's just
a very small personal gripe of mine. Uh, but yeah, yeah,
it's it's been it's been difficult.
Speaker 2 (03:09):
You know. What I've discovered is these tiny little coloring
books that are really like easy coloring, so it's not
detailed because I've discovered as much as I like to
do those things, I cannot stay within the lines. And
when they get really like fancy, the adult coloring book versions,
like they get fancy and you have to do the shading,
and like, what is this? Why do I have so
(03:30):
many things to color? So I've discovered these tiny ones
that's just like a cutesy large pictures of like a milkshake.
It has been really nice that I can just check out,
stay inside the lines and color with a marker like
those little like you know, paint like markers that.
Speaker 5 (03:46):
Doesn't really satisfying. Nice to like zone.
Speaker 3 (03:49):
Out, Sam, you are speaking my language, because a good
I have a I've spent so much money on this,
but alcohol markers on Like, there's nothing quite like good marker.
A new set of markers. You're like, this is gonna
change everything. My future starts today. I've got this new
set of markers. Yes, and the one of the ones
(04:10):
that that I write really well are so satisfying. I
know I sound like a crazy person, but genuinely, the
appeal of finding the right set of markers can change everything.
Speaker 2 (04:23):
Oh no, I am obsessed with pins and well written
fine point pins, like it has to be fine points,
like the Chinese and Japanese have a market because they
have some of the best, like nicely flowing pins.
Speaker 5 (04:36):
Even though my my handwriting.
Speaker 2 (04:37):
Is really really bad, but I love the feel of
like a nice smooth rite. But with these, like the
alcohol markers as you're talking about, the only problem I
have is the color that reports that they say they
represent on the cap doesn't actually like translate in the marking.
Speaker 5 (04:53):
So I'm like, oh, this is a yellow and it
turns orange.
Speaker 2 (04:56):
I'm like, wait this is this is not but it
is very smooth and it is very satisfying because it
fills all the lines, and you're like, yes, I'm a professional.
Speaker 4 (05:04):
Colorer, of course.
Speaker 3 (05:06):
This is how This is how much I tell seriously,
I take this. When I get a new set of markers,
the first thing I do is a little colors watch
to be the oh they say orange, especially what their
orange looks like.
Speaker 4 (05:16):
Just do I know, no surprises?
Speaker 5 (05:18):
You know what?
Speaker 2 (05:19):
That's good to That is great advice as a newly
marker purchasing person.
Speaker 4 (05:24):
So thank you.
Speaker 2 (05:25):
I'm gonna have to i'ma have to get some like
scrap paper just so I can do that.
Speaker 3 (05:29):
This is we can do a whole episode on this.
Let's just look pens markers my fans. It gives me
a head tingles even just talking about it. I love
it so much.
Speaker 2 (05:39):
So since I did the like pre show, everything's the
worst I have to bring in. This is a solution,
and it's coloring, large small coloring books that with wonderful markers.
Speaker 3 (05:48):
Yes, yeah, that can be the antidote to our troubling times.
Have you considered just diving headfirst into the world of
markers and penmanship and calligraphy and and coloring.
Speaker 1 (06:01):
It is quite nice.
Speaker 5 (06:02):
It is quite nice.
Speaker 4 (06:03):
Annie.
Speaker 2 (06:05):
I will leave some for you. I know you're about
to house it for me, so I will leave some
for you to try out yourself.
Speaker 1 (06:10):
You know I have. I have two main thoughts from this.
One is that I hate when this happens where your
birthday is coming up, Samantha, And I wish I had
known this earlier. Oh no, that would have been a
great gift. But then I think, bridget you should come
on one time and let us talk about something that's
not so stressful. Let's let's give let's give ourselves a
(06:34):
little break. We can talk about markers. I know you
mentioned like reality TV. We could have a whole thing
where it's not something.
Speaker 2 (06:41):
So do you do all the like dark stuff we
should make Let you come in with the joys that
you have.
Speaker 5 (06:47):
Because we just talked about the Adirondacks and everything.
Speaker 3 (06:51):
Yeah, I feel like people who listen to my content
might not know that I experienced joy I have. I have.
Speaker 4 (06:59):
The only things.
Speaker 3 (07:00):
I talk about are not the crushing weight of fascism.
Speaker 4 (07:05):
I also enjoying reality television. I think this would be fun.
Speaker 1 (07:09):
I think we should look at it. I like it. Oh,
but unfortunately before today, we're not doing that today. This
is also the timing is interesting because we're Samantha and
I are working on an update on CPCs crisis Pregnancy Centers.
Speaker 4 (07:27):
I know them very well.
Speaker 1 (07:29):
Yes, and we in our research ran into a lot
of stuff about how tech companies were basically paying for
them to advertise or accepting their money and being misleading
about things. So this is very much related. What are
we talking about today, bridget.
Speaker 4 (07:46):
Well, that is such a good transition because it's all related.
Speaker 3 (07:49):
But tech companies really have put their thumbs on the
scale when it comes to being able to get accurate
content about abortion on social media. Your point about christ
his pregnancy centers and the way that Google essentially is
like paying an advertising network for them to exist is
a great example. But today I really wanted to talk
(08:09):
about how social media is heavily moderating and even in
some cases like suppressing and censoring content about abortion. I
think that we all talked about this back in January,
But do y'all remember when Mark Zuckerberg had that moment
that people sort of talked about as his mask off
moment back in January when Trump came back into office.
(08:30):
I think that we were talking about how he really
started dressing like a divorced nightclub promoter and was saying
things like, Oh, we're taking the tampons out of the
washrooms at the restaurant. Here at Facebook, HQ just really
was sort of having a moment where he was saying
a lot of things.
Speaker 4 (08:47):
Do you remember this.
Speaker 1 (08:49):
Oh yes, oh yes, he was like he leaned in hard,
he was hard.
Speaker 5 (08:54):
He's been waiting for this moment, And yes, oh my gosh.
Speaker 4 (08:57):
You could tell.
Speaker 3 (08:58):
I mean I also I almost quibbled when people were like, oh,
it's his mask off moment, because I don't think that
Mark Zuckerberg has any kind of like I don't. I
wouldn't call it a mask off moment because I think
that he is the definition of a hollow, empty person,
and so I think he is the mask.
Speaker 4 (09:15):
He will say anything.
Speaker 3 (09:17):
I think that he has no he's I'm honestly fascinated
by him as a tech leader because I think that
he has no value, scruples, morals, there's just nothing. He
will say anything, he will do anything. However the wind blows,
that's how he will blow. And I don't think it's
fair to call that a mask moment when truly, like
what the mask is not is not hiding anything. This
(09:37):
is just genuinely like who you are, who you always
have been, just the soulless person who was waiting to
see who they should kiss up to you and will
do that if it means holding onto power.
Speaker 2 (09:47):
Right, he was just waiting in the background, like his
true personality was just waiting in the shadows.
Speaker 4 (09:52):
And then it's like, oh, oh ho, this is my moment,
and so when that all was going on.
Speaker 3 (09:57):
He also announced that Meta was going to be scrapping
their community notes feature and scrapping all third party fact
checking on the platform, because, as he said, it was
time for the company to get back to their roots
when it comes to free expression. I will play a
little bit of a video that he put out talking
about this.
Speaker 6 (10:16):
Hey, everyone, I want to talk about something important today
because it's time to get back to our roots around
free expression on Facebook and Instagram. I started building social
media to give people a voice. I gave a speech
at Georgetown five years ago about the importance of protecting
free expression and I still believe this today. But a
lot has happened over the last several years. There's been
(10:38):
widespread debate about potential harms from online content. Governments and
legacy media have pushed to censor more and more. A
lot of this is clearly political, but there's also a
lot of legitimately bad stuff out there. Drugs, terrorism, child exploitation.
These are things that we take very seriously and I
want to make sure that we handle responsibly.
Speaker 3 (10:58):
So to say about that, first of all, very convenient
rewriting of the history that frankly wasn't that long ago,
and that you if you're listening and you're my age,
you probably remember because we all know it is not
a secret that Facebook. Mark Zuckerberg created Facebook as a
college student to rank the looks of the women in
(11:19):
his on his college campus. Somehow we sort of let
him get away with being like, I created Facebook to
protect free expression. Okay, sure, I don't know. I always
have to like quibble at that because he I guess
saw a video where he said I created Facebook because
I wanted people to be able to have debates about
the Iraq war, and it's like, no, you did it.
First of all, I was an organizer in the anti
(11:42):
war movement. Nobody was communicating on Facebook when were like,
Facebook wasn't for that, So that's just not true. I
really have a thing where people lie to your face
about recent history that you remember that.
Speaker 4 (11:56):
You were therefore that you were part of. So that's bullshit.
Speaker 3 (11:59):
But even more than that, he's talking about how the
content that he really wants to focus on in terms
of moderating the platform is a legal content, right, child safety, harms,
drug trade, organized criminal activity, all of that. So this
is when he was really talking about how important it
(12:20):
was to protect free expression on social media platforms.
Speaker 4 (12:23):
You'll might recall that.
Speaker 3 (12:24):
Around this time he was in the headlines for saying
that you felt the Biden administration had been trying to
pressure Facebook into removing COVID misinformation. The White House had
a different take, saying, quote, when confronted with the deadly pandemic,
this administration encouraged responsible actions to protect public health and safety.
Our position has been very clear and consistent. We believe
tech companies and other private actors should take into account
(12:46):
the effects their actions have on the American people while
making independent choices about the information they present. So, you know,
Zuckerberg in this moment was like, we are not going
to be moderating political content the way that we have been.
We are going to lift restrictions on topics that are
part of mainstream discourse and really just focus on the
enforcement of a legal and like high severity violation.
Speaker 4 (13:09):
So yay for free speech right.
Speaker 3 (13:11):
That all sounds great, Well, all of that, It's only
the case if that part of the mainstream discourse is
not abortion, which Facebook continues to suppress and moderate quite
heavily with zero transparency and zero consistency.
Speaker 4 (13:26):
So it seems like if.
Speaker 3 (13:28):
You're spreading COVID misinformation, Well, that is protected speech that
needs to be left up for freedom. If you are
sharing accurate information about abortion that isn't even against metas policies,
they will take it down.
Speaker 1 (13:41):
Yeah, and like you said, without warning or transpancy or
nothing just is gone. And you might not know why
or well, you could probably figure it out. But one
of the things that's really frustrating about all of this
is that, you know, like you said, they're kind of
lying to our faces, right, Like they're saying one thing
and doing something completely different.
Speaker 3 (14:02):
Yeah, that's what really makes me angry about this. You know,
I cover a lot of tech companies. The thing that
gets me is when they lie when they say one
thing publicly, when they publish something in their rules and
regulations and policies. You know, no one's putting a gun
to their head and making them put these things in
their rules. They put them in their rules, and then
they do a completely different thing, and then when advocates
(14:24):
or organizers call them out on it, there's just no
they're just like, oops, what are you gonna do that?
For some reason, that just really gets me because they
are allowed to enjoy all of this positive press of
putting this thing in their policy and then continue doing
the shady work of going against that policy. It never
(14:45):
it never comes back at them like they're able to
just do whatever they want while saying one thing can
doing another. And I just don't feel like they really
get held accountable. And so a Meta spokesperson said that
taking down abortion content goes against Meta's own intentions. A
spokesperson the New York Times, we want our platforms to
be a place where people can access reliable information about
(15:05):
health services, advertisers can promote health services, and everyone can
discuss and debate public policies in this space. That is
why we allow posts and ads discussing and debating abortion.
But they're not doing that at all. Because the big
thing to know here is that Meta says one thing
in their policies and then does a completely different thing
(15:25):
when it comes to how they are actually moderating abortion content.
Speaker 1 (15:29):
Yeah, and it's it's so difficult right now to get
that good information, and there's so much misinformation and disinformation
out there, and to remove it is just really piling
onto a problem that really doesn't need any more piling
onto it is already really bad and people are already
(15:51):
very confused. This is not helping.
Speaker 3 (15:56):
No, that's a really good context to set that. You know,
we're in a time where the Supreme Court struck down Row.
It is so much harder to access accurate information about
help so that people can make health decisions for themselves.
And when social media platforms like Facebook put their thumb
(16:17):
on the scales and make these kinds of moderation decisions
with no transparency that go against their own stated policy,
it just makes that climate so much harder. It makes
it harder for the people who are trying to do
this work, abortion providers and abortion advocates. It makes it
harder for people who need to make decisions about their
health and the people that support them. It makes it
(16:37):
so that people cannot access information to figure out what
they want to do with their own bodies and lives.
And these companies do that while saying, oh, we promote
the ability to use our platforms to get this kind
of information. I would prefer that they say we don't
like abortion, we don't want people using our platform to
(16:58):
talk about abortion, so we take that content off.
Speaker 4 (17:01):
At least that would be honest.
Speaker 3 (17:02):
But what they are doing is lying to people about
what they're actually doing while doing it. It just it's
it's so it's it's really adding insult to injury.
Speaker 2 (17:11):
Right, I mean, the true, honest answer probably is that
they are taking money or they know that they are
just buying time until the entirety of our rights and
reproductive rights may be completely dismantled in every way. And
that way they can already say, hey, leaders of this
fascist regime, we have done everything for you, so can
(17:33):
you keep supporting our platform and give us more money? Ugh?
Speaker 3 (17:37):
I mean the way that you've got the Fox Watch
in the Henhouse here, the way that platforms are able
to cozy up to really, I mean, it's not even
really the.
Speaker 4 (17:47):
Trump administration, just whoever is in power.
Speaker 2 (17:50):
Uh.
Speaker 3 (17:50):
And then though that administration is also the administration that
is meant to be overseeing and regulating them. It's horrible
and so we really it's I'm glad that you brought
that up, because I think that helps us peel back
the layers of what exactly is going on here and
why it's so unacceptable.
Speaker 2 (18:06):
And in understanding that that the whole confusion part is
probably the point.
Speaker 4 (18:12):
I think that's true.
Speaker 3 (18:13):
I mean, in looking at some of the ways that
Facebook says one thing and does another when it comes
to moderating abortion content.
Speaker 4 (18:18):
I think that's exactly the point.
Speaker 3 (18:19):
It's like, you know, if we and we'll get into
this a little bit in a moment, but if we
create a confusing, inconsistent, not transparent climate, people will just
stop posting this information on our platforms.
Speaker 4 (18:31):
And so we don't have to crack down on all
of it.
Speaker 3 (18:33):
We don't have to have a policy that does not
allow for abortion content to be on our platform. It'll
there'll be a chilling effect and people will do it
on their own. They'll just stop posting on their own.
And I think, in my opinion, that's the why and
why this is happening. So Meta says that they really
(18:58):
want to focus on moderating posts that deal with illegal content.
Side note, they don't always do such a great job
of that either, but that's for another episode. So Meta's
Dangerous Organizations and Individuals ORDI policy was supposed to really
be like a narrow policy focusing on preventing the platform
from being used by terrorist groups or organized crime like
(19:19):
violent or criminal activity. But according to the Electronic Frontier Foundation,
over the years, we've really seen those rules be applied
in far broader and troubling ways with little transparency and
significant impact on marginalized voices. And this has essentially allowed
Meta to suppress factual content about abortion that does not
actually break any of the platform's rules. So the reason
(19:42):
wht me know this is because the Electronic Frontier Foundation
or EFF have really given us a snapshot into what's
happening and provided some very clear receipts with their stock
censoring abortion campaign effs as they collected stories from individuals,
healthcare clinics, advocacy groups, and more, and together they've revealed
one hundred examples of posts and resources being taken down,
(20:03):
ranging from guidance on medication abortion to links of resources
supporting individuals in states with abortion bands. What is important
to note is that the posts that they found that
weren't taken down or that resulted in sometimes a ban,
did not break any of meta's rules. Iff said, we
analyze these takedowns, deletions, and bands comparing the content to
(20:25):
what platform policies allow, particularly those of META, and found
that almost none of the submissions we received violated any
of the platform stated policies. Most of the censored posts
simply provided factual, educational information. So it really is a
system where you don't know, I mean, I guess you
(20:45):
could guess why this content is being taken down. There's
no consistency, there's no transparency, and Facebook just gets to
be like oopsie when it happens. Here's a great example
of a post that was removed from a healthcare policy
strategist named Lauren Carer discussing abortion pills availability by mail.
Her post reads, FYI, abortion pills are great to have around,
whether you anticipate needing them or not. Plan C Pills
(21:08):
is an amazing resource to help you find reliable sources
for abortion pills by mail, no matter where you live.
Once received, the pills should be kept in a cool,
dry place. The shelf life of maybe pristone is about
five years. The shelf life of missopriystal is about two years.
There is a missipriystal only regiment that is extremely safe, effective,
and very common globally. So that post is just here
(21:30):
is some factual information about these pills. However, Facebook removed
that post, and the explanation they gave Lauren was that
they don't allow people to buy, sell, or exchange drugs
that require a prescription from a doctor or a pharmacist.
But as you can tell, that post isn't about selling
or buying or trading medication. It is just fact based
(21:52):
information about that medication.
Speaker 1 (21:56):
Yeah, it's one of those things where you read it
and you're like, I don't see the I do see
the thing. The thing that you're saying is there. It's
just it's just information. Uh oh it makes me bad.
Speaker 3 (22:12):
Yeah, and Eff points out that this post does not
break any of Meta's rules and should not be removed.
But you don't have to take their word for it
or my word for it, because Meta said the exact
same thing. Eff points out that Meta publicly insists that
posts like these should not be censored, and if February
twenty twenty four letter to Amnesty International, metas Human Rights
(22:32):
policy director wrote, organic content i e. Non Paid content
educating users about medication abortion is allowed and does not
violate our community standards. Additionally, providing guidance on legal access
to pharmaceuticals is allowed. So what the hell he suck? Like,
why if it's allowed, why are you taking it down?
Speaker 1 (22:53):
I'm so curious about because if the moderators are essentially
kind of removed. Then is this just a they have
like a keyword, like, how is this happening? Is there
a person or.
Speaker 4 (23:06):
That is a great question.
Speaker 3 (23:07):
If I had to guess, I would say, just knowing
what I know about content moderation, I would say this
is probably over use of AI moderation and then not
caring enough to correct that. That's if I had to say,
I would say, because honestly, content moderation is a job
for a not just a human, but a culturally competent human.
Speaker 4 (23:28):
When you don't have culturally.
Speaker 3 (23:29):
Competent humans making the moderation decisions, it's a problem, and
it's a problem that leads to the content of marginalized
people being suppressed much more on these platforms. Right, So,
if I had to guess, I would say, this is
somebody using an AI content moderation and then not.
Speaker 4 (23:45):
Caring enough to correct that.
Speaker 3 (23:47):
It is consistently taking down content that does not break
any of the platform's rules.
Speaker 4 (23:51):
That's my guess.
Speaker 1 (23:53):
Well, and that kind of relates to another thing I
know you're going to talk about, which is something Smith
and I have also talked about on some of our episodes.
Is shadow banding.
Speaker 3 (24:02):
That's right, I mean shadow banning is one of those
issues that I find very interesting because who among us
has not posted something on social media had that post
not perform as well as you were expecting, and wonder
my shadow band. I have definitely thought this myself. If
you've ever thought that, you are not alone. But it
does really happen. So shadow banning is when a social
media platform limits the visibility of someone's content without telling them.
(24:26):
And this is happening to people and organizations that make
content about sexual and reproductive health. And it's a real
problem because, as you were talking about before the Internet
in twenty twenty five, like that is really where people
are going to find information about their health, especially in
a landscape where that information is more difficult to come by,
where it's criminalized and cracked down on. So people need
(24:49):
the Internet as a resource, and so if the people
and advocates and organizations who provide that information online are
shadow band it becomes that much harder to access what
is so often life saving information to help people make
health decisions. Earlier this year, the Center for Intimacy Justice
shared a report called the Digital Gag Suppression of Sexual
(25:10):
and Reproductive Health on Meta TikTok Amazon and Google, and
they found that of the one hundred and fifty nine nonprofits,
content creators, sex educators, and businesses that they surveyed, sixty
three percent had content removed on Meta, fifty five percent
had content removed on TikTok. And this suppression is happening
at the same time as platforms continue to allow and
elevate videos of violence and gore and extremist and hateful content.
(25:34):
And this pattern is troubling because it only becomes more
prevalent as folks turn more and more to social media
to find the information that they need to make decisions
about their health. And so I like that context because
we really do have a social media landscape that allows
for violent content, gory content, extremist or hateful content to
(25:54):
stay up while taking down accurate content about reproductive health
that they agree does not violate any of their policies.
Speaker 1 (26:05):
It's pretty telling to you. You have some examples here,
and one of them is from a place near us
that I was like, oh dear, oh, dear, Emery. Yeah, yep,
But I mean it's also, as we're doing this research
(26:25):
on the CPC episode, I consider myself pretty you know,
pretty informed about abortion and all of it, but I
had to look up some stuff about like I'm not
sure is that legal there? I don't know, Like I
was feeling like I don't I don't know if I
can trust this information. And then you try to go
to a place where you're like, Okay, I know this place,
(26:47):
and then you find out it's taken down, it doesn't
have anything about it. Yeah, it's not a good climate.
Speaker 4 (26:54):
Yeah.
Speaker 3 (26:55):
And then you have Google allowing CPCs to stay, you know,
high ranked in their search. And then when you go
to CPC's they tell you all kinds of misinformation about
pregnancy and abortion. They are allowed to just essentially lie
to people people who are in vulnerable situations. And so
it's already a climate where it's hard to find trustworthy,
(27:17):
accurate information. And then the clearly not trustworthy, clearly not
accurate information is allowed to not just allowed to exist,
but they put their thumb on the scales in terms
of making it more accessible than information that is factual.
Speaker 4 (27:36):
Yep.
Speaker 1 (27:37):
So let us get into some of these examples, including
the one eras.
Speaker 4 (27:41):
So let's talk about what happened at Emory University.
Speaker 3 (27:43):
So rise at Emor University, the Center for Reproductive health
and research in the Southeast. They published a post saying
let's talk about Miffy pristone and its uses and the
importance of access. So they post this online. Two months later,
their account was suddenly suspended, flagged under the policy against
selling illegal drugs, which they were not selling or offering
(28:05):
illegal drugs. They were just giving fact based health information.
They tried to appeal, but that appeal was denied, leading
to their account being permanently deleted. Sarah Read, the director
of research and translation at Rise, told Eff as a team,
this was a hit to our morale. We pour countless
hours of person power, creativity, and passion into creating the
(28:25):
content we have on our page and having it vanished
virtually overnight took a toll on our team. And you know,
I really think, like think about how critical that information
is these days, and how critical social media is these days.
They are already doing sensitive work in an area where
that work is threatened, and so losing your social media
(28:47):
that you've put so much time into is like losing
a lifeline, both for the staff and for the community
that you're trying to do that work in. As Eff
puts it, for many organizational users like rise, they're social
media accounts or repository for resources and metrics that may
not be stored elsewhere. We spent a significant amount of
already constrained team capacity attempting to recover all of the
(29:09):
content we created for Instagram that was potentially going to
be permanently lost. We also spent a significant amount of
time and energy trying to understand what options we might
have available for Meta to appeal our case and recover
our account. Their support options are not easily accessible, and
the time it took to navigate this issue distracted from
our existing work. So I totally feel what they are
(29:32):
saying that when you are doing work that is that critical,
you know, time sensitive, having to stop that work to
figure out, well, how are we going to appeal this
decision to Meta? Is all of are all of our
years and years of work on Instagram just lost forever?
Speaker 4 (29:47):
That is a real problem.
Speaker 3 (29:48):
And again, they weren't doing anything wrong, They're they're nothing
that they posted on their account was against metas policies.
It's just arbitrary, and so luckily they were able to
eventually get their account back, but only because they knew
someone who knew somebody who worked at Facebook personally, which
is really the only way to appeal when this kind
of thing happens. If your account is taken down for
(30:10):
no real reason by Facebook, I am sorry to say,
unless you have a friend who knows somebody who works
at Facebook, you're probably not going.
Speaker 4 (30:16):
To be able to appeal again. Because a lot of
these decisions are AI right.
Speaker 3 (30:19):
It can be very very hard to escalate to a
human and the only real way to do it is
to just know somebody there. And again, I just feel
that in these situations where Meta agrees these posts are
not in violation of their rules and that they admit
they made a mistake, it should not come down to
knowing somebody at Facebook to have these decisions be reversed
(30:41):
when Meta agrees their mistakes on their part.
Speaker 1 (30:45):
Now I'm trying to think if I know someone at
Facebook I used to, I don't know if we're still there. Well.
Another issue with this is, as you said, if people
(31:06):
are worried that their content might be deleted or shadow banned,
or just they've seen this happen to other organizations or
something like that, then they might not post it anymore.
Speaker 3 (31:21):
Yeah, and I have to assume that is the point.
Eff rights. At the end of the day, clinics are
clinics are left afraid to post basic information, patients are
left confused or misinformed, and researchers lose access to these audiences.
But unless your issue catches the attention of a journalist
or you know, someone at Meta, you might never regain
access to your account. And so I really think that
(31:44):
that is the sort of so what here that Meta
is doing this to sort of not explicitly discourage organizations
and advocates and people from posting this kind of information
on their platform while saying the opp because it is
going to have a silencing effect. You know, nobody wants
to risk losing their entire platform, years and years and
(32:07):
years of content and research and resources they've collected.
Speaker 4 (32:09):
Yeah, no one's going to want to take that risk, right.
Speaker 2 (32:13):
It's interesting that their policy with like the things that
we're going to actually moderate is about terrorism and gain
and child endangerment, which is kind of a dog whistle
for what the Republican platform has been to for jump
to all of this morality level of issues, and that
(32:35):
the fact that Zuckerberg is like, you know what, yeah,
we're going to adopt this too, but it's purely to
protect the people's We're just protecting the people's And again
it does seem like, see see we're doing like you,
we got your back. We also agree with this, this
is the only way or this is the best way
to control what information is being out there.
Speaker 4 (32:57):
Yes, and if you actually I mean, this is a
whole other topic.
Speaker 3 (33:01):
But when you look at the way, so they say, Okay,
we're only gonna be cracking down on content that creates
harm for kids, this dog whistle that they love to
pull up, and then when you look at the kind
of harm for kids that they either allow or advocate,
part of me is like, what content are you actually
taking down? I don't know if you all saw the
(33:21):
recent reporting. There was a very interesting report I think
from the Wall Street Journal where they have gotten their
hands on an internal policy document. So this is something
that somebody at Facebook said, this is our policy, totally
fine to have in writing no problem. That said that
Meta's chatbots were allowed to engage in sensual play with minors,
(33:46):
so kids, it was okay with Meta if their chatbots
engaged in like sensual I won't say sexual, but I
would say I've seen some of the content and it
is sort of spicy, but it's okay if they're if
they're bought to do that with children. And part of
me is like, I cannot believe you will put this
in writing. I cannot believe that someone at Facebook said, yeah,
(34:08):
this is a this is a document, I'll attach my
name to this. Well and behold. When the Wall Street
Journal asked about it, they were like, oh, no, we
have since walked that policy back. That's no longer our
official on the record. Our official on the record policy
is no longer that it's okay for our bots to
engage in sexy role play with kids.
Speaker 4 (34:26):
We walked that back, Like, I bet you did walk that.
Speaker 5 (34:29):
Back today as you asked this question.
Speaker 3 (34:32):
I'm sure it happened right after the Wall Street Journal
called them and asked them about it.
Speaker 4 (34:36):
I'm so sure that it was like an hour later
we walked at.
Speaker 5 (34:38):
That we got this now, No, we would never right.
Speaker 1 (34:45):
Well, yeah, and I mean you were here, Bridget. I
guess it was years ago and you were talking about
another kind of whistle brower account of Facebook knowing it
was harming young girls.
Speaker 4 (34:58):
Yeah, Francis Hogan is is the whistleblower? Why we know that?
Speaker 1 (35:02):
Yeah? So it's it is very galling for them to
be like, we want to protect the children, and then
you have these things that again directly show that clearly
you don't not.
Speaker 3 (35:14):
Really and to be clear, that is knowingly harming kids.
So I just think it's very interesting that Facebook gets
to say, well, we're too busy focusing on content that
harms kids to take down to really care that much
about what's going on with our abortion content, and that's
that's the content we're really working on.
Speaker 4 (35:33):
But really we're not doing that either.
Speaker 3 (35:35):
You know what I mean, Like they get they really.
It just infuriates me, it really does. And I think
the issue really is to understand is in twenty twenty five,
when you have a question, well I have a question,
the first place I go is.
Speaker 4 (35:49):
The internet, right we all, that's I think that is
the reality.
Speaker 3 (35:52):
For most of us, and the internet and social media
really has become this lifeline for folks trying to get
information about the world around us, including our sexual and
reproductive health. And if folks are not able to find
what they need in their own communities, which I'm sorry
to say is becoming more and more of the reality
these days, they are going to go online to turn
(36:14):
to social media to fill those gaps. That access really
matters most for folks whose care is being cut off,
like abortion seekers or trans or queer youth living in
states where healthcare is under attack. And so if you
have these social media platforms kind of adding to a
landscape where that information is difficult to access, even if
that information is not against their rules, it's just making
(36:37):
it that much more difficult.
Speaker 4 (36:38):
And these decisions really do matter.
Speaker 3 (36:41):
I mean they some of them are life or death,
and they really have real world impact on people's lives.
Speaker 1 (36:47):
Absolutely, And unfortunately this is not this is part of
kind of a larger issue, kind of a larger attack,
gendered attack. Correct.
Speaker 3 (36:58):
Yes, so this is a thing I find so interesting
and I actually should come back and do another episode
on it.
Speaker 4 (37:03):
In the middle of some research on it right now.
Speaker 3 (37:05):
But the Center for Intimacy Justice, whose report I mentioned earlier,
they have another report that really shows how platforms routinely
suppress sexual health content for women and trans folks, while
leaving content aimed at supporting the sexual health of CIS
men largely untouched. Right, So, I know lots of people
who run businesses that are focused on like the sexual
(37:29):
health of people who are not CIS men. Right, So
if you have pelvic pain, if you need sex toys,
like all these different things that are aimed at people
who are not cisgender men. I have lots of friends
who run businesses like that. They are essentially not able
to do any kind of advertising on Facebook because Facebook
does not allow it. However, Facebook certainly allows information about
(37:54):
the sexual wellness of cisgender men. So we really have
a climate where, let's mostly men who run these platforms
are able to determine whose sexual health is important and
who is not, Whose healthcare is healthcare, and whose is
like something perverted that needs to be suppressed and isn't
allowed on their platform.
Speaker 4 (38:15):
Yeah.
Speaker 2 (38:15):
Now thinking about it, as I've been looking at Instagram,
the amount of GLP one ads that I've been getting,
which is interesting because I thought that was medication that
you had to get their doctor, is overwhelming. But also
on the vice versa of that is MELS you know,
health sex health bs. Those are the two ads that
(38:37):
I get. Definitely nothing about women in birth control rarely
there's a few that but as of late I think zero.
But the amount of GLP one ads, I'm like, WHOA,
what is happening Instagram.
Speaker 5 (38:49):
I thought we weren't allowing this.
Speaker 3 (38:51):
Yes, I mean the amount of ads I get, specifically
for the erectile dysfunction medication blue Choo, and the ads
are do you ever seen these ads online?
Speaker 4 (39:01):
The ads are clearly targeted.
Speaker 3 (39:03):
At women, so it's it's a cute woman being like, ladies,
get your man to get blue shoe. Blue shoe is
gonna rock your world. Get your man on blue shoe.
And that's a medication, that is that is an erectail
dysfunction prescription medication. But these platforms have just decided, Oh no,
that's okay.
Speaker 4 (39:23):
That's that.
Speaker 3 (39:23):
You can you can show that all day long, no problem,
you can boost it, you can put money behind it.
Speaker 2 (39:28):
Totally fine again Like and then the other part being
the weight loss medication, which slowly a lot more information
comes back like, oh there's side effis this might not
be as good as you think.
Speaker 4 (39:38):
By the way, Oh yes, I'll just say yes.
Speaker 5 (39:42):
I'll just say yes.
Speaker 2 (39:43):
Well that's like as as of late, and we know
this was coming. We knew this was coming because they've
also got variations that are not FDA approved, which I
guess means very little at this point in time. But
again that this is the rampant amount of ads like
every two like scrolls on instcaret that pops up, and
on Facebook too, which is I'm like, I don't even
(40:03):
go to Facebook. I just need to know people's birthdays.
That's all I need, That's all I really want. But again,
this seems to be like I thought once, if that
was your policy from Jump, then how are these as
paying you, I know, paying you millions?
Speaker 5 (40:19):
How are these okay?
Speaker 3 (40:20):
Yeah, their policy is totally inconsistent, seemingly arbitrary, and seemingly
biased against any kind of marginal identity. Like that's just
what's going on. They don't have any transparency. They say
one thing and do another. Uh, and that just is
the is the norm for them, and it's I really
do think that we should be talking about how you know, again,
(40:43):
let's be honest, we're talking about mostly men, and mostly
not just men, like a specific kind of man, white moneyed,
coastal all of that. How we have given them so
much power to define what knowledge is acceptable, whose voices
are amplified, whose bodies are are left a risk, and
when platforms decide what can and can't be shared, they
(41:04):
are making public health decisions with global consequences in ways
that are often contrary to public health, and then also
reinforce systemic and equalities. And so I just think, you know,
this is not just we're not just talking about like
vague policy language. I know that I've spent a lot
of time talking about that because it annoys me. But
it's really about them deciding who gets to speak, who
(41:26):
gets seen, who gets access to the information that they
need to make decisions about their own health and bodies.
When Meta and these platforms silence accurate, essential sexual and
reproductive health information, but not just enforcing inconsistent body rules.
Speaker 4 (41:41):
I mean they are, but they're not just doing that.
Speaker 3 (41:43):
They are also shaping people's lives and really deciding whose
health matters and whose doesn't. And in a world where
we know the internet has really become this lifeline, that
is not just annoying, although I am annoid, it is
dangerous because free speech shouldn't come with a disclaimer that
your bodies is just optional and up to the whims
of Mark Zuckerberg.
Speaker 1 (42:04):
No, I don't want to live in that world.
Speaker 3 (42:05):
No, No, I mean I think about this all the time,
the ways that these that these individuals, like a handful
of individual mostly white guys, get to define what our
worlds look like in these very concrete ways.
Speaker 4 (42:22):
And I've never met Mark Zuckerberg.
Speaker 3 (42:25):
Although I have met Cheryl Sandberg, but I've never met
Mark Zuckerberg. I don't want Mark Zuckerberg in charge of
deciding anything for my life. I don't think Mark Zuckerberg
and I have any common idea about what it means
to have a good, fulfilled life.
Speaker 4 (42:39):
I don't want him designing what my future looks like.
Speaker 1 (42:43):
I think that's very wise.
Speaker 2 (42:45):
I mean that one movie made him look really pretty
much a dick.
Speaker 3 (42:48):
So, oh my god, you mean The Social Network, one
of my favorite movies.
Speaker 4 (42:53):
Oh my god.
Speaker 3 (42:54):
I don't want to I don't want to spoil it,
but the ending of that movie is my version of
Citizen Kane.
Speaker 4 (43:00):
Have you both have seen it? Yeah, okay, you've not
seen it.
Speaker 2 (43:04):
I've always seen clips because I'm like, I don't want
even want to know, but he seem.
Speaker 4 (43:07):
Like a dick.
Speaker 3 (43:07):
Go home and watch it tonight. I know your birthday
is coming up. My birthday is a fun birthday watch.
I mean, I'm such a nerd. I say it's a
fun watch, but if you're looking for a movie to
watch on your birthday, that might be that might be
the one.
Speaker 1 (43:22):
It's a good, it's a solid like, oh yeah, you're
just a sad man ending.
Speaker 4 (43:27):
Yes, you know what I'm talking about.
Speaker 3 (43:29):
You kiss Citizen Kane. The sled moment at the end
of that movie haunts me, and I think if you haven't,
I think if you've seen it, it gives context for
some of the stuff we've talked about when it comes
to birth today.
Speaker 2 (43:43):
All right.
Speaker 5 (43:43):
Without even seeing it, I was like, all right.
Speaker 1 (43:48):
Also love it. It's such a dark soundtrack.
Speaker 4 (43:52):
Who is it yet?
Speaker 3 (43:54):
Nobody does a haunting soundtrack like Trent Resnor Gone Girl
soundtrack soundtrack to Challengers, also Trent Resnor, And that's the
soundtrack I put on when I'm writing if you need
to focus and just like put on some headphones and
be like we are writing, that is your soundtrack. Resnor
can write a dark movie soundtrack like nobody's business.
Speaker 4 (44:17):
And I love that.
Speaker 1 (44:18):
It was in this movie about this college kid trying
to get a girl to like oh lord, yes, okay,
well we'll revisit that later. We'll do uh maybe you know,
we had a fun time trash talking some Hugh Heffner
that one time we'll come back with a fun thing
(44:40):
for you, Bridget and you can side because you always
bring us these heavy topics of your choosing. But before then,
thank you so much for being here. Where can the
good listeners find you?
Speaker 3 (44:53):
You can find me on my podcast, there are our
goals on the internet. You can find me on Instagram.
I know it's owned by Max Zarckerberg. I don't make
it either at Bridget Bryan DC, TikTok at Bridge and
Bran DC, and on YouTube.
Speaker 4 (45:05):
And there are no girls on the internet.
Speaker 1 (45:08):
Yes, go check all of that out if you have
it already, listeners, if you would like to contact us,
you can. You can email us at Hello Stuff, I
never told you. You can find us on Blue skyt
most of podcast or Instagram and TikTok at Stuff one
never told you. We're also on YouTube. We have new
merchandise at Cotton Bureau, and we have a book you
can get wherever you get your books. Thanks as always to
her A Suproduces, Christina Executive Producing, my under contributor Joey,
(45:30):
Thank you and thanks to you for listening. Stuff Never
told you. Quot should by Heart Radio for more podcasts
from my heart Radio. You can check out the heart
Radio app, a podcast or where you listen to your
favorite shows.