Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to another episode of Internet Hate Machine. Sophie. Thank
you for being here. It is been such a long, exciting, fun, challenging,
lovely couple of months covering all of these topics with you.
Today we are wrapping everything up. Can you believe it?
I really can't. It feels like it feels like it's
(00:23):
been like we've been doing this for so long, but
it feels so short. At the same time, I agree.
I think it kind of makes sense to take a
pause and sort of wrap up where we've been in
the last four months because we've actually covered a lot.
So for folks who are just joining us, I hope
this is not your first episode because it is also
the last, but just to sort of recap where we've been.
(00:46):
We started the season talking about Adrian Richards and dongle Gate.
I really wanted to start there because for me, that
was when this kind of problem of targeted, coordinated online
harassment of women, particularly black women, really got on my radar.
We talked a bit about the end Father's Day campaign,
like a four chan eight chan troll campaign to make
(01:08):
black feminists look bad. Talked about Leslie Jones SNL actor,
comedian extraordinaire and the horrific harassment campaign that she faced
by my Loanapolis but dressed by none other than Steve Bannon.
We talked about elected officials after the mid terms and
how they're facing coordinated online harassment pactics that keep them
(01:29):
from being able to run for office and sort of
their communities. We talked about Elon Musk. That's right, there
was a time where he did not own Twitter. It
doesn't that seem like forever ago. Well, we talked a
bit about how he is empowering extremists like Kanye West
and Nick Quentes online. Although, by the way, did you
see that Nick Quintez was rebanned on Twitter? So cool?
(01:52):
It's almost like content moderation is a thing, and there
was a reason why he was banned in the first place,
and unbanning him, just to reban him a couple of
days later is like an exercise in futility. And you
could have saved all that time if you just had
a little bit of humility coming into Twitter mode. You
know what, I won't don't don't let me go off
on a rant on Elon Musk. The point is and
(02:14):
the unbanded thank you. We talked about the director of
the film Cuties, and how there is still an open
court proceeding against her because one guy in Texas just
won't let it go. We talked about Shay Moss and
Ruby Freeman election workers who were targeted by Trump and
(02:34):
his creonies, which, by the way, did you see that
Biden awarded them just a presidential medal? Okay, I thought
it was. I was gonna say the Presidential Medal of Freedom,
but I wasn't sure if that's a different thing. Just
a casual presidential Yeah, just a casual presidential medal. One
thing I did love about them getting that medal is
we talked a bit about this in the episode, about
how like one reason why I feel like they were
(02:57):
targeted specifically was that they know they look like they
have a very specific kind of look, and I loved
that when they were on stage accepting this very important
metal from President Biden, they kept the same look, the
same hairstyle, the same fashions, And I was like, yes,
they are not they are They look like themselves And
I was really hardened to see them showing up looking
(03:18):
like themselves, not in hiding, not changing the way they
look to appease racists and sexists and extremists, but like
themselves on stage accepting a presidential medal. Really, it felt
kind of like a full circle moment for the show,
to be honest with you, Yeah, for sure. And then
we wrapped things up talking about critical race theory ce
Celia Lewis and kind of the state of our education
(03:42):
when it comes to this kind of thing. And then
lastly we talked about my girl, Foma Zoma and how
even as someone who is trying to right some of
these wrongs within our broken technology ecosystem, she was punished
for it. And honestly, that did seem like a kind
of a a good place to leave things, because there
are so many people who are trying to make our
(04:03):
digital world and our technological world better and safer and
more inclusive and less broken. But we're never going to
get anywhere as long as those voices continue to be
punished and targeted for the work of trying to make
those spaces better for all of us. It's it's so
fascinating to go through what we've discussed and to see
all these different elements still in play currently in so
(04:27):
many different stories. Uh specifically, when Robert and I were talking,
Robert Evans and I were talking about Andrew Tait and
how he got so popular. There's so much Steve Bannon
and I g e in there. There's so much of that,
and it's it's really just a sight to see that.
(04:49):
We're like, if you if we had taken just you know,
a couple of seconds to actually not move on to
the next thing, then maybe more bad things would stop happening. Absolutely,
I mean the the Andrew Tates stuff. First of all,
I'm so glad that y'all that y'all talked about that.
I've seen folks championing that episode everywhere, and I think
(05:10):
it's I think it's important to talk about who empowered
these specific people to be able to have the platforms
that I have now and asking some questions about, Yeah,
when we saw when Steve Bannon was empowering folks like
my Janopolis and allowing him to harass people and abuse people,
why like like, where were people in power being like, actually,
(05:32):
maybe we should talk about what this means and the implications.
And now we have Andrew Tate, Like, do you want
to get Andrew tapes? That's how you get Andrew Tates.
So these are just some of the stories that I
chose to tell for this series. But by no means
are these the only women who have stories like this
that are out there. The research is super clear that
(05:55):
this problem is systemic. It is not individual. It is
not just a handful or a dozen or two you know,
individual women who have faced individual harassment campaigns. It is
a systemic, deep cultural problem, and it comes with consequences.
Black women are disproportionately targeted for things like harassment and
disinformation and hate speech online, and the kinds of disinformation
(06:15):
campaigns that target us, as we've demonstrated in this series,
often traffic in racialized and gendered stereotypes and tropes about
Black women. And the research is really clear also that
when this happens in digital spaces, it has very real
world consequences. Let's start by looking at some of the
consequences for the Black women who are targeted. When racialized
(06:36):
gender disinformation and harassment festers and thrives online, people become
a lot less likely to trust Black women as leaders,
and in turn, those Black women are less likely to
be civically engaged because they know that even to casually
participate in civic discourse is to risk becoming a target
of dis harassment, and that is, honestly, something that really
(06:57):
stands out to me about all of these stories is
that how in almost every instance, these are women who
were being punished and targeted for just merely participating in
public and civic life. You know, they were doing their jobs,
they were speaking their minds, they were making films, making art,
they were serving their community or running for office or
helping people vote. You know, these are just the stories
(07:18):
that I decided that I wanted to tell in this
limited series. But there are so many more that peop
will never hear. For every you know, Hollywood actress who
is speaking up about being the target at this kind
of harassment, there's a black woman, or a black girl,
or a black transperson or a black queer person whose
story we will never hear, who will not have the
kind of resources to speak up and have somebody do anything.
(07:42):
And so, you know, as as important as the stories
that we heard this season are, I think it's important
to also just honor the fact that there are so
many more that will never hear and we should make
space for those stories as well. Like you shouldn't have
to be Megan Markle or Leslie Jones to get your
story told. You shouldn't have to stand on a stage
next to President Biden to have people take this seriously
(08:03):
and understand what you've been through when it comes to
this kind of thing. And I think that we can
really draw a straight line from those kinds of attacks
to our current political landscape. You know, when people with
power watched black women be targeted for coordinated online harassment
and abuse just for doing their jobs and basically did nothing,
they basically set the stage for the same kind of
(08:24):
attacks to be used on different people, many of whom
are still marginalized people. You know, they're still LGBTQ folks
or folks who are otherwise marginalized. But it's the same
kind of attacks that are meant to keep them from
showing up in public and civic life as well. A
good friend of mine to that Harry who was just
kind of a general brilliant tech troublemaker. She argues that
when black women speak up about the abuses and harm
(08:47):
that we are facing online and people with power don't listen,
that harm is then allowed to be experienced by others.
She writes, harmful behavior toward black women isn't enough to
inspire change. Until others are harmed, but the original harms
are often lost by journalists task with covering tech. The
power and rhetoric that went unchecked then becomes common, and
the tactics used against black women for lulls become weapons
(09:10):
used in the conspiracies to stabilizing the very nature of truth,
from the swarming of victims, to posing as black women,
to destabilizing communities or countries. I would add democracies into
that list, and I would also add that that is
exactly what we are seeing today. I wanted to focus
on Black women for this series because so many of
the stories of our harassment are sort of lesser known
(09:31):
and get less attention. But as Todet argues in that
piece for Wired, when those stories go undealt with and unexamined,
they are allowed to fester and become everyone's problem. And
I think that's where we are right now. So what
does this look like out in the real world. There
are so many examples, but I wanted to talk about
a couple of specific ones. First is the abortion and
(09:54):
reproductive justice advocacy space. Some of y'all might know that
I cut my teeth in the abortion rights movement, working
out planned parenthood, and I'm still sort of in that
space more broadly, kind of more on the abortion advocacy
and tech digital security intersection. UM, but I can tell
you that right now, coordinated harassment and online abuse campaigns
(10:14):
are threatening people, the vast majority of whom are women
and women of color doing abortion advocacy work. And if
you care about a future where abortion access is protected
and people actually have true, meaningful reproductive justice and bodily autonomy,
you need to care about what's happening to women online.
Because the women who are abortion providers, abortion advocates, folks
(10:37):
fighting for abortion access, those are the same women being docked,
being smeared, being lied about, being attacked online, and also
facing cybersecurity threats. And so if you care about abortion
access and protecting abortion access, which we all should, you
need to care about what's happening to women online. Another
one is the suppressing and silencing of survivors. You know,
(10:58):
the tactics that we talked about on Internet Hate Machine
are the very same tactics that we've been seeing to
silence people who speak up about abuse. Anybody who posted
about the Johnny Depp amber her defamation trial, for instance,
probably has experienced like a little bit of what I'm
talking about firsthand, where reporters, Yeah, I mean you like
I it's it's wild to me because, like I they
(11:22):
kind of got me on this one. Like I saw
how clearly it was that people who stood up for
survivors in that instance, how quickly and how obviously coordinated
it was to have people be like, here's all the
reasons why you're wrong, Like no, like you're like like
a swarm of people online and so that like I
(11:42):
was like, well, I'm not gonna wade into this because
I don't want to deal with that. And it has
such a clear silencing impact where people don't feel first
of all, survivors don't feel like they can speak up,
and then anybody who would support that survivor or be
an ally to that survivor doesn't feel comfortable speaking up.
So it has it's this real obvious intentional silencing. Yeah,
I think what you said there is extremely important to
(12:03):
point out the word coordinated, because these weren't just one
off people that had an opinion about something and we're
we're trolling you. This was a coordinated effort and was
in mass correct. That's so important, Like it's not just
one or two people being like, actually, I don't agree.
It is a coordinated effort where those those really fun
(12:25):
for me to see you two, Actually I disagree. Actually
I could do an impression of a well actually guy
in my sleep because I deal with them all day long. Yes, yes, yes, yes.
And then what's so interesting in the case of like
(12:47):
suppressing and silencing survivors is it's not just the survivor
or the person who would support them, it's also you know,
reporters are or journalists who are just trying to provide
coverage of what's happened. A lot of reporters who worked
on the beat of covering, you know, survivors of abuse
recently have talked about how they too become targets of
(13:08):
coordinated harassment. And so, if you have an ecosystem where
reporters can't even provide balanced, fair, accurate reporting of what's happening,
you have a real problem. And it's clearly meant to
suppress survivors. It's clearly meant to silent survivors who would
be speaking up against powerful abusers. When you look at
(13:34):
attacks on educators and lgbt Q youth and educators, UM
you know, I believe that Twitter accounts like libs of TikTok,
which target teachers and youth perceived as lgbt hugh or
allies to the LGBTQ community. I believe that that account
and accounts like it, saw that you could get a
whole lot of traction and attention and engagement from an
(13:55):
audience if you use the same kind of tactics that
we've talked about on this series to target marginalized folks.
You know, and just like so many of the stories
that we've talked about, from Cecilia Lewis's story to End
Father's Day to the director of Cutie's Uh, they saw
that you can get actual like traction and media support
for doing this right, So Tucker Carlson might signal boost you,
(14:17):
the mainstream media might start platforming you and your cause
in this kind of you know, both sides framing as
if it's a reasonable position to hear from both people
who are just trying to live their lives and the
people coordinating to trying to try to stop them from
doing that. And we're at this point where people are
doing these types of things in order to get attention
from the Tucker Carlson's of the world. They're they're being
(14:40):
as big as ass hats as they could possibly be
in order to get that ten seconds on his show,
and he rewards them, He rewards that behavior. And it's bullshit.
It's bullshit. And what's worse, it's like it's this self.
It's this cycle where people use a dishonest tactic, they
(15:03):
get engagement and coverage from it, and it just incentivizes
others to do the same. So it really when I
when I I mean, I'm sure folks are sick of
me using the word ecosystem, but when I say ecosystem,
I mean ecosystem that it it. It feeds off one
feeds off of the other, and it kind of will
keep going and keep going until somebody stops the cycle.
And I feel like that's where we're at. We're just
(15:24):
seeing people weaponize disingenuous, racist, sexist attacks and attacks on
LGBT hugh folks, and it is a cycle. It is
a cycle that is going to I think, make us
all worse for the wear. You know, when when this
is what gets engagement and eyeballs and clicks and attention,
you know, not only is it dishonest and homophobic and
(15:45):
sexist and racist and all of that stuff. It absolutely is,
but it also incentivizes people not doing the thoughtful coverage
that they should. It makes it eat that much easier
to slap together a story that's like, oh cop die
is from looking at fent at all and running with that,
it creates an ecosystem where there is not a it's
(16:06):
not incentivized to do thoughtful work that actually helps us
get smarter and understand issues. Yeah, I mean, you said
all the words. So what's really kind of scary to
me is the way that these kinds of folks are
using these tactics to amass political powers. As we heard
(16:28):
in that story about Cecilia Lewis, it's not just about
targeting individuals. It is about amassing electoral power political power,
and in a lot of cases, I am sad to
say that it is kind of working. You know, while
we're all focused on national politics and these like big
flashy presidential races, extremists can buttress these kinds of online
(16:49):
hate campaigns into far right takeovers of school boards and
local offices. And I think that it is scary how
effective it can be. Also, it is scary how I
hate to. I mean, I know, I don't hate to
do it. It is scary to me how mainstream media
will just help them do this. We we were to
(17:09):
the point where, you know, as someone will say, oh,
I'm a concerned parent, and it's like, actually, you are
running a campaign. I got, like a astroturfed campaign to
take over a school board, so you're not just a
concerned parent, like you know. The way that mainstream media
will help bad actors and extremists do their work for
them is really appalling and I think a pretty scary
(17:31):
consequence of letting all of this stuff go unchecked. And
I think that it's critical that we really understand kind
of what has been normalized as a part of our discourse.
You know, disinformation, online abuse, harassment, and hate. That's not discourse.
And when we allow it to be treated like discourse,
we all lose out, except for bad actors. Sorry, I
(17:52):
did I say that a lot, but that's the only
one I canna use. Yes, if you're still if you're
playing the drinking game, you're probably tanked by now, uh It.
It is bad actors and tech billionaires who win while
we all lose out because they get engagement and they
get richer. So the real question is do we want
to have a media ecosystem where the rest of us
are paying such a high cost so that Mark Zuckerberg
(18:14):
or Elon Musk can add another comma to their bank account.
I would argue no. I would argue no. So that
leads us to our kind of like big So what
question of the of the series, which is what needs
to be done? The first thing that I think absolutely
needs to be done is that institutions and people with
(18:34):
power I'm talking about like elected officials and like government
folks need to listen and do something. Here's things get
a little bit squishy on this one. I know that
one route that people talk a lot about is you
know that we need more policy solutions, Like governments and
elected officials need to be doing more on the policy front.
So I a thousand percent agree that in the United States,
(18:55):
even just taking this seriously as a problem will be
a step in the right direction. My hope and my
sense is that you know, this might be changing a
little bit. Like I think there was a time back
when when you talked about online harassment, like I always
had to call law enforcement for online harassment against myself,
and this was like many many many years ago before
(19:15):
we were having lots of conversations about what harassment looked
like in the Impacts, and I remember tell like, the
thing that was very scary was like I was pretty
used to getting people saying mean things to me, racial slurs,
to me, this is out of time, and I was
like really trying to build up my career as like
a person who had opinions in public. And the thing
(19:37):
that was scary was that I used to live in
an apartment that shared a wall with like a popular
bar here in d C. It's no longer there, but
it was like a very popular bar, and I used
to always talk about how like, oh, I'm going to
this bar again, going to this bar again, because it
was like kind of my spot. And somebody tweeted at
me that they knew my address because they knew that
I lived next door to this bar, and that was
(19:59):
very different then you know, someone calling me a slur
or someone being mean to me online like this person
accurately safety issue, yes, And so I remember calling my
parents and my my dad was like, we need to
get you a taser. Okay, that was his, that was his,
you know, and I was I was pretty young. It
(20:20):
was like this was like I was very early career,
I guess, I would say. And I ended up calling
the police, and the police told me, like, what you
need to do is just stay off the internet. And
I remember thinking, what you're basically telling me a new
cops are so bad at everything. What that basically means
is like start a new career, go back to college.
(20:42):
Because at that time, my whole, my entire career trajectory
had been around, you know, making a public name for
myself as somebody who had like opinions in public, and
so that like I had done a lot of work online,
like you can't really do that in the modern era
that an online presence is. So basically he just showed
me how much this person did not understand the impact
(21:05):
and the fact that if I disclosed my computer that
this threat wasn't going to go away. So I do
think that things are changing in the sense that like
people with power are perhaps taking it a bit more seriously.
I would hope that if I had to call law
enforcement today, I would get a different a different answer.
And so I do think that we have gotten to
a little bit of a place where institutions are taking
(21:27):
it more seriously when it comes to whether or not
I think that governments should be pushing for an advocating
specific policy. I gotta be honest that this is really
like not my ministry. I know tons of really smart
people in the tech public policy space who are doing
a lot of advocacy around this to get governments to
adopt better policies around tech regulation. I don't want to
(21:49):
throw any cold water on any of that, but personally,
it is just like not my area of expertise on
how this could be done. And I know from like
being in these spaces that there is not even necessarily
like broad consensus in terms of what needs to be done. Right,
Like I'll be sitting at a coalition table and one
group will be like, oh, well, we need to revoke
(22:10):
section to thirty, and another group will be like, no, no no, no,
we need we need to strengthen to thirty. And so
it's like I wouldn't even necessarily say there is like
broad consensus in terms of people who want to do
something that or want governments to do more. There's not
even a broad consensus of what should be done. I
do know that when this when like what kind of
policy solution should be done gets brought up. One of
(22:32):
the common solutions that gets floated around is like, well,
what if we had a rule where you had to
use your real first and last name if you wanted
to post on social media. What I find so common
about government intervention, this one specifically, is that sometimes people
will put forth policies that will actually end up causing
more harm, particularly to marginalize folks of like abortion advocates
(22:54):
or providers, or activists or sex workers. If there was
a policy that you had to use your first and
last name, like your government name, in order to post
on social media, it would be a disaster for these folks.
Insert clip of of that senator being like, now, what
is Vinceta? I mean, that's so true, and and like,
(23:14):
I guess that's another reason why I don't really hold
a lot of hope. I don't put a lot of
my eggs in the elected official government basket, just because
oftentimes we have people who are not digital natives, who
do not understand the technology that it is supposed to
be up to them to regulate and to make to
make decisions about. You know, I guess the like top
(23:37):
quote unquote democratics check schumers. Check Schumer's got flip phone, right,
So we're like, if you are a young queer person
experiencing target and harassment, you are meant to trust that
Chuck Schumer understands what you're going through when it's going
to like advocate for you. I mean, again, I don't
want to throw cold water on anything, but it's just
(23:59):
not it's just not where I think that. It's not
where I personally choose to put a lot of my
my energy and hope there. And so I can say
that whatever government and public policy ends up being advocated for,
we have to make sure that it centers the voices
of the people who are most impacted, and those are
(24:20):
marginalized people. Right. So, like I see this time and
time again where folks with power will say, oh, well,
we're we're gonna make this policy and it's going to
protect X, y Z marginalized group, but it ends up
with those people with power just speaking over those communities
and kind of telling them what they need. And I
think we need to really make sure that we're flipping that,
(24:41):
so we're listening to marginalized communities, working with them, paying them,
centering their voices and experience and having the equitably let's
start there eutably. So often, I mean this is like
I could go off on a whole rant here. I'll
try to rein it in, but you know, don't rent
it in. Do it. Do it. One thing I see
(25:03):
quite a bit. It's like people who are very well meaning,
who who I do think are genuinely interested in, you know,
centering these voices, but not understanding if you are, like
from a marginalized community asking someone to show up and
you know, tell you about their experiences for free, that's
going to them that help you with like whatever policy
(25:23):
position you're trying to take or advocate for, or like
help you make your platform you know better, or whatever.
When you're making a million dollars from it is it's
not equitable. And I think that, like, we really have
to make some changes in how we work with and
alongside and amplify the needs in the voices of people
who are marginalized, because so often it's just really not great.
(25:52):
So while I can't really speak to some of the
policy suggestions and I hear thrown around a lot, but
I can talk about is platforms. In my book, it
is social media platforms and the people that run them
who are the real bad guys because they're making money
off of breaking and poisoning our Internet ecosystem. I could
give a lot of piecemeal things, and I think that
(26:14):
platforms should be doing. Platforms need more transparency and their
moderation policies. For instance, if someone's Instagram account keeps getting,
you know, taken down because of coordinated attacks, they should
be able to tell you exactly what's going on. It
shouldn't be a mystery to the person. It should be
clear what's happening. Oh god, you're so right. Platforms need
(26:39):
to enforce the moderation policies they do have instead of
making excuses for doing funk all. I could give you
all a million different examples of the ways that platforms
will have policies on the books and then be like, oh, well,
we're gonna not do that. You know, if this was
widely reported, and I know that, I probably say it
a lot across a million different podcasts. But Mark Zuckerberg
(27:00):
personally intervened to make sure that Alex Jones would not
be permanently banned from Facebook. He was banned for being awful,
and Zuckerberg stepped in and kind of created a loophole
specifically for Alex Jones, so that people who were reposting
Alex Jones as content, which is a lot of people,
would be freely able to do that even if Alex
(27:22):
Jones could not post himself. And so they created a
loophole that it's it's it's actually the exact same loophole
that Andrew Tate exploits on TikTok. You know, when you
have a policy, actually enforced that policy rather than finding
ways to side step it. Mark Zuckerberg and platforms need
more inclusion, you know, right now, we have such a
(27:44):
very narrow subsection of men, like even more narrow than
just white men, like specifically wealthy white men in coastal
cities like New York or you know, um San Francisco,
who are the ones of making policies and rules for
the rest of us. And these people are so often
removed from the consequences or harm of the rules that
(28:05):
they make. And this is not just within the United States.
This is particularly relevant for parts of the world like
the Global South, where Facebook and Facebook own property, so
like Facebook, Instagram, What's app basically are the Internet the
same way that in the nineties America online was the
Internet for a whole bunch of folks in a lot
of these places across the globe. Facebook is the Internet.
(28:28):
And so when you have a very very narrow subset
of wealthy white men in coastal United States based cities
making decisions for these big pockets of the globe, that
is a real problem. So platforms really need to work
to create a better culture of inclusion so that the
people making the rules for platforms actually reflect the people
(28:49):
for whom those rules impact, you know, so that's the
next time that somebody like e Foma Uzoma comes along
to try to make better policies, she's not targeted and
punished and pushed out for doing that work. And so
I think a real deep culture of inclusion is something
that really needs to be done. Banned this person, banned
that person, like individual, because when you do that, you're
like playing whack a mole. The problem is is much deeper,
(29:13):
and it's a cultural one. We need to change the
culture that says that tech leaders just do not have
to be accountable to us. You know, the people that
use our technology. Their platforms would not exist if it
was not for you and me, all of us users,
and so Even if you're not, you know, an engineer,
a developer, a coder, tech leaders still need to feel
accountable to the way that their platforms shape the world
(29:35):
that we all live in. It is in their best
interests to think that we're too stupid or don't understand
the technology that they build, and that thus, like we
don't get to say, we don't get an opinion. I
can't tell you how many times I've watched Mark Zuckerberg
be hauled before you know, before Congress, and he acts like, oh, well,
(29:56):
you know, this is just too complicated for you all
to understand, Like I'm smarter than all of you, so like,
don't even worry about it. But trust me, you are
not more stupid than Mark Zuckerberg or Elon Musk. There
is no way that you are more stupid than Mark
Zuckerberg or Elon Musk, right that it is in their
best interest to trick all of us into thinking that
we don't understand this technology, we don't understand how it works.
(30:17):
There for what do we know? What why do we
get to have an opinion when we believe that nonsense,
We are empowering men like Mark Zuckerberg and Elon Musk
to continue doing harm with no accountability. So the second
that you catch yourself thinking, I don't really get to
say in all this, they have one. We all need
a deep cultural change where we unlearned the idea that
(30:39):
we don't deserve to have a voice and we don't
deserve to take up space when it comes to conversations
about the Internet and technology, because they impact all of us.
From the way that you get your groceries, the way
that you vote, whether or not you're gonna be criminalized
for going to a protest with your face showing all
of that. These are all decisions and policies and rules
that have such a deep blasting impact for all of
(31:01):
our lives, and the leaders that get rich off of us,
they need to be accountable to how it impacts our lives.
And so I really want to have there be a
massive culture shift around technology and how we all think
about it. And so, you know, I'm sure you're thinking, like, well,
that sounds great, but I don't know Mark Zuckerberg, I
don't know Elon Musk, I don't know Jack Dorsey, I
(31:22):
don't know Jeff Bezos. What can I do? Well? We
don't have to wait for tech leaders to do the
right thing. You know, we don't have to wait for
government intervention. We can shift culture by acknowledging and being
real about the impact that it has. One of the
reasons I was so excited to be telling these stories
with Cool Zone Media and the audience that you all
have built, Sophie, is that I know that it's this
(31:44):
engaged audience of people who really care about what's going
on in the world and really feel empowered that they
can make the be the change and make the change
they can. They can, you know, rethink the way that
they see these power systems playing out in their own
lives and be the change to make something a little
bit different happen. Yeah, I mean, and we also know
(32:04):
that like Elon Musk isn't going to save us, the
government is not going to save us. Rich Track bros.
Aren't going to save us. The only people that we
can rely on is ourselves and each other and mutual
aid exactly. And you know, sometimes it is said that
a good marker of how we're doing as a society
is how the people who are the most marginalized are treated.
(32:27):
And I think that that should be the case for
our digital world too. And I just gotta tell you,
I the research is really clear, and I hope this
this podcast series made it clear that the people who
are the most marginalized online right now are black women.
And I think that if you are a leftist or
a radical, or you care what's happening in our world,
(32:48):
you care what's happening to our democracy, you care what's
happening with the state of activism right now, you've got
to care what's happening to black women in real life
and online to like that's it's just it's not something
you can side up. It is a fight for all
of us, not just because it's the right thing to do,
not just because you know, black women are everybody is
better served when our voices are are centered and included
(33:09):
in the conversation, but because it really does have a
direct impact on all of us. I believe if way
back in the day, somebody would have would have listened
when all of these black women like Adrian Richards and
Leslie Jones were speaking up about what they were seeing,
if somebody would have done something, I don't I don't
know that I could say that we would be in
the same position that we're in right now. I truly don't.
(33:29):
And so it really matters what happens to marginalized communities
the way that people who are marginalized on the internet
are faring. You know, so goes us, so go us all,
And it really comes back to community, right, Like I
want people to when they're having online experiences and they
see something where they're like, this seems like a coordinated attack,
(33:50):
or this person seems like they're they're not this someone
is not just disagreeing with this person. They are trafficking
in racism or sexism to harm them and to suppress
them into, you know, shame them out of participation in
civic discourse. Acknowledge that, right, Like, don't let it be
(34:10):
our fight. It's all of our fights. And I think
for so long it's been something that we don't really
talk about, where it's we are the ones experiencing this
kind of harm and it's kind of like not polite
to talk about it, and like and like, who wants
to listen to somebody, you know, talk about, you know,
what they're experiencing online. But I just really encourage everybody
(34:32):
listening to take up this fight because this our fight
is your fight too, And truly, we all deserve better.
We all deserve an Internet ecosystem that's not so broken,
that it's not so toxic, where people can you know,
have experiences that feel like something other than harassment and hatred,
that feel like discovery, that feel like inclusion, that feel good.
(34:56):
Right when I was first coming up online, it was
the Internet that helped me understand who I am today.
It was the Internet where I went to answer questions
and I couldn't find the answers to in my tiny,
at small southern town, right. And so I genuinely worry
about the the generation coming up after us, the kind
of Internet ecosystem that we're leaving for them. I want
them to have the same kind of you know, experiences
(35:18):
that I did when I was coming up. I want
the in what we're all better sort of when the
Internet and the our media ecosystems are actually there to
foster thoughtful discussion, thoughtful discourse, democracy and you know, actual
actual actual debate, not hate speech, not disinformation, not harassment,
and not online abuse. And when when it's safe like that,
(35:41):
it can also be fun. You can also have It's
also a place where there's uh, you can have a
creative outlet where you can learn, you can laugh, you
can love, and uh, if we're not careful, that goes
completely away, and we're trending that way. Remember when the
Internet used to be fun? Um, absolutely it was. It
(36:03):
was cool. Remember remember changing your background on MySpace being like,
look at me, this is my favorite song. Hello, Oh
my god, I would I like. That's how I taught
myself how to how to code with like, you know,
hiding the pause button in my MySpace profile. So I
was like, oh, no, you're gonna listen to this panic
at the disco song and you're not getting It was
(36:25):
always a panic at the disco. I don't know why.
If you if you had an automated song playing on
your math Space, it was always panic at the disco.
Might had like falling stars, like Twitter stars. It definitely
crashed those like old boxy fucking computers. But it used
to be fun. It used to feel like discovery. And
now it's like, I don't want to live in a
(36:47):
world where the Internet isn't a place to discover who
you are and have fun and explore. And I I
feel like we are trending in the wrong direction and
I want us to have a hard reset. I know,
full circle. The Internet's bad. Hannic at the disco broke
up like yesterday, God, what's that? What's a millennial to do? Right? Oh, Virget,
(37:15):
this has been such a delightful series to do with you.
This has been so fun and I couldn't have asked
for a better person to help guide me through these
these rocky internet waters. Thank you for the opportunity this.
I want this to not be the end of the conversation.
So well it isn't. It isn't. We were getting towards
the end of this limited series and I was like,
(37:36):
I can't I can't be without Bridget. I'm like, I
wanna I want to see like I hope she'll say yes.
It was like me me being like nervous to somebody
to the school dance. I was like, do you think so? Yes?
It was like telling I was telling my to was like,
whom she does? I really love her? H Bridget is
gonna be doing a monthly, monthly episode on our uncle
Zone Media as that could happen here series, which is
(37:56):
our daily show a monthly a monthly Bridget, thank god,
So please stay tuned. This is not the end of
the conversation. And yeah, until then, keep in touch, be
good stewards of the Internet and each other. Is there
anything else you want to plug at the end here
of work? You're doing that where people can follow you
on on the interwebs anything. Yeah, you can follow me
(38:18):
on Instagram at bridget Marie and d C. You can
follow me on TikTok. I just started on TikTok at
bridget tad mixed Pods and you can follow me on
Twitter at bridget Marie. I'm less active there than I
used to be, but I am still there. We would
love to have you and it is currently on hiatus,
but you can listen to my other internet podcast on
I Heart Radio called There Are No Girls on the Internet.
(38:39):
We'd love to have you there as well. And yeah,
just keep in touch. I I want to continue the
conversation about these issues and thank you so much for listening.
It means a lot. Internet Hate Machine is a production
of cool Zone Media. More podcasts from cool Zone Media,
check out our website cool zone media dot com, or
find us on the I Heart Radio app, Apple pod Us,
(39:00):
or wherever you get your podcasts.