All Episodes

March 10, 2022 46 mins

Does it seem the whole world has lost its collective mind? We are a polarized country, for sure, but Imran Ahmed says fundamentally we are no different than we’ve ever been. It’s just that online social media and big tech are using behavioral psychology, mathematical algorithms, and the protection of current legislation to have their way with our heads and consequently, as evidenced by January 6th, our fists. Imran gives us practical ways to stop the madness and predicts new protective legislation for us and our kids is coming. //


If you have questions or guest suggestions, Ali would love to hear from you. Call or text her at (323) 364-6356. Or email go-ask-ali-podcast-at-gmail.com. (No dashes) //


Links of Interest:

Center for Countering Digital Hate: www.CounterHate.com

The murder of Jo Cox, MP:

https://www.bbc.com/news/uk-38079594

The Great Reset:

https://www.bbc.com/news/blogs-trending-57532368

Facebook Whistleblower Frances Haugen:

https://www.nytimes.com/live/2021/10/05/technology/facebook-whistleblower-frances-haugen

Section 230 of the Communications Decency Act of 1996:

https://www.theverge.com/21273768/section-230-explained-internet-speech-law-definition-guide-free-moderation

Antitrust:

https://www.cnbc.com/2021/09/23/ftc-chair-khan-outlines-vision-for-antitrust-enforcement-consumer-protection.html

Kids Online Safety Act:

https://techcrunch.com/2022/02/16/senators-propose-the-kids-online-safety-act-after-facebook-haugen-leaks/

https://www.washingtonpost.com/technology/2022/02/16/kids-online-safety-act-unveiled-blackburn-blumenthal/

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Go Ask Ali, a production of Shonda Land
Audio and partnership with I Heart Radio as a stand
up comedian, of which I am not. I tried it.
When you're hilarious, I've been your fan forever. You know
I should say right now, I'm married, so I'm off
the table. We can do weekends. Get your bullshit detector
and get it honed. Are you mad about something? Go

(00:24):
out and seek people who are mad about related things,
and also listen to them if part of what they're
mad about is you. You actually look for those little
colonels of hope. Ut. Yeah, well that's that's a good stuff.
I think it is a good stuff, and I think
we need a good stuff. Always. Welcome to go ask Allie.

(00:44):
I'm Alli Wentworth and this season I'm digging into everything
I can get my hands on, peeling back to layers
and getting dirty. Okay, I'm terrified about social media. I'm
terrified about all the dark sides of this new crazy
ride we're all on. And this episode is all about

(01:05):
navigating the treacherous world of social media and online hate
and misinformation. I mean, we live in a very polarized
country right now, because there are two sides that are
constantly slinging singers at each other and playing fast and
loose with facts in quotes. So how do we protect
ourselves and more importantly, how do we protect our kids

(01:26):
from this kind of trauma? And I've always been concerned
about social media in terms of how it affects the
mental health of kids, but that is just a tiny
portion of what misinformation and hate can do to our culture.
This is a mine field of issues. This is political hate,

(01:47):
hate groups, bullying. It should be terrifying and we've got
to figure out a way to get a handle on it.
And there's no one better to kind of help us
through this than Imron Ahmed. Imran Ahmed is the founding
CEO of the Center for Countering Digital Hate, based in Washington,
d C. He is a recognized authority on the social
dynamics of social media and the dark side of those spaces,

(02:11):
such as identity based hate, misinformation, conspiracy theories, and modern extremism.
Imron regularly advises politicians in the U, s UK, EU
and elsewhere on policy and legislation. He holds an m
A in Political and Social Sciences from the University of
Cambridge in England. Hello m ron Hi. Ali, we're just

(02:35):
going to dive in right into this topic. I've said
it before that the Internet is like the wild West
and we don't have any laws, and I worry every
day about all of it. So first of all, tell
me how you even got involved in it. You're at Cambridge,
you're taking classes, there's IVY covered everywhere. Well, I mean

(02:57):
it happened a long time after Cambra Join. When I
left Cambridge, I I spent a few years working in
commerce and understanding how business works and realizing that, you know,
I learned a lot of lessons. They're like, basically, companies
are a moral organizations. They don't have an instinctive morality

(03:17):
beyond making money, but that can lead to some problems,
distortions in the way that they operate that can cause harm.
And I was actually working an investment banking in the
sort of early auties when some of the some of
the real bad behavior was coming out, and then switched
to working in politics and I worked in the UK
Parliament for a number of years when in twenty fifteen

(03:41):
something happened that was really outside of any recent political
experience in the UK, which is one of our political
parties became irredeemably infected with anti Semitism. It was the
left wing party, the Labor Party, and I was serving
it at the time, and it was insanely shameful to

(04:02):
me that the party that I felt so much for
was starting to spout, both in terms of members and
some other senior leaders some really scary anti submitted rhetoric.
And then six months later we had a referendum in
the United Kingdom on leaving the EU, and in that referendum,
conspiracy theories, misinformation, hate were flowing at a phenomenal rate.

(04:24):
And at the end of that referendum, my really close
colleague Joe Cox MP, who was a Member of Parliament
in the UK, she was assassinated by a far right
terrorist who had been radicalized online. In fact, when he
killed her he shouted Britain first debtor traitors, which were
sort of Internet slogans. There were there were twenty six

(04:45):
version of stopped the steel or lock her up. And
you know, six months later what happened in the US
but the rise of a politician, the election of a
politician who had used these movements, these conspiracist movements here been.
He had been showing them some affections and as though
he was the one who would represent them in Washington,

(05:06):
and he built a political movement out of fringe conspiracy movements,
fringe ideologies and bolted them onto you know, a major
political party. And I thought, well, crumbs, something's happening here.
And it's not about left or right, it's not about
UK or US. This is a global thing. And when
something changes in multiple places simultaneously, when politics becomes much

(05:29):
more unstable and hate and misinformation start to flow, it
means that there's a systemic, a global problem. And what,
of course had been changing we realized very quickly was
the impact of social media on political discourse and the
way that we share information we form our values as
a society and which we decide what attitudes and behaviors

(05:53):
we accept. So it terrifies me that, for instance, there
could be one two people that concede information that goes
out there that like a snowball effect, you know, suddenly
becomes fact whatever that is. And do you identify these individuals?

(06:13):
I mean, how do you keep an eye on them,
or how do you call them out, or how do
you form any kind of criminal action towards them? Well,
one of the things we realized very early on was
exactly as you say, a small number of bad actors,
bad people are able to create a disproportionate amount of

(06:33):
noise and they can infect millions with misinformation and the
precursor lies that underpin hate. And what we've become quite
well known for as an organization is in going into
particular sectors of disinformation. For example, I think most famously,
we did a very notable study on anti vax misinformation

(06:56):
and found that twelve people produce sixty of disinformation shares
on social media. And you call them the disinformation doesn't correct?
Is this what we're talking about? That's right, that's right.
And I mean you know, that was a statistic that
was cited by Joe Biden when he criticized Facebook for
not taking action against these people. It's really helped health

(07:18):
communicators to understand that actually they're not up against a
social thing. It's not that suddenly people spontaneously generate anti
vax beliefs. It's that it is seeded by a small
number of bad actors. And when you know who it
is that's doing it, well, two things happen. First of all,
it helps us to plan out well who is it
that we're up against, and what are their actual agendas?
What how are they economically motivated? It helps us to

(07:41):
bust apart some of the disgusting ways in which they
make money from this. But the second thing is we
told the companies, we told the platforms, look, you've got
rules against misinformation that's designed to hurt people. You can
now take action because we've given you a list. So
like Joe McColo, Robert F. Kennedy Jr. Which is really
depressing because he's a Kennedy, but he's the bad Kennedy, um,

(08:01):
you know. And then there's people that you won't have
heard of, like Dell Victory, and like Tye and Sharleen Bollinger.
Tyan Sharlie Bollinger are too like to basically snake oil
salesmen who have been doing videos like the Truth about Cancer,
the Truth about Vaccines for years and years. But they
realized that because they're kind of good old boys, that
they can get all the Q and on people. So

(08:22):
they started marketing their stuff. They started hybridizing their ideology
with Q and on and they created this thing called
the Great Reset, which is now really big. So Q
and on has basically gone away, and it's been replaced
by the great reset that was driven by Joe Mcola,
Robert F. Kennedy and the Bollingers. So yeah, we know
who they all are. It is to my great regret

(08:42):
that in fact, of the nine social media accounts held
by those twelve people, forty two are still up, which
is just depressing. You know. It's why bother going half
the way and not going the whole hog when it
comes to dealing with people responsible for making people not

(09:03):
anti vats but vaccine hesitant, so to overwhelm people with
misinformation so that they can't make a decision on whether
or not to vaccinate. And if you wait. The problem
with COVID is if you wait to get vaccinated and
you catch it. There are people right now in intensive
care units as we're talking, and by the end of

(09:24):
this podcast, one or two of them may have died
telling their doctors. But I thought the vaccine would hurt
me and COVID and I you know, you've had COVID
and I've had COVID. It is a brutal infection. I
would much rather have the vaccine and the slightly sore
arm than go through that again. Well, so I want

(09:46):
you to walk me through, for example, what the disinformation
doesn't I want you to walk me through how they're
being paid, how they get the money for the disinformation
that they put out there, how do they put it
out there, and then how does it bleed? So, I
mean they're making money in lots of ways. One of
the easiest ways, for example, is to is that if

(10:08):
you can persuade people, you know what, you can't be
sure about the vaccine. Here's some misinformation about the vaccine.
Mike Cousins. You know, testicles got rather large and his wife,
his girlfriend left him. Um, so you need to wait
before you take it. But while you wait, why don't

(10:29):
you take my supplement? Because my supplement is proven to
be effective in helping you protect you against COVID. That
is one of the main monetization methods by these economically
motivated bad actors. And you know what's what's so remarkable
is that these pills they keep advertising them for everything
that comes along, so you know you'll find them spreading

(10:52):
misinformation saying cancer treatments don't work. Actually, what works as
my pill? You know, H one N one. If you
want to protect yourself from you should take my pill.
But also once they can persuade someone not to trust doctors,
not to trust you know, the public health professionals, and
to trust them instead, well then they've got a sucker

(11:12):
on the whole, haven't they. And a snake oil salesman
knows they can sell them books, they can sell them
access to special email list, they can sell them nebulized
how so one of the world's leading anti vacs there
who was splashed on the front of the New York
Times based on our research, Joe mccola, He recommends nebulized

(11:32):
hydrogen peroxide, which is inhaling bleach. Yeah. Yeah, But what
I don't understand is how he can get away with it.
If this guy is telling people to do something that
could kill them, like inhale bleach, why can't he be
held liable for endangering the public. You can't just yell
fire in a crowded theater, your theoristically there are other

(11:53):
laws that would be used in those instances. So telling
someone a lie about health, for example, that my cause
them to lose their lives, there can be liability. But
we've been really bad in the US and prosecuting them,
and it tends to be done at a civil level
by the FDA finds. So, for example, Joe mccoller was

(12:13):
told halfway through the pandemic to stop claiming that vaccines
were bad and that's why you should take vitamin dcoes
set in and something else, which he sells on his
Amazon branded web store. But it's a year into it,
and you know, by then he's already made two million dollars.
We tend to treat sales as we almost give it

(12:34):
more leeway than we would someone shouting fire. When it
comes to the companies, the platforms that literally then megaphone
fire through the theaters. They have this special liability shield
which is really really weird, and if it was any
other type of business, they would be sued for their
breach of their duty of care. All right, So then

(12:56):
you and your company, let's say you go to face
book and say, hey, listen, you know people are actually
not only dying from COVID, they're actually dying from these
made up concoctions that people are trying to sell. And
as you say, they go through, deny, deflect away. Yeah,

(13:16):
you know, I've been in this game. Now. I bought
when my friend was killed, my friend Joe Cox MP
That's when I started putting my mind to this, and
for the first three years we tried to talk to
the platforms. We said, look, we can identify for you
the spaces in which bad things are happening. Will you
do something about it? And for the first three years
we got involved in policy teams and discussions and they

(13:36):
told us we really want to deal with this too.
And it was only after that time that I realized
that we were being gasolt. We were being engaged in
a fake policy process designed to deny responsibility, deflected to
someone else and then delay taking action. And that's the
that's the playbook of deceit that we see from the

(13:58):
big tech companies. So we are very explicit now. And
of course we sent our report to senior executives at Facebook.
I think I sent it to Cheryl Sandberg, Mark Zuckerberg,
and Monica Bicker, myself, and you know, Joe Biden mentioned it.
A couple of months later, twelve attorneys general wrote a
lesson to Facebook based on our report. When they went

(14:19):
before the House Energy and Commerce Committee, those politicians asked
them about it, Republicans and Democrats, because at the time
it was of course bipartisan agreement that the vaccine was
a good thing because it was a vaccine developed on
President Trump's watch, and so it was at the time uncontroversial.
Facebook decided to not take action. In fact, they later

(14:41):
released a statement saying that c c D a Center
for Country in Digital hates report, which is a five
or one C three, we are a nonprofit that I run,
that that that our report was a faulty narrative. Now
here's the thing. Ali. A few weeks later, Francis Hogan
released her documents and what she revealed world was in fact,

(15:01):
on the very same day our report come out and
internal report come out on Facebook confirming our findings. So
what they have done for six months was they hadn't
just not taken the action required. They had also known
that we were right, and they've gone out and told
people that we were liars. Unbelievable, really, I mean, they deflect,

(15:22):
they deny, they delay all the time with everything, and
it's time for a short break. Great, let's get back
to it. Let's talk about these crazy conspiracy theories where

(15:45):
they put out on the internet that you know, people
should be killed, or Hillary drinks the blood of babies.
I've actually spoken to people who truly believe that they've
read that on the internet. They believe Hillary Clinton drinks
the babies. How do you counter that in any way,
Because if if I went on television said that's not true,

(16:05):
they say, oh no, you liberal. You know, there's no
way of countering this information once it's out there. It's
like if I called you a pedophile for the hell
of it, if I put it out on social media,
that would live on forever, and there's no way of
erasing it. And now that's what politics is. Politics is
just throwing false narratives back and forth so that you

(16:28):
don't know anything that's real or true. Well, I mean, look,
something terrible is happening right now. And I gave electric
tom students yesterday, high school students about how to be
safe online, and I explained to them the creation of
social media was a brilliant thing. It was a real positive,
but it's had these real negative side effects of destabilizing

(16:50):
our ability to to form consensus, in polarizing us in
actually now fundamentally undermining our ability to have democratic, liberal,
tolerant societies in which we can commonly address these huge
problems that our world has, like climate change and lots
of other things, lots of things are affected by disinformation.
And I said to them that the reason for it

(17:12):
is because basically this is this is San Francisco libertarian
capitalism gone crazy. It's like driven to the nth degree
where they don't care at all about the impact of
their businesses. They feel no duty of care. And in fact,
under US law, so something called section two thirty of
an act past which couldn't have predicted how this would

(17:34):
be misused. But under US law, a platform cannot be
held liable for any content created by a third party
on their site. And that specific law, section to thirty
of the Communications Decency at essentially creates a system of
impunity in which platforms feel no obligation to take any action.

(17:55):
In fact, for them to sort out their platforms, it
means reducing the engagement their platforms and they have to
pay to clean it up. So it's actually a lose
lose for them. And there are a number of malignancies
that that's allowed to emerge in our society, and it's
it's a bit like the climate. You know, climate comes
out of having cars and factories and broader access to

(18:18):
to consumer goods, which are good things, But then it
rooted carbon too the atmosphere, which which creates a problem,
and so you seek to address that that problem, Well,
we're sort of polluting our not just our physical ecosystems,
but our information ecosystems as well. But the trouble is
that with information ecosystems, if you collapse your democracy, really

(18:39):
really terrible things happen. People die. You know, societies and
democracies are fragile their pressures, and we have to work
every day and commit ourselves every day to the values
that underpin it. You start to undermine those a systemic level,
and you can create a real problem. But you know,
this hasn't happened over a period of a hundred years

(18:59):
like the Industrial Revolution. This has actually been relatively quick,
and our ability to address it, as well it should be,
can be fast. It's not going to be as obscenely
expensive as solving climate change will be. We're much more
able to deal with this information crisis that we face,
and at the center we do three things to work
on this. The first thing is we help people understand

(19:23):
what's changing and how to more healthily navigate digital spaces
so that the users know what's going on. Your data
is being used by these platforms, Here's how it's being used,
why it's being used, here's why you see what you see,
and here's how you can interpret that and not fall
down the rabbit hole of conspiracies, and says what the
second thing is, we need to deal with the platforms themselves.

(19:45):
And you know, I'm quite a cheeky person by nature.
I am a bit of a travelmaker, and so I'm
very good at getting lots of attention to the issue
of what of the harms that they create. You know,
we've we've made a lot of noise and that that
forces the platforms to kind of go, oh no, we
can't just hide it anymore. We've got to do something

(20:07):
about it. And the third thing is that we're working
with legislators all around the world because what's very interesting
is just a year ago is the first time when
I stopped being asked the question, but isn't this just online?
Because January the six and the pandemic have made it
clear to anyone that bothers to look that social media

(20:30):
has created problems for our society that are offline, that
are real, and the cost is paid in lives lost.
And this year what's so wonderful is the start off.
And I'm having a lot of conversations with legislators and
with other folks around the world at the moment on Okay,
we now know there's a problem, what should we be doing?
And I was in London last week speaking to legislators

(20:55):
about the new legislation being brought forward at our urging
in the UK An Online Safety Bill. In the European Union,
there's a new act called the Digital Services Atlets coming along.
There are countries all around the world that are seeking
to legislate. And just before Christmas, I actually gave evidence
for the first time to Congress. I gave evidence to
the House Energy and Commerce Committee, which is a legislation

(21:18):
committee about they had forty different bits of legislation that
they were experimenting, the Republicans and the Democrats. And really
what's amazing is that, for different reasons, these platforms have
managed to unite the GOP and the Democrats on one issue,
which is that something needs to be done about social media. Well,

(21:39):
thank god for you, because you know, the past few years,
the cultural screaming has been you know, Facebook and Mark Zuckerberg,
and this is terrible. I'm going to delete my Facebook account,
and my feeling is like, why why aren't we taking
to the streets a little bit? Why is this? Why
do we just all sort of sit around and go like, oh,
Facebook is awful hashtag you know, I mean there has

(22:02):
to be legislation, yeah, I mean, that's the irony is that,
you know, to go and protests Facebook, people go on Twitter,
exactly exactly. One of the dirty secrets of tech regulation
is that most of the organizations that are lobbying for
changes to tech regulation are funded by big tech. So

(22:23):
most of them I find, ironically in meetings running around
saying you don't want to go too far, and I'm going,
aren't you meant to be the people saying we need
a better social media and they're saying, yes, let's keep
it as it is, because that's better than than whatever
the regulated one will be. And so Google and Facebook
spent more money than x On and Philip Morris put
together in on lobbying a hundred and twenty million dollars

(22:46):
in Washington, d C. Alone in the UK. They're lobbying
in Europe, the lobbying in Australia, New Zealand, Germany, all
around the world, because they're trying to stop regulation coming forward.
And the honest truth is that we still have a
fight on ours, a huge fight. I mean in every
huge fight with misinformation. I even think about in our

(23:07):
country the fight with with oxy coding, you know, the
misinformation of that, and how Jesus, how can normal citizens
go up against huge pharmaceutical companies. It feels like the
same thing when it comes to tech. They have too
much money, they're too powerful, they do. But we're right,
and and I think that a lot of people increasingly

(23:28):
agree with us. One of the things that happened over
the pandemic is a lot of my friends, for example,
who are parents of young children who would never talk
about mental health normally. Some of my sort of blow
key bloke friends, they've turned around and they suddenly are
talking about post traumatic stress disorder after the pandemic. And
I think that we are acutely aware of the mental

(23:51):
health of our children in this particular moment right now.
And they've had the chance to see how their kids
use social media because they've been at home, and they're
just horrified. They know now that their kids will go
on Instagram and they'll come out and they'll they'll start
talking about sort of how they feel about their body,
and they'll be thinking, why does my beautiful child feel

(24:12):
that way? And they know why. And so I think
that one of the things that's changed over the last
year is parents are really engaged in this battle. And
you know, there's a few things that parents can do,
which is to be really cognizant of their children's social
media use. But my contention is politically that when parents
start to mobilize in the numbers that they are right now,

(24:33):
and when there is aware of the problem as they
are right now, politicians tend to follow because that is
something that transcends left right, urban, rural. We all have kids,
and it's going to be I think this is going
to be a rough year for the social media companies
that are going to see the first legislation coming forward,
and in the next couple of years, I expect to

(24:54):
see even the US legislate in this space, either to
break up the companies to create more moral competition, or
to regulate to allow them to to be sued where
they are failing and their duty of care to their users.
But we will see changes, and I'm deeply optimistic. I
think social media is. I mean, as a core concept,

(25:16):
the idea of being able to send a message to
another person anywhere in the world for free is incredible,
and their networking two point five billion people. It theoretically
liberates humanity to operate beyond border, beyond time and space,
to create this sort of this meta mind that binds

(25:36):
us all and makes us an even more incredibly creative
and an integrated species. To get there, though, we have
to deal with the malignancies because right now, ironically, these
companies are messing up there, screwing up their product by
being lazy in just making as much money as they
can right now. If they invested into it, by having

(25:57):
proper moderation, enforcement of their rules of building, the safety
protocols in place, wow, I mean, this thing could could
genuinely liberate humanity. What did you tell your high school
high school students that you spoke to the other day,
What are you saying to that generation of people, Because
these are the people that grew up with a cell
phone in their hand, and so how are you telling

(26:19):
them to be the people that go into battle and
figure this all out and and help turn this into
a good thing for humanity and not a bad thing. Well,
I mean, I'll be honest. The first thing I did
was apologize to them for our failure to once again
consider their interests and the world that we were bequeathing

(26:41):
to them in our zeal to make money from this technology.
Right now, they're going to deal with much more mature
versions of the problems that we've created unless we start
to deal with them now. And then I committed to
them that we're going to do our best those of
a who can see the problem and can see that

(27:02):
this is not a technological problem, this is a moral
question of what are you willing to make money from?
How is it that you're willing to accrue your wealth?
Are you willing to accrue it through content that kills people?
Are you willing to accrue it through being lazy and
not fulfilling your duty of care to administer your product
in a safe way. And then I took them through

(27:24):
the way that trolling shapes the discourse in politics. Tralling
is purposeful communication. It's designed to make journalists, politicians, scientists,
to anyone in society feel fearful of saying certain things.
That's why they troll you. But targeted abuses used to
cleanse those spaces of certain types of people that you know,
if I had to open my door and every time

(27:45):
I opened it someone shouted, you know you dirty brown
x y Z, I just wouldn't bother leaving my home.
I would stay at home. So if every time I
got on social media, I know I'm going to get
a wave of abuse, why would I use that that space.
It's designed to make people not want to enter those spaces.
And that is so regressive because you know what have
we spent the last two years doing, bringing women, gay people,

(28:08):
brown and black and Asian and his span of people
into public life, and now social media a primary way
by which we establish our brands in which we communicate,
we established relationships is being cleansed of those people. It's
it's the counter enlightenment, you know, it's the reversing of
science and tolerance. The truth is it's not free speech.

(28:29):
This is a weighted platform towards the most controversial. You
want to know why we're more polarized, we're more angry,
We're not. The truth is that we're being slowly tilted
that way by platforms on which most of our political
discussion and opinion forming happens. And those are weighted towards
the angry, that the hyperbolic, that the kind of the
dumbest take possible, you know, the one that's least thought through,

(28:53):
that's least considered, that's the worst advice that you could
possibly want to take, is the one that they will
promote the highest because they know that it will get
people angry and shouting at each other. And while you're
shouting each other and you're on the platform, they can
serve you apps. That's all they care about. And so
it's really important that we think about that aspect um.

(29:14):
And of course the way that the drip drip of
misinformation recolors the lens through which they see the world,
that they have to be very cautious in using their sites.
I mean, I told them advice on how they can
use those sites, and are reminded them that these sites
are entertainment. You don't know who's posting there. So taking
that stuff as gospel is as dangerous as as drinking
water from a stream. You don't know if there's been

(29:35):
there's an animal carcass upstream, or if there's you know,
someone's been dumping toxins into it deliberately, which is what
social media is like. But I also encourage them that
this I think is going to be an area that
we're going to have to work on for decades to come.
Even if we have regulation today, even if we have
social action today, even if we educate ourselves, this is

(29:56):
not going back in the bottle. And we're going to
have to evolve to learn how to deal with these
sorts of environments and these spaces because they're vital to
our future as a species. But at the same time,
they're pretty bad right now. Well, particularly because a lot
of the CD dark side of our culture. You know,
a lot of these platforms were watering and feeding them.

(30:17):
When you think about sex, trafficking, drugs again, terrorism, hate groups,
all that stuff you brought up January six. I mean,
this stuff all can happen because of social media. And
I remember years ago I did a panel where we
I talked about its effect on children, and I did

(30:39):
it in Silicon Valley and all the tech people that
were there came up to me afterwards and whispered to me, Yeah,
we know how bad it is on our kids. That's
why we don't let our kids do it. And I thought, well,
that's interest, you know, you won't even let your own
kids do it. That is absolutely true. I mean, I
personally don't use social I don't have a Facebook profile
on my Twitter in MINDSTA go and managed for me

(31:00):
by staff, and I don't look at any of the
abuse I received because it's designed to make me stop
doing my job, and I refuse to stop doing my job.
You know, I talked about companies being a moral like
lacking morality. They're not human beings. But I have over
time changed my opinion on that. I think that when
you are aware of a problem, and when you know

(31:22):
that you're creating harm, and you've had the time and
you have the enormous wealth to change that, and you
refuse to do so and instead keep profiting from those harms,
that you become immoral companies. And I mean I genuinely
believe that over the next year or two will also
see another change happen in society that I think we

(31:42):
used to look at companies like Google and Facebook and
think they're they're cool places, they're good places to work,
but these are like working for Philip Morris or ex
On or you know. I want a situation where when
a kid comes home to tell their parents I got
a job at Facebook. Their parents will say, oh my lord,
what will the neighbors think? What have we done wrong? Yeah,

(32:06):
we'll be right back, and we're back. So does it
make me a hypocrite if I agree with everything you're
saying and yet I'm going to post this podcast on
my Instagram? I mean, can you be in for all

(32:29):
the right reasons? Of course, and that, of course is
one of the problems. And that's why we need to
have anti trust. You know why. I'm really pleased that
Joe Biden has appointed Lena Can't, Tim Woo, and Jonathan
Canter the three horsemen of the Apocalypse when it comes
to anti trust, because their job is to make sure
that there is competition. If there's no competition, then there's

(32:50):
no competition over both economically but also morality, and people
compete on the morality of their platforms, the quality of
the enforcement. If a plat form came along tomorrow and
said that it's genuinely safe for kids, and we knew
it was safe for kids, I think you'd see a
lot of people migrate. They're very very quickly. But at
the moment you've got no choice. It's Facebook, Twitter, and

(33:12):
TikTok or bust, and that's a terrible thing. And you know, personally,
I always say I won't be bullied off those platforms,
and I certainly won't be scared off those platforms by
the way that they behave because I actually find it
quite delicious to use Facebook to undermine Facebook. And And
what do you say to cheeky moms like me when

(33:32):
I say what can I do? You know, in the
old days it was what can I do? And every
say we'll call your congressman, what can we do? Where
we feel involved in changing the way social media has
been up to now that there is one central bit
of logic and advice that the sort of the mathematical
analysis by C. C. D. H on how to deal
with stuff that you see online. And I wish I

(33:55):
could tell everyone in America this and everyone in the
world this, because I think that if we all lived
by it would change the mathematics of what we see
online very quickly. When you see trolling, when you see hate,
when you see misinformation, don't engage with it. These platforms
are designed to make you look at stuff that will
piss you off, that's controversial, that's kind of gets you

(34:17):
emotional and triggers you to want to engage, and when
you engage with it, you literally tell the algorithms this
is higher engagement content and so therefore show it to
more people. And so our recommendation is to ignore it,
block the person sending it, and take a time out
if it's hate and if it's something that might have
affected you, and then report it to the platform so

(34:37):
you know that you've done your bit as well. The
second thing is there's going to be a lot of
talk about legislation, and so actually now is the time
right your email or tweet at your send a Facebook
message to you DM via Instagram, your your congressman and
senator and tell them I expect you to protect my
children by putting in place proper protections on social media.

(34:59):
And I'd understand why they get away with something that
no other industry does. Why have they been protected through
Section to thirty from any liability for the harms they produce.
There is actually a mathematical solution to the hate that
we see on these platforms, and that's something that users
can do. But there are lots of good organizations as

(35:20):
well doing work in this space. I work with the A. D.
L on anti semitism. I work with Color of Change
on anti black racism. I work with a number of
organizations and organizations like mine, the Center for Countering Digital Hate,
and all of them need the resources to fight back
against the enormous wealth of these companies. Just as you know,

(35:41):
my colleagues and the climate change sector, who we work
with very closely on climate disinformation, have had to fight
back against Exxon, against these enormous, great big companies. We
are too, and so every donation matters, and you know
you can donate at www dot counter hate dot com.
And I assume that these big tech companies have fought
back and tried to take your company down. You know

(36:01):
that they've offered our staff two times of salary to
go and work there. They've put out pr saying that
we live and what's been what I'm really confident in
is that we are absolutely on the mark that we
are very cautious research organization, and and everything I say
is is extreme. And you know, there's a lot of

(36:22):
like extraordinary claims that I make. I can provide evidence
for every single one, because, believe me, they've tried to
We've we've had a number of lawsuits against is not
one of them has even gotten past the first letter,
because our lawyers in every instance have been able to
say the first defense against defamation is truth and here's
the evidence. So yeah, I mean, it's it's going to
be a tough fight, but you know, most people are,

(36:45):
even even those people on Facebook, these are fundamentally good
human beings, like all of us are. But they're doing
the wrong thing. And I think you get enough of
us working and pushing with all our might, and we
will slowly start to right this wrong that's been done
to our society. And our societies have proven incredibly resilient.
That's the great power of democracy, and especially of American democracy,

(37:05):
which is so decentralized. I'm really excited about the next
few years. I think we're going to win this battle
and then you know, and there's other things to be done.
It's all about money, isn't it. Of course? Wow? All right,
So a lot of what I talked about in my
podcast is protecting children. And I know there's something called
the Kids Act that was introduced in Congress recently. Can

(37:26):
you tell us about the Kids Act and what it
would do? Well? I mean, the Kids Act is an
act introduced by Senators Markis and Bluemntal that would stop
the addictive aspects of the design being targeted at children specifically,
and you know, I wish it could be stopped from
being targeted to anyone. I think it's really really important
that with kids, where they've got these incredibly malleable minds

(37:48):
and in which they don't know in the same way
that adults perhaps might, the way in which they're being
manipulated psychologically by these platforms, by the core programming that's
sort of the tweaks to the service based on behavioral
psychology that make it so addictive. I mean, what they
do is they amp up emotion to eleven and that

(38:11):
is really really unhealthy to a child's mind, a mind
that isn't formed yet, whose frontal lobe is still growing.
You know. The reordering of what we see by social
media algorithms to prioritize the hateful, to prioritize the angry,
to prioritize misinformation, has started to normalize these things within
our society more generally, like we've now become used to it.

(38:34):
There's almost an extent to which we're like, yeah, we know,
we know that this is caused by social media. But
that's how I feel now. M You know, with with adults,
you're rewiring the brains, but with kids, you are fundamentally
wiring their brains in that way. They will never have
known any other way of being. I can't I can't
describe it any other way than it is just child abuse. Yes,

(38:56):
you're wiring the brains the same way that it would
be if they lived in a household where there is
just violence and anger and shouting and screaming and thus
just terrible. We're traumatizing a generation digitally, yes, And I
look at my own children, you know, and they're the
guinea pigs for all this. We didn't grow up with
cell phones in our hands, but they did. And it's

(39:18):
hard apparent when you didn't go through it yourself. And
speaking of children, I wanted to ask you about the
metaverse because the metaverse is made for them, and I
just think it's going to be an even more dangerous
area for them, because, I mean, talk about unregulated. It's
so new. So tell me what you're working on when
it comes to the metaverse. Well, in a few weeks ago,

(39:39):
we actually released some research in in the New York Times,
on the Metaverse, and when the meta Verse was announced
by Mark Zuckerberg, he said, I'm going to put safety
and security, especially if children are the very heart of
the experience, and we wanted to test, well is that true?
And of course meta versus just virtual reality. And Facebook

(40:00):
already has a virtual reality service. It's called Oculus, and
you may have seen it advertised before Christmas. They were
telling everyone to buy it for their kids. We actually
spent twelve hours studying what Oculus b our Chat is like,
and our researchers sneakily recorded the footage and what we
found was every seven minutes, something happened there that was

(40:23):
either hate. We saw kids harassing children sexually, we saw pornography,
We saw someone literally turning their avatar into the Twin
Towers and crashing a plane into it. We saw someone
preaching white genocide, white supremacist mythology to kids. So every
seven minutes, and I mean our contention is if you

(40:45):
if you're buying Oculus for your kids, you need to
be aware that you have no way. You can't even
look at their phone when it comes to VR, because
it's it happens in the moment there. We've got a
big way to go on that. Because these platforms have
not proven themselves fit and proper to run even text
based and image based services, Why on earth are we

(41:06):
giving them monopolies in virtual reality? So yeah, look, the
technology keeps evolving and so does the nature of the fight.
But luckily my research team they are ahead of the
curve on these issues and they've been tracking it all
the time. And I would think once you start to
get laws in place, these are laws that will affect
things that haven't been created yet. You know, five years

(41:28):
from now, these laws will be implemented in whatever is
discovered or invented. Well, I mean the first country to
have proper legislation on the books is the UK. It's
got the Online Services Bill which is actually having its
second reading in March in the UK, and that bill
if they failed to clean up their platforms from harms
that they know are being created, that are things like

(41:49):
really serious stuff. So whether it's racial hatred or it's
misinformation that might kill people, if they fail to reflect
their duty of care, they can be fine of global
revenues and executives can be put in jail. So the
time of accountability for these platforms is coming. My job
now in the UK is to make sure the politicians

(42:11):
don't go back on what they've promised they're going to do.
But you know, we're past the stage where we're begging
for legislation, because legislation is coming. Good. Ron, I just
sit in my cozy chair and ask people a lot

(42:33):
of questions and so at the end I like to
let my guests ask me a question about anything. And
so now you're up. So I'm asking everyone this, what's
the one thing you couldn't do last year that you're
excited to do this year? Ha ha uh one thing
I couldn't do, Like I mean, I let's say that

(42:55):
I can. I actually, I will say I'm hoping this
year to travel. It is something I haven't done in
a long time. I would like to not eat my
own cooking. I would like to see some other faces
that I haven't seen. I'd like to actually go and
do a chat show where there's an audience and not

(43:17):
just a few cameraman and a p A so that
that the laughter is real. But I think travel is
the biggest thing, you know, let me know that there's
a bigger world out there. Yes, definitely. I haven't been
in a studio for a year and a bit now,
so you know, being able to sort of go out
and actually talk to someone, get into a real debate

(43:38):
that isn't over a screen. But I actually didn't hug
anyone that wasn't my wife last yet. So for me,
it was the hug. Wow. I secretly hugged friends and families,
so you know, I'm sure I can. I'll get arrested
for that, but I did. I I safely hugged people. Um,
thank you so much for such amazing work. You know

(44:00):
which me is. It's like sisyphus And if you can
actually get this rock over the hell, then you know
you should get a Nobel Peace Prize. So thank you,
thank you. And anything I can do personally and my
listeners can do to to scaffold you, we would like
to do. Just tell us how well I mean. The
first thing they can do is go and find us
on Twitter at c c D Hate or on Instagram

(44:22):
at counter Hate, So at c c D Hate on
Twitter and at cante on Instagram, and that you know,
start amplifying on material, get it out to your friends.
The greatest gift you can give someone you love is
the gift of good information. Thank you, but Placia. First
of all, im Ron is such an amazing speaker. I

(44:43):
didn't even want to ask him questions. I just wanted
to hear him talk. Um. You can go to counter
hate dot com and learn more about his not for
profit and geo. And basically, I'm left with the incredible
idea that we are in a much bigger war than
I thought against digital technology. It's completely changed how we communicate,

(45:06):
how we have relationships and knowledge our politics, and if
this is going to be the new way we exist,
we need to figure out a way to make these
platforms safe, particularly for our children. We have to come
together and stop all this misinformation and the different platforms
of hate and start to find ways to govern it

(45:30):
in a real and practical way. And I think that
everybody has a hand in this, and everybody has to
join together and set these rules. It's always thank you

(45:50):
for listening to go ask Gali. Also, we posted some
links in our show notes. If you'd like more information,
be sure to subscribe, rate and review the podcast, and
follow me on social media on Twitter at Ali You
Wentworth and on Instagram at the Real Ali Wentworth Now.
If you'd like to ask me a question or suggest
a guest or a topic I can dig into. I'd
love to hear from you, and there's a bunch of

(46:12):
ways you can do it. You can call or text
me at three to three three six four six three
five six, or you can email a voice memo right
from your phone to Go ask Alli podcast at gmail
dot com. If you leave a question, you may hear
it on Go ask Alli. Go ask Alli is a

(46:34):
production of Shonda land Audio and partnership with I heart Radio.
For more podcasts from Shonda land Audio, visit the I
heart Radio app, Apple Podcasts, or wherever you listen to
your favorite shows.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.