All Episodes

April 4, 2025 32 mins

On this month’s show, we’re joined by Alia Al Ghussain, a researcher and advisor with Amnesty Tech, who talks about the huge need to centre human rights in the technology sector.

“This is a bad situation and it’s primed to get even worse,” she says, on the back of the move by Meta earlier this year, to end its practice in the US of fact-checking information posted on its platform. The development is seen as a backward step to curry favour with the Trump administration, according to Alia: “It marks a very clear retreat from the company’s previously stated commitments to responsible content governance and I think it also shows that Meta hasn’t learned from its previous recklessness, or if it has learned anything that those learnings have been discarded, basically.”

Amnesty Tech has previously investigated and released reports into the role that various tech companies have had in human rights abuses. One of the major reports that first shed light on the issue - that of digital players now needing to own responsibility - was its report into the persecution of the Rohingya Muslim community in Myanmar, looking into what role Meta played; how hate speech on its platform promoted violence against the Rohingya. The report found that, “Meta’s algorithms proactively amplified and promoted content which incited violence, hatred, and discrimination against the Rohingya – pouring fuel on the fire of long-standing discrimination and substantially increasing the risk of an outbreak of mass violence.” And it concluded that: “Meta substantially contributed to adverse human rights impacts suffered by the Rohingya and has a responsibility to provide survivors with an effective remedy.” Another major report Alia was involved in looked at Facebook’s role in contributing to violence during the brutal two-year conflict in Ethiopia’s northern Tigray region, which began in 2020 when the Ethiopian government began military operations there against the region’s ruling party. Amnesty concluded that Meta had once again – “through its content-shaping algorithms and data-hungry business model – contributed to serious human rights abuses.”

In this episode, Alia explains how big tech players make their money from users’ data and discusses the harmful impacts of algorithmic curated content. She also shares her knowledge on how content goes viral, emphasising that it’s not necessarily that the content is good, but rather that it elicits an emotional reaction amongst users and generates a lot of engagement.

Ultimately, Alia believes that digital platforms need to be redesigned with human rights at the centre and she calls for more governance in this area: “I think that governments and some regional bodies, like the EU, really need to double down their efforts to rein in big tech companies, like Meta and others, and also to hold them accountable… because this is about people’s lives at the end of the day. This isn’t an intellectual debate.”

Presented and produced by Evelyn McClafferty

With thanks to our donors: Irish Aid.

Note: The views and opinions expressed in this episode do not necessarily represent those of IRLI or Irish Aid.

Mark as Played

Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.