All Episodes

May 26, 2024 • 5 mins

Em Gillespie joins Jonesy & Amanda with the latest in entertainment. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
I don't entertainment, put on your dance and shift.

Speaker 2 (00:05):
Don't give me your.

Speaker 3 (00:06):
Fast shot from the Daily Os. She's here, Emma Gillespie, Hello.

Speaker 2 (00:11):
Good morning. Well we're talking AI this morning, as we
so often are, but there've been a couple of big,
high profile AI fake voices making the headlines, including our
own Sandra Sully. This is a really interesting one. I
want to play you a grab that was used in
a true crime podcast.

Speaker 1 (00:30):
Saint Patrick's college teacher arrested and facing multiple alleged child
abuse charges.

Speaker 2 (00:36):
That's not Sandra Sully. So in this podcast they say
before they go to the grab, this is an AI
generated voice reading out a headline.

Speaker 4 (00:46):
So it's not generic that they've modeled Sandra's voice.

Speaker 2 (00:49):
They disclose it as AAI, but they don't say that
they've modeled Sandra. Clearly, it sounds like that iconic voice
we know. Sandra actually did an interview with the project
talking about it, and she said she didn't feel violated,
but she felt more concerned about the future of news
and what that means his a little bit.

Speaker 1 (01:05):
So if you know, I'm arguably some sort of trusted
voice in news, then how can that be manipulated? And
for what purposes a good, bad or evil? And that's
pretty scary.

Speaker 3 (01:17):
Whose Amanda, does you do a great said, oh, not
as good as that.

Speaker 4 (01:21):
So six people were taken out of the forest in bodybirgs.

Speaker 2 (01:26):
Oh my god, Amanda Kella, that was a fantastic and
terrified chorin that was an AI.

Speaker 4 (01:31):
But that is scary because when people won't know the context,
they will hear that on a podcast and think it's Sandra.
And it may be a benign circumstance in this one,
maybe not another time.

Speaker 2 (01:40):
Well exactly. We've seen another really high profile similar story
in the news this week. Scarlet Johansen has accused Open Ai.
Now they're the tech company that owns and created chat GPT.
They've released a new voice. She's called Sky and she's
an AI bot assistant. Scar Jo says that she sounds
just like her. She's released a statement saying that it
sounds eerily similar to her and particularly interesting because she

(02:03):
turned down an offer from Open Ai to be that voice.
So we've got some grabs here. We've got the AI
voice and whether or not you think it sounds like
Scarlett Johansson and also playing me off Ryan blame Ryan
We've also got a grab of the movie Her, which
you might remember from twenty thirteen, and Scarlett Johansson voiced

(02:24):
an AI assistant in that movie, so a bit of
art imitates, like, so this first bit is the AI.

Speaker 3 (02:30):
This is the II. I'm on stage right now, I'm
doing a live demo, and frankly, I'm feeling a little
bit nervous.

Speaker 1 (02:35):
Can you help me calm my nerves a little bit. Oh,
you're doing a live demo right now.

Speaker 4 (02:41):
That's awesome.

Speaker 1 (02:42):
Just take a deep breath and remember you're the expert.

Speaker 3 (02:47):
Now that sounds that sounds faint, I can't tell.

Speaker 4 (02:49):
So here's the actual then, Scarlett Johansson.

Speaker 2 (02:51):
Good morning theatre, Good morning, you have a meeting in
five minutes.

Speaker 3 (02:55):
You want to try getting out of bed? Too funny.

Speaker 4 (03:00):
So I wouldn't have said that identical, but she claims
that all her friends and family and in the industry
they think it sounds like her.

Speaker 3 (03:06):
Yeah.

Speaker 2 (03:06):
I think that stand alone, that the voice doesn't sound
exactly like her, But given that they tried to get
her to voice it, I think that's where the concern is.
I mean, it does sound different enough, but you can
hear that kind of warm raspy sort of Scarlet Johansson
tone open AI have paused the voice, so they've said, fine,
we won't run it because Scylet Johnson's threatened legal action.

(03:29):
But they also said that it's not her voice, it's
not supposed to be her voice. That they put out
an open casting call to actors months before they reached
out to Sculet Johansson allegedly, who knows if that's true.
But funnily enough, in that open casting call they specifically
called for actors who were not in the union, not
in sag AFTRA, because they can exploit them, and went

(03:50):
on strike really hard last year for riots for voice
actors and AI.

Speaker 3 (03:54):
They could be mimicking us. You wouldn't know.

Speaker 4 (03:56):
I saw this great thing the other day about AI.
They say, you know the biggest problem pushing all things
AI wrong direction. I want AI to do my laundry
and dishes so I can do art and writing. Not
for AI to do my art and writing so I
can do laundry and dishes and sot on an instant.
Humans doing the hard jobs on minimum wage while the
robots write poetry and paint is not the future I want.

Speaker 3 (04:15):
That's not the future.

Speaker 2 (04:17):
Point well, thinking about the future scarily enough. You know,
next year we've got a federal election due and the
Australian Electoral Commission even came out last week and they
warned that ahead of the election, we're going to see
a huge rise of AI voices and deep fakes in
the political space, and that they don't have any power
to do anything about it. There aren't laws in Australia

(04:38):
that are strong enough to prevent these kinds of deep fakes.

Speaker 4 (04:41):
Or laws anywhere. So America's having an election, India's having
an election, Briton's having an election, Australia's having an election
next year, and they're expecting AI deep fakes, misinformation everywhere.

Speaker 2 (04:53):
Well, there was that call in the US, a voicemail
thing that was Joe Biden calling people telling them how
to vote, and that was a deep fake. We've already
we've seen lots of examples of how it can impact politics,
but now that that's crossing over into pop culture and arts,
you know who knows where.

Speaker 3 (05:07):
Don't believe everything you hear. So for example, you hear this,
you know that that's fake.

Speaker 4 (05:14):
I can smell it.

Speaker 3 (05:16):
Fascinating M thank you, thank you. Check out M editor
of the jaily Oz,
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.