All Episodes

May 21, 2024 17 mins

Scarlett Johansson says Open AI ripped off her voice for their ChatGPT4.0 even after she said no. No means no, Sam Altman! 

Here’s what their behavior says about consent in tech.

Scarlett Johansson Says OpenAI Ripped Off Her Voice for ChatGPT: https://www.wired.com/story/scarlett-johansson-says-openai-ripped-off-her-voice-for-chatgpt/

AI Art and the Problem of Consent https://artreview.com/ai-art-and-the-problem-of-consent/

 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Tad and this
is There Are No Girls on the Internet. Welcome to
There Are No Girls on the Internet, where we explore
the intersection of technology, social media, and identity and all

(00:24):
of those things intersected in a major way this week
with Scarlett Johanson and Open AI. So let's talk about it.
Full disclosure, I was already working on putting together an
episode rewatching Spike Jones's twenty thirteen movie Her, starring Scarlett
Johansson's voice as an AI assistant. I really wanted to
compare and contrast what the movie thought AI integration with

(00:45):
our life would be like and what it actually has
been like ten years later. I'm really excited that the
movie Her is part of the public conversation right now
because it's one of my favorite movies. If you haven't
seen it, I don't want to give too much away,
but Scarlet Johansson is the voice walking Phoenix's AI software.
The movie imagines a future where AI is less like
theory and more like a real human. People in the

(01:08):
Her universe fall in love with AI. They have friendships
and real meaningful relationships with AI. And that's partly because
AI sounds like a real human person who speaks to
you and behaves like a person would, not like a
robotic voice. And as I was preparing for that episode,
the whole thing was Scarlett Johansson really blew up. And

(01:30):
the more I thought about it, honestly, the matter I
got Last night, I was getting ready for bed, and
I was sort of angrily brushing my teeth, and I
found myself thinking about this yet again, and the kind
of chorus in my mind that I kept saying over
and over to myself was that these tech guys just
think they own whatever woman they want. Because to me,

(01:50):
this is not even really about Scarlett Johansson. It is
about what happens when consent in technology is violated again
and again and again, and how it erodes the trust
that we should be able to count on being at
the center of our tech experiences, and how it reinforces
that the most powerful companies in our world who are
shaping our collective futures, consistently demonstrate that they cannot be

(02:13):
trusted to simply respect people, especially when those people are women. Okay,
so here's what's going on. Open Ai, the company that
makes chat GPT and a major player in the AI space,
has been flirting with integrating voice technology to chat geept
since around last year, But last week open Ai finally
revealed a new conversational interface for Chatgypt that they called

(02:36):
Sky yep. Just like a lot of voice technology, Sky
has the voice of a woman, but Sky also has
a voice that is really similar to the one that
Scarlett Johansson used to play the AI assistant called Samantha
in the movie Her. But then open ai suddenly disabled
this feature over the weekend grand Opening Grand Closing. And

(02:56):
this comes after open AI's head Sam Altman, who you
might remember we made an episode about he was fired
for something we don't totally know what, but it seemed
to be related to his lack of honesty, and then
he was rehired and is now basically doing whatever the
hell he wants. Well, Sam Altman was talking up this
integration and comparing it to the movie Her and talking

(03:17):
about how we'd finally have AI that felt like a
real human that you could be friends with, which is
a plot line right out of the movie which spoiler alert.
I do think that some of these tech geniuses might
actually be low key misunderstanding the takeaway from the movie.
But anyway, so shutting down this new voice technology after
sam Altman was driving so much anticipation about it, everybody

(03:40):
myself included, was like, what is going on? Like what's
the story there? So then on Monday we get the
real team, which is that Scarlett Johansen told why or
did a statement that open ai actually reached out to
her to ask her to be the actual voice of
their new conversational interface and she declined twice, and that
open ai and I basically just used her voice anyway,

(04:02):
or at least a voice that sounds a lot like
her voice and open Ai sam Altman even tweeted a
reference to her work in the movie Her when announcing
that new chat JPT voice interface, So there isn't really
a ton of plausible deniability on his part even Okay,
so this is what sky open AI's not. Scarlett Johansson's
voice integration sounds like I.

Speaker 2 (04:24):
Don't have a personal name, since I'm just a computer
program created by open ai, but you can call me assistant.

Speaker 1 (04:31):
What's your name? And here is Scarlett Johansson as the
voice of the AI Samantha from the movie Her.

Speaker 2 (04:38):
Well, right, when you asked me if I had a name,
I thought, yeah, he's right. I do need a name,
but I wanted to pick a good one, so I
read a book called How to Name Your Baby, and
out of one hundred and eighty thousand names, that's the
one I like the best.

Speaker 1 (04:49):
Well, you read a whole book in the second that
I asked you what your name was in.

Speaker 2 (04:53):
Two one hundreds of a second.

Speaker 1 (04:54):
Actually, wow, it sounds pretty similar to me, and scar
Ja agrees. Here's what she told Wired in a statement.
Last September, I received an offer from Sam Altman, who
wanted to hire me to voice the current chat GPT
four point zero system. He told me that he felt
by my voicing the system, I could bridge the gap

(05:14):
between tech companies and creatives and help consumers to feel
comfortable with the seismic shift concerning humans and AI. He
said he felt my voice would be comforting to people.
After much consideration and for personal reasons, I declined the offer.
Nine months later, my friends, family, and the general public
all noted how much the newest system, named Sky sounded

(05:35):
like me. When I heard the release demo, I was shocked.
And angered and in disbelief that mister Altman would pursue
a voice that sounded so eerily similar to mine that
my closest friends and news outlets could not tell the difference.
Mister Altman even insinuated that the similarity was intentional, tweeting
a single word her a reference to the film in

(05:55):
which I voiced a chat system Samantha who forms an
intimate relationship with a Hue. Two days before the chat
GPT four point zero demo was released, mister Altman contacted
my agent, asking me to reconsider before we could connect
the system was out there. As a result of their actions,
I was forced to hire legal counsel, who wrote two

(06:16):
letters to mister Altman and open Ai setting out what
they had done and asking them to detail the exact
process by which they created the Sky Voice. Consequently, open
Ai reluctantly agreed to take the sky Voice down. In
a time where we are all grappling with deep stakes
and the protection of our own likeness, our own work,
our own identities, I believe these are questions that deserve

(06:36):
absolute clarity. I look forward to resolution in the form
of transparency and the passage of appropriate legislation. To help
ensure that individual rights are protected. So I really applaud
Johanson here, and I think this is the first time
that there has been a legal dispute over a sound
alike that is, as far as we know, not AI generated,
and I think it could set a precedent for this

(06:57):
kind of thing going forward, especially or voice actors and
creative professionals who can't afford lawyer's feeds or a big
lawsuit if their likeness or voice is used this way
without their consent. Her statement is also just a good
reminder that Johansson has been here before. She is one
of the most targeted celebrity figures for AI deep faked images.
So finding out that open AI actually asked Scarlett Johanson

(07:21):
to work on this twice and when she said no,
they just found a sneaky workaround to do it anyway
enrages me. It enrages me as a voice professional, It
enrages me as a creative and it enrages me as
a woman. You know, when I say on the show
that the exploitation of women is baked into technology in
a lot of ways from the ground up, that these

(07:41):
are features and not bugs. This is a great example
of what I mean. It matters that a company like
open ai would build their anticipated voice system in a
way that has the exploitation of a woman baked into
its earliest foundation, and this is not happenstance. It colors
how they see women and other marginalize people as just

(08:01):
available to take from in service of them making money
to create their vision, a vision that by design ignores
and exploits us. Like, don't these people understand that no
means no. I should say that open Ai says that
they did not actually steal her voice, but I also
want to say that I want one hundred percent do
not believe them at all. Here's open ai statement. We

(08:24):
support the creative community and worked closely with the voice
acting industry to ensure we took the right steps to
cast chat GPT's voices. Each actor receives compensation above top
of market rates, and this will continue for as long
as their voices are used in our products. We believe
that AI voices should not deliberately mimic a celebrity's distinct voice.
Sky's voice is not an imitation of Scarlett Johansson, but

(08:46):
belongs to a different professional actress using her own natural
speaking voice. To protect their privacy, we cannot share the
names of our voice talents. So here's my opinion about
what's actually going on. I believe that they probably did
work with a human voice actor, and they probably intentionally
picked a voice actor that sounded a lot like Scarlett Johansson,

(09:06):
and I think they had this person ready to go,
whether or not Scarlett Johansson agreed to do this or not. Like,
I don't think they really cared about actually having Scarlett
Johansson's permission, and they were going to either use this
sound alike or use Scarlett Johansson's real voice. Because in
addition to his single word Her tweet, Sam Autman, the
head of open ai, also said that the new AI

(09:28):
voice technology quote feels like AI from the movies. Open
AI's chief technology officer, Mira Murradi, said that that was
all a coincidence, but even still, it's like they want
to have it both ways. They obviously want us the
public to be associating their new technology with the AI
and the movie Her, and they're clearly trying to capitalize

(09:49):
on that for this rollout, But they want to have
all of that and benefit from all of that without
actually having the consent from the real human woman behind
the voice in the movie that they're referencing. As Bethany
Frankel might put it, it is a cheater brand. Let's
take a quick break eder back. So I'm not an attorney,

(10:24):
so I can't really speak to the legal frameworks at
play here, but I do think it is a good
opportunity to talk about consent in technology. So in my opinion,
looking at the facts and the timeline, I believe open
Ai picked a voice actor that sounded a lot like
Scarlett Johansson to be able to complete this vision that
they had been hyping up that merges creativity and technology

(10:45):
for their voice tech. And I think they were betting
that legally they were in the clear because they were
using a sound alike, But beyond what might happen in
a lawsuit, to me, this is the classic story of
someone trying to cover their ass after they weaseled out
of getting consent. Like the question that OpenAI is asking
is probably would we lose a lawsuit? The questions that

(11:07):
they're not asking is is this ethical? Are we crossing
the boundaries of what this person said they don't want
to do? Or questions like are we further damaging the
trust between the technology that we make and that we
want using it. And these are important questions that they
should be asking, because once that trust is damaged, it
is not easy to win back. And the entire thing
makes me incredibly uncomfortable because I can't not see that

(11:30):
we're talking about a woman whose body is being used
to financially benefit people who are mostly men. And these
people who are mostly men have appointed themselves the architects
of our future and have shown themselves to be the
kind of people who are not interested in clear affirmative consent.
I can't help but think that they feel as entitled
to the body and likeness of a woman as they

(11:52):
do to everyone's data and information online that they use
to train their AI without consent for their own profit.
It's about entitlement, and why would anybody watching trust these people.
If open ai thinks like their statement says that this
was done working with the creative voice talent industry and
meant to be done in a respectful way, I think

(12:13):
it's fair to say that as an industry, we are
not buying it. We're smart enough to see these tricks
and know that if this is how you're moving that
none of us should trust you, and I think that
open ai really needs to understand and internalize why we
are also skeptical. It's because of their own behavior. This
is a natural consequence of their actions. Scores of writers

(12:36):
and artists and creative professionals have already had to sue
open ai, specifically for using their intellectual property to train
their AI without their permission, and a big bulk of
the Hollywood Writer Strikes was about AI stealing the work
of creative professionals and then being used to replace those
same creative professionals. Like open ai doesn't get to act

(12:57):
all shocked that people are then naturally skeptical of their
intentions and there just is a better way. On the
podcast I make with the Mozilla Foundation called IRL that
explores ethics in AI, we spoke to artists Matt Dryhurst
and Holly Herndon who are building AI consent systems at
haveiben trained dot com that allows people to search popular

(13:20):
AI training data sets to see if their work shows
up in them and makes it easy for people to
opt out of having their work used to train AI models.
Holli is a musician and she built this publicly available
version of her own voice that anybody can play with.
So they're both big advocates of what can be created
using AI when consent is at the forefront. In a

(13:41):
piece for Art Review called AI art in the Problem
of Consent, they underscore the importance of the intersection of
AI and creativity being grounded in consent writing. My hope
is that a new era of abundant media will affirm
this social value of art and artists, And this is
why establishing new protocols of consent is so vital. I
am a deep believer in AI augmented expression and have

(14:03):
no desire to limit experimentation or enshrine outdated IP laws.
I would simply like to avoid what a world looks
like in the absence of consent. When consent is absent,
beautiful relationships and connections are stymied that could instead have
been nurtured. I hope the dress rehearsal of an AI
art future we are witnessing highlights what an opportunity we
have to finally fix those damned broken links. And I

(14:26):
think that's exactly what we have here. Trust has been eroded,
consent has been violated, and those links have been broken.
So what's next, while Scarjo is not backing down and
Homegirl is notoriously not afraid of a lawsuit. This is
the woman who mounted a legal contract dispute against Disney,
a notoriously litigious company, and walked away with a settlement.

(14:49):
So I don't see Scarlet Johansson's team backing off without
a fight. Open AI did pause the use of Sky
and apologize directly Johansson, saying we are sorry to Johansson
that we didn't communicate better, and I want to get
into all of this more in our episode breaking down her.
But even the conversation about them wanting to use Scarlet
Johansson's voice tells me a lot about how open ai

(15:12):
is thinking about technology in this moment. It's like they
want to give us the illusion that they're finding thoughtful
ways to link creativity, connection, and technology in our futures,
which is why they're trying to reference Johansson's voice from her,
a movie that is at its core about sad, lonely
creative people surrounded by technology and searching for connection. But

(15:33):
it's just that, like a shallow, empty movie reference, open
AI cannot actually give us connection or community. The best
they can do is a movie reference that uses a smooth,
stolen voice to kind of mimic those things if you
don't look at it too closely. Sam Autman told Scarlet
Johansson that using her voice would give people comfort and

(15:54):
ease around this technology, people who maybe were a little
bit anxious about how quickly AI technologlogy has proliferated, and
it's like they can't actually sort out a way to
make technology that gives people a genuine reason to feel
comfort about it. They can only offer a quick workaround,
and worse, in trying to reference this, tech leaders are

(16:14):
showing that they think it's okay to do all of
this without a strong foundation of consent. They think they
have rights to whatever woman's body or whatever woman's voice
they want without permission. And none of this bodes well
for the kinds of futures they're going to use technology
to try to create. And if these are the people
who have appointed themselves to build all of our futures,

(16:35):
then for everyone's sake, they have got to be better.
Got a story about an interesting thing in tech, or
just want to say hi, You can reach us at
Hello at tengodi dot com. You can also find transcripts
for today's episode at tengody dot com. There Are No
Girls on the Internet was created by me Brigitad. It's

(16:56):
a production of iHeartRadio and Unbossed creative. Jonathan Stricklet is
our executive Pretty Tarry Harrison is our producer and sound engineer.
Michael Amato is our contributing producer. I'm your host, Bridget Todd.
If you want to help us grow, rate and review
us on Apple Podcasts. For more podcasts from iHeartRadio, check
out the iHeartRadio app, Apple Podcasts, or wherever you get
your podcasts.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.