All Episodes

June 28, 2025 8 mins

The Danish Government is set to allow citizens to claim property rights over their features and voices in a bid to tackle the impact of AI deepfakes.

The proposed legislation would mean that people would have the right to ask platforms to take deepfake content down if it breached copyright infringement. 

Copyright lawyer Rick Shera is sceptical about the prospect of this new law making a difference.

"The real issue for me is that it doesn't really address the underlying problem - which is the harm that's caused by deepfakes of a particularly intimate nature."

LISTEN ABOVE

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
You're listening to the Sunday Session podcast with Francesca Rudgin
from News Talks EDB.

Speaker 2 (00:12):
Denmark is making moves to allow people to copyright their
own features. The Danish government has made the decision in
an attempt to fight AI and online deep fakes. So
changes to their copyright law would give people the right
to their own body, facial features and voice. It's believed
to be the first law of its kind in Europe.
So is this a move in the right direction To discuss,

(00:34):
I'm joined by leading internet and copyright lawyer Rick Share.
Good morning, Rick, how are you?

Speaker 3 (00:39):
Good morning?

Speaker 2 (00:40):
What do you make of this move by Denmark?

Speaker 3 (00:43):
Yeah, I think it's very interesting. I actually don't think
it's going to work. The problem with it. Normally, with copyright,
you own. If you take a photo, you the person
who takes the photo owns the copyright and the photo.
In this case, what they're trying to do is turn
that on its head for what they're preferring to as

(01:04):
deep fakes. Of course, deep fakes can take all sorts
of all sorts of nature. And the problem I see
is that what does this mean that if I take
a photo of a few friends at a party and
then post it online. Does that mean that I'm infringing
their copyright? What about a photographer who takes a photo
of a public news something on the street, a public

(01:26):
news event, does that mean they have to get the
permission of everybody to publish the photo? I thought, I
can see a lot of issues with it, and the
real issue with me for me is that it doesn't
really address the underlying problem, which is the harm that's
caused by deep fakes of a particularly intimate nature.

Speaker 2 (01:42):
It's a very preventative kind of approach, isn't it. I
presume though, that if you would have to have altered it,
or used a I or turned it into something to
become an issue. Like if you, as you say, you
were at a party and you took photos and you
put that photo, people post for a photo, and you
put that photo online, that was probably okay. If you
took that photo and you used AI and you manipulated
it and you made everyone naked, then yeah.

Speaker 3 (02:05):
It's unclear from the Danish proposals whether that's what they're suggesting.
It seems to be that they're just suggesting that, look,
I will own my image, and therefore, if you want
to use my image, you have to get my permission. Now,
there is a form of that in the United States
where you have a personality right where if you want
to commercially exploit someone else's image, then you have to
get their permission. That's commercial expectation. We've also had various

(02:29):
issues both in New Zealand and overseas in terms of
using photos in ways which are objectionable. The old case
of Mike Hosking and his kids being photographed in the
street created a privacy taught for us where photos which
are taken in a public place could be deemed to
be objectionable if they were used in that sort of way.

(02:51):
So there are all sorts of areas that might impact here.
I do think though, that really you have to focus
on the harm that's caused as opposed to just some
sort of blanket prohibition on people using other people's likenesses.

Speaker 2 (03:05):
Okay, so we've had amp Laura McClure on the show.
She was on the show a few weeks back with
her proposal around deep fakes. Is what she's proposing is
a little bit different, but is that a better Yeah?

Speaker 3 (03:17):
I think New Zealand is interesting in the sense that
We have a online piece of legislation, the Harmful Digital
Communications Act, which is quite unique in the world and
quite quite groundbreaking, groundbreaking that was brought in twenty fifteen.
It needs some tweaks in this area. I think I

(03:40):
think AI has created a gap in that legislation. We
introduced section twenty two A a little while ago for
intimate digital recordings. The problem is that a AI generated
or manipulates image may not be a recording in the
sense that the Act uses it. So that's what Laura
mcclau's Private Members Bill is trying to address is to say, well, look,

(04:04):
if we thought that public wishing intimate visual recordings was bad,
then what we need to do is tweak it to
make sure it covers that these AI generated types of images.

Speaker 2 (04:14):
The bill currently lacks government support. Here would you would
you like to see more action around this? Do we
need to act? You know, I think we tend to
sit back on issues around you know, technology and AI
rate and things pass us by quite quickly, don't they
They do.

Speaker 3 (04:30):
And you know, Laura is a very cumbersome instrument to
deal with technological advance in the way that AIS advancing.
But I mean copyright itself is up for grabs in
terms of AI. There are cases coming out every day now,
meta and anthropic, two cases, major cases in the United
States holding that in some senses AI copyright material maybe

(04:55):
infringing if it's been if it's been scraped. So there's
all sorts happening in therea. But I think yes, as
I say, we saw it as important to introduce law
which addressed taking of intimate visual recordings. With all that
we're doing here is tweaking it to make sure that

(05:15):
law has the proper ambit to cover manipulated images. So
I don't think that's a big stretch, and I think
it's quite appropriate.

Speaker 2 (05:24):
With the Danish plan, part of it will also cover
realistic digitally generated imitations of an artist's performance without consent,
which is quite an interesting move, isn't it. They're trying
to cover a lot of bases here.

Speaker 3 (05:37):
Yeah, And that's the problem I think is that, you know,
copyright is a very nuanced area, as I say, with
AI at the moment is completely throwing the copyright balls
into the area as it were, So throwing this into
the mix as well just creates another problem. I mean,
they've said, for example, that they wouldn't be restraining satire,

(05:57):
war parody. How do you? How do you? It's easy
to it's easier than describing it when you see it,
and you know, when you see an intimate visual recording,
that is pretty obvious. But what about I mean, the
in the news example, they've had that picture of the
pope dressed in a puffa jacket. Well, is that what

(06:18):
they're talking about, because that doesn't seem to me to
be something that copyright law should address. It looks like parody.
So if that's going to be an exception, well where
does the exception go? And that's always a difficult difficulty
when you're when you're creating these very wide laws, is
that you end up having to create so many exceptions
that the law is too difficult to enforce. And actually

(06:39):
that's that's one other area with copyright that's problematic is
that copyright is not a copyright in Franklin is not
a crime, and therefore it would be a self help remedy.
The better approach, I think for these sorts of very
harmful digital intimate recordings is so treatment is an offense,

(07:00):
to treat them as a crime, and therefore that doesn't
rely on the victims having to take their own.

Speaker 2 (07:05):
Action, Rick, Is this something which one country can solve?
So like, if Denmark put this into place, they're not
going to have any control over content posted outside of Denmark, right,
So is it something that an individual country can come
up with a solution or do you need the buy
in from tech companies as well? Would we all kind
of need to be on the same page, do you

(07:26):
know what I mean?

Speaker 3 (07:28):
I think the reality these days is that jurisdictional laws
like intellectual property laws, copyright, trademarks and so on, they
are jurisdictional. They're done by country by country. There are
international treaties which give some form of reciprocity, but the
reality is that as we speak at the moment, the
platforms are more powerful than the legislation. So I think

(07:51):
the way to address these is to have the platforms
take some responsibility, and that's what the Danish law is suggesting.
I think that the platforms would be required to take
this material down. I don't think the copyright weighs the
way to do it, but I do think that having
laws which making the offense to do these sorts of things,
and then having the platforms required to take the material

(08:13):
down is the right way to go.

Speaker 2 (08:15):
Rick, really appreciate your time this morning. Thank you so much.
That was Internet and copyright lawyer Rick Shaer.

Speaker 3 (08:20):
There.

Speaker 1 (08:21):
For more from the Sunday session with Francesca Rudkin, listen
live to News Talks THE'B from nine am Sunday, or
follow the podcast on iHeartRadio
Advertise With Us

Popular Podcasts

Bookmarked by Reese's Book Club

Bookmarked by Reese's Book Club

Welcome to Bookmarked by Reese’s Book Club — the podcast where great stories, bold women, and irresistible conversations collide! Hosted by award-winning journalist Danielle Robay, each week new episodes balance thoughtful literary insight with the fervor of buzzy book trends, pop culture and more. Bookmarked brings together celebrities, tastemakers, influencers and authors from Reese's Book Club and beyond to share stories that transcend the page. Pull up a chair. You’re not just listening — you’re part of the conversation.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.