Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Joining us now. Young Voices contributor and author of Monta
Bella mondays. Sarah Mountabella joins us.
Speaker 2 (00:05):
Good morning, moren Thanks for having me.
Speaker 1 (00:07):
Absolutely so, I wanted to get your thoughts in what's
happening with this Takedown Act? Explain this is this what's
going on when it comes to social media and AI
out there where people will you know, take images of
people their face and put them into you know, sexual
situations or just you know, compromising situations or make them.
(00:29):
I mean, we're not talking about the little babies that
look like Donald Trump walking around talking about policies. We're
talking about something destructive here.
Speaker 2 (00:35):
Right, Yeah, Yeah, absolutely, I'm glad you're talking about this
because there's a couple of different efforts going on in
Congress right now for you know, digital privacy and what
do we do about these AI bead fakes is what
they're calling it, where it's you know, a non consensual
image of someone who obviously has not consented to be
(00:55):
in this. You know, the Take It Down Act really
is focusing on those intimate images. Another one that is
important is the No Fakes Act, which creates kind of
a right to a digital replica, and that can go,
I think in the two broad direction. But there's a
couple of interesting things going on to you know, require
(01:17):
that you notice and remove, you know, these images that
the tech platforms end up being liable for this frankly, so, yeah,
it's a couple of interesting things going on.
Speaker 1 (01:30):
Well, you know, when you've got these images out there
like this, and people, I mean, AI is everywhere, people
are doing all kinds of things with it, you know,
fun things and also horrifically bad things with it. And
that's what this is all addressing here. So what happens
when these images pop up? Who's liable all of a sudden?
Are we going to be seeing court case after court
(01:52):
case that social media platforms like TikTok and x and
Facebook and you know, are they liable for allowing is
crapping it on?
Speaker 2 (02:01):
Yeah, in a lot of ways they are. And it's
a very tight window actually to take these things down.
To take it down act to the forty eight hour
of receipt of a request and what that happens to do,
and this is also true for the No Fakes Act
that I've studied more heavily. What that does is it
(02:22):
puts the burden on the tech platforms in a lot
of ways to ensure not only that they are not
allowing uploads of these things, but they are also preventing
future uploads of that same content. So what I think
is going to happen here is we're going to see
a lot of kind of preventative stuff where you know,
maybe you know this is it's meant to be a
(02:45):
parody of some sort. Maybe there is there's an image
that is supposed to be you know, falling under First
Amendment protections or parody law or what have you. But
there's going to be a lot of tech platforms that
are overly encouraged to take down materials as a preventative.
I think that's what's going to happen. Sure if these
(03:06):
laws go into effect.
Speaker 1 (03:08):
Well let's talk about that first, man. I mean, you know,
versus disgusting you know content. You said, Yeah, there's a
lot of things that are public out there, and you know,
satire is fun, But who decides what decency is? Because
I got to tell you there's a lot of stuff
that's out there that a lot more people would say
(03:30):
not decent, get rid of it, and others are still
letting it fly.
Speaker 2 (03:34):
Yeah, it's an important thing to talk about. You know,
I think that these you know, the No Fact Fakes
Act in particular, probably goes a little too broadly in
its attempts to make things very, very and decent for
people on the internet. You know, the whole point of
(03:56):
these laws that they rely on someone to make a
outlet report that you know, their likeness has been used
without consent or that this is you know, obviously a
non consensual image, and that what we'll see is there's
a lot of incentive to make the claim first, even
if it's misled, it's misinformed. You know, it could be
(04:18):
entirely false, but the onus is on the tech companies
to take it down and not really create the Heckler's veto.
You have to take down this content when you receive
a notice pretty much regardless of whether the claim is
frivolous or mistaken, because they have fines if they delay
or they're seen is not acting in good faith under
the No Fakes Act. So that I think is dangerous
(04:41):
in itself as kind of a censorship risk. You know,
obviously I think there's a smell test right where you say, oh, yeah,
this is obviously not going to you know, need this
satire First Amendment benchmark. But the problem with it is
that there's not any time allowed to have this play out.
It just it has to be taken down pretty much
(05:04):
of me immediately.
Speaker 1 (05:05):
Well, I think we all know, you know where that
line should be. But you know there's some people say, well,
I don't care that we're not going to have a
line because it's freedom of speech, and you know so well, Sarah,
thank you so much. I really appreciate you input this
morning on all this young voice of Sarah Montobato and
once again check her out Mountabello Mondays