Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
On this episode of Newsworld. On Monday, President Donald Trump
signed the Take It Down Act, bipartisan legislation that enacts
stricter penalties for the distribution of non consensual intimate imagery,
sometimes called revenge porn, as well as deep fakes created
by artificial intelligence. A legislation which goes into effect immediately.
(00:25):
Was introduced by Senator Ted Cruz, a Republican from Texas,
and Senator Amy Klobushar, a Democrat from Minnesota, and gained
the support of First Lady Milania Trump. The law makes
it illegal to knowingly publish or threaten to publish intimate
images without a person's consent, including AI created deep fakes.
(00:47):
It also requires websites and social media companies to remove
such material within forty eight hours of notice from a victim.
Here to talk about the Take It Down Act, I'm
really pleased to welcome my two guests, Senator Ted Cruz,
representing the great State of Texas, and Scott Berkowitz, founder
and president of Rain, said, Welcome to knuts World.
Speaker 2 (01:18):
That's great to be with you, Nut.
Speaker 1 (01:19):
I'm really curious how did you become involved in the
Take It Down Acting? How big of impact was the
meeting you had with Elliston, Barry and her mom, who
had snapchat, refused for nearly a year to remove an
AI generated deep fake of the then fourteen year old.
Speaker 2 (01:37):
What had happened to Alliston was infuriating and it was wrong.
And over a year ago, Elliston was then fourteen. She
was a freshman in high school and one morning her
phone started blowing up. All her friends were texting her.
And what had happened is a classmate of hers, who
was a boy in her class, had taken a perfectly
(01:58):
innocent picture of her from social medi and had used
an app that uses AI or artificial intelligence, to create
a deep fake, and then sent to all of her
classmates in ninth grade what appeared to be naked pictures
of Elliston. And the technology is such that it's not
a photoshop with someone's head stuck on a body, but
(02:19):
it appears perfectly real. But it was entirely fake, and
Elliston understandably was in tears. Listen, it's hard to be
a teenage girl today. As the father of two daughters,
I know the pressures that are on particularly our girls.
It was even more frustrating because the way the law was,
what the boy had done was not illegal. It was
perfectly illegal to do that. The Take It Down Act
(02:42):
corrects that now. And what it does is two things. One,
it makes it a crime, a felony to post non
consensual intimate imagery, either real pictures so called revenge porn,
or in Elliston's case, deep fakes of real people. But two,
and this goes right to your question, new it puts
a legal obligation on the tech platforms to pull the
(03:05):
content down.
Speaker 1 (03:06):
When you'd decided to sponsor the Take It Down Act,
I notice she got your colleague, Senator Emmykloboshar. How did
you and Amy come together to make this a bipartisan bill.
Speaker 2 (03:16):
The way the issue came to my attention initially is
because Elliston is a Texan and her mom, Anna is
a Texan. They live in Alito, Texas, which is North
Texas outside Dallas Fort Worth. And her mom called my
office and said, you're my senator, can you help me?
And my staff elevated it to me and we looked
at what happened to Alliston and thought it was horrible,
(03:38):
and we discovered this was a phenomenon that was happening
all over the country. Last year there was a three
thousand percent increase in deep fakes, and over ninety percent
of the victims are women and teenage girls. And so
we drafted the Take It Down Act to fix this problem.
And Amy Klobuchar and I worked together quite a bit.
We're on two committees together. We're on Judiciary together, and
(04:00):
we're on Commerce together. I'm the chairman of the Senate
Commerce Committee. And she and I have had a good
relationship for thirteen years, and so I approached Amien. She's
been very active in the tech space as of I
and reining in abuses of big tech, and I asked,
you want to join me on this and she said yes.
And in fact, the first time I met Elliston and
Anna was when they came to d C to join
(04:22):
Amy and me for the press conference announcing the bill.
And we're sitting in my office and I asked, I said, well,
whatever happened to the pictures? And Anna Elliston's mother she
expressed enormous frustration. She said, you know, this was nine
months ago. This happened and she said, I've been on
the phone, I've been emailing Snapchat over and over and
(04:44):
over again, and I've been running into a complete stone
Wall and nude. I turned to my staff, I said,
I want you to get the CEO of Snapchat on
the phone today. I want those pictures down today. Within
two hours, they pulled them down. But it shouldn't take
a sitting senator making a phone call to make that
(05:04):
happen and to take it down act now that President
Trump has signed it in the law gives a legal
right to any victim that if you notify a tech
platform that picture is me. It is explicit, and you
do not have my consent, they have a federal statutory
obligation to take it down immediately.
Speaker 1 (05:24):
It's impressive at a time of great partisanship. The bill
that you developed passed the Senate unanimously and then passed
the House in a four h nine to two vote.
I mean, they are not many things that have that
level of unanimity. So obviously you had found a sweet spot.
And I'm curious, how did you craft this to fit
within the First Amendment rights.
Speaker 2 (05:45):
Well, we made it explicitly focused on non consensual intimate imagery.
And there are a lot of laws across the country
that deal with so called revenge porn, where say a
boyfriend and girlfriend are in a romantic relationship and they
choose to take explicit pictures or videos, and then they
have a bad breakup and one or the other decides
(06:07):
to post that to the world, to stick it to
their former boyfriend or girlfriend, and that I think is
a grotesque violation of privacy. And those laws have been
upheld across the country because you don't have a right
to do that to somebody else. Texas has a revenge
porn law. The problem is Texas's law did not cover
deep fakes. Deep fakes are a new enough phenomenon that
(06:30):
most of the laws in place. That's a hole in
the law, and so this did not meet the legal
definition of child pornography, even though these appeared to be
naked pictures of a fourteen year old girl and a
real fourteen year old girl. Because they were deep fakes,
they didn't fall under the definitions of child porn, and
so we define it. It's clear the First Amendment does
(06:51):
not cover the right to put out revenge porn and
target an individual. And it's also clear the First Amendment
doesn't cover child pornography, and so we put deep fakes
in that explicit category. And I'll tell you on the
take it down obligation new What we did is we
borrowed from an existing legal framework. So there's a long
(07:12):
standing law called the Digital Millennium Copyright Act that if
you tweet out a song from the Lion King, they'll
take it down within hours because you don't have a
right to violate someone else's copyright or trademark. And so
every tech platform has an office that deals with notice
and takedowns where they get notified Okay, this is copyrighted
(07:35):
material and they pull it down. Well, what we did
is we put this in that same bucket. So that
same office that is pulling down the tweet of the
Lion King. Now if a teenager or a woman notifies, hey,
this is nonconsensual intimate imagery, that same office and that
same mechanism has the statutory obligation to remove the content.
Speaker 1 (07:56):
I know it says you were developing this. First Lady,
Milennia Trump, had been leading a be Best campaign. One
of her concerns was online safety. How helpful was she
in drawing attention to the whole concept of take it Down?
Speaker 2 (08:09):
So the First Lady was incredibly helpful. In the last Congress,
we passed take it Down through the Senate passed it unanimously,
and then in the House, unfortunately it failed to get passed.
House leadership had added it to the Continuing Resolution in
December that was going to pass, but then if you remember,
that Continuing Resolution got pulled down and much of the
(08:32):
stuff that had been added got stripped out of it.
So it did not pass last Congress. And so this Congress,
starting in January, we passed it again through the Senate,
and the real challenge a new you've been speaker, you know,
House leadership has a million different demands in terms of
where to allocate floor time, and so the challenge was
(08:54):
getting this to rise up the priority so that it
gets a vote on the House floor. And the First
Lady reached out and called my office and said she
was very interested in this bill and she wanted to help,
which I thought was fantastic. So I invited the first
Lady to come to Capitol Hill and we did a
roundtable with the victims. And so the first Lady heard
(09:14):
from Elliston directly heard her story. She heard from Francesca Mani,
who is a fifteen year old girl in New Jersey
who the exact same thing happened to her. That happened
to Elliston. She also heard from Brandon Guffey, who is
a state rep in South Carolina, whose story is even
more tragic. His son got a direct message from what
(09:36):
he thought was a cute girl, and this supposed cute
girl convinced him to send naked pictures of himself to her. Well,
it turns out that that cute girl was in fact
a con man who began sextorting him and saying, send
me money now, or I'm going to take these naked
pictures you just said, I'm going to send them to
your parents, to your family, to your friends. And tragically,
(09:59):
this young man, Gavin, took his own life. He committed suicide.
And Brandon told that story. The time between the first
direct message from the con man and when his son
took his life was ninety minutes. And we've seen suicides
all across the country from this growing problem. And so
when the first lady came and heard these stories firsthand,
(10:22):
and she leaned in at the roundtable, the Speaker of
the House was there, Steve Scalise, the Majority leader, was there,
Brett Guthrie, the relevant committee chairman in the House was there,
and Milania asked them, will you commit to getting this done,
and House leadership said absolutely, And I think the first
Lady's involvement elevated and sped up the progress dramatically and
(10:45):
helped us get it over the finish.
Speaker 1 (10:47):
Line, teenage suicide has become a terrible problem and now
extends down to like eight, nine and ten year olds,
and a lot of it I think comes straight out
of the effect of social media and the effect of
isolation because of people paying so much attention to their
cell phones. Let me ask you, do you see additional
legislation evolving as we learn more about these kinds of problems,
(11:11):
and do you think they can evolve in a bipartisan way.
Speaker 2 (11:14):
I think Congress needs to do more to protect kids online.
When you and I were teenagers, their challenges to being
a teenager, but we didn't face all of these forces.
Our kids. We give them a phone, and every predator
on earth has access to them, every evil force, the
pressures that are on our kids. You look at social media,
(11:35):
they push substance abuse, they push self harm, they push
body image issues and self worth issues, and it increases
depression and increases anxiety and increases suicidal ideation. So one
example of a build that I want to move is
a bill called Keep Kids Off Social Media or COSMA,
(11:56):
which I've introduced with Brian Shotts, Democrat from Hawaii, and
it does three things. Number one, it prohibits children under
thirteen from having social media accounts. I think there's no
reason for a child that young to have a social
media account. Number two, it prohibits tech platforms from using
algorithmic boosting for kids under sixteen, and boosting is how
(12:20):
they push particular messages at kids. And then number three,
there's a provision in a bill that I introduced separately
called the Eyes on the Board Bill, that says that
any schools that receive federal funds have to block social
media on campus. That if you're in class, there's no
reason for you to be on Snapchat, you ought to
(12:41):
be listening to the lessons. And that bill we passed
out of the Commerce Committee with overwhelming bipartisan support. It
is yet to pass the Senate, but I think that
would be another important step. And I'll tell you what's interesting.
Newte are colleagues that are in their seventies and eighties. Frankly,
when it comes to this issue, most of them are
(13:02):
a little puzzled by it. They just haven't dealt with
it firsthand. You know, the co sponsors that I have
on this bill, they're almost exclusively senators in their forties
and fifties who have kids at home, who have teenagers
or adolescents. And every parent I know who's dealing with
this right now is frustrated and doesn't know how to
(13:24):
fully protect their kids online and on their phone. And
I think we need to do a lot more to
help them.
Speaker 1 (13:29):
I say this as somebody in the age group you're describing.
You may want to encourage the older senators to talk
to their grandchildren.
Speaker 2 (13:36):
Yeah, and I am doing exactly that.
Speaker 1 (13:39):
Suddenly they'd be in a new world. You are a
leader on so many different topics, but on this one,
I suspect you will literally be able to say over
the years that you saved several thousand lives literally from
committing suicide, and you saved tens or twenty thousands of
people who otherwise would have been deeply humiliated and deeply affected,
so psychologically so ted up. Thank you for your time.
(14:02):
I know how amazingly busy you are, and this was
a real active leadership on your part, and something which
the country owes you a debt of gratitude.
Speaker 2 (14:09):
Well, Newt, thank you. And let me just say I
appreciate the first lady very much. Her leadership was critical.
I appreciate President Trump. In the State of the Union,
he introduced Eliston Barry and he called on Congress to
pass the bill. I would be remiss if I didn't
say thank you, especially to the victim advocates, because they
could have taken their pain and just hurt from it,
and instead they decided to stand up and lead. And
(14:31):
the bravery, particularly Eliston and Francesca, but also certainly Brandon
and Breeslou and other victim advocates. The bravery they've shown
is extraordinary and this would not have gotten done without
their courage.
Speaker 1 (14:43):
That's great, That's tremendous. Listen, have a wonderful day and
good luck in Texas.
Speaker 2 (14:48):
All right, take care, Thanks Duke.
Speaker 1 (15:06):
Jah, thank you for joining me. Can you talk about
how the Take It Down Act gained by partisan support?
Speaker 2 (15:12):
Yeah?
Speaker 3 (15:12):
Absolutely, a lot of credit to sender a Cruise and
Center a Clovi char Sexual abuse has always been a
non partisan issue. We're fortunate to have really great support
from both sides in Congress. I think what Congress saw
here is that tech enabled sexual abuse is the fastest
growing form of sexual abuse. Every month, we're seeing a
huge increase in cases. Rather than waiting until this is
(15:37):
a problem that is too big to be fixed, Sender
a Cruise and Center clovichar jumped in and wanted to
do something about it so that we can try and
slow it down, try and put an end to it
before it hits every school in the country.
Speaker 1 (15:50):
The school issue of internet impacts psychologically is enormous. You
wouldn't have thought it thirty years ago. But a piece
of making America healthy again is getting our hand around
this whole internet and social media section, which has become,
particularly for very young people, an enormous part of their lives.
Speaker 3 (16:08):
You're absolutely right. Creating non consensual intimate images has become
just shockingly easy. There are dozens of apps and websites
now that within minutes a kid can submit pictures of
their classmates and generate nude photos of them, and within
minutes that can be distributed around the school and all
around the web. So this is a new form of
(16:30):
abuse that is having a really devastating effect on victims,
and it's just growing like crazy and keep in mind
worth just at the beginning of this AI revolution. A
few years from now, when these tools become even more
accessible and easier to use and more universal, this problem
is going to be so far out of control if
(16:50):
we don't do something about it now.
Speaker 1 (16:52):
The degree to which younger people in particular live in
that world, their reality is in many ways an electronic reality.
It shapes them. A lot of states, as I understand, though,
we're trying to find a way to address this. So
I think there was something like forty eight different states
that enacted some kind of law. What's your sense of
the state level effort.
Speaker 3 (17:13):
There has been some good action in the states, but
the laws are very inconsistent, and states sometimes have difficulty
because the Internet is open, it's worldwide. State by state
regulation ultimately isn't going to work. There needs to be
some federal regulation so that we can criminalize the distribution
of these images across state lines.
Speaker 1 (17:33):
Forty eight states had enacted some kind of law criminalizing
non consensual distribution of inimate images. South Carolina and Massachusetts
had not. What was happening in those two states, they.
Speaker 3 (17:44):
Were flow to come to the game. I think that
initially there was some opposition on one side about first
and ending grounds the idea that we're going to be
making certain images illegal. But I think to Take It
Down Acted a really good job at carving out legit uses.
So doctors can still share intimate images for medical purposes,
law enforcement can still distribute them for investigative purposes, but
(18:09):
when they're used to harass or abuse the subject, we've
criminalized them now.
Speaker 1 (18:15):
So if somebody actually goes out now and deliberately sends
out an artificial intelligence generated image, are they personally at
risk for having sent it out?
Speaker 3 (18:26):
They are if they are distributing it without the consent
of the person pictured in the image. And I should
say distributing intimate images of children has always been illegal
and still is. This expands that to intimate images that
might not have been caught under child sexual abuse material laws,
as well as to images of adults that are being distributed.
Speaker 1 (18:49):
The key part of implementing this is going to be
companies like Meta, TikTok, Google. What's their responsibility?
Speaker 3 (18:57):
They do have really a key role here. One of
the most important provisions of the Take It Down Act
is a requirement that the big tech platforms, once they're
notified of a non consentual intimate image on their platform,
they have to take it down within forty eight hours.
That's the provision that the Take It Down Act is
named for. We are hoping that the tech companies will
(19:19):
really embrace the responsibility here. This is about looking out
for their customers, and this is about treating their user
as well. It's incredibly traumatic for a kid to find
out that their naked image is circulating around on TikTok,
around on Instagram and not have a way to get
it down. And so this finally makes it easy for
someone to identify an image tell the platform, and the
(19:42):
platform then has two days to take it down.
Speaker 1 (19:45):
This bill gets signed in a law in May, and
yet you're setting a deadline of July fourth, which is
by normal federal standards, amazingly fast. What was your rationale
from these very big corporations moving this quickly.
Speaker 3 (19:58):
We're being optimistic here, but the law gives the federal
government a year to fully implement it, and there's the
whole process of the Federal Trade Commission having to come
up with regulations around it. But what we're asking is,
we've worked with a lot of these tech companies before
they can move very quickly. This is not a complicated
process or complicated technology that they need to build to
(20:20):
implement this. They need to add a button on their
site that lets people click and report an image, and
then they need to set up an automated process and
that image gets taken down off their site. So this
is something if they are committed to it, they could
get done in days.
Speaker 1 (20:37):
And the normal process of lobbyist in Washington. I'm surprised
there wasn't a huge effort to delay implementation. What would
the cost in human terms have been if you had
delayed the implementation for.
Speaker 3 (20:50):
A year, literally millions of new victims who would have
had no recourse. Every one of these images, once they're
out on the web can be distributed thousand times and
can have millions of viewers, So the human cost would
have just been extraordinary. And one of the reasons that
this was passed by the House so quickly this year
(21:12):
was the involvement of the First Lady Lania Trump decided
to take up this cause and to push for the
passage of the takendown at This is actually the first
policy that she worked on when she became First Lady
this term, and that really motivated the leadership of the
House and members of Congress to do something about this,
to move this quickly. So I got to give her
(21:34):
huge credit for being in the lead on this and
understanding that this is such an important issue for the
health and well being of kids.
Speaker 1 (21:42):
I suspect given her background as a world class model,
and she probably fully understood the importance of imagery and
how devastating it could be to have the wrong image
out there and what do we do to kids because
there's a period there and adolescents where the very sensitive,
very insecure, And of course you had a frightening increase
(22:03):
in suicide, which I think is partially associated with cyber
bullying and cyber isolation. And so I think her role
in this was a great place for her to focus
her prestige and her experience. But it wouldn't have happened
(22:33):
I think without your organization. RAIN. Talk about how RAIN
came to be and what its focus is.
Speaker 3 (22:40):
Sure so RAIN is the largest anti sexual violence organization
in the US. We work on public policy, trying to
improve the criminal justice system, trying to make sure that
more perpetrators are caught and convicted. We work on public education,
trying to help affect what the country understands and how
they react to sexual violence and motivate them to help
(23:02):
prevent further harm. And then we helped victims. We run
the National Sexual Sault Hotline. We also run a hotline
for the Department of Defense for members of the US
military around the world, and our victim Service programs helped
more than thirty thousand survivors every month. We've helped more
than five million survivors of rape in their loved ones
since we started it up.
Speaker 1 (23:23):
What does RAIN stand for.
Speaker 3 (23:25):
Rape of These Incests National Network?
Speaker 1 (23:27):
When was it?
Speaker 3 (23:28):
Creed started it up in nineteen ninety four.
Speaker 1 (23:31):
And what was the empetus so?
Speaker 3 (23:32):
And who started I started it out and the emphasis
there was a service gap, there was a need for
a national hotline somewhere for victims of rape to go
to get support, to get advice, to get information and help.
We originally started by launching the National Sexual Salt Hotline,
and then within a couple of years started our work
on public policy and really had great support going back
(23:56):
to when you were speaker, you were terrifically helpful on
these issues and supportive of the work we were trying
to do, which I'm really grateful.
Speaker 1 (24:04):
For During the twenty one years that you've been doing this,
has the hotline had a series of trends that devolved
or are the patterns you can see that are somehow informative.
Speaker 3 (24:14):
We've seen a huge growth in usage of the National
Sexual Salt Hotline. We were helping about forty thousand people
a year back in nineteen ninety five. Now we're helping
nearly forty thousand people a month, So the demand has grown.
People being willing to reach out for help. We've seen
that grow and we've seen a big evolution in every
(24:35):
year more and more kids reach out to us. About
half of victims are minors, and that's reflected and the
people who are contacting the National Sexual Salt Hotline and
asking for help.
Speaker 1 (24:47):
Do you think the increase represents the increased problem in
the society or an increased awareness of the hotline.
Speaker 3 (24:54):
I think that the increase is primarily based because of
increased awareness, increase comfort in talking about the issue, and
the diminishment of the stigma around it. I think that
kids more and more have a better understanding that they're
not to blame here anymore than a victim of a
mugging or a victim of any other violent crime is
to blame. This is a violent crime that the FBI
(25:18):
ranks it second only to murder in terms of seriousness,
So there's nothing to be ashamed about for a kid
who's a victim or for an adult. The other trend
we're seeing is a huge increase every month in the
number of calls that we're getting better about technology enabled
sexual piece like nonconsensual images. We're seeing that increase every
(25:40):
single month, and I think that five years from now,
I think that that's probably going to be the majority
of the cases that we're getting.
Speaker 1 (25:47):
If people do want to call with any help and support,
what number do they call?
Speaker 3 (25:52):
They can reach us by phone at one eight hundred
sixty five six Hope Hope, or they can get help
through online chat at hotline dot ranrii n dot org.
Speaker 1 (26:05):
And we'll post both of those on our show page,
so anybody who knows somebody who needs help can give
them that information and help them get to Scott and
to his team. I think there's a significant step forward
in dealing with what sadly is a crisis, and I
think frankly, what you've done with the take it Down
Act begins to get ahead of the curve. We're going
to face huge challenges and how we deal with an
(26:28):
electronic world that we're beginning to realize has many different
negative effects on our kids. And I think you're a
piece of that solution. And I think it's great that
you were able to work in a bipartisan way and
work with the first Lady to get this moved all
the way into law, which is in this Congress a
substantial achievement.
Speaker 3 (26:47):
Well, thank you, we appreciate it. We're thrilled to have
worked on this. And I think that this is the
first big piece of legislation that has passed that puts
restrictions on this sort of abuse and puts requirements on
the big tech companies to address sexual abuse. So we're
really optimistic that this was passed quickly and early enough
(27:07):
to really make a difference and hopefully slow down the
growth of this and help Picktims Scott.
Speaker 1 (27:13):
I want to thank you for joining me. Our listeners
can learn more about the Take It Down Act by
visiting your website at RAI N dot org. And I
encourage people to read about it and understand that this
law will only protect our children and our grandchildren. Thank
you to my guest Senator Ted Cruz and Scott Berkowitz.
(27:35):
You can learn more about the Take It Down Act
on our show page at newtsworld dot com. Newtsworld is
produced by Gingrid Spree sixty and iHeartMedia. Our executive producer
is Guarnsey Sloan. Our researcher is Rachel Peterson. The artwork
for the show was created by Steve Penley special thanks
to the team of Gingishtree sixty. If you've been enjoying Newtsworld,
(27:55):
I hope you'll go to Apple Podcasts and both rate
us with five stars and us a review so others
can learn what it's all about. Right now, listeners of
nuts World can sign up for my three freeweekly columns
at Gangwish three sixty dot com slash newsletter I'm nude, Gingrich.
This is nuts World.