Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Good Morning Beat News Talk eleven ten, nine
to nine, three WBT. It is Wednesday, December tenth. BO
and Beth here in the Tyboid studio. And the last
time I was corresponding with our next guest, she was
in Palm Beach. I know why she was there, and
(00:22):
we'll get to that in a moment. But where are
you now?
Speaker 2 (00:25):
I'm back in Charlotte, Yay.
Speaker 1 (00:28):
I have to ask because Teresa is just all over
the place. That's always been the case with her, and
she is very likely many days when we're talking to
her to be just about two or just coming off
of one of the national morning shows, whether it's Good
Morning America or the Today Show or Fox and Friends,
whatever it is. She's in high demand all across the country,
the founder of Fororderless Solutions, and you can always follow
(00:51):
her on x and we recommend that because a lot
of times the conversations that we have here on the
air extend there and people ask her questions and she's
very interactive that way. But it's Teresa Payton, our cybersecurity expert.
Good morning to you, and what were you doing in
Palm Springs?
Speaker 2 (01:06):
Yeah, good morning. Well, the weather wasn't that great, But
it was raining the whole time I was there, but
inside it was definitely sunny. I was meeting with hid
Global and their channel partners and all of their clients,
and we were talking about how AI is changing or
needs to change the way everyone thinks about physical security. So,
(01:29):
for example, if you're focused on I'm going to present
to you a key card access to a building, AMI
biometrics that is not going to be good enough anymore
with deep fakes AI, the stealing of biometrics, and so
we were having sort of a futuristic conversation around here's
what's coming next, and so here's what you need to
do to prepare today, to change your designs so that
(01:51):
we can be safe in the future.
Speaker 3 (01:54):
That's one of the things that I've seen a lot
of chatter about online is the stealing of biometrics, the
fact that biometrics are being utilized more and more. And
it does feel and you and I have talked about this, Teresa,
it does feel very minority report ESQ. But what can
happen if someone does steal your biometrics? I mean that
seems even on a next level of privacy violation.
Speaker 2 (02:19):
Yeah, I mean if somebody has your biometrics, they could
potentially if the right safety nets are not in place,
they could potentially present your biometrics to a building for access,
to your computer, for access to your online banking for access.
And so this is why anybody who uses biometrics in
(02:41):
addition to your user ID and password has to be
designing different layers of safety nets because user ID, password
and biometrics is not going to get it. And we're
starting to see the reproduction of biometrics and things like
that and its speed with generative AI and other technologies
(03:03):
coming out, and it can be very convincing. You know,
there's something called proof of life that's used to authenticate
in financial transactions, and even that is being run through
the paces with deep fake technology. So I always say, like,
it's not that that technology is no good. It means
(03:25):
you have to have other things that you do in
addition to just in case you're presented with a voice
clone or a deepake video or the biometrics of somebody
and it's not actually them.
Speaker 1 (03:37):
I got to ask you switching gears here. One of
the biggest stories out there this past week, especially in
the entertainment but also the digital realm has been what
happens to Warner Brothers and Discovery, And initially the story
was Netflix had emerged as the exclusive negotiator, and then
over the weekend it was reported that Netflix had struck
a deal, and then earlier this week, Paramount submitted an
(03:58):
all cash thirty dollars per share tender offer to Warner
Brothers shareholders. This is a fascinating story and even more
fascinating to sort of think about however this goes, how
many things it affects.
Speaker 2 (04:13):
I know, and I can't wait depending on how this
turns out, there should be a movie about this, and
I can't wait to watch it, you know, just the
politics and the backroom deals, and you know, I don't
I don't really love either one option. But obviously here
we are with seeing things like Coca Cola McDonald's doing
(04:37):
one hundred generative AI created commercials. You know, this is
having an impact on TV shows, on the movie industry.
I didn't love the Netflix announcement because I do love
a superhero movie in the theater. I like to also
watch it again a second or maybe more than second
(04:58):
time at home. But Ted Sarando said that he doesn't
really love theatrical releases. He actually called theaters quote an
outmoded idea. So I didn't love the Netflix thing, but
I just I don't know where it's going to land.
But I do know this. I do know that because
of technology, the authenticity of the TV and the movies
(05:21):
that we watch is at risk. And the artists who
have brought us so many great TV shows and movies,
they're at risk as well.
Speaker 3 (05:32):
Speaking of AI, the President, President Trump actually is dropping
an AI executive order this week. Is this in favor
of propelling AI and making this a quicker process, or
is this trying to put up some safety rails.
Speaker 2 (05:51):
Well, I didn't read a lot about governance, guardrails and
ethics what I read in this bill, and maybe that's
sort of a part two, But in part one, what
they're saying on the surface sounds like it makes sense,
which is that they're going to they want to pass
a federal law, and so the executive order. Executive orders
(06:12):
cannot compel us in the States to do anything. All
an executive order can do is compel the departments and agencies,
or the military or the White House to do something.
But what people sometimes forget is the connective connectivity between
departments and agencies tend to be regulators of private sector industry,
so that compelling can have sort of a downstream effect.
(06:35):
But what's interesting is that basically they're saying is if
you have a state law, there will be a federal law,
and the federal law will supersede the states, which obviously
you have to ask a question about state's rights here.
Florida has a pretty strong AI law, and from what
I saw, Governor DeSantis mentioned that, you know that they're
(06:56):
going to keep their law in effect, So we'll see
where this head's best. It's a great question, and I
think it's another stay tuned and get out the popcorn
and see kind of where this executive order for a
federal law that supersedes state laws will land. You know,
I could see the positive side of things because we
don't have a federal privacy law, and look at what
(07:18):
a mess our privacy is as it relates to big tech,
Silicon Valley, social media type of things. We don't really
have a privacy bill of rights as citizens of the
United States, so you could see where, well, having something
on a federal level could be helpful. But I don't
see the really good things that are already in the
state laws that have been passed in this executive order,
(07:39):
so more to come there.
Speaker 1 (07:40):
Speaking of more to come, next week will be our
annual conversation with you about predictions for twenty twenty six.
And if you've listened to the show over the years,
when Teresa makes predictions, they often come true, and often
come true pretty quickly. She's always sort of out there,
a little ahead of things, and so I always look
forward to this yearly conversation. So we'll do that and
(08:02):
wrap up the year as well, and wrap up things
right now. Thank you so much, as always well, Beth
and Boat.
Speaker 2 (08:08):
It's always great to be with you, and congratulations. I
saw the big announcement about WBT, so I guess the
next time I talk to you, you'll be under the
new umbrella.