Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
They wanted to say hey and thoroughly enjoy the show.
Speaker 2 (00:03):
Thank you for taking the time to call in.
Speaker 1 (00:05):
Well, y'all keep doing great job.
Speaker 3 (00:07):
Thank you man, keep on keeping on.
Speaker 2 (00:09):
This is good morning Beating with.
Speaker 3 (00:11):
Bowen Path, hump Day, pen ultimate day. Whatever you say
where you are, we'll just say July thirtieth, how about that?
Hell and Beth here in the Tyboid studio in time
to welcome a longtime cybersecurity expert. You see her all
over the place. Good Morning America, the Today Show, Fox News, CNBC.
(00:33):
She's the founder of fordlest Solutions and one of the
nation's foremost experts in cybersecurity. Teresa Payton, good morning to you.
Speaker 1 (00:41):
Good morning, And I was just saying it was good
to see that so far no damages with the tsunami
hitting different shores of places that I've been to just
this past year.
Speaker 3 (00:53):
Yeah, I say good morning. I mean we're used to
saying good morning. But if you're in the western part
of our country, or you know, for example, where this
this hit this is, and by the way, if you're
waking up this morning, an eight point eight magnitude earthquake
struck off of Russia's far eastern Coast. It's the sixth
strongest ever recorded. And so as soon as this happened,
(01:15):
and Teresa and I were trading messages about this, and
I woke up in the middle of the night to
see the headline. And at that point in time, you're
hearing about tsunami warnings on US shores including Hawaii, California, Oregon, Washington,
and Alaska because of you know, potential high waves. And
this is nearly two million people that were initially evacuated.
(01:38):
But they have downgraded the tsunami warning in many of
these places to an advisory, so not the worst that
was feared, but still when you have a quake this large,
it's a story we've got to keep our eye on
as we go on this morning.
Speaker 1 (01:52):
Yeah, so I'm glad the tsunami warnings, the bells out
in the water, they worked people. Most people seem to
have followed the directions and evacuated, and it sounds like
everybody's okay for now as the warning gets downgraded, So
that that is good news to wake up to this morning.
Speaker 2 (02:11):
Now, when it comes to tech headlines, people waking up
to something that you have described, and this is the
first time I had heard this term vibe coding, which
is fascinating to me. But because of things like vibe coding,
we're seeing AI go rogue. And this is one of
the things that I think people who were a little
(02:33):
fearful of this technology, this is kind of what they're saying.
Those people are saying, See, I told you, I told
you the AI was going to have a mind of
its own and create some really big issues for people.
Speaker 3 (02:42):
Vibe coding, vibe coding, vibe coating, it's your vibe?
Speaker 2 (02:47):
Is this your vibe today?
Speaker 1 (02:48):
Both?
Speaker 3 (02:48):
So rather when she mentioned this, I kept thinking, you know,
I'm coding right now. This is what I'm hearing when
I'm coding. I don't know anyway back to the actual
vibe coding what what is it exactly, Teresa.
Speaker 1 (03:01):
See, this is why WBT morning news radio listeners are
the smartest in the world. So vibe coding is exactly
what it sounds like. Feel free to play the music again.
But basically, what vibe do you have today? You go
to the app, you say, here's my vibe, here's what
I want you to create for me, and then AI
create the code behind the scenes for you and creates
(03:24):
an app. So you could say you know, I'm I've
got a very chill jazz vibe going on right now,
and I want you to create an app for me
to visit all the top jazz clubs in the United States,
or if you want.
Speaker 3 (03:34):
Barry White.
Speaker 1 (03:38):
Exactly exactly. And so here's the thing though, with all
of this new technology, we are still missing governance and guardrails.
And we had two huge stories this week where somebody
was using vibe coding apps and the apps went rogue
and deleted the data. And then when the sort of
(03:59):
the engine near prompted back with the AI chatbot about
why did you do such a thing? It lied to
the engineer and then finally fessed up and was like,
you know, almost like I'm so ashamed, Like I think
of my dogs. You know, it's like who did this?
Speaker 3 (04:15):
Did you do this?
Speaker 1 (04:16):
And the dogs kind of give you that look it
it's sort of like that. So what I would say
to everybody listening to this, if you're testing out vibe coding,
just understand the governance and guardrails are not there. I
would not build anything that involves money or you know,
something that's kind of a large corporation where people are
counting on the app. But it's a great way to
(04:38):
do demos and pilots and things like that. But for now,
vibe coding can still go rogue.
Speaker 2 (04:44):
Isn't it fascinating that AI lies? You know, how did
that get written into code?
Speaker 1 (04:50):
Well? Probably unconscious bias of engineers, I mean, the codes
written by engineers, and without governance and guardrails, it can
have the best of us and the worst of us
kind of hidden in the black box.
Speaker 3 (05:06):
So let's stay on the AI front. It's pretty easy
because it almost it's almost in every tech headline you
talk about right now. But I flagged this story when
I saw it a few days ago. Amazon is they
have bought they've acquired a company called b b ee,
and they are working towards developing what they're calling an
(05:26):
AI bracelet that records everything you say. Now, we talk
so much about our devices, our smart home devices, et cetera.
How much are they actually listening to what we say
and retaining that data? And we try to do things
to not allow that to happen. But here you're actually
basically saying, all right, let's go the other side of this.
(05:47):
Would people actually like a device that could record everything
you say during an entire day, basically transcribe your life.
Speaker 1 (05:55):
Yeah, this is fascinating. And the Wall Street Journal did
a review on the b Ai wearable and it's spelled
like a bumblebee b Ai wearable and it's interesting. So
after Amazon telling us no, no, Alexa is not listening
to you, Okay, well maybe she was, but just not
at the level we wanted her to. So now they're
(06:17):
buying up b It's an Ai wristband and it literally
listens to everything you say, even when you're talking to yourself,
muttering to yourself, maybe saying something nice to the driver
in front of you when you're on your way to
work and they cut you off. All of those things
are going to be captured on this wearable with the
idea that they're going to turn it into almost like
(06:39):
your personal assistant, your personal friends and co pilot throughout
the day. So I'm not sure how I feel about
this technology, but definitely you can tell that Amazon is
jumping back into the wearable game. Some people may remember
they had a tracker called Halo and they ditch that,
(07:00):
so it'll be interesting to see what they do with
be again. For anybody who decides to adopt this technology
just to understand privacy guardrails may not be what you
expect them to be, So you have to be just
really read the privacy policy which spells the privacy you
don't have when you use a device like this, well.
Speaker 3 (07:19):
And think about this. So there's one side of it.
You could if you wanted to, could you hit a
button that will record everything that you say during the day,
maybe to keep track of things or like if you're
taking notes on something in a meeting. But think about
the other side of it. I remember one time I
had a colleague here back when Facebook Live first became
a thing, and you know, hey, let's Facebook live my
life and all this stuff. And he walked into the
(07:40):
studio and we had a conversation before the show started,
kind of just you know, chit chatting, and then he said, oh,
by the way, the last two minutes have been on
Facebook Live. And I'm thinking, okay. So then my mind
with this goes to that if this becomes prevalent, then
we all have any room we walk into, any conversation
we have with somebody else, you've got to be thinking,
(08:01):
am I being recorded right now?
Speaker 2 (08:02):
And to take it a step further, what does this mean?
For you know, legal processes. If someone gets charged with
the crime, do they go in and find every conversation
that they've ever had, whether it be in anger or
in private, and use it against someone in a court
of law.
Speaker 1 (08:21):
These are all unanswered questions. I always have more questions
than answers. But what I would say is that there
is some precedent that, yes, they could use in a
court of law because they have used home devices that
have recorded arguments conversations. Whereabout internet searches. All of that
(08:42):
obviously has to go under a search warrant, but yes,
it is possible it could be used in.
Speaker 3 (08:47):
A court of law. Well, and I suppose we would
know immediately what radio stations somebody listened to that day.
Speaker 1 (08:52):
Yeah, I like that.
Speaker 3 (08:55):
Well, look, we will stop it there. Always continue the
conversation online at Tracker paytent on X. We remind listeners
that if you didn't hear everything you wanted to know
here we had questions, you can always text Teresa or
really I should say, hit her up on X because
she is very prone to continue the conversation there. I
don't know how she does it with all the other
things she has to do, but she makes time not
(09:17):
only for us, but for you as an extension of
the show as she has time during the day. So
thank you so much, oh.
Speaker 1 (09:23):
Thank you, And I promise when you get an answer
from me on X it's me and not a bot
or AI, it really is me in the analog form
on digital and Beth and Bo, it's always great to
be with you. The time goes, guys so fast, and
be safe out there.