Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_01 (00:06):
Happy episode
fifteen from Wired Together with
your host and hostess, MelanieWinter.
SPEAKER_04 (00:15):
And Jason Winter.
I guess I'm the host.
SPEAKER_01 (00:18):
Yes, you're the
you're the host.
I'm the hostess.
SPEAKER_04 (00:21):
Alright.
SPEAKER_01 (00:21):
The hostess with the
most us.
SPEAKER_04 (00:23):
Ah yeah.
SPEAKER_01 (00:24):
Um and continuing
with our October um I guess
we're kind of going with thatspooky.
So, you know, we're not muchinto the um gory or the you know
um even the the the overly likepsychological crazy horror mess,
(00:46):
no.
But the macabre noir.
Oh yeah, mostly we could so fallinto that.
Yeah, you know.
You know, a little egg rumpo, alittle you know Twilight Zone.
Old yeah, Twilight Zone.
Yeah, that's always been aclassic.
Old movies.
Yes.
We're we're we're gonna be alittle more on that end of
things.
SPEAKER_04 (01:05):
Yep, definitely.
SPEAKER_01 (01:07):
And so we're kind of
gonna go in this um little
little fun, little jazzy filmnoir style.
SPEAKER_04 (01:16):
And you kind of had
an idea as far as it not being
like mythbusters, but bringingup um things that are maybe
myths or things that are oftentalked about, and where is maybe
the truth in life, you know, putwith this, right?
SPEAKER_00 (01:33):
Yeah, yeah, so kind
of your dark side of AI.
Yeah.
So let's let's go into the darkside.
SPEAKER_04 (01:41):
All right, do I it
was another night in the digital
city.
Neon code raining down thealleyways of the web.
Somewhere out there.
A machine was learning.
And it wasn't just learning myname, it was learning
(02:04):
everything.
SPEAKER_01 (02:12):
I love that.
SPEAKER_04 (02:13):
Okay.
SPEAKER_01 (02:16):
So a little
detective movie going on, so
we'll set the scene.
SPEAKER_04 (02:20):
Yep, that's right.
SPEAKER_01 (02:21):
You know, a little
drink, a little toddy.
SPEAKER_04 (02:25):
Yep, that's right.
Alright, I well I think we setthe scene, so we'll just stop
right there.
SPEAKER_01 (02:33):
We'll have a little
fun with it.
SPEAKER_04 (02:35):
Alright.
That was cute.
SPEAKER_01 (02:37):
Right?
I thought it was cute.
SPEAKER_04 (02:40):
So I guess what is
the first thing that you want to
talk about?
SPEAKER_01 (02:46):
Well, the first
thing you think with AI, and and
that's the the the creepy, thedark and stormy of of AI.
Is it really listening toeverything?
SPEAKER_04 (02:55):
Oh gosh, everyone
talks about that, you know.
And I mean, I think it was justthe other evening with in this
technically isn't AI, but itgoes into the global idea of
what technology is able to do.
And we've all heard about, likeFacebook heard me.
Well, I don't I don't rememberwhat it was, and I guess it
(03:15):
don't matter, but I thinksomeone's name was randomly
mentioned, and this name is nota common name.
And you know, I think our oldestdaughter was having a
conversation with you, and thenI'm scrolling through Facebook,
and that name popped up onsomething that matched whatever,
(03:37):
and I just kind of like flashedmy phone at Melanie, like, all
right, here we are.
You know, and and I guess I knowwhat happens, I'm a little
hesitant to really buy into thatbecause I know algorithms can be
algorithms, and I know a lot oftimes it's because you searched
something, you know.
Oh my gosh, it knows I want tobuy a grill.
Well, it's because you went toHome Depot's website and you
(03:58):
were looking at grills, and thennow Facebook says this grill is
you know 30% off, and you'relike, oh my gosh, it's
listening.
No, this was not the case.
SPEAKER_01 (04:07):
Well, as devil's
advocate, sometimes you're not
searching and you're just sayingor thinking.
Right.
And then all of a sudden it popsup.
Exactly.
That's when it gets reallyeerie.
SPEAKER_04 (04:18):
It is, and you know,
where do you put that?
Right.
But it kind of it's kind ofbecomes the case of the
eavesdropping assistant, is kindof like what we were thinking
that this might fall into.
And so does AI actually do it.
SPEAKER_01 (04:32):
Um and well, the the
short answer, you know,
obviously there's no listeningdevice, there's no spies.
SPEAKER_03 (04:40):
Right.
SPEAKER_01 (04:40):
What to spy on every
single person in the country
would be almost impossible.
SPEAKER_03 (04:46):
Right.
SPEAKER_01 (04:47):
But there is kind of
a a learning of what you what
your genre, or not genre, youknow, but the the like your
demographic.
SPEAKER_03 (05:00):
Right, right.
SPEAKER_01 (05:01):
The you're probably
gonna go in this direction,
you're probably gonna think thisway, you know, that kind of
thing.
And so there is a lot ofcreepiness when it comes to that
because yes, there's things thatare correct, right?
And you're like, oh, I wasthinking about a growth.
SPEAKER_04 (05:17):
Well, it does it
does extrapolate, and marketing
has spent a lot of money intothis to predict your next move.
So it creates a profile for you.
And like on ads on Facebook, ifthings pop up, you can click on
it, like when you go to try toreport it, and you could say,
Why am I being shown this?
(05:38):
And if you dig enough, sometimesthey will say, Well, it's
because you are attached to thispage or group, you're within
this age group, you know, maybeyou bought a house in the past
two years, and it will try togive you reasons for why that
information was served to you.
(05:58):
So, yes, it might seem likemagic, but when you piece
certain variables together, youcan really extrapolate something
that may be fitting to you.
SPEAKER_01 (06:06):
So, and you know, as
we often do, there is the the
old and the new coming together.
Yeah.
Um, and the beginning ofcommunication was the telephone.
Yeah.
And in the the early stages oftelephone was party lines.
(06:27):
And so you did have uh nosyneighbors.
You snoopers, yeah.
Yes, and so uh not everybodyhung up when they realized that
the phone call was for you.
And sometimes they stayed andlistened.
SPEAKER_03 (06:40):
Right.
SPEAKER_01 (06:40):
And you had to be
aware of that.
Yeah.
And and back in the the turn oflast century, you you had to be
aware of the fact that these,you know, you're gonna have
neighbors listening to yourconversation.
Well, in in a way, we've kind ofgone full circle.
Sure.
SPEAKER_04 (06:57):
So you're saying
that some of these antics are
not unique now.
SPEAKER_01 (07:01):
Your Alexas and
things like that, the Google uh
echoes.
SPEAKER_03 (07:06):
Right.
SPEAKER_01 (07:06):
They're they're
they're not gonna listen to
everything.
There it's called a wake word.
SPEAKER_04 (07:12):
There's a word that
will wake up.
It knows when I hear this now tojump on board.
SPEAKER_01 (07:19):
Right.
Well, sometimes you will saysomething that's completely not
anything to do with that, and itsounds enough like the wake word
that they wake up.
SPEAKER_04 (07:27):
And it's like, oh,
maybe they thought because I
said this, yeah.
SPEAKER_01 (07:30):
So that gets thrown
into the cloud and and data gets
um, you know, thrown in there,and and yes, it does get
understood, and it's kind ofgets to that point of of what's
eavesdropping versus what's uhsure, just following whatever
the algorithm says to jump on.
SPEAKER_04 (07:49):
It makes sense.
SPEAKER_01 (07:49):
And and so yeah,
that gets it gets creepy, um,
but at the same time, it that iskind of part of it, um, you
know, when it comes to if you'vegot something like that in your
household, yeah, you know, Imean, nine times out of ten,
you're not saying things thatare like truly nefarious.
I mean, it we have an Alexa inour kitchen, it's probably
(08:12):
listened to a lot of math.
SPEAKER_04 (08:14):
It's a lot of math
homework, a lot of boring
conversations.
SPEAKER_01 (08:17):
A lot of boring
conversation, finishing up
about, you know, Winter Netweb,um you know talking about
friends to the girls, you know,stuff like that.
I mean, if it really wants tolisten, wow, how boring.
I know, right?
SPEAKER_04 (08:34):
Yeah.
Um if we had anything to reallycover up, I guess we would have
everyone's cell phones in a inan aluminum bag.
Right.
And put our tin four heads.
SPEAKER_01 (08:44):
You know, yeah, it's
it's it it's wake words, but no,
it's not actually spying.
Right.
Because that would be so muchdata and so ridiculous.
It would not be crazy.
SPEAKER_04 (08:58):
And we've seen a lot
of lawsuits related to that kind
of thing.
I mean, it if the technologywere able to do it, the
blueprints are out there, andyou know, there's enough
curiosity that people would haveresearched it.
And I I think we can kind of putthis one to bed and say it's not
actually spying every singlesyllable.
No, it's just trained to jump.
And the algorithm as a whole istrained to predict your next
(09:22):
move.
So that's probably what's that'swhat's going on.
And you're like, oh my gosh,wow, you know, so alright.
SPEAKER_01 (09:33):
Um and then, you
know, of course, there is the
the darkness of AI.
It's impossible to talk aboutthis without talking about, say,
the dark web.
Um, so strolling down the alleyof the dark web, how is that um,
(09:53):
you know, uh something to livethrough, be comfortable with,
that kind of thing.
SPEAKER_04 (09:58):
Right.
And you know, it Well therethere's a difference w with the
web.
You have the uh what people callthe deep web, and then you have
what is the dark web.
Right?
SPEAKER_01 (10:14):
Right.
So that that is two different uhsubjects here.
SPEAKER_04 (10:18):
Right.
And it's often kind of puttogether.
SPEAKER_00 (10:21):
Um So what's the
deep web?
SPEAKER_04 (10:24):
Well, the deep web,
of course it's deep, just like
say fishing or whatever.
It's the area that is beyondwhat's easily accessible, and
that's usually behind a passwordwall.
So your account information thatmay be part of whatever thing
you are signed up with.
SPEAKER_00 (10:42):
Like your
authenticator stuff.
SPEAKER_04 (10:43):
Well, yeah, I mean
anything anything behind the
authenticator wall is yourpassword protected area, so it's
it's private to you, so it'sdeep.
Can it be accessed?
Well, yes, there are ways, andwe've had a lot of data
breaches, and that's what accessum accesses that.
But the dark web is somethingentirely different.
(11:09):
And you don't just accidentallygo, hey, I'm in the dark web.
You know, there's no likevignette that comes over your
screen and you realize I'm inthis area.
SPEAKER_01 (11:22):
It's um Yeah, you
need software.
You need to be invited into it,if you will, you need it's um to
be nefarious and in yourexistence.
Right.
SPEAKER_04 (11:35):
Oh gosh, this kind
of reminds me of the 90s and all
well, yeah, anyway.
So there were areas of the webthat's just put it this way,
that it's on a network ofservers, which is the internet,
alright?
But this is a network of serversthat kind of coexists in its own
(11:58):
little circle.
Now you can find a way intothis.
I kind of think of it as likethe grand theft auto world that
tends to be where people thatmaybe don't have the best
(12:19):
intentions may be anonymouslysharing information, um selling
things, involved in things.
But, you know, a lot of yoursome of your trained police
force are also there.
So that is why it's kind of likethe dark web, because it's like
(12:40):
its own world where people havegone underground to try to
conceal certain dealings andcommunication.
And I mean, even some to a goodlevel maybe.
Um you know, there are timeswhere, you know, white hats may
(13:00):
be involved, um sharing ofinformation that they don't want
to be uncovered if sniffed inthe public standard level.
Um journalists and things may bethere as a way of trying to get
information for a story to getpushed through.
(13:21):
Um and it's called dark justbecause it's, you know, again,
kind of outside of it's on theother side, the shadow.
Right.
So uh yeah, it's but yeah, it itdoes exist.
SPEAKER_01 (13:35):
Yeah.
Villains live there.
SPEAKER_04 (13:37):
Yeah.
Yeah.
SPEAKER_01 (13:38):
It's but uh th
that's the thing with pretty
much anything.
I mean, you take a you know, thegangsters of the the twenties
and thirties, you know.
It it's kind of the same conceptas, you know, you know where the
gangsters live, you know whereyou don't go there.
Or you know where your yournefarious things happen, sure
you you know, make sure you'reyou know, steer clear.
SPEAKER_04 (14:01):
Right, the bad part
of town.
SPEAKER_01 (14:02):
The bad part of
town.
SPEAKER_04 (14:04):
These are the houses
you don't visit during
Halloween.
I mean you kind of look at no,we don't know we're not going
down there.
SPEAKER_01 (14:11):
Right and you know,
you know, so you you protecting
yourself is is as simple as youknow, you're probably not in it,
so you'd have to really try.
SPEAKER_04 (14:22):
If you don't know
about it, there's probably a
good reason why.
Right.
And you know just just pretendit doesn't exist and keep
moving.
But yeah, it's um it's reallyinteresting though.
Um so yeah, there was that.
There's the dark web.
There's a dark web and the deepweb, so right.
SPEAKER_01 (14:45):
Um and well, you
know, as most people are
starting to really pick up onnowadays, um AI can can fake
things pretty well.
Um and so here's thedoppelganger kind of situation,
the digital doppelganger, if youwill.
Um so there's been some somecase studies, um, some cases
(15:08):
we're gonna talk about thedetective cases where imposters
can call.
And they will actually fake yourvoice.
They will call people that youknow and they know your name and
stuff like that.
So, you know, how do we getourselves through this?
Yeah, this is this is this isthe wild west of okay, these bad
things are happening, how do wedo this?
SPEAKER_04 (15:29):
Right.
I I guess all this started likea little bit more than a decade
ago, where actually, and this isjust a really scary, bad, dark
concept where someone would havea child or someone call, say, a
grandmother, and say, I'vegotten into an accident, I'm at
(15:50):
the hospital, I need money, andor maybe, you know, well, just
if you could send money here,wire money here, I could be
taken care of.
SPEAKER_01 (15:58):
And they're that
actually happened to your
grandmother.
SPEAKER_04 (16:00):
It actually did.
SPEAKER_01 (16:01):
They told her that
she was you.
SPEAKER_04 (16:04):
Uh well, it I mean
it it wasn't me, it was someone
else in my family that was umyeah.
But and that, you know, it thenshe called me.
I remember I was painting a abuilding, huh?
SPEAKER_01 (16:19):
Smart.
SPEAKER_04 (16:20):
Right.
And yeah, that was, you know,and said, Are you okay?
I heard this, and all now I'mlike, Yes, I'm fine.
And you know, she was obviouslyupset because I mean, anyone, if
if you're called by somebody, sothey're exploiting on families
that may maybe don't see eachother very frequently, and um,
but the technology has gottenbetter.
(16:40):
Now you can actually, I mean,honestly, someone could take
this podcast right now and grabmy voice and then throw that
into a computer system and haveme read the Declaration of
Independence, sounding just likeI read the whole thing, you
know.
So it's this is how vulnerablewe kind of become
(17:02):
technologically, and of course,technology can be used for good
purposes or bad purposes.
I mean, think of chain lettersback when emails came about and
you know, other things.
Well, if you do this and or howabout that, uh I'm still waiting
on the several million dollarsfrom that prince in Nigeria.
So it it comes around, but youknow, it's there are different
(17:26):
scams and things.
So anytime that technologyallows for something, you have
some people that go, Wow, thiscan make life better, this can
do this.
You have other people andsaying, Who can I, you know,
screw over by using this?
SPEAKER_01 (17:41):
Who can I hurt?
SPEAKER_04 (17:42):
Right, who can I
hurt?
SPEAKER_01 (17:44):
That that's so
unfortunate, but it is it it's
been a uh human experience fromthe get-go.
Right.
Uh from the beginning withrocks.
I mean, it is a human experiencewhere some people are gonna be
um doing the nefarious thing,right?
And then, you know, others willgain from the the knowledge of
(18:06):
it.
And so um, how do we protectourselves yet gain and and not
completely ignore it?
We can't just, you know,completely throw AI in the trash
that it's gonna be here.
So, how do we protect ourselves,you know?
And the big thing is, you know,as as Jason says constantly,
look up the actual phone number.
(18:26):
Uh if if somebody is calling andsomething feels off.
SPEAKER_04 (18:30):
Oh yeah, use your
gut.
SPEAKER_01 (18:32):
Yeah, use your gut
and and call that person back
with the actual phone number andmake sure that that's not them.
SPEAKER_03 (18:40):
Right.
SPEAKER_01 (18:41):
Um, that's gonna be
your your it don't give money
out at all.
SPEAKER_04 (18:46):
Right.
SPEAKER_01 (18:46):
And in in this if
anything's been considered like,
you know, well, if you do itthrough um a gift card or
something, no, right unlikelyyour your family member is not
gonna ask you to create a giftcard.
SPEAKER_04 (18:59):
Right.
SPEAKER_01 (19:00):
That is a scam.
SPEAKER_04 (19:01):
Or or to wire.
I mean, that would be likeweird, you know.
Or yeah, exactly.
SPEAKER_01 (19:05):
So call the actual
family member if it sounds too
real.
And I know you just make sure,do it that double check.
SPEAKER_04 (19:12):
And I know they, of
course, always have another
reason.
You know, you might be thinking,well, this is not your number,
why are you calling me on this?
And the quick retort is, well,my phone, you know, it's not
with me.
My phone battery died, uh, itgot uh wrecked in a crash.
I'm calling for my friend'snumber, you know, so they always
have a reason to because theyknow you're going to question.
(19:33):
Right.
But you need to get right backto like Melanie saying, you
know, just go back and dial theoriginal number, you may find,
hey, how you doing?
Oh, well, I just heard, youknow, and then okay.
SPEAKER_01 (19:44):
And they're in the
kitchen cooking.
Right, exactly.
Well, hey, grandma, why why areyou talking about it?
SPEAKER_04 (19:48):
Yeah, why you call
them, hey, hope you're doing
well.
SPEAKER_01 (19:50):
Right, and that's
yeah, there you go, you know.
SPEAKER_04 (19:53):
So, and you get back
to the truth.
SPEAKER_01 (19:54):
They have no idea
that that something has
happened.
So, you know, that make sure youcall them people back.
Sure.
Um when it comes to certaincompanies, as we told before we
said in the last podcast, ifthere's a company that's like
making something urgent, findthat company number.
More than likely, the realnumber.
SPEAKER_04 (20:16):
Yeah.
SPEAKER_01 (20:17):
Um, more than likely
the company's not gonna sit
there and call you.
SPEAKER_04 (20:20):
And don't go to
Google to look for it.
And I know you're like, whereelse am I gonna go?
Well, you may have an email fromthat company, like from an
actual statement or somethinglike that.
Try to find a real way tocommunicate it because um we
know Microsoft, okay, and umthere are people that exploit
(20:42):
that, antivirus programs, andum, I've dealt with recently
where people look up Microsoft'snumber to try to reach them.
Well, there's so many peoplethat create profiles that say
they work for Microsoft or thesedifferent sites that they are a
support from Microsoft-relatedthings, and again, it's more
(21:04):
open source than Apple, but youcall that number and it's
actually a scammer.
And you go, okay, well, let mebe smart, let me look up this
person.
He said his name was uh, youknow, Michael Johnson, let's
say.
And you look up online, well,I'm smart, let me look this up.
Michael Johnson, yes, he doeswork for Microsoft.
Well, yeah, he said his name wasMichael Johnson because there is
(21:25):
someone from Microsoft that youcan find easily online with that
name.
So they try to build thatcredibility.
So, and it's more than justaudio, but video too.
A lot of the videos we're seeingare being faked.
SPEAKER_01 (21:38):
Right.
And um And a lot of these callsare um just so you know, gonna
happen morning or night.
So late at night or earlymorning, uh that's when you
catch you off guard.
That the the the point is tocatch you off guard.
Companies will not ever do that.
SPEAKER_04 (21:54):
Yeah, that's outside
of their hours of normal.
SPEAKER_01 (21:56):
And then if they're
doing that as cloning a a family
member's voice, again theythey're gonna bet that you're
not gonna go ahead and call thatfamily member early in the
morning or late at night.
Yep.
And so that's kind of whatthey're betting.
SPEAKER_04 (22:09):
That goes within
what your brain's gonna react
to, like, whoa, this is outsidethe hours of normal.
SPEAKER_01 (22:14):
So before wiring
anything or doing anything, wake
up your family member.
Yeah.
Don't worry about it.
Exactly.
Because they're they're gonnarather you do that than send
somebody something that's nottrue.
SPEAKER_04 (22:25):
Exactly.
So we're good, good.
And just um want to talk aboutsome of the video ways in which
this is done, which how AI iscontributing to you know, and a
lot of this is just playfulmaterial.
You take a video of of some uhscene event, and then you have
(22:49):
AI see what it can do with it.
So, and of course in thepolitical realm we see, did that
congressman really say that?
Or did did the president saythis?
And these these videos ofthings, and a lot of it is just
to get a chuckle, but there arepeople that see it, and it's
believable enough that you feellike, oh my gosh, I didn't know
(23:09):
that, and then you share it.
So we're sharing a lot of messthat isn't even true on both
sides of the aisle, and we justneed to be very careful about
that, and um, you know, justmaking sure that we realize that
yes, the technology is gettingbetter, it is getting
believable, but and if itdoesn't feel right, it probably
(23:30):
isn't.
Exactly.
SPEAKER_01 (23:31):
And we do have to
kind of go back to our our moral
standard here on the right.
Right.
SPEAKER_04 (23:35):
And there's the too
good to be true, you know,
concept we need to go with, andalso with a lot of uh scammers,
there's the I can't believe thishappened situation, and we want
to fix that, and they want to bethe hero.
And um you know, oftentimesthere is a scenario that may or
(23:56):
may not have happened thatyou're believing that this
person will help you through.
So you're actually allowing thescammer in.
Don't call the numbers, um,don't believe the pop-ups, um,
don't believe the emails.
Um, they're actually noted.
I was talking with uh someonethat follows our podcast very
(24:20):
recently, and it never dawned onme, but it used to be it the
very nature of it was a lot ofthe emails you got you could
tell were written by someonethat maybe didn't speak English
natively.
So the you know the keywordswere kindly sir, there you go,
kindly sir, do this.
(24:40):
They're now using AI.
And AI makes it sound a littleperpetual.
Not only that, but verybelievable because they're doing
their research.
Right.
So the email you're gettingsounds like it came from them
because you're finding otherwritings on the web.
And this ended up this ended upbeing a really good phishing
attack.
(25:01):
Um, you know who you are, I'mnot gonna name you out, but uh
cheers.
Um but yeah, so you know it it'salways good to have a certain
level of mistrust.
We we all want to keep ourfamilies safe and everything
like that.
SPEAKER_01 (25:14):
Which has been since
the beginning of time.
From the beginning of time, andyou know there are certain
things you do trust again withwith your moral construct.
Yes, and then you know, yeah, ifit sounds too good to be true,
it is.
It is right.
If it sounds, you know, just alittle beyond, it is.
Yeah, you know, um that that'skind of the the way to protect
(25:37):
yourself is you know, make surethat it's it's legit first.
SPEAKER_04 (25:42):
Right.
And I know we we don't like gooverboard on this, but we are
cautious and we've eveninstilled some of this in our
kids.
And like we go on vacation, andI know we're at a hotel, and you
know, Dad, can you go down andget you know it's always some
type of candy or something inthe evening and all?
(26:02):
And I come back and um I'llknock on the door, and they were
trained, what's the code word?
Now I'm not gonna tell you whatthe code word is.
They know what the code word is,but it to them it's funny, but
it's still instilling a trainingof, you know, hey, this is dad,
you know, ex you know, and thereare not many situations where
(26:25):
maybe you won't be in a roomwith them or something, but if
you weren't and the kids werejust there, I mean they're at
the age now, but still, youknow, this is your dad.
I mean, you know, it don't letthem think it is.
It's what's a code word, and nowit's like, oh.
And if I don't say the codeword, Nangle let me in.
SPEAKER_01 (26:43):
Right.
But um That's okay.
SPEAKER_04 (26:45):
Yeah, it's okay.
So and then you know, butanyway, but it's just it's a
code word is helpful.
Yeah, this word, the world we'rein.
SPEAKER_01 (26:54):
So um set your
families up with a code word.
SPEAKER_04 (26:57):
Right.
SPEAKER_01 (26:58):
Just that word that
means something to your family
and your family.
SPEAKER_04 (27:01):
Exactly.
Yeah.
SPEAKER_01 (27:02):
That's helpful.
SPEAKER_04 (27:03):
But but you were
saying this isn't new in the
broader historical sense ofthings, right?
SPEAKER_01 (27:07):
It's not, you know,
there's um uh throughout history
we've had to pretty much uh pullin all technology, you know, at
some point.
Yeah.
Um back in 1919, uh-huh.
Um somebody actually yelled firein the movie theater.
(27:30):
In 1919.
SPEAKER_04 (27:33):
Well, my great
grandparents are born, yeah.
SPEAKER_01 (27:35):
Yeah, a very new
concept, and um it it became a
obviously a hazard.
Um since 1919, it is illegal toyell fire in a theater.
Well, it's one of thosesituations where you've got to
kind of pull things in.
You've got to make sure that umif it's causing some sort of uh
(27:58):
civil unrest, that kind ofthing, well, you know, law has
to kind of pull things forward,so and say, okay, we're not the
wild west here.
We're not able to just sit thereand let things run their course.
Right.
And um, and so that's where kindof our our next little piece is
(28:20):
where is the law in this?
You know, um, so in our filmnoir, of course, you know, well,
you've got to have your yourrenegade law or your trustee old
shiny bad sheriff.
Right.
Um, so where is this in AI?
AI is such an unchartedterritory, it's um so new.
SPEAKER_02 (28:41):
Yeah.
SPEAKER_01 (28:42):
And um, you know,
we're getting there.
We're coming up with someregulations.
Um, the EU has the AI Act.
Right.
Um, they're pretty much kind ofsetting the standard.
Yeah.
Um the regulations from us is uhcome in from the FTC and you
know, making sure that cocompanies are accountable.
(29:04):
Um AR, AI is not a person.
Sure.
AI cannot be sent to acourtroom.
Right, it's a tool.
You can't blame it.
And so it is the companies thatcreate that is, you know, okay,
you're creating this, you'reputting this out there.
Um, just like the radio.
SPEAKER_04 (29:21):
Yeah, and if you're
implementing it, you need to
make sure that you're doing itin a responsible way.
SPEAKER_01 (29:26):
Right.
Radio waves um started to reallybe tapped into, you know,
obviously um much, much later.
Um turn of last century gettinga little bigger.
Right.
And so the FCC came up with, youknow, okay, yes, the radio waves
(29:47):
are out there.
Yes, you can kind of almost doanything with that.
So we we've got this, you know,big boom of this uncharted
territory.
Right.
So the FCC is, you know, thatokay, let's.
rear this in.
Let's make sure that we're we'reputting something out in the
public.
All public needs to be able tohear this.
(30:08):
So they're the one of the thingsthat came up with in George
Collin will love this.
The words you're not allowed tosay.
SPEAKER_04 (30:15):
Beggie brought up
George Carlin.
SPEAKER_01 (30:16):
And it's the very
George Collin concept of these
are the words you're not allowedto say.
SPEAKER_04 (30:21):
We will not say
those I'm not going to say them
either but they're runningthrough my mind at the moment.
SPEAKER_01 (30:27):
If you want to look
up George Carlin, he makes a
really funny exactly kind of aspin on that one.
SPEAKER_04 (30:33):
Comically shows what
you can't say.
But yeah but you know it was thea way of saying okay the
public's the public's going touse this and we need to make
sure I mean because we we'reusing it as a technology.
I mean think of the emergencybroadcast system.
Right.
So people relied on the radiofor communication even even
today in the same way you knowany um disaster type thing is
(30:54):
like have a battery poweredradio accessible for
communication if there's atornado or something.
So you know your family's goingto be listening.
SPEAKER_01 (31:03):
So make sure that
even when television you know
ground used to be you know youget the the beep beep people on
television and all that.
Well not everybody people streamnowadays.
Not everybody has it's not thesame local regular television
local channels.
Exactly.
And so you can't do it throughtelevisioning.
No.
So uh I think the emergencybroadcast system is turned into
(31:24):
the the phone.
Yep.
SPEAKER_04 (31:26):
Because you're at
least acknowledging everybody
has the you get that text youget that whatever in it and so
like the amber alert it you knowscares us all when we're driving
down the road or concentratingon something.
SPEAKER_01 (31:37):
Right.
Oh my god okay very importantthing to have that because
that's kind of getting sure amessage out there very quickly
something like an amber alertyep huge um need for that to be
fast yep and and make sure thatpeople know very quickly in a
specific area.
And so you know we've come upwith these things.
(31:58):
Hopefully well as AI isfunctioning in a a a deeper
level we're going to be comingup with more and more of this.
We're going to have probablymore unfortunately more um
courtroom settings where wheredo we draw the line and and you
know some policies coming upagain yelling fire in the middle
(32:20):
of the theater back in 1919people get trained realize that
was not a great idea.
So same concept uh radio wavesgoing out there uh in the turn
of the century we're not goingto be able to say some things
children could be listening trueso we're you know we're gonna
kind of slowly regulate thisconcept and the technology to
(32:41):
this fear of the Wild West we'regonna get there.
Yeah and and the main thing is Ithink with AI it's like is it
white hat is it black hat and Ithink the concept is what hat
are you wearing?
It's both yeah it's it's the hatyou're wearing we're all using
(33:02):
it really makes the theunderstanding of what what this
means for you.
That is true.
If you're wearing the white hatand you're saying you know okay
I'm using this as a tool to tobenefit myself as far as Google
search on steroids that's goingto be useful to me.
SPEAKER_02 (33:20):
Yeah.
SPEAKER_01 (33:21):
You know I'm not
doing this to go against people
I'm not doing this to to enterinto some sort of you know web
that I have no reason to touch.
Yeah.
You know I'm I'm doing this forfor something that's beneficial.
You know that's really up toyou.
That's that's your version ofthe the white hat.
Yeah um can it be used fornefarious reasons?
SPEAKER_04 (33:43):
That along with
everything else yes yeah
absolutely a lot of things youknow can fall into wrong hands
but also can be positive tothose that intended to and you
know so what is what is AI goodfor you know um why are we using
this?
SPEAKER_01 (34:00):
Why does this matter
can we just completely ditch it?
Well uh not necessarily it it'sbeen proved to be extremely
helpful when it comes to weatherpatterns um you know yes we can
make really good predictions ashumans AI can do it a lot faster
and so farmers really rely onthat.
SPEAKER_04 (34:22):
Yeah um it catches
fraud yeah so even though it can
create fraud it can catch fraudextremely easy in the same
patterns I was talking aboutwhere it wants to suggest me
purchasing something based on mypatterns of life it also says
you are not the kind of personthat's going to buy a$300 pair
(34:42):
of shoes from a company inCalifornia.
Right.
And that has that has been areal case where they're like
they called me up and I'm likewhen Chase Card Services calls
you're like oh no and it's likeno I did not buy those shoes
you're exactly right.
Well it didn't it didn't fityour pattern it's like I
appreciate it go ahead and blockmy card I did not do that.
(35:03):
So that is you know and thatstory comes 15 years ago that
essentially is AI.
Right.
It is now of course themachinery given into it you know
the supercomputing power it hasnow is where we're all seeing
what AI is able to do on a dailybasis.
(35:24):
And so conceptually it's beenthere but we flipped a switch
about three years ago that andwe're not we're not going back
because it's not hey this newcreature we got it's the
supercomputing power that we'reable to put into it has gotten
so much you know more powerful.
(35:45):
Oh yeah and we can't go back itit's no it's it's diagnostics.
SPEAKER_01 (35:48):
Yeah exactly in the
healthcare industry right yes if
we think about AI and how fastit can diagnose something based
on all of the informationexactly because we don't always
think of all of the informationsure but if you plug something
in and you start to reallyexplore yeah a diagnostic
situation yes it can come upwith a a diagnosis and a early
(36:13):
detection of disease muchquicker.
Of course so that that's a hugebenefit.
It is uh a benefit that I don'tthink humanity's willing to just
let go of yet.
SPEAKER_04 (36:23):
No unfortunately if
I can leverage it and I can do
better in my practice then Ihave leverage over my
competition.
But if you sit back and go wellI don't know well your
competition's going to use it.
The same thing with the internetthe same thing with any
technology.
Well if we use this and theydon't or if they use it and we
(36:45):
don't then we're behind.
SPEAKER_00 (36:47):
Right.
SPEAKER_04 (36:48):
It's nothing new to
us with all this other
technology out there.
SPEAKER_01 (36:53):
With radio with
television with um streaming
with you know I it again it'sit's kind of what you choose um
with the same thing we've we'vewe've had television um in the
home since you know probably the50s maybe everybody in the 60s
yeah um where it's almost inevery single home right and so
(37:14):
you get to that point where okaywe're this is potential for
being nefarious.
Now our children are kind ofstaring at the screen.
SPEAKER_04 (37:25):
So one way
communication are they going to
believe everything they see.
SPEAKER_01 (37:30):
But you know there's
also certain situations as as
you and myself grew up with youknow um Mr.
Roger's neighborhood oh yeah youknow it you don't have that
without that ability to be outthere on the airways exactly
yeah and certain things likethat where it really opens up
(37:50):
the um the spiritual and andmoral uh development of children
exactly you know that kind ofthing that that's a very big
deal right uh radio waves youknow you've got you know sure
you could listen to to reallyawful stuff you can also listen
to you know uh connectingspiritual music you can listen
(38:13):
to things that are going touplift you yeah and so that's
really your choice when it comesto whether you're pushing that
dial right when it comes to backin the day with an actual
dialogue yeah actual dial yeahit it's the exact same thing
when it comes to literally tunein yeah right when it comes to
(38:33):
tuning into um this pattern withAI are you gonna use it wrong
right you know yeah exactly Iknow yeah makes sense and um I
think that's really why weprobably need to go ahead and
and get that um sense ofmorality through God through the
(39:01):
spiritual connections right andour connections with people and
our want to go back to um whythat was important to begin
with.
I agree it's important today.
It is and um that's the thingabout the us and the the concept
of both past and present thatyou know yes we're very
(39:21):
technological but we also reallyfocus deep into the past yeah um
and our our moral past isbecause that is kind of your
your um your solid foundation.
It is and I think that that'swhat we're we're called to do.
I think that's what we need todo when it comes to all of this
new technology coming out therethat is um scary.
(39:44):
Yeah it is you know in in in thebig you know October way of of
spookiness it's more spooky thanthat.
Yeah but if we if we maintainour spiritual calling yeah and
our morality you know ineverything that we do we have to
you know maintain that withinour household and and within our
(40:08):
families and so throughout thatcommunity you know we want that
message to be heard.
Yeah let's maintain this allsure you know and and nefarious
things will happen.
Yeah and you know those of usthat know um some things about
the technology we're here tofight with you.
SPEAKER_04 (40:26):
Yeah mm-hmm and and
and I've always said that you
know yes a technology uh allowsfor a lot of things and you know
we um if technology is able todo things and you know I mean
think back to the Jetsons um youknow with uh um was it Rosa?
(40:50):
Rosie Rosie I was thinking Rosauh Rosie the uh the robot that
you know took care of stuff inthe household and of course the
jobs are seen as just pushingbuttons which I mean we're close
to that and so yes technologyeliminates the you know the the
physical and the the the grindif you will but that allows for
(41:15):
us to be more human.
But what does human mean?
And we need to really lookinward and realize where we are
in this equation because uh wereally are two sides with this
computer in the middle and youknow we need to be louder than
the the noise we need to makesure that our moral compass and
(41:38):
the way in which that you knowwe participate with it is with
good intentions.
You're always going to havesomeone bad on the other side we
can't get rid of it.
If we abandon it then they takeover.
Right.
So it's here.
SPEAKER_00 (41:53):
It's here.
SPEAKER_04 (41:54):
So um I guess we
just need to be aware of that
and keep that in mind.
And like I said by being louderthan a noise means w it if we
have concerns with it then don'tjust say, you know, well let's
you know wish it goes away.
It's not going to go away.
We need to educate and stay upto speed because it's gonna be
(42:16):
here for our kids.
Just like in in the schoolsystem there's you know a lot of
research and a lot of discussionwith um and of course devices
like mobile phones have beenpulled away and there's been
mandates in Virginia and otherstates where they can't have
these phones during class.
(42:38):
And while I do understand itbeing a distraction I also can't
help but see that in the futurewhen you know the magic day of
graduation comes and a companypicks you up they're gonna be
like okay we're using thesecomputer systems, we're using
(42:59):
AI, what's your familiarfamiliarity with that?
And if you haven't been taughtthis can it come back on the
school systems as neglect andI'm reading more and more about
that.
So I think while we're trying toplay a safe move or a convenient
(43:22):
move we're trying to you knowjuggle the distractions of
technology we really need tomake sure people understand what
it means earlier because againit's not going away.
If toddlers have a tablet andthey're interacting with it but
then go into school and don'tyou know what are they really
(43:46):
learning?
Who's actually looking overtheir shoulders of what they're
doing we actually need toexplore this and our political
leaders need to also understandthat if we're not doing it right
and of course I'm not expectinggovernment to be the savior ever
but as a community we need tomake sure that we understand
(44:09):
what is being put in their handsand to what level and we know in
the real world it's gonna bethere so we need to make sure
that we're preparing our nextgeneration.
SPEAKER_01 (44:22):
And then we're
preparing them morally.
Exactly as opposed to black hatand white hat we can't throw the
hot hat in at the if they'regonna make a decision between
white and black hat yeah then ifwe throw the hat in we're not
making that decision at all.
SPEAKER_04 (44:37):
Yeah throwing in the
hat right so so let's let's uh
support the b the white hatlet's support support the white
hat listen to those that knowright allow them to educate the
right path and I think we'll beokay but an e-jerk reaction may
not be the best way of solvingthis right so maybe you need it
(45:00):
in your hand but understand thatthis is a moral dilemma that you
need to navigate through.
Exactly and uh to have theexperience with it in the right
way because in the real worldagain it's there so you need to
know you know don't don't justlike don't wait until you're you
(45:21):
know 16 to then all right hereyou are you're 16 here's a car
here you know here you go and uhoh should have told you the
brake was in the left first butanyway but that comes with
experience let's not let thefirst experience be when you
finally reach the age but you'vehad no experience.
SPEAKER_01 (45:41):
Well and and
fortunately with driving a car
you don't end up with some sortof moral dilemma.
Right.
When it comes to you know thesedevices yes you're gonna have a
lot of moral issues yeah and andyou might as well start that
conversation you know we need tostart it now we need to start it
in um the bigger picture.
SPEAKER_04 (46:00):
Yeah yes exactly
right so well I think we've done
enough today and um were we werewe good detectives I well I mean
part of being a detective is youknow obviously looking at things
from both sides and alsounderstanding you know um where
(46:21):
the evidence is and how you knowwhat the past means and
sometimes history repeats itselfbut um I think uh we're gonna be
winding off all right um alwaysknow that you can find our
podcast in um winnoniteweb.com.
(46:42):
Yeah we have it there and allthe popular platforms you know
of course Apple Podcasts Spotifyyou know being some of the
leaders there but um weappreciate all the support and
everything and um unplugging fornow.
SPEAKER_00 (46:57):
But always stay
connected