Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Hello, hello.
Welcome again to anotherpodcast.
Got Blake right.
Speaker 2 (00:07):
Hello everybody so
today we're going to talk about
some of the cybersecurityheadlines that are going on
right now.
Which one did you see thatinterest you?
Speaker 1 (00:21):
The Google Calendar
being targeted by hackers is
interesting.
Speaker 2 (00:28):
I'm a huge fan.
I don't know for somethingabout it to me that North Korea
has always fascinated me.
So apparently there's a NorthKorean state linked nation group
called Bluenora that'sessentially been using Objective
C commands to hack Maccomputers.
(00:50):
But you don't see a lot of Macviruses out there.
That's surprising.
Speaker 1 (00:56):
So is the quick
version to protect yourself on a
Mac is just update and reboot,or is there something else that
people need to do?
Speaker 2 (01:05):
I actually use
software called Clean my Mac,
which essentially has malwareremoval on there, but of course
you always keep your computersup to date, especially Macs.
So I remember there was a rumorgoing on a long time ago that
Macs can't get viruses.
Speaker 1 (01:22):
Yeah, that was a myth
a long time ago.
Speaker 2 (01:26):
No, they can.
Obviously, I think people justchoose not to write Mac viruses
because it's a small percentageof the computer market.
So why are they going to spendall this time, energy, effort,
resources creating software thataffects a small percentage of
the computer users?
So, no, just stay up to date.
(01:52):
Clean my Mac is a good remover.
What else let's see.
Speaker 1 (02:02):
So let's look at this
Google Calendar thing.
It's called C2 or CNC.
It refers to ahacker-controlled server that's
used by cyber criminals to sendcommands to, and receive data
from, computers that have beencompromised by malware.
So they're saying, in this casethere's a new proof of concept
(02:22):
exploit.
So they're saying that Googlehas not observed this in the
wild, but it's been sharedrecently on a bunch of hacking
forums.
I'm just trying to see ifthere's anything that people
need to do.
I don't know about you, butI've noticed maybe, with the
(02:49):
holidays coming around, anuptick in phishing is a little
bit of a problem.
Maybe, with the holidays comingaround, an uptick in phishing
and smishing and all sorts ofsocial engineering attacks
around deliveries and justdelivery notifications or
there's been a problem.
Or payment, ach payments.
(03:10):
I don't know if you've gottenthose, but yeah, they're
basically praying that you'regoing to click on one of these
things.
The other thing that was kind ofinteresting.
That I was going to talk aboutis a lot of these class action
lawsuits that happen and a lotof times it's hard to
(03:30):
distinguish if the email islegitimate.
I'm talking about, likesometimes they'll send an email
and say, oh, xyz company wasinvolved in a class action
lawsuit.
If you want to claim damagesyou have to click this link.
And the first thing I'm likeI'm like I'm not clicking on
those links and they registerthese super long domain names
(03:53):
that are notorious for beingused in hacking and phishing and
stuff.
So I don't know what about you?
Have you ever participated inany of that stuff?
I just avoid it.
Speaker 2 (04:07):
Yeah, realistically
it probably isn't even worth my
time, but I was a part of theExperian or one of those credit
union issues.
By the time you submit yourpaperwork and do all that, I
mean, then you get like five orsix dollars.
Is it really worth it?
There should be a higher costassociated with I mean they need
(04:30):
to set some kind of minimumdamages like clause.
If you look at my customerprofile from any company, my
data is worth thousands ofdollars because they can use my
data to deliver ads to influencepurchasing or buying habits.
I mean my data is worththousands of dollars.
(04:53):
I mean every person's data outthere, arguably, is probably
worth the same.
Speaker 1 (04:58):
And I would argue
that it's probably worth more
than that.
I mean depending on howdetailed the data is.
I mean, companies would bewilling to pay, I would say,
tens of thousands of dollars ifthey can get certain data points
.
Speaker 2 (05:09):
Exactly.
So there you go.
But yeah, you know.
So it's kind of offensive when,like my social security number
or you know my credit, you knowinformation is compromised and
they give you five dollars.
Speaker 1 (05:23):
Yeah, well, I think
also, my point is that the
mechanism or, in this case, theemail, obviously email insecure,
so email is so relied upon asthe vehicle to transmit this
information and I feel likethere should be a government
website and like a portal.
(05:44):
Hey, you've been involved inthis class action lawsuit.
You know you can claim damagesof $20 or whatever, and then you
should be able to log into thedashboard of the government
website and then claim you knowyour damages or whatever.
You shouldn't be going to weirdemail or responding to a weird
(06:05):
email address that you've neverseen before and then clicking on
links.
I mean that's just a recipe fordisaster, like you said.
I mean oftentimes going to leadyou down a rabbit hole.
You're going to fill out abunch of paperwork and you're
going to get five bucks.
Speaker 2 (06:18):
There's been one for
my car there recently.
I got an email like a few daysago, my pressure cooker there
was some lawsuit.
My pressure cooker and they'relike, oh, like the gradient
levels on the side of it likearen't marked properly, like it
(06:39):
could expose you to like boiling, like burns, like don't use
your pressure cooker, and then,but the remedy is like click
here and register your pressurecooker and they'll send you like
a new gradient or so yeah, I'vebeen a part of a few of them
and yeah, I definitely think thegovernment should have some
type of resource to like step inand mandate these recalls,
(07:02):
because they're the ones thatare forcing these recalls.
Like the companies don't wantto do that, like they lose money
.
They lose, you know.
Speaker 1 (07:11):
Well, it just goes
back to there's just no
standardization, there's no like, it's like a free for all, and
then everybody's just kind of ontheir own.
I mean, like I was talking toone of our partners and he's a
CMMC register practitioner andhe was saying that the CMMC, or
(07:36):
CyberAB now it used to be theCMMCAB, you know it's the
CyberAB they changed therequirements that if you're an
RP you need to be associated.
You can't just be an RP on yourown anymore, you have to be
associated with an RPO.
And obviously we're an RPO,we're both RPs and we're
associated with the firm, theRPO.
(07:58):
Well, anyway, since we're goodpartners, I allowed him to join
we're partners anyway, so mightas well join our RPO, so that
gives him access to some of thetraining or whatever that the
CyberAB has come out with.
And I heard that basically CMMCis on Biden's desk right now,
(08:18):
just needs to be signed.
So that's good news.
I feel like it's so importantfor standardization.
And there's also a new tracknow.
I don't know if you're aware ofthis, blake, but there's now a
registered provider advancedtrack.
So fun stuff for us.
(08:38):
Now we gotta take anothertraining, but I mean I guess
that I haven't done it yet, Ihaven't looked through it.
But long story short, I went totry.
I haven't logged into theportal in a long time.
I go to log into the portal andof course it doesn't like the
password that I saved, mypassword manager.
So then I go to the passwordreset and it's like, oh no, this
(09:01):
doesn't work.
You need to email.
Blah, blah, blah.
So I email and then I'm likewaiting, I don't have access to
it.
So it's just like.
It's like one thing leads tothe next.
But my point at kind ofbringing it full circle is for
just to kind of close the loopon the class action lawsuits and
things like that.
Lawyers in general, I feel likethere's just not a lot of
(09:22):
standardization, too muchemphasis placed on insecure
email.
I mean, I was doing some stuffwith a law firm and they're like
, yeah, I fill out thispaperwork, whatever.
And I'm like, well, do you havean encrypted email or a portal?
And they're like, no, no, wejust use Dropbox.
Like no, I'm not using Dropbox,sorry.
I'm like here's my encryptedemail.
(09:44):
But my point is that we'resmart enough to kind of to see
that and then react and providea solution.
But what does the averageconsumer, do Average consumer.
They're relying on the companyor the vendor or the government
to provide these resources thatare supposed to be secure.
But it's like I feel like we'rein the wild west now where
(10:08):
everybody's got to have theirown solution.
You know what I mean.
It's just like there's just nostandardization on any of it.
So I hope that the CMMC bringingthat into this will help
streamline regulation and reduceconfusion, because in my
opinion, it's been quite thebumpy road.
I mean it's been expensive,super expensive, for us to
(10:31):
participate in the ecosystem.
I mean we've helped a lot ofgreat clients, but I still feel
like there's just so manyclients out, potential prospects
and clients out there that arejust really doing nothing, just
kind of waiting on the sidelines.
And if this thing gets signedwhich I really do hope that it
does get signed I feel likethat's gonna be almost like a
(10:51):
gate lifted for people needinghelp and finally coming off the
sidelines.
Maybe I mean I'm trying to beoptimistic around that, but I
don't know.
I mean I feel like it's justbeen.
What has it been?
It's like almost four years now,right, yeah, yeah, I mean, I
remember it was signed a longtime ago, yeah, I think, well,
(11:15):
the executive order was signed,and then there were updates, but
the CMMC was just kind of inbeta, then it went to 1.0, then
it went to 2.0.
So anyway, hopefully, maybe bythe time this thing airs, you
know, we'll have a signed offCMMC ecosystem and program, and
I think, once that happens, Ithink that not only people and
(11:36):
companies will maybe take itmore seriously, but maybe
vendors will take it moreseriously too, cause I feel like
vendors are again another freefor all.
It's like, well, we're notreally compliant with that, but
we have this and it's not eventhe same thing, like it might be
CSF or some version of NIST,but not quite 800, 171 or 172
(11:58):
and not near CMMC, and then thesame thing with HIPAA compliance
.
So I think that you know, justall this would get simmered down
and simplified.
Speaker 2 (12:08):
Yeah, I mean,
obviously there's so many
mandates that cybersecurity andcompliance mandates that one
company has to jump through.
I would like to see it kind ofcentralized, you know, like CMMC
, you know could be used in allbusinesses, you know.
Speaker 1 (12:28):
Yeah, that's kind of
my point and also to kind of
come off of that, the reason whyit's beneficial for all
businesses, in my opinion, isbecause of the maturity, the
structure you know you reallyneed the documentation, the
policies, the procedures, theorganization of all of it.
It's only gonna help yourbusiness grow.
Speaker 2 (12:50):
Right, yeah, I mean,
if you have the ability to take
somebody's money, right, youknow you should have the ability
to protect their informationthat they give you.
You know, I mean, it's justlike I mean if you have kids,
right, you know, and somethinghappens and somebody breaks into
(13:11):
your house, you know, I knowthis is the worst analogy, but
you, as a father, are expectedto defend your family, right?
You know, a lot of thesecompanies they take your money
but they don't defend your data.
You know, they're just like, ohwell, too bad, sorry.
You know, I mean, all the bigplayers do it.
You know, I mean, there's no,you know there's no regulation,
(13:36):
right?
So if you're taking somebody'smoney Well, not only is there no
regulation.
Speaker 1 (13:39):
There's no.
I guess the better way to putit is there's no clear rules and
guidelines, there's no clearblueprints on this is how you do
this, and then there's no clearsupporting boots on the ground
for enforcement.
So I think that's the kind ofthe big mess I feel like with
(14:01):
the way things have been for awhile.
It's just like oh yeah, we'reregulated, we need to comply
with XYZ compliance, and thenreally I feel like in my opinion
, the only ones that really playball are the ones that get
pressure from a vendor.
Like the vendor's like oh, youwant to do?
(14:21):
They're the bigger company,typically they're the more
mature company.
They're like you want to be ourcustomer?
Well, where's your cybersecuritypen test evidence?
Have you done a third partyaudit?
Where's your gap assessment?
Where's all this stuff?
And then the vendors likeasking for all this stuff.
And then small businesstypically is like what's that?
And then they go on theserabbit holes of like how do I
(14:44):
answer this question?
And it usually starts withcybersecurity insurance or some
questionnaire there.
And then that leads tosomething else, maybe with the
vendor that requires supportingevidence of certain things being
done.
So my point is it's usuallythis kind of journey that
happens.
Speaker 2 (15:03):
Yeah, I mean that's
not the way to secure your
business, that's not the waythat you legitimize your
business.
Like if you are taking moneyfrom somebody, like there is an
exchange, you're gettingcompensated for your work, your
effort, your security, yourcompliance.
Like you're getting compensatedfor that.
(15:24):
It's a weird world where peoplejust say, oh, I can get X money
from the government and donothing for it.
I mean, we live in that worldwhere people think that they can
take money for nothing.
Speaker 1 (15:39):
And we talked about
this on the last episode.
I think we also live in a timewhere everything is so trustless
and you can't trust anyone,can't trust any company.
Everything needs to be verifiedon so many levels, and I mean,
just look at the headlines.
I mean banking I don't know ifyou remember back in 2008, with
(16:02):
the whole crisis and banks andover lending and overstretching
with credit.
That's all, in my opinion, thewritings on the law.
It's all pretty much happeningagain, and I think this time
around, people are starting tocatch wind around.
Hey, maybe this isn't the bestsystem anymore.
(16:24):
So I think people are gettingsmarter and there's different
perspectives on things, but alot of people don't really
understand how it all works andthere's a lot of misinformation
out there.
So we're in this time periodwhere there's so much overload,
(16:44):
I mean with everything like AI,artificial intelligence.
Elon Musk announces that he putout Grok.
Did you see that?
No, so he launched Grok, whichis the competitor to ChatGPT and
Google Bear.
So it's Elon Musk's take on.
Hey, these guys made a chatbot,so I want an AI chatbot and
(17:08):
supposedly it's more, I guess,less censored.
So certain topics that werekind of off the table, where the
maybe open AI won't respond, orChatGPT won't work, or Google
Bard won't work.
This is supposed to have moreof a almost like a humor to it
(17:29):
or a sarcasm to it.
I haven't tried it yet.
Apparently there's a waitlist,so I got on the waitlist to play
around with it, but it's justlike there's just so much
information and it's hard.
I feel like it's hard to pickapart what information is
trustworthy and what's not,because what a lot of people
(17:50):
don't realize too is a lot ofthe media is paid off by certain
groups too, and there's biasand influence, so it's really
hard to get the truth.
Anyway, we're going on a rabbithole, but that kind of deviated
.
Speaker 2 (18:08):
I know, I know AI has
been like a buzzword.
Like you know, it was likecrypto for a while, and the now
is AI.
And Did you see there is acompany that appointed a CEO,
like appointed an AI bot astheir CEO?
Did you see that?
(18:28):
No, yeah, so I saw this articleon Bloomberg, or whatever.
I mean is that like fabricatedor is that like I think?
I think it's just a PR stunt.
Speaker 1 (18:40):
Yeah, that's what I
was just gonna say.
I mean, I feel like that'ssomething that just kind of gets
press.
Speaker 2 (18:46):
Exactly that.
That that's the buzzword, right?
So it's a Colombian rum companyI'm not gonna say their name
because clearly they're gettingall the free press right
Appointed an AI robot as thecompany CEO.
Oh, mika is a research projectbetween Hanson robotics and a
Polish rum company, whocustomized the CEO to represent
(19:09):
the company and his uniquevalues.
Mika said that with advancedartificial intelligence in
machine learning learningalgorithms I can swiftly and
accurately make data-drivendecisions.
Okay, cool.
We'll see how long that lasts.
Speaker 1 (19:29):
Yeah well.
I know Elon, when he launchedgrok, he was basically saying
that he sees AI as like thefuture, where it's gonna
basically take everybody's joband we're all gonna live an
elevated lifestyle, and I Justdon't see that.
I mean, I understand like thewhole terminator thing and you
(19:50):
know I've taken a lot of AICertifications and things like
that.
I do think that there's a usecase for a lot of it.
I think it's helpful.
You know, I think that I usedto.
I think one analogy is itAmplifies your skills and gives
you almost like superpowers todo certain things, and it does
more of the grudge work that alot of people Don't want to do.
(20:11):
You know it's really good atthat, right, but I don't really
see that it's gonna completelyreplace a human, especially, you
know, at the higher levels.
I do think that you know, likesorting data or data
classification, things like that, that you know it's just really
just mind-numbing work.
(20:33):
I think that's great for AI,but, and I do think that it'll
eliminate some of the you know,the quote-unquote jobs in the,
the lower bracket.
But it'll also create new jobs,and I think it's gonna create
new jobs that are different,that either don't necessarily
exist yet or that are brand newand Just you know, like I said,
(20:54):
that those mundane tasks willjust completely, just get, keep
getting distilled and drilleddown.
Speaker 2 (21:01):
Yeah, I mean I think
that there has to be a human
element of everything, exceptfor, like you said, like data
sorting, like I mean there's alot of jobs out there that
people don't want to do Right,and AI may fill the gap for
those jobs.
I mean they're not gonna, youknow.
You know, be your your dumpsterlike your your your trash
(21:22):
pickup on Saturdays.
You know, I mean AI is notgonna do that.
Maybe, maybe.
Speaker 1 (21:27):
I mean, I do think
that I think that might be a bad
example.
I do think that one day therecould be probably a Reinvention
of trash pickup where you canhave an autonomous vehicle, you
know, pick up your trash rightand you know who knows.
Maybe that'll work out Well.
I think that there's gonna be,like I said, good use cases and
(21:48):
bad use cases.
I think, like in our world forcompliance.
I do think that in the future,when cmmc is law and cmmc then
puts on a lot more pressure tovendors like Amazon, aws,
govcloud, microsoft Azure, Ithink that there will be push
(22:11):
button compliance From asecurity control perspective.
Now, does that?
Now?
What I mean by that is you maybe a defense industry based
contractor and you know whoknows, in 20 years you may be
able to sign up for Microsoft'sGCC high and click a button you
know it or fill a questionnaireand it'll auto harden the
(22:32):
environment.
I could see something like that,but our jobs wouldn't be
eliminated.
I mean, we, we as humans andcyber experts, would still be
needed to work with the humansand the, the other aspects
outside of the configuration ofthe endpoint or the system level
.
Does that make sense?
(22:52):
Like so?
It's like like the, thesecurity, hardening the manual
process, all that stuff that haslike detailed instructions you
can program to do, you knowsoftware to do that, but when
you get a you know a higherlevel discussion or a tabletop
or simulations, you can't reallyautomate all of those things,
(23:15):
especially the more complicatedones.
Speaker 2 (23:18):
Yeah, I agree.
And then something that Ithought about too, whenever
you're talking is like the olderTerminator movie when, like it
was like I think it's Terminator2 with the liquid guy yeah,
that's the second one, yeah, butbut yeah, like you know, in the
(23:39):
, there's a scene there wherethe young kid is like crying,
and then he's like what's wrongwith your eyes, you know, and
he's like, oh, don't worry aboutit.
And then you know, like therobot doesn't understand emotion
, right, right.
And the only reason why I saythat is because there is a lot
of decisions that have to bemade emotionally, like in day to
(24:03):
day operations.
You know that that thatbusinesses need to make.
You know it could be hiringsomebody, it could be firing
somebody, it could be movingfrom vendor to vendor.
You know, for the best of thebusiness, you know for the best
(24:24):
of the future of the company,like those are emotional
decisions that nobody can, norobot, no software, no script
can, can make.
And you know these AI bots andalgorithms are designed to
perform, you know, tasks beyondthat, right.
(24:45):
So I don't know, I agree, Idon't see them taking over.
And I'm surprised that Elon wasdoing something like that
because he was talking about.
I thought he at first I thoughthe invested in chat GPT and
then I don't know if that wasjust a rumor, but then he was
talked about how, like chat GPTwas like the worst thing ever.
(25:07):
But you know, for being real,like Elon Musk is, he's
obviously known in tech and he'sknown by everybody.
But but yeah, I mean it took alot of government funding to get
him to where he's at, to buildTesla and to build these EVs and
these solar panels and, likeyou know, he's secured a lot and
(25:30):
developed a lot of his successoff the backs of tech taxpayers,
right, Like he had a lot ofcompanies that that failed
before PayPal, you know.
Speaker 1 (25:41):
Well, and there's the
whole.
I don't know if you've beenfollowing, but there's the whole
kind of perspective on thewhole electric car vehicle thing
too, like there's a lot of likeI heard recently in headlines
that sales are down and thatpeople are starting to realize
that there's just a lot of gapsin the current market with EVs,
(26:03):
where you know range, forexample, infrastructure, lacking
infrastructure, you know.
So it's like you know the pushto go this direction, to not
need gas, right, like I lookedat my fuel usage for like last
year.
You know my fuel usage.
I don't drive a lot but youknow less than $10,000.
(26:24):
Okay for the year.
And I'm like, well, I can buyelectric vehicle, but an
electric vehicle costs a wholelot more money than typically a
gas vehicle and I'm restrictedon.
I always have to have in theback of my head well, where can
I go, and what if I get stuck?
Or what if I?
People don't realize that ifyou go faster or if you're
(26:48):
pulling something like a traileror you're loading the vehicle
up, you're straining the batterymore, so your range reduces,
you know.
So if you go certain speeds oryou do fast acceleration, things
like that, you're draining thepower faster, so then you need a
charge, you know.
So, yeah, it's getting better.
You know, like hotels,partnerships are being created
(27:08):
all the time.
I envision a world one daywhere the vehicles may be
charged themselves on thehighways, but again, that
requires revamp ofinfrastructure, right?
Speaker 2 (27:19):
There's solar panels
now that you can.
You can outfit I think is Tesla, but it's like a solar roof
panel that will charge your caras well.
Speaker 1 (27:30):
But wouldn't I mean
so like I would envision that
Tesla just embed that technologyinto the roof of the paint of
the car?
Speaker 2 (27:37):
Yeah, yeah, I think
you know, I think that that's
the direction you know but mypoint is that you know we're
like.
Speaker 1 (27:45):
You know we see, or
the you know the whole push
towards EVs and less reliance onother countries, for I get all
that.
I support all that you know,but might not be the cure.
All solution for all situationsis my point.
You know what I mean.
Like there, there's a purpose.
Maybe if you're going to use itas a you know, just a commuter
(28:08):
car, take your kids to school orsomething like that, maybe
that's a great way to use it.
But if you know you want to goto the mountains or the beach or
you want to take a road trip,it's a little risky in my
opinion, because you have tokind of plan your route based on
charging.
And then what if you have asituation where you get stuck or
something happens, or you knowwhat I mean.
Like so it.
Anyway, my point is that samewith AI.
(28:31):
You know AI.
Ai has been around before.
They kind of amplified thewhole marketing buzzword of AI.
You know automations have beenaround for a while and just
using, like junk mail filtering.
You know Gmail labels andsifting and sorting of.
You know auto, auto suggestionsfrom Google.
(28:53):
You know all that's AI driven.
You know it's been aroundforever.
So it's just kind of, I thinkit's just how you apply it and I
think that there are good waysto apply it, but I don't think
that everybody's, I don't thinkthere's going to be a mass
execution of jobs, you know, andnobody's going to be able to
find work or anything like that.
I do think that there are coolthings being developed and I
(29:14):
think that that will continue tohappen and, you know, most will
fail and some, will, you know,come to fruition and I think,
overall, it's just like anyother technological advancement.
Speaker 2 (29:28):
Yeah, I don't think
the government's going to hand
over the keys to the missilesilos to AI like they do in
Terminator.
Not anytime soon.
I don't think that's going tohappen and yeah, yeah, I mean
the future's here.
You know, I remember again likewatching a movie guy, but
watching back to the future andI think it was like 2018 or
(29:51):
whenever they traveled into thefuture and they had flying cars,
you know, and that movie right,and it's like where are those
at?
Speaker 1 (30:02):
Well, that kind of
reminded me of remember.
This was what I think five,maybe seven years ago now
remember Amazon was reallypushing drones.
Yeah, yeah, drone delivery,drone delivery.
I don't know about you, but inLas Vegas but here they have
drone delivery for certainthings.
(30:22):
Now, do you have that?
Speaker 2 (30:25):
I think so.
Yeah, I think so.
Speaker 1 (30:26):
Or you could order
like lunch.
Speaker 2 (30:31):
Yeah, I think I don't
really pay too much attention
to that, but I think so.
I mean they're in Vegas isself-driving cars.
Have you seen those?
Speaker 1 (30:40):
I haven't seen them
like face to face or anything,
but I've heard of them and I'vealso seen them everywhere.
Oh really, I heard of somethingthat they were testing.
I don't know which vehicle itwas, or whatever, but then a
woman got caught under it.
Did you hear that?
Speaker 2 (30:56):
No, I didn't hear
about that.
Speaker 1 (30:59):
Yeah, somebody got
caught trapped under the car and
it was like drug under the carand it was an autonomous vehicle
, so it was a mess, it washorrible.
Speaker 2 (31:09):
Yeah, the ones that
they're using here in Vegas are
like Hyundai Equus or whatever.
They're like little SUVs andthey have like spider arms that
come off of them from the topand then they have like cameras
in every corner of it, I meanall across the bumper.
They're camera up in everydirection and I think for I've
(31:34):
never ridden in them but theyhave to have somebody behind the
wheel of the car should someincident happen.
Speaker 1 (31:45):
That's what I thought
that the Tesla design was for.
I thought that on Teslas youcan have the full autonomous
mode, but you have to be likethere to make a help or whatever
.
Speaker 2 (31:59):
As it should be.
And then I have if you come tothe convention center in Vegas,
if you take an Uber or arideshare and it's a Tesla and
you're going to the conventioncenter, they take you
underground into these littleTeslas and then the car just
starts driving itself throughthese, like I mean they're like
(32:20):
little slim, little tubes.
I mean it's like you could likeliterally reach out and like
almost like put your hand on,like I mean it's crazy, oh,
that's wild.
The experience you know forfirst time people going through
it.
Because it's like how is thiscar driving, I mean through here
.
Like it's like so narrow, it'slike just barely big enough for
(32:42):
the car to fit through it.
That's crazy, and yeah, it'spretty crazy.
And then the car does the wholething, you know, drives the
whole route, you know.
So if you're in Vegas, try thatout, Cool.
Speaker 1 (32:58):
What's that?
Speaker 2 (33:01):
So we should probably
end on that one.
Speaker 1 (33:03):
Yeah, I was just
going to say so.
I think that's a good drop offpoint.
But yeah, I think the summaryis AI is not going to eliminate
your job anytime soon, unlessyou're doing one of those
mundane tasks that we talkedabout, yeah, which would
probably be better for youanyway to not continue to do
that and find some morerewarding work.
Speaker 2 (33:23):
Yeah, I guess the
future will be an AI president,
right, we'll see what happenswith that, you know, see if this
AI can do a good job at thisliquor company.
And then what, if I mean thatcould be the future right, where
the AI make government baseddecisions.
Who knows?
Speaker 1 (33:42):
I think AI is going
to severely disrupt the legal
industry.
Yeah, the legal industry isreally going to get upended with
a lot of AI driven software andtools around like wills, trusts
, all that stuff.
I think it's going to be justhuge around tools like AI, that
(34:08):
kind of systematized that Idon't think it's going to like
mass eliminate the need to havelawyers.
I just think that certain typesof work like that will get
eliminated.
Speaker 2 (34:22):
Yeah, yeah, I mean, I
can see it.
There's websites right nowwhere you can get contracts
written up.
You know for nothing.
You know using software, so youknow I agree.
Speaker 1 (34:33):
Yeah, I wouldn't.
So just kind of caveat, I would.
I would consider using a toollike that, but I would never use
it.
I would still want a humanlawyer to review it in the end.
You know, because there's justso many things that I'd rather.
I guess my point is if it costs$5,000 to for a lawyer or a
(34:55):
group of lawyers to do like awill trust kind of thing, I'd
rather pay some software $500 todo it and automate every, but
then pay the lawyer another $500or whatever their hourly fee is
and then get them to review itand in the end I'm still saving
money because I paid, you know,using that scenario, maybe
$1,000 instead of $5,000.
I would still have a power, soit's still a disruptor, but I'd
(35:17):
rather have the peace of mindknowing that, okay, this has
been reviewed and I thinkthere's great value in that and
I think that's a human executivefunction and that's kind of my
point and that's kind of thetakeaway here.
With same, with the cyber andcompliance we can find
leveraging the latest andgreatest in tools, but we need
our team, we need our people ashumans to go through it.
(35:37):
You can automate vulnerabilityscans and some of these things,
but it's never the same effectas a real hacker or a white hat
that would go through things.
Speaker 2 (35:47):
Yeah, I mean, we can
obviously look at things that in
other perspectives, like we canshed light from different
angles, that software can doright, it's only just.
Speaker 1 (35:58):
And with less
brackets, I guess less stringent
tunnel vision around it.
So we can look at.
Speaker 2 (36:07):
Yeah, like, for
example, like some of the
compliance software that we useis like, oh, if we were to run a
sweep of all their documents.
It's like, oh, this company'snot compliant, right.
And then we go over andmanually override it and saying,
oh, this control doesn't applyto that company for X.
Speaker 1 (36:25):
Y, or there might be
like a way to sanitize the data,
to make it out of scope.
Speaker 2 (36:34):
Yep, yep, all right
guys.
Speaker 1 (36:38):
Thanks for listening.
Speaker 2 (36:40):
Yes, we'll see you on
the next one.
All right, take care.
Thank you.