Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:04):
You're listening
to the Audit presented by IT
Audit Labs.
My name is Joshua Schmidt, yourco-host and producer, and we
have the usual suspect, ericBrown, managing Director at IT
Audit Labs, joining us today.
How are you doing, eric?
Doing great Busy week, butdoing good.
I know you're busy.
Thanks for taking the time tohang with us today.
And then we have Matt StarlandMatt.
How are you doing?
Thanks, I'm doing well.
(00:24):
Thanks for taking the time tohang with us today.
And then we have Matt StarlandMatt.
How are you doing?
Thanks, I'm doing well.
Thanks for having me.
It's been a few months sinceyou've been on the podcast.
I'm glad we roped you in.
Usually we have Nick, butNick's tending to other things
that are important as well, butwe're glad you can be here.
Eric Brown (00:38):
Probably picking up
another cat.
Matt Starland (00:45):
This is a better
looking face, isn't it, than his
, you know a little bit moreclean shaved, we won't go there.
Eric Brown (00:47):
But you know, that's
funny because I did say to nick
um, we've got this competitionthat's coming up um here in a
couple of weeks and nick and Iare on two different teams and
it's a capture the flag eventand I I said to Nick, how
confident are you that you'regoing to be able to beat the
(01:07):
team that I'm on?
And he felt pretty confident.
So I said, how about?
If you lose, you've got toshave off half your beard.
So he wasn't that confident.
Matt Starland (01:19):
Yeah, you could
do like a two-faced.
Look, do shave half his beardhere, then shave the other half
of his head there.
So it's just kind of like acheckerboard.
Different, a two-faced.
Look, do shave half his beardhere, then shave the other half
of his head there.
Eric Brown (01:25):
So it's just, you
know, kind of like a
checkerboard, you know,different so I, we did a dry run
last night and it didn't gowell.
We, we, uh, we, we jumped intothis competition and most of the
team wasn't there.
It was just myself and oneother guy on the team and I
think out of a possible likeeight or nine hundred points in
(01:50):
two hours, we got a hundredpoints.
So, matt, the door's open ifyou're, if you're, uh, joining
nick in that event.
Matt Starland (01:59):
Yeah, I mean you
could, you could make me a you
know a plant I could plant, youknow, plant myself in there,
plant myself in there, and withthe right, maybe, sum of money,
I could put a little booby trapsor something.
Joshua Schmidt (02:09):
I think that's
what we should do for our next
game night, Eric.
We should put a little moneydown.
Eric Brown (02:14):
Sounds good to me.
Joshua Schmidt (02:17):
Make it
interesting.
Or we can shave eyebrows.
Just an eyebrow, Just one Halfbeard other eyebrow, other side
of head.
There you go, go right up theface we we do have the tattoo
machine yeah, well that you knowyou can tattoo a wild curly
eyebrow or something you knowonce you show permanent is this
permanent tattoo machine, or isthis?
Eric Brown (02:37):
it's the temporary
one.
The prinker from um, what's theevent in vegas?
Uh, the all the the ces.
When I went to ces a coupleyears ago, I picked one up nice,
nice.
I've not seen that.
From what's the event in Vegas?
The CES?
When I went to CES a couple ofyears ago, I picked one up.
Joshua Schmidt (02:48):
Nice, nice.
I've not seen that in actionyet, so I'll be looking forward
to a demo on that one next timeI see you at the office there.
Let's jump right in here.
We all picked out an articletoday, and the first article
we're going to start with,matt's article.
This is the EasyPass tollpayment text return in massive
phishing wave.
This is frombleepingcomputercom.
I don't know about you guys, butI have gotten many of these, so
(03:11):
maybe you have some tips on howI can get away from this
phishing scheme.
But an ongoing phishingcampaign impersonating EasyPass
and other toll agencies hassurged recently, with recipients
receiving multiple iMessage andSMS texts to steal personal and
credit card information.
The messages embed links that,if click, take the victim to a
phishing site impersonating easypass, the toll, toll roads,
(03:34):
fast track, florida turnpike andother toll authority that
attempts to steal their personalinformation, including names,
email addresses, physicaladdresses and credit card
information.
And this is a great example ofwhat they look like.
Lord knows, I've seen these popup.
Have you guys seen these pop up?
Uh-huh, is there any?
Let's just start with the tip.
Is there any way to get aroundthis or avoid these types of
(03:57):
phishing scams?
Blocking numbers doesn't seemto work.
Matt Starland (04:00):
I mean, they'll
just keep randomly generating
different numbers, differentemail addresses.
This is, it's the the newvector.
I shouldn't say it's not a newvector, it's just another vector
that historically hasn't reallybeen used before.
You know, we've always seenthese phishing emails for years,
and it seems to be in the lastI don't know what five, 10 years
(04:30):
or whatever that maliciousactors are now thought of.
Hey, guess what?
Let's use this other threatvector to try to sneak something
in that is known of good use,of everyday use, for
productivity purposes.
And how can we get the personto click on it and open it up?
Get the person to click on itand open it up?
Because, guess what, mostpeople and even most
organizations don't haveprotections on devices that's
going to stop or filter thistype of stuff out.
(04:53):
I can't think of something thatthey're going to easily be able
to do that's going to reallyhelp filter this out.
I mean the, if you're technicaland maybe a like a proton vpn or
a nord vpn might have some sortof uh you know, malicious
filtering on this, because,guess what, you're going to tap
on this and then it's.
Do you have some sort of aproxy or barrier mechanism, like
(05:15):
a firewall that's going to helpfigure out is this a malicious
url?
Or link a lot of organizations,um you know, while they have
their enrolled workstations,desktops, servers, behind some
sort of a firewall that mightnot always be the case with
(05:35):
mobile devices like a smartphonelike this, and usually the best
way to do it is probably set upa VPN on them, so they're
always flowing through some sortof a firewall that has some
sort of URL filtering.
So, from the home userperspective, if you've got some
sort of a URL filteringmechanism through your own home
(05:56):
firewall, you could VPN in fromthere, but that's not something
most people have.
Most people have or wouldn'teven know how to even set up.
Eric Brown (06:04):
You could do a
couple things, I think.
Think, matt, and none of themare perfect.
One would be signing up for thedo not call registry, but again
that that doesn't necessarilystop a criminal from using your
phone number even though you'veopted out of solicitations.
It's kind of like putting up ano soliciting sign on your front
(06:25):
door.
Or you used to go intoconvenience stores and you'd see
the sign on the door that saysyou know, no shirts, no shoes,
no service.
That you know doesn't alwaysstop people from going into the
stores without appropriateattire.
Same thing here.
Signing up for the do not callregistry keeps the honest
telemarketers honest, but thequick thing you without
appropriate attire.
Same thing here.
Signing up for the Do Not CallRegistry keeps the honest
(06:46):
telemarketers honest, but thequick thing you could do is
actually just log into yourE-ZPass account and see if there
was a balance due on it, so youcould manage the account that
way, not going through a linkwhich, matt, that's the same
thing that we would recommendfor people anyway, if you're
getting a call or an emailthat's requesting something of
(07:10):
you, to actually just go to thereal website to log in.
Matt Starland (07:15):
And that's a
great point because that goes
back to just good old generalsecurity practices.
It's, you know, defense indepth, having different layers.
So one, you've got the training, the human being, making sure
that they're aware and how tohandle the think twice about the
things that are coming inthrough channels that are
normally used for productproduction purposes, you know.
(07:37):
And then you have thetechnological layer of do you
have something to help filterout for that one event that you
did get socially engineered tomaybe help protect you?
And then do you have anythingelse then watching your device,
you know, to protect you, almostlike an endpoint protection
kind of thing.
And when you say, when you talkabout the, you know the
training and being on thelookout and watching this, and
(08:00):
this is why I picked this one.
I was this close from clickingon this.
Here's the reason why it's not.
You know, I get all sorts ofthings like this all the time.
You know all in there, and thenit's coming for different names
that I don't.
You know they're trying tocontact somebody who's doesn't
even have that name on thatphone number, and sometimes they
get my name right, butsomething I didn't sign up for.
(08:22):
So my family and I went down toFlorida about a month ago and I
had never received one of thesetexts.
So we get off the plane, hadour car reservation and
everything, getting ready toexplore Florida during spring
break, which is, if you realize,too, that's when all of this
started flooding out during aspring break.
(08:42):
So they timed it right.
These malicious actors knewwhat time of year it was and
when all the schools were goingon spring break and all the
family members, so they timed itright.
We get off the plane, get to ourcar.
You know your car rental placewaiting in line.
Okay, fill out the paperwork.
Got everything good to goWithin 10 minutes.
My phone gets one of these 10minutes.
(09:03):
After I signed all thepaperwork and I thought, oh crud
, did I forget to give themsomething?
Because I one of the thingsthey asked at the car checkout
was hey, do you want to sign upfor the easy pass tollway system
so you don't have to pay eachtime?
Yeah, yeah, yeah, sign me up.
Do all that.
That, so I get this text.
(09:24):
I look at it, I'm like, oh, didthey screw something up on my
enrollment or whatever you know?
And so I'm sitting therelooking, though this goes from a
training, even beingcybersecurity professional you
know all the things we learn anddo and I'm like it just seems
too generic though there's no.
Hey, matt starland, hey, uh,your account, the car, you know
(09:47):
there was no details that wereunique about my enrollment.
That literally happened 10minutes, 10 minutes before I got
that text, and so I thought tomyself you know what I'm not,
I'm just gonna let this go, andif I get any more calls or texts
, you know, maybe I'll uh lookat it closely again.
(10:09):
So, again, six hours later, gotanother one, and but it was the
same thing, very generic, andyou know.
And the first thing I thoughtto myself was you know, maybe
I'll just call, call up a numberthat I know that's associated
to the car rental place and toask them hey, is this something
that you guys would be sendingout, or is this something that
(10:30):
you forgot to enroll me in?
What's going on here?
Or two, I was on vacation, Ididn't care, you know what.
They can bill me later orfigure this out when I return
the car.
Eric Brown (10:42):
And so I took the
latter approach.
Matt Starland (10:44):
No, I don't wanna
mess with this now.
I'm on vacation, I'll deal withthem when I get back.
And then, after going throughthose three, four days and I
talked to some other people,like yeah, I'm getting all these
easy pass things too I was like, wow, ok, so they, they, these
malicious actors, knew what timeof year it was and just nailed
everyone shotgun approach,hoping you know what?
(11:06):
We're going to cover 100,000people.
Guess what, if we get 0.01% toclick, we're going to get still
a decent amount of money off ofthis.
And so why try to target whenwe can just make it sound really
good?
And I was close, close tofalling for it, and it was
mostly because of timing.
(11:27):
Timing was key.
Eric Brown (11:29):
Yeah, that's what
we're seeing.
We see that with the taxreturns as well.
Right In that tax season, anincrease in that type of
activity.
And then towards the end of theyear and the holiday season, we
see the UPS package alerts popup.
It's all about the timingpackage alerts pop up.
(11:51):
It's all about the timing.
They're getting moresophisticated where they know
that there's going to be somesort of filtering involved or
the threat actor.
When they build theseapplications they're assuming
that they're going to go throughsome form of scrutiny.
And we see it on the email sideas well, this one here.
(12:14):
If you were to exploit this ona computer it wouldn't take you
to that E-ZPass page, but from amobile device it does right.
It knows the type of devicethat you're coming from.
Unless you spoof that on acomputer you're not going to get
to the same thing as you wouldon the mobile device.
(12:35):
And we see the same thing inemail.
The tools may, or the URL, themalicious URL, may be dormant
for a period of time, knowingthat it's got to go through the
inbound filters.
And as long as there's nothingbehind that URL, the filters are
going to assume it's benign andthen it's activated, post
(12:59):
delivery and the other thingthat we've seen are are
malicious accounts that whenthey go through the sandbox
environment it knows it's in asandbox environment, not on a
user's workstation, because thesandbox environments kind of all
(13:21):
look the same.
They're designed to detonatethese URLs in a controlled
environment and it's a virtualmachine with very little memory
or very little processing powerdedicated to it, so it can
fingerprint the machine thatit's on, know that it's not on a
(13:42):
user's workstation and then notdetonate.
So those are pretty cool waysthat it's looking to get around
the controls that a lot ofcompanies have in place.
Matt Starland (13:55):
And you throw AI
in the mix and AI is going to
generate these to be much moretimely relevant.
Joshua Schmidt (14:03):
Timely
personalized perhaps even
personalized.
Matt Starland (14:06):
Now you know, now
the big thing for me is, if it
sounds too professional, I don'twant to click on it, because
nobody talks like that yeah,maybe the ai will start to
incorporate some kind of oscenton on your social media and
hopefully, hopefully not anytimesoon.
Joshua Schmidt (14:19):
But this one
didn't get me because we don't
have tolls in minnesota right,we do have that fast pass, um.
So I did do a double take.
But yeah, I think anotherreally interesting part of this
article that I was picking up onwas this fishing as a service
platforms like Lucid and Dracula.
I haven't heard of those untilI read this article.
(14:39):
Have you guys been hip to thatand how that's changing, kind of
changing the game?
Eric Brown (14:44):
No, but I was to
just echo your point.
I was just going to show you.
I got a USPS text at 844 thismorning about my package that
can't arrive and that I wouldneed to respond with.
Why, then exit the text message?
Open it again.
(15:05):
Click on the link, copy it intoyour Safari browser and open it
.
Right, so I don't probablycan't see that on the camera,
maybe, but yeah, it's.
This one came from a hotmailcomaddress.
Joshua Schmidt (15:21):
Oh, hotmail.
So that's how you know it'sprofessional and from the post
office.
I'm surprised it wasn't AOL.
Yeah, hotmail.
The USPS is using Hotmail thesedays.
Huh, they're getting dogedright.
Yeah, the budget's gettingtight there.
Someone traveled with theirDeLorean back to 1995 and spun
(15:42):
up a Hotmail account I'massuming that these services are
online and available and raisesthe question why they're
allowed to operate if they'reseemingly only for nefarious
purposes.
But we'll have to shelve thatfor a different day and do
another deep dive.
We should do a deep dive on.
Eric Brown (15:58):
they're probably
using Evil Jinx as the back end
platform for bypassing andstealing the MFA tokens, but
we're actually we've got a eviljinx environment here that we're
spinning up for researchpurposes, of course, so that
would be a fun one to dive intoas a deep dive yeah, and while
we're live here, that's a greatsuggestion, and if you have any
(16:19):
other suggestions for topicsthat you'd like us to cover or
news articles, drop us a note.
Joshua Schmidt (16:24):
Youtube or
linkedin, or send us a message.
We have a website that's beingrevamped.
You can check us out there tooand also sign up for information
like this, so check that out.
Itauditlabscom, again one ofour favorite outlets.
Fbi, us lost record 16.6million to cybercrime in 2024.
Cybercriminals have stolen arecord 16.6 billion in 2024,
(16:56):
making an increase of lossesover 33% compared to the
previous year.
According to the Bureau'sAnnual Internet Crime Complaint
Center, that's called IC3 report.
The IC3 recorded 859,532complaints last year, recorded
859,532 complaints last year,and that was amounting to an
average of $19,372 a loss.
Of course, we all probablyanticipated this next paragraph.
The most impacted group isolder Americans, especially
(17:18):
people over 60, who filed147,000 plus complaints linked
to approximately 4.8 billion inlosses.
So as these tactics become moreand more prevalent and more
sophisticated, what's themessaging for our family members
especially, you know, seniorsaround receiving these things
and how to interact with thistype of content?
Eric Brown (17:41):
Josh, this one
really is.
It's disappointing, it'spainful, it's you know, it's all
the things.
and the number is probably lowright, because this is only on
people that have reported thecrime.
I was just at an event inChicago earlier the week and I
was at a dinner afterwards andthe topics turned to information
(18:03):
security.
To information security and oneof the people saying that their
grandfather was impacted bysomething like this and they had
lost it was tens of thousandsof dollars and the the
grandfather had finally reachedout to the person for help
(18:23):
because I think at that pointthey didn't think something was
right, like Like the threatactor had continued to ask for
more money you know where itmight start of like you know 500
, you know to get in onsomething that's too good to be
true and then another 2000 andso on and so forth.
But the person finally reachedout to the person who was with
(18:44):
me at the dinner andunfortunately the grandparent
was kind of embarrassed by thisso was reluctant to share a lot
of information after it wasconfirmed that it was indeed a
scam and the grandson didn'tknow if the grandparent had
(19:04):
actually gotten back any of themoney.
Didn't think that they did.
But it was really hard to hearthose sorts of things,
especially as people who areolder are probably not working
and probably on a fixed income,and that's going to be really
impactful for them.
So, to answer your question,josh, I think the one thing that
(19:26):
we've got to do is we've got totalk about it Any opportunity
that you can where we reallyjust Thanksgiving table,
birthdays, anniversaries,anytime, the families together.
You know you don't have to getout a cybersecurity book and
start preaching, but just totalk about it and say you know,
this is real right, this ishappening, this is in the news,
(19:49):
here's what we need to do andjust, you know, quick and
impactful and just I think,almost every time you have that
conversation or you're able tointeract with them, just to
bring up something, so that it'stop of mind.
Joshua Schmidt (20:03):
My family can
attest.
So yesterday was my two-yearanniversary at IT Audit Labs and
so my family can attest thatI've now become that person at
the dinner table and preachingthe gospel of cybersecurity to
anyone and everyone that willlisten.
Recent breach or did you hearabout this recent phishing
attempt?
Or this AI voice taking overpeople's phones and convincing
(20:30):
them?
Because I think we all knowpeople that have been a fallen
victim to these types, whetherit's hijacking your browser with
the flashing warning sign andthe loud noises or a phishing
call.
One tip I would have is just tonormalize that.
People fall victim.
Even Matt Starlin, the mightyMatt Starlin can call victim to
an easy pass phishing texttexting attempts.
(20:52):
So I think, because some ofthese people are so embarrassed
right that they that they fellvictim or they gave away some
money, so it just kind ofreinforces the insidiousness of
it because they're afraid totalk about it.
So then it kind of just keepsthe wheel turning.
I think we need to be easy onpeople and just be supportive
and have them be open to talkingabout it, so we can spread the
(21:14):
news.
Matt Starland (21:15):
I do want to say
I didn't fall, I wasn't, I
didn't, I didn't, but it cameclose.
But it but it came down to goodtraining.
So and cause, that's exactlywhat the same article is about.
You know.
So, whether it was using avishing, you know, getting
called through a phone andsomebody impersonating a U S
bank, wells Fargo, whatever itmight be and saying hey, we
(21:39):
found these charges or whatever,and then try to get more
information out of you.
I think the lesson learned herewould be is for those family
members is don't expect yourinstitutions that you deal
business with to call you andstart asking for more details.
(21:59):
If you ever get that phone callor text hang up, don't respond
and instead go find what theactual support number is.
You know.
Go to their.
Look up the actual website.
If you're a US bank, you know.
Go to usbankcom or doa, google,search for the right one.
Find what that customer servicenumber is that's posted
(22:21):
publicly.
You know from their legitimatewebsite.
You know.
You do your own search and cometo that website.
Then call them up and ask hey,I had a representative contact
me.
Did you guys actually havesomething on my record that you
need to get a hold of me on?
So don't respond to theanonymous callers.
Instead, you be the one to takethe initiative in action and
then follow up with a morelegitimate phone number that
(22:45):
you're able to look up or searchon your own and not being
provided by an anonymous personcalling.
Eric Brown (22:51):
Matt, I had one the
other day where it was an
obvious phishing call.
Right, I didn't recognize thenumber Answer the call because I
wasn't quite sure.
I was kind of expecting a call,but I didn't know.
But right away off the bat Icould tell it was going to be a
phishing call.
(23:11):
It sounded like it was going tocome from a bank, right, they
were calling from I don't knowsome financial institution that
I hadn't heard of.
They said that the call wasgoing to be recorded and I was
like, oh, that's a good idea,I'm going to record this because
this would be great materialfor the podcast.
Right, I'll replay it on thepodcast.
So we have the ability on, ifyou have an iPhone, to record
(23:33):
the call now.
So I hit the record a call, butunfortunately the iPhone
announces well, fortunately insome cases.
Unfortunately in this case, itannounces and says this call is
going to be recorded.
As soon as it made thatannouncement, the guy just said
thank you and hung up.
So I know it was you know afake call but that would have
(23:56):
been a fun one.
Matt Starland (23:56):
So now you're
going to have to have two phones
with you at all times and turnthe recording on.
You know, kind of, do one ofthese things, hold it.
You know, be that person, butit would have been fun to hear
you try to reverse fish them,give them next info, yep.
Joshua Schmidt (24:15):
That's great.
Yeah, this dovetails, and onceagain it's flowing right along
right into the article that Ipicked out.
You guys got your tinfoil hatsready today.
All right, you got one alreadyReady.
Matt Starland (24:31):
Yeah, I got to go
to the kitchen and get the.
Joshua Schmidt (24:32):
Yeah Well, this
is kind of in that zone.
I'm gonna start here in themiddle of this article, but I
guess we'll shout outcomputerworldcom this is an
article about meta puts the deadinternet theory into practice.
This is a bit dated, thearticle, but I still think it's
very relevant.
Have you guys heard about thedead internet theory?
I have not.
Okay, well, it kind of breaksit down right here.
(24:55):
It's a belief that most onlinecontent, traffic and user
interactions are generated by AIand bots rather than humans.
As a business plan, instead ofa toxic trend to be opposed.
If we think back to, like, theWild West days of the internet,
when it was, you know, just sorandom and you know, the graphic
design was way off and it was,you know, windows-based stuff
(25:16):
and it was so exciting, right,and you didn't know what you
were going to find You'd clickon a link and you would.
Just it really felt like youwere exploring something.
I mean, there were some reallycrazy websites back in the day,
uh, that we get talked intovisiting you know, computer, uh,
rottencom, things like that.
(25:37):
That nature, 4chan, you knowkind of popular, popular,
popularized all this crazinessright and it kind of put it all
in one place and then we stillhave elements of that on the you
are that can take effect.
So I think that's what's reallybeen the driver of this kind of
(26:16):
a theory.
And then you know things likethis Mark Zuckerberg, you know
he never said no to a bad idea.
It seems he's putting some ofthis into practice here by
adding AI bots to.
I haven't come across this, butapparently adding AI bots to.
I haven't come across this, butapparently adding AI bots to
Facebook interactions or anymeta type interactions to try to
(26:37):
drive engagement.
The article says the companyplans to host millions of
billions or billions of fake AIpowered users.
It's being rejected by realusers.
No surprise there.
Don't follow meta's bad example.
Obviously we have a little bitof a bias here, but it goes on
to say Meta's mission statementis to build the future of human
connection and technology thatmakes it possible.
(26:58):
But what it's really puttingtime and energy into is some of
these projects, like the fakecelebrity project.
This is not something I wasaware of, once again, but let me
know if you've seen this.
In September 2023, metalaunched an AI chatbot featuring
celebrity likenesses, includingKendall Jenner, mr Beast, snoop
Dogg and some others.
(27:20):
So yeah, by no surprise of me,this was rejected by users.
No one wants to talk to fakeSnoop.
Eric Brown (27:28):
And we've seen fake
influencer.
Joshua Schmidt (27:32):
So have you guys
run across this at all?
I had a call, I guess, with asupport I can't remember the
company it was just a couple ofweeks ago, but it's escaping me
where I had a call, a supportcall, with a bot and it actually
went really, really well and Iactually preferred it over the
typical customer serviceexperience because I didn't have
(27:53):
to wait for the line to open upand the bot was fairly adept at
answering my questions and kindof got to the bottom of it a
lot quicker than I would havetraditionally.
Eric Brown (28:05):
Matt, I know you've
got something to say here, but I
wanted to just jump in on thatone.
Sorry, I wanted to just jump inon that one.
I'm sorry, I wanted to justjump in on that one.
Back in the day and I'm goingback into like the late 90s I'm
trying to remember who it was.
But when you made a support call, it was to a relatively large
(28:28):
company at the time I'm going tosay NetApp, but it wasn't
NetApp, but it was somethinglike that where you call in for
support, they actually had a DJon and while you waited you
could request different music,right.
So it's kind of cool, kind ofkitschy for them at the time.
But wouldn't it be cool if youcould request a bot personality,
(28:52):
like if you wanted a Snoop Doggbot that you're going to
interact with?
You know, when you call Deltato, you know work on your ticket
or whatever, and you knowyou're going to be routed to do
an AI agent.
Wouldn't that be cool if it waslike if Delta licensed Snoop
Dogg's likeness and you couldchat with Snoop Dogg as your
(29:14):
support agent?
Matt, that's a million-dollaridea right there.
We should run with it.
Matt Starland (29:19):
You're already
behind the times, man.
Garmin already did this.
Garmin did this with thedifferent voices you could
download years ago where youcould get Samuel L Jackson to
narrate where you're going.
I mean, the guy was cussing meout left and right on, trying to
tell me what to go left andright, and I didn't appreciate
it.
So we started getting into ayelling match.
(29:41):
But I don't know if that was,that probably wasn't very
healthy.
Joshua Schmidt (29:44):
But yeah, so
it's like.
So I see that.
Matt Starland (29:46):
Garmin.
Now, that was before the AIdays, you know that was just
already a pre-programmed voice.
So I thought of two locationsOne movie, ready Player One, or
the book, where you know youplug into the digital world and
you've got NPCs, non-playablecharacters, walking around that
you can interact with.
What was the?
(30:12):
The archives?
Or the library archives, thesearch for the founder of that
world and who's talking to, uh,you know, an ai chat bot helping
them figure out where all thesevideo recordings and life
things were, and so so it's like, wow, I didn't realize we're
there already, but I justexperienced this week.
So amazon, uh, I had a packageshipped.
Um said it was delivered.
I'm looking around, nobodydelivered on my porch.
(30:32):
Checked my mailbox and it saidit was delivered.
I'm looking around, nobodydeliver on my porch.
Check my mailbox and it said itwas put in my mailbox.
Now we have mailboxes that arelocked, so it's like nobody
stole it out of my mailbox soclearly that they put it in the
wrong one and somebody else hasit.
So I I waited for a few daysjust to see if it show up, up,
asked around, uh, on facebook orwhatever hey any neighbors, did
(30:54):
you see something show up forme that from you know, amazon,
whatever, and never heard back.
So I waited for a few days andum, so then I reached out to
amazon and went to theircustomer support and it was an
ai bot.
Um, pretty much, hey, what canwe help with?
And the conversation was justso fast and quick and I the
thing that amazed me, though Iwanted to do a replacement, but
(31:19):
I was like, well, it just hasn'tcome yet.
You know which.
What do you want to guide me todo?
Because I don't want to be I'mnot trying to steal say it
didn't show up.
And here it's sitting on myfront porch.
I literally it didn't show up.
How should we take care of this?
And it's like you know what, noworries, matt, we'll take care
of it.
We got another one out to you,whatever, blah, blah.
And I'm like that's amazingthat ai, they had already
(31:41):
programmed it and hadappropriate methods built in to
even remediate the problem.
For me, not kind of triage, aself-help have you tried this,
have you tried that?
And then, oh, we hit my limits.
Let me pass you on to a humanto figure out what is the best
(32:02):
scenario.
They had already pre-programmedit based on certain contexts
that it could take an action forme.
It it did the refund or notrefund, but it sent out a new
one and everything and I was and, to Josh's point, the
interaction was amazing.
Eric Brown (32:19):
There's a pretty
cool AI voice, conversational
voice tool.
It's called Sesame and I getsesamecom and it's right now in
demo mode.
You can go and you could choosemiles or I think Maya are the
two voices and then you can justhave a conversation with that
(32:41):
AI bot.
But it sounds veryconversational.
It adds ums and ahs and it hasdifferent inflections and it's
only in demo right now, so it'spretty limited in what it can do
overall, like it can't searchthe internet in real time.
Joshua Schmidt (32:56):
Very easy to
talk to someone like that.
I think that is kind of why Ibrought this article up.
So I think customer service isa great use for AI.
It's not probably a job thatpeople really enjoy.
I know they take a lot of heat,those people on the customer
call centers.
It's probably a high stress job.
(33:19):
But the dark side of thingswe've had the lowest birth rate
in like what Forever and ever,and we have people that are more
isolated than ever.
And my concern, and maybe it'syours as well, that people
really lean into this and beeven more isolated and really
stop interacting with otherpeople because the bar of entry
is just so low.
You don't have to leave yourbed, let alone your house, or go
(33:40):
to a social situation and beuncomfortable.
Right, you can just kind ofstay in your pajamas and get all
the socialization you need.
You put on your Apple Vision.
Eric Brown (33:50):
Pros and, all of a
sudden, the four walls that
you're around you can beanywhere you want yeah.
Joshua Schmidt (33:56):
I mean, it
sounded like that's something we
wanted you know back in the dayOnce, getting back into the 90s
.
That sounded like an awesomefuture, but I think the closer
we get to it, the more it seemsdystopian to me.
I don't know, what did yourguys take on that?
I mean, we have guys that arehaving you know AI girlfriends.
We have that's really takingoff in popularity and yeah, I
(34:24):
just I worry about how that'sgoing to affect our kids and
things like that, and evenourselves.
Matt Starland (34:26):
Yeah, it's, you
know there's, I think,
technology.
There's goods to the technologyand there's bad to the
technology.
You know, you look at theInternet and there's a lot of
dark side to it.
There's a lot of good that'scome out of it and I just hope
that it's, and pray that it'sthe same thing with some of this
too, that there's other morepositives that come out of it.
But I have the same kind ofdystopian mindset to Skynet.
(34:48):
Oh man, ai is going to takeover, kind of thing.
You know, not that it's to thepoint of where it's worrisome,
but it's.
It starts to pop into the backof your head and you're like are
we getting there?
It's kind of weird.
But but then at the same timetoo, it's like, you know, look
at the Internet, this, theamount of information, the
knowledge sharing, things,technology.
(35:08):
And you know there's a lot ofgoods that have also come out of
it too.
So you know it's.
I guess time will tell, butyeah, I look at what you're
talking about.
Almost sounds like Ready PlayerOne.
Throw the Google or the Appleeyes, whatever the vision on it.
(35:31):
There you go and the world isburning around you, but it's
real cozy in this virtual spacethough.
Um, yeah, I, I do, you know, I,I do want to.
You know, this is this is newto mankind.
This is a very new territorythat we're starting to treading.
You know, for thousands ofyears, mankind has always or
(35:54):
humankind has always, you knowinteracted in person,
communities, fellowship, and sowe're definitely treading in new
areas and there's definitelynew studies coming out.
This is why, like we see now,social media getting banned for
certain kids a certain age,because look at the, the, the
findings that are coming out ofit.
(36:14):
So sometimes technology moves sofast and we didn't think about
should we have done it.
It's kind of a reminder.
I think there's a jurassic parkquote, you know where, uh, they
talk about yeah, it was there.
So we stamped it, we put youknow we, we did it, we fit it
and then we just let it do itsthing.
Versus should we have askedourselves first, should we have
gone down this route?
(36:35):
And sometimes, you know, youfollow the quick buck.
Joshua Schmidt (36:40):
Sometimes the
technology was a quick buck and
we didn't think twice yeah, wewere so busy thinking whether we
could, we didn't stop to thinkwe should yeah, exactly, and so
I don't know inverselyversely,though, I could see this solving
a loneliness problem for seniorcitizens, for example.
I don't know if that's, the jurywill be out for a while and
whether that's a healthy way toget interaction with human or
(37:04):
human type interaction.
But you know, if you go on toread the article, you can really
see that people are reallypushing back against this in
certain contexts, especiallysocial media, and the art world
has gone crazy about all theimages being generated and
people saying that's not realart.
And it's a valid argument, youknow.
(37:26):
But it is changing so fastwhere we're going to have to try
to stay ahead of it.
But it is changing so fastwhere we're going to have to try
to stay ahead of it.
But, eric, I know you're a bigadopter of AI and you've been
even doing some education aroundthat to stay on top of things.
Where do you see this going orwhat's piqued your interest
lately?
Eric Brown (37:42):
Well, I saw a real
example of this quite recently
as we were working on a redesignfor a website and, as part of
that process, working with anSEO firm, and the SEO firm make
suggestions on things that youcan do technology-wise within
the website in order to gethigher search engine rankings
(38:05):
right.
And the search engine rankingsare all calculated through bots
and through automation.
It's not humans that areranking these things, but it's
spiders and other technologythat's crawling websites.
So we're essentially usingtechnology to create technology
(38:26):
that is then viewed and scoredby other technology, is then
viewed and scored by othertechnology.
So that the article was verypoignant in that there are bots
already out there talking tobots and it was like, as we went
through it the exercise of thewebsite redesign and everything
it was just a moment that I hadfor pause of wow, now we've got
(38:51):
to spend cycles thinking abouthow we get automation to train
other automation on where thesite should come out from a
ranking perspective.
Joshua Schmidt (39:03):
For people
interacting with AI, especially
when it comes into the contextof phishing, bishing, all that,
squishing, all the ishings,what's squishing?
Isn't that one?
Or is that quishing, all theishings, what's squishing, isn't
that?
Isn't that one?
Or is that quishing?
There's quishing and thenthere's smishing.
This should be squishing.
I'm going to use that one forfour, four square, whatever that
.
That old Tumblr.
So here we go Be skeptical ofperfect content.
(39:26):
You know I think Matt wasalluding to that when he was
reading his text it was justlike a little too squeaky, clean
, very corporatized.
I think we can develop a senseof what is AI and what's not AI
and I think, to your point, eric, just immersing yourself in the
technology is another tip thatI got here in my research of how
(39:48):
to identify what AI content isright.
If you're not using it andyou're not seeing results from
your own interactivity, it's alot harder to develop a nose for
what type of content that AI isgenerating.
But I've spent quite a bit oftime on mid-journey recently and
just by interacting andgenerating some content I've
(40:09):
already got some other insightsand some better insights into
what that's doing there andmaybe how to identify it.
Not a perfect process, but ithelps.
Eric Brown (40:20):
Is MidJourney still
kicking out humans with multiple
fingers?
It had a hard time getting thefingers right for a while.
Joshua Schmidt (40:27):
The fingers are
a big problem.
We're on version 7.0 now andit's gotten incredibly powerful,
and then you can dump it intoother apps and things that will
animate the photo, so we cantake it a step further and
create videos out of staticimages.
But, yeah, the fingers seem tobe solved.
(40:48):
I'm sure there's still somehallucinations floating around,
depending on the image and whatkind of prompt you're typing in.
I think when you're typing in aphotorealistic image, that's
still probably harder thanwhipping up a Boho or an Andy
Warhol design or something likea comic book.
I think there's a little bitmore room to wiggle there for
(41:10):
the AI.
Don't be afraid to mute andblock AI slop.
I don't know if you guys havecome across this on YouTube, but
there's just a proliferation ofchannels now that are AI
content generated.
There's even music channelsthat are 24-7, streaming with AI
generated lo-fi hip hop beats,which they can't copyright
(41:31):
because it's all trained oncopyright material.
But you can upload it to yourYouTube channel and then
generate a thousand uh, you knowAI images and then have those
kind of scrolling through asyou're listening to music and
you can capture a monetizedrevenue from from the
interactivity there.
So, um, yeah, I was eventelling Matt about uh brain rot
(41:54):
where we're getting hip toskibbity, so uh, we'll have to
come come back around on thatone too.
But uh, don't be afraid toblock that stuff and I think, as
always, just be payingattention to what your kids are
are doing on there.
Do you think of any other waysthat we can protect ourselves
from, from kind of the frontierof the AI revolution?
Matt Starland (42:14):
I think besides
you know, where you talk about
paying attention to what yourkids are on, though, too, but
also teaching your kids, youknow it just goes back to the
same type of user training thatwe're talking about, for you
know security and stuff.
So and I don't mean teachingyour kids to be secure, but just
how to be respectful you knowhow to be mindful of things out
(42:34):
on the internet like that, and Iknow that's hard, but I've come
across some books out thereabout, you know, talking to your
kids, about talking to yourteenager, about social media.
You know.
You know, before the days ofsocial media, when you had
gossip stuff going around highschool or middle school or
whatever, you know it might makeit to a certain extent, and
then new news comes up andeveryone forgets about it.
(42:55):
Whatever, you move on and youcan be a stay away from those
people.
But now, with the social mediadays, once you post something,
expect it to be there for therest of your life, and so so I
would say you know, make sure tobe mindful and talk with your
kids about those types of things, and you know it's just.
You know whatever you're goingto say is that something you'd
want your parents to hear youknow or somebody you respect and
(43:18):
you want to think less of you.
So don't be posting things likethat on the Internet and you
know, just be, you knowrespectful of each other, so be
careful.
Joshua Schmidt (43:29):
One of my
favorite pieces of advice was
from a guest we had probably 20episodes ago, named Andre
Champagne, and one of hissuggestions were to keep the
devices in a common area in thehouse.
Don't, don't lock the kid.
Let your kids lock themselvesin the room with a personal
computer or even a phone.
Have that out in the open andyeah, and then just be
(43:49):
transparent.
But I think those prioritize ahuman human interaction and then
just be transparent.
But I think those, uh,prioritizing a human human
interaction and then trainingyour digital intuition, keeping
the lines of communication open,are super important and go for
hikes and camping, get outdoorsand leave the electronics behind
.
Matt Starland (44:04):
I like learn how
to socialize again maybe we do
an it audit labs.
Joshua Schmidt (44:08):
Hike, eric, we
could do that.
I could see.
I could see you die inside alittle bit.
Eric Brown (44:17):
Mission accomplished
.
Joshua Schmidt (44:19):
Well, I don't
think we're getting Nick today,
fellas, Unless we have anythingelse we want to add.
I think it's been a really funconversation.
Thanks again, Matt, for joiningus Thanks, Matt Thanks for
taking time.
I know you're busy, so always apleasure to be chatting with you
guys about cybersecurity.
We're going to see you guys ina couple of weeks Game night,
right, I'll be there, man,awesome.
Yeah, and like, subscribe andshare.
(44:40):
Drop us a comment and we willtry to be doing this again in
the near future.
We are whipping up some stuffthat might be coming up on
LinkedIn or YouTube in thefuture, so give us a follow, if
you're not already, andsubscribe to our YouTube channel
.
We're also on Spotify withvideo.
We have full episodes everyother week, and we've been
putting up some Flipper Zero andsome fun tech videos in between
(45:01):
, as well as shorts, so check usout.
It Audit Labs.
You've been listening to theAudit.
My name is Joshua Schmidt, yourco-host and producer.
We've been joined by MattStarlin, our guest today, and,
as always, eric Brown, ourmanaging director.
Thanks a lot, fellas.
We'll see you soon.
Eric Brown (45:17):
You have been
listening to the Audit presented
by IT Audit Labs.
We are experts at assessingrisk and compliance, while
providing administrative andtechnical controls to improve
our clients' data security.
Our threat assessments find thesoft spots before the bad guys
do, identifying likelihood andimpact, while our security
control assessments rank thelevel of maturity relative to
(45:40):
the size of your organization.
Thanks to our devoted listenersand followers, as well as our
producer, Joshua J Schmidt, andour audio-video editor, Cameron
Hill, you can stay up to date onthe latest cybersecurity topics
by giving us a like and afollow on our socials and
subscribing to this podcast onApple, Spotify or wherever you
(46:02):
source your security content.