All Episodes

December 2, 2024 38 mins

From Gmail 2FA bypass warnings to SEO poisoning campaigns, we’re diving into the latest cybersecurity headlines reshaping the industry.  

We explore how attackers are using hyper-specific search terms—like the legality of Bengal cats—to deliver malware and manipulate search results. Plus, we discuss advancements in AI-powered behavioral analytics, from cutting down false alerts to streamlining incident response. With real-world insights and actionable tips, this episode is packed with must-know updates for IT professionals navigating today’s ever-evolving threat landscape. 

In this episode, we'll discuss: 

  • Gmail session cookie theft and bypassing two-factor authentication. 
  • SEO poisoning campaigns delivering malware via niche search terms. 
  • AI-driven behavioral analytics improving incident response. 
  • Real-world social engineering and user behavior risks. 
  • Balancing usability and security with tools like passkeys. 

Thanks for tuning into The Audit. Subscribe on Spotify, Apple Podcasts, or YouTube to stay informed on the latest in cybersecurity. Don’t forget to follow us on social media and share with your network! 

#CybersecurityNews #2FA #BehavioralAnalytics #IncidentResponse #SEOPoisoning #ITSecurity #DataProtection 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:04):
Coming at you live.
Today you're listening to theAudit presented by IT Audit Labs
.
My name is Joshua Schmidt, I'myour co-host and producer, and
today we're joined by the usualcast Nick Mellum and Eric Brown.
How are you guys doing today?

Eric Brown (00:19):
Doing good.
You were mid-rant, though, herewhen we clicked the go live
button, so let's get back tothat.
You're more about that.

Joshua Schmidt (00:28):
So, yeah, how much time do you have?
I was griping about.
We'll just let the company gounnamed.
But just the terrible serviceof cable companies in general.
I guess we call them cablecompanies still, even though no
one's buying their cableservices.
But yeah, we call them cablecompanies still, even though no

(00:51):
one's buying their service,their cable services.
But, um, yeah, we call them ispisp internet service provider.

Eric Brown (00:53):
Yep, that's not the cable guy anymore you guys
remember that movie, the cablecable guy.
I was one of my favorites.

Joshua Schmidt (00:58):
I upgraded to quantum fiber and, uh, yeah, the
usual didn't show up on time.
The guy didn't get there until,uh, you know, three days after
he was supposed to show up andthen, you know, installed my, my
modem in the wrong place, aftershowing him exactly where I
wanted it.
Then he put it in a differentplace and plugged into the same

(01:19):
outlet that I have mydehumidifier plugged into in the
basement.
Yeah, then a wire went down upup the up the road and then,
completely, uh, left all oftheir crap out on the street in
a tangled mess under the uhtelephone pole, and then it's
been there for over a month.

Nick Mellem (01:37):
Haven't cleaned it up your internet wasn't down,
though my internet wasn't down.

Joshua Schmidt (01:42):
Okay, to be fair , I I'm not quite sure whose it
was um at this point well,they're guilty by association
either way yeah, I probablyexactly.
I called the power company, Icalled the um, the uh, the cable
providers.
I called the city and no oneseems to be um taking
responsibility for for it.

Eric Brown (02:04):
So there's a good book, josh, called tubes, by
andrew bloom, and it explainsthe essentially how
interconnected the internet is.
But the I believe he was a um,a journalist for wired, and one
day he was having some internetconnectivity and he looks out

(02:27):
the end of a backyard and hesees the cable coming into the
house and maybe a squirrelrunning on the cable or
something, and then he decidedfrom there to track where that
cable went and then all of thedownstream implications there of
the ISPs and the super ISPs andwhere all of the cables go.

(02:48):
So it's a really it's a quickread and it's a good read and
it's written from theperspective of a
non-technologist.
Well, that was your icebreakerof the day, I guess.

Joshua Schmidt (03:00):
No, no, I got a different icebreaker, so let's
get to that Favorite arcade orvideo game.
You know I'm coming from theNintendo generation.
I don't know about you, nick,you're a little younger than I
am, but do you have a go-to gamethat you spent?
What's the game you spent themost time on?

Nick Mellem (03:18):
I was wrestling with this for a little bit
because I have a deep love forDonkey Kong, but I was actually
going to say I probably spentthe most time to remind me.
I can't remember one of thenames, but one of them was Duck
Hunt oh, awesome.
But what was the one with themailman, where you were like
riding the bike down and youwere going off the jumps?
You remember the mail boy, likemail the paper boy paper boy

(03:42):
yes, paper boy, paperboy andDotcom I remember like Vampire
League and then like N64 camearound.
Then it was Donkey Kong.
So you didn't get into, likethe computer games, like Myst or
Diablo or anything like that, atiny bit.
I'm not a gamer, I wouldn'tclassify myself as that.

Joshua Schmidt (04:02):
I dabbled Growing up in the northern
climes.
There wasn't a whole lot to do,but I did.
I did get thoroughly into mistfor a while, if you guys
remember mist I, I don't, Idon't um but it was hella
confusing, especially in apre-internet, because it's such
like a puzzle game where youknow you're, you're going
through this, this kind of uh 3,uh 3d world.

(04:23):
Yeah, m Y S.

Nick Mellem (04:26):
T Cause I thought you said NIST like national
Institute of standards.

Joshua Schmidt (04:30):
See, I'm dating myself.
Now, Eric, back me up on thisman?

Eric Brown (04:44):
I'm right there with you.
Yeah, did you, I think for me,I was young at the time.
But if you remember this, thisis the opening line of the game
West of house, you are standingin an open field west of a white
house with a boarded front door.
West of a white house with aboarded front door.

(05:06):
So the opening line is Zork.
And then it's a text-based game, it's an open mailbox and so on
and so forth, and you gothrough and you play the game by
just inputting text, thecommand prompt.
So that just kind of.
I don't know.
I was young at the time butreally opened up to how fun
computer games could be or canbe or are.

(05:27):
And from there it justprogressed into more games.
And then in college and rightafter college, the MM RPGs, the
massive multiplayer online roleplaying games.
Like World of Warcraft andthings like that yeah, I got in
a little bit before world ofwarcraft certainly a lot of

(05:48):
world of warcraft time, butbefore that, um, dark age of
camelot.
So my buddies and I spent quitea bit of time, uh, in that game
that was such a time sink I Ididn't get into that, I'm gonna.

Joshua Schmidt (06:02):
I actually kind of lost a buddy in high school.
Just he played so much that itjust like overtook his life of
world of warcraft.
Yeah, we need to have anintervention for that guy.
Uh, it was, it was a little bittoo much.

Eric Brown (06:14):
I thought he was alone on that, I don't think he
was yeah, and then you had what,what was, uh, colloquially
known as the chinese farmers inthose games.
You guys know what I'm talkingabout there yeah, I know about
farming.

Joshua Schmidt (06:27):
Yeah, it's where you just do repetitive actions
to gain coins or like stats orsomething right?

Eric Brown (06:35):
and then they would be sold for fiat coin.
Right, you'd sell, you know, ahundred thousand whatever gold
bullion game, gold bullion, onebay.
For you know what was it like?
A hundred dollars or whateverit's the original crypto.

Joshua Schmidt (06:51):
Yeah, speaking of crypto, have you guys been
paying attention to what's beengoing on?
Wow, dude crazy yeah yeah, Ihad to get in a little bit.
I had a little fomo.
I had to throw a couple jellybeans on there late at night on
Coinbase Nice.

Nick Mellem (07:08):
You got actual Bitcoin or something else.

Joshua Schmidt (07:11):
I had Bitcoin before, but I bought some XRP
Just going and then going inback into Doge again.
I'm a glutton for punishment.

Nick Mellem (07:21):
I got fleeced on Doge last time I actually did
okay on doge last time, but uh,I actually okay on on doge and
uh, the other one was, I think,shib shib coin, shib shiba shiba
coin yeah, but I think thatisn't it just shit I believe
they call those shib coins yepthere's uh over.

Eric Brown (07:40):
There's probably over 10 000 now uh coins out
there.

Nick Mellem (07:45):
Well, it sounds like again.
Wasn't Tesla taking Dogecoin atone?

Eric Brown (07:48):
point.
I know Elon talked about it alot.
I don't know if they were.

Joshua Schmidt (07:52):
Yeah, I think there were rumors.
Yeah, but we'll have to seewhat happens.
It looks like it's going to bein for a wild crypto ride.
What a time to be alive.
Yeah, I had to get in.

Nick Mellem (08:05):
I didn't want the fear of missing out to take over
.

Joshua Schmidt (08:07):
So hopefully you don't get fleeced again.
No, I just gotta, just gottahodl, as they say.
So, um, yeah, speaking ofcurrent events, we're jumping
into a news episode today.
We're going live here.
We will be uh, recording thispodcast as well, and then, um,
publishing it with all of ouraudio spotify, uh video on
youtube again, and um, applepodcast is among many others.
So we're going to jump into ittoday, publishing it with all of
our audio Spotify video onYouTube again, and Apple
Podcasts, among many others.

(08:27):
So we're going to jump into ittoday.
Our first article is about thisnew Gmail 2FA attack warning.
Stop the email hackers.
Now, it says.
We scroll down here a littlebit.
The Federal Bureau ofInvestigation published on
October 30th, public alertrelating to the theft of what
are known as session cookies bycyber criminals in order to

(08:49):
bypass 2FA account protections.
The FBI Atlanta division'swarning stated that hackers are
gaining access to email accountsby stealing cookies from a
victim's computer.
Gmail, being the world'sbiggest free email service, with
more than 2.5 billion activeaccounts, according to Google,
is naturally a prime target forthese ongoing attacks.
So, yeah, I had a couple ofquestions that I kind of

(09:11):
prepared for this, for ournon-gamers, non-computer nerds
what are cookies?
I think I know what they are,but I'd love to have a pro
explain it to me.

Eric Brown (09:24):
Yeah, I think that you know if I was sitting around
a Thanksgiving table, maybe acookie is a way of collecting
information about either theuser or the compute resource and
using that information toenable the user to have a

(09:44):
potentially better browsingexperience.

Joshua Schmidt (09:48):
So in this case, this is what the hackers are
going after in order to gaincredential information.
It seems like using theircookies on their browser to then
bypass the two-f, the two FAA,rather the two-factor
authentication or multi-factorauthentication.

Eric Brown (10:09):
Yeah, nick, so how would you explain that from a
cookie perspective?

Nick Mellem (10:13):
Yeah.
So what I was going to say is Ithink what they're getting at
here is they're probably tryingto, you know, get these cookies
and then reroute you to a fakelogin page.
Right, that's that would be myworry.
We see it all the time.
We do it in in our you knowevents, social engineering, pen
testing, whatnot.
Uh, if you can reroute somebodyto what looks like a legitimate

(10:34):
Gmail page, in this case, um,you're able to, you know,
harvest tons of credentials, um,without the uh, you know
innocent party being aware, uh,they think they're just logging
into Gmail.
You know innocent party beingaware, they think they're just
logging into Gmail.
You know, our folks, orwhatever.
That's what I get out of thisone.
But going back to the cookiething, that kind of exactly what
Eric said you know, it's just away of and I think we see it

(10:57):
with like marketing, right,they're collecting those cookies
and they pitch it as having abetter experience browsing the
web, right, seeing what you liketo see, what you want to see,
and then it's feeding that toyou.

Joshua Schmidt (11:08):
One thing that helped me understand the cookie
thing is that recommendedRemember Me tab where, instead
of signing in each time, you canclick the Remember Me.
So that would be leaving somesort of an identifier of your
credentials so that you canbypass some security features.
Correct, are you clicking?

Nick Mellem (11:25):
the remember me.
Thing.

Joshua Schmidt (11:27):
Not anymore, attaboy.
I have in the past, you know,per the recommendations of the
podcast here.
I actually got a passwordmanager in working for me here,
bitwarden.
I've actually talked a fewother friends into getting it as
well.
The one thing I will say aboutthat is it's really hard to use
on your iphone.

(11:48):
Um, the browser extension isvery clunky as far as I know, so
maybe you guys can show mesomething.
I don't know, but it's prettysmooth, uh, experience on the
computer though.

Eric Brown (11:59):
Yeah, on the mobile device I just copy and paste.
You know, go into Bitwardencopy paste.

Joshua Schmidt (12:04):
I might have to try that out.
And that gets into thisboundary between usability and
security that we often talkabout.
I know that.
So how do you guys speak to anorganization or people working
within an organization to kindof convince them to take that
one extra step, even though ifthe usability becomes slightly
more clunky, kind of like in thecase of what I'm talking about

(12:25):
on my iPhone with the Bitwarden?
You know, I know that's what Ishould be doing, but it makes my
day a little bit morefrustrating.
So what's kind of the messagingaround that when speaking to
organizations?

Nick Mellem (12:37):
Yeah, I think it's become increasingly difficult
because once you put you know,put an extra speed bump in front
of an organization or anindividual user.
People are apprehensive to dothat because, unless you're a
professional in the space or youknow a lot about it, it is
making their day a little bitmore difficult.
It takes monotonous, easy tasks, it takes them a little longer.

(12:57):
I think we found success intraining.
I know I've said that a lot,preaching the training and
user-based education.
What are they getting out ofthis?
Why do they want to use it?
Speak to it on their I don'twant to say their level, but
something they might understand,like why we'd want to do this,
protecting their credit,facebook, whatever it may be.

(13:18):
Some things that we're using atcurrent organizations, that
we're going to othertechnologies are, like YubiKey,
single sign-on methods, whicheven makes things a little bit
more safe than 2FA right, I mean, it's a form of it, but you
would sign into this device.
I think a lot of us arefamiliar with Yubico's YubiKey,
like I just mentioned, and itallows you.

(13:38):
I have a YubiKey right here andit allows the users to plug
this into their computer.
If they've got single sign-onand applications allow single
sign-on, it'll allow them tobypass that in the first place.
So I kind of went off on alittle bit of a tangent, but
it's multi-vector authentication.
But it actually, to me, makesmulti-vector authentication a

(13:59):
little bit easier because ofthat small device and it might
you know, it takes the users outof the equation a little bit
because it's making their lifeeven easier versus trying to
have a conversation about whythey should take an extra step.

Joshua Schmidt (14:12):
So is that like a passkey then, Eric?

Nick Mellem (14:14):
It is.
It is a physical passkey.
Yep, I'm glad you brought thatup.
I think they talk about it inthis article.
Passkeys, you know, and there'sa difference too.
Right, we talk about passphrases, right, Pass keys.
This is a physical pass keywhere you have, instead of using
a password, once you would putthis into your computer machine,
whatever you're using desktop,you'd put in a six-digit PIN.

(14:37):
That you know, and that's howyou get your multi-vector
authentication.

Eric Brown (14:42):
And the pass keys.
You can use the physical one,the YubiKey, like Nick's talking
about.
Google's got pass keyauthentication where one side of
that key is stored with aserver, the other side is stored
locally with you, and thenyou're essentially just

(15:03):
comparing the two keys, likedoes the key fit in the lock?
And you're.
You're typically getting tothat key through some form of
biometric authentication, likemaybe your fingerprints on the
computer or you know apassphrase to get into, and
essentially unlock the key onyour side and then match it with

(15:24):
the key on the server.
Fortunately for the user, youdon't have to know any of that
Just when you go through and setup that passkey with Google.
That's what's happening behindthe scenes.

Joshua Schmidt (15:35):
So, eric, what is your messaging around?
Getting people to take thatextra step, whether it's
organizationally or for personalinformation?

Eric Brown (15:42):
Organizationally it's a little bit easier,
because if you hired us to comein and help with information
security, it's like you knowthis new sheriff in town.
This is just how we're going todo it.
So you know, like it or don't,but this is the way it is and
organizations can set thatpolicy right, because people are
coming there to perform afunction, they're getting paid

(16:03):
to perform that function and thecompany is saying this is how
we're going to operate, theseare the standards that we're
going to operate by to protectthe greater good.
It's much harder when there'snot that function of employee
and company in our personallives.
It sounds like you've had somegood success in getting your

(16:26):
friends to do it, and all toooften, unfortunately, it is.
Someone has had a bad eventthat has occurred to them.
They've had information theft,they've had something happen to
them and now they want to getclean, and then they're
certainly willing to take thesteps that it takes.

Nick Mellem (16:47):
Yeah, that's absolutely true and
unfortunately sometimes it doestake, let's say, a disaster for
an organization to maybe wake upor an individual to wake up to
see how important this is.
We see it all the time withorganizations not wanting to
maybe spend the money oncybersecurity.
Something big, somethingnotable happens.
There's a loss, whatever it is.
You know IT Out of Labs isgetting a phone call and you

(17:11):
know we're in there trying tohelp them remediate or, you know
, make them whole again.

Eric Brown (17:16):
And you know we've got some really cool tools on
the corporate side where andthey're similar, just less
expensive on the user side,where we can come in and we can
essentially create the abilityto lock all of the corporate
accounts.
So all of the username andpasswords that are used to get
into servers or network devicescan essentially be locked and

(17:39):
then require a checkout processto get the credential.
That's relatively easy to setup.
The hard part is getting theadoption and using the tools and
configuring the tools in a waywhere it's not overly cumbersome
, because it's really easy toturn the tool on and walk away.

(18:01):
But the usability side of thatcan be difficult if the tool is
not set up and configuredproperly.
Or maybe you purchased a toolthat might have been a little
bit cheaper but it doesn't havethe functionality to make it
easy for the users to interactwith.

Joshua Schmidt (18:19):
And stop clicking the remember me box.

Nick Mellem (18:24):
You tell them Josh.

Joshua Schmidt (18:26):
That was new to me.
So you know, if that was onetakeaway for me today, that
would probably be it.
Just because that's seeminglyan issue now, that probably
wasn't, you know, not that longago.
Cool, thanks for your advice onthat, guys.
We're going to head on to thenext article.

(18:47):
Nick had some insights on thisone as we were getting prepared
for today, but this article is,uh, from the hacker news again.
It's five ways behavioralanalytics is revolutionized.
It's an incident response andit kind of goes through.
You know how this whole aitooling and the ai analysis of
data is really kind of maturedover the last handful of years
and kind of being useful now asbefore.

(19:09):
It might have been a littlemore clunky, you know.
It might have been flaggingthings that were maybe not any
kind of a threat or kind of ananomaly.
It's just kind of clunky.
So it sounds like things havekind of changed.
The article states behavioralanalytics, along associated with
the threat detection, isexperiencing a renaissance.

(19:29):
Once primarily used to identifysuspicious activity, it's now
being reimagined as a powerfulpost-detection technology that
enhances incident responseprocesses.
By leveraging behavioralinsights during alert, triage
and investigation, socs cantransform their workflow to
become more accurate, efficientand impactful.
Fortunately, many newcybersecurity products like AI,

(19:50):
soc analysts are able toincorporate these techniques
into their investigationcapabilities, thus allowing SOCs
to utilize them into theirresponse processes.
It's a lot there, but maybe,nick, you could break it down
and speak to if you've used anyof these tools or if you've seen
any of this crop up in yourwork.

Nick Mellem (20:09):
Yeah, I mean, I think most of us have seen it
right.
We're seeing all these newtools come out with some sort of
AI benefit, some way to makeyour job easier, and in a lot of
ways it is.
You know, I think something alot of organizations have in
common is probably bandwidthissues.
You know, I think something alot of organizations have in
common is probably bandwidthissues.
You know you need a lot morepersonnel than you may have, so

(20:30):
a lot of your I don't want tosay junior employees, but people
in your organization wearmultiple hats.
I think a downfall to this inthe past was it was very noisy.
You get a lot of false alertsand you end up chasing something
that's, you know, not important, but it maybe looked important
or alarming, I should say, whenit comes in, when the threat
comes in, but it's, you know,totally benign and it has
nothing no inherent risk.

(20:51):
But you know, for me the bigtakeaway here is how do we save
time, how do we fill position?
Or you know manpower, you knowwith tools and I think it's
starting to do that now.
I think something that Iconnected to is recently we had
the presidential election andyou know we were all hands on

(21:11):
deck for some of our clients.
You know watching networktraffic and many other things,
but most notably that you knowwhat's going on, what's coming
in and out from insider threatsto everything, and you, you know
not getting those false alarmsis huge, especially when maybe
you're overnight.
You've got one or two guys on,gals on.
You know watching alerts, so Ithink that's a big one for me.

(21:35):
And then also josh, if youscroll down a little bit further
, they've kind of got that.
They're spelling out onethrough five, but I think it was
number three.
Oh, number two as well.
I was, I was reading and I wasfinding that to be very
interesting and important,because a lot of times, let's
say, somebody, we get an alertthat somebody is working out of
the country, right, and we can'tget ahold of them, or you know,

(21:59):
the time change is so greatwe're maybe getting an alert.
Well, if we don't have to themor, you know, the time change is
so great, um, we're maybegetting an alert.
Well, if we don't have tocontact or wait for the user to
contact us.
You know, with these tools wecan take immediate action to
make sure there's no, uh,potential threat, or neutralize
the threat right away.
And then I think it was numberfour, if you would scroll down a
tiny bit further.
Uh yeah, enhance the in deeperinvestigation, and this, to me,

(22:21):
goes kind of back to thebandwidth issue.
Right, we don't like to abandon, you know, any searches before
we can get as deep as possible,find the root cause.
But a lot of times either it'sa bandwidth issue or a know-how.
And I think when we get thesetools, you know, it allows us to
maybe get to that root cause alot faster, and then we're able

(22:47):
to teach ourselves and ourteammates, you know, maybe some
new tactics on how to get thereand how it actually happened,
you know, and inherently makingthe organization that we're
working for at the time, youknow, much safer.
But to me, out of all thearticles, this one had the most
to chew on, I think.

Joshua Schmidt (22:55):
That sounds great Like streamlining your job
, making it a little moreaccurate.
You know, I've definitely hadmy credit card shut off while
traveling.
You know, forget to notify thebank and all of a sudden you're
in a different city and yourcredit card's not working.
So what other examples mightthere be of detections that
might go off or set off some redflags on these kinds of tools

(23:16):
that might alert professionalsthat there might be some
malicious activity going on?

Eric Brown (23:22):
You know, josh, there's some.
The tools are getting prettygood.
But I go back to the end userand it's that end user education
piece the behaviors that wehave just ingrained to take
shortcuts in life right.

(23:42):
To make things easier, themalicious actors are able to do
that now at scale with AI.
So we've got to be able to, atscale, respond to that.
But just a couple of weeks ago,during the election cycle, I
went in to do the pre-voting acouple of days early, where you

(24:03):
go in, you get a ballot, anabsentee ballot, and then you
take that to, you fill it out,you take it to an election judge
and then the election judgereviews it, looks you up in the
system and gives you a ballot tovote.
During that process I'mstanding in front of the
election judge there was aboutthree of them and they're all

(24:24):
helping different customers andI look down.
He's sitting there in front ofme and I'm standing.
I look down on his computer.
Pink sticky note is thepassword of that election judge
computer.

Joshua Schmidt (24:43):
Josh Bahr Jr, your favorite sticky notes.
You got an eagle eye for stickynotes, I must say, eric.
Well, this one was hot pink.

Eric Brown (24:50):
Josh, I mean, you couldn't avoid it right and I
got to pull.
I took a picture, I got to pullthe password up here.

Nick Mellem (24:59):
Social engineering at its finest.

Eric Brown (25:02):
I mean they're social engineering themselves,
right, Absolutely socialengineering themselves, right
like absolutely.
It's just, it's just egregiousthat you know there's no checks
and balances.
Nobody on either side of theperson looked over and said you
know what are you doing to be anelection official is very

(25:22):
painful I gotta find this thingnow, but uh, the password was
something like you know, maybewhatever the name of the
building was one.
But so, going back to the tools,josh, you know there's we.
From a security practitionerstandpoint, I think, leveraging

(25:43):
technology to to where we canhave the interactions with the
users, be transparent and thencreate a positive experience for
that user when they do reportsomething.
So one of the examples therewas, if the user is out of the
country, if the user isconnecting in from a location

(26:07):
that's normally not their own,great things to trigger on.
In email now, which is, ofcourse, the number one threat
vector, there's a lot of contentcoming in that's really well
written, used to sociallyengineer the user into taking an
action, and we've talked in thepast about QR codes coming in.

(26:29):
The user then scans the QR codewith their phone and it breaks
them out of the walled garden ofthe corporation.
There's some cool tools nowwhere the user, if that does
happen, to make it into theirmailbox and I'm a big advocate
for not putting it in aquarantine folder, not marking

(26:57):
it as potentially malicious,just deleting it right.
There's hardly anything.
You're getting to get an emailthat is so critically important
that if it looks suspicious itprobably is, that if it looks
suspicious it probably is andjust getting it out of the
user's box completely so thatthey can't interact with it.
But the ones that do make itthrough.
If the users do choose toreport that, which hopefully

(27:19):
they do they report that andthen the system will take
another look at it and allow theuser to ask questions of like
well, why was this malicious orwhy was this flagged, or
whatever it was right.
So you know, if the userreports it as suspicious, the
system comes back and says, no,this is clean.

(27:40):
The user can ask thosequestions to say, oh well, this
looks suspicious because of X orwhy is this not a malicious
email?
Rather than opening a ticketwaiting to hear back from
somebody, the user is gettingthis nearly instantaneous
response, which I think isreally good, and I think we're

(28:01):
going to see more that justacross the board with AI and
user behavior and analytics.
That is outside our normalroutine.

Joshua Schmidt (28:16):
Yeah, and that's what the article got into.
It's, I think they're justfinding, um, more algorithms or,
for lack of a better word, moreum, more models of people's
behavior, um, and using evenmore data points in those models
, even like the way peopleinteract with applications, for
example, and just theirtendencies and their workflow.
I I do wonder what this um doesfor our privacy, even though

(28:40):
it's kind of shot anyway, um, itfeels like a another gateway to
, uh, giving away even more ofjust our privacy.
But perhaps you know it'll be,just like everything else, kind
of a kind of a seesaw effect onbalancing out, you know, the
risk versus the reward there.

Eric Brown (28:59):
There was a company in Eden Prairie, minnesota, I
believe, and I'm going back now10 years, minnesota, I believe,
and I'm going back now 10 years.
I forget the name of thecompany, but a former colleague
worked there for a while.
Small kind of startup and theiridea was they were going to, or

(29:20):
they did, map how we type.
So there was essentially anagent that would sit on the
machine and map how you, as auser, type and then that became
the way that the machine wouldauthenticate you.
So rather than you logging inwith either you know a
fingerprint or password orfacial, you know whatever, and

(29:43):
then you know you essentially goaway and then a malicious actor
could take over your computerthe likelihood that the
malicious actor had the sametyping patterns and I think they
were detecting you know themilliseconds of time between
keystrokes, how long the key waspressed, all those sorts of

(30:04):
things which would create ourdigital character of a.
You know how we type wouldauthenticate us, and it was just
continual authentication Likeevery six seconds.
It was kind of running thischeck so really cool in in
theory and I think in in limitedrun.
Practice it worked pretty well.
I think some of the breakdownswhere maybe if you had like an

(30:27):
injury, but you know we'reprobably not getting that
injured that often in our hands.
Maybe nick's getting his handsscratched by his cats or
whatever, but I don't know ifthat impacts this typing, but
you know what I mean, right?
So I don't know what happenedto that company now that, now
that we're talking about it, Iwant to look it up.
Yeah, I got you nick as aguitar player.

Nick Mellem (30:45):
I've had a stint.

Joshua Schmidt (30:47):
You got it?
No, actually I got it.

Nick Mellem (30:52):
Oh, I didn't even see it right now.
Shoot.
I got to go See you guys, rawr,rawr.

Eric Brown (31:02):
For those of you who don't know why it's a joke is
Nick's got a lot of hairlesscats.
For some reason he keepsgetting more.

Joshua Schmidt (31:09):
Yeah, Nick, you might want to pay attention to
this article, buddy, because wegot a new attack vector for Nick
.
It's getting really granularhere.

Nick Mellem (31:19):
Did you have AI make this up before we came on?

Joshua Schmidt (31:22):
I actually found this.
I had to kind of work it intothe episode today.

Eric Brown (31:50):
It's a good one, though, josh, because it really
is talking about SEO poisoning,or searchGhoulish, which is an
attack vector that and it's verysimilar to this search engine
optimization compromise, butSoshGhoulish is spelled
S-O-C-G-H-O-U-L-I-S-H.
Threat actors compromise awebsite and it's typically maybe

(32:11):
not a website that you know,run by a corporation with, you
know, a dedicated security team.
I know one of the examples wasMad Mothers Against Drunk
Driving I think this was a whileback.
Their website was compromisedwith this malware and what the
malware does is when the userconnects to the website, then it

(32:34):
pops up a message that will say, for example, your browser is
out of date, you need to refreshyour browser, and it will
attempt to entice the user toclick on that and then go
through the process of injectingand installing the malware so

(33:06):
kind of a social environment, orif it's actually a live user
coming in to visit that website,and if it is a, if it's
something coming through avirtual machine or a sandbox,
then potentially it won'texecute.
But very similar to where we'regoing with this on the search

(33:27):
engine optimization poisoning,where, same thing, websites are
compromised and then users aresocially engineered, so to speak
, to when they type in a searchquery that they're looking for.

(33:48):
In this case it was around, Idon't know, it was at Bengal
Cats in Australia.
That would send them to sitesthat contained a malicious zip
file that would then, you know,potentially be downloaded.

Joshua Schmidt (34:15):
Yeah, we might have to come up with another
name.
You know social engineering, itmight have to be feline
engineering.
I see they have.
I'll leave that one to you,nick, I see they have SEO
poisoning up there.
That's kind of interesting.
But yeah, basically you summedit up, eric.
I'm just going to read this forthe people that don't know.
In an unusual, specificcampaign, users searching about

(34:38):
the legality of Bengal cats inAustralia are being targeted
with the Goot Loader malware,the goot loader malware.
In this case, we found thatgoot loader actors using search
results for information about aparticular cat in a particular
geography being used to deliverthe payload are bangle cats
legal in australia?
Uh, sophos researchers, trangtang and hikaru koike I think

(35:02):
I'm saying that right and ashacastle and Gallagher said it in
a report published last week andNick.
Mellum, nick Mellum.
Yeah, so basically what yousaid, eric, you know really
hyper specific, which is why Idid find this interesting and it
was bonus points for having thecat in there.

Nick Mellem (35:18):
But you guys nailed it this week.

Joshua Schmidt (35:20):
Not sure.
Not sure about the legality ofBengal cats here in Minnesota.
Maybe Nick could let us know,or I know texas has got some
looser laws.

Nick Mellem (35:30):
You know whatever you want.

Joshua Schmidt (35:31):
Baby tiger king was it was it tiger king down
there?
That was florida flow, oh, that, of course.
So this one is actually goingeven further to say um, there
had been a newer attack where itwas um, making it less funny.
I guess I can't rememberexactly what, what, what the uh
instance was.
It was making a a differenttype of a website.

Nick Mellem (35:53):
Um, I would be very interested to have a industry
professional on this, come on,because this would be something
I'd be a lot.
I'd be very curious andlearning more on are you an
industry professional?
In this specific space also gotit.

Joshua Schmidt (36:09):
But thank you very much, eric yeah, here,
here's the, here it is what Iwas referring to.
Um, they changed the seopoisoning to california law
break room requirements, whichis probably a lot highly cert,
more highly searched than, uh,the bengal cats.
I'm assuming the californiansneed a little extra break time.
They're looking for that breakroom, want to get the laws

(36:30):
around that.
But yeah, that seems to be likea higher traffic kind of
keyword SEO than the Bengal cats?

Eric Brown (36:37):
Remember when you used to be able to smoke in the
break room?
No, I don't.
That's going back years.
Yeah, just the fact that youhave a break room.

Joshua Schmidt (36:47):
We have a great break room at it audit labs.
I must uh call out the pinballmachines, eric.
Thank you for those fun timesyeah hey, I know we got a run
today.
Um, we got a little bit ofshorter episode, but uh, that's
all right, um, and we thankeveryone for listening.
Uh, you've been listening tothe audit presented by IT Audit
Labs.

(37:08):
My name is Joshua Schmidt,co-host and producer.
We're joined today by NickMellum and Eric Brown.
Please like, share andsubscribe and catch us again in
two weeks.
Thanks all Thanks.

Eric Brown (37:19):
You have been listening to the audit presented
by IT Audit Labs.
We are experts at assessingrisk and compliance, while
providing administrative andtechnical controls to improve
our clients' data security.
Our threat assessments find thesoft spots before the bad guys
do, identifying likelihood andimpact.
Our security controlassessments rank the level of

(37:41):
maturity relative to the size ofyour organization.
Relative to the size of yourorganization, thanks to our
devoted listeners and followers,as well as our producer, joshua
J Schmidt, and our audio videoeditor, cameron Hill.
You can stay up to date on thelatest cybersecurity topics by
giving us a like and a follow onour socials and subscribing to
this podcast on Apple, spotifyor wherever you source your

(38:05):
security content.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.