Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Joshua Schmidt (00:00):
Hi, I'm Joshua
Schmidt, and welcome back to the
audit.
As we close out the year, Iwanted to take a beat and look
back at some of theconversations that really stuck
with us.
The episodes that made us thinkdifferently, the guests who
challenged our assumptions, andthe moments where we realized
cybersecurity isn't justevolving, it's fundamentally
transforming.
This is our 2025 year inreview.
Okay, for our first clip, Iwanted to share something from
(00:25):
episode 72 of the audit thathonestly changed how I think
about hiring.
We sat down with JustinMarciano and Paul Van from
Valydia to talk about thedeepfake workforce.
And this isn't just somedistant threat.
This is happening right now.
Now, towards the end of ourconversation, they showed us
just how ridiculously easy it isto create a convincing deepfake
for a job interview.
We're talking minutes, nothours.
(00:46):
And the implications?
Well, they're staggering.
We're looking at nationalsecurity threats, intellectual
property theft, and entireindustries vulnerable to
infiltration.
So settle in and let's take alook back at a year that proved
one thing beyond any doubt.
In cybersecurity, standingstill means falling behind.
Let's dive in.
Logged back in here on theaudit to um to talk with uh I
(01:09):
don't know.
Who are we talking to today,fellas?
Justin Marciano (01:11):
Who do we have?
Uh it's Justin Marciana here inuh in a different body in my
roommate's body.
Shout out, shout out EdwardMasarro.
Um sorry for putting you on thepodcast here.
Uh, but uh did give mepermission to use your name,
image, and likeness.
Paul Vann (01:28):
Uh so here we are.
And then we've got uh we've gotme as Justin Marciano here, uh
a live deepfake we prepared uh alittle bit before this call.
Nick Mellem (01:38):
That is really
wild.
Justin Marciano (01:41):
So yeah,
honestly, as a great
explanation, here's here's kindof two different versions,
right?
Paul's is a live deep fake thatwas pre-recorded.
Um, we can stream live via uh,like if we wanted to actually do
a live deep fake, we can.
Um the purpose of what I'mdoing or this actual product
here is more for people that areon the road, on a ski lift.
(02:02):
You can essentially just be ina controlled environment and you
train a model on that.
So that's what's running in thebackground right now through
this camera and with the uh withthe voice.
And then on Paul's end, youknow, you can legitimately
produce real-time beepfakesnowadays where you know you take
someone's face um and use anaudio uh changing tool at the
same time and have aconversation just like that.
Eric Brown (02:25):
And Justin, is that
tool called Pickle, the one that
you're using?
Justin Marciano (02:28):
Yeah.
So I'm using Pickle and then uhPaul, what what tool did you
use again?
There's a there's a millionopen source ones.
Paul Vann (02:35):
The the video that I
recorded is actually fully open
source.
It's uh using Deep LiveCam.
You can install it on your Macand you can connect your webcam
and in real time swap your face.
Like I said, this one'spre-recorded, but yeah, we did
this one live and just screenrecorded the the live rendition.
Nick Mellem (02:50):
Meaning as Elon
Musk.
Justin Marciano (02:52):
Oh yeah.
There's a lot of there's a lotof videos on uh on X of people
doing that, like a live screenwith like as with his face,
which has caused actually somepretty significant scams too.
That's the reality of it.
Joshua Schmidt (03:05):
The one that
pickle that Justin's using, I
could see how someone could usethat today, and then maybe maybe
they freeze it intentionallyand just go, oh hey, my my
screen's frozen or my mycamera's frozen.
Justin Marciano (03:16):
And uh so this
is me in a similar environment,
not the same environment.
Give it a sec to to start thelip sync control.
Um I probably filmed it rightin this room, the same room.
So give it a second, and thenuh we'll be able to do the yep,
lip sync is now back on.
So yeah.
A little bit wider of a mouthfor sure, but it goes to show
(03:38):
you can have different personas.
It's supposed to be just of youfor context, but you know,
adversaries and and people whouse technology for whatever
purpose they want.
So I uh got to use my roommatethere too.
Might get me banned from theplatform, but it is what it is.
Nick Mellem (03:57):
I just downloaded
Pickle.
Eric Brown (04:00):
Well, there you go,
Paul.
You switched it.
Nice.
Paul Vann (04:03):
Yeah, I actually just
used uh like I so to create
these deep fakes, you have tohave a virtual camera.
I was just able to swap my uhmy virtual camera.
It's pretty cool though.
You can actually see that I canalmost double up.
Uh I can double up in a way andhave the have a little bit
here, a little bit there.
Uh but uh yeah, that's youknow, virtual cameras are are
are fantastic.
(04:23):
That's I mean, that's howpeople are creating these deep
fakes today.
Joshua Schmidt (04:28):
All right, we're
on to clip number two.
This is where things get realinteresting and kind of nerdy.
Bill Harris joined us to talkabout quantum computing and what
happens to all of ourencryption when quantum becomes
practical.
Now, Bill's knowledge goes waybeyond just the security
implications.
Now, I know what you'rethinking.
Parallel universes on acybersecurity podcast.
Just trust me on this one.
I'll let Bill break it down.
Bill Harris (04:51):
So I got one more
thing that actually goes into
philosophy a little bit here.
So um I want to call outsomething that says that Google
uncovered, uh, and thensomething that NASA uncovered.
So Google was doing somequantum operations.
Uh, this wasn't that long ago.
I think it might have been lastyear.
And there's a blog on their ontheir post about it uh that said
(05:16):
they they got some unexpectedresults that they believe lend
credence to the theory ofparallel universes.
And they put this right outthere on their blog and and they
write it, and they wrote a bitabout it, which I found really
surprising.
But what was even stranger wasthat there is this story
circulating.
And the story itself, the thefirst part of the story is true.
(05:39):
NASA shut down its quantumcomputer in February of 2024
because they got some unexpectedresults that they said
challenge contemporary thinking.
And they shut it down, and withno further comment, they were
they began looking into it.
Eric Brown (05:54):
Well, hold on a
minute.
Is that like the CatholicChurch shutting down science in
the 1400s because the sun is nolonger the center of the
universe?
Well, not necessarily.
Bill Harris (06:07):
So what people, it
might be that, but you're doing
what people are doing, right?
People were wondering, well,what's happening here?
So imaginations have really runrampant.
And the stories right now rangefrom something, you know, like,
well, they just found, youknow, some new math that they're
trying to resolve, to they'vestumbled into some alternate
reality or they have uh stumbledacross some type of
(06:30):
extraterrestrial intelligence.
Yes, like the movie Contact.
Remember that?
Joshua Schmidt (06:37):
Great flick.
Nick Mellem (06:38):
Yes.
I yeah, I had it sitting backhere for like six months and I
finally threw it out.
Joshua Schmidt (06:43):
Get it out
because I'm bringing it there.
Here we go.
Have you heard of the Mendalaeffect or the Mendela?
I want to say yes, but goahead.
Hit us with it, Josh.
This is going on, this is goingon like wildfire over the last
probably five plus years on theinternet.
Have you heard of it, Bill?
I've I've not.
Maybe it was the big host thewhat's the what's the particle
(07:05):
accelerate accelerator inSwitzerland called?
Is it the big host of the largehadron colliding?
Yeah, yeah.
The the theory goes that CERNstarted messing with that.
We slipped into a differenttimeline.
So there's people that rememberthings inaccurately.
If and and Nelson Mandela uhdying is one of those, where
some people remember him passingaway, and then some people
(07:27):
remember him being released fromprison.
Another example would be theBernstein Bears, the way it's
spelt, the Bernstein Bears orthe Bernstein Bears.
Um, it goes into pop culture,uh verbal cues, um, like mirror,
mirror on the wall from SnowWhite.
It's actually magic mirror onthe wall.
So they're saying that weslipped into a different
(07:47):
timeline at some point.
Eric Brown (07:49):
I'm gonna use this
in my next argument at home with
my wife, right?
Where it's like I'm constantlyI'm like, yo, you're in a
different uh dimension here.
That's the best case.
Bill Harris (08:01):
So maybe that's you
know, lending some credence to
uh to this yo, so it's funny youshould say that because that's
exactly kind of where some ofthese arguments are going.
They're drifting off intoquantum memories.
Um and the theory that thetheory that you actually never
(08:22):
really die because at everyjuncture in your life, you like
Schroeder's cat, you either liveor you die.
And so a version of yourselfwill live in perpetuity as it
always junctures off.
And so you'll die an infinitenumber of times, but there's
always another one of you outthere in some type of an
alternate universe.
Eric Brown (08:42):
Why did why did they
shut it down if they found
something interesting?
Bill Harris (08:46):
They didn't
understand the results.
Um so they they um it theresults that they got from their
quantum machine uh did notcorrelate with their
understanding of physics.
Or they just didn't like whatthey got back.
Joshua Schmidt (09:01):
All right,
moving on to clip number three.
Our next clip, we're jumpingback in time to episode 75 with
Alex Bratton from LexTech.
Now, Alex came on to talk aboutsomething every business is
dealing with right now (09:10):
AI
integration.
And here's the thing companiesare racing to adopt AI tools
faster than they're thinkingabout the security implications.
Alex breaks down why businessesneed to pump the brakes and
actually think through theramifications of integrating AI
before it's too late.
Alex Bratton (09:28):
And that is the
challenge that we've got right
now because people are blazingstraight ahead and rewinding a
little bit, and I'll use theiPad as the example uh when it
came into the business world.
It was brought into thebusiness world by business
leaders who were like, I needthis, I want this, and whether
that was a doctor or a CEO, itdidn't matter.
And the technology teams wereso far behind that it was it
(09:49):
made it difficult for everybody.
They couldn't figure out, okay,how does this connect to the
network?
How do I secure it?
How do I do anything with it?
And many IT teams today, at anycompany size, are doing the
exact same thing.
They're struggling.
There are so many ways that wecan empower teams, but is this
tool secure?
Did somebody just sign up for atool that's free that's
stealing all of our information?
(10:10):
Which so that step number oneis this the stepping back and
communicating with the teamaround hey, what are our
expectations?
What is responsible use of AI?
It doesn't mean here's the fivetools you're allowed to use, or
something we see veryfrequently in big companies is
the answer to AI use is no.
Well, that's not real.
Come on.
What are we gonna do with it?
(10:31):
What does responsible mean?
Number one, you have tounderstand what are the
licensing terms of this thingwe're using.
What does it do with ourinformation?
And for non-geeks looking atlegal agreements, that does
that's not awesome.
Um internally, we wrote acustom GPT to analyze that.
So we could give it a toolname, it would go out, grab all
the documents, bring it down andsay, yes, they're gonna train
(10:52):
on your data.
And that that's the magicquestion.
Um but the simple statementbeing if it's free, you're
giving them your data.
Forget it.
Don't ever do that.
If it's paid, okay, well thenget somebody involved if you're
gonna use it officially.
But while you're tinkering,don't give it state secrets.
That doesn't work.
Um but taking those to thepoint where we know what's going
(11:16):
on.
Um now for me personally,again, coming in on the Vision
Pro, um, one of the things Ilove about Apple is their
privacy stance.
I think one of the hugechallenges, and I don't know
that folks see it coming, isthat when we couple AI with the
company that hosts all of ouremails and our documents and
that makes money by sellingadvertising, that's a bad combo.
(11:40):
And we have two megacorps thatsit there at the center of that,
that's dangerous.
Nick Mellem (11:47):
You you lobbed it
up perfectly, Alex.
I'm was thinking about thesethings along the same lines.
We're in a we need to, iforganizations haven't gotten or
stood up a governing body forAI, get a get a board of people,
four or five people, whateveryou decide on, and start
deliberating on what directionyou're taking with AI.
(12:07):
Because it's coming, you gottado it.
You have to do it now becausewe run the risk of if we don't
do that, your employees aregonna do it on their own anyway.
Alex Bratton (12:15):
Chat OIT is real,
it's gonna happen.
Of course.
Nick Mellem (12:17):
Yeah, we've seen it
all the time through uh
plugging proof point.
You know, we see it at one ofour clients that they're
emailing each other back thetheir work and their personal,
they're emailing back and forth.
So what do you think they'redoing, right?
They're taking theirinformation that they're working
on at work and they're sendingit to their LLM and their
personal machine, and they'resending it back with their
output, right?
So we're they're using italready.
(12:38):
Uh sometimes they might beusing it uh, you know, on their
machine.
But so first off, it's youknow, figure out the direction
you want to go.
Let's peel the hood back andfigure out where you want to go,
what we want to use it for, ifyou're gonna run it, uh, if
you're gonna let if you're gonnause Claude, if you're gonna run
on Cloud or if you're gonna runon-prem, right?
How are you gonna do this?
Um, and then you have to getpolicies and procedures out.
(13:01):
You have to uh educate yourstaff, and then you have to
train your staff on how toactually use it, right?
It's one thing for us to say,oh, don't use state secrets,
right?
That's just a given.
That's things that the three ofus we just know that is a
no-no, we're not gonna do that.
Alex Bratton (13:16):
But what does it
work really mean?
Nick Mellem (13:17):
Yeah.
What does it really mean?
Exactly, right?
Alex Bratton (13:20):
Yeah, totally
agree.
And it's helping them becomfortable with it.
And again, for me, that's abouthow are we baking this into our
culture?
This is the new norm.
This isn't as simple as, hey,we're just gonna hand everybody
Excel who doesn't have it.
It's it we're not giving you atool.
We have to change how we'rethinking about work.
Every time I'm about to do atask, how could AI maybe help me
(13:41):
with this?
And just giving simpleguidance, not IT scary stuff of
no, no, no tools.
Like no, we have we have toembrace the whole company in
surfacing this stuff.
IT can't own all of it.
Totally agree.
Nick Mellem (13:52):
Totally, totally
agree.
I I I actually thinkorganizations could run a risk
of overprotecting this.
You can tighten the bolts toohard.
Um, and that and we run thatbalance in cybersecurity all the
time, right?
We people need to be function,they need to be able to
function, right?
I mean, if we're in a perfectworld, we would just disconnect
from the internet, right?
And then we're good, right?
(14:13):
In the cybersecurity world,that's not an option for us,
right?
So we got to ride that razorthin line of tightening the
bolts too hard.
And we're having the sameproblem, I think, with AI.
I think a lot of organizationsare gonna come in and they're
gonna put too many policies,procedures around it.
We should be deleting enough orwe realize we need to add
things back.
Alex Bratton (14:32):
It's interesting
when you hit on the potentially
clamping down too hard.
Um, that's something that wesee very, very frequently in the
big businesses in the Fortune500 because they have very
sophisticated IT, security, anddata teams.
So, especially over the pastyear or two, many of those teams
have raised their hands saying,we own everything.
Nobody touch it, nobody doanything.
(14:55):
And unfortunately, many of themhave been successful in
wrapping their arms around itand kind of shutting everybody
out.
And which again has totallymissed why does this technology
exist?
It doesn't exist for that groupof people, it actually exists
for the people who aren't theexperts, aren't the geeks?
Wait, I can just talk tosomething, I can chat with
(15:15):
something.
And so where we're seeingpeople be successful is actually
focusing on the employees.
Maybe it's an airline pilot ora flight attendant, maybe it's a
salesperson.
What would help this person dotheir job?
What would empower them?
What would take friction out ofthe way?
And the part that again, thegroups that are locking it down
are missing is if we canidentify the couple things we
want to help people with, we canjust trace a thin line through
(15:38):
the back-end systems, the data,the policies.
We can figure out that partfirst.
And then we can do the nextline and figure out that part.
Instead, we're boiling twooceans.
We're boiling the ocean ofwhat's all of our data, make it
accessible.
Well, you know, to powersomething for this person over
here, you might not even havethe right data yet.
But most folks say, hey, let'swe have this, so let's take our
(15:59):
our ocean and figure that out.
And then let's look at thesystems.
How do we AI enable all ofthese systems and make them all
accessible?
I appreciate the thinking, butspending three years on that
means you are gonna be so farbehind.
When I look at mid-market andsmaller businesses, I love the
passion of business leaderssaying, we have to do this, hey
team, let's figure this out.
And they are moving so muchfaster, which is actually
(16:23):
hinting to me that I think we'regonna see a lot of leadership
positions in a bunch ofdifferent industries change.
Joshua Schmidt (16:28):
Aaron Powell Is
that a sea change in thinking of
employees not just as anothercog in the wheel to accomplish a
task, but treating eachindividual like their own
entrepreneur within the company?
Um, is that kind of whereyou're coming from with that AI
enablement?
Alex Bratton (16:43):
Uh and that's an
interesting way to put it.
And that that that is my corebelief is that things like AI
should be giving peoplesuperpowers.
We should be helping that oneperson be 10 or 100 times more
effective, not how do Iimplement AI so I can fire my
whole team?
I think anybody doing that,number one, you're focused on
cost cutting and that's not goodfor growth, that's not where
(17:05):
you grow.
Number two, the great ideas ofwhere AI is going to transform
the business come from thosepeople.
And they're the ones that havethe idea.
So once they get comfortableand they can lean in and have
that entrepreneurial alwayslearning, hey, what if?
When they start to ask the whatif at times 10 or 100 or 1,000
(17:26):
employees, that changeseverything.
Joshua Schmidt (17:28):
Seems like it
started with an ultra kind of an
altruistic stance.
I mean, baked into the name,right?
Open AI.
It started out open source, Ibelieve, but now we have this
kind of AI arms race going onwith nation states.
Um, how do you see thatsquaring off with kind of it
seems like things should bemoving more to open source if
(17:49):
everybody's enabled to kind ofbe their own entrepreneur and um
everyone has Leonardo da Vinciand Albert Einstein in their
pocket?
How do we get from where we arein this kind of like keeping
things close, but allowingpeople certain liberties?
And how are we going to balanceall that moving forward?
Alex Bratton (18:06):
That's an
interesting one.
Um I don't know that itnecessarily needs to go open
source.
And I think again, back to whatwe the humans control, we
control being greatcommunicators, we control coming
up with the ideas and framingwhat it is that we want the AI
to do.
Um and then even, hey, here'sthe process.
When we're building things, forexample, even in the agentic
(18:26):
world, I mentioned Chat GPTearlier.
You know, it's a simple go-to.
But you know what?
If I'm writing code on my Macand I want to be having it
generate different types ofcode, I might be pointing at
different systems.
So the the simple statement ofnot ever being locked into a
single vendor is more importantnow than ever in technology.
We have to be crafting thingsin such a way that there's
(18:48):
flexibility there.
Joshua Schmidt (18:51):
And on to our
final clip.
This comes from episode 65 withDan Schaefer and Adam Warner,
where we talked about Piehole.
Now, Piehole is a greatnetwork-level ad blocker.
You may have heard of it.
But what really struck me aboutthis conversation wasn't just
the tool itself.
It was what Dan and Adam showedus about how tech communities
can come together.
This is open source at itsbest.
People who care about securityand privacy, building something
(19:13):
valuable, sharing it freely, andcreating a whole community of
like-minded folks who just wantto take back a little control
over their digital lives.
Here's Dan and Adam on whatmakes the Pi Hole community so
special.
Eric Brown (19:26):
Like we were talking
about earlier, this is really
enterprise grade technology thatis at fractions of the cost.
So you could you could bring itin and plug it into a network
and it doesn't consume muchresources at all.
And a really big shout out tothe pie hole community because
(19:47):
there are some people out therethat are curating and generating
lists that we all then consumeand use.
And the lists are up to date.
I think some of them areupdated.
Daily, if not more frequently,with really new and emerging
malicious uh sites.
(20:08):
And we can consume that, andthen we're just as safe as an
enterprise organization.
Joshua Schmidt (20:14):
Adam, maybe you
could speak to just the value of
the community and how how thosepeople have been generating
those lists and how you into howyou integrate that information
into what you and Dan areworking on.
Adam Warner (20:27):
Every block list
that's out there is community
maintained.
We don't have we don't have anopinion.
So as as as the softwareitself, we we don't care what
you're blocking.
You can block as much or aslittle as you like.
Um it's really it's up to youhow you use it.
Um so when we um initiallyinstall just to make sure it
(20:47):
works and just to sort of lowerthe barrier to entry, we have
one suggested list um which wefound works quite well, doesn't
block too much, doesn't break alot.
Um and that's just there to getsort of people started.
But uh yeah, there are, I mean,certainly on Reddit, there's a
guy, uh forget his name,Wally3K, um, who uh maintains a
(21:10):
list of lists.
So not not just his own liststhat he puts together, but he
also um I think he goes throughand and kind of optimizes uh a
few other people's lists.
Um Firebog.net, I believe, isis where he keeps those.
Firebog.
And then you've got thatthere's just there are so many
people out there that are justcoming up with the different
(21:30):
things.
Joshua Schmidt (21:31):
One list to rule
them all, it sounds like.
And there you have it, just asmall sample of incredible
conversations we had in 2025.
If you enjoyed this look back,make sure you're subscribed so
you don't miss the thing we havecoming in 2026.
Because trust me, we're justgetting started.
You can find us wherever youget your podcasts.
Check out our YouTube channelfor video episodes and shorts,
(21:53):
Spotify for video and audio, andconnect with us at ITAudit
Labs.com or on LinkedIn for morecybersecurity insights.
Until next time, stay curiousand stay secure.
This is Joshua Schmidt with theaudit, and we'll catch you in
the next one.
Eric Brown (22:07):
You have been
listening to the audit presented
by IT Audit Labs.
We are experts at assessingrisk and compliance while
providing administrative andtechnical controls to improve
our clients' data security.
Our threat assessments find thesoft spots before the bad guys
do, identifying likelihood andimpact, or all our security
control assessments rank thelevel of maturity relative to
(22:31):
the size of your organization.
Thanks to our devoted listenersand followers, as well as our
producer, Joshua J.
Schmidt, and our audio videoeditor, Cameron Hill.
You can stay up to date on thelatest cybersecurity topics by
giving us a like and a follow onour socials, and subscribing to
this podcast on Apple, Spotify,or wherever you source your
(22:53):
security content.