Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Drew Thomas (00:04):
Fast Fact, in 2023
cyber attacks accounted for over
343 million victims. I'm DrewThomas, and you're listening to
Bank Chats.
(00:39):
Welcome to the next episode ofAmeriServ Presents Bank Chats, I
am Drew Thomas, and as we havediscussed in the past,
cybersecurity topics areabsolutely a very ocean-rich
topic that you can dive just asdeep and wide as you want to.
And so we're going to once againrevisit the topic of
cybersecurity. In this case,we're going to talk a little bit
(01:01):
about not so much specific scamsand things, but really about how
the different types oftechnology that are out there,
and how they might impact yourcybersecurity life in the real
world in your day to day life,and talk about some of those
terms and how they work and whatexactly they are. And, once
again, I am very pleased towelcome back some previous
(01:24):
guests that we've had on theshow. We have Kevin Slonka once
again with us, as well asMichael Zambotti. And both of
them from Saint FrancisUniversity, and welcome back,
guys.
Kevin Slonka (01:35):
Thanks. Nice to be
back.
Michael Zambotti (01:36):
Hey, glad to
be here.
Drew Thomas (01:37):
Yeah. Let me give
you guys a chance to explain
your, your credentials to thosethat may not know you from past
episodes, and then we'll go fromthere.
Kevin Slonka (01:45):
Sure. So, Kevin
Slonka, I teach Computer Science
and Cybersecurity at SaintFrancis University. And I've
also worked in industry sinceabout 1999.
Michael Zambotti (01:56):
Mike Zambotti,
I have worked in the financial
services arena in the past, andnow I teach Cybersecurity and do
consulting as well.
Drew Thomas (02:05):
All right,
fantastic. So, in the past,
we've, we've done sort of, ageneral overview of
cybersecurity. There's anepisode that we did on that if
you haven't heard it, you can goback and listen to that sort of
touches on all thingscybersecurity, to very, very
shallow level. We also did anepisode where we talked about
various scams and went more indepth about things like
ransomware and phishing andthings like that. Today, we're
(02:28):
going to talk about some of thedifferent technologies that are
out there, and how they mightimpact you in the cybersecurity
world. And I think that one ofthe things that we were talking
about that we want to start withis the cloud. Everybody talks
about everything being in thecloud, as though it's some sort
of a mythical place where youcan go and visit. Let's talk a
(02:50):
little bit about what the cloudis.
Kevin Slonka (02:52):
It's up in the
sky, isn't it?
Michael Zambotti (02:53):
Yeah, whenever
you talk about the cloud, you
have to look up. Yes, it's,that's where it is, your data is
up in the sky. If it rains, youcan't get to your data.
Drew Thomas (03:02):
I, you know what,
that is possibly something
someone might think I mean, it'sbut I mean, really, so your data
is not being hung out justsomewhere in the ether, it lives
somewhere.
Kevin Slonka (03:13):
Yeah, I mean,
saying the cloud and saying your
stuff is in the ether is, Imean, it's a legitimate way to
think about it, because it's notlike physically within our
possession on our computer. Itis, you know, I'm putting air
quotes here, somewhere else.
But, you know, the way we alwayslike to explain the cloud is
that it's just somebody else'scomputer. Right? So, if you're
storing your data on, say,Dropbox, or Microsoft OneDrive,
(03:37):
you know, something like that,where it's some external
service, you have to log intoout on the internet to store
your files. That's the cloud.
But that is literally just onsomebody else's computer. You
know, if you're storing yourfiles on Microsoft One Drive,
those are physical servers in aMicrosoft data center somewhere,
(04:01):
right? Whether it's inCalifornia, whether it's on the
East Coast, they have datacenters everywhere, but your
data lives somewhere physical,it is somewhere, but to us, we
don't have to care about that.
And that's the benefit of, airquotes again, the cloud, is that
we don't have to care where ourdata is. It's just magically
accessible to us, and somebodyelse takes care of the physical
(04:23):
part and the storage part, youknow, we don't deal with that.
Michael Zambotti (04:28):
Yeah, there's
definitely benefits to the
cloud. There's challenges aswell. And we'll, we'll cover
both and kind of dip into thosea little bit throughout the
episode. But yeah, as Kevinsaid, it's, it's really somebody
else's computer. Your data is ona data center. You know, if it's
in Microsoft, East Coast,probably somewhere in Virginia.
There's a giant building, whichhas more servers that you can
(04:48):
even imagine, and that is thecloud. Interesting, Microsoft
actually did a proof of conceptwhere they put kind of a carrier
of servers underwater, just tosee if they could do it. And
they ran services from basicallyan underwater data center, an
underwater cloud? An underwatercloud, yeah. That's like fog
(05:08):
almost right. But yeah, thecloud could be anywhere, your
data could be anywhere in theworld. Maybe someday there'll be
data centers in space. Whoknows.
Kevin Slonka (05:16):
What, and that's a
really important thing that you
just mentioned, you said theworld. That's something that we
also need to be cognizant of,when we're storing our data in
these random websites. Where arethose data centers? Like we tend
to think, oh, you know,California, Virginia, Idaho. But
could it be Russia? Could it beIndia? Could it be China? You
(05:40):
know, the internet iseverywhere, these data centers
could be anywhere. So, I mean,that has its own host of
problems if you start dealingwith your Rossen country lines.
Michael Zambotti (05:50):
Yeah, whenever
we think about challenges, you
know, think about our data, ifyour data is stored in a country
like China, last time I checked,they don't use the US
Constitution in China. They havetheir own set of laws, and their
laws apply to data that is, intheir data centers, whether it's
China, Russia, Ireland, Irelandis actually a country with a lot
of data centers.
Drew Thomas (06:10):
So, what so in a
strange way, for people that
might be really old school, whenit comes to technology, it
almost has this sense of being amainframe. The old idea where
you had a mainframe computerthat took up a room, and you had
terminals throughout thebuilding that all access the
data on that mainframe, it'sjust a much more modern Internet
(06:30):
capable version of that idea.
Kevin Slonka (06:32):
Yeah, the cloud is
mainframe 2000.
Michael Zambotti (06:34):
And you might
even say, well, oh, you know
what, I've never used the cloud.
I just use things like OneDriveon my computer. But I don't use
the cloud. Well, you do use thecloud. Services, many of the
Google Drives, well, GoogleDrive, Microsoft OneDrive,
you're on the cloud,congratulations, you made it.
Kevin Slonka (06:50):
Yeah, do you have
an email account? Right, because
where's your email stored? It'snot on your computer, it's in
the cloud somewhere.
Michael Zambotti (06:56):
So, it's
something that really impacts
pretty much everyone that usestechnology.
Drew Thomas (07:00):
And it's, it's a
very convenient thing in some
ways, because if you are tryingto access your email on your
phone, and then you want to alsobe able to access your email on
your laptop, and you want to beable to access your email on
some other device, the only wayto do that is to have that email
stored somewhere not on yourdevice, which is in the cloud.
And that's how you can reach thesame email from multiple devices
(07:21):
in your home.
Kevin Slonka (07:22):
I mean, think of
it like your physical mail, you
know where, you can only accessyour physical mail in one place,
right, at your house where yourmailbox is. But with digital
technology in the cloud, we canaccess it anywhere.
Michael Zambotti (07:34):
Yeah. Which,
which is great. You know, as far
as, you know, in security, weoften will talk about
functionality versus security.
And it's as far asfunctionality, that's awesome,
that's great technology. But onthe security side, there's also
some interesting challenges. Ifyou can access your data
anywhere, maybe somebody elsecan as well.
Drew Thomas (07:50):
So, if you're using
something that is in the cloud,
which it sounds like it's prettymuch anything, everything, yeah,
I mean, everything's out there,right? Is there something you
should look for when you'redeciding where to set up your
email, for example? Because youdon't really have any control
over where they store your data,correct? Right.
Kevin Slonka (08:04):
Yeah, you can't
say I want my data here or
there. Yeah, so I mean, go backand listen to previous episodes
where we talked about passwords,you know, that's the first great
step is if your data issomewhere else had better be
behind a good password, sopeople can't break into it. But
also, if you're looking for, youknow, to use a new file storage
service, or whatever, do someresearch on the company first.
(08:27):
Google the company name and seewhere it's based out of. Would
you rather use a company that isbased out of California, or a
company that is based out ofDenmark? Not saying there's
anything bad about Denmark, butthat's a different country,
different laws.
Drew Thomas (08:40):
Something is rotten
there.
Kevin Slonka (08:43):
Do you want your
data there? You know, I don't
know. But that's a choice youhave to make. And you have to be
cognizant of the companies whoown the services you're using.
Michael Zambotti (08:53):
Interesting
also, probably not a road we
want to go down, but Europe hasdifferent privacy laws. So, if
your data is stored in Europe,or you're working with people
that are in Europe, you know,maybe a topic for another
episode, but it's something thata business might think of as an
individual, you're probably notgoing to think of that too
closely. You're not going to betoo concerned about that.
Kevin Slonka (09:10):
Yeah, here, here's
one thing that I can almost
guarantee is going to apply toeverybody, TikTok. We were
talking about this off-micbefore, but with TikTok, do you
know who owns TikTok? It's, it'sa Chinese company. So, when you
use TikTok, and you make anaccount there, you've created a
(09:32):
username and a password thatChina has access to. Do you
think that's okay? All of thestuff you're looking for on
TikTok, all of your usage onTikTok, is on Chinese servers
now. Do you think that's okay?
And you're using an app thatyou've installed on your phone
that was developed by China. Doyou think that's okay? Could
that not have malicious code init? We would like to think not,
(09:56):
but you're essentially giving,we talked on a previous episode,
how China has a whole portion oftheir military whose job is to
hack other countries, you'regiving them free rein to your
cell phone, if you have anaccount on TikTok, and you have
the app installed on your phone.
You know, you're essentiallyjust opening up the door to
them. So, looking into companiesthat make apps on your phones is
(10:22):
also as important as likeDropbox or file storage
services, you really want toknow where your data is going.
Michael Zambotti (10:28):
Well, years
ago, also, maybe even during the
cold year process, wheneverRussia wanted to spy on the US,
they would actually physicallysend a spy to the US who would
get a job here and integrateinto society and send back
reports about what was happeningin the United States. That's how
they knew what was going onhere. Now, you know, like Kevin
mentioned with TikTok, thecontent is also going directly
(10:49):
to these other countries, it'sgiving the other country,
possibly an adversary, really,really deep insights into what
we do here every day, you know,and what our people are doing.
Kevin Slonka (10:59):
And that's a good
word that you just use
adversary. A lot of people don'thear that unless you're like in
the federal world. But just tomake it clear to people, the two
largest enemies of our country,from a political perspective,
are Russia and China. So, whenwe talk about, you know, do you
want your data being in Russia?
Do you want your data being inChina? We're saying that as a
bad thing. Like, you don't wantthat. Because those are the two
(11:22):
countries that are constantlytrying to attack us and steal
things from the United States.
Michael Zambotti (11:29):
So, it's
almost like we've given them the
ability to do espionage withouteven trying, yeah, we're
providing the information forthem, let alone,
Kevin Slonka (11:37):
yeah they don't
need to send a spy anymore,
right. We're just giving it tothem.
Michael Zambotti (11:39):
Right. We're
feeding them that information.
So, you know, and we havetouched upon artificial
intelligence, and we'll talkabout that on future episodes as
well, which is a topic that'shot in the news. For artificial
intelligence to work, you know,at its core, it has to learn
from something. So, there'sthinking that the owners of
TikTok are using all thesevideos that were, the millions
(12:00):
of hours of videos, to teachartificial intelligence, to
learn about US customs, oursociety, how things work here,
what our people do, how wespeak. And so whenever we talk
about possibly deep fake videos,and let me get my tinfoil hat
back on.
Kevin Slonka (12:16):
We never took it
off.
Michael Zambotti (12:17):
Right,
exactly, but you know, we think
about the emergence of deep fakevideos where a video might be
showing you, saying somethingthat you never said, maybe it
was born out of a TikTok videothat you made a couple of years
ago. So, some not to, not toscare anybody, again, the
tinfoil hat is firmly on myhead, but it is something that's
(12:37):
possible, something that we needto think about as people that
are trying to defend our, ournation and our data.
Drew Thomas (12:43):
Well, again, in a
previous episode, and you know,
we talked about some of thesedifferent scams, right? And we
said that one of the biggestthings that you can do to help
protect yourself is to simplynot give the keys to the kingdom
to the person that's trying tosteal your stuff, right? I mean,
speaking from a bankperspective, you know, you can
put as many firewalls andprotections around our servers
and information as you like. Butno matter how much we defend
(13:06):
that data, if you, as anindividual, give the keys to
your data to someone elsethrough some sort of a scam,
there's nothing anybody can doabout that. Right?
Kevin Slonka (13:17):
If you put your
debit card numbers out on
joestoyshop.com, yeah, you getwhat you get.
Drew Thomas (13:22):
So, in a way, this
sounds like the same thing. You
know, it's exponential. I mean,the amount of data that we're
exporting out to the internet interms of YouTube videos, TikTok,
video, social media, is justabsolutely amazing. And you're
just handing that information tosomebody else.
Michael Zambotti (13:38):
I've seen
statistics, and it's probably
hard to actually quantify. Butwe create now as a society more
data in a day than they did forlike decades at a time before
the internet was born.
Drew Thomas (13:50):
So, we were talking
about the cloud being sort of in
different places around theworld, right. And I think that
leads to a conversation aboutthis thing that, that people are
hearing now called VPN. Thisidea that if you're on a VPN,
you're protected. If you're on aVPN, magic, you can pretend that
you're in a different place, youcan pretend that you're in a
different part of the world, youcan, you can somehow protect
(14:10):
yourself. So, let's, let's talka little bit about what VPN is
and what it really does.
Kevin Slonka (14:15):
Yeah, so you
brought up two points there. One
is that you're safe, and two isthat you can pretend you're in a
different part of the world. So,point number two is true, VPNs
can do that. Do they make yousafe? Depends on how you use
them. So, it's important forpeople to know this because you
all listening, probably haveseen the commercials on TV. You
know, there's one company,specifically Nord VPN, that has
(14:38):
commercials during everything.
So, you've probably seen thatcommercial where they tell you,
you know, install this Nord VPNservice, and your internet
browsing will be safe,everything will be private.
Magically, you know,everything's awesome. Yeah. So,
so what is a VPN? I think theeasiest way to explain it is, is
to give the example of like anemployee who works from home.
(14:59):
So, if you have a job and you'rein your office, you're actually
at the company's office, you canaccess all the stuff that is in
that building, the servers thatare in that building, your files
are in that building. And youcan only access it when you're
in that building. Well, withpeople working from home,
companies need to give a way forpeople to access those things
that are only in that building,while they are at home. So,
(15:21):
enter the concept of a VPN,which basically makes this
secure connection, this securetunnel, between your computer
wherever you are, and yourcompany's building. So, it makes
it look like you're in thebuilding physically, even though
you can be anywhere. But the keyof the VPN is it's encrypted.
So, if anybody tries to spy onyour traffic, your work related
(15:44):
traffic, they're not going to beable to read it. It's all
scrambled, it's encrypted. So,Nord VPN is trying to sell you
this tool that kind of startedoff as, as a business tool to
allow people to work from home,as a way to say that, you know,
we are sending your data throughour encrypted tunnel, and
therefore your data is privateand safe. And it's true to a
(16:07):
point. But the, the key questionyou have to ask yourself is, who
are you trying to hide your datafrom? So, when you browse the
internet, what you type in, say,you're logging into a website,
that goes out across theinternet, and it goes through
your internet service provider.
You know, around this area, youmay have Comcast, you may have
(16:28):
Breezeline, whoever you have, soyour data is going through their
servers. Technically, you couldsay they could see your data
because it goes through theirservers. And then it goes
through however many otherservers until it gets to the
final destination. So, if youdidn't want your internet
service provider to be able tosee your data, you could use a
VPN, and then it would beencrypted, and they wouldn't be
(16:50):
able to see your data. Nord VPN,that's, that's what they're
saying, is that if you use us,your ISP won't be able to see
your data, your service providercan't see it.
Drew Thomas (17:00):
And so what's the
advantage to that?
Kevin Slonka (17:02):
Exactly, yeah. And
there really is no advantage
from that specific argumentunless you think your ISP is
spying on you. And you don'twant them to see your stuff.
Because the the point thatnobody ever thinks of and the
point that they don't say in thecommercials, is that Nord VPN
can see your data, because theother end of that encrypted
(17:22):
tunnel is coming out on Nord VPNnetwork. Do you know who Nord
VPN is? Who works there? Whatcountry they're in? Where their
servers are? Like, you knownothing about this company, but
yet you're paying them and usingtheir VPN service, and you're
giving all your data to them.
So, yes, it allows you to makeyour data private and to hide it
from certain people. But you'realso now exposing it to other
(17:43):
people who you may not want tohave access to your data.
Michael Zambotti (17:49):
As Kevin
mentioned, Nord VPN is a paid
service, so that's bad enough,you are actually paying for the
service, and you're, you'regetting it and ostensibly,
they're hopefully not looking atyour traffic. There are a host
of free VPNs, quote, air, I'mgonna borrow Kevin's air quotes,
the free VPNs which...
Drew Thomas (18:05):
Audio only is
rough.
Michael Zambotti (18:07):
Exactly. These
are even worse. If you see a
free VPN, building a virtualprivate network infrastructure
is expensive. So, why would acompany give away that product
for free? What, you know, one ofmy favorite sayings is if you're
not paying for the product, thenyou are the product. So, there
has been cases where free VPNswere actually injecting
(18:28):
advertisements into your, yourbrowsing experience. And in some
cases, actually just snooping onyour traffic, which they can see
because it's going through theirservers. Yes, it is encrypted,
you have an encrypted connectionwith them with the provider of
the virtual private network, sothey can see all your traffic.
And in those cases, you know, Iwould say, stay away from
generalities, but, or absolutes,but I would say almost
(18:51):
absolutely stay away from anyfree VPNs.
Kevin Slonka (18:54):
Yeah and if you
really are concerned about
people seeing your web browsingtraffic, that the best thing you
can do is not go out and buy aVPN, it's make sure that every
website you browse is encryptedin and of itself. And that's
very easy to see because mostweb browsers in the address bar
will show you a little padlockto let you know that that site
(19:15):
is encrypted. Or if there is nopadlock, you can look at the URL
and see that it starts withhttps. If you see that letter
"S", that's telling you it issecured, it is encrypted. So, as
long as you make sure that thewebsites you're browsing and
giving your personal info to areencrypted, you don't need a VPN
(19:36):
because your data is alreadyencrypted with that browsing
session. So, just make sure thatthe sites you're browsing, have
that HTTPS, that they areencrypted and that is
technically good enough.
Michael Zambotti (19:46):
Now also one
thing to be aware of what the
encrypted need to look for thatlock in some browsers will have
the little green lock or justthe closed lock. Yes, you have
an encrypted connection withthat site. Some attackers, and
we did talk about this a coupleepisodes ago with the phishing
emails and landing pages whereif you click on a link, it goes
to a rogue landing page. Itmight look like you're signing
into a certain resource likeAmazon, but it's actually
(20:08):
controlled by the attacker, andattackers have gotten smart. And
they actually will purchasewhat's called a certificate to
show their website is also, itwill show it as encrypted. And
yes, while you do have anencrypted connection with that
website, then unfortunately,that is a malicious website. So,
if you see the lock, it doesn'tguarantee your safety.
Kevin Slonka (20:28):
that that website
is the website you intended to
go to. Yeah, you know, somebodydidn't swap it out on you.
Michael Zambotti (20:37):
Right. So, the
encrypted traffic, you have this
encrypted connection with thethe attacker, congratulations,
no one else can see thatcorrespondence except you and
the attacker, which is, youknow, which is great. Except for
you know, you just gave yourcredentials to somebody just
relying on one thing versuslooking at a couple different
Drew Thomas (20:59):
So, we're talking a
lot about things, we keep
factors.
mentioning traffic, internettraffic, back and forth. When we
talk about internet traffic,we're talking about data being
sent out from your devices anddata being received by your
devices, right? Sure. So, andwhen we talk about devices, we
tend to think of things like ourcell phone, our laptop. What we
(21:19):
don't always think about, Ithink in today's world
especially, is the fact thatthere are so many other things
in our homes that are connectedto the internet that are sending
and receiving traffic all thetime. And they call that the,
that the internet of things,that IoT, right? So, you know,
is there a danger? Is there,maybe danger is the right word,
maybe it's not. Is there adanger to connecting my, my
(21:42):
refrigerator to my network sothat I can monitor the
temperature of the freezer? Isthere a, let's talk a little
about that.
Kevin Slonka (21:48):
Yes, throw them
all away.
Michael Zambotti (21:51):
I saw an
interesting story about a what's
called a smart refrigerator thatwas connected to the internet,
it would send you an alert ifthe door was opened. So,
refrigerator doors ajar it wouldsay hey, the doors ajar. So, the
next thing was well, if it wasthat smart, when to just shut
the door. Why tell me?
Kevin Slonka (22:07):
But yeah, that's I
mean, you bring up a good point
that anything we buy that hasthe word smart in it, a smart
whatever, the word smartbasically tells you it's
connecting to the internet,right? You have to configure it
for your Wi Fi, or plug it intoa network cable somehow. But,
and we have mentioned this in aprevious episode as well, all of
these devices are computers.
Like, yeah, our phone is acomputer, if you have a smart
(22:30):
refrigerator, there is acomputer inside of your
refrigerator, just the same asyour laptop. And literally it is
a computer. And a lot of thesedevices, what we see, is that
the people, the manufacturerswho are making them, I'm gonna
say this, they basically don'tcare about security. They never
(22:50):
tested for security, they testedfor functionality like Mike had
talked about before. Does itoperate as a refrigerator? Does
it alert you when you're low onmilk? Does it do all the
refrigerator thing, but becauseit's a refrigerator, nobody ever
thinks to protect it as if itwere a computer. So, having all
of those devices on your networkthat were never tested for
cybersecurity vulnerabilities,that just opens up what we call
(23:13):
the attack vector, the ways thata hacker can break into your
home network. And once they getinside your house, what do they
have access to right? Yourpersonal laptop that has your
credit card data on it, yourpersonal phone, you know,
whatever is in your houseconnected to your home Wi Fi,
which is everything right? It'severything.
Drew Thomas (23:32):
So, I think it's
that, that brings up a good
point that having a secure homeWi Fi is important. I mean, when
you buy a router at your localelectronics store, and you take
it home and it has a defaultpassword of admin.
Kevin Slonka (23:47):
Never changed.
Drew Thomas (23:50):
Easy to remember, I
mean somebody could literally be
sitting out on the front, youknow, sitting in a car parked
out front of your house, lookingcompletely innocuous and be
checking to see what the Wi Finetworks they can reach just
from sitting out on the street.
And if you haven't changed yourpassword, they now have access
to everything that's going, youknow, all that traffic on your
network and inside your home.
Kevin Slonka (24:11):
Yeah, this is
hacking 101, that is literally
the first thing I teach mystudents when you're trying to
break into a device. Try defaultpasswords. Yeah, try admin admin
try admin password. Try all thedefault things that companies
might put on their devicesfirst, because chances are
somebody didn't change it.
Michael Zambotti (24:28):
Well what I do
is I change my passwords to
incorrect so that if I typed thewrong thing, it says your
password is incorrect.
Kevin Slonka (24:34):
It is not a real
thing. Don't do that.
Drew Thomas (24:36):
That's a great
joke. Don't do that. And yes,
also for the longest time themost common password was what,
123456 or something?
Kevin Slonka (24:44):
Yeah, it still is.
Michael Zambotti (24:46):
Yeah, the list
of top 10 passwords has been
unfortunately the same 123456password. password with the "@"
sign as the "a", you know. Yeah,"$" signs is the "s" but
consistently, and that's whatattackers will do, like Kevin
was saying, hacking 101 orethical hacking 101, if you're
trying a password, you try thesimple ones first. And, you
(25:06):
know, if you try 100 people,maybe one or two you get and
that's all you need. You havethe opportunity then.
Drew Thomas (25:11):
Stealing from
Spaceballs the movie, that's the
kind of password that somebodyhas on their luggage. Yeah.
Kevin Slonka (25:19):
It's true, though.
I mean, and who really changesthe default password on their
router? Most people probablydon't even know you can they
just get the router from theirISP, plug it in, they have Wi
Fi, they're happy.
Michael Zambotti (25:31):
They want to
get up and running. Yeah. Don't
want to necessarily change thepassword. Or they want to say,
well, what if I forget it? Youknow, what if, because how often
does the average person accesstheir router?
Kevin Slonka (25:41):
Yeah, I mean,
after it's originally set up,
you probably never need to login again.
Drew Thomas (25:45):
And that's it, you
make a good point that so many
people get their routers fromtheir ISP, they don't even buy
their own anymore, right.
They're using one provided forthem by their internet service
provider. And that ISP has adefault password that they use,
that you can change, but mostpeople just allow that person to
set up their network, and thenas long as they can get online,
that's good.
Michael Zambotti (26:04):
That's the
goal. All right, yeah. And, you
Drew Thomas (26:06):
And if I can borrow
your tinfoil hat, I think we're
know, we look at these Internetof Things devices, and it is
really a two edged sword,because, you know, I can see the
use cases, I was at a party andsomebody was, the woman looked
all wearing them at this point.
It really makes you wonder,at her phone, and she's like,
I'm gonna feed my cats now. So,she logged into the cat feeder
back at her apartment, and wasactually feeding, I was like,
wow, that's pretty lazy. But itused to be, you'd have just a
friend go over to your house andmaybe toss some food out for the
(26:29):
cats. Yeah, but there'sfunctionality and people like
that technology, people, theylike, if these drone companies
are not somehow sending thatwant to interact with
technology. But like Kevin said,these Internet of Things
companies, sometimes, especiallythe drone companies, you know, a
lot of them, the drones arevisual data, somehow back to a
server somewhere, and getting, Imanufactured in China, they
don't think about security, orthey're actively looking for
ways to sabotage and gainmean, they can get good maps of
neighborhoods, they can get downpasswords or gain access. So,
(26:50):
you know, the functionality isthere on one hand, but on the
other hand, we do have scenarioswhere, hey, they're not thinking
about our security, we need tothink about that.
(27:17):
to the street level, you know.
Michael Zambotti (27:19):
GPS
coordinates, everything, you
know.
Kevin Slonka (27:21):
Well, you just
brought up an interesting point
of, you know, spying on thatvisual data. Do our listeners
think that they would ever justwillingly give bad guys an open
microphone in your house or anopen video camera in your house,
so anybody can watch you? But wedo this all the time. I see
where you're going. Yeah, howmany of you out there have an
(27:43):
Amazon Alexa, or any, like aGoogle Assistant, something that
responds to your voice at home?
Or how many of you have a smallchild, and you have a smart baby
monitor near the crib that has avideo camera? You can do a quick
Google search and see that thereare people out there who are
breaking into these smart videocameras to like spy on babies.
(28:04):
And because you can speak toyour kid through them, these
people are talking to otherpeople's children. Because these
are smart devices that are onthe internet, and people have
found a way to break into them.
Drew Thomas (28:18):
That that is
terrifying.
Kevin Slonka (28:19):
Yeah, think about
I mean, yeah, you are just
giving the bad guys, and youknow, what's the number one
reason why this is happening?
Because those devices havedefault passwords, and people
don't change them.
Michael Zambotti (28:30):
And why do so
many people use these devices?
Because, and it's you know, thefunctionality versus security
debate. They are functional,they do have a purpose that is
making people's lives easier,right. But if we only think
about the function, that's onlyone side of the coin. Also that
security side. Yeah, I've seenstories to Kevin, like you
mentioned about the babymonitors, strangers talking to
your child, which to me is justthe height of creepiness, in
(28:54):
your own home, where you would,you expect to be in a safe
environment.
Drew Thomas (28:57):
It makes you
wonder, what would possess
people to do this? And I thinksometimes it's simply the
challenge. I think certainpeople find, you know, they go I
just want to be able to see if Ican do this and they do it.
That's creepy enough. But for,for somebody that's also then
using it for a nefariouspurpose, you know, to be able to
tell when you're at home, whereyou might be located in your
(29:17):
home or something like that.
That is, that's really scary.
Kevin Slonka (29:19):
Yeah, especially
if you have, you know, smart
security cameras. So, not justbaby monitor cameras, but
security cameras around yourhouse. If somebody can access
that, they might be able torecord the footage of you in a
compromising situation, and postit online to blackmail you. So,
you don't want to be givingpeople free rein to see inside
your house. Right? With theAmazon Alexas, the same thing
(29:41):
with listening inside yourhouse. Like have you ever really
thought how a Google Assistantcan respond when you say, okay,
Google? It has to be listening24/7, that microphone has to be
on 24/7 so that it can hear yousay okay, Google. What is it
capturing and recording in allof that downtime when you're not
(30:03):
saying, okay, Google, while itis waiting for those trigger
words, and where is that datagoing?
Drew Thomas (30:08):
Yeah, and this is,
there's another guest that we
would like to have on the workswith the gentleman that does
look at this from like a legalstandpoint. And I would love to
sort of have this conversationwith him here on a future
episode to talk about thelegalities that, that are
involved in in some of this,because there have been legal
filing suits and things likethat, that get filed against
(30:29):
some of these people to provethat the data that they're
collecting, when when you're notactively using the device is not
being used for a purpose, notintended. Whether or not that's
ever been completely proven, oranything else is anybody's
guess. And to the point that wasmade earlier, and I can't
remember which one of you madeit, that there are functional
(30:49):
uses for these things. I mean,it's, we're not sitting here
saying that, you know, youshould just strip every piece of
technology out of your home andnever ever, ever use them.
Although Kevin might be.
Kevin Slonka (31:00):
I do have some
smart devices in my home.
Drew Thomas (31:03):
But, if you are
going to use them, to use them
as safely as feasible. And thatmeans doing the basic stuff,
like changing passwords, whenyou receive devices, you know,
initially right out of the box,because, and correct me if I'm
wrong, I would have to thinkthat if someone is trying to
access your home network,they're not going to waste a lot
(31:24):
of time trying to break into anetwork that has a changed
password, if they can also findone, a block away that isn't
protected, right?
Michael Zambotti (31:32):
Most attackers
are opportunistic, they're going
to go for the low hanging fruit.
They're going to try yours,they're going to try, it's
almost like if you broke into ahotel, and hopefully none of our
listeners are breaking intohotels, but if you did, you're
going to run down the hall andcheck every single door. Hey,
you found one that was open, itdoesn't matter whose room it
was, it just matters that youfound an open room, that's where
you're going to explore. So,you're an opportunistic
attacker. And that's generallythe majority of of cyber
(31:54):
criminals will have thatopportunistic mentality. Some
are motivated, where they'regonna go after a specific
target, we might see that withmaybe a state sponsored
espionage based cybercriminalgroup. They want to go after,
they want to get the plans for ajet, fighter jet. Well, I don't
have the plans for a fighterjet, but Lockheed Martin does.
So, that's a motivated attacker,they're going after that
(32:17):
specific target. That's in theminority, most attackers will be
opportunistic.
Kevin Slonka (32:22):
Yeah, this
actually happened, I can give an
example from a company that Ipreviously worked for. There was
a vulnerability that gotpublicized with Microsoft
Exchange, which is the Microsoftemail server that companies can,
can run on their corporateservers. And, you know, our
(32:42):
adversaries found out aboutthis. And what they did was
basically exactly what Mikesaid, they did kind of, they
took an opportunistic take toit. And they just launched that
attack against the entireinternet, basically, just to see
who they could break into usingthis new vulnerability. But they
weren't looking for money, or,you know, just to take what they
(33:04):
could, they were looking forspecifically, like government
type data. So, we had found thatone of the clients that we
managed, was breached because ofthis vulnerability. So, as we
were investigating, we realizedthat they breached initially and
then stopped. They didn't moveon to phase two of the breach.
(33:25):
And the reason that theattackers probably did it is
because this company wasn't agovernment contractor, they had
no data of interest. So, yeah,like Mike said, you know, you,
you could be hacked right now,every one of us could be hacked
right now and not know it.
There's always the chance thatthey just steal whatever they
could steal, but there's alsothe chance that they didn't find
what they were looking for sothey move on.
Drew Thomas (33:48):
Something you said
about being opportunistic, and
then you also said about thehotel example you gave me, I
wanted to circle back to theidea of public Wi Fi. And the
fact that, you know, we'retalking about protecting your
home Wi Fi and having passwordsand so forth, but, you know,
there's a lot of businesses, alot of hotels, coffee shops,
whatever, that, you know, sayfree Wi Fi, you know, come sit
down, look up your laptop, andyou know, work here while you
(34:11):
eat, drink, whatever. But everis that a good idea?
Michael Zambotti (34:15):
Public Wi Fi
is something to be aware of, and
yeah, it's easy. Hey, you go tothe coffee shop, and you go to
Starbucks, and it says Starbucksfree Wi Fi. I have a device
called a Wi Fi pineapple, okay,it's something that anybody can
purchase. You can set up what'scalled a rogue access point, a
rogue access point, which meansI can sit at Starbucks and
create an access point that saysStarbucks high speed. Okay, and
(34:39):
just sit there. Anybody thatconnects to that access point, I
will see all the traffic. So,whatever it is, if they type
passwords, I will see thetraffic. They will have no idea
because they will also connectto the broader internet, they'll
get the traffic that they wereexpecting. They want to go to
their bank, type in thecredentials, they're not going
to see that I was able tocapture those credentials. So,
(35:00):
this is something that canhappen whenever you're using
public Wi Fi, you don't knoweven if you're connecting to the
resource that you think you are.
It could, you know, you can makea Wi Fi access point called
anything once you have theproper hardware.
Kevin Slonka (35:10):
This actually
happened, I watched a YouTube
video of an ethical hacker whowas demonstrating this. And he
went to a hotel, I think it wasa Marriott, and the the name of
the Wi Fi, the real Wi Finetwork was just Marriott. But
he sat out by the pool and setup his own Wi Fi access point,
and he called his Marriott pool.
Sounds logical, right? If you'reout by the pool, that's probably
(35:33):
the one you want to connect to,right? If you're a guest, you're
not even thinking twice, right?
You're connecting right to it.
And what he would do is he wouldwatch people's traffic, figure
out who they were, and then walkover and tell them hey, look
what you just did. I just sawall your stuff. You should
probably be more careful. Yeah.
Drew Thomas (35:50):
We talked about
like being a review that traffic
right? Do you have to havespecialized software to
understand what that traffic is?
Or is it just a bunch ofcomputerized ones and zeros
streaming past? And it's not thematrix, right? I mean, you're
not.
Kevin Slonka (36:04):
I mean, it is the
matrix. And yeah, you do need
specialized software. But it'snot like you have to pay for it.
You know, all of these tools arefree. Anybody with a couple of
quick Google searches can getsoftware to be able to read that
traffic, it's not difficult.
Michael Zambotti (36:18):
And things
like YouTube are excellent for
learning, you can learn so manydifferent things on YouTube. You
can also learn how to do thingslike malicious activities, like
setting up a rogue access point.
If you want to learn thoseskills, you know, the tools are
cheap, or free. And you can goon YouTube and find out exactly
how to do it, it will take youmaybe a couple hours to get up
to speed. So, it's not somethinglike you have to have a
tremendous amount of hackingability or computer ability. You
(36:40):
can get up to speed prettyquickly, and do some malicious
things. So, not to scare anybodybut, public Wi Fi, probably
something that I would avoid,unless you, you know, unless it
was an emergency, I don't know,if there's emergencies where you
need to have Wi Fi. But you canuse an alternative would be
using your phone as a hotspot.
So, you have control over, overthe Wi Fi or, or purchasing a
(37:04):
hotspot or maybe just savingyour net activity for whenever
you're in a Wi Fi that youcontrol.
Kevin Slonka (37:09):
Yeah, if you
absolutely have to use public Wi
Fi, probably the only thing wecould say is, at least use
public Wi Fi where they make youput in a password to connect to
it, because at least then yourwireless connection is
encrypted. If you're connectingto one that doesn't require a
password to connect, then thatis plain text. And anybody could
(37:30):
be sitting there like Mike withhis pineapple sniffing that
traffic and reading yourpasswords as you send them. So,
you know, at a minimum, makesure you're at least connecting
to one that requires a password.
Drew Thomas (37:40):
But going back to
your previous point, then you're
sharing your data with someone,someone you're sharing your data
with that company theoretically.
Kevin Slonka (37:48):
You have to trust
in Starbucks, if you're sitting
at Starbucks that they're notreading your data. Yeah, this is
probably the one case where aVPN might be a good idea. If you
have a legitimate VPN, that isnot some, you know, random
garbage one that's also stealingyour data.
Drew Thomas (38:04):
So, people that
travel extensively, things like
that, and maybe, maybe, maybethat's...
Michael Zambotti (38:08):
Maybe invest
in a hotspot, yeah, you know a
lot of the cell providers, youcan purchase a separate hotspot,
or use your phone as well.
Kevin Slonka (38:15):
And it's probably
only what $10 bucks more a month
to turn on the hotspot featurefor most companies, that's
definitely a good thing to do,right.
Michael Zambotti (38:21):
But it comes
down to your particular use
cases. If you are traveling alot, maybe you do want to have
internet. So, we can't go 100%on the security side, we do need
Drew Thomas (38:30):
And I think that's
one of the reasons why we're
some functionality.
doing all these episodes, and,you know, why we've done the
ones in the past that we've,that we've already released.
Because we live in an era oftechnology, unless you really
want to live off the grid, andjust shun almost all of modern
(38:50):
society, it's almost impossibleto not interact with technology
on some level. And the trick istrying to do it as safely as you
can. Which is not to say thatit's 100% safe no matter what
you do. But then again, gettingin your car in the morning and
driving to work is not 100%safe, it's an acceptable level
of risk that you take when youdrive to work every morning
because you feel that thebenefit is going to outweigh the
(39:11):
risk. And that's kind of whereyou are with these pieces of
technology, the Internet ofThings, the voice assistants,
whatever, you have to personallybe comfortable with whatever
level of risk you're willing totake for the amount of benefit
you're willing to receive.
Michael Zambotti (39:27):
to do is
consider the threat, you can
never eliminate risk. You wantto get in a car, there is a
nonzero probability you can bein a car accident, but you can
also put into play controls. Youput controls in place, you have
seatbelts, you have safe drivingprocesses, you have airbags, all
(39:50):
those things, there will alwaysbe a residual risk. So, we're
not saying hey, don't usetechnology. We're saying let's
use it smart. Let's use it in away that we're aware of the
threats and we're puttingcontrols in place, you know. And
one thing I did want to mention,you know, go back to our
Internet of Things discussion.
We were talking about inside ofthe house. To open up another
can of worms, what about thingslike Ring doorbells and outside
of the house? You know, I see,you walk down the street in the
(40:12):
city, everyone has a Ringdoorbell. It's almost like
you're on camera, if you'rewalking around.
Drew Thomas (40:17):
Oh, it's it's to
the point where, you know, you
see that all the time, even onTV. Law enforcement will go
around a neighborhood that hashad some sort of an event, you
know, people breaking into acars or doing something. And
they'll go to the neighbors thatif they noticed, they have a
Ring doorbell, they'll say, youknow, can, can we look at your
Ring doorbell footage, you know?
Michael Zambotti (40:33):
Well, one step
beyond that, they'll actually
subpoena Ring. They'll subpoenathem and actually get the
footage. So, you have no say.
Kevin Slonka (40:40):
Yeah, you actually
agree to that whenever you set
up and log into your Ringdoorbell for that, uh, that end
user license agreement thatnobody ever reads for all of
their stuff. There's a littleline in there that says, by
using this doorbell, you agreethat, you know, Ring has access
to all of your videos and canprovide it to law enforcement at
any time. So, you don't have tosay about it.
Drew Thomas (41:01):
That's, we, we were
again, this is one of those
things we were talking about itbefore we started recording
today, but it comes up, youknow, and it goes to risk
appetite, I guess too, I likethat term as well. If you want
to use these things, sometimesyou don't have a choice, right?
We were talking about, likecaptcha, and things like that,
like, you know, there's a,there's a functionality that may
(41:21):
be beyond the obvious for someof these things. But if you want
access to that site, you don'thave a choice, you know, you got
to complete the captcha.
Kevin Slonka (41:28):
The only choice is
not to use it.
Drew Thomas (41:30):
Yeah, you know, and
the same thing, you know, if you
want a Ring doorbell as anexample, you have to sign that
license agreement, or you can'tuse it. So, again, it's one of
those things where you shouldread those things. Right?
Michael Zambotti (41:40):
That's the
thing, though, as consumers, we
are kind of behind the eightball. If we read it, and there's
something we don't agree with inthere, we can't cross it out. Do
you want to use this product ornot? Okay, well, are you not
going to use this productbecause of one line in the User
Agreement. That's, you know,most people are not going to
make that decision. Most peopleare not going to read it. I have
(42:01):
heard stories of companies thatwill put easter eggs in their
end user license agreements,where it will say if you read
this line, email this, and youget something. One company was
giving out, I think $1,000 or$2,000. And it took like six
months before somebody claimedit, it was a long time. But it
just shows, hey, in classes,I've done that in my syllabus. I
have put in there, if you readthis line, email me, and you get
(42:23):
five extra credit points.
Generally 20% of students willemail me. So, even in a college
course, a syllabus, that's onlya couple pages, we're talking
end user license agreements thatare hundreds of pages in point,
what is it, two, like two pointfont? Yeah. I mean, yeah. And
it's in legalese. All thesentences are, you know, a
paragraph long and you read thefirst line, and you're like,
this is not happening today.
Drew Thomas (42:45):
Yeah, it's a
strange world, I mean, that we
that we live in. We've talkedabout you know, just that
exposure that you're, you'reputting yourself out there for
even when it comes to usingemail or doing the basic stuff,
like being able to send andreceive text messages and stuff.
I mean, that has a risk. Wetalked about, you know, not
clicking on links, things likethat in previous episodes, but
theoretically, just owning acell phone has a certain level
(43:06):
of risk, because it has toconnect. I mean, I don't know
that you can buy a cell phonethese days without having a data
plan, you know, without havingsome internet connection on that
phone.
Kevin Slonka (43:17):
Well, and the way
that the cell companies are
going now is that all voice willbe done over data in the future.
So, you really don't have theoption anymore. Yeah.
Michael Zambotti (43:27):
Right. And we
think about it, you know, in the
context of us, purchasingtechnology and using it, you
know, our Alexa's or our Ringdoorbells. But what if we don't
even have one? What if ourneighbor has a Ring doorbell,
we've never even agreedanything. But every time we go
out of our house, we're oncamera. Our neighbor can see us
and what can we do? You know,what's our defense against that?
Yeah, really none. Move I guess.
Kevin Slonka (43:49):
There's no right
to privacy in public.
Michael Zambotti (43:51):
That cabin in
the woods that we are, we aspire
to live in, covered in tinfoil.
Yeah.
Drew Thomas (43:58):
All right. Well,
guys, I mean, I, once again,
lots of great information andinformation that we could go
into deeper detail on, but inthe spirit of trying to keep
things from becoming toooverwhelming for our listeners,
we'll try to wrap things uphere. So, I think the, the
takeaways from our discussiontoday really are understand when
(44:18):
you adopt some of thistechnology or agreed to use some
of this technology, you may oryou may be agreeing to some
things you don't necessarilyknow. And when you get these
things at home, you know, changethe password. I mean...
Michael Zambotti (44:29):
Yeah, it goes
back again to basic blocking and
tackling. You know, we talkedabout it a couple episodes ago,
having a different password onevery website, changing a
default password on an Internetof Things device. These are not
things that you know, that aresuper complex, but it can really
improve your, your personalsecurity posture.
Drew Thomas (44:46):
All right. Well,
thank you very much for once
again, having a greatdiscussion. We appreciate it.
We'd love to have you back anddo some, some additional
discussions on these things.
Because again, I mean, we, mygosh, we could talk about all
kinds of different stuff that wehaven't even, there are words we
haven't even uttered at thispoint that we can do entire
episodes on.
Kevin Slonka (45:05):
I'll be interested
to see the comments that you
get, you know, yelling at me fortelling people to stop using
devices or not.
Michael Zambotti (45:11):
Well they
can't comment because they've
gotten rid of their computers.
That's right.
Drew Thomas (45:14):
What I mean, in all
seriousness, though, we
definitely encourage comments,we encourage suggestions, we
would love to hear yourquestions as well, because, you
know, we do intend to have Kevinand Mike back and, you know,
give us some feedback about whatyou want to know, give us some
feedback about what you'd liketo hear, what you'd like, you
know, some some feedback on andsome information on that, that
(45:35):
maybe we haven't covered yet. Ifyou haven't had a chance to
listen to the previous episodeson cybersecurity, be sure to go
back and listen to those aswell. There's some really,
really great information inthere that we did not
necessarily cover in thisepisode. And we hope to
definitely revisit some of thisstuff again, you know, in the
not too distant future. And withthat, yeah, thank you very much.
Excellent, thanks. All right.
This podcast focuses on havingvaluable conversations on
(45:56):
various topics related tobanking and financial health.
The podcast is grounded inhaving open conversations with
professionals and experts with agoal of helping to take some of
the mystery out of financial andrelated topics, as learning
(46:17):
about financial products andservices can help you make more
informed financial decisions.
Please keep in mind that theinformation contained within
this podcast and any resourcesavailable for download from our
website or other resourcesrelating to Bank Chats, is not
intended and should not beunderstood or interpreted to be
financial advice. The host,guests, and production staff of
Bank Chats, expressly recommendthat you seek advice from a
(46:39):
trusted financial professionalbefore making financial
decisions. The host of BankChats is not an attorney,
accountant or financial advisor,and the program is simply
intended as one source ofinformation. The podcast is not
a substitute for a financialprofessional who is aware of the
facts and circumstances of yourindividual situation. Jason
Dorsey of the Center forGenerational Kinetics once
(47:03):
observed that back in the 1990s,a young Generation X was
considered to be the most techsavvy generation, those most
likely to both be using thenewest tech and the ones most
likely to understand how itworked. The people and the
technology grew up together. GenX is now middle age, and the
young Gen Z and older Boomersare finding that they have
(47:25):
something in common, they aremore likely to be tech dependent
rather than tech savvy. While itisn't necessary to know
everything about how technologyworks, knowing at least a little
can really help make you moreaware of how cybercriminals may
be trying to reach you. Oursincere thanks yet again to
Kevin Slonka and MichaelZambotti for lending us their
time and expertise on the show.
(47:47):
We also welcome your commentsand feedback. You can click the
links in the description to getin touch with us. Thanks to our
producer and part-time co-host,Jeff Matevish, for all of his
hard work and dedication.
AmeriServ Presents Bank Chats isproduced and distributed by
AmeriServ FinancialIncorporated. Music by
Rattlesnake, Millo, and AndreyKalitkin. If you enjoyed the
show, please consider liking orfollowing the podcast. For now,
I'm Drew Thomas, so long.