All Episodes

November 24, 2021 30 mins

Mary Aiken is a Cyberpsychologist and Chair of the Department of Cyberpsychology Capitol Technology University in Laurel Maryland. She is a member of the Interpol Global Cybercrime Expert Group.


Want to learn more? Here are links to additional resources Dr. Aiken recommends:


The Cyber Blue Line (Aiken & Amann ) https://www.europol.europa.eu/europol-spotlight/europol-spotlight-cyber-blue-line


 The Cyber Effect - how human behavior changes online (Aiken, 2016)  https://www.amazon.co.uk/Cyber-Effect-Pioneering-Cyberpsychologist-Explains/dp/0812997859


 Manipulating Fast and Slow (Aiken, 2018 )  https://www.wilsoncenter.org/article/manipulating-fast-and-slow#:~:text=As%20a%20discipline%2C%20cyberpsychology%20focuses,emerging%20technologies%20on%20human%20behavior.&text=this%20article%2C%20Dr.-,Mary%20Aiken%20talks%20about%20how%20the%20discipline%20helps%20us%20better,the%20context%20of%20recent%20elections.



15 minutes of Shame, Documentary HBO Max https://www.youtube.com/watch?v=dhJrnNdH-aw



Master of Research (M.Res.) in Cyberpsychology https://www.captechu.edu/degrees-and-programs/masters-degrees/cyberpsychology-mres


Doctorate in Cyberpsychology https://www.captechu.edu/degrees-and-programs/doctoral-degrees/cyberpsychology-phd



Solvable is produced by Jocelyn Frank, research by David Zha, booking by Lisa Dunn. Sachar Mathias is the managing producer and Mia Lobel the executive producer.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:15):
Pushkin, This is solvable. I'm Ronald Young Jr. Leave a
whole generation that have grown up as virtual shoplifters, pirating music,
pirating movies. By time they're in college, they're pirating crack software.

(00:35):
So for that cohort, the line between rights and wrong
becomes a little blurred. I went to college in two
thousand and two. Back then, we were still on black Planet,
using live Journal, exchanging AOL screen names, and leaving up
pointed emo song lyrics as a subliminal away message for
the crush who broke our hearts. And I was also

(00:57):
a virtual shoplifter. Before we had music streamers, before Apple
Music and Spotify. We downloaded everything from what we're called
peer to peer networks, music movies, that hot new warcraft
game we wanted to play, and for poor college students,
this was just the exciting new way of experiencing the world.

(01:18):
We didn't see anything wrong with what we were doing.
It was predicted that this year cybercrime would cost the
global economy just over one trillion US dollars. To put
that in perspective, that's like over six million tesla's being stolen,
one hundred and fifty thousand dollars each, or approximately three
Hope diamonds valued at three hundred and fifty million. It's

(01:40):
just about equal to the entire infrastructure bill President Biden
just side into law. Some estimate that if cybercrime is
not addressed, it could cost a global economy over ten
trillion dollars by twenty twenty five. It would seem that
with every advancement in technology, criminals and opportunists find a
new way to exploit it. The pace of technological innovation

(02:03):
seems to move too fast for regulations to keep up,
and not everyone is on board with all the ways
we could bring order to the madness right now, when
I talk to people about surveillance, they get really upset
about this, and they were like, I don't want any
government or any law enforcement agency practicing surveillance because I
don't want to be surveilled. But you are being surveilled

(02:28):
by social technology companies. You are offering everything up. They
can practically predict what colored socks you're going to wear tomorrow.
It's just surveillance by a different entity. Doctor Mary Aiken
is an expert in forensic cyberpsychology, which is the study
of criminal deviant and abnormal behavior online, So that means, unfortunately,

(02:51):
she's kept pretty busy. In addition, she's a researcher and
teacher and an academic advisor to Europole, the Cybercrime Center,
the EC three, and she's a member of the interpoll
Global Cybercrime Expert Group. She's doing a lot. I'm not overwhelmed.
I maintain my sense of humor. I remain optimistic cybercrime

(03:11):
and online harms are solvable. In two thousand and sixteen,
NATO ratified cyberspace as an environment, acknowledging that the wars
of the future would take place on land, see air,
and on computer networks. This is a space, and this

(03:33):
space comes with incredible opportunities but also risks. And I
would argue that cybercrime and online harms are solvable problems
if we understand the cyber behavioral dynamics in this space.
So let's think about it like an iceberg. Whatever search

(03:54):
engine they're using, like Google or Safari. Yeah, who, that's
what we call the surface web, and that's between one
and three percent of the Internet. It's the tip of
the iceberg, and then of the whole part of the Internet,
which is what we call what lies beneath, and that's

(04:14):
the deep web, and certainly the spate helps to facilitate
cybercriminal behavior. And what happens is that human behavior mutates
or changes in this environment. So anonymity is a powerful
psychological driver. In other words, it's a superhuman power. It

(04:38):
is the age old mythical power of invisibility, and that
comes with tremendous responsibility. So let's take anonymity. And often
I debate on this topic, people will push back and say, oh, no, no, no,
we can't do anything to change anonymity online because anonymity

(05:04):
is a basic fundamental human right. No it is not.
And yes, we want people in oppressed regimes to be
able to post or blog or do whatever they want.
But at what cost. And if the cost of that
is cyber fraud and cyber crime and exploitation and coercion

(05:27):
and extortion and what is described as revenge porn and
all the things that we see that are going wrong online,
then maybe the cost is too high. Now, before you
get too far into that, let me let me back
up for a second. I liked what you said about
anonymity as being one of the vulnerabilities and the pitfalls

(05:49):
between the human technology relationship. One are some other big
vulnerabilities of the human technology relationship besides anonymity. Well, you
have the online disinhibition effect, and it dictates people will
do things online that they will not do in the
real world. So it's like a form of innbriation or

(06:10):
being drunk online. So what you see is that human
behavior changes. You can also see more vulnerability expressed. Online.
Hypochondria is excessive concerns about your symptoms. Cyberchondria is you

(06:31):
have a headache which could be from too much coffee
or could be a hangover, and you start googling symptoms
about your headache and you end up reading about brain
tumor and you start feeling anxiety as a result. That
sounds familiar. So you might be perfectly well but end

(06:52):
up with a nasty case of health anxiety as a
result of careless search escalation during search. So with the pandemic,
we had the infhodemic, this information overload which actually increased
people's anxiety in the general population. And what we saw

(07:15):
was that cybercriminals are incredibly adaptable and agile. They tapped
into that anxiety by creating malicious u URLs are malicious
links offering you discount personal protection equipment, offering you a
vaccine click here, before vaccines were even readily available, offering

(07:38):
you all sorts of cures. So people are anxious, they
want to protect themselves in their families, so they're further
more likely to click on a link and compromise their tech.
And that's how the cyber criminals moved in. This doesn't
feel different or that different from how crime and scams

(07:59):
happen even when they were an analog, you know, before
they went digital. Because we're talking about like things like
you know, snake oil and what you're talking about an
opportunism and people using vulnerable periods of time in which
to actually enact a crime or to make someone a victim.
In this case, it feels like the existence of this

(08:20):
sort of technology, the existence of the Internet, creates a
space for constant vulnerability. It makes it much easier. It
opens up the range of victims from you weren't going
to weren't going to fly halfway across the world to
target somebody burgle their house, to fly all the way back,

(08:42):
which you can do that online and what you know,
your point about this feels like, you know, old school
crime I would argue that in an age of technology,
it's almost impossible to commit a crime that doesn't have
some technology component to it. In twenty twenty, so last year,

(09:05):
it was predicted that this year cybercrime would cost the
global economy just over one trillion US dollars. It's predicted
going forward that by twenty twenty five, the cost of
cybercrime is going to be over ten trillion dollars. Do

(09:34):
you think that the same people who would commit regular
crime are the same people who will commit cyber crime? Again,
it's complicated. You have existing hardcore organized crime groups who
look at technology as a faster, better, cheaper way of

(09:58):
conducting regular crime. You then have a group of people
who are not career criminals but sort of get involved
in using technology to engage in sort of entry level crime.

(10:20):
Because we have a whole generation that have grown up
as virtual shoplifters. So they've been pirating music, you know,
like nineteen eleven, Then they are pirating movies. By time
they're in college, they're pirating crack software. So for that cohort,
a line between right and wrong becomes a little blurred. Yes,

(10:45):
and you're a problem with your finances and somebody says, look,
there's this great scheme whereby you can sign up to
be some sort of logistics person for a corporation. All
you've got to do is open a bank account. They
put ten thousand dollars in. Then you just have to
transfer it on somewhere else and you get to keep

(11:06):
a thousand dollars And you might, oh, there feels like
there's something real wrong with that, but it's a fast
way to make money, and I need to pay my
rent to be clear. You know, that's money laundering. It's
a crime. So I think there's that cohort growing up
where the boundaries are blurred, and that's where we've really

(11:27):
got to educate. And then I think you've got another
cohort who are young entrepreneurs, but they're cybercriminal entrepreneurs. This
is a career path. And then you also have activists.
You've activists, you have state sponsored and state condoned actors.

(11:48):
You know, there's a there is a wide range of
people engaging in harmful and criminal behavior in cyberspace. What's
the next step? What do we do? What happens in
cyberspace impacts on the real world. What happens in the
real world impacts on cyberspace, so this is continuous relationship

(12:08):
between the two. I think the first thing is that
we've got to really think about governance and policing in cyberspace.
And what we see from a law enforcement point of
view is that law enforcement started in the real world,
and as technology has evolved, law enforcement has tried to

(12:32):
evolve to keep up with technology. But that presents a
lot of challenges. For example, the encryption debate. You remember
all that the Apple you know, hack me if you can. Yeah,
law enforcement wanted Apple to hand them encrypted data, and
Apple refused to do it because they said, this is
our thing. It's encrypted. If you want to hack it,

(12:53):
you can get it, but we're not just going to
hand you encrypted data from our customers. There was all
sorts of issues about that, about backdoors, about encryption, but
these are really important issues to talk about as a
society because we want law enforcement to deliver on safety
and security in the real world. How are they going

(13:16):
to deliver on safety and security in cyberspace when there
exists in crypto domains that are effectively operating beyond the law.
And if we accept as a relationship between the two,
then it's going to continue to be more and more
complicated going forward, so you know, we have to be

(13:39):
involved have a voice in the governance of this space,
just the way we have a voice in the governance
of the real world. I am listening to you talk
about policing and in and I think that there definitely
is a way to regulate, especially criminal activity online. I
was a victim of a cyber crime, and it occurred

(14:00):
to me that there is really no way that I
could ever catch this person or or chase them down
based on what had happened. So it was, and it
was it felt I felt pretty helpless at the time.
I will say also, I was a victim of a
real crime in which my car was broken into and
my bag was stolen, and the police were not very

(14:21):
much help in that case either. I do there was
no way they're going to be able to catch this
person and chased it down. I felt the same in
both and both scenarios. The other part of this is historically,
for some marginalized groups, policing has been a problematic conversation,
and so even when we're talking about it now, there's
a part of me that's like, I don't know who

(14:42):
I want regulating this online, especially when you know historically
there has been you know, racism, misogyny, homophobia, all those
things involved in policing also taking part in defining what
crime is, which is a whole other conversation. So how
do we regulate at actually catch criminals, catch cyber criminals

(15:06):
without also overcorrecting in a way that continues to oppress
the marginalized groups that continually feel oppressed by unjust policing. Ronald,
You're right about policing and racism and all the stuff
that's going down. So I'm trying not to be prescriptive there.
And you don't want a non national telling Americans had
to govern themselves either. I'm sensitive to that. So I

(15:29):
think that that's why we need to have the conversation.
And the conversation might be we don't need any police
on the internet. That might be if we want to
have a conversation, then everything should be on the table.
For example, we could say, well, how about the companies
who profit in this space? How about the police are governed.

(15:50):
Let's not use the word police, let's say governed cyberspace.
Because this is the problem. You have huge corporations that
ultimately lie under the radar and may have aspirations of
statehood operating in cyberspace, and law enforcement thinly stretched resource

(16:11):
trying to clean up, you know, as they go along.
And what's happening in countries like the UK. I work
closely with government there, And we have a new piece
of legislation coming through called the Online Safety Bill, and
it's going to force a duty of care for those

(16:31):
corporations to profit in cyberspace. So let's take crimes like
cyberbulling or harassment or what is described as revenge porn,
and let's make these companies responsible for cleaning that up.
Then let's take mists and disinformation. And while that's not
criminal activity per se, it can certainly lead to racist attacks,

(16:55):
and it can lead to all sorts of threats, and
it can hate speech, can lead to when you know,
this is a broad spectrum of what we describe as
online harm. But are these issues policing issues? Are are
these civil society issues? And let's think about, for example,
child pornography. If you have a young person who's sexually curious,

(17:17):
then they're eleven or twelve or thirteen, and they take
an image and they send it to their boyfriend, or
girlfriend who's also thirteen or fourteen. If you've generated an
image and it's explicit and you are under age, then
de facto you are generating and distributing child pornography, albeit
of yourself. That doesn't make sense to criminalize that behavior.

(17:41):
So let's take all of that behavior and think is
this a police issue or is this an issue that
needs to be dealt with separately? And then if you
look at youth hackers, you know, if you've got a
tech talented youth upstairs in the bedroom of their parents home,
and if that kid becomes curious and starts probing around
the edges of a network and then breaches accidentally or

(18:05):
through curiosity, is that child a hacker or is that
somebody who needs to be educated? You know, if we
can't identify tech talent and young people, how can we
stage interventions. We have IQ, we have EQ, we have CQ,
we have no TQ, no technology quotient, So we can't

(18:29):
screen from them. We can't identify. Then, we can't educate them.
We can't stage intervention. But what we can do is
when the fourteen or fifteen is prosecute them. It sounds
like you're saying that there's multiple ways that we need
to examine the behavior that's happening online. And then we
need to also define what criminal activity is and who

(18:51):
can be criminals online, and then three, we need to
regulate it. We then need to regulate that behavior. Am
I mischaracterizing that as being a solution? No? I think
that's fair. And we have a chance here to renegotiate
the social contract that has existed for thousands of years
and more recently in the US. I'm Europeans, so I

(19:12):
can say thousands, okay, but the social contract we can
talk about that. We can say, right, let's take online harms,
this range of undesirable behaviors, harmful behaviors, in some cases
criminal behaviors. Well, how about we make Facebook or Twitter

(19:33):
or Instagram responsible for that and say, guys, you profit
in this space, and in fact, many of your technologies
exacerbate these problems, So you go figure and go deal
with that. Then let's take young people and what they do,
what we describe as juvenile cyber delinquency pathways into cybercrime,

(19:54):
and let's deal with them separately and not criminalize young
people who are tech curious, who don't have parents who
can teach them who don't have teachers who are as
knowledgeable about technologies they are, and let's deal with them separately.
And then let's look at technology solutions to technology facilitated

(20:15):
problem behaviors, because we can't solve this with human intervention.
We're going to need AI artificial intelligence and mL machine
learning solutions because there's just too much stuff happening. But
we have to be able to quip law enforcement to
carry out investigations, to actually deal with issues around encryption,

(20:39):
to deal with issues around privacy and surveillance. And when
I talk to people about surveillance, they get really upset
about this, and they're like, I don't want any government
or any law enforcement agency practicing surveillance because I don't
want to be surveilled. But you are being surveilled by

(20:59):
social technology companies. You are offering everything up. They can
practically predict what color socks you're going to wear tomorrow.
It's just surveillance buy a different entity by people who
are not elected, by people who don't have a duty
of care, who don't have a mandate. Are you Are
you optimistic that this is going to be solved? Because

(21:20):
I hear you talking through a lot of theories. Are
you optimistic about the actionable steps? I am increasingly optimistic.
If you'd asked me that ten years ago, I would
have said no. I'm increasingly optimistic for a couple of reasons. One,
I understand cyber behavioral science. Secondly, are work closely with

(21:42):
law enforcement, with policy makers, and I'm solutions focused. And thirdly,
class and group actions. This is the money piece. We've
been here before cigarettes asbest us. There is going to
come a point when the social technology companies who are
responsible for some of these harms are going to have

(22:08):
to do better. And they won't do better because we
tell them too, And they won't do better because the
government asked them to do better. They will do better
when it financially hurts them not to do so. And
I think that gives people hope, and we always want
to you know, humans need hope. Doctor Aiken, you talked

(22:46):
about asking companies that profit in this space to take
action to regulate themselves. You mentioned Facebook and Twitter and Instagram,
and I'll add YouTube at four Chad at eight Chad,
all places where disinformation can flourish. And I have to
say I'm not so impressed with their self regulating so far.
I mean when I think about January sixth, the insurrection

(23:06):
at the Capitol, which happened just earlier this year, I
get emotional. I'm angry. I'm just like, y'all have to
be responsible for this disinformation. And I mean that's easy
for me to say from a soapbox, y'all need to
regulate this. But how are you feeling about this? As
an expert in cybercrime and criminal psychology, I think, and

(23:27):
this is just my opinion, I think they know a
lot more than they admit too. I think social media
companies have their finger on the pulse of the Internet.
I think they know along before any of us when
something is moving. That said, can they admit that they should?
And when you talk about the events of Capitol Hill,

(23:49):
what we saw there was very interesting in terms of cyberpsychology.
I mean tragic, but interesting because that's the first real
world example of what happens when the online world facilitates
the normalization and socialization of fake news, of misinformation syndicates

(24:14):
to bring people together and creates virtual echo chambers to
reinforce and normalize and socialize belief systems, and then that
explodes into the real world with tragic consequences, and I
think that got the attention of all agencies. In fact,
the day after it happened, I got a call for

(24:34):
a senior person in a US agency who said, Mary,
you stood in front of us five years ago and
told us this what happened, and it did. And I
just wanted to say that you were right. So I
don't want to be right after the fact. I want
to say this is eminently predictable. This is a very

(24:56):
very powerful tool technology, the Internet, the connectivity. And people
ask me about causation, you know, they say, does the
internet cause bad behavior? Online harms criminal behavior? And there's
two ways of looking at this. So what we can

(25:17):
say is the connectivity, how we are connected with each
other afforded by technology. We can say, yeah, maybe that
causes bad behavior. Maybe it causes us to do things
we would normally not have done. Or maybe what it's
doing is shining a very bright light into the darkest

(25:39):
reaches of the human psyche what Young called the two
million year old man or woman. And maybe we're all
just Game of Thrones underneath it all. Doctor Aked, you
are blowing my mind. I just I want to end
on a lighter note. There's a show called CSI Cyber

(26:00):
where Patricia Arquette plays a character based off of you.
How does that feel? Incredible? I'm going to write another book,
and I'm going to write a book about how to
get ahead in Hollywood without even trying. I have survivors
guilt for all the authors and the people who pitched
TV shows, and I know how hard they work. I

(26:21):
had one one meeting, one meeting, and they commissioned the
show on the spot. Wow, that's amazing. That was it.
And I had one meeting with CBS and the head
of Broadcasting and Entertainment. It was fifteen minutes. It turned
into three hours. They stood up at the end of
the meeting. The CBS lead looked left and right and said,

(26:45):
are we all agreed? And then they looked at me
and they said, we want to make a TV show
CSI Cyber, based on you. And I issued the immortal words,
can I get back to you? CBS called me back
into the meeting and they said, they said, we're going
to make you a producer on the show. Hey. I

(27:05):
didn't know what it meant, though, so I said I'm sorry,
I can do that, and they said is this a
negotiation strategy, and I said no, I just don't have
that sort of money, and they're like, no, no, we
don't want your money. We want to give you more money.
I've only ever seen the movie the producers know where
they were trying to raise money for the show, and

(27:27):
I just thought they were asking me for money. So
they tell they tell that story. I was the first
person they'd ever offered this deal too, and I was like, no,
thank you, thanks, but no thanks. Doctor Aikin. Before we wrap,

(27:48):
I want to ask you about what listeners can do
to get more involved. You've invited them to the conversation,
but what would that look like. I think we have
to think about global solutions because cyberspace is a global construct.
But I think equally we have to be respectful of
national criteria of cultural differences in different countries who want

(28:13):
to approach cyberspace in their own way for their own population.
So I think that you have solutions that are local
country by country, and then you have agreed solutions in
shared spaces in cyberspace. We have maritime law for shared waters,
we have aviation law for the shared airspace. So I

(28:37):
think they're basic things we can agree on and then
there's some things that will need to be legislated country
by country. I would ask listeners to reach out to
their local legislators, their local politicians. There are a number
of bills that are being debated at the moment in
the US, for example, Center Warners Safetech Bill, which will

(28:58):
address some of these issues. So get involved, become an
activist in cyberspace, a good activist. One thing that I
can mention is that for any listener that is now
fascinated by cyberpsychology and the way that I am, we
are offering the world's first online Masters at Capital Tech

(29:23):
and online PhD in cyberpsychology. So if you feel like
you want to complete your education, reach out to us online.
Doctor Aikin, this has been a great conversation, very eye opening.
Thank you so much for being on the show. Thank you,
Ronald absolutely enjoyed it. Doctor Mary Aiken is a cyberpsychologist

(29:46):
and chair of the Department of Cyberpsychology at Capital Technology
University in Laurel, Maryland. You can find links to her
recent publications about cybercrime and to the degree program she
mentioned in our show. Notes. Solvable is produced by Jocelyn
Frank research by David Jack, booking by Lisa Dunn. Our
managing producer is Sasha Matthias and our exact producer is

(30:09):
Meil LaBelle. I'm Ronald Young Junior. Thanks for listening. M
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.