Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_01 (00:26):
And welcome to
Technology Tap.
I'm Professor J.
Rod.
And this episode, well, it's aspecial episode, is an episode
of my students doing their ownpodcast.
Let's tap in.
(01:08):
And welcome to the TechnologyTap.
Hi, I'm Professor J.
Rod.
For those of you who don't knowme, I'm a professor of
cybersecurity, and I lovehelping my students pass their
A, Network Plus, Security Plus.
And I also um consider myself anamateur historian, so sometimes
I drop episodes on the historyof modern technology, you know,
history of you know different uhtechnological origins.
(01:32):
But in this episode, it's alittle different, usually around
this time and during the summer,sometimes I do, you know, I have
to take a break.
So not to leave you out coldbecause I know how much you fans
love hearing the sound of myvoice.
But unfortunately, this time youwon't.
I'm gonna have usually havestudents do a podcast during
(01:54):
this time of the year, and Ipost it.
Uh so how I do it is is I havethem do it as an alternative to
an assignment.
Either write an essay or do apodcast off their presentation
that they did earlier in thesemester.
Well, it seems like as the yearshave gone by, I think this is
the maybe second or third yearthat I'm doing this, I'm getting
(02:16):
less and less podcasts.
I think a lot of it has to dowith AI, so they rather write
the essay.
So I'm gonna have to changethat.
I'm gonna change that nextsemester.
So those of you who arelistening and who are my
students, just know that changesare gonna be made next semester.
But grateful to have Aaron,Antonio, Mark, and Joe put
(02:40):
together this episode on theCambridge Analytics Facebook
scandal.
And I hope you enjoyed it.
They worked really hard on it.
SPEAKER_04 (02:55):
Antoni Garcia.
SPEAKER_03 (02:56):
Saya.
SPEAKER_04 (02:57):
Uh Joelle Torres and
Mark Diakitos.
SPEAKER_03 (03:01):
And in today's
episode, we are talking about
the Cambridge Analytica Facebookdata scandal.
Now, we're gonna have Joel startoff.
SPEAKER_02 (03:10):
Alright, thank you,
Aaron.
So uh we're gonna be definingthe data scandal, all right?
Uh the Cambridge Analyticascandal.
Uh just hearing the name.
It's like so bad.
Uh but what exactly happenedhere?
Uh its core, uh, this was amassive breach of trust.
Uh personal data from 87 millionFacebook users was collected by
(03:35):
a political consulting firmcalled Cambridge Analytica, and
it was all done without properconsent.
These weren't just random datapoints, these were intimate
details about people'spersonalities, preferences,
political leanings, andbehaviors.
Cambridge Analytica uses datafor political advertising,
(03:55):
attempting to influence voterbehavior.
They built sophisticatedpsychographic profiles,
essentially psychological mapsof millions of voters to target
them with tailored politicalmessages.
The two primary players herewere Facebook, the social media
giant that housed all this data,and Cambridge Analytica, the
(04:17):
consulting firm that exploitedit.
Now, here's what makes thisparticularly disturbing.
This wasn't a quick hack thathappened overnight.
This data collection occurredprivately between 2013 and 2016.
For three years, millions ofpeople's data was being
harvested and weaponized withouttheir knowledge.
(04:40):
It only became public in March2018 when m whistleblowers came
forward.
Their goal?
Cambridge Analytica built thepsychographic profile
specifically to influenceelections, including the 2016 US
presidential campaign andBrexit.
SPEAKER_04 (05:04):
What did they
exactly do again?
And why was this like a problem?
SPEAKER_02 (05:08):
Well, Cambridge
Analytica misused data from
about 87 million Facebook usersby using it for political
advertising instead of thepurpose users believe the data
was for.
Yeah.
Alright, alright.
Next we're gonna be talkingabout the data harvesting
mechanism.
Alright.
(05:29):
So how did they actually pullthis off?
Uh Candrich Analytica startedwith a seemingly innocent
personality quiz app called ThisIs Your Digital Life, created by
a researcher named AlexanderKogan.
Now, personality quizzes wereeverywhere on Facebook back
then.
You remember them, right?
Like, well, which Disneycharacter are you?
SPEAKER_03 (05:50):
Or yeah, like
BuzzFeed quizzes.
Yeah, like yeah, yeah, yeah.
That's what I was gonna say.
Yeah.
SPEAKER_02 (05:55):
Like they seemed
harmless and fun.
But this quiz was different.
When users agreed to take thequiz, they consented to data
collection for what they thoughtwas an academic research.
The consent form mentionedacademic use, which sounds
legitimate enough.
But here's a crucial detail mostpeople miss.
By taking this quiz, usersweren't just sharing their own
(06:16):
data.
They were unknowingly exposingtheir friends' data too.
This brings us to the consentloophole, and this is where
Facebook's own policies becamethe weapon.
Users consented to share theirdata, but they had no idea they
were also handing over access totheir entire friends' network
information.
Imagine taking a quiz andaccidentally giving someone
(06:37):
access to the personal data ofeveryone you know.
That's essentially whathappened.
Cambridge Analytica exploitedFacebook's open graph platform,
which before 2015 allowed appsto access extensive friend
information.
This wasn't technically hackingin the traditional sense.
It was using Facebook's ownfeatures in a way that clearly
(06:58):
violated the spirit, if not theletter, of user privacy.
And a scale, each user who tookthe quiz gave access to
approximately 160 friends'profiles.
Do the math on that, and youquickly reached 87 million users
globally.
That's more than a population ofthe United Kingdom.
These profiles contain treasuretroves of information like
(07:20):
interests, location data,relationship status, political
views, everything needed tobuild the psychographic profiles
I mentioned earlier.
SPEAKER_00 (07:30):
So I have a question
for uh this slide.
So the question I have, which iswho were the main organizations
involved with the actualCambridge Analytica scandal, and
what role did each one playaccording to this slide?
SPEAKER_02 (07:47):
The main
organizations involved was of
course Facebook, which provideda platform where the data was
collected, and CambridgeAnalytica, which used that data
for the political consulting,basically scamming their
players.
SPEAKER_03 (08:03):
So they got around
87 million users out of what was
that you said, how many people?
Yeah, you got 87 millionpeople's like users just from
the people that did the quiz andthen like taking their friends
list.
That's crazy.
(08:24):
But exactly now that we'vetalked about how they got the
data, let's talk about what theyactually did with it.
Because they didn't just use itfor like regular standard
advertising.
So the whistleblowers describedit as psychological warfare,
which I'll explain in a bit.
So on the corporate side, thiswas a joint venture between SEL
Elections and Robert Mercer, whowas a billionaire who put
(08:44):
millions of dollars into theproject, as well as Steve Bannon
being one of the board memberswho was a key political
strategist in this whole thing,too.
But their main secret actuallycame in as to what Joel said,
which was the psychographicprofiling, where most
advertisers usually target youbased on common demographics
(09:05):
like your age or where you live.
But Cambridge Analytica claimedthat they themselves could even
target your personality as likewho you are.
So they actually used what wascalled the ocean models, which
scored people based on openness,conscientiousness, extroversion,
agreeableness, and neuroticism.
So pretty much by analyzing yourFacebook likes, they could
(09:26):
predict these traits with crazyaccuracy.
But going back to what I saidabout the psychological warfare
thing, with this system, theycould even know your deepest
fears and insecurities, and thenuse that to spam you with
content that's designed totrigger like a feeling or
emotional reaction to youpersonally.
Their next goal became toactually show the scale and the
(09:48):
impact of the tool in the realworld.
And that was in this case, theelections.
So it actually started with theTed Cruz campaign, in which
Cambridge Analytica was paidaround six million dollars to
build these psychologicalprofiles and test if they could
actually influence the voters.
And once that technology wasmade better, they actually moved
(10:09):
on to the Donald Trump campaign,in which it went from six
million to 1.5 billion dollarsin ad impressions.
Just crazy.
But that pretty much made itimpossible for the media or like
(10:32):
political opponents to actuallyfact-check them at that time.
But their strategy was now splitinto two things.
So between finding supporters ofTrump and riling them up, and
finding people that are likelyto vote against Trump and his
supporters and targeting themwith negative content to in a
(10:53):
way discourage them from evenvoting at all.
Which is crazy.
And part of the reason whythat's so important is because
it pretty much made it so likepolitics and elections and stuff
went from the like public opendiscussions that they would have
to be more focused on privatepersonalized attacks on social
(11:16):
media.
Uh, does anyone have anyquestions on that slide?
SPEAKER_02 (11:20):
Yeah, sorry.
I got a question real quick.
SPEAKER_03 (11:22):
How did the ads work
again?
Like okay, yeah.
So the dark posts are prettymuch ads that all that are like
specifically made for a user uhso that they can see it, but
once they like once theyactually see it and open it, it
vanishes like right afterwards.
It like disappears.
So pretty much made it so it wasdifficult for people to actually
(11:42):
keep track of these ads and madeit harder to kind of pinpoint it
to them, if that makes sense.
So they got away with it for along time.
I see, I see.
SPEAKER_04 (11:55):
Um, like the
whistleblower, right?
Christopher Wiley, he was aformer like Cambridge Analytica
employee, in which he basicallylike revealed everything about
like the hijackings, he hadproof, and there was a bunch of
evidence like confirming likeover fifty million Facebook
users were pretty muchharvested, in which they spent
(12:20):
over one million right.
And then Facebook and both yeah,both Facebook and Cambridge
Analytica, right?
They denied everything.
Like they denied Wiley'sevidence, even though he showed
it in court.
And even then, didn't likeFacebook also know in 2015, but
(12:42):
they didn't like tell anyoneabout it.
Like they only like toldeveryone once people found out.
Then also this also doesn'tapply to just the United States.
This has also happened in Brexitin the UK, Trinidad and Tobago
with the t with the 2010campaign to impact young voters,
(13:05):
and even in the RussianFederation, with basically
screwing up multiple electionsin which Wiley testified that
Cambridge Analytica had talkswith Russia.
SPEAKER_00 (13:19):
We're actually gonna
describe and explain about the
different ethical and moralfailures.
Now we are uh going to talkabout such as the accountability
deflect, then a minimizingfailure, lacker transparency,
and violations of informedconsent.
So based on this, you know,based on this failures that what
(13:42):
went on, this will cover theaftermath in terms of penalties
in under the breakdowns in theCambridge Anthetica scandal.
Facebook failed to enforceproper data governance, allowing
metadata collection beyondlegitimate needs.
The users were unaware of theextent of data harvested, and
(14:04):
informed consent was violatedbecause data was used for
political purposes withoutpermission.
The scandal underscores theimportance of transparency and
accountability in digitalplatforms.
Does anyone have any questionson Facebook?
Well, actually, I do.
SPEAKER_03 (14:20):
So how did Facebook
contribute to the accountability
deficit?
SPEAKER_00 (14:25):
Okay.
So in terms of contributing,they had Facebook itself had
inequatic data governance, andthey also failed to enforce its
own policies effectively.
SPEAKER_04 (14:40):
Yeah.
Um, I also have a question.
Yes.
What was the main ethical issuein the Cambridge Analytica
scandal?
Because like I'm having a littletrouble understanding.
SPEAKER_00 (14:53):
Okay.
I can totally explain that foryou.
So they had a certain violationwhich was based which is
primarily due of the informedconsent.
Now there were users whobasically never agreed to their
data being used for politicaladvertising.
(15:15):
So when that happened, that'swhen a whole violation came into
place from that cause.
Yes.
Oh, all right.
Yeah.
All right.
So we have some regulatory andfinancial consequences.
We will be talking about thedifferent types of penalties.
We have the FTC penalty, the UKITO fine, which is from the
(15:38):
actual commissioner's office.
We have some SDC penalties,which were basically fines that
were misleading disclosures onany data misused from users that
were reported.
And then we have somemeta-sentimates, which when that
happened, it just took place offwhen whenever directors agreed
(16:02):
to settle for any scandals thatwere mishandled in certain
situations.
So this site covers theaftermath in terms of penalties
and corporate.
SPEAKER_03 (16:12):
Wait, actually, I
have a question about the last
thing you said.
So what happened, what actuallyhappened to Cambridge Analytica
after the scandal?
SPEAKER_00 (16:21):
For the people that
don't know, just in case.
So after the scandal, theyfailed they filed for uh the
bankruptcy that they had goingon in back of like May 2018.
I believe that's when thathappened.
Oh wow.
SPEAKER_03 (16:37):
So after that
actually filed for like chapters
on the bankruptcy then?
SPEAKER_00 (16:41):
Yeah, yeah, yeah.
That's good.
Yep.
So as I said, they cover anyaftermath which dealt with
penalties and had a copper fallout.
Facebook themselves, they facedregular breaking fines from the
FTC, SEC, and UKICO, as Iexplained earlier, for any
(17:02):
privacy violations andmisleading disclosures.
Now, Meta agreed to sentiments,and Cambridge Analytica filed
for bankruptcy in May 2018,which I explained earlier.
These different consequencesdemonstrated how regulatory
bodies are stepping upenforcement and the financial
(17:22):
risks of poor data practices.
SPEAKER_03 (17:27):
Actually, I have one
more question.
Do you know what like majorprivacy laws emerged like after
the scandal?
SPEAKER_00 (17:35):
Privacy laws?
So I think like major privacylaws will be based off from the
GTPR back in Europe, and inCalifornia, there will be
something called the CCPA.
Yeah, that would take place inCalifornia, of course.
Okay, okay.
SPEAKER_02 (17:53):
Yeah, I also have a
question to be honest.
Yeah, yeah.
How did like meta respond to thebacklash again?
SPEAKER_00 (18:00):
So after they dealt
with the backlash, which was
pretty crazy at the time, theyhad to restrict any third-party
access to any user data throughtheir API changes.
Okay.
Mm-hmm.
unknown (18:17):
Yeah.
SPEAKER_04 (18:17):
So basically,
through what we all like
discussed today, the CambridgeAnalytica scandal showed just
how powerful and dangerouspersonal data can be when it's
basically misused for anything.
What happened wasn't literally adata breach.
It basically revealed howpersonal information could be
weaponized to influenceelections and distort public
(18:39):
opinion.
And this also exposed majorfailures in transparency,
accountability, and datagovernance.
Especially on the part ofFacebook and Cambridge
Analytica.
Basically, because of thisentire thing, there was massive
public backlash, includingmovements such as as someone
(19:02):
said, delete Facebook, right?
Which pushed people to questionhow their data was being used.
And and then also on in othercountries, this helps spark
stronger privacy laws such asGDPR in Europe and CCPA in the
United States, along withstricter platform policies about
data access.
(19:24):
But the one true takeaway fromeverything is that ethical data
practices and user trust areessentially needed for democracy
in the digital age.
So if companies don't protectour data, the consequences can
affect not just us, butanything.
SPEAKER_03 (19:53):
I mean, honestly,
all of us would recommend
checking out the documentary,The Great Hack, on Netflix.
I don't know if everyone fullysaw.
It or not, but I did, and it wasreally good, honestly.
So if we've thrown a lot ofnumbers and dates, you guys
today, but like this documentaryactually follows the real people
that were involved in this wholescandal.
(20:15):
So, like, including thewhistleblowers that we mentioned
earlier and the actual likeprofessors and other people who
fought to get their data back,you know.
But like fun little fact, itactually features Brittany
Kaiser, who was a like a formerdirector at Cambridge Analytica,
and actually like shows herjourney from inside the company
to actually becoming like awhistleblower.
(20:37):
So it's really interesting.
And it also like really helpsyou like visualize how that like
invisible data and like stringsof code can actually you know
turn into something way bigger,like real-world political like
problems and chaos, you know.
But I don't know if anyone haslike any more questions that
they would like to ask before wewrap this up.
SPEAKER_02 (20:58):
Yeah, I got a
question real quick.
Okay.
What's up?
You know how you know how like Imentioned like psychographic
profiles before earlier?
How is that actually differentfrom regular advertising?
SPEAKER_03 (21:08):
Yeah, so we actually
both mentioned it a bit, but
it's still a great questionbecause we didn't answer it
enough, honestly.
So regular advertising usuallytargets demographics, like your
age, gender, like where you liveand stuff, but psychographic
targets like your personality.
So, you know, CambridgeAnalytica categorize people
based on traits like neuroticismand openness, which I mentioned
(21:31):
a bit in the ocean model.
So if the data like show thatyou are a like fearful person,
like you get scared easily, theywould purposely show you
something that was like designedto try to scare you.
So it wasn't just about sellingpotential candidate, but more of
as well as like triggering anemotional reaction out of
people, too.
I don't know if anyone has anyquestions.
SPEAKER_00 (21:53):
Yeah, I do have a
question.
So earlier, I don't remember ifyou mentioned any like super PAC
concerns, but why does it evenmatter if they worked for both
of that campaign or the PAC?
SPEAKER_03 (22:08):
So glad you
referenced it, because it's
actually super important.
Like it's a major like legalissue, especially like in the
US, where official campaigns andsuper PACs, which can raise like
pretty much unlimited money, arelike strictly forbidden from
coordinating their strategieswith others.
So since Cambridge Analytica wasworking for both the Trump
(22:29):
campaign and the Make Americanumber one super PAC
simultaneously, there were hugeconcerns that they were acting
as a bridge to pretty muchillegally share data and their
strategies, which likeessentially bypasses campaign
and like finance laws and stuff.
Wait a minute.
SPEAKER_04 (22:47):
How did they even
get all this info?
And did like all those peoplereally take the quiz?
SPEAKER_03 (22:53):
Yeah, I I don't
remember if Joel mentioned it, I
think he did, but pretty muchno.
So that's honestly like thescary part.
I don't know if Joel wants toadd to it or not, but pretty
much like only about 270,000people actually took the quiz
and like you know download theapp and stuff.
But the like those were userswho consented, and the app also
(23:14):
like scraped the data off of alltheir Facebook friends without
them knowing.
So that was like the the noconsent part.
So that like whole friend of afriend loophole that they ended
up using was how they reallymanaged to go from 270,000
people that took the quiz to 87million profiles in which they
(23:35):
could use like from such a smallgroup in comparison.
But I don't know if anyone elsehas any questions.
SPEAKER_02 (23:42):
I mean, yeah, like
like that one group, like one
friend or one person who tookthe quiz could spread it to like
160 friends, basically.
Yeah, and that's how like crazylike badasses.
SPEAKER_04 (23:54):
Yeah, so it's like a
sickness, like it spreads like
the more like you come incontact, basically.
SPEAKER_02 (23:59):
Exactly.
Yeah, like once one person takesthe quiz, their friends, 160
around them basically like getstheir information stolen and
stuff.
SPEAKER_03 (24:07):
Yeah, so like if if
you take it and you have like
how Joe said, like up to like160 friends added on your
profile, they could just takeit, and they do.
They just straight up like tooktheir profile and their data as
well.
So they kind of just used theusers who took the quiz not only
(24:27):
for their data, but also to kindof take everyone else's data
too, which is crazy.
But I think that's it, honestly.
Yeah, I just want to say thankyou, and I'll see you guys for
another episode sometime soon.
I don't know if anything hasanyone to say.
SPEAKER_00 (24:44):
Yeah, I appreciate
yeah, I do want to give my first
thoughts.
I do appreciate uh being in thiskind of group and this uh
podcast, you know.
Definitely there was a lot offun, and yeah, you know.
SPEAKER_03 (24:54):
Yeah, I'm really
glad we were selected together.
It was a lot of fun.
It was very nice.
SPEAKER_00 (24:59):
Yeah.
SPEAKER_03 (25:01):
Alright, guys, we
will see you for another episode
sometime soon.
Have a great day, guys, andgoodbye.
SPEAKER_01 (25:07):
Alright, I hope you
enjoyed that episode.
I know the gentlemen they workreally, really hard on them.
I'll see you back after theholidays for original episodes
of Technology Tap.
I think I have one more of thesethat I'll I'll release, and then
I'm gonna have to go into thearchive, see if I can find some
from older students, or maybeI'll just replay some of the
(25:30):
earlier ones from a couple ofyears back.
But I want to wish everybody asafe and and happy holidays and
a fabulous new year, and asalways, keep tapping into
technology.
(26:08):
This has been a presentation ofLittle Chacha Productions, art
by Sabra, music by Joe Kim.
We are now part of the Pod MatchNetwork.
You can follow me at TikTok atProfessor Jrod at J R O D.
Or you can email me at ProfessorJrodj R O D at Gmail dot com.