All Episodes

April 27, 2018 70 mins

Love it or hate it, Facebook is one of the biggest success stories of the past century. Billions of people have profiles, and for many users it's the only source of online news or social interaction. And Facebook collects as much information as possible on each of these users, ostensibly to better target advertising campaigns. So what happens when someone starts using this vast collection of data for something else? What is Cambridge Analytica, and how deep does this rabbit hole go?

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

They don't want you to read our book.: https://static.macmillan.com/static/fib/stuff-you-should-read/

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn the stuff they don't want you to know. M Hello,

(00:24):
welcome back to the show. My name is Matt. My
name is no. They called me Ben. We are joined
with our super producer Paul Mission Controlled decond You are you,
You are here that makes this stuff they don't want
you to know. As you're listening to today's show, feel
free to drop in your own emoji reaction, whether it's

(00:45):
a wow, it's sad, I'm mad. I don't know the
names of those reactions available on Facebook. One of them
is the oh face. It's just the open mouth. I
guess that's wow. Yeah, I call it wow. It could
be wow or it could be Are there official names?
Is there a company line on what these reactions should

(01:06):
be known as? I hope they all have specific, proper
nown names like Derek or Calliope. I'm sure there actually
is a title to each of those images. Right. It
seems like a pretty um everything in its right place
kind of company. Facebook. They don't really let much fall
through the cracks. Yeah, I'm speaking of Facebook. While you're listening,
to this. Feel free to peruse our Facebook page stuff

(01:29):
they don't want you to know at conspiracy stuff or
our new uh here's where it gets crazy group on Facebook.
Just you know, hang out on there the whole time
you're listening. It's fine. Nothing bad will happen to you,
nothing bad that hasn't already happened. We'd like to introduce
you to Facebook if for some reason you managed to
listen to podcast and uh, never heard of this before,

(01:52):
and you're wondering what the heck is Facebook? I mean
the Facebook, right, not a Facebook, And we'll get to
that because there was an earlier physical duration of Facebook.
A little known fact, if you were like over two
billion people across this fair planet of ours, you use
Facebook on a daily basis. Mobile traffic is a huge

(02:12):
part of Facebook's operation nowadays because originally was desktop and
it was only for members of certain universities or colleges.
But now anyone can get on it were provided they
have the right equipment and internet connection. Isn't it crazy
to how mobile has advanced so exponentially. Remember a time
where you still had a phone that barely had stuff

(02:33):
you could do on it had snake and like a calculator. Yeah,
it's what has it been twelve sixteen years, as it's
been really really developing, but it's really ramped up in
the last more like ten years, you know, with like
the I mean the iPhones only you know, a little
more than a decade old. Right, Yeah, it's true, and

(02:55):
it's an exponential growth that has prompted some questions in
the industry and outside of the industry, and in the
minds of fringe researchers as well. Did this stuff grow organically?
Was state supported on some level? The answer to that
question both of those questions is actually yes, but maybe

(03:17):
not in a super nefarious way. Regardless of the origin
of this exponential growth, we know that mobile is, as
Knell said, huge, and many people would not have predicted it.
It's safe to say, but as of December, there were
one point one five billion mobile daily active users on

(03:38):
Facebook's platforms, whether mobile through an OS, which is the
safer way, or mobile through Facebook App. Does that include
Instagram as well? Since Facebook owns Instagram. No, that does
not include Instagram, And that's a great question. That number
shoots even further through the roof if you include Instagram,
which is also hugely popular right, Instagram has been one

(03:59):
of the fastest growing, especially since Facebook has been taking
an increasing amount of flak over the years. That one
point one five billion number, that was an increase of
twenty from December. And this means, you know, if we
read between the lines that there are a lot of
people who do not have a desktop computer or a laptop,

(04:21):
have have no intention of buying one, and say I
can just do everything in my digital life on my phone.
It's becoming increasingly more true. I mean you, there's a
many things that we would rely on a laptop or
computer to be able to do that you can totally
do on your mobile device. And it's so much more
convenient and easy and you always have it right in
front of your eyeballs. Yeah, and you can put your

(04:42):
friends on Pokemon Go so that their faces will be
introduced even if they introduced the system, even if they
don't have an online profile or a desire to create one.
You can poke your friends. If you're not into Pokemon,
you can just pope them. So, so, okay, here's here's
my a big question. And most of you already know this,

(05:03):
but you think to yourself, well, Facebook is this massive
thing with all of these people walking around on their
phones looking at their profiles and stuff. It's all free though,
I don't have to pay any money to download Facebook. Yes,
if you have a service for free that uses ad revenue,
you are not the customer. You are the what the product?

(05:26):
M Yep, that's true, folks. You are not the customer
of social media. You are the product of social media
unless you are paying a monthly fee for it. And
even then, if you are paying a monthly fee for something,
uh if it's not social media. If you're paying a
monthly fee for your cable company or your internet connection,
they're also taking your data. You are more than one

(05:48):
revenue stream for them, and for Facebook. This is usually important.
We looked into the advertising revenue right where Facebook publicly
makes a lot of its money, and mobile advertisement alone
represents eight per cent of all Facebook's ad money from
Q three. Quick explanation if we in case, we're getting

(06:12):
a little too inside baseball. In businesses, as you may
or may not know, profits and losses and revenue are
divided up into four segments per year that reached three
months long. They're referred to quarters and in business speak
in the lands where people say things like Wheelhouse and
Granular and Cadence. These quarters are simply referred to as

(06:37):
Q one, Q two, Q three, Q four. So this
mobile advertising is huge for Facebook. Facebook needs this stuff
to happen, and for that to work, they need to
have someone looking at ads. They need to have users.
That's where you come in. Uh. You might be saying,

(06:59):
not me, fellas, not me, you jib brownies because I
don't have a Facebook profile. Well just hold on, hold on,
just just for now, pretend you do. Uh, let's let's
talk a little bit about you. Let's talk about you,
the Facebook users. What do we know about you? So? Um,
twenty nine point seven percent of users range in age

(07:20):
between twenty five and thirty four years old. Uh, and
that is the most common age demographic of Facebook use.
And that's also a pretty sweet marketing spot too, isn't it.
Oh yeah, all those young professionals, all that new income
fresh for the harvesting, or you know, the folks who
are staying at home with their parents because they can't

(07:41):
afford to buy, you know, pay rent anywhere. Because that's
a whole other issue we're having to deal with. Well
that's a lot less fun, Matt, But Anyway, Yeah, we
we do know that we should do an episode on
the looming financial crisis of the millennial generation and the
post millennial generation. Tion stay tuned. As a matter of fact,

(08:02):
if you're listening and you're in that age group, right
to us and let you let us know what you think.
The biggest misconceptions are about your generation, and everybody does
want to have some sort of attention or connection on
social media, right. That's why it works. You know, you
you say something that is a funny one liner or whatever,
and then you check back a few hours later you're like, wow,

(08:25):
twenty four people know I exist. You know what I mean?
Sweet addiction, isn't it? It is? And we do see
dopamine spikes both when you check out mobile phone just
in general for text messages and when you look at
Instagram or Facebook or Twitter or what have you. So
we are going to help you a little bit with this.

(08:47):
It turns out that Facebook users in general are most
often on this platform in the middle of the week
between one to three pm, and they're eighteen percent more
likely to engage with whatever is posted. That means throw
alike or comment or reaction sub sort. If it's Thursday
and Friday. So if you want a lot of people

(09:09):
to see your stuff and talk to you, do it
middle of the day on Thursday or Friday. So what
part of the bell curve am I if I'm on
it Monday through Friday. Sev We're a little bit different
in that regard because in interest of full transparency, we
use Facebook for work. That's that's what I tell myself anyway.

(09:29):
And and there are a lot of people I think
who who would say who would be in the same
boat and say, well, I may only check it every
so often, but I get notifications automatically pushed, or I
am always logged in because Facebook wants you to feel
like it's more convenient to always be logged in. Not
all of these users, though, are real, right. CNN estimates

(09:54):
their eighty three million fake profiles floating around, and that's
probably a lower number than it is in actuality. Wow,
I mean, that's a sizeable chunk of one point something billion,
you know, users, But that still leaves you with over
a billion users, even if it's a hundred million, even

(10:16):
if it's two million. There's still so many human beings
using this thing. And there are also people who have
multiple accounts of their own, so there's still a meat
sack at the other side of the keyboard or the phone.
But does that mean that only one of those profiles
accounts is real? Well, it's true. It's like you know, um,
some folks that we work with will make themselves a

(10:37):
work related Facebook what they call a public facing Facebook profile,
and then they'll keep their personal one, so they just
you know, a friend there actual friends, not that we're
not all friends with, you know, are people that like
the show and then we interact with We're all internet friends.
But you know, definitely you have to separate it. I,
on the other hand, I only use my one because
I don't care and I'm an attention hord. Same. Actually,

(11:01):
on average, regardless of whether the user is real or
a bot, uh, the user will tend to spend twenty
minutes on Facebook per visit. Advertisers love this number because
that is an amazing amount of time to spend on
one website, especially one that's continually refreshing ads for you.

(11:21):
Can I ask you guys a quick question if either
of you ever been taken in by an online ad
as an online ad ever done anything more to you
than just register annoyance? Have you have you? Have you
add a successful interaction with an online ad where it
knew something about you and served you just the right
product or service that you were immediately like, I must
have this. Do you mean like a civilian thing? Yeah,

(11:42):
instead of going to complain about the ad never, I
always wonder it just it just seems very ham fisted
most of the time the way you get served ads online.
I'm just wondering, like, how successful is it really? I
don't I don't know, I can't think of one. It's
interesting because you're right, it reminds me of the early

(12:03):
days in the Terminator universe when they say the first
androids were easy to spot, right, they didn't sweat or
bleed or whatever, and they got increasingly sophisticated. That's what's
happening now. So it maybe ideal, well in the ideal
world for some of these ad providers and data aggregation centers,

(12:23):
it may be that they want you to not know
that you are being taken in by an AD. They
want you to feel like you just had an epiphany
and realized you needed a whopper. Yeah, you gotta wonder, like,
maybe even like the ads that I see as being unsuccessful.
Maybe there's somehow implanting that whopper idea in my head

(12:44):
through subliminal means. I don't know, I don't know, or
they're just doing it outright. If it's a beauty product
or something like what my wife watches on social media,
it's you know, someone that she likes in, is interested in,
and follows, but then all of a sudden, a beauty
product show up that the person is holding, and we'll
mention just quickly, and then somehow she wants to buy

(13:07):
it all of a sudden. Well, and that's you know,
that's a super effective thing that's happened ever since we
were children with toys. Right, and then there's right now
for a lot of Facebook ads. If you have a
Facebook account, you have probably encountered the following cartoonish situation.
You have, let's say, made a big purchase. You have

(13:31):
purchased a new car and like, this is my new
Toyota Corolla, and that information bounces around backstage where you
can't see, and then boom, you all of a sudden
are inundated with ads for Toyota Corolla. And it makes
no sense, but that's what happens. You know, they're like,
you know what, this person who just bought a boat

(13:53):
for the first time probably wants five more boats. Why
not like a boat shammy though, or like a boat
tray tailor you know, why not like, you know, get
a little smarter and granular with it, because you're right
every time, like or even if like I've looked at
a product on a site and maybe I didn't buy it.
Maybe this is smart. It inundates me with ads for

(14:13):
that same product that I almost bought that I could
see being like, well, I guess I I could buy it.
I keep getting reminded maybe I do need it after all.
But the whole boat with a boat thing just seems
like a waste of of algorithm. But if twenty people
ignore it and one person says, you know what, Yes,

(14:35):
is that the ratio youthing two boats? No, that's way
too high. It's way less. But we know that regardless
of the reaction, people are seeing this, and you don't
always have to be on Facebook to see it. On average,
those like and share buttons that go across all these websites,
they're viewed across ten million websites daily, again estimated probably

(14:56):
a lower end estimation to every sixty on Facebook, five
hundred and ten thousand comments are posted, two nine three
thousand status is updated, and one hundred and thirty six
photos are uploaded. And one in five page views in
the US occurs on Facebook, one in five twenty percent

(15:19):
of every single page view. That is crazy. But how
did Facebook get there? How did it start? What what
led us to this strange thing? Well, you have to
go back to two thousand and four year of our Lord,
February five. In fact, it was the brainchild of a
man named Mark Zuckerberg, Marcus Zuckerberg. He and his roommate.

(15:45):
You've all seen the movie. We've all seen the movie,
Fantastic Movie. It's pretty good. Remember when you first heard
there was gonna be a Facebook movie and you're like,
how's that gonna work? Yeah, that's gonna be dull. I
had no idea there was actually compelling stuff behind the
creation of this. But yes, Zuckerberg and his heart roommate
Eduardo Severin, they created this website and it was built
after or off of another program he'd already created called

(16:08):
face mash all one word yep and um that one.
That one was created in two thousand three year prior
to it. So facemash was developed by Zuckerberg. He wrote
all the software for it um. He made a website
when he was in his second year of college, and
the website was set up as this, it's like a
hot or not game. You've seen it before online. At

(16:29):
least around that time, there was even hot or Not
dot com. That was a thing you just say, yes,
this person is attractive or no, this person is not attractive.
And it was entirely female students to female students, side
by side, and then you would devote on who was
hot and who was not. So this sounds yeah, who

(16:50):
like tender? Basically, um, you know, I don't. I don't
participate in online dating apps. And but that was his
information stored. Was it was just some kind of leaderboard.
I thought. I remember hearing something about that, like, was
it literally does you do it and it goes away? Hey? Man,
I don't know the ins and outs of facemasha. They did,

(17:11):
they did store, they did store the data, and they
also at Harvard at the time, they had a physical
thing called a Facebook. There was more than one, and
these just had a picture of someone's staff, alumni or
current student and a little blurb of information about them,
sort of who's who kind of thing. And so Facebook,

(17:33):
the online platform, was named after this because Zuckerberg I
thought they would try to digitize and he's like, look,
I could do it better, and I could do it
in a week. And it turns out he was right,
because today Facebook is one of the largest repositories of
this sort of information on the planet. And you know,
to your question, I wonder, I wonder if somewhere deep

(17:57):
in his own secret files completely air gapped, if Mark
Zuckerberg is still going through his hot or not and
thinking for the world, for the world. Well, and you know,
it's one of those things where when you're creating a
Facebook like that, you're gathering information and one person or
a team of people are creating that physical book with
that information. Right, what Suckerberg realized is that people would

(18:22):
voluntarily insert their own blurbs and then add pictures and
then videos, and then check into places and say where
I've been. He was onto something pretty crazy. That's kind
of deep in the heart of all of us. It's
very true. It's very true, and you know that's why
we're gonna get into this later. But it's hard to
accuse people of stealing your data when you give it

(18:46):
so willingly. Yeah, it's a it's that. I think. The
common human trait we're talking about is narcissism is look
at me itis, right, which we we suffer from as
a species, but also compels our species to do some
amazing things. And now what happens. What happens when one
company has all of this information at its fingertips? Fast
forward way past two thousand four, but not as far

(19:08):
as you would think, and enter Cambridge Analytica. That's right. Um,
this company, Cambridge Analytica, which I knew precious little about
until you know the kerfuffle of late, was started in um.
The company markets and Marketed, and markets continues to market
itself as a source of consumer research, targeted advertising, and

(19:30):
other data related services to both political and corporate clients.
So you know, in short, they're they're kind of mining
data and crunching the numbers, running it right. According to
The New York Times, it was launched with fifteen million
dollars of seed money, backing money by billionaire Republican donor

(19:52):
Robert Mercer and Steve Bannon Steven Stevie b Yes, the
advisor for the Trump campaign and later for a time
an adviser to the Trump administration. Now bitter enemy also
was the kind of power behind the throne of bright Bart,
this very divisive right wing um website. And everything was

(20:15):
going along swimmingly until that is in March of when
the house of cards began to tumble. And we will
sort through the debris when we return from a quick
sponsor break. Here are the facts. On March seventeen, reports

(20:38):
from the Guardian and The New York Times revealed that
Cambridge Analytica, again a data analysis firm that had worked
with President Trump's campaign, had harvested the personal information of quote,
around fifty million Facebook users without permission. Whether that's permission
from Facebook, whether that's permission from the users like they

(20:58):
would ask. This revelation came from a whistleblower named Christopher
Wiley w y l i E, who states that he
helped build the firm and worked there until twenty fourteen. Uh.
The way this worked is that Cambridge Analytica retrieved the
data from an outfit called Global Science Research or gs R,

(21:19):
a company owned by University of Cambridge professor Alexander Kogan. Yeah,
and he collected all of this data in He used
a psycho graphic personality test app UM and it was
called this is Your Digital Life all one word, just
this is your Digital Life. There were there were around
three thousand people who and who actually like got this

(21:42):
app and installed it, and then they gave it permission
to collect their personal information from Facebook, including the city
where they set up the profile, the content they liked,
and the information about their friends. Now this is the
most important part. It you gave it permission to collect
information about your friends, so that three hundred thousand people
jumped up exponentially. It's like even when you hear about

(22:05):
like recon data recon that UM law enforcement agencies or
the FBI do where you can know something about who
people interact with that actually knowing who they are kind
of like you can see the web of communications and
it starts to give you a really clear picture even
without knowing the specifics about who's in the chain. Dude,
you can do that with the white pages, that's right,

(22:26):
saying like with this, it's like they literally have like
all the information. So it's like it's there's no no
end to the kind of stuff you could learn about
people with this kind of access, right, So, so this
is what happens. You get three thousand thousand people to
bite on the hook, then all of a sudden you've
got the information on over fifty million people at the time.

(22:48):
That's what we estimate exactly. And then the the other
the other sticky problem with this. Without going too far
into big data, we can recommend that you check out
our earlier episode on it. If you would like to
learn some frightening things about target, you can find that
on our website. Uh One one of the sticky and

(23:08):
citious things about this is what Facebook would refer to
as frictionless sharing. Right. It's a very it's very fluffy
euphemistic phrase for the process that Noel and Matt are
describing here, and it does expand It's like that uh
six degrees of Kevin Bacon game, although using their using

(23:31):
their information that they had aggregated, Facebook later found people
are more like three and a half degrees separated from
each other, including maybe Kevin Bacon. I don't know, it's
an average. I mean, how often if you haven't met
a new random person on Facebook or and then you've
ended up befriend of them on Facebook and realize that
you have several mutual friends, even if there's somebody from

(23:53):
like another state, you know what I mean, Like, it's
not that far fetched. So the web starts to shrink,
or at least the access to it, you know more
you think about you know, these connections. Yeah, absolutely, And
on the so March seventeenth, this all goes public. Right
the next day, Facebook bands Cambridge Analytica and they ban UH,

(24:13):
its parent company, and then they banned Cogan himself. They
also on the same day band Christopher Wiley, the whistleblower.
March nineteen, Facebook stock plummets takes a nosedive. On the
same day, Congress starts to stiff around. Senator Edward Markey
from Massachusetts says, hey, we gotta hold some hearings against

(24:33):
these Facebook and Cambridge analytic a jokers and paraphrasing, and
specifically he wrote to the ranking members of the Senate
Committee on Commerce at Senator John Thune and Senator Bill
Nelson of South Dakota and Florida, respectively, and Markey says
we need to move quickly to hold hearings. Since Facebook
is required to quote obtain explicit permission before sharing data

(24:57):
about its users. Facebook holds an emerging see meeting. Uh.
They also and this is enjoyable. Uh. They also for
everyone at the meeting, their version of an all hands,
they have a poll for the employees. That's what you know,
what you think? Uh. Zuckerberg doesn't say anything about this
for three days and it's not until March twenty one

(25:17):
that he addresses it with a post on his Facebook accounts.
Got to respect the consistency, and he says what happens
is his responsibility, but there are change is made to
the platform that will prevent it from happening again. He
proposes a three step plan going forward to investigate third
party apps, restrict developers access to user data on the whole,
and give users tools to see what information of theirs

(25:40):
is being shared. So, man, if I didn't install my
Digital Life, what was my data breached? Yeah? Quite possible.
So can you explain? So there's this there's this thing
where in the rules of Facebook, if you're an advertiser
and you are collecting data in this way, it says
that you can, as the developer of an app, take

(26:01):
like glean the information of friends of the people who
have actually installed your app or whatever. However, you have
to use that data in a very limited way, and
only to improve your app that you're using. You can't
actually sell that data to a third party or use
that data for any other reason. That that was the
original like rules that were set up. And then just

(26:22):
to clarify, that's exactly why they made that change where
the third parties don't have quite carte blanche access to
um your friends data, only to yours directly because you
have opted in to that third part Again, it's restricted.
That's a lot like not. That's a lot like flame
retardant or water resistant. It's not proof, not water or

(26:44):
flame proof. It's not it's still not banned. Yeah, there's
still a fire. And this is where some dicey things
start happening. March twenty two, this all happened so quickly. March,
President Donald Trump goes on Twitter, which is his social
media of choice, and he appears to boast about a
partnership with Cambridge Analytica. He doesn't name them, but he

(27:06):
says people are no longer claiming his use of social
media was less robust than Clinton, that being Hillary Clinton
during the sixteen campaign in election, and while he doesn't
outright name them, the timing suggest that he was talking
about Cambridge Analytica. It's sort of like that Boasteammates where
he says I don't pay my taxes, so I'm smart.

(27:27):
It's kind of like him saying I figured out how
to exploit social media, so I'm smart. Yeah, I guess
you know. I don't. I don't follow his tweets very closely.
It is it, don't you think it's sort of an
odd thing to say, given how controversial this stuff has
been for him to come out and say that, if
he wasn't kind of trying to vindicate himself in some

(27:47):
way or say that, like, I don't know, it just
seems a little boastful to me. It's it's puzzling. Indeed,
Yeah he did. I mean, it's clearly a boast On
March twe ellen musk our modern tone, me stark are
modern Tesla. I guess he would want us to say,
gets spooked and deletes his space x account. He deletes

(28:08):
a Tesla account. Zuckerberg is invited on the same day
to testify in front of the House of Reps. House representatives,
many of who will go on to thoroughly embarrass themselves
with their lack of knowledge regarding social media platforms and
technology in general. I saw a really good meme where
it was like a screenshot of of all the people
that were quizzing him, and it just had like fifty

(28:30):
spyware search bars like stacked up on top of themselves,
like in their browser, you know, like those ones that
like automatically installed, and you can't you rid of them. Well,
they can't. There's to fill their whole screen, gating his photoshop,
his visit. It's worth it. Yeah. Well, and here's here's
the whole other thing that we're not even gonna get
into too deeply in this episode. But yeah, some of

(28:53):
the some of the ways that Cambridge Analytica was using
this data included influencing if just you know, the American
president presidential election, the entire Brexit situation. Yes, that is true.
That is true. And this is one of and their
UK based company to Cambridge. So this is the primary

(29:15):
impetus for officers from Britain's Information Commissioner's Office getting involved.
There a data watchdog group, so they raid the offices
at Cambridge Analytica in London. They take their time and
there for seven hours. I heard an NPR piece where
someone like was able to get their kind of dossier there,
their digital you know, dossier that Cambridge Analytica assembled about

(29:39):
them particularly, and it's not the most alarming kind of
sounding stuff that you might imagine. It's like you're who
were you? What party are you more likely to identify with?
You know, it represented that kind of stuff, and in Zuckerberg,
even in his testimony said this is all stuff people
would share openly themselves. So you know, he that him

(29:59):
in an attempt to absolve himself. I was wondering what
you guys think, like how is this different? Like how
what what makes this so different? I want to walk
through the rest of the timeline and get to the
testimony here, because that is a good question. So the organization,
the Information Commissioner's Office, was given the search warrant by
the High Court judge to determine whether the group tampered

(30:22):
with Brexit, as Matt as you just brought up, and
they are currently still analyzing and considering the evidence. Right
on March, Zuckerberg, who has become at this point known
as the Zuck in a lot of online forums, takes
out a full page ad in multiple UK US papers.
It's a it's a profound apology, repeats his three step plan.

(30:44):
The FTC the next day confirms that they're investigating Facebook
for these privacy practices. On the Zuckerberg refuses to meet
with British lawmakers. He sends one of his employees, but
he does agree to meet with US Congress. And while
Britain is investigating this, Wiley, that whistle blewer from earlier

(31:05):
tells Parliament that Facebook uses microphones to improve ad targeting,
which we've talked about before on this show. What do
you guys think? Do you think that's a real thing?
Do you mean like on your phone? Yeah? Yeah, yeah,
I've certainly and we all have I think had a
situation when we were chatting about something and then got
served with an AD about it the next day or later.
I mean, I yeah, I think it's highly plausible. I'm

(31:26):
utterly convinced, even though I have not seen absolute, um
like damning evidence, I'm convinced. I you know, we we
have to ask ourselves, at what threshold can the sheer
number of anecdotes and firsthand experiences be accepted as as
you said, Matt, damning evidence. I think we're close to

(31:47):
that threshold. There's there's something we should fast forward to
in April four, buried in a blog post about changes
Facebook is making to quote clarify in terms of service,
like that clarifying not correct. The company casually increases the
number of people affected by Cambridge analytic as efforts from
fifty million to eighties seven million. Okay, just a couple more,

(32:11):
just a few, just a few dollars almost half again,
these are these are individuals that were compromised, right, yeah, yeah, yeah.
And they do go out of their way to say
people not profiles because they think their PR team is
working around the clot to humanize. Do you think they've
called out like a lot of it. We talked about

(32:31):
fake profiles and bots and stuff. I mean, how do
we know that number? Like, how many in that number
do you think could be fake profiles? And like, I
don't know, do you remember the number of fake profiles
from the top? No, it was how was high? Eight
three million? So what if these are all fake? No,
they're not all fake that it's just it's a proportion.
I'm sorry, I just I think it'd be hilarious if

(32:53):
they only got fake profiles. That would be you know what,
that would be a relief for everyone involved. Except for
the bot masters, but they continue to their credit. Facebook
suspends two more companies doing similar stuff, one called Cube
You and one called Aggregate i Q, and then they
start informing users who are affected by the breach, and folks,

(33:15):
you're listening to this today, the odds are you probably
have some sort of Facebook profile, And if you have
some sort of Facebook profile, the odds are not insanely low.
It's it's not super likely, but it's it's quite possible
that you opened your Facebook one day and you saw

(33:36):
a notice that says that you may have been affected
by this because either you or a friend used the app.
This is your digital life, Oh, this is your digital life.
That sounds like a soap opera or something for like
the future. Yeah, oh man, or a Black Mirror episode, right,

(33:57):
that'd be a good one. I think, yeah, we should
pocket like at that. So they appear to have also
harvested not just the what we will call like the metadata,
the connections, the time of log in, the time spent somewhere,
but they've also harvested content, private messages. Because that we're
harvested again, that's isn't that that's dark to me? That's

(34:19):
really creepy. The other synonym is scraped. Yeah, I've a lot. Yeah,
uh so, they said a small number of people who
logged into This is Your Digital Life also shared their
news feed, timeline, post and messages, which may have included
posting messages from you. That's according to Cambridge Analytic and Facebook.
They also gained access to information from the friends of

(34:41):
the apps users and it's still, honestly, is not clear
how many users were affected. Look's like a disease. It's
like it's it's can contract it from others. It's a
vector from contact and astute listeners. You noticed earlier that
Mark Zuckerberg himself refused to go to British lawmakers, but

(35:06):
did consent to go to US lawmakers. And we were
kicking this around a little bit off air. Why why
do you think that is? Well, I mean, he spent
two days giving testimony to to the United States lawmakers.
It sounds so much better when you when you call
them lawmakers instead of just you know, the House or

(35:27):
congress Um. But anyway, yeah, he spent two days there
and he had not much of a problem at all
fielding questions, like you said, Ben from people who had
barely a grasp on what an online advertising is or
what an app actually is, or how data transfers occur,
and correct me if I'm wrong. But it's my understanding

(35:50):
that British lawmakers do more so than than American lawmakers,
or at least they know how to dig deep and
and at people's feet to the fire, maybe a little
bit more than we do. It felt like this is
a bit of a softball kind of situation. Oh agreed. Yeah,
And this goes into maybe journalistic culture too, because BBC

(36:11):
reporters are generally thought to be harder hitting as journalists.
I've got a short just just a couple of choice
questions you want? Lawmakers asked? Uh, Mark Zuckerberg, would you
like to hear something? Please? Okay? Senator Lindsay Graham from
South Carolina asked, Mark Zuckerberg? Is Twitter the same as

(36:34):
what you do? Lindsey Graham asking, like, Lindsay, Grandpa? Am
I right? Oh boy? Uh? Florida Senator Bill Nelson is
fond of a particular type of chocolate and he had
mentioned that fact to some Facebook friends and now he's
seen ads for that chocolate. So he said, what if

(36:54):
I don't want to receive ads? For chocolate. Yeah uh.
And then Senator Roy Blunt from Missouri says, my son
is dedicated to Instagram, so he'd want to be sure
I mentioned him while I was here with you. It's
not even a question, can I can? I can? Can
you sign this for me, Mark Zuckerberg for my grandson? Yeah? Man?

(37:18):
Good lord? How much more? This is comically a softball
And again, like I mean, you know, there's probably a
committee that like knows how the internet works or why
why why aren't the FCC involved? I don't know. I
just it just seems very like what's the word, Just
a lot of pageantry and no real like like action,
like a lot like a lot of theater. And you

(37:39):
can you can read, uh in depth some of these
some of these questions. Matt, you really took the really
took the took one for the team because you watched
a lot of this I did. I started watching the
first day and I got lost. After about forty minutes,

(37:59):
I just could it became noise and I just shut
it down. Can you do an impression of Zuckerberg? I
don't know that I can. I was listening to NPR
and they were kind of summing up some of the
major parts. And I heard him say this is a quote,
so I guess I'll try it here. We didn't take
a broad enough view of our responsibility and that was
a big mistake and it was my mistake, and I'm sorry.

(38:23):
I started Facebook, I I run it, and I'm responsible
for what happens here. And I swear I am human. Yes,
I swear. Hello, friend, it's just the responses that were
given to a lot of the major things. As you
can expect, since he's under oath, he is, he's on

(38:43):
the surface. Everything is on the surface, and he just
seems pretty surfacing in general. And I don't know, I
don't know much about the zuck Man. I feel like
when I talk about the zuck I have this bit
of envy just about how young and successful really you
think he's happy that I think he'd feel really isolated
and and I don't know, you know, more money, more

(39:06):
problems than me. He's just trying to connect to the world. Well, Mark,
if you would like to hop on our show in
the future for an interview, or if you just need
some actual friends to maybe go bowling with you or something,
let us know, hit us up on Facebook where we
are conspiracy. Yes, just so I will say one last thing.

(39:27):
There was one one, even though it's kind of a
low blow shot. Senator Dick Durban at one point of
all of a sudden, asked Mark Zuckerberg what hotel he
was staying at. He's like, well, so, what what hotel
are you staying at while you're here? Would you be
comfortable revealing that information exactly? And you know, Mark of
course is like, no, absolutely not, because again, all of

(39:49):
the information sharing here that we're talking about is voluntary. Yep.
And as a point, he has a point. It's an
interesting argument because a lot of this is based on
the inherent problems of social media. But it's a contract
that we all sign on for and we accept the
trade off, you know, because surely we all took the

(40:12):
time to read the terms of agreement in full. Now, Ben,
I would think that's possible, mind you possible that you
have it is quite possible. But one thing that is
impossible for us to do now is to pretend that
this is not an ongoing thing, or pretend that there's
a full number of full solid number of people that

(40:33):
we know have been impacted. We can't predict the future,
but what we can do is explore the troubling implications.
After that is a word from our sponsor. Here's where
it gets crazy. This granted, even though the three of

(40:57):
us are hilarious, sounds sounds kind of dry, you know
what I mean. But that's because we're at the beginning
of a much longer story, in a much larger revelation.
It's a story that's been going on untold silently for
you know, since shortly after Facebook was opened to anyone

(41:19):
who wanted to log in. And while those investigations are ongoing,
it's safe to say more and more unpleasant discoveries linger
on the horizon waiting to be discovered. One of the
first questions is what other operations are occurring at this point.
We don't know. A very few people actually know. Like

(41:39):
just just before we hopped in the studio today, a
few hours ago, it was revealed that Cambridge Analytica was
planning to launch its own cryptocurrency. They want to raise money,
don't they. They want to They want to raise money
to pay for the creation of a system to help
people store and sell their online personal data to advertisers.

(42:00):
Come on. True story called dragon Coin after a famous
gambler in Macau. But I thought we agreed that like,
while an aggregate on the whole, our data is worth
a lot, my data is not worth a lot. Your
data is not worth a lot. I couldn't sell my
data and have it. Maybe, I mean, maybe I'm correct
me if I'm wrong. Oh no, they're using The thing

(42:21):
is they were hoping to build a system that would
combat the same kind of activities they did, the idea
of being that who knows better how to build a
hin house than the fox who got fat? All, Okay,
that's interesting. I understand what you're saying. Almost like a
password wallet or something like that, or like a vault
that you can keep your stuff under wraps, which would

(42:44):
also be supervised by Cambridge Analytica and their parent company.
There's there's this other question, which is, uh, is it
that bad? What gives you know? The The effort for
this was overseen by their four mr CEO, the disgraced
chief executive, a guy named Alexander Nicks. He was for

(43:06):
style the company in March of this year because he
was caught on tape bragging about his company's approach to
political work in other countries across the world. Including Eugene
shell companies and strategies to entrap opponents. Yeah. They they
famously state, and this is Cambridge themselves, that their database
contains over five thousand data points on nearly every American consumer.

(43:30):
And if you do some due diligence, like NOL did,
then you will see that a lot of it seems
kind of innocuous. That's what I was talking about earlier.
I think I jumped the gun a little bit and
talking about that NPRPS I heard where a gentleman um
was able to obtain his Cambridge analytic a profile, let's
call it, I think, I said, darcier Um. And he
was pretty sure that it wasn't all of it, because

(43:52):
it certainly wasn't that five thousand points, but it was,
you know, a pretty basic little collection of data points that, yes,
you could absolutely glean from just kind of trolling someone's feed,
their their their page, you know, their wall, all the
kind of like is this person and Republican or a Democrat?
You know most likely, um are they? What kind of
income bracket are they? And this is all stuff you

(44:13):
could kind of figure out by looking at pictures and
seeing the kinds of stuff people actually post about themselves already,
but then much fewer than five thousand, and he was
pretty sure that he didn't he that they weren't going
to give him all five thousand. It would be stuff
that I love that point you make about a lot
of this stuff being things that you or I could

(44:34):
find just by looking at someone's page without even being
their Facebook friends. Well, it's not my point. At Zuckerberg's point,
he said that to US lawmakers. That was sort of
his defense, right, I mean, and that's why I think
this is a really interesting question that you you raise, Ben,
isn't that bad? And I want I want to know
what's the difference between the stuff we share and what
Cambridge Analytica is doing with this stuff they're harvesting from us. Yes, okay,

(44:59):
it because problems murky and dangerous when it goes into
political influence. So Cambridge Analytica did use this information, this
data to influence elections, per Matthew Oskowski. At Cambridge Analytica,
we break up our data into three buckets, Political, commercial,
and first Party. We work with several of the main

(45:21):
voter file providers depending on the preference of our clients
to access vote history and voter profiles, and then they
couple that with top commercial data providers for on a
licensing basis that stuff like general demographics, geographics, purchase history,
interest and so on. Then they get R and D

(45:42):
project data, internal surveys research what they call exclusive data relationships, which,
as you know, in this kind of conversation, the most
blase phrase is usually hiding the strangest things. Uh. And
data collection through direct response objects, which is actually asking
people what they think. Whistleblowers not just wildly alleged that

(46:05):
Cambridge worked with Steve Bannon to sway the vote starting
as far back as fourteen. And this is not really
a big deal, does it seem? It's not unique there.
Everybody we ever meet is going to try to persuade
us to do something without falling into relativism, right, Like,

(46:28):
the three of us are pretty opinionated people, and we're
trying to persuade each other's stuff all the time. Usually
we agree, but if it's a larger company, that doesn't
make that impulse itself inherently nefarious. Until we get to
what Nick's actually said, Yeah, he was secretly recorded being

(46:50):
a bit of a braggadocia, a bragga douche trying to
squeeze a line in there somehow. But yeah, he was
bragging about how his firm, Cambrage Analytica, was largely responsible
for the the election of Donald Trump as presidents United States.
He described all kinds of questionable practices, um that they

(47:13):
used to influence foreign elections. Like because remember this is
a British firm and they're influencing the American election. But
what I thought Russia was influencing the election. You might say, oh, well,
maybe it was also our friends over in Britain. I mean, look,
I just want to say I pulled up this press
release from Cambridge Analytica's website and they they lay out

(47:33):
a few points here. One, no laws were broken. Cambridge
Analytica did not hack Facebook to Cambridge Analytica did not
use the data or any derivatives of this data in
the U S presidential election. Uh Three, Cambridge Analytica did
not work at all on the Brexit referendum. For Mr
Wiley is not a whistleblower. It goes on well well,

(47:53):
in this secret recording of Alexander Nicks, the former chief
executive of Cambridge Analytica, he that his firm did all
the research, analytics and targeting of voters for Trump's digital
and TV campaigns. He was recorded saying that. He also
said that he had met personally met Trump when he
was the candidate many times, and that they deployed dirty

(48:14):
tricks including old school espionage stuff like honey traps, setting
someone up with a sex worker, prostitute, fake news campaigns,
fake bribery scandals, operations with X spies and espionage agents
to swing election campaigns. And this was this was recorded
by an underground reporter working for The Observer. What they
did is they posed as a member of a wealthy

(48:36):
family from Sri Lanka seeking political influence. And then when
the reporter as this wealthy Sri Lankan person, when they
asked if Cambridge Analytica could offer investigations into damaging secrets
of rivals, digging up dirt. Essentially, Nick said that Cambridge
worked with former spies from Britain and Israel to look

(48:57):
for dirt and said they were doing it as they
were speaking. He also volunteered that his team was ready
to go further than uh, just a keyboard investigation. He's like,
I've got one guy, He's a master disguise we can
send him. He can be anybody. It's like, Also, we
can entrap people with some sex workers, bring a few
Ukrainians with us on the trip, if you know what

(49:19):
I mean, have a p P party. That was the
implication something something untoward, right and uh, then he said
that we do. Uh he said, deep digging is interesting,
but you know, it can be equally effective to just
go in and speak to the incumbents and offer them
a deal that's too good to be true and make
sure the video of it is recorded. Hence the dirty tricks.

(49:42):
And also Okay, so according to Wiley the whistle blower,
he says, quote, we exploited Facebook to harvest millions of
people's profiles and built models to exploit what we knew
about them and target their inner demons. That was the
basis the entire company was built on, alleged whistle blower. Bro,
that's what we have to Oh yeah, we're gonna get

(50:02):
alleged heavy here. Allegedly heavy. That's not a bad name
for an album of some sort. We should put that
in the book, Allegedly heavy. If it's heavy, allegedly, I
feel like there's a good uh got a concept album
here now, to disc uh. But the the question remains, like,
what is what is wrong with this? In a legal sense?

(50:27):
If we have signed a contract, essentially, if we have
assented to having this data traded in exchange for using
this free service, then what's what's the big hubbub? The
problem here is that it was used without notifying Facebook users,
and Facebook really does not want this to be characterized

(50:49):
as a data breach for a number of reasons, one
of which is a majority of American states have laws
requiring notification in different cases of data breaches, including California,
where Facebook is based. How you ever got one of
the emails from your bank or target or something saying, hey,
you were part of a potential bundle of a trillion

(51:10):
users who may or may not have had your information compromise.
You should probably change your password. They didn't just send
that to you out of the goodness of their heart,
did they. No? No, there's a lot of c y
a there. Uh And as we know, being a family
show that stands for cover your actions or ard vark
or cover your ardvark, which I like better actually, and

(51:31):
the ultimate goal of Cogan, who worked at the University
of St. Petersburg by the way and received Russian grant money.
Was not just to collect this and build an end
anonymous database. He was working with Cambridge Analytica to build
a database of identified profiles, or what they would call
matched profiles. They started with the aim of making two

(51:53):
million match profiles, meaning tied to electoral registers in eleven eights,
with room to expand much further, ultimately aiming to get
all fifty states to what end to to send them
targeted messages about candidates, very specific personalized targeted messages. So
like if you're working for uh, if you're working for

(52:16):
the Republican Party or something that's your client, right, and
you want to split a vote, you could say, well,
here's some fake news about Bernie Sanders or Hillary Clinton
or something. Or you could also put out some fake
or embarrassing news about a Republican rival and say, oh,

(52:41):
Senator Ted Cruz is I don't know, stealing peeps from
kids on Easter Sunday. Yeah, And you can do it
with a Facebook ad, Like you could buy a Facebook
ad spot that only targets one human person. And I
think another thing that came from this, in terms of
Facebook changing their guidelines, there are going to be much
to they say they're going to be much more strict

(53:02):
about selling political advertisements, that they're going to be much
more like knowing where the business is coming from or
something right this. Yeah, and and that's a that's a
tough question too, because you know, it's very easy to
vilify Facebook. Note but there's it's similar to the problem
with YouTube, like how can YouTube catch every falling sparrow

(53:25):
of inappropriate stuff people post? The Wild West man? You know,
I bet you it's gonna come out in this that
Facebook didn't really break the law. That not not necessarily
true of Cambridge Analytica, but it doesn't seem like it's
like the thing. It's like, why would you do something
if you're not feel if there's no monetary consequence in
place for doing it, you know what I mean? Like

(53:45):
that's the regulations are all about, right, So, but you
also don't know how to make a regulation until the
case comes up where it's needed. And I think that
we're going to see that now. But I do think
that Facebook probably didn't break the law. It's interesting because
you know, legislation lags so far behind technological innovation, you know,

(54:06):
which we've talked about in the show for for long
time listeners. But there's something else, a big implication that
a lot of people are just now becoming aware of,
in a real spooky one that that might surprise some folks.
And that is the question. Okay, let's say I don't
have a Facebook profile. Let's say I've never logged onto

(54:28):
Facebook or Instagram I know their fingers on the same hand,
and Twitter or whatever, and I didn't even I didn't
even mess with my Space or friends stars. How much
could Facebook know about me? While they know a whole
heck of a lot because of things that are called
shadow profiles, called shadow profiles by literally everyone but Facebook. Yeah,

(54:50):
so like like we're going to call them shadow profiles
because it's an awesome sounding, ominous thing. Here we go.
You may have never started Facebook, right, you never even
opened it up, not once, not a single time. But
guess what, I hate books and faces. Yeah right, I'm
with you, brother, but I'm off the grid. Matt, you
are off the grid. But you've got this friend that

(55:10):
works over at the Piggly Wiggly. Here's the problem. She
loves Facebook. She's on it all the time, right, and
you know, you guys, you're just Palso, you're texting every
once in a while and on my flip phone. Well, yeah,
you're on your flip phone, but she's got that sweet
iPhone five and uh, you know she's on Facebook with
her app. That could be a problem for you. And

(55:31):
then she maybe saves your number and labels you as
hot Maddie. Yeah, hot Maddie in her contacts. That's that's
the off the grid guy that I am. And my
piggly wiggly wiggly wiggly paramore. Yeah yeah, and so yeah,
so then uh the then Facebook has that piece of information.

(55:52):
And then let's say, because you allow Facebook to see
your context, yes, and depending uponing your interaction or your
the amount of trouble you're willing to go to. It's
very easy for Facebook to do that. In some cases
you cannot avoid it, and you do have to opt
in for that though. You notice any apps like that,
as soon as you do an action on that app
that requires it to have access to your camera or

(56:14):
have access to your microphone or have access to your contacts,
a little pop up comes up and you have to
say yes, and you know it's a permission thing. But
I guess once you've done that. That's just for you,
it's not for those around you. Well, it's also if
you use an iPhone, it's going to be a tad
bit better about that. And the bluewear can be put

(56:34):
in by a provider. I see that in that bluewear,
maybe such that you would have to perform a process
called rooting your phone to actually get rid of it.
You mean on a on an Android, like on an
because if you've always you I've always heard that PCs
are more susceptible to spywear and things. I guess that's
true of the phones as well. Yeah, I think yeah,
I could easily see that. So let's just let's just

(56:56):
quickly jump into why Facebook hates that term shadow p files.
Oh why do they Well, it's because it sounds like
they're straight up making these hidden profiles somewhere on your
phone or your app, or maybe just in Facebook at large,
profiles of people's people in your contacts list that don't
have a Facebook profile. And they say, that's not what

(57:17):
they're doing. It doesn't do that. But in twenty thirteen,
Facebook said it discovered and fixed a quote unquote bug.
The bug was that when a user downloaded their Facebook file,
which you can still do today, and will give you

(57:38):
all the information they say they have on you. It
will give you all the at least the information you're
allowed to see they have on you, all your likes, comments,
et cetera, your messages. The thing was that in this
quote unquote bug included not just people's visible contact information

(57:59):
for their friends, but also their friends shadow contact information.
So they were seeing stuff that they weren't supposed to see.
And the problem with the bug for Facebook was not
that all of this stuff was lumped together. It was
that it had shown people that it existed, so the
extent of the connections that it built around every user

(58:19):
was supposed to only be visible to Facebook. And they
admit that this is in their phrase, it's getting information
from a friend or someone you know or might know.
But what does that mean? That means anyone at any
point who might have somehow labeled your phone number, your email,
or even your physical trying not to curse here, even

(58:42):
your physical address will be added to that that uh
agglomeration of information that is you. So whether it's the
piggly wiggly Hot Maddie, whether it's an old email address
from college that a friend of wars has, you know,
and it's like, um, it's like hot dot Maddie at

(59:06):
u g A dot e du it was Obadiah fourteen
at a O L. I knew it was him. Dang,
it was the whole time. Well Obadiah its thought he
looked like an ob Yeah, I've always thought that. Yeah.
So this means that the average user is much more

(59:27):
exposed than they were led to believe. And Facebook's position
is that mistakes happen and will be improved. And I think, no,
you make a really interesting point and prediction here. Will
it turn out to be something illegal? Do laws exist
for this sort of situation? It seems like right now

(59:49):
not many do. It seems like this is a case
that will lead to change. We're already seeing it, but
you know, big big changes then tech companies will have
to abide by. And also makes you wonder is this
going to affect user experience in a way that is unpleasant?
Because I think it might. I think some of the
things that we take for granted on an open internet

(01:00:12):
or on like an app like Facebook are largely because
of the the lack of barriers in some of these
kinds of things, and we may find ourselves being inconvenience
because of fallout from the story, right, I mean, that's
a that's a good that's a good point, and it's
a great perspective to raise because as we know, history
proves that privacy as a concept is a relatively recent phenomenon,

(01:00:39):
you know, the the idea of privacies. We understand it
did not exist in centuries past, and it may well
not exist by the end of this century, by the
end of by the end of this next decade started.
But the other thing is this is the current lay

(01:01:02):
of the land, folks. The other thing is that, as
far as Facebook is concerned, none of that information that
other people have about you counts as your information inmation. Yeah,
it belongs to the people who uploaded it. They're the
only ones with any control over it. That's why. For instance,
you can remove a tag of yourself on a photograph, right,
and this is not shadow stuff. This is public profile stuff.

(01:01:25):
You can remove a tag of yourself in a photograph,
but if you want the photograph to be taken down,
you have to get the person who put it up
to take it down. This is dangerous stuff. And this
is why even though it's not illegal. This is why
it could put people in real danger. Someone who cannot
have their identity known. Yeah, like an undercover cop for instance,

(01:01:48):
or let's say a prosecutor for criminal cases, anybody in
law enforcement, anybody in law enforcement. Let's say people who
have lives working in uh, well it's not legal in
a lot of the US here, but somebody who's working
in the sex trade or something. And you can find
these various stories of people who deleted their Facebook because

(01:02:11):
they were in a profession like this and they got
the scariest thing that can happen, which is someone they knew,
maybe even someone they like, put away in jail, or
something pops up on the people you may know list,
or for instance, there was there little things, they're just
twilight zone moments, like you'll hear stories about someone who

(01:02:33):
works as a receptionist at in office meets somebody for
the second time and they call them by their nickname
in their second conversation because they popped up on their
Facebook list. Or what if you are a therapist and
your clients start popping up on each person's Facebook list.
This is I mean, this is dangerous stuff, especially when

(01:02:54):
we're what we're getting to is um. Regardless of the intent,
what we're getting to in action is a violation of confidentiality.
It's it's a dangerous thing. But then we have to ask,
and I keep going back to the question US, No,
how can government's police international entities like this? Should they
even be trying? You know, it seems like the US

(01:03:17):
taking this more seriously than the US, But there's pretty
compelling evidence that this is not in any way new
We've talked about alphabet and Google, talked about the CIA
and Facebook, Cambridge Analytica. It's just it's a gimme. It's
a home run. If if we were intelligence agencies, or

(01:03:38):
if we were these businesses, we would feel like it
was foolish not to be doing this. It feels like
there needs to be some kind of United Nations style
thing just for digital matters, for for international digital matters.

(01:03:59):
Something really good points something where it's like an international
body that can make decisions and say anything that occurs
on networks that go across what lines of of states
and countries they you have, you are beholden to our group.
I say, we set it up and then immediately newter

(01:04:20):
the heck out of it. Yeah, yeah, yeah, let's give
it the legal nations treatment. Uh, And I think it
needs to be run by a group of AI S
Paul's nodding robots. Mark Zuckerberg will be the head. Oh boy,
have you seen those memes where they like paint him
up like a data from Star Trek. Yes, it's really

(01:04:42):
good and it's eerily eerily works. But I am kind
of serious with people that people that actually know what
they're talking about and understand. I mean, I think there
should be a body that at least can enforce regulation. Well,
there's several there's a wealth of bodies in the US

(01:05:03):
and abroad that we're built with that purpose. But of course,
as you mentioned before, are a little close to the
entities and institutions and industries that they are supposed to oversee. Well,
what's the question, what what happens next? Is this gonna
be a a flash in the pan news wise? Will
the news cycle move on to a different thing as

(01:05:25):
the US and Russia continue to wage war for the
pipeline in Syria? I'm sorry the the wait, what did
you say? Proxy war? Freedom? And whatever? Was it? But
the but the you know, is is this going to
stay in the headlines? Will we ever know the full story,
the full cast of players, the extent of their involvement.

(01:05:47):
Will will anybody at Facebook be charged with a crime?
Will anybody at Cambridge Analytica? Did Facebook even commit a crime?
You know that That's what I keep going back to.
Cynics in the audience would probably say, no, guys, if
you token skpegoats will be profit up for some public
pitchforks and political theater and then it'll be business as usual. Yeah, man,

(01:06:13):
but uh that's our show. But for today. By the way,
we're on Facebook. Yeah, yeah, if you have any thoughts
on Cambridge Analytica and Facebook's use of of data and
data and however you say that, uh you know, just
right to us. You can find us on Facebook. We
are conspiracy stuff. Like really, just find us there. Make

(01:06:36):
sure you put any and all identifying information just directly
into the common box. Okay, um, but just you know,
put more just as much as you possibly list to
your fears, your blood type things that you worry about
when you walk into an elevator full of strangers. You know.
We had a funny realization the other day. We were
looking at some of our breakdowns of like listenership and

(01:06:57):
there's a really huge percentage of our ostnership that is
comes up in the data that we get as unknown
or yeah, and Ben immediately it was like, well, that's
probably because our listeners might be more likely than some
other shows to use a VPN. Yeah, we know what
you're doing right now with all your encryption, and we
applaud it. We think it's awesome VP of virtual private network. Right, yes,

(01:07:23):
let us know what you think about this. What do
you think about the scandal? Would you, for example, would
you be okay with it if you received some sort
of compensation other than the use of that platform, like
if you got to check every three months or something
where they said, hey, great, great job, here's for being you.

(01:07:43):
Here's perfect. Yes, here's for being you. Here's two dragon coins.
Here's two dragon coins. Uh, and a couple of embarrassing
pictures of you from middle school. Don't don't thank us.
We like to thank you. And if you just you
just know which side your bet is. May yeah, I
just remember if you know you stop using our services.

(01:08:04):
We do have these pictures and they may or may
not end up in the hands of your enemies, to
be a shame. If your enemies found out about your
your head gear, your your brace face. So do you
click like angry or wow? Why was it? Oh face? Yeah,
oh face? I don't know. I do the heart. Yeah,
that's my favorite, like the heart. So let us know.

(01:08:26):
Thank you so much as always for tuning in. Stay
tuned as we return with more strange, unusual, fascinating, and
at times terrifying stuff they don't want you to know.
In the meantime, you can, as we have said multiple
times of this podcast, you can find us on Facebook,
but hey, we're also on Instagram, which is uh uh.

(01:08:50):
We're also on Twitter. You can find every episode we've
ever done on our website stuff they don't want you
to know? Dot com. You can also call us directly,
should the spirit so move you. We are one eight
three three st d w y t K. You got it.
That's correct. We're not going to give you the numbers.

(01:09:12):
You can figure it out. It's a fun little puzzle.
Just do it if you want. You can write U
some snail mail too. That's on our website somewhere right. Sure,
we probably shouldn't read that over the air. It's fine.
You can look it up and how stuff works. Offices
at Pont City Market in Atlanta, Georgia, where we live.
Come find us. Well, we'll just be hanging out. Paul
will greet you at the door. UM, he'll probably help

(01:09:34):
you get a beverage. UM. I will assist you. Know.
I can do some laundry for you while you're here.
Whatever you need. I'll give you a nice shoulder rub.
I will do a slow clap. Okay, or you know,
don't come to wrong. And if you don't want any
of that which God knows, you can write to us.

(01:09:55):
We are conspiracy at how stuff works dot com. I

Stuff They Don't Want You To Know News

Advertise With Us

Follow Us On

Hosts And Creators

Matt Frederick

Matt Frederick

Ben Bowlin

Ben Bowlin

Noel Brown

Noel Brown

Show Links

RSSStoreAboutLive Shows

Popular Podcasts

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.