Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Fellow conspiracy realist. First off, we want to thank you
for joining us this evening. Statistics prove that you are
probably listening to this show on a mobile device.
Speaker 2 (00:13):
It's true, or recording this show well, occasionally glancing down
at your mobile device. No, not typically, but we're doing classics,
so I was just checking the feed. But see, yeah,
they were kind of tethered to these things, aren't.
Speaker 3 (00:24):
We looking at mine? Here it says something about who's
this Lindsey Graham character? And then a bunch of.
Speaker 1 (00:31):
Old ladybugs lady, Yes, ladybug lady, how is ladybugs?
Speaker 2 (00:37):
Uh?
Speaker 1 (00:38):
Back in back in twenty twenty, right around Halloween, the
most wonderful time of the year, folks, we asked ourselves
about the sudden proliferation of the technology called the smartphone,
which very much looked like a thing out of SIDEI
(01:00):
Fi franchises, similar to Star Trek. It had been predicted
for a long time. DARPA and parts of the Deep State,
just to be honest, had cell phone technology way before
the public. But we were worried about what we later
decided to call the dopamine casino of it all.
Speaker 2 (01:22):
How quaint our take must be company compared to the
episode on chat GPT Psychosis that we just recorded.
Speaker 3 (01:30):
Oh geez, just think about all the things you do
with that phone that's in your pocket right now, or
that you're staring at because now you're contemplating.
Speaker 2 (01:38):
It because we made it.
Speaker 1 (01:42):
We've naughty out of mind for you, and so now
now the boys called you out.
Speaker 4 (01:48):
And you're looking at your phone.
Speaker 2 (01:51):
Well, I think a concept too that that's come up
a lot in terms of like the way our attention
is divided. Matt Damon was recently doing the rounds for
the Netflix movie that he's in. It's like about some
kind of heist. I think, I forget what it's called,
something like the heist, but about second screen viewing, about
how so many screenplays are being required to be written
(02:13):
or notated to death by executives in order to accommodate
what's called second screen viewing, the idea that our attention
is always on our phone, so you better make sure
that people that are half paying attention to your movie
can follow along.
Speaker 3 (02:27):
What were you guys saying?
Speaker 2 (02:28):
Sorry?
Speaker 1 (02:29):
Yeah, I can always have exposition similar to how MPR
whenever they come back from an ad break, they'll tell
you what was happening. So Dylan, Matt Nol, we are
a conspiracy show and we're talking about what happens when
we become our phones.
Speaker 2 (02:49):
Yeah, let's ro all this quaint take right now from
twenty to.
Speaker 5 (02:54):
Twenty from UFOs to psychic powers and government conspiracies. History
is real with unexplained events. You can turn back now
or learn this stuff they don't want you to know.
A production of iHeartRadio.
Speaker 3 (03:19):
Hello, welcome back to the show. My name is Matt,
my name is Nolan.
Speaker 2 (03:23):
They call me Ben.
Speaker 1 (03:24):
We are joined as always with our super producer Paul.
Mission Control decands. Most importantly, you are you, You are here,
and that makes this stuff they don't want you to know.
Shout out to everybody listening to today's episode on a
smartphone or mobile device. In a way, this is very
(03:44):
much for you. It's for everybody all the time, but
this one's especially for you. One question I thought we
could start with today, Guys, what did you do on
your phone today? Just however much you feel comfortable disclosing
to millions of people.
Speaker 2 (03:59):
And by too much, too much, I mean it's like
the mission control center of my life. I do a
lot of my work on my phone I like enter
data into Google Docs on my phone. I do your
research on my phone. I do social media on my phone.
It's kind of embarrassing, honestly.
Speaker 3 (04:18):
Well it's yeah, it's interesting. So I think, well, I'm
not sure for all of us, but I'm on a
laptop and almost all of my work happens there. But
if I ever have to leave my home office for
any reason, that phone is just in my hand and
it's doing all that same stuff. Or of course playing
Call of Duty or Best Fiends.
Speaker 1 (04:39):
Right there, we go, check that one off the box.
We got in front of it this time. Awesome. So
you know, that's a really great point because a peep
behind the curtain. Your individual mileage may vary, folks, but
Matt Nol and I do a ton of stuff all
the time. As a matter of fact. Before or I
(05:00):
guess it's okay to say this publicly, before I've complained
to you guys about it, and I've said, you know,
the fact that we're kind of always available makes us
like very underpaid anesthesiologists. We're just sort of always on
the clock, you know.
Speaker 3 (05:15):
What I mean, ready to get someone prepared to fix
a problem. Surgeons, firefighters, there you go surgeons. I think
we're surgeons.
Speaker 2 (05:27):
Well. We also refer to like problems in the podcast
world as fire drills quite often, so I think we
are sort of firefighters in our own ways.
Speaker 3 (05:35):
Nice fire surgeons.
Speaker 1 (05:37):
There we go, fire surgeons, rocket fighters there we go.
Put it on the T shirts and so personally, I
did a lot of stuff on my phone today and
I did something that I don't usually do well.
Speaker 4 (05:49):
I did this yesterday.
Speaker 1 (05:50):
I kept track of what I was doing. I didn't
check every time I touched my phone, but I checked
all the weird things. I bought stuff, I looked through
social media and blah blah blah, all the YadA YadA yadas.
But one of the things I did today was post
(06:10):
an informal question on Twitter. They asked, has your phone
or an app on your phone ever seemed to know
too much about you? And if so, what freaked you
out the most? You can find the responses there on apt, bembull, ANDHSW.
But for our purposes today, one thing that most people
(06:33):
talked about was, of course, targeted advertising. We see stories
about someone saying, you know, I coughed around my phone
and then instantly when I pulled up Instagram or something.
I was receiving sponsored ads.
Speaker 2 (06:48):
For Hals the Dea.
Speaker 1 (06:50):
Yeah, and a lot of people reporting like their microphones
somehow seemed to know stuff, or their conversations with someone,
even a wave from their devices somehow led to what
they see as a targeted ad. There's a little bit
of better Minhoff in there, for sure, but there are
also a lot of shenanigans. But the thing is, there's
(07:12):
a problem with this. We've talked about targeted ads before
everyone knows they exist. These are the things your smartphones
and your telecom your ad industries want you to know about.
They want you to see an ad for Holes or
Best Fiends or whatever. But that's because it's the stuff
they want you to know that they know about you.
(07:34):
But today's episode is different. Today's episode goes in a
different direction. It is about the stuff they don't want
you to know about you and your phone.
Speaker 4 (07:43):
Here are the facts.
Speaker 2 (07:44):
So what is a smartphone? It's sort of like just
a phone these days. But you know, I know people
that have decided to not go that route and still
use a dumb phone. I guess nobody really calls them that,
but they're basically phones that if they have Internet connective
are rudimentary at best. Maybe they can send text messages,
probably can't send pictures very well. Oftentimes, when you're communicating
(08:07):
on a smartphone with somebody with a dumb phone, things
appear in blocks or weird ways. You have little message
sent kind of things. Because these are much more old
school communication tools. Smart their phones their phones. Yeah, first
and foremost, I remember I used to have a razor.
You guys remember the razor? Yeah, yeah, and that was
that was like, oh god, it's a color screen and
(08:29):
it was like, you know, a tiny little color screen.
Was looking back at now seems laughable. But a smartphone
is essentially a cell phone with a bevy of highly
advanced features like what we're talking about. Yeah, oh man, Matt,
is that your burner? This side hustle?
Speaker 3 (08:44):
This was my first cell phone, you guys.
Speaker 4 (08:47):
Oh my god, is that a Motorola.
Speaker 3 (08:49):
It's a Kyo Sera.
Speaker 2 (08:50):
I don't even know that brand.
Speaker 3 (08:52):
QCP five Oh I have those amazing Still.
Speaker 1 (08:59):
I have a Nokia, but I just attached to chain
to it and use it for personal defense because these
things are indestructible. But you're I love the point you're
making because it's like if you own a smartphone, or
if you buy an iPhone or an Android or something,
saying that you're gonna call someone or knowing that you
(09:21):
can use it to make phone calls is very much
like a lower priority feature. Now you want it to
be online, you want to have certain slick apps and
so on.
Speaker 2 (09:33):
Well, yeah, and we recently talked about on a listener
Mail episode about you know, robocall scams and things like that,
and how they're often targeted to older generations because they're
the ones that are still using phones to make calls
or have land lines. You know. I haven't had a
landline in probably fifteen years, you know. And I very rare.
(09:55):
I mean I use my phone to make calls for
work often because we do conference calls, but now that's
pivoted to you know, Google hangouts or Microsoft teams or
Zoom or what have you. So very rarely use my
phone to make calls. And oftentimes if I get a
call that I from a number I don't recognize, I
don't answer it. And if someone calls me out of
the blue, I low key resent them for it. Sorry
(10:18):
about that, No, not, I'm not you mad. I'm always
happy to hear from either of you, guys. But I'm
just saying in general and out of the blue call
feels like, why didn't we schedule this? You know, this
should have been like on the books. You know, you
can't just call me out of nowhere because it implies
it requires an immediate reaction. Although there's also that kind
of expectation with texting to people text you aggressively because
(10:39):
they want you to text back right away, and if
you don't, it can kind of weird people out sometimes.
But basically we are in the world of smartphones. There's
no putting that Badger back in the bag or Genie
back in the jar or what have you. Because people
are using their phones for so many other things, calling
being absolutely secondary, and at this point it really might
(11:00):
as well be an extra feature. But I guess it's
bundled in because the telecom companies still want to have
that infrastructure for making calls, right, I mean yeah.
Speaker 1 (11:08):
And they want that voice data right so they can
frank abite you later, all.
Speaker 4 (11:16):
Right, allegedly sure.
Speaker 2 (11:19):
But the.
Speaker 1 (11:21):
One thing that's fascinating to me about this is how
quickly this became normalized, especially in the West. I mean,
if you ever want to have a fun rabbit hole,
try looking into all the in if you're a hip
hop head like I am, try looking into all the
old technological brags of hip hop songs of yesteryear, you know,
(11:45):
when like jay z Is is like I'm too cold,
Motorola two way pager, like that was fancy. That was
a razor at the time. And now that stuff seems
so antiquated because it evolves so quickly. But that's the
front of the curtain evolution eye the curtain stuff is
much different and also evolving at a much faster pace.
Speaker 3 (12:05):
Yeah, but it's it's harder to trace the stuff with
a two a pager.
Speaker 2 (12:07):
I'm just you know, that's true.
Speaker 1 (12:10):
That's true. That's why it's so cold. In a very
short amount of time, as Drake would say this stuff,
the smartphone usage went from zero to one hundred and
as we record this episode right now, this is a
statistic that baffled me. There are approximately ten billion mobile
devices in use, and yes, to confirm, that's more than
(12:33):
the current human population of the planet. I checked just
in case.
Speaker 2 (12:37):
So wait, is that just for like people that have
like two phones. That's part of the world, And why
would you do that. That's very suspicious. Oh that's true.
You might get issued one for work. That's the good point.
Speaker 3 (12:48):
SI side piece burners also everywhere. Also true.
Speaker 1 (12:52):
I also think there may be more than ten billion
in use as we record this because one thing that
one thing I think a lot of people, especially in
the US, don't know, is that there are some phones
that are designed to have two SIM cards in the
phone so they can function like two different phones. I
found out about it when I saw some news in China,
(13:18):
I can't remember which city about these guys who were
cheating on their spouses and then their their secret SIM
cards got found.
Speaker 2 (13:26):
Tut tut. That's like prison rules, man. Yeah, like seriously.
And also, you know, because of these smart devices, we
hear a lot of other stories like that, Like I
think you know Gavin Rossdale from the band Bush he
used to be married to Gwen Stefani, you know, amazing
solo artists, but also from the band No Doubt, And
(13:46):
I believe that he got caught cheating because he had
the Family iPad synced with his personal device. Because with
Apple you sink all your Apple products together and so
it came up on the Family iPad. The like dudes
that his mistress was sending him and so his wife
found out, possibly even his kids. I can either confirm
(14:07):
nor deny to that app.
Speaker 1 (14:08):
Okay, so prison rules, you're talking about multiple SIM cards,
not marriage.
Speaker 2 (14:13):
I'm talking about multiple SIM cards. Okay, yeah, yeah, for sure.
The way we don't hear about SIM cards. Remember that
that heroin smuggling cat, that cat also had SIM cards
in the little pouch around its neck.
Speaker 3 (14:25):
It really feels like we need to consult Professor Wilson
on this whole thing. I feel like he would know
a lot about all of this legend. Legend.
Speaker 1 (14:32):
Professor Wilson, we started, if you're listening, Doc, we started
a little late today because we had to talk about
the amazing emails this guy's been sending. Just check Professor
Wilson on Twitter. I'm sure he'll show up and hope
he's well. Professor Wilson probably has a cell phone or
smartphone excuse me, not a cellular phone. We wanted to
(14:53):
gather some up to date statistics about just the lay
of the digital land for smartphones, and we've got info
from as recent as we could get it. We've got
a lot from twenty twenty. There's some things where we
had to go back to twenty nineteen, and there are
a couple of little slips, but some of this is
going to be news to a lot of us, just
(15:14):
the enormity of what's going on with smartphones.
Speaker 3 (15:17):
Yes, so we talked about the ten billion devices that
are being used. Well, as of this year, there are
three point five billion humans, well mostly humans hopefully, who
are actually using these smartphones. And it's pretty crazy because
in twenty nineteen, eighty one percent of the United States
(15:39):
residents people living in the US had a smartphone, and
that is obviously a number that is going to continue
to rise. And here are some of the stats. Forty
seven percent say they absolutely could not live without a smartphone.
I certainly believe that, as I'm sure you do too.
In comparison, ninety nine point three percent of all internet
(15:59):
us in China use mobile devices to go online, so
not a laptop, not a laptop at all. When you
think about forty seven percent saying they couldn't live without one,
I'm just going to assume that a good portion of
that number has to do with people who just use
their their smartphone to access the Internet and things. Sure,
in comparison, in China. It's like so many people are
(16:21):
just using their mobile devices.
Speaker 2 (16:22):
It makes me ask you really quickly, Matt, like, what
were we doing before smartphones? Clearly not living? And what
was that?
Speaker 3 (16:29):
Like? We were blackberrying, we were, as Ben said, razoring,
we were playing snake texting each other forgetting. We were
actively forgetting all of the phone numbers that we had
in our heads.
Speaker 1 (16:44):
Game Boys, we're playing game Boys Nintendo DS. I guess
the is the Homo sapien to that neanderthal, but the yeah,
I would say it's affected our.
Speaker 4 (16:55):
Cognition for sure.
Speaker 1 (16:57):
We're thinking less deeply about things, and now we're thinking
about more things. We've said this before. We're thinking across
the axis rather than up and down.
Speaker 2 (17:09):
So we know, like we.
Speaker 1 (17:10):
Have, like the first paragraph of Wikipedia, knowledge about a
ton of stuff, but we don't have as much depth
of learning.
Speaker 4 (17:19):
Right.
Speaker 1 (17:20):
Just this is not a ding on any individual. This
is a purposeful, brutal ding on.
Speaker 3 (17:24):
Our species because part of this where we would store
all that stuff. It's now here in your hand. Oh god,
that was kind of weird. Here it's not here, it's
on this Oh look at that. Who ooh. Refresh the
tape for this episode.
Speaker 2 (17:42):
I think we've been in the before times we even
had like kind of bar conversations about what happens to
that part of our brain that used to store phone numbers.
Does it evolve into something cooler or do we just
lose it? And I don't know the answer to that.
I tend towards we just lose it, or I just
get subsumed with more kind of useless, shallow dive facts
(18:05):
that we can then spit out at parties.
Speaker 1 (18:07):
Mainly I would which we and every every one of
our colleagues are like the worst at parties, which is
we're gonna have to start a game called the actualies.
I'm kidding, I do. I have put concerted efforts into
not not being that person at parties. But but you're right,
(18:27):
I would say, if we wanted to be optimistic about it,
perhaps we could say that that that skill of memorization,
rather than being atrophied, has just been re channeled, because
for a time we were memorizing many more passwords, right,
which would I think occur in the same part of
(18:48):
the brain that retains phone numbers. But now you know
increasingly your phone or your device will recall passwords for
you or just say hey, let me go go look
at your face. All right, here's all your banking information.
That's pretty common with iPhones especially, and we use them
(19:08):
a lot. I'm still working this one out, and I
wanted to see how you guys felt about this number.
You Matt, you know everybody listening. So overall, it seems
that the average smartphone user across the planet checks their device,
whatever it is, at least fifty eight times per day.
We also tend to vastly underestimate how often we check
(19:32):
our phones, and there's fifty eight is probably still a
really low number. Because there's a study we're going to
get to later in today's episode that found in the
United Kingdom, people on average thought they checked their phone
thirty seven times a day. That was their best guest
when they asked all these people, but the actual number
was much closer to eighty five times a day. Does
(19:54):
that sound reasonable to you, guys or does that sound
like off base one way or another?
Speaker 3 (19:59):
Real that does not sound off base, No, it tracks sure.
If you if you use an iOS device and you
want to get insights into yourself and how you are
your phone, open up the thing that's in your settings
it's very close to the top. That's called screen time,
and you will see I have just so you know,
(20:21):
I've disabled mine on my phone. I disabled it almost
as soon as it became a feature, and I think
you should too, unless you are using it for a
specific reason like preventing children your children from using your phone,
or preventing yourself even from using your phone at certain
times or certain apps as a self regulating technique. Because
(20:44):
that thing alone, that little feature stores a ton of
information about you, as your phone does it.
Speaker 1 (20:54):
So wait though, but Matt does disabling it prevent the
device from collecting that information or just prevent it from
displaying that information to you.
Speaker 3 (21:03):
You can disable a lot of stuff in your iOS
device that has to do with what we're going to
talk about today, and that is one of the one
of the things that you can do.
Speaker 2 (21:13):
It's also a good way to feel like an absolute
technology junkie when it like serves you up with that
information weekly, like there's a like automatically. I think by
default it gives you reports like your screen time is up,
you know, blah blah blah. And I'm certain that because
of all this COVID business people's screen times are way
the hell up, you know, and then it becomes like, Okay,
(21:36):
I'm an alcoholic, I need to drink less. You know
it's bad for me. How do I self regulate? Like
you say? And the fact that we even have to
think about it like that sort of proves the whole
idea that it is hijacking our brains and making us
crave that feedback that we get from checking the phone.
What if there's a new thing. What if I get
(21:56):
a notification? Oh, let me drag down and refresh my
Facebook feed. Maybe there's a cool new thing that I
must know about instantly.
Speaker 1 (22:05):
Exactly, Yeah, and well put so, so we know that
that number of tracks, if anything, it's it's probably going
to be a little bit low for at least half
of the logically for about half the audience.
Speaker 2 (22:17):
Right.
Speaker 1 (22:18):
More than sixty percent of smartphone users have made a
purchase of some sort on their device. Not to pick
on Apple, but the way that they sandbox is purposely
designed to make it easy, or they would say, seamless,
to buy things on the device, to the chagrin if
I imagine many many parents who will later find out
(22:41):
that their kid has bought YadA, YadA, YadA.
Speaker 3 (22:43):
And husbands and spouses. Just kidding, just kidding.
Speaker 2 (22:48):
So while we're on.
Speaker 1 (22:50):
The subject of money, let's also point out in app
advertising is going to rise to two hundred billion dollars
US by just next year. In four years after that,
by twenty twenty five, this number is projected to grow
as long as we don't burn everything down, projected to
grow to around two hundred and twenty six billion, and
(23:12):
it's gonna keep going. We spend so much time staring
at these little screens for one reason or another. In fact,
smartphones are smartphones as the doorway to the Internet for
a lot of people are leaving other media platforms in
the dust. Seventy percent of all the media time like
(23:33):
around ends up being spent on smartphones, which is nuts.
It's not saying that people are, you know, necessarily listening
to a ton less radio or something. They're just a
ton more people with very easy access to stuff.
Speaker 2 (23:48):
It's just all integrated. I mean, I remember when the
first iPhone came out out. It seemed like magic. You know,
everything from the touch screen implementation to the way the
keyboard was virtual. And obviously it was clunky at first,
but it seemed to me at the time, and just
the idea that, oh, I can watch YouTube on my
phone or I can it's literally like a handheld computer.
And it is more so every single new generation because
(24:10):
of the way that you know, the way it rose
to prominence so quickly. That's just how technology works now,
because of the speed of processors and the exponential improvement
of technology and streamlining of technology. Not to mention that,
like you know, all of this is built on the
idea of stuff is free. You know, oh, the apps
are free. You can buy an app stuff if you
want to, but you don't have to. But it sure
(24:32):
does make it a better experience if you buy those
extra widgets or gems or power ups or what have you,
or you pay for the premium version so you don't
have to get serve those ads all the time.
Speaker 4 (24:43):
Freemium.
Speaker 1 (24:44):
Yeah, that brings me to another question, only tangentially related.
We all know a lot of early adopters, right, just
the nature of our social networks and friendships. Are you
guys people who buy the latest version of an iPhone
like you were you Twitter paid about the iPhone twelve?
Speaker 2 (25:01):
Absolutely not.
Speaker 3 (25:03):
My first iPhone was a four.
Speaker 2 (25:05):
I had an early one. I had I think I
had one of the real small ones, the original that
were kind of beveled edges, you know, that was kind
of shiny on the back. I had one of those.
And the thing about those, they just become bricks man
like it was probably it's probably in a drawer somewhere,
you know, no idea, because they become useless, you know,
because they basically designed the software so that they no
(25:27):
longer work on the old phones. And obviously we understand
all of that about planned OPSO lessons, and that's a
whole nother discussion. But no, I wait till it's affordable.
I wait till it's like part of a deal to
re up my plan or whatever. I'm never going to
drop the full price on a new device. I'm not
that guy.
Speaker 1 (25:45):
I gotta half heartily apologize and misled you guys just
a bit. Part of the reason I'm specifically asking about
the Apple iPhone twelve pro is that I got a
ton of questions over the past few days, people asking
whether I had done the vo for an Apple announcement.
(26:07):
And we record so much stuff that, you know, it's
easy for the three of us to not remember all
the stuff we may have recorded, and I went back
and checked after like the fifth or tenth person reached
out to me, and I've got what I've decided to
call a voppel ganger, like vo plus doppelganger. That dude
sounds like me. I don't Apple may have like done
(26:31):
something like it's weird.
Speaker 3 (26:33):
Okay, tell us how to find it? How do we
find it?
Speaker 1 (26:36):
Yeah, you can check it out by going to YouTube
and search for Apple event October thirteenth. It's it's weird.
I don't think there's anything that fair. I think it's
just that somewhere out there, I have a vocal twin.
Who are you, mysterious voppelganger?
Speaker 2 (26:57):
I got one of those two. Somebody sent me with that.
I don't even remember what it was, but it was
some like ad or something, and maybe it was on
one of the message boards or the show pages for
either this or Ridiculous History. But I went back and
listened to you and it was real. It was spot on.
Like it freaked me out a little bit, you know, So,
I don't know what's the play here?
Speaker 1 (27:15):
Your next bet, you're up next? But anyway, we digress.
We're just confirming that we're not secretly recording stuff and
claiming we don't know what it is. With all this
stuff in mind about smartphones and about how they have
entrenched themselves into society and culture, it shouldn't come as
(27:36):
a surprise that millions upon millions of people somehow work
in this industry. The current guesses I found said that
there are about fourteen million jobs that are directly related
to the mobile industry, but the number of associated jobs.
Speaker 2 (27:51):
Has to be much much higher.
Speaker 1 (27:52):
Right, Think of marketing firms where there's one person who
specializes in mobile experience, right, or or someone at a
tech company who specializes in mobile ux they're still associated.
Speaker 3 (28:06):
There is a great deal of money. We already talked
about it. There's more on the horizon. And make no mistake,
if you own a smartphone, or somebody who lives in
close proximity to you, especially in your house, if they've
got a smartphone, you are a piece of this ecosystem
in equation. And just get ready because as we keep
(28:26):
going in this episode, it's gonna get weirder and scarier.
Speaker 1 (28:31):
Yeah, right, And if you're listening to this show, you
know you're probably already very well aware of how this
process works. You, like us, have seen the social dilemma.
But more importantly, you've experienced it yourself. Your data on
social apps, that's a revenue stream, and you don't get
a piece of that pie. Your data from your telecom carrier,
(28:53):
same thing, your internet browser, every single app, or virtually
every single app on your phone. Each of those agrigations
of information are packaged up and they're sold to different
places with varying levels of anonymity.
Speaker 4 (29:07):
And you do not get a piece of that either, which.
Speaker 1 (29:10):
I think is one of the main things, one of
the pettiest things we winged about in the Big Data episode.
So instead we all stare into this digital abyss and
there's this dizzy, neuro chemically manipulative ping of likes and follows, subscribes, notifications.
They blow up our synapsies like these gaudy lights of
(29:30):
a casino slot machine.
Speaker 4 (29:32):
And that's where we're at.
Speaker 1 (29:34):
Like you said, this CAUs this has caused some people
to consciously limit their time or to maybe be more
mindful of it, like get those focus apps than say,
you know, hey, JABRONI get out of Instagram. You're here
for your Outlook email or something like that.
Speaker 2 (29:51):
Yeah, there's some apps that will even block you from
accessing certain other apps. It's like you literally have to
like it's like a digital chastity belt or something like that,
you know, or you have to make sure that you
won't be tempted by you know, the sweet, sweet evil
fruit of digital connectivity. You know what I mean.
Speaker 1 (30:12):
Yeah, how is that different from medicine that has prescribed
to address the side effects of another medication?
Speaker 2 (30:18):
That's exactly No, it's I think it's it's not different
at all, terrifying.
Speaker 1 (30:23):
But today's question, right, is this, how far does this
data gathering go? How will it affect you, your loved ones,
your pocketbook, your region, and this species in the future.
Why are more and more tech savvy people becoming just
like in social dilemma, doomsday profits telling us you are
(30:43):
your phone, we'll tell you afterword.
Speaker 4 (30:47):
Oddly enough from our sponsor. Here's where it gets crazy.
Speaker 1 (31:00):
Perbly aside, they are saying it because they are absolutely correct.
They've been trying to warn us about this for a while.
But this goes this goes back to one of my
problems with social dilemma. I think we've talked about it,
but maybe we haven't talked about it. While those folks
are providing valid, valuable insight. They're doing it after they
(31:21):
made a fortune, and morally, I am conflicted with by
you know, I don't know how to address the immense
privilege of that, right, Like you feel bad after you've
got your dragon's horde of gold and you sit on
that hill and tell people about the dangers of the
thing you created. But I don't see him giving them
(31:41):
money away.
Speaker 3 (31:42):
Yeah, but what would what would the money do? This
is my only counterpoint. What would I mean? I guess
if they were just trying to do good in the
world and just be you know, proactive about giving their
money to some charity, right, but we know that doesn't
always work, just throwing money at a problem. I'm I
I'm not saying I'm completely on their side. I just
(32:03):
know that I know for a fact that the human beings,
the very low number of human beings that were in
the rooms that created the the the pioneering apps in
things that we use now ubiquitously, those people didn't fully
understand what they were doing. They knew they were trying
to build something like that was addictive to some extent. Right, No,
(32:27):
wasn't there, guys, isn't there wasn't there a class or
something that everybody took, or there's like connective tissue between
a lot of these things.
Speaker 2 (32:35):
There is, and I would I would need to double
check what the name of the class was, but essentially
it was a class that was offered at Stanford. And
so we know Stanford is kind of like a feeder colony,
I guess for you know, Silicon Valley startup culture essentially,
and there was a particular class that everyone that ended
up in high levels of this ux design for these
(32:56):
types of apps all took. And it essentially had to
do with harnessing psychology and weaponizing it essentially. I mean
this really wouldn't have said it like that at the time,
but everyone kind of knew what they were doing and
they thought it very clever, and obviously it was. But hindsight,
a lot of these folks that spoke to this in
(33:17):
the Social Dilemma kind of make the argument of, oh,
we didn't realize how far it would go, but we
knew exactly what it was. So I see your dilemma, Ben,
that there is sort of like you can't have it
both ways thing. One of the guys who's featured in
the Social Dilemma, Gus sorry, guys, I'm being bad with names.
But he's one of the main speakers and he does
(33:37):
a lot of ted talk type things about the evils
of this technology and has essentially a program that's about AI,
you know, basically kindness in AI or sort of like
rethinking AI and rethinking these kinds of technologies. But you're right, like,
(33:58):
is he giving the money away like he made a mint,
you know, being a part of this thing. But he
was sort of the guy who flagged with Facebook early on.
He probably is the one who has the best argument.
Speaker 3 (34:09):
That's the Tristan guy, but it's spelled pronounced differently that Tristan.
Speaker 2 (34:15):
He has the best argument of like being somewhat benevolent
in this whole thing, because he flagged it early at
Facebook and essentially got shut down even after it made
a bunch of noise, this idea that this was going
to lead to something really bad, and then he got
sort of shut down, and then he left because he
couldn't be a part of it anymore. So I kind
(34:36):
of see him as being the kind of winner, the
guy that comes out looking a little bit more like
he's practicing what he preaches in this documentary.
Speaker 1 (34:45):
Yeah, understood, and thank you guys again. I don't know
how to intellectually navigate it, but the social dilemma is
worth watching. My personal principle is I don't like to
judge people too harshly when I'm seeing a curated version
of them, like on a documentary or something. So social
(35:07):
dilemma exists, though tangentially related.
Speaker 4 (35:10):
To this episode.
Speaker 1 (35:11):
To go back to the concept of what we're talking
about today, you are your phone as a concept goes
far far beyond targeted advertising and psychologically manipulative ux design,
way back as far back as twenty fifteen, and to
be honest, before twenty fifteen, people were sounding the alarm
(35:35):
a few years back, and there were a couple of
startups in Silicon Valley that had a bright idea. They
were lenders who want They wanted to be disruptors in
the financial market, and you know, disruptors is one of
those buzzwords it gets thrown around. Here's what they wanted
to do. They said, why don't we base personal loan
decisions on analyzes of data that we get from people's phones.
(35:59):
That means exactly what you think it means, folks, knowing
how you use your phone on a granular level, it
reveals a lot about you. They want it, you know,
they were what's another buzzword, They were going blue sky right,
and they were saying, Look, we want all the texts,
we want all the emails, the GPS coordinates, we want
social media posts, we want all the receipts that are sent,
(36:21):
you know, for like your Uber ride or you're purchased
through an app, And we want to take all of
that stuff and then build a collage about this person,
build like a very accurate persona and like a profile eser.
And these patterns of behavior that by themselves or grains
(36:44):
of sand and innocuous together they can be fused to
make this gigantic glass sculpture of you, which is weirdly
bad announce. Okay, these aren't all going to work, but
you see what I'm saying. And this behavior can correlate
in their minds with a person's likelihood to repay alone or.
Speaker 2 (37:05):
To default on a loan.
Speaker 1 (37:07):
And we've got like specific stuff they found, like it's
even more obscure than this person you know, goes to
Applebee's a five point thirty every Thursday.
Speaker 3 (37:19):
Which would be a great indicator that you are just
going to get the best loans that you are a paragon.
But yeah, it's things like how often do you recharge
your phone's battery? How many miles do you travel in
a given day? Oh here here's my personal favorite one,
(37:40):
and thank you for finding this.
Speaker 2 (37:41):
Ben.
Speaker 3 (37:42):
When you enter a new contact into your phone, do
you take the time to put a last name in there?
And are you capitalizing the first letter? Are you making
it look like a true good Rolodex within your phone?
Because if you are you my friend, are worthy of
the highest of credits?
Speaker 2 (38:01):
Or do you just put Steve Weed guy?
Speaker 4 (38:04):
You know?
Speaker 2 (38:04):
Like, yeah, these things matter. I mean it's interesting because
it's like it's not that far removed from the current
credit rating system. I mean, it's all about mapping our
decisions and penalizing us or we're rewarding us accordingly. I
mean it's like do we.
Speaker 3 (38:20):
But they're all they're all financially linked. Credit is now
linked to your financials.
Speaker 2 (38:25):
I get it. But what I'm asking you is do
you opt in to being tracked by experience? Do you
opt in to being tracked by all these credit reporting agencies?
It just happens by virtue of you having a social
security number, And I mean you don't when you get
that social security number, you're an infant. You don't have
the ability to say I don't want that, get me
off the grid, you know. So you are basically by
(38:47):
default part of this system. So we're already kind of
in this. This is just an escalation of it, or
like almost more something like approaching sesame credit, which we've
talked about, and the way that in China, I believe
they use social media statuses to track you and to
prove you for entrance and exit and premium flights and
all the kind of black mirror stuff. You know.
Speaker 1 (39:09):
So wait, michig control, give me a record scratch rewind.
Does this mean some of us are asking that always
having your phone around like fourteen percent charge or something
means you'll be considered a risky person loanwise? The answer
is yeah, well something like that is in the cards.
Speaker 3 (39:28):
Well let me plug this in then, right.
Speaker 1 (39:31):
The only real question in the conspiracy here is how
much data they can get, and this is their conspiracy,
these kinds of entities, and how much they have to
disclose about their analytical methods, their proprietary algorithms. It is
safe to say and This is an assumption, but it's
safe to say number one, they want all of the
(39:51):
data picture some version of that Gary Oldman meme or
GIF where he's screaming everything. And two they want none
of the old Why would you this as a business?
It loses its edge if other people can do the
same thing. So wait, you might be saying, well, guys, guys, guys,
I'm a longtime listener conspiracy realist. I don't have a
(40:13):
bunch of garbage apps on my phone, okay, and you know, heck,
now that I know this might affect my financial future,
I'm just going to stick to text. I'm just going
to stick to calls because if I don't generate a
bunch of information, then these people will have nothing to
use against me. And to that we can only respond what.
Speaker 2 (40:37):
Don't be so sure? Yeah, I wouldn't bet on it.
Study published in Science, that's right, Science Science publisher found
the metadata alone of this information, you know, in the
same way that we know the NSA was tracking getting
really good info using only metadata. They didn't have to
monitor your call. They could just tell things by the
length of the call and analyze that web of who
(40:59):
you're calling and who those are calling, et cetera, and
then just tracking all of that, they can get a
pretty decent picture just based on that alone. And this
study pointed out that this information actually can reveal interesting
economic status type information, just things like when the calls
were made, the length of the calls, when they were received,
(41:22):
when text messages were sent, and which cell phone towers
the texts or calls were pinging off of. And the
researchers analyzed again, just the metadata that we just described
and were able to build a one of those profiles
that we're talking about, or take it a step further,
an algorithm that could predict behavior and could correlate that
(41:44):
with wealth or lack thereof of a given phone user,
and it gets even crazier, even more granular.
Speaker 1 (41:52):
This is I don't know. Whenever we talk about this stuff,
I'm always torn between being like, that's amazing, actively fascinating
and impressive, and going that's terrifying and objectively frightening.
Speaker 3 (42:06):
Because they're able.
Speaker 1 (42:07):
To, uh, these scientists were able to answer even more
specific questions again just based on that very bare bones data.
They were able to say, this house obviously doesn't have electricity,
and therefore, if they you know, can't pay their electric bill.
How does that work out for their loans?
Speaker 3 (42:26):
Oh, because it wasn't being charged while it was being
geolocated there. Wow.
Speaker 2 (42:31):
Okay, but is this somewhere in the terms of service,
like at some point you have to sort of give
consent for this information to be used in these ways?
Speaker 1 (42:40):
Right?
Speaker 3 (42:41):
Or no?
Speaker 2 (42:42):
Is it just by virtue of having the device? Like
at what point? At what line of text? In these
massive terms of service?
Speaker 4 (42:48):
Is that?
Speaker 2 (42:49):
Is it per app? Is it for the whole phone
at large?
Speaker 4 (42:52):
It's tricky.
Speaker 3 (42:53):
Yeah, here's here's my mission for you know, find that
piece of text. I dare you hundred percent?
Speaker 2 (43:00):
I mean, basically speaking, I'm speaking not hypothetically, but it's
it's it's, of course a needle in a haystack situation.
If it even exists.
Speaker 1 (43:10):
It does it does exist. One of my one of
my old colleagues, actually was mentioned this person on air before,
but I'll try to keep it kind of anonymous. They
were having some pretty good success making readable terms of
service and conditions, and then they got their funding pulled
(43:31):
because there is no benefit to many of these entities
for the end user to understand those TOS documents, and
usually the line will be something like I'm not citing
a specific example here, but the line will be something like,
by agreeing to this service, the customer, the user also
(43:53):
agrees that the provider can use data collected and provide
a third parties as necessary to improve service. And so
they're like third parties is a huge chasm of a term,
and so is improve service. Right, They're saying, we're improving
(44:13):
your service. If they can find any kind of rationalization
for that, then they can do whatever they want, especially
because the other thing that always goes with it, just
like as Forrest Gump would say, peas and carrots, is
the is that lovely little line. The terms of service
can change at.
Speaker 3 (44:28):
Anytime, and they do. They get updated pretty frequently.
Speaker 2 (44:32):
But doesn't I mean you have to like, do they
serve it to you again? I don't recall having been
reserved terms of service to agree again when things have
been significantly changed.
Speaker 1 (44:43):
Sometimes I'll have to say there's a privacy announcement or something,
but you don't have to sign anything.
Speaker 4 (44:49):
They're just telling you what's happening.
Speaker 3 (44:50):
Yeah. Well, and sometimes you'll notice not necessarily with iOS,
but with other devices and other apps and things like that.
With an update. You'll just have to click eye agree
again and you won't even realize that you're doing it.
I'm serious. You're opening a new app that's just being updated.
You're like, oh, let me get past this. You don't
even realize. I've done it so many times in my life.
(45:11):
It's insane.
Speaker 4 (45:11):
It's so ridiculous.
Speaker 1 (45:12):
And I love the the slick psychological UX design where
it's clicking a button that said like the the yes
button is disguised as something like uh, okay or thank you, continue, continue, yeah, yeah,
and the no button is nested several different interactions down.
(45:35):
It's like under a learn more, under a learn more
under a learn more.
Speaker 3 (45:39):
Or you just can't you can't move forward. If you
say no, then okay, you can use it.
Speaker 2 (45:44):
You will occasionally see this is rarer and rarer. I
think ones where they will actually force you to, at
the very least scroll all the way to the bottom,
the implication being okay, I I blasted through this. All
of the text was in front of my eyeballs at
some point, and that's say, maybe to provide more protection
for the for the developer, you know, for the company,
(46:05):
But it doesn't seem like a necessity, it seems like
something they're legal, was like, we should probably do this
just to like cover our assets a little more.
Speaker 1 (46:13):
In the most token effort possible. And so here we
have proven that it doesn't take a bunch of data.
Speaker 2 (46:20):
Right.
Speaker 1 (46:21):
So as for the strategy of using a phone less
and less right, or disclosing less and less info to
later be used against you or to just be used
in some way with your informed consent, which we just
talked about how tricky that is. As for that strategy,
we're going to have to return to the study we
(46:42):
mentioned at the top of the show, and we'll do
this afterward from our sponsor.
Speaker 4 (46:53):
So we're back.
Speaker 1 (46:54):
Here's the study we mentioned about how often you touch
your phone or interact with it. There were four researchers
based in the United Kingdom. They installed a usage tracking
app consensually on the smartphones of twenty three students and
faculty at the University of Lincoln. They let it run
for two weeks and then they examined the results, and
(47:16):
there's an excellent summarization of this by an author named
Nicholas Carr.
Speaker 3 (47:21):
They realized that just by measuring one simple thing, they
could tell a whole host of information, a whole bunch
of things about a person's daily routine. And all that
is is when an individual is using their phone. That's
really all you have to measure, and you can see
almost the whole picture.
Speaker 1 (47:39):
When are you asleep? Phone usage will tell you when
do you wake up. It turns out that all of
these test subjects except one use their phone as an
alarm clock, and one hundred percent of them said the
last thing they do before they go to sleep is
check their phone. As for gaps and phone usage during
the day, pretty good indicator of a nap.
Speaker 4 (47:58):
It turns out.
Speaker 3 (48:00):
How crazy a nap.
Speaker 2 (48:02):
I can't. I don't know how to nap. I'm really
bad at napping. Do you guys know how to nap?
Speaker 4 (48:06):
Oh?
Speaker 3 (48:06):
We need to ask everybody out in the UK, like,
is it easier to nap there? Or what's the deal?
Speaker 1 (48:12):
Is it the gloomy weather that helps?
Speaker 2 (48:13):
Oh? I could do it. I could do it.
Speaker 1 (48:15):
I actually have the opposite problem though. I can't sleep
for like six hours in a stretch, so I just
kind of the pandemic has made it real weird too, you.
Speaker 2 (48:27):
Know, grab grab it where you can, a few winks
here and there, as they say, exactly.
Speaker 1 (48:33):
And why does this not sleep patterns? But why does
this matter that so much can be gleaned from this
innocuous information. Well, it's because both of those studies depend
entirely on this very basic info, and that means that
there's an entire new universe out there. When we look
(48:53):
at the other stuff, the other data that can be pulled,
we get to the world of apps, how we use them,
when we use them, When, for instance, do you tend
to order delivery food? Or when do you search? What
do you search for? What do we watch, what do
we listen to when do we do that? What kind
of pictures do we post on social media? And what
kind of pictures do we like? Even if those are
(49:16):
private to you, they are not private to the companies
that have the ability to wield this information. We're building
these sculptures of ourselves, and most of the time we
don't know how sophisticated they are.
Speaker 2 (49:29):
Well, and I don't know if we've ever gotten fully
to the bottom of this, but I think we've all
had that experience, or we had some little throwaway conversation
about a thing, a topic, a movie, a record, what
have you, and then all of a sudden you're getting
served up ads for that thing, and in your mind
it could only be based on this notion that your
phone is listening to you. And I think we did
(49:52):
cover something along these lines where you know the microphone
is on and transmitting or what.
Speaker 3 (49:58):
I don't know, well, especially if you're using on an
iOS device, theory, or on any other device, you're what
is the what is the term for that, the concierge
or your assistant whatever, whatever the personal assistant is, it
is always listening. There is a Google Home like on
the other side of this wall. I'm pointing directly ahead
(50:20):
of me for everyone listening. And I realized that while
I'm in my office with the door closed, on the
other side of that wall, it can hear me. So
I I am just now I'm having this realization in
real time. Every time we podcast, my Google Home is
listening to everything. I'm saying, Hey Google, no, no.
Speaker 2 (50:41):
No no, But did try everyone at homes? Google Homes
just went bonkers.
Speaker 3 (50:49):
Just okay, sorry, I don't even know where were that.
That really just weirded me out. I'm gonna have to
disable that thing somehow.
Speaker 1 (50:59):
Here's a here's a at I'm not being sarcastic because
a genuinely fun thing I've been doing in an experiment
that you can try at home, folks. When I was
convinced about this, and that was several years ago, based
on you know, our collective work here on stuff they
want you to know, I decided a while back that
I was going to try to hack my targeted ads
(51:21):
if I couldn't get away from them, because you can't.
It's very difficult to one hundred percent eliminate this phenomenon.
So I said, Okay, well, maybe I'll just try to
make the digital autobiography of me into a very like elitist,
wealthy person. I think we mentioned this earlier, right when
I started like just searching for fancy watches.
Speaker 2 (51:43):
Yeah right, yachts, Yes, yachts, And I've had it.
Speaker 4 (51:47):
I've actually had some good.
Speaker 2 (51:48):
Results, well, exclusively listening to Steely Dan, you know, really
putting out that vibe into the universe. You know that
you are a yachtsman.
Speaker 1 (51:58):
Yeah, searching searching for stuff like dangers of peasant uprising
and and order and endangered meat.
Speaker 3 (52:07):
You know, all all the hits, best fox, hunting dogs.
Speaker 1 (52:11):
Best fox. Oh that's a good one. Hang on, I'm
going to make a note, but but you're right. I mean,
we're we're right here. The companies that are seeking to
use this data are going to make big, life changing
decisions based on their perceived their self perceived ability to
predict what they see as your future actions. This could
(52:32):
be one of the most amazing, noble, beautiful things in
the story of human technology, because think about it. This
would mean that it is possible for the groups that
are aggregating the information like this to step in and say,
provide preventative financial or mental health counseling. If their algorithm
(52:53):
indicates that someone's going to be in trouble down the road,
they could save people in a very real way.
Speaker 3 (52:58):
Oh yeah, and that's exactly what's good to happen.
Speaker 2 (53:01):
Yeah, well you think, right, are we relying on like
benevolence on the part of these technology companies.
Speaker 3 (53:12):
Well, I mean, maybe we're seeing a turn, guys. Maybe
this is this is like our bright future. Maybe we're
gonna be okay.
Speaker 1 (53:19):
Yeah, maybe they'll all be like Bill Murray's character at
the end is scrooged. I don't know, but that's you
can tell where a lot of fun at parties, folks,
because that's not going to happen. That's not what's happening now.
There's just more money to be made in creating these
extremely educated guesses about your future and then punishing you
(53:42):
for those guesses. Again, they are guesses in advance. To
paraphrase Minority Report, this is sort of like financial pre crime.
And I like the point you made earlier about how
nobody really opts into credit bureaus in the at least,
(54:02):
and it's the same thing. But to be fair, it
would be misleading for us to talk about these inarguably
terrible consequences inherent in this strategy if we didn't also
point out the arguments made by the proponents. And these
arguments are pretty valid if theoretical. This is the sort
(54:23):
of the other side of the money grab.
Speaker 2 (54:25):
Yeah, I mean, there is, of course this notion that
this is exactly what we want, This is exactly what
we need, we deserve this, this is giving us better results.
It's allowing us to be served better information, less redundant,
you know, first off, as is discussed in behavior Revealed
(54:45):
in Mobile Phone Usage Predicts Credit Repayment by Daniel Buerkigrin
and Darryl Grisson. Many households in the developing world do
not have formal financial history period, no established credit, meaning
the odds are massively stacked against them when it comes
time to ask for a loan. So, outside of what
I just said initially the idea of serving us better results,
(55:08):
this is a very very important and potentially positive thing.
I know plenty of people that just decided they were
too freaked out to ever get a credit card, and
now they're thirty six, thirty seven years old and have
zero credit. Where do you go from there?
Speaker 3 (55:21):
Well, thankfully you've you've got your mobile phone and you're
doing all your stuff on it. And these companies now
have an opportunity to look at that data and maybe,
just maybe by using it, they're going to be able
to extend credit to you or to any you know,
to anyone on the what would be considered the fringes
(55:42):
of the financial world, people that don't have that established credit,
and with that data, it's giving them an insight in
to essentially predict what your credit will look like, what
your financial history will be like. Oh that is so weird.
Speaker 2 (55:59):
Yeah, I mean.
Speaker 3 (56:00):
That's a positive, right, somebody who needs a loan or
needs to establish credit, but hasn't yet. Would this could
be a way in right?
Speaker 1 (56:09):
And their method actually is better than the traditional methods
using credit prow information.
Speaker 2 (56:14):
But wouldn't you think you would have to be like, Okay,
I'm on board, here's my stuff, take a look, make
an assessment in the same way you'd submit documents for
a credit check or whatever for like getting a mortgage,
like they already have it. Like that's my question. I mean,
I guess the answer is yes. I would just want
to know that I could sort of turn the switch
(56:35):
on and off and be like, Okay, I need you
to use this information here. I consent for you to
use my data in this way. But that's blue sky,
I mean, that's it. I don't know.
Speaker 1 (56:45):
Yeah, it's tough, and it's it's it's hidden in byzantine
fine prints pretty often. And when we're talking about this approach,
or at least this specific study, they're talking about the
developing world, and it is valid. It could be huge
because this would open possibly open the doors of financial
(57:06):
opportunity for people that the industry calls unbanked, which I
think is a dehumanizing term. It basically means people who
don't have that financial history they don't have mortgages, credit cards,
checking accounts, et cetera, et cetera. They probably use their
mobile phone or use an entirely online banking system or
(57:28):
transactional app to trade stuff. So this could help those people,
of course, if the models are accurate, which they seem
to be, and if they are used in an ethical way.
Speaker 3 (57:38):
Hah, are these anywhere?
Speaker 2 (57:42):
Yes? Yes, it's not too too I mean it's sort
of an escalation and a different, you know, step in
this process. But it's like Venmo gave people that didn't
have a bank account the ability to digitally send and
receive funds, and that's a big deal for a lot
of people too. So I mean, I can I can
see the potential positive side of this.
Speaker 3 (58:04):
Yeah, but they had to have bank accounts or something
for the money, youl, do you have some kind of money?
Speaker 2 (58:09):
Oh?
Speaker 3 (58:09):
Well, okay, so you have to have a PayPal account.
Speaker 2 (58:11):
That's I don't think they even have to anymore. You
can have it. You can get a Venmo debit card
where independent of any bank, you can link a bank
account to Venmo, but Venmo money can go into a
Venmo debit card. And then there's even stuff for like kids,
like my daughter has this thing called green Light that's
a debit card for kids, and she doesn't she's not
old enough to have a bank account. So it's you know,
(58:32):
there are things that, again in this financial side of things,
could be seen as like, you know, forward thinking and
positive because if you're not, if you don't have access
to funds online, you are cut out of a whole
lot of the economy, you know, because especially being stuck
at home like we have been for so long. I mean, well,
if you don't want to go to the story because
(58:53):
you're immuno compromise, you need to be able to buy
your groceries using you know, one of these apps, and
you need to be able to pay for it somehow.
Speaker 1 (59:00):
Yeah, and I think this really comes into play again
in the developing world. You live in a region with
no infrastructure, you know what I mean, there's literally not
an ATM in eighty miles of you or something like that,
or the kilometers ther results may very so this stuff,
(59:21):
we can agree can be tremendously helpful, But what we're
talking about is the next step in that process using
that information for or against someone. And there are people
who are trying to make a more nuanced approach. People
like Elaine Shema. He's been working on limited applications of
this strategy, trying to build some fences around what can
(59:45):
and can't be used. Just last year he wrote something
called effective credit scoring using limited mobile phone data, and
he says, look, there are tons of potential privacy risk
in these companies knowing everything about you, especially if you
don't have again, a real kind of informed consent. So
(01:00:06):
he says, we can get the same results with a
more ethical approach. And all that he did was measure
what's called airtime recharge data or data whatever you want
to refer to it as this is this is a
term that describes when people paid or re up on
(01:00:27):
their prepaid mobile phones. Right, so you know, I've prepaid
for like a month or whatever or x amount of hours,
and then I come back and I re up at
this given schedule. Just using that, they can build a
similar credit prediction plan.
Speaker 2 (01:00:45):
But isn't the whole point of a prepaid phone that
you have anonymity and that you aren't your name or
identity isn't tied to it. I guess that's not the
same as a burner phone. You could have like a
prepaid service where you use the same phone and then
re up. But there's two completely different things.
Speaker 3 (01:01:03):
No, No, there are a lot of people in the world
who have a phone that they just pay for the
amount of data rather than you know, a monthly charge
or anything like that.
Speaker 2 (01:01:16):
But there's also ones where you can just buy a card,
pay for it at the gas station, and then just
enter the code and then you don't have any identity
information tied to it.
Speaker 1 (01:01:25):
That's all similar technology to different uses, you know what
I mean, got it? And for a lot of people,
the entry point into buying full phone and then subscribing
to a yearly plan is just too high.
Speaker 4 (01:01:40):
It's just unrealistic.
Speaker 1 (01:01:42):
So mobile phones have, I would prepaid mobile phones I
would say, have to a degree greatly democratized access to information.
But again, none of this stuff is without potential for misuse.
And then here's here's the third thing. So the third
argument for it is lending institutions are capitalistic. They don't
(01:02:06):
get to continue existing if every loan they make results
in defaulting, then they're in deep water. So we can't
expect them not to use every single lever available to
maximize their profits while minimizing their risk and to be
extremely blunt, we cannot expect these institutions to adhere to
codes of ethics unless those things are continually enforced with
(01:02:29):
serious consequences. Crime and fines for crime have since we
began this show been just a cost of doing business
for most banks.
Speaker 3 (01:02:40):
Yeah, I think we've proven that countless times just on
this show. And you know, history and articles and reality
have further proven it way more than that.
Speaker 1 (01:02:55):
And so you know that kind of resigned resign grown
means that we're close to the end of the show.
There are some other things we've yet to talk about
that can also leverage this technology. The medical industry using
similar methods to predict future or present medical conditions. The
law enforcement industry expanding its existing use of these tactics
(01:03:16):
to predict crime other rogue entities. If we want to
get really crazy from foreign powers, whatever your country is,
somebody from another country using those tactics to say, identify
target and turn assets. The dark sky is really the
limit here. We're way past the blue version. And it's
not a conspiracy theory. It's not some scary future thing.
(01:03:39):
It is a conspiracy. It's real, it's happening, and a
lot of people are just not aware that their data
is being used in this way or will be used
in this way very soon.
Speaker 2 (01:03:50):
Yeah, so what do you think? I mean? Is this
something that keeps you up nights? Are you like many
folks or you know we've done the research, we know
this stuff has nefarious kind of implications. But it's like
a trade off. But I get my cool free apps
and I get all this other stuff, and it's worth
it to me to do that trade off. Is that
I'm saying, speaking the Royal we hypothetically Do you feel
(01:04:12):
like that or do you feel like you need to
limit your use of this stuff and get as off
the grid as humanly possible in this day and age.
Is this a good thing? Is this a bad thing?
Let us know what you think.
Speaker 3 (01:04:22):
Yeah, and especially contact us if you have any good
ways to subvert some of the plans that are at
work here. We're particularly interested in those. You know, anything
we've discussed it before, anything from a VPN to you know,
your your actual settings on your device, like what are
(01:04:43):
some things that you do to protect yourself and what
can help out your fellow listeners by letting them know
about it. You can find us on Twitter Facebook. Oh god, wait,
what are we doing again? Instagram? We're on tumbler and
what else?
Speaker 2 (01:05:00):
Bumbers tender you can find this.
Speaker 1 (01:05:04):
Yeah, we're on key micro loans were you?
Speaker 3 (01:05:07):
It's all social media, it's all coming together, it's all one.
We are one, this castile soap. Wait what I thought
you were going to say?
Speaker 1 (01:05:16):
We are one eight three three stt wytk.
Speaker 2 (01:05:22):
We are definitely that one for sure. Uh honestly, that's
probably the least connected way you can get in touch
with us, where you can just send us that send
us your voice in electronic form and let us know
if you want to be anonymized or don't want us
to use it, or if you want us to refer
to you as something else or what have you. Well,
we will abide by those things because we are not cold,
(01:05:44):
uncaring algorithms. We are in fact human people. We see
true as such, it's true.
Speaker 3 (01:05:49):
But we just found out that these guys are trying
to take our voices. Right, what's going on?
Speaker 4 (01:05:56):
Voppel gangers?
Speaker 3 (01:05:57):
We've gone too far? Oh no, and.
Speaker 1 (01:06:02):
If sorry, be afraid, be very afraid. I mean, you know,
unlike Noel said, he had mattered humans just like anybody
else and here from you, so feel free to send
us an email. If you don't care for social media
or telephones, you can reach us twenty four to seven
(01:06:22):
until this whole thing burns down.
Speaker 3 (01:06:24):
We are conspiracy at iHeartRadio dot com. Stuff they don't
(01:06:47):
want you to know. Is a production of iHeartRadio. For
more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts,
or wherever you listen to your favorite shows.