Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from I Heart Radio.
Hey there, and welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with I Heart Radio
and a love of all things tech, and I figured
i'd do UH an episode or two about tech and privacy.
(00:26):
I've titled this episode how Tech Spies on Us, But
right from the get go, I have to admit that
that is a little disingenuous because in many cases, not all,
but in many cases, we're essentially giving tech permission to
just harvest information about us, which is a little different
than spying. It's kind of like saying, no, I'm cool
(00:49):
with there being a live camera on me all the time,
just broadcasting to some other location. You can't really call
that spying if you were in on it at the top.
But the way tech tends to go about this UH
usually follows a pathway that isn't transparent, and it's not
(01:10):
always obvious, and sometimes it relies on people not really
being aware of what they've agreed to. So I really
wanted to go into that so that we can all
have a deeper understanding of how tech gathers information and
then how companies leverage that and exploit our information, and
(01:31):
furthermore to kind of, you know, think about whether or
not we're okay with this. Some of you might be,
and there's I'm not casting any shade, I'm not making
any judgments, but others might not be. And that's why
I wanted to kind of go through this and really
get down to why it seems like our tech has
almost a preternatural ability to know things about us. So
(01:53):
let me paint for you a hypothetical situation. You are
hanging out with a friend and you're both chatting a out,
you know, all sorts of stuff, and the two of
you decided to go to a nice restaurant with an
outdoor seating area because you know, we're still kind of
post pandemic type stuff, and you know you're being super
responsible with regard to health concerns and all of that. Anyway,
(02:15):
you have a conversation with your friend who is telling
you all about this movie they just watched. And after
your meal, you say goodbye to your friend and you
head on home. Later that same day or evening or
night whatever, you're on the internet and you see something
kind of weird. You actually start seeing ads for that
film your friend was talking about popping up on different
(02:37):
sites and services, and you don't remember seeing ads for
this particular movie before, So maybe it's cherry picking. Maybe
it's the whole van as always by the corner thing
where you only notice something after it's been brought to
your attention. But maybe it's not. So what the heck
is going on? Was your phone spying on you? Was
(02:57):
it listening in on your conversationation that you had with
your friend at dinner and then sent that information off
somewhere to be processed and analyzed, then sold to advertisers,
And then when you get back and now you have
targeted ads aimed at you because your phone was eavesdropping
on you. Now, I mentioned this sort of thing in
(03:20):
a recent tech News episode, and I'm not gonna make
you wait through this whole episode to answer that hypothetical question.
Your tech is most likely not listening in on your conversations,
not because that would be hard to do it. It
actually wouldn't be hard to do that, but because there's
really no need to do it. There's no need to
(03:43):
go that far. Also, if tech were actually actively listening
in on you all the time, governments around the world
would be extremely interested in that, either to use it
for their themselves as a kind of surveillance of their
own citizens and people who are within the country, or
they would want to go after tech companies for violating
(04:05):
various privacy and security laws. Now, the point is that
the way our modern tech works, and the way companies collect, share,
and barter our data, means there's no need to listen
in to what we're saying. Our tech knows who we are,
it knows where we go, it knows what we're doing,
(04:26):
and because the people around us also carry similar tech,
it knows who we're with, and by cross referencing data,
it knows what sort of connections exist between us. So
today I thought I talk a bit more about how
our tech collects data and the history of that, from
the stuff that we willingly surrender to the stuff that
we might not even realize was being shared. We'll talk
(04:49):
about how this has evolved and how geolocation has played
a really big part in this, and this will lead
us to some recent events in which we've seen big
companies like Facebook and Google try to skirt around attempts
to give users the option to opt out of some
forms of data collection now, as always in tech stuff,
anyone who's listened to a textuff episode knows I love
(05:12):
going through history because it's not like there was just
a switch that was flipped one day and then suddenly
all of our personal information got hoovered up by the Internet. Also,
we should keep in mind that there are a lot
of different privacy laws around the world that can restrict
this kind of you know, data collection and how data
companies can use that information. So, for example, the European
(05:36):
Union has laws that provide a decent amount of protection
for citizens. But as I am located in the United States,
where the Internet first got its start, and thus where
the attitudes and practices regarding private information on the Internet
got their start, at least with regard to you know,
the web and the Internet in general, I'm mostly going
(05:58):
to be focusing on the good old us of a
which isn't really that old, and frequently isn't that good either.
So here in the United States, a right to privacy
was not one of the guarantees laid out in the Constitution,
at least not explicitly. However, the various forms of government,
(06:18):
including the Supreme Court, over the course of decades and
several court cases has essentially found that four of the
first five amendments to the Constitution extend over towards a
right to privacy. So those would be the first Amendment.
That's the one that guarantees the right to assembly, uh,
the freedom of religion, the freedom of speech, and the
(06:41):
freedom of expression. Then we skip over the second Amendment
and we head over to Amendment number three, which prevents
the government from being able to station soldiers in private
homes of citizens. Then there's the Fourth Amendment, which protects
citizens from unreasonable search and seizure. That's a big part
of privacy obviously. And then we've got Amendment number five,
(07:03):
which has a little bit of Monica in my life. Wait, no,
I'm sorry, that's mambo number five. Amendment number five says
that US citizens enjoy certain legal protections. For example, you
can't be tried for the same crime twice, right, that's
double jeopardy. That obviously doesn't include stuff like the appeals process,
(07:23):
but that's separate. Also, citizens can take the Fifth Amendment
to avoid self incrimination during a trial, so they cannot
be compelled to confess to a crime. Uh. That again
relates to privacy. Now, none of those four amendments provide
explicit rights to privacy, but they all extend towards that
direction in different ways. So that was back in seventeen
(07:46):
eighty nine, which means it was like two centuries before
we had to worry about the Internet, but a century
after the Constitution was ratified. That being an eight ninety
an article written by Justice Louis Brandy and Samuel Warren
argued for a more explicit right to privacy. It was
the early days of photography in the eighteen nineties, and
(08:09):
already the two saw the potential for people to have
their privacy violated due to the fact that now it
was easier to capture a moment forever and then potentially
distribute it widely through the press. In nineteen fourteen, we
got the founding of the Federal Trade Commission, or FTC. Now,
the primary responsibility of the FTC is to ensure that
(08:32):
companies within the United States are playing fair. The FTC
can go after companies that use deceptive commercial practices, but
later on, really like starting around the nineteen seventies, the
FTC would also focus on companies that violated citizen privacy,
going global. For a moment in nineteen forty eight, the
(08:53):
United Nations drafted the Declaration of Human Rights, which included
within it article number twelve, of which states, quote no
one shall be subjected to arbitrary interference with his privacy, family, home,
or correspondence, nor two attacks upon his honor and reputation.
Everyone has the right to the protection of the law
(09:14):
against such interference or attacks. End quote. Now, apart from
the non inclusive pronoun usage there, this was a really
big step toward positioning privacy as a basic human right.
In nineteen sixty a legal scholar named William Prosser published
an article titled Fittingly Enough Privacy, and in that article
(09:36):
processor outlined four cases in which infringing on privacy should
allow for the victim to pursue a civil lawsuit against
the perpetrator. Those four cases were intrusion upon seclusion or solitude,
so in other words, someone's barging in on you when
you're just trying to be alone, or intrusion into private affairs. Similarly,
(09:59):
public disco cosure of embarrassing private facts, which when you
think about that, the Internet and the way it works
would just catch on fire if people were really holding
that seriously. Today publicity which places a person in a
false light in the public eye was another case. And
appropriation of one's name or likeness. So if someone were
(10:22):
to go about posing as Jonathan Strickland and it's not me,
it's not someone else who's actually named Jonathan Strickland like
they were trying to pose as me, I should be
allowed to sue that person based on that that criteria. Now,
obviously in these cases, uh, there's a lot of leeway
because there also are issues where it kind of starts
(10:45):
to track a little bit toward issues with the First Amendment. Right,
So in case of public disclosure of embarrassing private facts,
if you're talking about a public figure, there's a lot
more leeway there because it may be that that public
fig years private facts have a public bearing, like if
it's a politician or something. So it does get a
(11:07):
little dicey, but these matters always do. We'll skip ahead
to nineteen seventy four, though I should say that there
were some court cases and some scholarly articles that further
the framing of privacy. By nineteen seventy four, the US
passed the Privacy Act, which placed limits on how federal
agencies can collect and use personally identifiable information not granted.
(11:30):
These restrictions were all about government use, federal government use,
not state government, and not corporate use, so it's not
like this applied across the board. In nineteen six, the
US government passed the Telephone Consumer Protection Act and established
they Do Not Call Registry, and these were meant to
reduce the number of solicitation calls that citizens would get,
(11:52):
you know, the spam calls we would call them today,
and it would give people the opportunity to opt out
of being on that regis streets, so that in theory,
you wouldn't get those cold calls. Now, this is one
of those things that in recent years has become an
area of focus because stuff like robo calls and spoofing
have really sidestepped the protections that were in place all
(12:14):
the way back in six rendering them almost meaningless. And
you've heard a lot of calls even in you know,
high areas of Congress to have a new version of
this too address things like companies that use spoofing and
robo calls to do a widespread targeting of spam. Over
(12:37):
in Europe, the EU adopted the Data Protection Directive in
nine and then in the EU replaced that with the
General Data Protection Regulation or g d p R. This
is the set of rules and restrictions that really prevents
largely like big tech companies from following the same source
(12:58):
of strategies in the EU that they follow here over
in the United States. So you'll hear a lot about
companies having to readjust how they work in the EU,
because if they were to continue to operate as they
do in the United States, they could be held legally
liable for lots of violations. There are other rules that
(13:20):
are important. There's KAPPA. That's the Children's Online Privacy Protection
Act here in the United States. That act places much
tighter restrictions on companies when it comes to the collection
and use of data about people who are younger than thirteen.
That law and the enforcement of it have created some
fairly tricky situations for companies and for others, like notably
(13:41):
content creators on platforms like YouTube. But I'm pretty sure
I'm gonna have to do just a full episode on
KAPPA and its consequences in the future, because those consequences
include stuff that I think are valuable and stuff that
because of the interpretation of the law and the implementation
of policies on platforms like YouTube can also be harmful.
(14:06):
It's complicated, so that's why it would require its own episode.
In the US past, the Graham Leach Blightly Act, which
requires financial institutions to explain how they use and share
private consumer data. So this was for things like credit
organizations banks. That kind of thing also requires that those
companies provide a means for customers to opt out of
(14:30):
having their information shared, which is a pretty good point
to focus on for a second. In the United States,
the general approach the default is to require companies to
provide an opt out option, but a lot of companies
bury these sorts of settings or features so that they're
not easy to find or activate, and often any disclosure
(14:53):
of there being an opt out feature can be buried
inside a long terms of service end user agreement page
which most people don't take the time to read. And
I'm just as guilty of this as other people. I
don't I'm not you know, I'm not throwing shade here, folks,
I am. I've done this, and some people argue that
(15:14):
it would be better to have an opt in approach,
so this would be where you would sign up for
a service. You would be told, hey, do you mind
if we use your information to do X, Y and
Z in a very clear and transparent way, not buried
in like pages and pages of terms of service, and
(15:35):
you would then click a box to allow it to happen.
But since so much revenue depends intrinsically on the ability
of companies to collect and share data, there's an extreme
financial incentive to go the opt out route. More on
that later. On the state level, presently, two states in
the US have passed privacy laws meant to protect the
(15:56):
private information of citizens of those states, and they are California,
which passed the California Consumer Privacy Act in which went
into effect in twenty twenty, and then Virginia, which very
recently passed the Consumer Data Protection Act just this past
March in but that law won't go into effect until January.
(16:20):
The other forty eight states of the US pretty much
just followed the general federal rules to some extent. So
that's the legal background on privacy in the United States
and to a lesser extent, the EU without going into
like super detailed analysis. The idea being privacy is clearly
a thing, but it's not as heavily protected from a
(16:43):
legal standpoint in the United States as it could be. Meanwhile,
obviously technology advances at an incredible pace and frequently leaves
the legal system in the dust. But what about the
actual collection of personal information in the tech age. Well,
again we need to think back historically. Now, in the
(17:04):
old old days, you know, pre digital age, businesses essentially
kept an eye on customers, particularly repeat customers, and kept
an eye on what they were buying. This was necessary
in order to just keep stores stocked with the stuff
people actually needed. And I think this kind of tracking,
which is being really generous, uh, it was much more
(17:26):
general approach. It's pretty easy to understand and to forgive.
So imagine that you are a store owner and you
notice that every two months farmer Betty comes in and
she buys two bags of grain. You would learn pretty
quickly that you need to make sure you have grain
stocked every two months because typically that's when she would
come in and buy them. So you want to make
(17:48):
sure you had that on hand. And you would do
this with all your customers, whatever it was that they
were buying. You would make sure that you were stocked
up on and you might also try and experiment and
stock some stuff was related to the products that your
regulars were buying, and assuming that you speculate correctly, everyone
benefits from that. You make more sales and your customers
(18:10):
end up getting stuff that they need, even if they
didn't know they needed it at the time. Fast forward
a whole lot to the point where we had electronic
means of keeping inventory. The invention of stuff like computer
systems and the bar code in the nineteen seventies that
made it possible for us to keep an up to
date electronic record of inventory and sales, and these weren't
(18:34):
directly tied to customers just yet. That would still largely
depend heavily on observation, but now it was much easier
to spot buying trends and respond to them in close
to real time. That particular branch would lead to data
driven marketing, in which experts and marketing would look at
(18:54):
various goods and services and use data to determine which
regions they should focus on and which one's might prove
to be less profitable. So let's say you are uh,
the owner of a chain of grocery stores in the Southeast.
You might notice that grocery stores in Atlanta are selling
a lot of a particular type of product, and then meanwhile,
(19:16):
a store in you know, Charlotte isn't selling that so much,
but it is selling a different product at very high volume.
These are the sort of dad trends you would want
to know so that you could make sure you had
the right stock on hand, you could do advertising campaigns,
and you can maximize your sales and minimize your waste.
It was incredibly valuable data that was undeniable. Information was
(19:41):
the key to maximizing profit and reducing costs as much
as possible. In the nineteen eighties we saw the rise
of direct marketing, in which marketers would customize and personalized
efforts and aim them at specific shoppers. Now we're no
longer looking at regions, we're looking at individuals. And typically
(20:02):
they did this through direct mail sales. And this was
a pretty primitive approach and based solely on the customer's
past purchases. So I can give you an example of this.
I grew up in the eighties. I was one of
those kids who occasionally would buy comic books, and of
course stuff on the backs of comic books would sometimes
be really tempting, Like man I really would like that
(20:24):
hand buzzer. That seems like that will be a real hit,
And you go and you send your like dollar seventy
five or the mail and you buy one. And then
next thing you know, you start getting catalogs for all
these sort of novelty gifts. And that's how I ended
up on a billion mailing lists that were all aiming
at me in different ways for different types of weird
(20:44):
or novelty type stuff. That was kind of the approach.
It was pretty primitive. It was not, you know, super sophisticated,
but it did set a foundation that particular branch would
later extend further into st like loyalty programs in which
stores would issue cards or tokens, often using a bar
(21:06):
code right that they would scan, and that would link
purchases to specific customers. Now you actually know who it
is that's buying stuff and how frequently they're buying it,
and when linked to other information such as a person's
email address or their snail mail address, that can allow
stores to proactively reach out to a customer and alert
(21:27):
them of sales, maybe offer up coupons, all in an
effort to sell more stuff. And it gave the stores
way more information about the preferences of their customers on
that individual basis. That's sort of a microcosm of what
we're looking at with personal data on the Internet. I'll
explain more after this short break. Okay, we have gotten
(21:56):
up to the ninety nineties and the birth of the
World Wide Web. Now, the original purpose of the web
was to create a collection of documents that could be
linked to one another using hypertext links. So it was
just a means of sharing information and linking information together
which could allow you to create a type of contextualization.
(22:17):
So think of the average experience you might have on
a site like Wikipedia. Sure you might start off reading
about sloths, but then through a series of clicking on
various links in different articles, you ultimately end up reading
about the communist revolution in Cuba. It didn't take too
long after the initial launch of the earliest web pages
(22:38):
for commerce to follow onto the web. In the digital
magazine hot Wired introduced something new, the banner ad. Hot
Wired was an online branch of the print magazine Wired.
Now hot Wired doesn't exist anymore, but Wired now occupies
both print and digital formats. However, the banner ad became
(23:01):
a really important stepping stone in our story. The first
banner ad was for a T and T, which reportedly
paid hot Wire a fee of thirty dollars so that
the banner ad would be placed at the top of
hot Wired pages for the duration of three months during
that time, according to one source, at least I could
(23:22):
not verify this. It's widely reported, but I feel like
they're all pulling it from the same source, So take
this with a grain of salt. But apparently that banner
ad enjoyed a click through rate of forty so quick aside,
just in case you're not familiar with online ad terms,
click through is what it sounds like. It's how many
(23:44):
people actively clicked on an ad, which would then link
the person to some other page. It would send them
to a page that might have a little more information
about a specific service or product, and frequently it would
also include some means of signing up or purchasing that
service or product. So if you're an advertiser, you want
(24:04):
a pretty decent click through rate because one, it shows
your ad was effective, and too, it makes your client
happy because presumably they're going to get more sales, and
click through is indescribably effective. It is it's insane when
nearly half of all the people visiting a web page
(24:26):
are clicking on an AD that's on that page. That's incredible.
These days, the average click through on display ads is
somewhere in the neighborhood of point three five per cent,
so less than half a percentage point. Banner ads, by
the way, get even less, sometimes as low as point
(24:46):
zero five percent click through. So yeah, was unthinkably successful.
It probably said a lot of very unrealistic expectations, if
I'm being honest. But it was also a brand new thing.
People had yet to develop ad blindness to banner ads, right.
They They hadn't sort of trained themselves to just ignore
(25:07):
everything that's at the top or the right of the
page or sometimes the bottom. They were looking at everything.
And also ad blockers were not yet a thing, so
it wasn't like people were using means to prevent ads
from showing up in the first place. Now, in those
early days, the general thought was that a conversion, that is,
(25:28):
converting someone from viewing an AD to acting on it,
was pretty much limited to sales. So, in other words,
people thought of ads as only being successful if someone
was actively buying stuff through interacting with that ad. But
pretty early on that attitude began to change because there
were other aspects of conversion to consider. Clicking on the
(25:51):
ad showed interest, and interest alone could be a conversion.
It might mean that the company that was in charge
of of uh, whatever projects our service the ad was
linked to, might have to do some extra work, but
you have at least identified a a prospect. Downloading a
coupon was another type of conversion, and companies began to
(26:15):
have a way to track data trends through online interactions,
at least to how well their ads were kind of doing.
At the same time, a developer named Lou Montoli created
a new type of computer file that would transform the
experience of using the World Wide Web. These files, called cookies,
are integral to how we experience the web. The cookies
(26:37):
keep track of which websites we visit, how frequently we
visit them, and what we do on those sites, including
stuff like if we make a purchase or if we
click through an ad. They allow us to do stuff
like log into a service online, like you know, put
in your user name and password, and then you can
stay logged in even if you navigate away from the
(26:58):
site and you come back to it later, you're still
lugged in. Well, that's thanks to cookies. That way, when
we come back, we're already lugged in. We don't have
to go through that process again unless you're using something
that has really high security, in which case you would
have to do it because the security is the most
important part. Similarly, if you're using a site that allows
(27:18):
you to set certain preferences, the cookie file on your
computer tells that site what those preferences are, so that
when you return, those preferences are already in place for
you automatically. It creates a level of convenience that makes
the web more usable. Now, when you visit a site,
the site can request the information stored in that cookie file.
It has to because if the site needs to adjust
(27:41):
things for you, it has to know what to do right.
And this is where we start to get to a
point where personal information becomes a hot commodity online. The cookie,
which had clear benefits for users, would have even more
significant benefits for businesses online. So let's switch back to
the ads side of the story. At first, companies would
(28:02):
follow a T and T S lead and purchase banner
space or right rail space on a website for a
set amount of money for a set amount of time.
But it didn't take long for that to change. In
Netscape and info seek migrated to a new pricing model.
Instead of creating a one size fits a few approach
(28:25):
to selling web page landscape, which meant companies would have
to pay the same amount whether their ads were working
on that page or not, and for you know, whatever
duration they had picked. The new approach was the good
old CPM. CPM stands for cost per milay, with milla
being the Roman word for thousand. So it sounds at
(28:46):
first like you're saying cost per million, but you're really
saying cost per thousand. So the thousand in this case
tends to mean impressions. That is the number of times
the ad is presumably seen by visitors to that particular website.
So what this really breaks down to is that web
pages that are really popular and they get lots of
(29:07):
visitors can demand a higher CPM because more folks go there,
So ads that are placed there are going to have
higher visibility than an ad placed on you know, old
Joe Bob's Duct Tape Museum website. A site that gets
a lot of traffic can demand more money per thousand impressions,
kind of like how in the United States, if you
(29:29):
want to have a commercial played during the Super Bowl,
it's gonna cost you millions of dollars compared to, you know,
putting an ad on some niche television channel for late
night TV. On the flip side, for advertisers, they can
negotiate a rate with websites for a certain number of impressions,
which effectively replaced the duration limit for an ad. So
(29:50):
instead of saying I want an ad running on this
page for three months, you would say, all right, the
web page has a CPM rate of ten dollars, meaning
that you have to pay ten dollars for every one
thousand views, and you might negotiate for a million impressions,
which would mean that the advertiser would have to pay
(30:11):
the website ten thousand dollars to carry that ad in
order to generate a million impressions. And if the page
got a million impressions in a couple of days, boom,
that ad campaign was done. If it takes weeks, well,
then the campaign would last longer. And obviously that CPM
was different depending on the popularity of the website. Website
(30:32):
that's not that popular would have a low CPM, and
it would also take a longer time to reach whatever
the agreed upon number of impressions was. By the way,
this is also why there are a lot of websites
that have things like quizzes and galleries where you have
to scroll through each item. Like any website where it's
(30:52):
top ten movies that feature ghosts that have hair in
front of their face, and every single tree is its
own web page. That's because every one of those web
pages you go to, that's another impression. So it's a
way of expanding the number of impressions and AD would
get by making you have to reload the page over
(31:15):
and over and over again. If all ten of those
things were on one page, you would get one impression
for that article. These things, by the way, also have
massive drop off rates, like if they're not really compelling,
people will bail on them within like two or three entries,
so there's a diminishing returns thing going on with them.
(31:35):
That's just some insight into how web pages generate revenue.
I've been on that side, and it is not always
fun anyway. A big part of all this is that
the ad campaign wasn't tied to other performance metrics, right,
So it wasn't whether or not people click through the
ad or whether they actually made a purchase through the ad.
(31:57):
This was just about how many people actually were exposed
to the ad itself. The ad had to do the
rest of the work. Then you had the emergence of
double Click, that company which is now owned by Google.
It was actually the focus of a big antitrust lawsuit
that I'm not going to get into, but it was
(32:18):
a big deal anyway, and let companies know which of
their ads were most effective. So double click would give
feedback to companies about which ads got the most impressions
and click throughs, and those companies could then get a
better idea of where they needed to spend their digital
marketing dollars. Double clicks relationship with advertisers meant that there
was a lot of rapid innovation in the web marketing
(32:40):
space as companies began to hone in on what strategies
worked and which ones didn't work. From there, you add
all sorts of changes in the space. Some ads moved
to pay per click models, so that was kind of
going back to that conversion approach, where in this model
and advertiser would only pay a website for the number
(33:02):
of clicks that an ad received. That created an incentive
for the website operators to try and make sure they
were pairing ads with pages that would potentially drive the
most traffic. So if you had a web page about
I don't know, chainsaws, you probably wouldn't try and pair
that page with an ad for perfume. Now I'm not
(33:24):
saying there's no crossover between people who are interested in
chainsaws and those who are interested in perfume, but from
a marketing standpoint, you're probably not going to get the
view that is the best use of your resources. And
another big change around this time was to how people
were accessing information on the web in general. While it
(33:46):
might be theoretically possible to navigate the webs simply by
following various hypertext links to get from point A to
point B, to actually do it would be monumentally inefficient.
I mean, you might have to pass through hundreds or
thousands of sites to get from A to B. If
you're limiting yourself to just following links that are on
(34:07):
the page of A and trying to get to be
that way, this can be a fun game by the way,
a kind of seven degrees of separation kind of game,
but it's not an efficient way to navigate the web.
So Internet search engines were way more useful. You would
put in your query in the search bar and boom,
you got results that and, in theory at least best
match whatever it was you're looking for. But this meant
(34:31):
that search engines were incredible aggregators for user behavior. Not
only could search engines keep itally on which search terms
were the most popular at any given time, they could
also consult those cookie files on users computers and see
what other sites users have been visiting, and search engines
could sell ads to. In fact, search engines made a
(34:53):
subtle shift from being all about indexing the web and
became more about monetizing search and getting into the data
collection and advertising businesses. That's what happened to Google. I
mean years ago you might call a you know, Google,
a search company. Uh this was before the days of
tons of other Google products emerged, like Google Cloud and
(35:15):
Android and all that stuff. But at its heart, what
Google really was wasn't a search engine. It was an
advertising company. Companies could pay Google to have their sites
pop up in searches as you know, results in a
search result, they would be ads, they would be clearly
marked his ads, but they would be advertised spots. And
(35:35):
it gave those companies way more visibility than they might
otherwise have. And Google can target specific users with ads
that might appeal to them through a collection and analysis
of all that personal data, both from the users cookies
and their own searches history, and and if that person
happens to be using Chrome or an Android device, well
(35:56):
that's another kettle of fish entirely. So things took a
massive downturn shortly after search engines began to dominate the
data collection space, and that's because of the dot com
bubble bursting in two thousand, two thousand one, and the
world slid into an economic recession in general that particularly
(36:16):
hit the online world hard. That was further exacerbated by
the terrorist attacks on the United States on September eleven,
two thousand one. But the industry did continue. There were
just fewer players. Some of the bigger companies were able
to survive all that turmoil, and the fact that the
smaller competitors were largely wiped out meant that these companies.
(36:37):
So companies like Google and Amazon now had an even
more dominant position in the online world, so it would
be very hard to overtake those companies, and both Google
and Amazon are famous for jealously guarding their dominant positions.
Move forward a few more years and you get the
rise of social network platforms. My Space was one of
(36:58):
the first really big social networks. It launched in two
thousand three, but Facebook soon followed in two thousand four.
My Space would be the most popular social network until
about two thousand eight. That's when Facebook overtook it. But
Facebook also introduced Facebook Ads in two thousand seven. Facebook
offered even more data points about people than search engines could.
(37:21):
I mean, people use social networks to connect with friends
and to share stuff about themselves and the people they like.
So people were willingly giving up tons of information that
an advertiser might find extremely relevant. And so Facebook's business
model largely fell into the realm of collecting and analyzing
(37:42):
information about users, both from their activities on the platform
itself and through the consultation of cookies, and Facebook could
use that information to place highly relevant ads on specific
user pages. This became the absolute foundation of Facebook strategy
(38:03):
for the desktop experience, and how they generate revenue, and
the value proposition Facebook has for advertisers is incredible. They
can say, Hey, practically everyone in the world is on
our platform, and we know what they all like and
what they dislike because of cookies and their activities on Facebook. Plus,
(38:23):
we've developed algorithms that keep people on Facebook longer. So
if you advertise on us, we can make sure that
the people who are most likely to respond to your
ad are the people who see that ad, and we
can make sure that they see your ad a lot.
This also began to introduce the era of read targeting.
(38:45):
So if you've ever had the experience of going to
different websites but seeing the same ads play on those websites,
you've been experiencing red targeting. Maybe you were doing some
comparison shopping for I don't know how, with a brand
new toaster, but for whatever reason, you didn't pull the trigger.
Maybe you went as far as looking at a specific
(39:06):
toasters page on a shopping site like Amazon, but you
didn't move forward with the purchase. But now suddenly everywhere
you go you seem to be seeing ads for that toaster,
or maybe it's a different but similar toaster. This is retargeting.
The cookies on your computer have a record of you
visiting that toaster page, and they also include the fact
(39:29):
that you didn't actually purchase the toaster when you went
to that page. Those cookies allow various websites that have
deals with ad companies that are working with this toaster
manufacturer to dynamically insert ads for that toaster on all
the various sites that you go to with the goal
of convincing you to finally follow through on your purchase
and buy that dang toaster. All right, we are still
(39:53):
only just approaching the surface level of water from the
tip of the iceberg. The really valuable stuff is below
the surface, and we have Apple to thank for it.
I'll explain more when we get back. On June seven,
(40:18):
Apple launched the iPhone. While other smartphones predated the iPhone,
it's safe to say that the iPhone was the first
truly successful consumer smartphone that appealed to the mainstream market. Previously,
smartphones pretty much just targeted geeks and executives. Now everybody
wanted one. The smartphone would open up brand new opportunities
(40:38):
for data collection. In two thousand eight, Apple launched the
first iPhone to include a GPS chip. This allowed for
really useful features for the users, such as real time
maps that can give you an accurate view of your
current location. That was huge, right, enormous benefit, But it
also meant that along with all the data that companies
(40:58):
could access thanks to cookies and social media posts and
search engine activity, they can now also add location data
to the mix as well. Now companies can know what
you were doing online and where you were in the
actual world. Now, to be clear, companies could do that
for folks who were accessing the Internet on desktops or
laptops as well. It's just that we don't tend to
(41:21):
carry desktops around with us at all, and for those
of us who do use laptops, we don't have them
on an active all the time. But a smartphone that's different.
That's a device that can ping back to home base
several times a day, sometimes more than a hundred times
a day, and that pain can include stuff like how
much screen time you've spent on the device that day,
(41:42):
what apps you've been using, what sites and services you've accessed,
and where in the world you happen to be. And again,
this allows companies to target more specific ads your way,
and now those ads could be location based as well
as activity based. So maybe you're wandering around a new
city and you start seeing ads for specific locations like
(42:03):
restaurants or shops or amusement parks or whatever. On the
one hand, that could be really useful as you try
to find things that you might want to experience in
a new place. But on the other it indicated that
the smartphone was really gathering a ton of information about you.
And then there's that scenario I mentioned at the top
of this episode. If you hang out with other folks
(42:24):
and everyone happens to have a smartphone, everyone is generating data,
whether they're actively using their phones or not. And part
of that data isn't just what they're doing or where
they are, but also who there with. And now our
relationships with one another, the time we spend with each other,
and the places where we spend it, that all becomes
(42:45):
part of the data grab as well. It represents another
way that companies can leverage and exploit the information we generate.
One of the industries that grew out of all of
this was the data brokerage industry. These are companies that
collect and maintain massive data repositories about well about us.
(43:06):
These companies buy and sell personal information as if it
were any other commodity, because for a lot of entities
out there, that's exactly what our personal information is. So
even for companies that might not have the means to
collect your personal data directly through their own services, can
pay to get hold of those sweet ones and zeros
(43:27):
through using data brokers. The U. S State of Vermont
passed a law a couple of years ago that mandated
that any company that bought or sold third party personal
data had to register with the Secretary of State. As
a result, one hundred twenty one companies registered with the
Vermont state government. The companies didn't have to provide information
(43:49):
about who was in their databases. They didn't have to
say how many people were in them. The rules didn't
require that these companies make available any information about consumers
to those consumers, which means if you wanted to check
to see how much dirt any of those companies might
have on you, and you, you know, lived in Vermont,
(44:12):
well you would kind of still be out of luck
because the law didn't go that far. The law did
require that the companies had to inform the government about
any kind of opt out features that the companies provided,
but that only assumed that they actually were providing an
opt out feature for the average consumer. What this means
is that there are more than one companies trafficking and
(44:34):
personal data, and yours could be among them, and to
opt out of that system, you would have to contact
each and every one of these data brokers and go
through whatever process they might have in order to opt out,
because by default we are all opted into that system.
In some cases, we did have a choice, in the
(44:55):
sense that the choice was whether or not we wanted
to use a specific service or platform or visit a
specific website. Again, this is frequently where those long user
agreements come in, the ones most of us skip right over,
so that we just click on that I agree button
and get on with it. We might not be aware
that we just gave consent to have all of our
data collected, but that's arguably on us if we didn't
(45:18):
bother to read the terms and conditions. But in other cases,
we might not ever have really had a chance to
opt out of a specific data brokers collections at all
because we never had direct contact with some of those
data brokers. Because again information is bought, sold, and traded
like crazy. So some of these companies are just buying
(45:39):
up data that was collected by someone else and we
only ever had contact with that initial point. So as
an example, let's say that you sign up for a
social networking platform. We'll call it space Look. So you
sign up on space Look, and there's this long, boring
passage of information that you've got to scroll through so
(46:00):
that you can click the I Agree button, so you
zoom past all the dull stuff so that you can
finally get to uploading photos of your adorable kitty cat online.
But in that dull passage, there are terms that explain
that space Look will be collecting information about you and
then using the information to serve you ads. And in addition,
space Look might also sell or share your personal information
(46:23):
with other third party entities. So you've effectively signed over
your personal data to space Look for it to do
whatever it wants with it within the confines of the
agreement that you've clicked on, and your info might get
sent to various data brokers that otherwise you have never
heard of and never contacted. But wait, it gets worse. Recently,
(46:47):
Facebook and Google have been in the News for putting
up a bit of a fuss when it comes to
the types of data collection that are available to them.
Facebook got upset at Apple for a new change in
privacy settings on iOS devices, and it requires Facebook to
inform users about how it wants to collect data regarding
user activity on those iOS devices. So like iPhones and
(47:11):
you know, iPads, things like that, that activity includes the
use of other apps on the device. So, in other words,
Facebook collects information on users app activity even if that
activity isn't directly involving Facebook itself. So if you do
any banking on your phone, or shopping, or maybe you
play certain mobile games, well, Facebook wants to know about
(47:34):
all that because that data is valuable. Apples change requires
Facebook to alert users that it wants to collect that data,
kind of like this, this app wants to know your location.
Is that okay? It's similar to that, And it gives
the chance for users to opt out of that process
right then and there in that alert. And Facebook hates
(47:57):
that because if you give people the choice is to
not be tracked, often they take that choice. They just
you know, have to know about it first. So if
you aren't allowed to bury that notice in page two
of settings or in a deep user agreement, Well, then
a lot of folks might just take the option to
(48:18):
opt out, which means that's a hit on Facebook's revenue,
and Facebook and a fan of that, it has launched
a little bit of a campaign that essentially argues that
Apple's new rules are harmful to small businesses and that
people should feel badly about opting out, which is um,
super disingenuous. If you ask me, Facebook, I don't think
(48:40):
is at all concerned about small businesses except to the
extent at which those small businesses spend money on Facebook. Similarly,
Google was in the news when internal documents leaked showing
that the company had leaned hard on Android device manufacturers
to hide geo tracking opt out features deep in settings.
(49:04):
And so Google was essentially saying, hey, that's cool that
you want to make this Android smartphone, do us a
solid for the geo tracking stuff. Hide that like on
page two or three of your settings, so that nobody
ever bothers to go that far. And the goal was
just to make it harder for people to find where
they could turn off geo tracking, so that the company
could continue to collect that sweet data without too many
(49:26):
people opting out dirty pool Google. Just a few years ago,
all this data would have been alarming, but it also
would have been kind of burdened by the fact that
there's just so much information that's hard to do anything
with it. So it's one thing to collect enormous amounts
of personal information, but it's another to actually find a
(49:47):
meaningful use for all that data. You get a lot
of noise along with a little bit of signal. But
over the past few years, data analysis has advanced dramatically
and it's become much more sophisticated. Big data, which was
a buzz term that came largely from the marketing world,
is a real thing now, and when paired with systems
(50:07):
that use strategies like machine learning, advertising companies are able
to get incredibly detailed looks at who each of us
happens to be. Our data can be parsed and contextualized
in millions of ways that are incredibly valuable to countless
people and organizations. Some of those might be largely benign,
(50:28):
or at least no more malicious than your typical capitalistic endeavor,
but some might be way more malevolent. And of course
I haven't even touched on the burden of good stewardship
when it comes to protecting data. Many of those one
data broker companies have been targets of hackers, and more
than a few have had some pretty nasty data breaches
(50:50):
over the years. So I guess the whole point of
this episode is to really explain how technology in the
digital age largely centers around the collection and exploitation of information,
and that a lot of that information comes from people
like us, and that if we feel strongly about that,
we have to take steps to address this issue. Unfortunately,
(51:13):
right now, those steps are often laborious and convoluted. It's
easy to get discouraged. It's easy to prioritize convenience over privacy.
It's easy to give into the statement that Mark Zuckerberg
famously made in two when he said privacy is dead.
But as we see implementations of systems that exploit our data,
(51:37):
and as these become undeniably more invasive, it might benefit
us to look at them more closely and act in
our own self interest, because I assure you most of
these companies are not going to do that for us.
I'm talking to you, Stephanie, yes, you. I've personalized this
episode for every listener, and you, step phany need to
(52:01):
take action. I'm kidding. I didn't personalize this podcast, and
anyone not named Stephanie is probably just confused, and anyone
who is named Stephanie is probably flipping out. But no,
I just wrote that as a joke. But it is
the sort of thing that websites and apps and other
Internet related services can do for reals. In fact, technically,
(52:21):
with enough work, a podcast like this could probably do
it too. It would just require me to go through
a very long list of names and record them and
then have dynamic insertion of that statement in the podcast
and targeted to specific listeners. It would be a lot
of work, is what I'm saying, something that I am
not willing to do, but it could be done, and
(52:42):
maybe that's not always a good thing. All right. That
wraps up this very long soapbox edition of tech Stuff.
Hope you guys learned something in that and found some
value in that discussion. I will be doing some more
episodes about privacy related materials like Kappa. I do want
to talk about that and the intent of that legislation
(53:05):
as well as the actual impact of it, So look
forward to that in the future. But If you have
suggestions for things I should cover in future episodes of
tech Stuff, reach out to me. The best place to
do that is on Twitter. The handle for the show
is tech Stuff H s W and I'll talk to
you again really soon. Tech Stuff is an I Heart
(53:29):
Radio production. For more podcasts from My Heart Radio, visit
the I Heart Radio app, Apple Podcasts, or wherever you
listen to your favorite shows.