All Episodes

June 9, 2021 49 mins

The Children's Online Privacy Protection Act or COPPA has been around since 1996. So why was it suddenly in the headlines in 2020? We explore this law, its purpose and what consequences it has on the world of online content and advertising.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to Tech Stuff, a production from I Heart Radio.
Hey there, and welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with I Heart Radio
and I love all things tech. And on Monday's episode,
I looked at the topic of privacy or privacy and

(00:26):
the Internet and how companies collect and use private information. Essentially,
the point of that episode was to show how in
the era of big data, practically everything we do generates
at least a data point that goes into a constantly
more detailed depiction of who we are, digitally speaking, and

(00:49):
that numerous companies are profiting off that information. And sometimes
it means we as individuals actually see some sort of
benefit from that, right like targeted personalized experien ins can
be a benefit, but sometimes it really just means we're
being exploited in some way. Now, you could say that
for adults, this is just the price you pay to

(01:10):
be a modern citizen of our digital world, particularly when
it involves joining various online platforms and services that include
statements that explain that the service is going to collect, share,
and leverage data from users. And you could argue, hey,
you agreed to do this when you signed up for
an account, even if you didn't bother reading the fine print. However,

(01:33):
it is a different matter entirely when it comes to kids.
Kids don't have the awareness of what it means to
share information about themselves. They don't understand the potential consequences.
So it wouldn't be ethical to convince a kid to
sign any sort of binding agreement without that kid's parent

(01:53):
or guardian present and consenting to it. And so when
it comes to the privacy of kids, things take a
pretty serious turn. In the United States. That includes the
passing of a law in nine called the Children's Online
Privacy Protection Act or KAPPA CEO p p A. Today,

(02:14):
I want to explore what prompted the drafting and passage
of KAPPA, and some cases in which companies have been
found guilty of failing to comply with the rules of KAPPA,
and how that has affected content on the Internet in general.
Generally speaking, when it comes to technology and the law,

(02:34):
there is a lot of opportunity for messiness. Technology advances
at a far faster pace than what we see in
the legal world. Heck, the folks who draft legislation tend
to be let's say a bit on the older side,
which can also mean that they can sometimes be a
little bit out of touch when it comes to modern technology.

(02:56):
And we see this all the time with politicians just
struggling to understand the scope and the effect of technology
while simultaneously trying to draft legislation that in some way
intersects with technology. It doesn't always go well, and often
this means there's a figurative disconnect between the law and
the tech, and that can lead to unintended consequences. At

(03:19):
the same time, we can see the clear need for legislation,
at least in some cases. The fact that there aren't
hard and fast legal protections for online privacy in general
has led us to this condition in the United States
in which our data becomes digital currency. Only it's currency
that we the generators, the people that are making that data.

(03:43):
We don't get to profit off of that, at least
not you know, directly. Other people and companies are making
billions of dollars off the data that we create, which
doesn't seem terribly fair, does it That we're not compensated
for generating all this information. Our compensation tends to be

(04:03):
the use of a service or platform. Anyway, My point
is we can have situations in which we need legislation,
and yet we can also have those situations where the
laws we get may not measure up to what we
actually need. So let's take a look at Kappa and
figure out if it does what it's supposed to do. Now,

(04:25):
of course, this means we also have to look at
some history, and a predecessor to Kappa is an industry
organization called the Children's Advertising Revenue Unit or caru CE
a are you, that formed in nineteen seventy four out
of the Better Business Bureau here in the United States. Now, traditionally,

(04:46):
marketing targeted adults, right, there was adults that heard all
the different ads, But by the nineteen seventies, television programs
that were aimed specifically at kids were really becoming a
regular thing. And if the audience for those shows were kids, well,
that meant that the ads that were being displayed during
those shows were also being directed towards kids, and that

(05:09):
meant marketers had to craft ads that would appeal to
kids potentially so that those kids would go and then
convince their parents to buy whatever stuff they were seeing
on television, whether it was a toy or a serial
or whatever. But again, this brings up some pretty tough
ethical questions. How do you market to kids who aren't

(05:29):
old enough to make decisions for themselves? Rather than risk
having the government step in and get involved. The television
and advertising industries saw the wisdom in a self regulating body,
and CARU created some basic core principles which would also
become important for Kappa. The guidelines that CARU essentially set

(05:52):
they say stuff like kids may not be aware that
they're even being advertised too, and they might have limited
experience with the persuasive nature of advertising. So therefore, advertisers
need to show a special responsibility when they are marketing
towards kids. And as someone who grew up in the
seventies and eighties, I can tell you that it could

(06:15):
get pretty tricky to separate the ads from the content
back in those days, partly because a lot of the
shows geared towards kids were really nothing more than, you know,
grandiose commercials for lines of toys and stuff. I'm looking
at you, Transformers and g I Joe and he Man.
The guideline stress the advertisers need to be substantive in

(06:37):
their claims when it comes to ads directed at kids. So,
in other words, ads should not give kids the idea
that they'd be able to do stuff with a product
that just isn't possible, you know, like buying a he
Man plastic sword, and that if you just say, by
the power of Gray Skull, I have the power, you
will somehow magically get a page boy haircut and giant muscles.

(07:00):
I can tell you from personal experience that's just not
the case. Other guidelines say that advertisers should not advertise
products or services that are not appropriate for kids two kids,
So you shouldn't be getting like ads for car dealerships
in the middle of your Saturday morning cartoon block or

(07:20):
something like that, and that adds should include a diverse
representation within them. They shouldn't all be the same ethnicity.
And that also adds should reinforce positive social interactions, like
being honest in such In other words, the idea was, yeah,
we're gonna allow for advertising to kids, but let's do
it in a way where at least it seems like

(07:41):
it's wholesome. Now, I think it's fair to say that
these guidelines, which would again go on to inform the
rules for Kappa, weren't the product of necessarily a sincere
concern for children so much as it was an effort
to head the government off at the past. And I
say this as someone who has studied advertising and marketing

(08:03):
a little bit, and I walked away with a distinct
impression that ethics are you know, that's mostly something that
happens to other industries compared to marketing and advertising. Sometimes
ethics were seen as something of a drawback in that field,
particularly if you study the advertising of the fifties and

(08:23):
sixties the stuff. You should know. Guys have done episodes
on advertising from that era, and those episodes are phenomenal
and they really detail how sleazy that world could be.
And while that kind of foolishness might be aimed at
adults without much of a blink of an eye, the
story is different when there are kids involved. That being said,

(08:47):
Cairo takes the issue of advertising to kids in a
responsible way as a serious thing, because, again, if the
organization fails in this regard, if the industry starts to
fall short, the government is going to step in. There
will be a a strong position from the citizens for
the government to step in and do something about it.

(09:08):
And like many industries, the advertising world is not super
keen on the idea of regulation. But let's move forward.
So while you might look back on the children's programming
and advertising from the eighties and into the nineties and say, huh,
it seems like those guidelines were pretty lucy goosey, and
that a lot of a lot of programming may not

(09:29):
have followed it all that closely, the fact was that
Cairo was in charge of making sure things didn't go
too far astray in the United States, and then in
the early nineties, the mainstream public became aware of this
thing called the Internet. Now, those of you who have
listened to this show for a long time, or those

(09:50):
of you who have studied the Internet, know that the
Internet and its predecessors like our ponnet have actually been
around for a really long time. But outside of a
RelA tive ly small population of researchers and students and
government officials, hardly anyone knew anything about it. And that
changed with the launch and evolution of the Worldwide Web.

(10:11):
The Web was a much more user friendly and intuitive
way to access the Internet. The adoption of the Web
by the mainstream didn't happen overnight, but without the Web,
I think it's pretty safe to say that the general
awareness about the Internet in general h and what you
could do with it would have lagged behind by several

(10:32):
years at the least. Anyway, once websites started to really
be a thing, and once companies began to kind of
cautiously dip their corporate toe in the web and see
how they might conduct commerce and further, how they might
advertise to people online, CARU really began to get a
bit proactive in the online space. There was a legitimate

(10:54):
concern that one of the things that makes the web great,
that being that it's easy to access stuff if you
just have a browser, uh, internet connection and some basic
web skills, that also makes the web a potential pitfall
when it comes to how children access and process information. Moreover,
as I mentioned in Monday's episode, it didn't take very

(11:16):
long at all for companies to use the Internet in
order to start gathering information about users in order to
advertise to them more effectively, And it wasn't long at
all before companies began to build out databases of user information.
And while you might make the argument that a mature adult,
or you know, at least someone who's of an adult

(11:37):
age has the wherewithal to sign over their right to
privacy with an understanding of what that actually means. The
saint can't be said of kids, and thus CARU focused
on how various sites and services would gather and use
information online when it came to the information of children.

(11:57):
In nine CARU published a new set of guidelines called
Interactive Electronic Media. This was in an effort to get
ahead of issues that were starting to pop up thanks
to the web. The goal was to create new guidelines
for the online space that would protect the privacy of
those under the age of thirteen, which was kind of

(12:17):
an arbitrary age picked it. It reflected the ages that
CARU was focused on as part of advertising in general,
and it was easy to see the need for this
approach because even as early as nineteen there were companies
that were advertising to kids online and they weren't necessarily
following the CARU guidelines that had been in place for

(12:39):
like television advertising. Plus, there was the issue of companies
collecting information about these young consumers without parental consent. The
canary in the coal mine for this issue turned out
to be a website called kids Calm, which, as the
name implies, was a child focused site. Kids Calm used
tools like register straation forms, pen Pal programs, and contest

(13:03):
entries to gather information about their users, that is, kids
who are visiting the site, and hey, this is a
good time to remind you that if you happen to
be entering a sweepstakes or contest online or even offline,
what you're really doing is you're handing over your information
to some third party, and you can bet that information

(13:24):
is going to be used in some way. It may
be used directly by the entity you hand it to.
So it may be that, for example, it's a magazine
publisher and you filled out this information. Well, now the
magazine publisher is going to market other magazines directly to you.
Or it might mean that your information gets put into
a database that other companies can pay to access, or

(13:48):
it could be a combination of the two. Anyway, kids
Calm was doing this, but the big problem was that
a lot of kids Calm users were you know kids.
The FTC investigated Kids Come after receiving a complaint letter
about the site from the Center of Media Education. The
FTC found kids Come in violation of several FTC rules

(14:10):
with regard to the collection and sharing of data, and
it had turned out that kids Come was in fact
sharing a database of user information with third parties. On
the bright side, the data was an aggregate, so it
was not formatted to reveal personal information unique to individuals.
It was more inaggregate. It was all collected so that

(14:31):
there was no personally identifiable people in there. It could
have been worse, is what I'm saying. Kids Come agreed
to change its ways and to conform to the rules,
and that site stuck around till two thousand nineteen. But
the case of kids Come laid bare the potential dangers
of data collection when it comes to young kids. This
matter was seen as a serious one and it led

(14:53):
to the Federal Trade Commission or FTC, to draft the
Children's Online Privacy Protection Act KAPPA, and again that was
passed into law. In that law used CARUS Interactive Electronic
Media Guidelines as a foundation. The law applies to websites
and service operators that either directly target children for the

(15:14):
purposes to collect, use, or disclose personal information of those children,
or that have actual knowledge that those sites and services
are in the process of doing whatever it is they
are doing, also collecting, using, or disclosing children's personal information. So,
in other words, whether a company is setting out explicitly
to collect information about specific kids or that just happens

(15:38):
to be a byproduct of whatever the company is doing,
these laws have to apply to those types of entities.
But as we'll see, this approach is not quite as
black and white as it sounds. We're gonna take a
quick break and we'll be right back. Okay, So I

(16:01):
want to clarify something I said before the break, which
is how the law applies to sites and services online. Now, Essentially,
an Internet based entity would need to follow COPPA if
that entities service was targeting those under thirteen and also
collecting personal information about those users, or allows a third
party to collect information about those users, like an advertiser,

(16:25):
or if the entity runs some sort of ad network
or uses plug ins that also collect information, such as
like the Honey extension, that one collects personal information, and
if Honey knows for a fact that among the users
of its service there are children under the age of thirteen,

(16:47):
it has to comply with KAPPA. So, in other words,
if you know for a fact that the information you
are collecting includes information from kids, KAPPA applies. Third if
your site or service aims for a general audience, but
you happen to know that within that general audience there
are people under the age of thirteen, and you are
gathering information on your audience. Kappa applies and see as

(17:11):
kind of at the heart of what would cause issues
for content creators on YouTube. There are channels that clearly
target younger audiences, and there are some that, based on
the nature of their content, clearly aren't meant for younger audiences.
But there are tons of channels where it hits this
kind of gray area in which it might seem at

(17:33):
a casual glance to be aimed at a younger viewer group,
but in fact the channel contains content that is not
appropriate for kids. And this gets into a bunch of
tangential matters that open up so many kinds of worms
that I'm just gonna touch on it here. We're not
going to dive into it. That would be an entirely
separate episode. So what I mean by all this is

(17:54):
that there are some forms of media that traditionally people
associate with chill rans media. So I'm thinking about stuff
like puppets or animation or video games, and yeah, I'm
sure most of y'all listening to this can think of
plenty of examples of those forms of entertainment that are
definitely and definitively not for kids, like the Broadway show

(18:20):
Avenue Q that features puppets, but that show ain't for kids. Uh.
There are lots of different animated series like anime that
are far too sophisticated and contain content that would not
be appropriate for children, might disturb them and upset them,
and it's just not meant for younger viewers. And of course,
we also know that the idea that video games are

(18:41):
for kids is really an outdated concept. I mean, nearly
forty of the video game playing population is between the
age of eighteen and thirty four. Another significant percentage is
over the age of sixty five, So when you look
at it, kids make up a minority of the people
who are playing video games, and yet we still have

(19:02):
this association that video games are for kids. But the
fact that these types of entertainment have this traditional association
with children's media is an ongoing issue that I'm sure
we're gonna loop back into before the end of this podcast. Anyway,
Kappa passes in and while I've talked about who is
subject to KAPPA. I haven't really covered the actual rules yet,

(19:25):
so we're gonna go over those. And these rules come
straight from the f TCS website on Kappa. Those entities
that are subject to Kappa must and I quote, post
a clear and comprehensive online privacy policy describing their information practices.
For personal information collected online from children, they must provide

(19:45):
direct notice to parents and obtain verifiable parental consent with
limited exceptions, before collecting personal information online from children. They
must give parents the choice of consenting to the operator's
collection and internal use of a child's information, but prohibiting
the operator from disclosing that information to third parties, unless

(20:06):
disclosure is integral to the site or service, in which
case this must be made clear to parents. They must
provide parents access to their child's personal information to review
and or have the information deleted. They must give parents
the opportunity to prevent further use or online collection of
a child's personal information. They must maintain the confidentiality, security,

(20:31):
and integrity of information they collect from children, including by
taking reasonable steps to release such information only to parties
capable of maintaining its confidentiality and security. They must retain
personal information collected online from a child for only as
long as is necessary to fulfill the purpose for which

(20:51):
it was collected, and delete the information using reasonable measures
to protect against its unauthorized access or use. And they
must uh not condition a child's participation in an online
activity on the child providing more information than is reasonably
necessary to participate in that activity. So what that last
one means, by the way end quote, what that last

(21:12):
one means is that you can't require kids to fill
out like a lengthy online form with stuff like their
full name and home address and phone number and email
just so that they can play a game, because it's
not necessary to have that information in order for the
kid to play a game. That's what that last one means.
The FTC also goes on to define what constitutes personal information,

(21:35):
so specifically, the FTC when they say personal information, they
mean the first and last name of a person, their
home or address including street name and town or city,
their online contact information, their screen name or handle that
can serve as online contact information on certain services and platforms.

(21:56):
A telephone number counts as personal information, social security numb
or a persistent identifier that indicates a specific user across
different user sessions and different sites and services. So this
is something that would be unique to each person, but
not something that they would have necessarily provided to the service.
The service provides this to the person. So let's say

(22:18):
I'm the third person to ever sign up for Facebook,
and Facebook assigns me user number zero zero zero zero, etcetera, etcetera. Three,
that's my personal identifier. It's unique to me. It means
Facebook can track me and my my activities. That's the
kind of thing they're talking about there. That counts as
personal information, even though it's something that the site assigns

(22:39):
to the person, not something that the person gives to
the site. It also includes any file that has the
child's image or voice in it, so any sort of
video file or image file or audio file. It also
includes geolocation data that can target where the child is,
including like a street uh. It includes information collected by

(23:00):
the service from the child or child's parents that complements
the other types of information mentioned. So that's like the
registration forms and stuff I was talking about, and some
of these types of personal identification subsets, such as that
geolocation data, those were not part of Kappa originally back
in n and ninety six, geolocation wasn't much of a

(23:23):
thing because we were still waiting for GPS to be
kind of opened up to the general public. It was
still very much a military centered thing, and we were
just in the very early days of having that change.
But in the FTC expanded KAPPA a bit, given that,
you know, tech like smartphones had really created a new

(23:44):
way to collect data and new types of data that
could be useful to collect. So that's when we started
to see some of those other things add it in,
like geolocation and the personal identifier number. In twenty seventeen,
the FTC would expand KAPPA again to apply it to
not only to Web browsers and services on computers and smartphones,

(24:05):
but also to other types of Internet connected devices like
IoT type stuff. And that was because with the proliferation
of IoT sensors and toys and games, the FTC wanted
to make certain that the rules would apply to these
new technologies and protect children. I did an episode I
think it was on Forward Thinking actually the Forward Thinking

(24:26):
podcast from several years ago, where we talked about the
Internet connected Barbie doll and how that ended up posing
as a potential security and privacy threat for children, and
that's the sort of thing that the FTC really wanted
to get, you know, a handle on. Now, assuming that
an entity that must comply with Kappa is following the rules,

(24:49):
then that entity should be legally in the clear right
if they as long as they're following the rules then
and they're making sure they're getting parental consent first, they
can still collect data, they can still use that data.
They have to delete it after they use it, but
they're allowed to do it. But if there's an entity
that is supposed to comply with Kappa and fails to

(25:10):
follow the rules with regard to personal information, what happens then?
All the first thing that someone has to do is
file a complaint with the FTC or a state authority
and explain the nature of this complaint with regard to
the handling of children's information. The FTC would then investigate
this claim or a state authority would, and they would

(25:33):
see if the accused online site or service was in
fact in violation of Kappa. And if the authority determines
that there has been a violation occurred, they can then
bring a civil lawsuit against that site or service. The
rules of Kappa state that the civil case can seek
a penalty of forty three thousand seven dollars per violation.

(25:58):
That's the maximum amount per violation, but the FTC might
not seek the maximum amount depending upon circumstances. So determining
factors include the number of children affected. If it was
a few children as opposed to a lot, then that
changes things how egregious the violation was. If it's something
where it's found that a company was collecting user names,

(26:21):
for example, but no other personal identifiable information, they might
not get hit as hard um the types of personal
information involved, how the information was used will determine it.
And like I said, COPPA allows for state agencies to
enforce compliance with KAPPA with respect to entities that are
within those states jurisdiction, So there can be state level

(26:43):
COPPA cases as well as the federal level ones. And
then we get to the concept of safe harbor within KAPPA.
So safe harbor is a general concept in which a
person or company faces limited legal liability so long as
that person or company is following certain conditions. And I
know that's super general, but the concept applies to a

(27:05):
lot of different situations, so it has to be general.
So for example, a user generated content platform typically enjoys
a certain amount of safe Harbor protection because those platforms
are not responsible for the content that's published by their users.
Right If I join a content platform as a user

(27:26):
and I upload stuff that is against their policies, that's
on me, not on the platform. However, as I mentioned,
safe harbor typically only applies as long as certain conditions
continue to be met. So for a user generated content platform,
one of those conditions could be that the platform has
to take down an instance of user generated content if

(27:47):
it's proven that that instance includes copyrighted material that doesn't
belong to the user. So if I start uploading you know,
they might be giants tracks, and the platform i'm uploading
them to get a notice, Hey, this guy is doing
this without our permission, then that platform would be expected

(28:08):
to you know, ban me or delete my material or whatever,
and then it would continue to enjoy the protections of
safe harbor because it actually took steps to address the issue.
If platforms don't follow whatever those rules are, and it's
a case by case kind of thing, then they no
longer enjoy the safe harbor protection. Well, with Kappa, it's

(28:28):
possible for industry groups to file for safe Harbor status
under the FTC, and these groups have to establish a
set of rules and policies that are at least the
same or greater than those defined by KAPPA. And if
they do that, then they can apply for safe Harbor
status under KAPPA. And then regulation and enforcement kind of

(28:49):
falls to that industry groups and the organizations that belong
to industry group. So if a company then becomes a
member of a particular industry group, and if industry group
has safe Harbor status, the burden of responsibility really falls
to that industry group to make sure that all the
different member organizations are in compliance with that group's policies.

(29:11):
And I actually have a real world example. I can
cite of a company that at one point belonged to
a Safe Harbor group and it later got in trouble
with the FTC. That company was the game developer Mini Clip,
which in two thousand nine joined the children's advertising review
Units Safe Harbor. So, yeah, CARU, it does have a

(29:32):
Safe Harbor industry group under KAPPA. But in two thousand fifteen,
CARU terminated Many Clips status as a member of that group.
I'm not actually sure why that happened. I don't know
what led to many Clips status terminating under that Safe
Harbor group, but the FTC pursued a complaint against many

(29:53):
Clip because, according to the FTC, Many Clip continued to
display a message on its websites saying that the company
was part of Caru's Safe Harbor group well into two
thousand nineteen, which was years after the company had its
membership terminated, and the FTC's main complaint was that Mini
Clip was misrepresenting itself. It was presenting a falsehood that

(30:15):
it was still a member of this Kappa compliant group,
when in fact that was no longer the case. So
this showed that Kappa applies not just to the direct
activities that companies engage in that involved the collection and
use of personal data that belongs to kids, but also
how those companies represent themselves or in this case, misrepresent themselves. However,

(30:36):
plenty of companies have been found guilty of violating Copper
rules with regard to children's data. For example, back in
two thousand eight, Sony b MG, the music label, was
sued by the FTC for collecting information on an estimated
thirty thousand users below the age of thirteen through its websites.
See Sony b MG had sites that included some social

(30:59):
networking aspects to them, and that required users to include
stuff like their names and addresses and email address and
that kind of thing, you know, the standard stuff that
you have to fill in when you create an account
on a social networking site. Now, Sony claimed on these
sites that they were not meant for people under the
age of thirteen. It was for thirteen or older, but

(31:21):
there were no actual measures in place to actually prevent
kids from signing up. There was no sort of age
gate process there, and the FTC alleged that not only
was Sony not preventing it, the company was aware that
thousands of users were under the age of thirteen. In
the end, Sony BMG agreed to pay a one million

(31:42):
dollar settlement to the FTC, and by in the end,
I mean the day after the FTC filed the lawsuit
Sony agreed to settle. Sony also agreed to delete all
personal information related to users under the age of thirteen,
along with some other measures that were mandated by the
terms of the settlement. Sony's big issue was that, while

(32:03):
it proclaimed that the sites were intended for those thirteen
and older. It really had no measures in place to
actually enforce that, and kids under that age could register
and even include their actual age in the process, and
that meant Sony was knowingly collecting data belonging to people
under the age of thirteen. That's a big deal. Like,

(32:23):
if you unknowingly collect the data of people under the
age of thirteen, you actually have a bit of a
defense if you can prove that you did so unknowingly.
But when you knowingly collect it, it's a different kettle
of fish. When we come back, we'll talk about a
few other cases in which companies have tried to sidestep
the issues of Kappa entirely, and also how Kappa really

(32:46):
freaked out a huge population of YouTube creators back in
But first let's take another quick break. So when it
comes to companies trying to avoid dealing with Kappa, we
got to talk about Facebook. Pretty much. Since the time

(33:09):
Facebook opened up beyond college students, it has maintained that
the services for people thirteen or older, and to make
a Facebook profile, you have to include your name and
an email address at a birth date, even if you
don't intend on showing anyone else your birthdate, Facebook uses
that information to in part to check your age. So

(33:29):
if you're younger than thirteen, Facebook won't let you make
a profile. Why because then Facebook would be subjected to
the rules of Kappa, and as a platform that hosts
user generated content, that would be really hard for Facebook
to comply with. And moreover, it would mean that Facebook
would have to implement some serious restrictions on how it

(33:50):
gathers information. As I've pointed out many times, because the
vast majority of Facebook's massive revenue comes from how the
platform harvests and uses our personal information. So Facebook has
the age gate approach. If a user fakes their birthday
to get in, well, that's not really Facebook's fault, is it.
That's just someone being dishonest, and Facebook isn't knowingly collecting

(34:13):
the data of a child. The company would still be
collecting that data, mind you, but as long as there
was no indication that the profile actually belonged to a child,
the company would have the defense that, according to the
information submitted to Facebook, that user was over the age
of thirteen and so by positioning itself as a company
that has a social networking site intended for thirteen and older,

(34:36):
Facebook isn't subject to Kappa The system isn't perfect by
any stretch of the imagination, but it's hard to argue
that Facebook is just using smoke and mirrors to seem
as if it's complying. It might be easy to circumvent
the rules, but there are rules. At various points. Since
Kappa became law, lawmakers have considered expanding the rules and

(34:59):
upping the age to eighteen years old. Some lawmakers have
asked why thirteen has been this arbitrary age, which kind
of dates back to the old CARU guidelines, and so
far the government has not increased the age limit on Kappa,
which is a good thing for platforms like Facebook because
a change like that would have a massive effect on
that social network, as well as on countless other sites.

(35:23):
And then we come to YouTube. Yikes, alright, so YouTube
has had more than a rough recent history when it
comes to content and kids. First of all, YouTube has
a policy that's pretty much the same as facebooks. You
are not supposed to create a YouTube profile unless you're
at least thirteen years old, and like Facebook, Google age

(35:46):
gates this. Of course, someone might lie about when their
birthday was, or a parent might set up an account
and fudge a birthday so that their kid can watch
stuff on YouTube. It's also entirely possible to just go
to YouTube without being on a profile at all and
just watch content on YouTube as an anonymous user. So
while YouTube age gates profiles, the platform doesn't actually age

(36:10):
gate the general content on YouTube itself, at least not
to that extent. And what's more, YouTube slash Google. When
I use YouTube as a company name, you can probably
substitute Google or even Alphabet in there, because it's all
one big, dysfunctional family. Anyway, YouTube knows all about this,

(36:31):
which I mean no surprise that company is in the
business of collecting and understanding our data better than we do.
So in meetings with like big companies like Hasbro and Mattel,
you know, big toy companies, YouTube has bragged about how
it is a platform that is incredibly popular with children,
like kids between the ages of two and twelve and
stuff like that, so they're well aware that young kids

(36:55):
are watching YouTube. And then there's YouTube Kids, which is
the actual app that's supposed to filter content by age
group so that you can select using the app which
videos should be shown to your kid, as in, like
which videos that are appropriate to certain ages are allowed
to be shown to your kid. Uh. The ages I

(37:15):
think are five, eight, and thirteen, and it just depends
on which one you set when you set up the account.
Like the standard version of YouTube, YouTube Kids generates revenue
through ads, and Google got into trouble with this with
the Campaign for a Commercial Free Childhood and the Center
for Digital Democracy. They both argued that the way that

(37:38):
ads were being presented to children on YouTube Kids made
it seem like the ads were part of the content itself,
which was considered to be misleading and in violation of Kappa.
So Google subsequently made those ads stand out a little
bit more from the content itself in order to make
a more clear divider between what was an ad and

(37:58):
what was content. And my guess is you've heard the
stories about weird and disturbing videos popping up both on
YouTube and ultimately on YouTube Kids that was specifically targeting children.
These videos usually involve recognizable characters, ones that clearly have
not been licensed, but characters like Elsa from Frozen or

(38:22):
Spider Man from Marvel, and they're engaging in all sorts
of weird and sometimes upsetting activities. Sometimes it's live action
people dressed in costumes. Sometimes it's crude animation. Often the
videos are completely wordless and just set to music, which
means there's no language barrier there. So they can have
a pretty wide appeal globally, and they frequently involve activities

(38:44):
that kids find fascinating. You know, stuff that happens in
the world that kids think is really unusual or strange,
like pregnancy or toilet's or you know, all sorts of
things of that nature. Getting a shot from the doctor.
That's a big one too. It's definitely not high cinema,
but it's the kind of stuff that kids really fixate

(39:06):
on to different degrees and for different reasons. So through
a combination of using meta keywords and other means, these
videos would perform really well in in YouTube algorithms and
often make the transition to YouTube kids, even if arguably
the content was not appropriate, and then kids would stumble
across them, and that in turn would upset parents who

(39:28):
found out that their kids were watching these things, and
that ultimately made the news. Now, this issue had been
going on for years, really, but it really came to
light in twenty seventeen. That's when it it made headlines
in the United States and it brought a pretty harsh
spotlight on YouTube as a result, so the company began
banning thousands of these garbage content channels, but now found

(39:50):
itself under some serious scrutiny. And that scrutiny included people
who were concerned that videos that seemed like they might
be aimed at kids were in fact not kid friendly,
and that because YouTube's revenue model depends heavily on advertising,
it's also meant that kids might be seeing ads that
weren't really appropriate or didn't comply with CARU guidelines, and

(40:11):
also that YouTube might be collecting information about kids without
parental consent. In September twenty nineteen, the FTC and the
New York Attorney General reached a settlement with YouTube regarding
an allegation that the company had been collecting personal information
in the form of persistent identifiers used to track specific

(40:32):
users as they navigate through the site and beyond, and
that that included children under the age of thirteen and moreover,
that YouTube did not first notify parents about this and
get the parents consent, which thus was a violation of Kappa.
That settlement meant that YouTube would have to pay one
hundred seventy million dollars in fines, which was the largest

(40:55):
amount for a Kappa case at that point. The FDC
chairman said, quote, YouTube touted its popularity with children to
perspective corporate clients, yet when it came to complying with Kappa,
the company refused to acknowledge that portions of its platform
were clearly directed to kids. There's no excuse for YouTube's
violations of the law. End quote. The complaints stated that

(41:18):
YouTube positions itself as a general audience site, but in
fact has many channels clearly geared toward younger audiences. For example,
videos of people unboxing toys that are meant for little
kids seems to fall pretty neatly into that category. And
so YouTube, as a platform that knowingly played host to
child directed channels, had a responsibility to make certain those

(41:42):
channels and then, more importantly, the advertising on those channels
complied with Copper rules. Well, there's nothing like a big
old fine to incentivize a company to changing its policies,
and that's what YouTube did. Creators now have to indicate
whether their channels are child directed or not. They can
also label specific videos on a case by case basis

(42:05):
as to whether or not that video is directed toward children.
Any video determined to be child directed, either because the
user made that indication or YouTube figured it out or
decided that was the case. Anyone that did that would
not be allowed to include certain ways of collecting personal
information from viewers. Now, most creators don't actually collect information

(42:29):
from their viewers at all, and at least not to
this level. Most of the creators who are YouTube partners
and who are running ads on their videos are relying
on YouTube's algorithms to serve up advertising. They don't usually
actually have a say in what kind of ads are
going to play against their videos. And because of YouTube's
dynamic ad program, two different people in two different parts

(42:50):
of the world who watched the exact same video with
the same number of ad breaks could see very different ads.
But for videos that can't collect personal information, and that
selection of ads gets whittled down quite a lot. So
that meant that creators who had child directed channels would
find it more difficult to monetize their work and they

(43:10):
would see a lower return on investment, and for some
creators that could be severe enough to make it unprofitable
and thus unsupportable to continue making videos on YouTube. Creators
complained to YouTube, saying that it was difficult to determine
whether or not content was really child directed or not.
Maybe it was just family friendly but not specifically child directed. So,

(43:32):
for example, videos of Let's Plays, in which content creators
play video games and often provide commentary and so on,
they kind of fall into that gray area because there
is still that perception that video games are for kids,
even though most of the population playing games these days
are older than thirteen, and most games contain content that's

(43:56):
not really kid friendly. I'm pretty sure no one would
claim like Grand Theft Auto is appropriate content for small kids,
or that the Resident Evil franchise is great for a
six year old. But because there's this social association of
video games with children, it could be difficult to argue
that videos featuring gameplay are not actually child directed. And

(44:17):
since the FTC's decision meant that creators could be held
liable for future violations of Kappa, and because these creators
are dependent upon ad revenue as part of their income,
that gets to be a big problem. If kids are
watching the videos, then that means the kids are also
being served advertisements, and if that's happening, it means that
there's a chance those kids are having some form of

(44:39):
personal identification shared with those advertising parties without parental consent,
and thus we get back to the violation of Kappa.
It doesn't matter that the creators themselves wouldn't have access
to that personal information. Just by serving as the conduit
through which kids data could be hoovered up by an advertiser,
the creator would be on the hook. That means that

(45:02):
that content creator on YouTube could potentially have to pay
up to forty two thousand dollars per video that violates
the rules, which is a big ouch. In January twenty
YouTube put the new rules in place, and videos that
were either selected as being child directed or that were
subsequently determined to be child directed would have their various

(45:24):
features turned off. For example, comments would get turned off
for those videos because the comment section could serve as
a means for someone to collect data about members of
the audience. So these videos would no longer have personalized
ads because of course, those ads depend upon tracking personal data,
so you can't have a personalized ad if that's not allowed,

(45:44):
so child directed videos can't use that. So instead the
ads would be contextually appropriate based upon the content of
the video. Other monetization features, which include stuff like a
merchandise option, those would also be turned off because those
require users to handover information in order to interact with
those features, so those were not allowed. Playlists, the mini

(46:07):
player notifications, all of these features would be turned off
as well. That also means that these videos would be
adversely affected when it comes to YouTube's recommendation engine. Videos
that get a lot of engagement tend to go up
on those lists, but the child directed stuff by default,
has a lot of those different features that affect metrics

(46:28):
turned off, so that leaves the creators with a disadvantage.
Their content is less likely to be seen and discovered,
which in turn means lower revenues for those content creators.
And that's kind of where we are now. It's it's
a rough place because on the one hand, you certainly
see the FTCs point in that you don't want children

(46:49):
to be exploited. You don't want companies to be collecting
data about kids when there's no real accountability there. You
definitely want the parents to be involved in all of this,
and it's very easy for parents to be left out
of the loop, even well intentioned and you know, attentive
parents can be left out of the loop. So there

(47:12):
does need to be some sort of measures in place,
and it's not necessarily I can't really blame YouTube for
passing these rules. I know a lot of YouTube creators
were really upset when YouTube made these announcements, but the
company has an obligation as well, and ultimately is trying
to protect content creators because those are the people who
are going to be held responsible if they're found in

(47:33):
violation of KAPPA. I also feel for the content creators though,
because a lot of them are creating content that they
never intended to be directed towards children, and yet sometimes
children end up kind of, you know, latching on to
that content. Well, what does the creator supposed to do
about that? Because they're in the business of making stuff

(47:56):
that they're trying to entertain people with. They're not necessary
early intending it for kids, but kids are coming to
watch it. That really puts them in a in a
tight spot too. So it's tough. Like, there's no easy
answer to this. Uh, there are a lot of competing
motivations and and obligations going on here, and there's not

(48:19):
really a simple way forward. So while we are in
a in a bit of a mess. I'm not certain
that there's really a neat way of doing this. If
you think otherwise, I'm curious to hear your thoughts. You
can reach out to me on Twitter. The handle for
the show is text Stuff hs W. I'm sure I'll
do more episodes that relate to privacy and technology and

(48:43):
the ways that those are in conflict with one another,
and what we can expect from that and what maybe
we should do about it. I'm sure I'll do a
lot more of those in the future, but I suspect
that next week we're going to be covering some totally
different topics, So stay tuned for that and I'll tell
to again really soon. Y Text Stuff is an I

(49:09):
Heart Radio production. For more podcasts from my Heart Radio,
visit the i Heart Radio app, Apple Podcasts, or wherever
you listen to your favorite shows.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

Bookmarked by Reese's Book Club

Bookmarked by Reese's Book Club

Welcome to Bookmarked by Reese’s Book Club — the podcast where great stories, bold women, and irresistible conversations collide! Hosted by award-winning journalist Danielle Robay, each week new episodes balance thoughtful literary insight with the fervor of buzzy book trends, pop culture and more. Bookmarked brings together celebrities, tastemakers, influencers and authors from Reese's Book Club and beyond to share stories that transcend the page. Pull up a chair. You’re not just listening — you’re part of the conversation.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.