Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
There's something very strange happening on the YouTube Kids app,
and some of it is certainly not appropriate for children.
Jonathan Strickland and this is tech stuff daily. YouTube Kids
is an app that Google markets directly to parents of
young children. In fact, on the landing page for the app,
you'll find a description that reads a safer online experience
(00:27):
for kids. Just below that is a caveat, however, and
one that seems to be in response to a flurry
of news stories about questionable content popping up on the platform.
The caveat says YouTube Kids is designed to filter out
inappropriate videos for kids, but no system is perfect. If
a video that's inappropriate shows up in the app, you
(00:47):
have the power to block it, flag it, and bring
it to our attention for fast review. That probably comes
as small comfort to parents who have seen their kids
react to some truly bizarre and in some cases out
right disturbingly violent or provocative content. Media outlets including Medium
and The New York Times have reported on the large
(01:09):
number of odd and sometimes upsetting videos that come up
either through searching specific terms or as suggested videos picked
by various algorithms. So, what the heck is going on there? Are,
not surprisingly, several elements to this story, and part of
it is just due to the sheer amount of video
being uploaded to YouTube. That's about four hundred hours of
(01:30):
video joining the platform every minute. In other words, within
ten minutes, four thousand hours of video jumps on board.
At that scale, it's literally impossible to have human beings
combed through the video content and make sure everything is
on the up and up. You either automate or you
shut down. If humans were responsible for reviewing all the footage,
(01:52):
they'd find themselves falling further behind with every minute of
video they reviewed. It would be as if they were
in one of those horror movies where a character runs
down a hall only to see the door at the
end get further away. Another issue is that we've gotten
pretty good at automation in general, not great at it,
but good enough for it to be a problem. For example,
you might create an algorithm that looks for the most
(02:13):
popular search terms used on platform like YouTube kids. You
get this list of terms, which likely have little to
do with one another, and you use a different algorithm
to piece together a nonsensical video that takes those different
components and creates a sort of mash up. The resulting
video would likely be pretty awful, but still pop up
in search or along a list of related videos in
(02:36):
a sidebar. There are also automated programs that drive up
video viewing numbers. In other words, some of these videos
may have been designed and even assembled by bots and
shown to other bots. Based on the quality of some
of these videos, I think it's safe to say that
robots have exceedingly low standards when it comes to entertainment.
(02:56):
If that were all there were to this story, it
would be curious but not terribly important. But those videos
are also seen by children, not just brainless algorithms, and
some of the videos can be quite upsetting. It's likely
that so called bad actors made some of those videos,
particularly the more egregiously violent or inappropriate ones, whether they
(03:18):
were doing it as an attempt at satire or just
to take figures that appeal to kids and put them
in inappropriate situations. The problem is that some of these
videos are slipping through the filters on YouTube kids and
are being seen by actual children. So why would anyone
bother to do this? If we set aside those who
just enjoy stirring up trouble and causing distress. It likely
(03:39):
comes down to money. Specifically, we're talking ad revenue. If
you want to make a lot of money off of
YouTube videos, you need to get as many eyeballs on
those videos as possible while bringing down the cost of
production as much as you can. That's where all that
automation comes in. The videos don't have to have good content,
they just have to be good at being discovered. A
(03:59):
quick search through the kid videos will uncover plenty of
ones that have nonsensical titles consisting solely of popular keywords.
If people were involved in any step in that process,
it was with a light touch. Children also tend to
fixate uncertain characters, songs, nursery rhymes, or other elements that
video producers can exploit. The result is an onslaught of
(04:21):
low quality and sometimes wildly inappropriate content. The money side
of this issue is a huge problem. The video publishers
are making money by serving up ads against these videos.
YouTube takes a cut of that money as well, so
there's a lot of incentive to make tons of content
optimized to get lots of views, and little incentive on
YouTube's part to crack down on it. In addition, short
(04:44):
of developing incredibly effective algorithms that can somehow judge quality.
There are few logical options open to the company to
combat the trend. At the moment, the onus seems to
fall on parents. They are meant to report troublesome videos,
which can be removed by you Tube, but by then
the damage may be done personally. I do think parents
have a responsibility to make certain their children aren't being
(05:06):
exposed to inappropriate material, but I also think YouTube should
hold itself accountable for marketing and app to kids without
being able to make certain the content on that app
is actually appropriate. If the company cannot guarantee that, I
think it should probably not offer the app, But there's
a lot of money to be made, so I suspect
it will stick around. That's all for today. To learn
(05:28):
more about streaming media, online communities, and other tricky subjects
in the age of the Internet, subscribe to the Tech
Stuff podcast we published on Wednesdays and Fridays, and we
explore all things tech that includes the good, the bad,
and the ugly. I'll see you again, s