All Episodes

May 24, 2023 39 mins

"If you open a hole on the internet," UCLA professor Sarah T. Roberts tells us, "it gets  filled with sh*t."

The tragic death of Megan Meier was a turning point for MySpace. As the first social media company to operate on a massive scale, MySpace and its users were forced to grapple with the consequences of that scale.

In this episode, Joanne is joined by Thomas Kadri of the University of Georgia School of Law to discuss how our legal system was ill-equipped to deal with the social media era. UCLA professor and author Sarah T. Roberts chronicles the early days of content moderation. And Bridget Todd and Scott Zakarin are back to talk about bullying in the MySpace era.  

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
This is an iHeart original.

Speaker 2 (00:15):
What's interesting though about the dream case though, of course,
is that I think it's a very sympathetic impulse to.

Speaker 3 (00:21):
Want to use some.

Speaker 2 (00:22):
Sort of law here to convey that what Laurie Drew
and her daughter did in this case, not only did
it have a tragic ending, but the act itself was
quite cruel.

Speaker 1 (00:39):
I'm Joanna Ocneil, and this is main accounts the story
of MySpace episode seven. My safety users regularly experienced online
harassment on MySpace, hateful and hurtful comments, even bullying, especially

(01:06):
users who were very young yet at the time, with
social networking still new and regarded as an experiment, these
experiences weren't often taken seriously, and to those who did
experience online harassment in the context of MySpace and at
the time and the oughts, it was so unexpected or

(01:27):
bizarre that they often didn't know how to process it.
Take for example, Roommates, the web series MySpace produced from
two thousand and seven to two thousand and eight. The
show was available to watch on MySpace. It was also
promoted through the social network. It was critical for driving engagement.

(01:50):
The team set up accounts for individual characters on the show,
audiences could interact with the characters like there were real people.
It was exciting, but as Roommates creator Scott Zacherin soon
found out, there were drawbacks to being this public available.
Some of the comments made about the characters played by

(02:11):
real people actors we're really cutting.

Speaker 4 (02:16):
Yeah, I mean it's happened a few times. I mean
in our early shows when people would see the photos
and people would comment on their books. Any actor, male, female,
it was really painful for them. At first, we removed
the the garden variety hater or we warned them we

(02:38):
only would go to MySpace if there was something that,
you know, was beyond what we should be doing, so
you can get a sense of you know, okay, this
guy can be salvaged or ignored, but somebody else if
they start to get you know, too sexual, or you know,
something that goes beyond our standards and practices, that's when

(03:00):
we would kick it up to MySpace.

Speaker 1 (03:01):
The line between Stan and Stalker is kind of a
thin one. Like the people who are like really get
enthusiastic about a celebrity, it can be little really can
I get very easily tip the other way.

Speaker 4 (03:13):
Because I think people are more nervous about doing that
back then because they didn't know what have caused. Now
it's tearing people apart is common interactivity.

Speaker 1 (03:26):
Most users on my Space did not have access to
MySpace the company, like the roommates, cast and crew. They
cannot make special requests for the company to intervene. Typically,
it was expected back then that if you were harassed online,
the only thing to do was ignore it, don't feed

(03:47):
the trolls. This is something bridget Todd, host of Beef
and there Are No Girls on the Internet, commented on
in our interview.

Speaker 5 (03:56):
I think for far too long, people in positions of power,
or like parents and educators and administrators, people who are
in positions where they're meant to help young people understand
the world around them, we have been telling them this
complete fiction that what happens on the Internet is just
the Internet, and like your real life is your real life,
and like who cares what they're saying, It's just words

(04:17):
on a screen. And so when young people are facing
this kind of thing, there's oftentimes not a lot of
adults in the room who can really understand what's happening
and like talk to them about it in a real way.
That's going to be meaningfully helpful when we're talking about
things like online harassment. It is really important to keep
that in mind. And I think that we're seeing that
that attitude sort of slowly change, but I think it's

(04:40):
changing far too slowly to actually, you know, deal with
the problem at any kind of scale.

Speaker 1 (04:48):
The tragic death of Megan Meyer, which resulted in major
news coverage ongoing for years following the Glory True trial,
was a rackoning the thinking until then, a mix of
on the Internet no one Knows You're a dog as
a famou A New Yorker cartoon caption from nineteen ninety

(05:09):
three put it, combined with a moral panic over youth
online that we addressed in earlier episodes, belide how sometimes
the real threat of online harassment is more prosaic. Your
own neighbor could create indescribable pain for your family. Lourie Drew,

(05:31):
the mother of Megan Meyer's classmate, a neighbor of the Myers,
became virtually universally scorned when her role in the bullying
of Meghan became public. But while most people familiar with
the case believe that her behavior toward Meghan was cruel,
there was no clearly drawn path to accountability. There were

(05:53):
no laws that perfectly prevented someone else from behaving the
way that Laurie Drew had on MySpace. She was taken
to court in the case United States Versus Drew and
faced felony computer hacking charges. Drew was charged in two
thousand and eight with misdemeanor offenses of unauthorized access to MySpace.

(06:17):
This was overturned in two thousand and nine and Drew
was fully acquitted.

Speaker 6 (06:22):
Now to a new development, in the case of the
Missouri teenager who took her own life after she was
harassed on the internet. Her family wanted the mother allegedly
behind the hopes to be prosecuted. That authorities hit a roadblock,
But now there has been a surprising development.

Speaker 1 (06:37):
What kind of reactions to the Drew case do you
get from your students in one case?

Speaker 2 (06:41):
Specifically, I would say that by the time in the
semester where I'm introducing the students to this case, they've
already been quite outraged by the scope of the CFA
in certain other cases that I think many of them
are primed to think that the prosecutor hear was really overreaching.

(07:04):
They already have Aaron Schwartz's story in their minds as
somebody who was massed downloading academic articles from jstore and
was prosecuted and then ultimately tragically took his own life
after being charged under the CFA. They have him in

(07:25):
their minds. They have other types of cases in their minds,
where you know, LinkedIn is trying to stop another company
from scraping its website the public profiles that people have
posted on LinkedIn, and they're trying to use the CFAA
for that, and so by the time they get to

(07:48):
the Drew case, I think many of them are already
a little skeptical of the ways in which especially sort
of corporate actors can use a law like the CFAA
to exert forms of power and control over websites that

(08:12):
they create.

Speaker 1 (08:16):
That's Thomas Cadre. He teaches at the University of Georgia
School of Law, and he's an affiliated researcher with a
Cornell Clinic to end tack abuse, and as he just mentioned,
United States versus Lori Drew is a case that he
brings up in his classes.

Speaker 2 (08:33):
What's interesting though about the Drew case, though, of course,
is that I think it's a very sympathetic impulse to
want to use some sort of law here to convey
that what the what Laurie Drew and her daughter did
in this case, not only did it have a tragic ending,

(08:56):
but the act itself was quite cruel. And so I
think there's at least, you know, there's this perception that
what happened was it was at the very at the
very least uncivil and mean spirited. And so I think
the students do have a tough time squearing their opposition

(09:22):
to this very far reaching federal law that probably makes
all of them a cyber criminal. In my classroom, in
every class when they drift off for a second and
they go on a website and they, you know, they
violate a term of service without even realizing.

Speaker 1 (09:43):
After the break, we'll learn more about the CFAA and
to find out why this and other laws were slippery
to hold Lauratry to account.

Speaker 2 (10:01):
Essentially, the CFA, the Computer Fraud and Abuse Act, is
a federal criminal law that makes it a crime to
access a computer without authorization or to exceed your authorized
access on a computer. Now, what all of those magic

(10:22):
words mean has been the subject of now decades of
scholarly debate, different court decisions, many different interpretations have been
kind of put forward, and different cases have kind of
tested the boundaries of what sort of those key terms
might mean, especially the idea of what it means to

(10:43):
access a computer without authorization or to exceed your authorized access.
That concept of unauthorized access is really at the heart
of a lot of these disputes. Colloquially, the c IF
is talked about as the federal hacking law, but of
course even what constitutes hacking is a kind of disputed

(11:06):
conce and some of the confusion surrounding the interpretation of
the set she reflects some of those kind of colloquial
tensions as well.

Speaker 1 (11:17):
One thing that I just was curious about because it
is a law that last, I'm mistaken. It's been on
the books for decades now. So has a perception changed
over the decades because of changes in the technology or
what does it mean to have a.

Speaker 2 (11:35):
Lot of that old Absolutely?

Speaker 7 (11:38):
Yeah.

Speaker 2 (11:38):
So one of the interesting things about the CFA is
that I think best sceptions surrounding the law have changed,
and the law itself has also changed. It's been amended
by Congress several times since it was initially passed. In
the nineteen eighties, and so we've got these kind of
two parallel changes that are going on, and they're not
always synced up. So sometimes I would say public perceptions

(12:01):
surrounding the law has changed in response to a case
like the law Drew case, or a situation like Aaron Schwartz,
one of the founders of Reddit, who was famously charged
under the CFAA. We've had other high profile cases more
recently than those two, but there are these kind of

(12:22):
moments where there's increased public consciousness surrounding the CFAA, usually
paired with opposition to how it's being enforced or interpreted.

Speaker 1 (12:34):
The origin of the CFAA, possibly apocryphal, is that President
Reagan watched wargames at Camp David. It's that Matthew Broderick
movie from nineteen eighty three. You know it. The only
winning move is not to play yeah that one. Reagan
was allegedly so disturbed by the hacking depicted in this

(12:55):
movie that he whipped up legislation that extremely broadly outlawed
computer access without authorization. In the summer of twenty twenty one,
there was a Supreme Core case which narrowed the scope
of the CFAA. The way that the CFAA was applied
in the laureatory trial would probably be considered obsolete. But also,

(13:19):
I am very, very obviously not a lawyer, so I'll
that Thomas Cadrew take it from here.

Speaker 2 (13:26):
I still teach the case to my students because I
think it really helps to kind of ground the stakes
of what the Supreme Court was really doing in this
case last summer in saying, well, there's this one way
that we could read the statute that might cover all
of these forms of conduct, some of which may be harmful,

(13:48):
but maybe not harmful in the way that this statute
was designed to cover, an other of which may not
be harmful at all, maybe really inocuous. And of course,
one of the main things that said issue in the
Drew case and in many other cases involved in the
CFA is whether the violation of terms of service or
some other form of contractual agreement or written policy, whether

(14:11):
violating that kind of a restriction that isn't bypassing some
sort of technical restraint on access to a computer, that
is really doing something that you're not supposed to be
doing under some sort of rule that's written down or
that's conveyed to you in some way or maybe that's implied.
Those were the cases that always, I think, gave judges

(14:34):
some of the greatest discomfort in saying that the CFA
should apply there. But some courts and some judges, including
the judge and the Lorry Droot case, felt compelled to
reach that conclusion in part because the terms in the
statute to say that you're doing something, you're accessing a
computer without authorization, the statue gave no specialized definition about

(14:58):
what without authorization should mean, and so often when judges
are faced with interpreting a law like that, they just
look to the ordinary meanings, and we know what without
authorization means. It's synonymous with things like without permission, not allowed.
And so if something is forbidden in a written policy
and you go ahead and do it anyway, it sort

(15:20):
of makes sense to talk about that as lacking authorization
of some kind, lacking permission, And so judges, like the
judge in the Laurie Drew case felt compelled to say, well,
these actions because Laurie Drew and her co conspirators, as
the court puts it, co conspirators here being her daughter

(15:40):
and her eighteen year old employee. The mother's eighteen year
old employee. They violated various terms of service that my
Space had laid out, and so they were acting without
authorization and therefore they violated. Therefore they violated the law.
And so I still teach it to my students because
it's a fascinating case to kind of show. I think

(16:01):
a lot of people, my students included, they have some
sympathy with the idea that operators of websites should be
able to set certain rules and if those get violated,
it's not just a question of oh, you broached the contract,
but you did something that violated a criminal law, and
so we should be able to use the criminal law

(16:23):
to get at those kinds of permissionless uses of computers.
But I think the Drew case kind of pushes some
of those impulses to say, well, if this is allowed,
then this is the this is the extent.

Speaker 1 (16:37):
That you could go to. One question I have about
this case and then going back to this moment in
time in two thousand and seven, two thousand and eight,
I think a lot of reactions to the story of
Megan Meier is that something terrible happens and this you know,
what does justice look like in this in this situation?

(17:02):
What is on the books at all, and if the
CFA is a perfect legislation, and was there anything at
the time that could have been more suitable, or in
the years since then have there been developments to enforced
cases of what I would say extreme online harassment of

(17:24):
this nature or any nature.

Speaker 2 (17:26):
At the time, there certainly weren't as many laws that
would apply as there are now, And that's one of
the reasons why I would imagine federal prosecutors reached for
a law like the CFAA that, given its interpretation at
the time, was something of a capsule, or at least
it could help fill in the gaps where some other

(17:49):
laws wouldn't apply. And so you might have had certain
laws that prohibited forms of harassment but that didn't yet
apply to internet based harassment. Or you might have had

(18:11):
claims that could be brought for intentional infliction of emotional distress.
That's a thought that had existed for a long time,
but the government can't bring that as a criminal charge.
That's a private lawsuit that needs to exist between people.
So you know, Megan Meia's parents, for example, might have
been able to sue for intentional infliction of emotional distress,
but actually There are all sorts of very complicated reasons.

(18:33):
We won't get into that why it's difficult for Paris
to sue when something like that happens to a child.
But anyway, that the bigger point is that, yes, that's
one of the reasons why prosecutors reached for a law
like the CFAA, where you can you can bring in.

(18:54):
It gives a legal basis for our sense of moral
outrage that something bad happened and somebody needs to be
held responsible. Since the Drew decision, which ultimately, remember right,
even though she was convicted, it was her conviction was
ultimately overturned because of a constitutional challenge that she raised

(19:18):
to her conviction. Since then, there have been a whole
slew of cyber bullying and harassment and stalking statutes that
have been passed in many states across the country, including
one in Missouri, the home state where these events kind
of mainly took place. Missouri passed Meghan's Law, which was

(19:41):
a statute designed to get at various forms of cyber
bullying and cyber harassment that the terms of the statute
certainly seemed much they seem much closer to what happened here, right,
it's actions that are taken, you know, with the purpose

(20:02):
to frighten and intimidate and cause emotional distress. There are
different provisions that apply depending on whether the perpetrator is
a minor or an adult. So laws like this have
since been passed, but they've also been subject to a
lot of constitutional challenges as well, usually First Amendment challenges
based on the freedom of speech. So courts tend to get,

(20:26):
let's just say a little more skittish when laws make
it illegal to communicate with people with the intent to annoy,
with the intent yeah, pester, you know, if it's with

(20:46):
an intent to threaten, if it's with an intent to harass.
Generally that's you know, courts are a little less likely
to strike down those laws as unconstitutional. But the story
of kind of cyber bullying laws across the country has
been one of a few successes and many failures in

(21:09):
terms of those laws standing. This sort of the wealth
being being upheld by courts when.

Speaker 1 (21:19):
When they're challenge Yeah, that was a really helpful explanation there.
It raised a question that I have now, which is
it seems like with online harassment legislation you have different stakeholders,
the users, the people, the business, the executives of a platform,

(21:44):
the victims of cyber bullying or online harassment. How do
you negotiate I mean, again, I imagine it's a very imperfect process.
But how do you negotiate with these that balance between
the First Amendment rights and the accountability? Who is responsible

(22:04):
for what? And how has this process evolved over the
past couple decades.

Speaker 2 (22:13):
Yeah, it's constantly evolving. It is by no means settled.
And I'll add one additional complication, not that we need anymore.
We've got enough to be getting along with. But law
is only one possible regulatory tool that can be used
here to address some of these harmful forms of conduct communication,

(22:36):
interaction right that are conducted through technology. Technology itself is
another regulatory force here. Technology can enable and constrain different
forms of behavior in ways that is certainly not a
direct analog to law, that can be complementary to law

(22:56):
and sometimes not so complementary to law. And there are
other regulatory forces as well, right. There can be certain
market constraints on some of these forms behavior, and social
norms are working in the background as well to again
push certain types of behavior, enable it, or constrain it,

(23:17):
But technology in particular is really important to think and
talk about in this context because your question asked about,
you know, how do we navigate, for example, First Amendment
rights to free speech or just the political value of
freedom of expression right with laws and other forms of regulation,

(23:40):
including technology that might seek to regulate this kind of behavior.
And this is a constant process of evolution. I would
say that we see play out write everything from when
a former president of the United States gets kicked off Twitter,

(24:00):
whether that is a First Amendment issue, a free speech issue,
whether those things are one and the same. Right they aren't,
but they often get lumped together. The question of online
harassment by cyber mobs, dosing, non consensual distribution of intimate images,

(24:26):
other forms of kind of networked harassment, the values that
are at stake in each of those different situations, the
types of regulations that might be appropriate to deal with them,
the constitutional issues at stake. In some ways, I like
to think that that you know, they are all deserving

(24:47):
a very distinct treatment because they do often raise very
different questions of how to try and mitigate or address
some of those harms, and yet at the same time
they're all intimately connected. The types of lines that you
draw in one context will inevitably at least have to
be reckoned with in the other context, even if they
don't directly apply. And so if we want Twitter to

(25:11):
be able to or you know, let's use my Space,
right since it is still around, if we want my
Space nowadays to be able to address certain forms of
networked harassment or targeted threats that are you know, communicated
through its platform, that has a certain vision of the

(25:34):
ability of those platforms to kind of govern and police
their spaces that they've created online. That might also apply
in the context of trying to de platform somebody or
remove somebody's ability to engage in these kinds of expression,

(25:55):
and how they go about doing that right, Sometimes it's
going to be a question of law. Sometimes it's going
to be a question of other forms of regulation that
they might put in place, But it's all pretty connected.
In this ecosystem.

Speaker 1 (26:07):
Social media ad scale is difficult to govern. Any proposed
law that might aim to rid social networks of online
harassment and prevent future lory jrews could backfire in countless ways.
But while online harassment is real, what you believe constitutes
online harassment depends a lot on who you are when I.

Speaker 2 (26:31):
Write in this area, and when I teach these issues,
I can't just teach law. I have to teach technology
as well. I have to teach to some extent social
norms because they're all interacting in this space. There are
occasionally laws that are going to be a major motivating factor,

(26:53):
but often there are going to be other forces that
are actually pushing some of the key protagonists in this
space to act in certain ways, to remove certain types
of content, to protect people from certain types of harm.

Speaker 1 (27:05):
Well, it seem like no on the corporate side of
MySpace cared what the users were doing. In fact, there
were workers in MySpace who were on task to remove
objectionable content from the social network. More in the MySpace
content moderators. After the break, MySpace seemed like a free

(27:35):
for all, a place where you could post or upload anything,
and some took advantage of the lax rules. There were
users who uploaded incredibly vile content.

Speaker 7 (27:48):
I once interviewed, very early on in my research time,
a person who had been an executive at a digital
media company, and that person said to me, very very
wrily and sagely, if you open a hole on the
inner it gets filled with shit, and that was like,

(28:12):
you know, like mic drop. So MySpace opened this hole
in the Internet for people to fill in with photos
and imagery.

Speaker 3 (28:23):
You know.

Speaker 7 (28:23):
There were like also like you know, kind of crude
computer graphics that were part of it too, So you
can imagine how quickly swastikas would have shown up or
you know what I mean, just whatever crappy thing people
could do, they took the opportunity to do it.

Speaker 2 (28:40):
You know.

Speaker 3 (28:40):
It reminds me of like.

Speaker 7 (28:43):
When there's a fresh piece of sidewalk cement that they've
put in, you know, and they put some barriers around
it when they put it down, and then in no
time people are in there writing on it and putting
their face in it, like Michael Scott in the office,
and just doing stuff to it, you know.

Speaker 3 (29:02):
And that's what this is. Like, it was like this
blank slate and then.

Speaker 1 (29:06):
What that's Sarah T Roberts, author of Behind the Screen
and professor at UCLA. MySpace was the first social media
company at massive scale, which meant that things like kicking
people off the platform for say, posting swastikas was not
an easy process.

Speaker 7 (29:27):
There's no size of labor force that you could employ
that could have even gotten all the material on MySpace,
you know, much less on some of the platforms that
are out there now that are just exponential in comparison.

Speaker 1 (29:42):
A lot of Sarah's research and writing focuses on content moderation.
Big social media platforms like Facebook and YouTube employ massive
teams of workers, usually content workers, to remove photos of
violent or sexual content that users have uploaded. The worst
thing you can imagine, well, someone has probably tried to

(30:04):
get that up on a social media platform. At some point.

Speaker 7 (30:09):
Almost every major platform thinks of content moderation a little late,
like they think of it because some crisis has predicated
a new conversation within the firm like uh, oh, we
actually have to have some policies, or oh my god,

(30:30):
I didn't think someone would nefariously do this, But here
are a bunch of people doing this thing with our
tooling or our systems, and not only is that distasteful
to us, but maybe it's illegal, you know, in the
case of circulating child sexual abuse material, which people do
all the time, all the time on social.

Speaker 3 (30:50):
Media to this day, and it is illegal.

Speaker 7 (30:53):
Right.

Speaker 3 (30:54):
The thing about content moderation of social media is that
it's treated as a trade secret.

Speaker 7 (31:03):
It's treated the practices, the specifics, who does what and
where and exactly how. There's no consortium of social media
companies getting together and being like, hey, we all have
the same problem.

Speaker 1 (31:19):
MySpace, as the first social media company at a massive
scale and one that was largely image based, was the
first social network to grapple with the consequences of scale.

Speaker 3 (31:35):
There were like a series of maybe moral and ethical.

Speaker 7 (31:42):
Responsibilities that my Space felt, and then there were also
maybe some potential legal ones that kind of came into play,
and so all of that necessitated some some gatekeeping of
some sort. But there the firms have a hard time

(32:05):
thinking about that kind of activity, gatekeeping, taking material down,
enforcing rules, thinking about what can't be done.

Speaker 3 (32:15):
They have a hard time thinking about that as revenue generating.

Speaker 1 (32:19):
If you were a user who encountered some of this
file stuff, maybe someone left a testimonial with a picture
of dead animals, it wasn't clear how to flag this material,
and it wasn't clear what would happen if you did.

Speaker 7 (32:33):
They often would have had no idea where it was going,
and I think in many cases probably just presumed, oh,
I'm sending it off to the computer whatever that meant
when in fact, you know they were sending it off
to people, but they were doing some labor on the
front end of triaging that material already. So like maybe
at one point you had to go through a series

(32:55):
of menus to find where you would report. Now, it's
usually the the convention is like to have that much
more available to users, like that those buttons us the
red button or something, I've got to report this.

Speaker 3 (33:09):
But you know, it was a it was a process of.

Speaker 7 (33:13):
Like flow chart logic where you would find this place
to report, and then this is the macro category of
why it's.

Speaker 3 (33:21):
A problem because it's violent, or because it's.

Speaker 7 (33:25):
Inappropriate sexual material, or because it's some other kind of thing.

Speaker 3 (33:29):
I mean, I would argue that making.

Speaker 7 (33:32):
A better, safer, more comfortable place for people ultimately will
generate revenue, but that's a long that's kind of a
longtudinal argument for companies that want quarterly returns, so it's
hard to make that case. So what happened was, in
the case of MySpace, you know, they had to build
up a content moderation department, which meant they also had

(33:57):
to create a bunch of policies simultaneously. Because the policies
governed the operations of content moderation.

Speaker 1 (34:04):
Executives often rationalize these haphazard content moderation workforces with haphazard workflows.
They assume it will all get automated eventually, and for
those who work as content moderators, the experience can be traumatizing.
Sarah talked to moderators from multiple platforms for her book,

(34:27):
including someone who moderated MySpace content.

Speaker 7 (34:31):
She said, well, for the three years after I worked
at MySpace, if I met someone, I wouldn't shake their hand.
So I said, can you tell me more about what
you're saying with that? She said, well, I know how
people are, and people are nasty and their growth, and
I don't want to touch a stranger's hand because I

(34:52):
know the stuff they do. So this is kind of
how she went forward in the world after that experience.
She told me that she had coworkers or employees that
she was worried of out unleashing back into the world
because of the harm that they underwent in what they

(35:14):
were doing and seeing.

Speaker 1 (35:16):
You know.

Speaker 7 (35:16):
She told me, maybe some of these people started out
a little bit weird and this job just you know,
took them to the matt psychologically and she said, you know,
she often worried about what became of those people, where
did they end up?

Speaker 1 (35:31):
And in case you're wondering, automating content moderation would be
extremely difficult to do. In fact, many of the much
heralded AI applications depend on this kind of labor too.
A recent Time magazine feature revealed that workers in Kenya

(35:52):
moderate and filter chat GPT content for less than two
dollars an hour.

Speaker 3 (35:59):
Does it have to be that way? I guess.

Speaker 7 (36:05):
Companies think so for now, and they throw a lot
of resources on the you know again, computationally, but there's
no getting away from the human.

Speaker 3 (36:18):
The human ability to discern that is so uniquely human.

Speaker 7 (36:27):
To take all of these inputs symbols, language, cultural meaning
you know, the specificities of a particular region in Mexico,
you know, for example, and the political situation in that place,
and having someone who knows intimately that area, uh and

(36:52):
can respond to it like.

Speaker 3 (36:54):
That is nuanced and it's so it's so uniquely human
in some ways.

Speaker 7 (37:02):
It's like that discernment and judgment, like yes, if it's
you know, there's too much boob in the photo, okay,
A computer can like make a decision about that, Yes,
But when we bring in all of these elements language, culture, symbols, politics,

(37:25):
you know, like regional politics in some cases very specific religion,
all of these elements that are so complex that people
spend entire careers studying them or you know whatever, and
then ask very lowly paid people in a completely different
part of the world to decide about it, or we

(37:45):
try to create algorithms that can imitate those decisions. You know,
things fall through the cracks and it's a really hard,
hard problem to solve under the current business model social media,
which says post it and we'll sort it out later.

Speaker 1 (38:05):
My Space is still blame with content moderation because MySpace
still exists. It is still around, It exists as a company,
It exists as a platform. A collapse, certainly, no one
I know has used it in a decade by people
still work there, people post on it. What is MySpace

(38:26):
now in twenty twenty three. In the next episode, we're
going to explore what's left of it. Thanks for listening
to Main Accounts The Story of MySpace and iHeart original
podcast Main Accounts The Story of MySpace is written and
hosted by me Joanne McNeil, editing its sound design by

(38:47):
Mike Coscarelli and Mary Do. Original music by Elise McCoy,
Mixing and mastering by Josh Fisher, Research and fact checking
by Austin Thompson, Joson Sears, and Marissa Brown. Show logo
by Lucy Kintania. Special thanks to Ryan Murdoch Grace Views
at The Heat Frasier. Our associate producer is Lauren Phillip,

(39:10):
our senior producer is Mike Coscarelli, and our executive producer
is Jason English. If you're enjoying the show, leave us
a rating and review on your favorite podcast platform Sadly,
my MySpace page is no longer around, but you can
find me on Twitter at Joe Mick. Let us hear
your MySpace story and check out my book lurking main accounts.

(39:34):
The Story of MySpace is a production of iHeart Podcasts.
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.