Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Last time on the armchair sociologists.
Speaker 2 (00:04):
Today we are discussing social media.
Speaker 1 (00:07):
If it serves a profit motive, there is very little
incentive to treat your people on your platform like people.
Speaker 2 (00:14):
Your phone goes with you everywhere, and so also so
does social media. The more people that use their platform,
the more powerful that these platforms become. So there are
hundreds of issues that stem from the immense power that
only a few single rich people have over the entire
digital world. But today we're going to be talking about
too misinformation and censorship. It is a business over anything else,
(00:39):
and we are nothing but customers. But they cannot exist
without funding. So these platforms need as many users as
possible looking at advertisements on their screen for them to
make any money. These platforms do not care why the
users are engaging with the content, nor what the content
even is. The more controversial subject, the more people talk
about it, the more money the platform is making from me,
(01:00):
and the longer you stay on the site. This dangerous
equation is why these platforms have rampant misinformation issues. Studies
do show that political and health misinformation are the most
popular types of misinformation, followed very closely by misinformation about
social issues.
Speaker 1 (01:16):
What misinformation will you spread on social media? Let's find
out on this week's Armchair Sociologists.
Speaker 2 (01:25):
Yeah, so it turns out that misinformation can and does
hurt people, and TikTok is known very well for this problem.
Some of the health misinformation that I've seen on TikTok
would be really funny if it was not actually leading
to extremely serious health issues. Have either of you heard
about the borax trend?
Speaker 1 (01:43):
I have a feeling I'm about to yes, because I
teach high school.
Speaker 2 (01:47):
So there's a trend on TikTok from influencers telling people
to drink borax, you know, the cleaning chemical in their water,
because it has so many health benefits like weight laws.
Speaker 1 (01:58):
I guess you know, if you puking your guts out,
you will lose some weight.
Speaker 2 (02:03):
Yeah, so like, of course it helps with great loss
because it will kill you, but at least your skeleton
will look really snatched. Hell, but it will kill you.
Speaker 1 (02:12):
That's a small price to pay for beauty. You know,
I missed typods? Can we go back to ty pods?
Speaker 2 (02:17):
That was a fun one that was also not great,
but it wasn't.
Speaker 1 (02:21):
Bora Yeah, a little less deadly like it was that
was an emergency room visit, that wasn't a gravestone.
Speaker 2 (02:28):
Yeah yeah, yeah yeah. So TikTok has a notoriously very
poor approach to moderating and removing misinformation. But they did
remove all of the videos that promoted this trend.
Speaker 1 (02:38):
That's the most surprising thing you've said so far. Yeah.
Speaker 2 (02:40):
The problem is that anyone who already saw it isn't
being told that it's deadly unless they somehow also come
across another TikTok of something warning people not to do it.
Speaker 1 (02:49):
Right, So, yikes.
Speaker 2 (02:52):
Not all health misinformation on TikTok is completely deadly, but
a lot of it is still very dangerous. Have either
of you ever seen the energy drink prime? I am
sure Katie has, considering your you teach fourteen year olds.
Speaker 1 (03:06):
Yeah, it sits on their desk in my class because
I didn't ban drinks because I'm not.
Speaker 2 (03:10):
A fascist, exactly exactly.
Speaker 1 (03:13):
It is some YouTuber's energy drink right, yes, yes, oh yeah,
it's logan Paul right. Oh okay, yeah, I know. It
came in the Mister Beast meal lunchable thing. Yep. And
it tastes bad. Now, who's the terminally online one? Diana,
I would like to thank my friends on YouTube dot
com for informing me that those existed.
Speaker 2 (03:33):
So, Katie, I want to hear what do you consider
Prime to be, like like marketed for children. It's like
a status symbol if you have a Prime drink, right, yeah, I.
Speaker 1 (03:42):
Mean, like my impression is it's like, it's not even
the Varsity Chucks. It's not the Jeffrey Chucks. You drink
it because they think it's cool because Logan Paul punched
a guy in the face once or something like that, exactly,
that's it.
Speaker 2 (03:57):
Yes, So it's extremely heavily marked into children, as you said,
it's literally in like beasts or not feastables. That's the
candy bar the lunchables knock off like for children.
Speaker 1 (04:09):
The one that had the moldy cheese in it.
Speaker 2 (04:11):
Yes, yes, that one exactly, which we all care about
the health of children obviously. So I know that when
I worked at a summer camp, almost every single kid
was drinking a Prime drink at lunch every single day,
and the kids were using them as their refillable water
bottles just to like show off to other kids that
they had had one at some point. Are you ready
(04:31):
for this one? Bottle. So twelve ounces of Prime has
about the same amount of caffeine in it as six
cans of coke.
Speaker 1 (04:42):
Oh no, sick. That explains why those two kids are
like literally bouncing off the walls like tuamas. Well, the
caffeine is famously great to overdose on as a child.
It doesn't like stunt your growth or make you insane. Yeah,
caffeine is a drug, and we should really talk about
that more often. Yeah, I say, as I sent my espresso.
(05:03):
But like, yeah, like I say, as a caffeine dependent
person myself, but that's so much caffeine.
Speaker 2 (05:09):
Yeah, I'd argue that most parents wouldn't give their kids,
like pack them six cans of coke in their lunch box.
Probably not, but they'll they'll throw in a Prime drink.
So I don't think parents are understanding how dangerous that is.
The extreme large majority of Logan Paul's audience is under
the age of twenty four, with a very large book
(05:31):
of that being under eighteen. Is It obviously goes to
say that children shouldn't be consuming two hundred milligrams of caffeine,
which is twice the amount of what is safe for
them in one drink, right, like we can all agree
on that one.
Speaker 1 (05:46):
Yeah, it seems pretty cut and dry to me.
Speaker 2 (05:48):
But you'll see prime advertising on TikTok like pretty much
every time that you open the app.
Speaker 1 (05:54):
Well, Devil's advocate, go ahead. Children should be more responsible consumers.
I think this isn't their fault. I'm going to finish
rate you. And that's a hot take issue. But no, yeah,
that is that's nuts. But like, so I'm over here
connecting neurons with my little like serial killer mind map
red string, two different ideas to try and map out
(06:16):
the guy's path or whatever. I was just thinking, this
is social media adjacent, but we have all of these
bad health trends and all of these health misinformation rather
that gets passed around social media, like it's candy or
prime energy drinks. Sure, and then this whole thing with
like the the sheer amount of caffeine in this one
drink that is so popular with kids. And then I
(06:37):
go and look over at WebMD Google what is the
effect of caffeine? You get all of these ad riddled
pages that are really hard to read, that are very unintuitive,
and I'm like, well, like, no, wonder if we have
deincentivized curiosity, Ye, because it makes learning anything very difficult
unless it comes from me, your high school teacher who
(07:00):
you didn't pay attention to, who in ten years you
will say, well, what did they teach us this in
high school because you were asleep? Well that's because they
hadn't had their prime for the day yet, that's true.
Are we old enough, Rain, Are you old enough to
remember when Google worked? Yes? Okay, okay, okay, I don't
remember exactly when Google stopped working and started showing you
like only ads.
Speaker 2 (07:21):
Only like in like the last five years.
Speaker 1 (07:23):
I was gonna say, like, I think it was still
working when I went to Japan. It was definitely still
working when we graduated undergrad. Yeah, definitely for sure, because
I mean I had to be taught how to use
Google because Google didn't exist when I was born. But
it used to work very well, better than all of
the other search engines. I had used Yahoo and asked
Jeeves and all the other old ones. But like it did,
used to be much easier to verify information and misinformation
(07:47):
as long as you knew how to use a search
engine and not to believe everything you saw on you know,
some guy's blog. But now we've kind of just the
whole Internet is ads. And that's not very that's I'm
gonna say it. That's not very good. No, And and
like having just taught research papers, it is so hard
(08:07):
to find information nowadays unless you are actively in the
class learning how to research or you know, you've been
one of us who've been alive long enough to remember
when the internet still worked. And I don't know, I'm
just like wondering if there is this misinformation is so
easy to accept because it's coming from people that you
perceive as peers on TikTok, because oh, you follow them
(08:30):
or whatever.
Speaker 2 (08:31):
Gets you go become an a sociologist. Oh, you're getting
radicalized into sociology.
Speaker 1 (08:37):
Yeah, legally Rain has to give you her master's degree
because I already gave mine away up to Zoe people.
I already have one. I don't need another one. Too bad,
you have another, useless massile text God now, Katie three
degrees daven I know it's slightly sidebar, but it's all connected.
Speaker 2 (08:54):
That'll probably be an episode of this podcast.
Speaker 1 (08:56):
Probably. It stresses me out just thinking about it.
Speaker 2 (08:59):
Uh huh, Yeah, it's ichy. It feels ichy.
Speaker 1 (09:02):
Why do you guys do sociology. If it's a stressful.
Speaker 2 (09:05):
That's a great question.
Speaker 1 (09:07):
My job is to make it less stressful and more funny.
And I'm failing miserably. You know, it's still hilarious, but
it's hilarious and the like America's Funniest Home videos equivalents. Yeah, sociology.
Speaker 2 (09:19):
Yeah, the world is ending and we're all on fire.
Speaker 1 (09:21):
Yeah. If I can stress everyone out even a fraction
of how much I'm constantly stressed out, then I will
have done my job as an education. I understand everything
about you now. Yeah.
Speaker 2 (09:31):
Yeah. I went into sociology because this is the type
of stuff that I would just like see and I
would get so enraged, and I would go to every
one of my friends and family and be like, doesn't
this enrage you? And no one else would get on
the same level as of enraged as I was, And
as a teenager, that was really confusing, of like why
does no one care? Why does no one give a
fuck that the world is on fire?
Speaker 1 (09:52):
And so yeah, and that's when the sociology police come
to your house and they make you a sociology major.
Speaker 2 (09:57):
And then that's when you get indoctrinated by the police
and you become a sociologist completely against your will, and
then it forces you to get a master's degree in
sociology because you need to know everything that there is
about sociology so that you can inform as many people
as possible and be like, please understand, this has impact
and it will always have impact. And even if you
(10:18):
think that you directly are not being impacted, you live
in a society. We live in a sorry, and so
you are also being impacted.
Speaker 1 (10:25):
Exactly, need to walk into the sea. The other way
to become a sociologist is if one bites you.
Speaker 2 (10:30):
That's so true. Rules that would have been an easier
path for me. I wish that happened.
Speaker 1 (10:35):
You stay on that side of the wall. Well I
can't bite you, Katie. I need you to not know
what sociology is until the end of this episode.
Speaker 2 (10:40):
Yeah, and then you can get bit Yeah.
Speaker 1 (10:42):
Now, unfortunately, the third eye is already opened and I
need to go lay face down on the floor for
a bit. Welcome to Hell.
Speaker 2 (10:49):
We're out, Ayana. Do you have anyone else we can
bring in here and throw Katie out?
Speaker 1 (10:53):
Oh? Yeah, absolutely, you know too much now sorry, let
me get of that, all right, So let's keep talking
about it, I want or problems with TikTok.
Speaker 2 (11:02):
Yeah. So another huge issue that TikTok has is the
way that it's built promotes something called an echo chamber.
Uh huh, I know Diana knows what that is. Diana,
would you like to give a description of what an
echo chamber is?
Speaker 1 (11:15):
Yeah? Absolutely. An echo chamber is when you are in
a media environment, so say like Facebook, that is a
media ecosystem, so to speak, and when it only shows
you stuff you agree with or stuff that you can
be really righteously angry about because you disagree with it.
And also you get to see all of your like
minded people saying, well, this sure sucks that it happened.
(11:38):
Let's all be righteously angry about it.
Speaker 2 (11:39):
Yeah.
Speaker 1 (11:40):
And pretty much every social media platform algorithm is tuned
to make.
Speaker 2 (11:43):
Them exactly And when you are seeing something over and
over and over again, you are more likely to believe
it as fact. Yeah. And so when every single post
that's coming up on your Facebook is political misinformation about
this thing happened, this thing happened, and it's just nonsense,
and they're just over and over and over drilling that
into your head that that happened you're gonna believe that
(12:06):
that happened like so much more than the just the
very first time that you saw it. But every time
you interact with it, the algorithm is learning that that
is what you will interact with, and so it just
keeps throwing it back at you. And then you have
this false idea that everyone around you all believes in
the same thing because everything you're seeing everyone's believing in
the same thing.
Speaker 1 (12:26):
And then when you're like, hey, can you look at
this article that says the opposite of that, they've never
heard of that point of view and are very resistant
to it exactly.
Speaker 2 (12:34):
And I should say, even though I am, I don't
know if y'all can tell anti right wing. This also
happens with liberal media like crazy, Like it happens to everyone.
Speaker 1 (12:45):
Yeah, it's not.
Speaker 2 (12:46):
It's not just right wing it is. This is a
social media issue in general.
Speaker 1 (12:50):
Well, everybody likes a confirmation bias. I just did kind
of an experiment on my children. Oops, that's fine. We
did an anticipation in activity for a novel. It's called
Long Way Down by Jason Reynolds. It has to do
with cycles of violence and systemic issues that evolve in
a neighborhood, it's written in rhyme, it's great, it's whatever.
The anticipation activity I do with the kids is I
(13:11):
put these questions up around the room, and the questions
are controversial but extremely generalized. So like my personal favorite
that always generates a lot of conversation is I trust
the police, and you have to write down whether you
agree or disagree and what your reasoning is. Wow. Or
another one is kids these days hate books or women
(13:35):
have more emotions than men, And it is fascinating to
watch these kids with pencil and paper, their phones are
stuck in the little pockets in the side of the
room they're not allowed touch, them engaging with each other
and going oh my god, you don't actually think the
exact same thing I do. Yeah, what is this? That
might be the first time they've ever realized that. Yeah,
And some of them get the kind of I guess
(13:56):
confirmation bias the fact where they're like, oh, yeah, of course,
like everybody thinks that, And then I say, okay, who
agrees with the statement I think ghosts and spirits are real?
And the entire room raises their hand, except for like
three kids who are looking at the others like they're
the ghosts, like, Yeah, what are you talking about? And
it's fascinating to see, but you kind of don't get
(14:17):
to see it in a digital landscape. Yeah, that's very
hard to do. And also the digital landscape is disincentivized
to do that because if you see something that you
really don't like but that you can't like be righteously
furious about, you close the app and then you stop
looking at candy crush ads.
Speaker 2 (14:33):
Yeah, exactly. So TikTok has an incredible algorithm, Like genuinely
props to them for figuring out because they have genuinely
nailed it. As you're scrolling, it is learning more and
more about what you like and what you want to see.
So if it sees you interacting with misinformation, it's going
to continue spitting misinformation at you. The more exposure that
you have to that misinformation, the more likely you are
(14:53):
to believe it.
Speaker 1 (14:55):
It's time for a break. It would be really funny
if x dot com, the everything app formerly known as Twitter,
sponsored this podcast. Let's find out together.
Speaker 2 (15:09):
But we need to move on to Twitter. Diana. I
know you have a lot of thoughts about this platform.
If you would like to go.
Speaker 1 (15:14):
Ahead, I do. I'm going to save some of them
for a little further down the paragraph here, But yeah,
Twitter was very much the town square of the Internet.
I think they used that in their marketing for a
long time too, where you could talk to the president
for the first time, or celebrities or things like that.
A lot of also comedians used it as like their
(15:35):
the jokes Jim. I have heard it referred to where
they test out material, and dear God, Elon Musk has
ruined that website. As someone who never really liked Twitter,
even I am sad to see what has become of it. Yeah,
I have seen this referred to as digital colonialism, where
there is a common space that people exist in, people
(15:56):
live their daily lives in inso much as one lives
on Twitter. You know, some people more or less do rain.
And then he came in and bought up the whole
neighborhood and turned it into Naziland. And people still have
to live there because that's where their friends live. And
that sucks to me so bad.
Speaker 2 (16:13):
Yeah. I could talk about the downfall caused by Elon
Musk taking it over like a filibuster. I could go
on for days talking about how much it bothers me.
Speaker 1 (16:22):
Watch out, Corey Booker. It reminds me a little bit
of the when I worked at Starbucks. They used to
talk about the third place all the time. And yeah,
social media very much filled that role, right yes, and
oh no it's gone or oh no, I have to
pay seven ninety five for a cup of coffee to
enjoy the third place. Yeah, and also Starbucks doesn't want
(16:45):
you to stay there anymore, so they're taking out the chairs.
Speaker 2 (16:49):
Crazy business model.
Speaker 1 (16:50):
Fun fact, the third place is a sociological concept. So
you've been doing sociology this whole time and doctrine and
they were just telling me I was just drinking blonde
coffee for whatever. Yeah, that sociology is making coffee. It's
the same thing. Well, I mean you could argue that
because of the like way the global supply chain works
and like the Yeah, I mean I could argue anything.
But also, yeah, it's not you for twenty years. All right,
(17:15):
let's keep talking about Twitter, Okay.
Speaker 2 (17:16):
Since his takeover, this platform has been widely labeled as
the very worst social media misinformation offender, Like it was
voted that he.
Speaker 1 (17:27):
Worked for it. He put in the time, Yeah I did.
Speaker 2 (17:29):
Yeah, this is definitely, because Musk opted to withdraw from
the twenty eighteen Disinformation Code of Practice that is commonplace
among all top social media apps. The upsurge of extreme
right wing propaganda that has consumed the app because of
the decision is unbelievable. When he first got Twitter and
he owned it now, he reinstated thousands of right wing
(17:54):
profiles that had been banned for hate speech following the insurrection.
Speaker 1 (17:58):
Yay, that right, isn't that When he said he was
gonna make Twitter funny again because he filled it full
of clowns?
Speaker 2 (18:05):
Yeah yeah, yeah, remember the insurrection.
Speaker 1 (18:09):
I watched it, horror, I watched it live.
Speaker 2 (18:12):
Those people shouldn't have a platform for your hate speech
and hot take. That's all I'm saying.
Speaker 1 (18:18):
Ye speech me.
Speaker 2 (18:21):
So, we're gonna head over to our sociologic as. Whoa,
We're gonna whoa rain, get it together.
Speaker 1 (18:28):
Let's head to the sociological consultant corner with me, Diana
to talk about the sociological implications.
Speaker 2 (18:33):
Hell yeah, come.
Speaker 1 (18:34):
With me friends a week. So, the first thing I
want to actually say before I launch into what I've
prepared with my notes, is that I think of freedom
of speech and tolerance of other viewpoints as not a
moral good. I don't think that that is something we
do because it is the right thing to do. I
(18:55):
view it as a social contract that we do because
we would like to enjoy the benefits of having free speech.
Speaker 2 (19:00):
I agree hardly.
Speaker 1 (19:02):
In my opinion. If your speech is trying to infringe
on other people's right to speech or write, to exist
in public, for example, like with trying to make trans
people not exist through legislation, bathroom bills.
Speaker 2 (19:13):
Whatever, you're violating it.
Speaker 1 (19:15):
Yeah, you have violated the contract, and therefore that contract
does not apply to you anymore. That's how I feel
about hate speech is if you're saying things that infringe
on the rights of other people to live freely, then
the contract is broken and all bets are off. That
is my personal opinion. I am not a free speech absolutist.
I am a free speech contract theorist.
Speaker 2 (19:35):
I agree.
Speaker 1 (19:36):
You're saying this in a social contract space, not a
terms of service of Twitters used to specifically forbidden exactly.
I'm saying this in a This is a de facto
contract we have in America, or we arguably we never
really actually had freedom of speech. That's a different episode,
but when you are in a polite society, that society.
(19:56):
Every society has rules. All of them have different rules,
and there's for breaking them exactly. And when you are
doing your using your free speech to hurt other people,
that is breaking the rule of We live in a
society and we want to be able to have civil discourse, right.
We make these social contracts with each other, not actively,
but it happens over time in societies so that we
(20:18):
don't kill each other with swords.
Speaker 2 (20:19):
Exactly, or guns.
Speaker 1 (20:22):
That is their purpose, it is we are making freedom
of speech so that we don't do violence to each other.
When your speech is advocating doing the violence, why the
hell would the contracts supposed to be preventing violence apply period?
Did that makes sense?
Speaker 2 (20:36):
Naps?
Speaker 1 (20:37):
Yes, good job, thank you. So that's my two cents
on that. But let's go over to talk about ex Twitter,
Twitter x Twitter. So talking about monetization of posts. We
haven't actually studied how this monetization has affected the quality
of posts on Twitter yet, because it only became a
thing pretty recently. Social science takes a while to catch up.
(20:59):
Human research takes forever to do. But I think anyone
who still uses it agrees it sucks and is full
of Nazis. Now, yes, we do have past studies though
on this topic of monetization that we can talk about.
From being a former user of Twitter, though, I can
say anecdotally that when Elon Musk bought it, the algorithm
shifted massively in favor of conservative talking points, which up
(21:21):
to that point I had been in a left wing
echo chamber on my Twitter feed because of who I
am as a person, and it shifted very dramatically, basically overnight.
That is the kind of control of the ecosystem he
purchased for forty four billion dollars. The ads also started
skewing right wing, because surprise, companies don't want their products
advertised next to the words Hitler actually had some good points.
(21:42):
So you know, all the big advertisers pulled out because
of all the Nazis, and now Elon Musk is saying
that's discrimination. Actually it gets hubersane. Well, advertisers don't want
to buy his service of showing ads, and that is
a hate crime. Oh my god, that's the real hate crime. People.
Won't somebody think of Elon Musk? I wish I thought
(22:03):
about him less. I yeah, no, So for anyone who's
not using Twitter, I'm gonna call it x actually from
here on out, because for me, this is emblematic of
what makes x X as opposed to Twitter.
Speaker 2 (22:15):
That's that's fair for.
Speaker 1 (22:16):
Anyone who's not using it. I do want to talk
first about what the get paid to post system is
so that you all understand it, both you guys and
the listener. The way it works is that if you
have a paid subscription to the website or a verified
user with a blue check, you pay like five dollars
a month for that. I think you have a chance
to earn money by making posts that a lot of
people comment on, like retweet, etc. Works similar to how
(22:38):
YouTube works, where if a lot of people watch your
YouTube video, you get ad revenue, so you're getting money
from the ads people see underneath your post. However, those
people do also have to be subscribers, so only subscribers
seeing ads counts. So it's not that big of a
cash cow. But I have seen some people saying they make,
you know, tens of thousands of dollars on it, so
(22:59):
it's not absolutely nothing. But I imagine most people are
getting maybe five or ten bucks, not that much. Probably
not right The problem with social media in general is
that we've talked about this a little. It exists to
keep you using it. The longer you're using the website,
the more ads you're seeing, the more money the platform
X in this case gets to make. For a long
time now, we've been seeing that post that piss more
(23:20):
people off, get more engagement, more people click it, more
people share it to add their reactionary commentary about how
they hate it. More people like it or dislike it,
or send it to their friends in a text message,
and discust this outrage and need to correct people and
be righteously furious. Keeps people using the website. And we
have a lot of studies that prove that it is
(23:41):
called the confrontation effect by some of them, which I
think is a good term for it. And that's going
to keep people on the website. Rage bait, Yeah, that
keeps people looking at your ads, and then they see
more rage bait, and then they share it to say
how it's wrong, and then they see more ads, and
on and on the hamster wheel goes. Social media is
already habit for without monetary incentives. We've talked about rain.
(24:02):
You don't get paid to go on TikTok, but you
do it for eight hours a day.
Speaker 2 (24:05):
Oh god, I'd make so much money if I did.
Speaker 1 (24:09):
It's basically a skinner box, like you know, they put
the rats in it. They have the levers that give
them treats, and sometimes you get a treat when you
push the lever, and sometimes you don't, and then they
get addicted to pushing the lever. It's like gambling for rats.
Speaker 2 (24:21):
And I'm nothing if not a rat.
Speaker 1 (24:22):
I mean, aren't we all little rats in our own way?
Like pi society. We live in a society with Pikachu,
so it rewards you at random intervals for sharing stuff
and posting stuff, and the reward is people liking your
post or commenting on it. And the more you post
and the more you share, the more misinformation is likely
to slip the net just by volume. This has also
(24:43):
been studied, and I mean just think about it. The
more stuff you are doing on social media, the less
likely you are to be able to vet all of it.
Who really has time to read every single article before
they share it? Me? Only me? I have that kind
of time, or I will once I'm done with the
Capstone project. Hillo, Hi, we have studies that prove it.
(25:04):
The one that I'm citing here specifically is called sharing
of misinformation is habitual, not just lazy or biased. So
it really puts the onus back on the social media
platform as the responsible party, not the people using it
and getting trapped in this cycle.
Speaker 2 (25:16):
But blame anyone else, Yeah.
Speaker 1 (25:19):
Because it's really not. I really view like people who
took ivermectin or drink bleach, they are victims of social
media company. Yeah.
Speaker 2 (25:25):
Yeah, for sure.
Speaker 1 (25:26):
As much as I can personally get frustrated with a person,
I realize it's much larger than that, and it's very
They're in the rat box getting cocaine whenever they press
the letter, but only every three times they press the
lever or four. It's the randomness that keeps you in there.
It's why goatcha games work. Yes, yes, that's true. Yeah,
it's just it's classic just gambling, but with dopamine. Yeah.
(25:48):
So you're getting sorted by the almighty algorithm into these
echo chambers of people who share your beliefs, or really
highly polarized spaces with tons of outrage bait, so that
you and the people who share your beliefs can get
mad together. It's easy for rumors to spin out of
control in those spaces just from how the system is designed.
And it is designed that way. It doesn't have to
be this way. It is this way on purpose, which
(26:08):
is something I try to really emphasize with people. So now,
in addition to the built in feedback loop the skinner
box we just described, we add money to the equation
as another INSECTI yeah, that's capitalism, bravery. Your brain is
already conditioned to like posting and sharing because you have
had a Facebook since two thousand and nine. Your brain's
(26:30):
also very susceptible to outrage, clickbait and sharing pithy comments
about things that you don't agree with some of us
post on social media. Refined gentlemen, make the podcast. Oh
we're in the exact target audience for that. Yes, that's me.
I've always said I'm in a higher risk demographic for podcasting,
and look at me now I've relapsed.
Speaker 2 (26:51):
You get so angry that you go twenty thousand in
the hole to get a master's degree on the subject.
Speaker 1 (26:56):
Yeah, pretty much. Yeah, so your brain's already susceptible least,
and now someone's gonna pay you to do it. You're
on a platform that's already skewing very right wing due
to Elon Musk's personal politics and the things he did
to the algorithm. Other subscribers up to the website. So
the people paying for a subscription, who you only get
money if they see your posts, are gonna be fans
(27:18):
of his because they are paying a subscription to be
on his website. How do you get the people who
like Elon Musk to engage with your content as much
as possible, to be as inflammatory as possible because Elon
Musk is a serial troll and people are doing just that.
I will add as one last note here in my
little analysis that I did finally manage to find a
(27:40):
study that looks at how monetary incentives affect the quality
of misinformation on social media. It is not about Twitter specifically,
they made a fake platform to test it on. But
the study is called does incentivization promote sharing true content Online?
And there's a couple problems with our analogy here. This
study is about incentivizing true information and was done on
(28:00):
a fake social media platform where the researchers did just that.
If you posted true, good, accurate information, they gave you
a little bit of money. It also was conducted in India,
not the United States, so it may not completely map
to what we're talking about today. I would be an
irresponsible sociologist if I didn't say that the study was rigorous.
They had a control group. They also happened to go
(28:23):
to Harvard people who did it. So while it's not
a perfect one to one, I think it was worth
talking about. Yeah, for sure, the study showed pretty definitively
that people who were incentivized to post and share more
did and as we said earlier, the more you post
and share, the more likely it is you're going to
miss that misinformation sharing was also more likely among those
(28:43):
who leaned right, which is interesting for our discussion of X.
So we do have some data showing that that monetary
incentive increases posting in general. And in this case, obviously
Elon Musk does not care if it's true or not
if it gets in any kind of revenue for the website,
because it's not getting much from ad revenue at this point.
Speaker 2 (29:00):
It appears that he prefers when it's not.
Speaker 1 (29:03):
It does seem that way, But you know, I don't
want to throw stones in Elon Musk. The man's you
know everyone already is talking about Elon going through a lot.
Have some empathy. Well, somebody think of the billionaires?
Speaker 2 (29:16):
Please, if someone think of the richest man in the world.
Speaker 1 (29:19):
I know you guys told me to laugh when the
joke is funny, and like it's funny, but like also
the existential driver setting in. Now you sorry, WT, somebody
think of musk.
Speaker 2 (29:29):
Welcome to sociology, baby.
Speaker 1 (29:32):
That's why I went to grad school so I can
learn how to fix it, and I decided to make
a podcast instead.
Speaker 2 (29:37):
Well, you're educating people, and the more people that know
about how fished up this is like, that's something right.
Speaker 1 (29:44):
I view this as a vaccination. This podcast is the
vaccine to the misinformation social media machine. So if you're listening,
you're now inoculated. Welcome to the resistance. That's it for
this week thrilling, high octane episode of The Armchair Sociologist.
(30:04):
What will we learn next week in the thrilling conclusion
to the three part social media trilogy. Tune in to
find out