Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello, fellow sociologists, this is your friend Diana coming at
you with a quick update. There's not going to be
an episode next week because I have not made it yet.
These two series of episodes, the Social Media and Generative
AI episodes, were done as my capstone project at the
University of Alabama at Birmingham with my partner Rain Schroeder.
(00:23):
At this time, I am working on the podcast alone
going forward, and as such it takes a bit longer
to make episodes. It took two people a whole semester
to make these two, so with one person it goes
a bit slower. I have been brainstorming a workflow that
will allow me to release on a more regular schedule,
(00:43):
but for now, I am going to keep working on
the next episode, which will be on vaccine hesitancy and misinformation,
and we will be back with you as soon as
I can. I may also release some shorter solo episodes
in the meantime, discussing basic sociological concepts or current events,
but I wanted to give you a heads up. The
(01:03):
schedule will be a little less regular going forward, but
I will try to keep something coming out If you
want regular updates on the situation. You can join our
mailing list. There's a link in the description, or you
can go to the Armchairsociologist dot card dot co. That's
c r r D dot co and there's an option
(01:24):
to sign up for our mailing list there and you'll
get emailed every time there is a new episode, provided
I have time to send out the email that week.
So thank you so much for joining us, and let's
get going with this episode last time on The Armchair Sociologist.
Speaker 2 (01:39):
Misinformation can and does hurt people. There's a trend on
TikTok telling people to drink borax because it has so
many health benefits like weight lines.
Speaker 1 (01:49):
I guess if you're puking your guts out, you will
lose some weight because it will kill you.
Speaker 2 (01:54):
TikTok has a notoriously very poor approach to moderating and
removing misinformation. I work at a summer camp. Almost every
single kid was drinking a prime drink at lunch. One
bottle has about the same amount of caffeine in it
as six cans of co. Children going to be consuming
two hundred milligrams of caffeine, which is twice the amount
(02:16):
of what is safe for them in one drink.
Speaker 3 (02:19):
Misinformation is so easy to accept because it's coming from
people that you perceive as peers.
Speaker 2 (02:24):
The way that it's built promotes something called an echo chamber.
The more exposure that you have misinformation, the more likely
you are to believe it.
Speaker 1 (02:31):
It's really easy for rumors to spin out of control
in those spaces just from how the system is designed.
And it isn't designed that way. It doesn't have to
be this way. It is this way on purpose. The
more you post and share, the more likely it is
you're going to miss that misinformation. This time on the
Armchair Sociologist, will Katie continue to spiral into existential depression?
(02:51):
Will Diana be a big nerd about social media? Some more?
Find out now on the thrilling conclusion to the Social
Media series.
Speaker 2 (03:02):
So I also want to talk about the problem with
verified accounts. There is a genuinely very funny article by
someone named Anita Hag I hope I'm saying that right.
We'll call her Anita What's up Queen. She talks about
all of the ways that Elon Musk haters on Twitter
have taken some of his power away by simply just
(03:23):
trolling the hell out of him, using his own stupid policies. Hell,
I was hysterical when I was reading this article because
I remember when all of these happened in real time,
and it's just memory lane. You know, a blue check
mark next to someone's name on Twitter used to mean
what you would expect it to, that this person has
(03:44):
been verified to be the official Twitter account for that
individual or that brand. It was a way to stop
people from making parody accounts, or at least believing those
parody accounts.
Speaker 4 (03:55):
Oh yeah, I remember the parody accounts.
Speaker 1 (03:57):
This were fun.
Speaker 2 (03:58):
But then you Musk made the verification check mark something
you could simply just buy, no matter who you are,
completely eliminating the entire point of why it existed at all.
So naturally, users started making arity accounts and then paying
for a verification check mark so that they could fool
(04:19):
users into believing fall stories. Are you guys familiar with
the Eli Lilly scandal?
Speaker 4 (04:24):
Yes?
Speaker 1 (04:25):
I watched it in real time and it was great.
Speaker 2 (04:28):
It's so funny. I love talking about it. So, Eli
Lilly and Company is a company that manufactures insulin. Because
for some reason, you can own patents on properties that
are related to life saving medicine. Say it with me
our slogan.
Speaker 1 (04:42):
That's capitalism.
Speaker 2 (04:46):
Katie got in there, she knows now. But anyway, one
user had changed their name to at Eli Lilly and Co.
And then they bought the verified symbol. They then tweeted
in quotes, we are excited to announce insulin is free
now yay. Not only did millions of people believe that tweet,
(05:10):
but it also forced the actual Eli Lilly company to
tweet a correction that they weren't actually giving up their
profits on insulin, which led to a dramatic four point
three seven percent stock market decline for the company.
Speaker 3 (05:23):
Yay.
Speaker 1 (05:24):
Twitter's good.
Speaker 2 (05:25):
Actually the funniest thing that makes me laugh every single time,
because what did you think was going to happen?
Speaker 1 (05:34):
It's heartworming.
Speaker 3 (05:35):
I misheard that as heartworming, and I was like, well,
Eli Lily can.
Speaker 4 (05:38):
Help with that.
Speaker 2 (05:40):
So now we are at the half point of this episode,
which damn, I hope you're still listening. So we're gonna
switch gears into censorship and information governance, Like what even
is all of that? Basically, these platforms have the power
to decide what you see, but more importantly, they have
the power to decide what you do not see. So
(06:01):
what do you guys know about typical social media content regulations?
Like what are you not allowed to post? Like standardized
usually like porn dick pics Exactly.
Speaker 3 (06:12):
I was gonna say, you're usually not allowed to be
a trans woman on bost sides of yeah.
Speaker 2 (06:17):
And that was a sociological point of view, like you
really are, you're coming through?
Speaker 4 (06:23):
Can I go back to film school?
Speaker 2 (06:26):
So yeah, while that is all like violence, porn, all
of that, While that is all a form of censorship,
in a way, it is different to what I'm actually
talking about. Censorship or information governance is specifically the moderation
that happens completely behind the scenes. It might be written
into policy, but it would be included in those terms
and conditions that we all agree to without ever actually reading,
(06:47):
because who the hell reads those? Lawyers everywhere just started
sweating with me saying that, but like, genuinely, who the
hell reads those?
Speaker 1 (06:53):
They're designed to be unreadable exactly.
Speaker 2 (06:56):
This type of censorship is usually done by the algorithm.
These platforms write moderations like hiding specific words or phrases
that someone might say into the algorithm and it does
everything for them, so.
Speaker 1 (07:08):
It like doesn't actually delete the post or prevent you
from posting it.
Speaker 3 (07:11):
Right.
Speaker 1 (07:11):
What we're talking about.
Speaker 2 (07:12):
Is like it just kind of shadows.
Speaker 1 (07:14):
Yeah, taking a hashtag and making it so if you
search it nothing comes up. That kind of stuff.
Speaker 2 (07:18):
Yes, yeah, So, following the same format as before, we're
going to start with Facebook and it's a doozy boy.
Before I get into it, have you heard anything about
Facebook censorship or information governance specifically on Facebook?
Speaker 3 (07:32):
There are so many different opportunities for that to have
happened that I can't point to a specific one.
Speaker 2 (07:36):
But yeah, like that exactly. So have either of you
heard of drill rap? I know Diana has because I
know that you wrote a whole section on it. But
prior to that, do you know what drill rap is?
Speaker 1 (07:48):
I did not. Actually I had to learn what it
was so that I could participate in the episode properly.
Speaker 4 (07:53):
Katie Drill like the funny Twitter user who said he's
going to walk backwards in the hell?
Speaker 1 (07:59):
No, not that drills. So I wish that that drill
had a SoundCloud that I could give him money on.
Speaker 2 (08:06):
No, I agree. So drill rap is essentially this is,
for lack of a better term, it is defined as
really aggressive or even violent rap music.
Speaker 4 (08:16):
Oh okay, So.
Speaker 2 (08:18):
Mark Zuckerberg was caught censoring drill rap music videos on Facebook.
So it turns out that the reason why he did
this is because he took an under the table informal
deal with the UK police that was having issues with
these proclaimed And this is in't quote, aggressive and violent
youth that didn't respect authority making these music videos and
(08:40):
then sharing them on Facebook. If he can accept informal
requests of what to censor to the masses on his platform,
what the fuck is stopping him from making way more
under the table deals about what to hide on his website.
I need to hear your immediate thoughts on this.
Speaker 1 (08:58):
Nothing, it's nothing, nothing is stopping him from doing that.
Speaker 4 (09:01):
Oh that's it. Broken brain broken like it just my
mind is an enigma. The milk carton falls over. I've
got nothing like, Well.
Speaker 1 (09:09):
Don't worry, Come with me, Jump on my back. We're
going to the sociological consultant corner.
Speaker 4 (09:15):
Yeah, I discipline my shoulder.
Speaker 1 (09:17):
Got you. So this is a can of words. I
will admit didn't know what drill rap was before you
pose this question to me. And we are specifically talking
about drill rap as it exists today in the United Kingdom. Correct, Oh, okay,
it is a rap subgenre. It tends to be pretty
explicitly about the realities of being poor and black in London.
I'm saying that it's the UK version because the genre
(09:40):
actually has its roots in Chicago, I think in the
seventies and eighties. But they cover similar themes and have
similar names, and they sound different like it's a different
music format but very similar content wise, and about two
different cities got it, so I want to make that
distinction at the top. But naturally, because it's an art
form primarily practiced by black people and talks about lived experience,
(10:01):
the police immediately started blaming it for causing.
Speaker 2 (10:03):
Crimes last police behavior.
Speaker 1 (10:05):
This is a classic American tradition goes back at least
to the nineteen fifties. And I know we're talking about
the UK, but stick with me. When house parties used
to get broken up in the fifties for playing ska
and reggae music because that was just corrupting the minds
of the children, all those trumpets and chill vibes. I
know that all these genres are also very political and
were at the time too, and still are now. But
(10:27):
also it's very funny in twenty twenty five to think
of that as being a think of the children genre
of music. It is the same anti black bias, just
a different music genre and a different country.
Speaker 4 (10:37):
I mean, like it's jazz from the twenties as well.
Speaker 1 (10:40):
Yeah, jazz from the twenties had the same thing. Anti
rock and roll sentiment was anti black. It was called
race music. Hip hop was blamed for causing the violence
when we grew up, So you know it goes on
into infinity.
Speaker 2 (10:50):
Yeah, this is just the current day genre that it's
getting blamed.
Speaker 1 (10:53):
Yeah, this is just what we do with black culture.
So while drill rap does deal with themes of violence
and crime, that itself doesn't cause violence in crime. And
we have plenty of data gathered scientifically backing this up exactly.
As always, my sources are down in the show notes.
The lyrics do, however, reflect the lived trauma of these
(11:14):
black artists that scares the but Jesus out of white people,
and I believe that is the scientific term but Jesus.
This ties into a much larger topic in history, which
is general criminalization of blackness. I'm going to bring in
arguably the father of American sociology here, even though we're
talking about the United Kingdom, because anti blackness is really
an American invention. We really started that. So let's talk
(11:37):
about W. E. B. Dubourg Yo.
Speaker 4 (11:42):
I know that name.
Speaker 1 (11:43):
He's huge in education. Yeah, he's huge in education. He
wrote about a ton of stuff. He was also a
huge civil rights activist and the first black man to
earn a doctorate from Harvard.
Speaker 2 (11:51):
Oh.
Speaker 1 (11:51):
Yeah. He was born in eighteen sixty eight died in
nineteen sixty three, So he was really right in the
thick of it historically speaking of a rights movement. Yeah.
He talked about crime as an expression of rebellion against
one's social environment and basically viewed criminal behavior as a
symptom of an unequal society because black Americans weren't able
(12:12):
to advance after the Civil War ended period. This wasn't
just a coincidence that, oh, they just didn't know how
to make money after they stopped being slaves, Nor was
it because slavery had only recently ended and they were
acclimating to that new status quo. Though that is certainly
a piece the criminalization of blackness, many have argued, is
due to a loophole in the thirteenth Amendment, which abolished
(12:33):
slavery except as a punishment for crimes in the South especially,
and this is well documented history. Politicians would talk about
this in speeches. It wasn't a secret. It was the
point crazy. It creates an incentive if you arrest a
now free black person, they can be forced to work
for free again, and slavery can essentially continue, only now
(12:55):
you don't even need to pay for the initial purchase
of a slave. The state does it for you.
Speaker 4 (13:00):
Ro I saw the John Oliver episode on this.
Speaker 1 (13:02):
Yes, it's a very good one. That's a good talking point.
If you want to learn more about this actually is
go watch John Oliver's I.
Speaker 4 (13:08):
Think it was the prison industrial complex and the prison labor.
Speaker 1 (13:11):
Yes, prison labor, so yeah, go type John Oliver prison
Labor into YouTube, go watch it. This is how you
get things like black codes, which were laws designed to
criminalize existing while black and make it nearly impossible to
exist in public space without getting arrested. That's actually where
we get laws against loitering. For example.
Speaker 4 (13:27):
That's crazy, Yes, yeah, that makes sense.
Speaker 1 (13:30):
That was just standing while black is the crime that
loitering is. This is also where we get laws against
you know, the classic things you think about when you
think segregation in the South, colored water fountains, black people
are forced to sit on the back of the bus,
the works, laws against being unemployed while black yeah, laws
against being homeless, vagrancy. These are all crimes and that
(13:51):
came out of this period of time because the more
black people you can arrest, the larger your free labor
force gets, so it becomes very profitable to criminalize as
many aspects of being black as you can. That brings
us back to black music being thought of as violent
or dangerous. Because America has spent so long criminalizing just
blackness in general, anything associated with it also becomes taboo,
(14:15):
especially if you're a white person engaging with it.
Speaker 4 (14:18):
It's almost a phobia.
Speaker 1 (14:20):
It's almost a phobia, it really is. Actually. Fat phobia
is also rooted in anti blackness because fatness was seen
as something only black people were, so being thin met
you were more white than a fat white person. This
is also how Irish people ended up being considered black
in the early nineteen hundreds, is because Irish people were poor,
and poor people get poor nutrition and less gain weight
(14:40):
because they're eating more carbs to make up for it's
fearing the black body. If you want more on that topic.
It was really eye opening for me. So American culture
is exported on mass all over the globe through movies, music, TV,
the Internet, and its influence is especially strong in Europe,
especially the United Kingdom. So it doesn't really surprise me
(15:01):
to hear that police are trying to censor and criminalize
drill rap at all. It's just part of a much
longer story of conditioning society to think being black is
the same thing as being a criminal. Yeah, So now
(15:21):
that we're all depressed, let's get back to Mark Zuckerberg.
Speaker 2 (15:24):
Yeah, he's so good at that. I love when that
man does anything. It like really helps bring my.
Speaker 3 (15:30):
Mood up, like surfing in Kawhi and like terrible sunscreen
to distract from the fact that he bought half the
island and I thought that would Yeah.
Speaker 1 (15:38):
And that all of the native Hawaiians are really upset
about it because it sucks. Yeah, that's another episode.
Speaker 2 (15:45):
Speaking of Zuckerberg making us really happy. Another fun thing
that he's been caught doing was manipulating his user's psychology
in order to be as addicted to his platform as possible.
So there's a really interesting article by someone Douglas Yuvan
or Yvonne from this year that really drives home the
power that this one man has over the psychology of
(16:09):
our brains. He argues that Zuckerberg is pretty much immune
to needing to take any accountability for his actions because
of how powerful he is as a person, despite the
fact that there is actual data that shows that his
manipulation into the algorithm to make us as addicted as
possible has led to broad mental health declines and a
(16:30):
more divided society. Do you guys know about the policy
change that Zuckerberg made this year about the way that
Facebook manages censorship?
Speaker 1 (16:39):
Uh? Huh of course?
Speaker 4 (16:40):
Oh yeah.
Speaker 3 (16:42):
Is this the one where they said we're not going
to moderate it anymore?
Speaker 2 (16:46):
Yeah?
Speaker 1 (16:46):
Yeah, why would we do that? By the way, if
you listen to our Generative AI episode, you already know this.
But if you didn't, the real reason they did that
is because the moderators working for them and Kenya unionized
and then all of a sudden later, just kidding, We're
not gonna do content moderation anymore. You guys do it
for us for free? Okay, you the user?
Speaker 3 (17:06):
Wow?
Speaker 2 (17:07):
Like the content here rights itself.
Speaker 1 (17:10):
It really does.
Speaker 4 (17:12):
That's so.
Speaker 2 (17:14):
Okay. So I'm gonna move on to TikTok censorship because
in my research for this episode, I came across the
most shocking information about TikTok censorship that has straight up
become my Roman empire ever since, and I don't know
if it'll ever move on from being my Roman empire.
There is a thesis written by someone named Rebecca, and
(17:36):
I'm skipping her middle name Boosto or Busto. I can't
pronounce names. I don't know if y'all can tell. But
this explores the algorithmic censorship that TikTok uses to control
what content is able to come up on your for
you page. So TikTok as like very publicly put in
(17:57):
out whoa TikTok has, very publicly put out their policies
on what they use to train their algorithm, and the
things that are on there are so foul they have
moderators go through and train the algorithm to ignore and
to like shadow band and hide people that are deemed
(18:19):
to have abnormal body shapes, too many wrinkles on their face,
to have physical deformity in the form of disability, just
and those are public policies that they have come out
and said, we.
Speaker 1 (18:36):
Stand by these. This is how we do it.
Speaker 2 (18:39):
We hide videos from content creators that have like physical
I guess unattractive attributes, which sure or fat people yeah, fat, yes.
Speaker 3 (18:49):
And this is the Bite Dance company that owns them,
so the Chinese company correct.
Speaker 2 (18:55):
Not only do they have the physical appearance there, but
they also are hiding videos that show like backgrounds of
like what they call shabby environments, so like a crack
in the wall or like a landscape that's not like
your yard's not mode. If you have backgrounds that are
(19:16):
also shown to be non attractive, it also hide that.
There are so many implications of this. Not only is
it yeah, it's it's classiest, it's fat phobic, it's racist,
it's like, it's agist, it's all of every ist, it
is every is that is like that you can imagine
(19:38):
that is absolutely crazy. That that is just public knowledge.
I mean it's public knowledge in the sense of like
if you go searching for it. I had no idea
about this, and I use like we said, I use
TikTok NonStop. But I had to go searching for this
for this episode Yeah, that is like the most mind
blowing thing that I can think of, that they trained
their algorithms specifically censor all of these videos that come
(20:03):
from all of this. So the part that like that
is so incredibly mind blowing to me is that people
who are in shabby environments, as they say, and they're
trying to make tiktoks to maybe promote a small business,
to maybe like.
Speaker 4 (20:19):
Change their status in society.
Speaker 2 (20:21):
Exactly, Yeah, to maybe better their circumstances that they're in,
are automatically being discriminated against by the algorithm a robot.
Speaker 1 (20:31):
They made the robot racist, exactly.
Speaker 3 (20:33):
I want to be surprised, I really tough.
Speaker 2 (20:39):
It's also important that I at least Devil's Advocate just
a little bit here of that. Obviously, the algorithm does
not completely eliminate all of those videos. It is just
that they're less likely to make it onto a for
you page in order to get picked up by the
algorithm and have a lot more people engaging with it.
Speaker 1 (20:56):
It's like they're still there, but again their shaboy band,
they're not able to get the kind of reached that
a thin white person with a nice backdrop and et cetera,
et cetera is able to do. It's just they just
baked the discrimination into the machine exactly right. Finally we
can automate racism.
Speaker 3 (21:13):
Yeah, time, Well we've already automated all the fun parts
of humanity like art and writing, so yeah, not the
racism too.
Speaker 1 (21:20):
Though. I have heard a little bit about a different
kind of censorship on TikTok, specifically after it came back
because Trump resurrected it from the dead.
Speaker 2 (21:29):
So TikTok has been in the news lately for another
kind of alleged censorship. And I'm and I'm being very
clear with my word alleged here because please don't see
allgend Please don't sue us. Please.
Speaker 1 (21:41):
I have no money. I'm a sociology major.
Speaker 2 (21:43):
Yeah, I'm a full time grad student. So like, I
don't got I don't have anything. In fact, I'm like
so deeply in the hole I don't know how I'm
going to get out of it that maybe that's the
price of just getting educated at all unless you listen
to this podcast for free.
Speaker 1 (22:00):
We said it here first, folks, We are providing you
at least twenty thousand dollars worth of value per episode,
So really, can you afford not to listen exactly.
Speaker 2 (22:09):
So some reports are coming out with claims that TikTok
is suppressing political information after the band in order to
make Trump look better allegedly, but many users have reported
searching for certain words or phrases in America, such as
election rigging and getting a screen that shows no results found.
TikTok's official statement on this matter simply says that they
(22:32):
do not suppress political information and that different countries have
different policies of what's allowed to be discussed on the
app and what's different, so like say in the UK,
these same phrases do show information like that, they do
come up with videos, and it does not say or
no videos found, but great, that's that's just an interesting point.
Speaker 1 (22:53):
I did see some stuff about, like right after that happened,
people using Canadian VPNs and then searching the same thing
and getting way more results. So like this particular censorship
seems like it allegedly might not be coming from TikTok,
but allegedly could be coming from the United States government instead,
because Trump made a backroom deal to bring TikTok back,
even though the Senate and fuck and House had already
(23:16):
banned it and it had been signed into law.
Speaker 2 (23:18):
So I don't know necessarily if saying that the different
countries have different policies on what's allowed to be discussed
on the app and what's not really covers sure, the
whole issue that's being brought up. But again that's just alleged.
So it is what it is. So we're gonna move
on to Twitter or X one hour. It's Twitter to me.
(23:40):
One issue that has become incredibly noticeable to every user
since Musk's takeover is the amount of bots on the
damn app.
Speaker 1 (23:48):
But Elon Musk said he bought it to get rid
of the bots. Are you saying Elon Musk is either
lying or incompetent?
Speaker 2 (23:54):
Allegedly, let's say both are very popular. Tweet mostly from
paid verified accounts, will be filled with countless replies from bots.
It is widely thought that the only reason for these
bots is to try to sway public opinion in the
favor of conservative politics.
Speaker 1 (24:12):
Allegedly, that's not dystopian, No, it's not at all.
Speaker 2 (24:15):
The information governance on Twitter is only getting more and
more severe because as a business, Twitter is failing. Yes,
since must took over, Twitter has lost seventy two percent
of its entire financial value. That is a massive number
of loss. So he is flailing trying to control what
(24:37):
people are able to see on the site in order
to try to get more just any kind of success here,
because he's failing. Tons of users have reported having right
wing propaganda pushed onto their feeds despite not following or
even engaging in any of right wing content. I'm included,
Diana is included, eight, He's included. That's and you can
say allegedly all you want, but just all three of
(24:57):
us have.
Speaker 1 (24:57):
Experienced that anecdotally. I don't know anyone who did not
have that experience that was using.
Speaker 2 (25:02):
Twitter exactly so fairly. Recently, Musk also changed the way
that blocking works on the platform. Even when you have
blocked someone, they can still see your content, they just
can't engage with it.
Speaker 4 (25:14):
And this is when I that's when I left.
Speaker 1 (25:16):
Officially, I was like, Oka, that is when I deleted
my account.
Speaker 4 (25:19):
My block pat doesn't work. The whole account's going private,
and I installed the app.
Speaker 2 (25:23):
And so let's talk about the implications of that.
Speaker 1 (25:25):
The reason I ended up just deleting my Twitter entirely
after that. So I first privated it and then I
realized that, like even the people who already followed me,
I didn't know if all of them were people I
could trust necessarily because I had someone who abused me
when I was in college who I really don't want
to know things about what I'm doing and where I am.
She's the first person I block on every social media
(25:48):
platform I've ever used. Not having the block function is
incredibly dangerous. I also know other people who were abuse
victims who like their abuser, could use their Twitter to
figure out where they are and go physically harmon exactly.
It's absolutely catastrophic. I do wonder if Musk just did
this because so many people blocked him once he started
forcing posts on your feed, well allegedly allegedly, and also
(26:11):
because you can block advertising accounts in order to not
see there at Yeah, I.
Speaker 4 (26:15):
Used to do that.
Speaker 1 (26:16):
That's something I actively did same all the time. So
I'm wondering if those are the reasons. But it just
it's such a huge safety concern. Yeah, it's also for like, yeah,
for kids that need to be able to moderate their
own safety.
Speaker 2 (26:32):
All of it's insane. In fact, I got into an
argument on Twitter yesterday with someone who I have had
blocked for like nine months. Oh great, what the hell?
Speaker 1 (26:41):
So the system's working?
Speaker 4 (26:42):
Why are you back?
Speaker 2 (26:43):
How the hell are we able to interact with each other? Yeah, exactly,
Like there's no way to just rid someone from your feet,
and that is like so frustrating when there is someone
at just seeing them pop up at all, even if
they're not even talking directly to you, like deteriorates your
mental health. Yeah, so many people on Twitter I don't like,
and I would just like to protect my own brain
(27:06):
and I can't because they will show up regardless. Like
it does, Like, the blocking essentially does absolutely nothing. Now. Interestingly,
both Apple and Google have incredibly strict regulations on their
app stores that social media apps must have a proper
blocking function because it is a violation of privacy and
safety to not. However, Twitter is still available on both
(27:29):
the App Store and the Google Play Store. That's interesting.
Speaker 1 (27:32):
Why how money?
Speaker 3 (27:35):
But like, I'm like, isn't this why they made Tumbler
get rid of porn like in twenty eighteen or so? Yeah,
well so that could be on the app store, you know,
you'll never guess, But it's also still all over Twitter.
Speaker 1 (27:45):
Is it porn?
Speaker 2 (27:46):
It's poornh Yeah, every time I open that app, I
am faced with stuff I don't want to see, both
politically individually.
Speaker 1 (27:54):
It sounds like a great app and you're having a good.
Speaker 2 (27:56):
Time, Yeah, I am, it's awesome over there. So everything
that we've talked about here, I want to get both
of your final thoughts about the amount of power that
both the company aspect of a social media platform, but
also the power of the individuals behind these platforms have.
What did you learn? I, especially Katie, I want to
know what you learned.
Speaker 3 (28:16):
I I knew some of this already, it's just seeing it.
It's like taking a step back and seeing the whole forest, Like,
oh shit, there is so much more than you know
people are actively thinking about, especially for something that's ubiquitous.
Speaker 4 (28:33):
As social media. And like my other thought was.
Speaker 3 (28:36):
Oh, I'm glad that Jeff Bezos also doesn't have a
dog in this fight. But he does because Amazon owns
like most of the data storage for everything that is
cloud based, A ton of it is based in Amazon servers.
So like he already has control over some of this stuff.
We just don't even know it yet. Yeah, and I'm
so tired. I have a poster in my classroom that
(28:59):
says how to fight the fake news invasion, and it's
like a ten step thing about how.
Speaker 4 (29:05):
To learn like what you read online. I don't think
kids have looked at it, I completely honest.
Speaker 1 (29:12):
I'm sure some of them did.
Speaker 3 (29:14):
Yeah, the Shakespeare poster dex to it is actually somehow
more interesting to them, so huh.
Speaker 1 (29:19):
But nerds.
Speaker 4 (29:21):
But like I feel a little bit like.
Speaker 3 (29:27):
This is the oil barren era from the nineteen tens
in nineteen twenties, everybody controlled these small stakes and things
and something had to give, and it eventually did, and
it brought in a cycle of regulation.
Speaker 4 (29:46):
I think that's coming.
Speaker 3 (29:47):
But what concerns me is who is in power when
that regulation starts to come through, right yep. On the
one hand, we could you know, these things can get
broken up and they could get given to more different
people I don't know, or it could just be put
into new shell companies for the same three people.
Speaker 1 (30:08):
Yeah. But I think that's a really good analogy. Like
the Robber Baron, it is that we all learned about
a history class of which is also why I think
I resonate with the digital colonialism angle two. Yes, Yeah,
because it just it feels it rhymes it's not as
violent on its face as like real colonialism is, for example,
but it is taking what used to be a public
(30:29):
space and it's inclosing it. It's putting a fence around
the public park so that and putting anti homeless benches
in so people can't stay there. Yeah, another episode. But yeah,
I mean I think, and this is my sociological opinion
that it is bad when like three dudes can control
all of the information we see oligarchy and.
Speaker 4 (30:50):
There isn't an information oligarchy.
Speaker 1 (30:53):
Like I just it really is. That's a good term.
I'm stealing that from my next paper, Like, it is
really an information oligarchy, and it works more efficiently than
any kind of government censorship really could ever dream to be. Yeah,
I'm glad we got to talk about this because this
is my white whale and this is what I went
back to graduate school to study, is how social media
(31:15):
works and how it changes people. And you did a
great job with the research on this one.
Speaker 2 (31:19):
Rain, Oh, thank you you did as well.
Speaker 4 (31:23):
Fascinating.
Speaker 2 (31:24):
Yeah, it's a little sad that every episode, everybody, including
our guests and all of our listeners are going to
end each episode being like hey.
Speaker 1 (31:35):
Yeah, like well, this is a bummer.
Speaker 4 (31:38):
I watched John Oliver for fun. He admits that it is.
Speaker 3 (31:42):
Yeah, white guy tells you sad statistics for an hour,
punctuated by dick jokes.
Speaker 4 (31:47):
Like I get yeah, I know why I'm here, but.
Speaker 1 (31:51):
The format works, but uh yeah. Thank you for hosting,
Thank you for having me, thank you.
Speaker 2 (31:57):
That's all we have for this week. Thank you for
listening to The Armchair Sociologist, where what you don't know
can and will hurt you actively. As always, all sources
used in this research of the episode are in the description,
along with how to find us on all of these
social media platforms that we just talked to a bit about.
Until next time. The Armchair Sociologist is an independent podcast
(32:19):
produced by Diana Haslin. Any opinions stated on her own
unless there's somebody else's.
Speaker 4 (32:24):
Our theme song is.
Speaker 2 (32:25):
Armchair, provided by our in house band, Karl Marx and
The Dialectics.
Speaker 1 (32:29):
If you or a loved one or in a high
risk demographic for podcasting, don't stay silent.
Speaker 2 (32:34):
Help is available