Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
So one of the first videos I saw of yours
was this dolphin video.
Speaker 2 (00:18):
Yeah, I made a dolphin language.
Speaker 3 (00:21):
It's called and it only has vowels that are made
in the front of your mouth to best imitate dolphins beach.
Speaker 2 (00:25):
Here are some example conjugations.
Speaker 3 (00:27):
All the sentences are six vowels long, but the meeting
changes based on vowel placement and length. So the sentence
I eat would be pronounced, but if you want to
conjugate that too, I will eat. Since it's an east
n verb, you have to lengthen the penultimate vowel, which would.
Speaker 2 (00:39):
Change it to E.
Speaker 1 (00:42):
Adam Alexic is a linguist and he's known online as
the etymology nerd.
Speaker 2 (00:47):
I was like combining, like something goofy with something academic,
because they have both of those impulses constantly working at me.
So the academic impulse to me is like, you know,
what if we try to make a minimalist language, see
what we can do with the boundaries of language. Here,
the goofy impulse was, let's just sound like dolphin kind
of the same thing happened with my book here, Like
what if we actually seriously study the origin of slang words? Right,
now where they're coming from. But also, what if I
(01:08):
just get to write a book about skibbity toilet and
get paid for.
Speaker 1 (01:10):
It, Like I if hearing the phrase skibbety toilet creates
a fight or flight response, or even if you don't
know what that is yet, it's fine, I'll explain later,
and I promise you you were absolutely the target audience
for this episode. So Adams just released a book called
I'll Go Speak How social media is transforming the future
(01:30):
of language and overall. In the book, he shows that
the short form video content that we see on Instagram
reels or on TikTok and the algorithms that they run
on are changing how we talk in real life. So
I wanted to talk to him about how we got
here and where we're going.
Speaker 2 (01:48):
We somehow think, oh, the history of right now is
not important, but in fact it's extremely important because it's
an indicator of what it means to live and exist
right now in the present, and the fact that we
are existing in the skibbity toilet era means that we
should maybe look into that.
Speaker 1 (02:00):
I'm sorry, we're existing, We're existing in the skivity toilet era.
Speaker 2 (02:03):
You're not wrong I'm saying goofy things, but I'm serious
about it.
Speaker 1 (02:07):
No, No, you're like, everything you're saying isn't incorrect. Everything
you've said is right. I just when you say it,
it hits a little bit.
Speaker 2 (02:14):
It's a lot of fun. I'm a little self aware
of my job. That is, it's a little goofy. I'm afraid.
Speaker 1 (02:27):
From Kaleidoscope and iHeart podcast. This is kill Switch. I'm
Dae Thomas.
Speaker 4 (02:36):
I'm sorry, I'm sorry.
Speaker 1 (03:00):
Good God, let's even back up a little bit. But
feel free to talk to me like I'm a five
year old. Yeah, how do you define algo speak?
Speaker 2 (03:11):
Traditionally, al go speak refers to speech made to circumvent
algorithmic censorship. The quintessential example there was on alive, like
instead of kill because you can't say kill on TikTok.
You say on alive sex instead of sex because you
can't say that on TikTok. And so you can literally
see language re routing around the algorithm with those examples.
(03:31):
But it's also like a game of whack a mole, right,
The algorithm sensors certain words and then a new word
will spring up. Because humans are always very good at
finding ways to express themselves, and that historically has been
referred to as algospi. How I define algo speak, I
think is a little bit different because that's just the
easiest to point to example of algorithms shaping our speech.
(03:54):
We also have where words are coming from, how words
get popularized, how quickly those words spread, how creators are
interacting with algorithms, how users are interacting with algorithms. All
of those things are also going to affect the way
we speak and relate to each other.
Speaker 1 (04:06):
How do you explain to somebody that you're taking skibbety,
you're taking Rizzler, you're taking gat You're taking all this stuff. Seriously, Yo,
I'm gonna write a book about it, and this is
going to be something that's academically sound.
Speaker 2 (04:20):
So I'm going to acknowledge that we are talking about
silly things, but even the silly things are points of
human connection. They're important trends. The fact that rizz was
the twenty twenty three Oxit English Dictionary Word of the Year,
the fact that skibbity toilet has been viewed more times
than the moon landing, the fact that middle schoolers are
out here that's how they connect with each other.
Speaker 1 (04:37):
Well pause, sorry, no, no, I'm not No, you can't
steamroll through that. Sorry skip the say that one more
time than the moon Landing? Are you serious?
Speaker 2 (04:46):
I think yeah, Apalla, moon Landing is reviewed like one
hundred and ten million times on TV, and then each
Skibbity toilet video by itself has over like one hundred
fifty million views on YouTube shorts or something like that.
Speaker 1 (04:58):
That's just one video, all right, So this is where
I do the podcaster thing and bust out with the
explanatory comma. But like the extendo clip version, So let's
go down the list. Riz rizz you could think of
like short of charisma. So if you have RIZ, that
means you've got style, charm, attractiveness, people like you, that
(05:20):
kind of thing. Unalive is what it sounds like. It's
a euphemism for death, So killing someone would be unliving someone.
Suicide would be unliving yourself. Yat is a little complicated,
but short version. It's a big butt big in a
good way. Sigma is also good. Usually it's like the
(05:41):
apex version of an alpha male. Okay, and you know what,
Now that I've done all that, I might as well
explain skibbity toilet, so okay, paraphrasing from the Wikipedia, because yes,
there is a Wikipedia entry for skibbity toilet. Skibbity toilet
refers to an armed conflict sprawling across dozens of episodes
between humanoids and singing human headed toilets called skibbity toilets.
(06:04):
Skibbity toilets are called this probably because in the first video,
a human head pops out of a toilet and starts
singing like this.
Speaker 2 (06:12):
Chris mustache is ridiculous, stimmy.
Speaker 1 (06:24):
And then the head flies out of the toilet and
into the camera. And now you know why kids are
saying skibbety. And if your head hurts after hearing all that,
you now understand the last phrase, brain rot, which is
basically a catch all phrase for everything I just talked about.
All right, back to the interview.
Speaker 2 (06:43):
Word skibbitty actually is functionally no different than the word
scooby doo. That's how I like to explain it. They
both come from non lexical vocables that's scot singing, So
just a random like phrase that rolls off the tongue.
We can't ignore cultural phenomena. Language is always a proxy
for what's going on. In culture, it's the little things
we can see that point us to the things that
we can't see. And by looking at these examples, I
(07:06):
hope to broadly demonstrate that algorithms are having insane impacts
on our society as a whole, but specifically through the
lens of communication as well.
Speaker 1 (07:19):
You brought up the censorship thing. How have you seen
al go speak develop in response to censorship?
Speaker 2 (07:26):
Censorship is a productive force, which is what linguists call
something that generates more language. One example I really like
is in Chinese, the word for censorship is censored, so
users turn to the word for harmony in allusion to
the Chinese government's goal of building a harmonious society, and
that's like hoole shit or something, and then that started
(07:47):
being censored as well. Now it goes down again, new
mole pops up. Now people start saying river crab because
river crab sounds like the word for harmony, and so
that's whole shit with a falling tone. And then that
word starts being sent done some platforms as well, and
then users start saying aquatic products simply because it's similar
to river crab. So you're not going to stop us
from talking about this stuff, and then sometimes it bleeds
(08:10):
through to the offline. My book opens with examples of
kids writing essays about Hamlet, contemplating on aliving himself, or
the Seattle Museum of Pop Culture releasing an exhibit commemorating
the thirtieth anniversary of Kirk Cobain on aliving himself.
Speaker 1 (08:24):
And hold on. Just to clarify here, Adam is talking
about an actual thing. Last year, a placard on an
exhibit about Kurt Cobain said that he quote unalived himself
at twenty seven. Some people did not like this, but
it's real.
Speaker 2 (08:38):
It's bleeding through. We like to pretend that the online
is a separate world, but in fact it does affect
our reality. It does affect the way we relate to
each other.
Speaker 1 (08:48):
Censorship often, when we think about it, it's done by
a government, it's done by the church, something like that.
There seems to be something a little bit different about this,
and maybe it's just the rate at which it's happening.
And I think that the Chinese model is a really
good example because in a lot of ways, I think
what's happening in English is something that has been happening
(09:11):
in China. Since you know the two thousands.
Speaker 2 (09:14):
Our algorithmic models are all based off of bite Dance's
infrastructure that they built for Duyan because China is censoring
certain language and imposed regulations on internet companies there. So
the reason English language algorithms are so good at censorship
and detection and that kind of stuff is based on
the Chinese model. They're ahead of us.
Speaker 1 (09:35):
It isn't just the Chinese based apps like TikTok, YouTube, Instagram, Facebook,
All these apps have algorithm based censorship, and in response,
people are making up words to get around those sensors.
It's hard to keep up with.
Speaker 2 (09:51):
I think it's happening faster and it's more compounded. It's
hard to prove that quantitatively because it's hard to even
like identify what's the slang word or something. How do
we know what's happening faster?
Speaker 4 (09:59):
Right?
Speaker 2 (09:59):
But it seems pretty logical right when we talk about
productive forces and linguistics that the more you sense are
a word, the more people try to come up with
new words that just logically follows. It makes sense that
algorithms are bringing us more language and ever before, the
underlying mechanisms are not new. Yeah, underlying to language is
this feeling that humans just want to express themselves and
(10:20):
communicate to one another, and we are very good at that.
We always find ways around that when people well, I
can't even say here, probably, but there's like a trend
recently where people are talking about somebody doing the thing,
and everybody seems to know what the thing is because
you can't talk about it, and they're asking, you, know, what,
what are we going to do when the thing happens.
I can't wait for the thing to happen. But this
(10:40):
is not something you can actually talk about, but you're
still expressing yourself. This is like the most taboo of
concepts in American society, and yet we're still finding ways
to talk about it that we can't actually put into words.
Speaker 1 (10:54):
Yeah, and I know exactly what you're talking about. How
does that work? How does it work that when somebody
says the thing or doing the thing? How are people
able to pick up on that? Because linguistically, I have
no explanation for that.
Speaker 2 (11:11):
No, Well, linguistically, language is cooperation. And when I'm talking about, oh,
I think somebody should do the thing, that's a quote
I am signaling to you that because I'm not saying
this the signal I'm standing out is this is something
I can't talk about. And you pick up on that signal.
Now you're thinking, like, what are all the things I
(11:31):
can't talk about right now? And then you start to think, well,
maybe it's something really illegal. Maybe it's something that like
you could go to jail. As you say, so, you
think of those things and you think, oh, there's one
obvious conclusion. Also, confusion generates engagement. Engagement pushes it further.
The algorim there probably are some people confused about what
the thing is, and then they might comment and ask questions,
(11:54):
and that'll push it further.
Speaker 1 (11:56):
All right, social media is changing our language, whether we
like it or not. That but is it actually leading
to quote unquote brain rod that's after the break. Let's
(12:17):
talk about brain rot for a second. This is a
while ago. I remember this actually really clearly. I was
at the graduation ceremony for my niece. She was graduating
from middle school.
Speaker 2 (12:27):
That's the ripe age for brain rot.
Speaker 1 (12:29):
Perfect right, And what I started doing was, you know,
there's there's a bunch of downtime, right, and I started
texting her because she's sitting up in the front row.
She's waiting for her name to get called. I was like, Yo,
this ceremony is not busting at all. No cab on
God for real, for real, And she gets mad at
me and texts back like all these angry emoji and
(12:49):
why are you sending me brain rod? And I thought
it was interesting because the kids who we associate with
using brain rot kind of as a they also call
it brain rot.
Speaker 2 (13:03):
I think it's important to unpack that. Obviously, words have
multiple meanings. Yeah, brain rock can be used in the
sense of this is bad for your brain. Oh, that's
brain rot. YouTube slop ai slop is brain rot. Like,
that's one way the word can use. Another way the
word is used. And I would argue the more commonly
way it's used that people ignore is that brain rot
refers to a meme aesthetic of nonsensical repetition. So things
(13:26):
are trending online, riskybity yat ohio is trending online. Yeah,
And it's funny to say that as a sentence, because
you're calling attention to the algorithmic oversaturation of these words.
I think in the sense that she used it, that's
brain rock. Is the meme aesthetic there.
Speaker 1 (13:43):
It's funny because I can't think of another time when
anybody has used the slang that they use, they have
labeled it with something that, even on the surface is pejorative.
Speaker 2 (13:57):
What's the word slang? Slang on the surface level pejorative.
The words langu was coined in the seventeen hundreds as
a way to differentiate upper class language from lower class language.
That's all it was.
Speaker 1 (14:08):
The algorithm is changing how we talk, and it's not
just the words. There's entirely new accents that are developing
in response to the algorithm. You'd usually think of an
accent as something you pick up from living around people
who talk a certain way, but the effects here aren't
happening because of other people, well, not directly, the influencer accent.
(14:30):
Can you explain that to me?
Speaker 2 (14:31):
Well, there's a stereotypical influencer accent, and then there's the
more nuanced explanation. But I'll start with the stereotype. The
Hey guys, welcome to my podcast. I'm using rising tones
as a way of retention. That kind of up talk
at the end of each sentence draws you back in,
It reels you back in. There's stress on more words
that keeps you watching videos So these are algorithmic retention
(14:51):
tactics that keep you watching the video, that survive as
viral accents, and then they're replicated by people consciously or unconsciously.
I use a different kind of influencer accent. I use
what I call the educational influencer action. I'll talk quickly,
I'll stress certain words to keep you watching my video.
Then there's the mister be style accent. He also very
meticulously knows what he's saying intentionally to go viral. I
(15:14):
just bought a private island, and today I'm giving away
a million dollars. Every word is like shouting at you.
Every word is sensationalized. You look at an interview of
mister Beast talking in real life. He doesn't talk like that.
It's a show, it's a presentation. He intentionally is very
good at manipulating the algorithm. There's people at this point
who just assume that's the accepted way to speak online
and replicated. So you have people with no followers, like
(15:37):
one hundred followers, and the first time they decide to
post a video on TikTok, they'll talk in that influencer
accent simply because that's what they think is the norm.
Part of this argument in this book is that algorithms
compound and amplify natural human tendencies. So there's a more
exaggerated or what I call there's the word flanderizer, which
means like pulling out a personality trade and exaggerating it.
Speaker 1 (15:58):
Quick as I here because this is interesting phrase. So
flanderization refers to the actual character Ned Flanders on The Simpsons.
He started out as this well meaning dor key neighbor
who also happened to be religious, but the fans thought
that religious part was really funny, and the writers ended
up dropping the complex parts of Ned Flanders and making
(16:19):
that his whole personality. So Ned Flanders is now hyper religious.
He's become a simplified caricature of himself.
Speaker 2 (16:26):
Hopefully dealy. I think I constantly see that happening with
influencers online. We have to play personas. I play a
persona of myself. I do care about at homology. I
am very excited about this stuff, but I will exaggerate
my voice. I will speak in a hyperbolized manner because
I know that I'll get views which will help me
earn a living right.
Speaker 1 (16:45):
If a lot of people are doing this, because they
think that in order to post anything online, what they
should be doing is basically becoming a very small niche
of themselves. And then also it affects our language, which
we're not going to full no Chansky here, but you know,
also affects your thinking in some way. Then I think
(17:06):
that starts to get a little weird there.
Speaker 2 (17:08):
I think language definitely influences thinking. We categorize the world
a certain way. Now, these categories determine how we act
because we're conforming to these categories at algorithms, although they
purport to push you into your niche or create more
specific subcultures, they actually have these broad, flattening trends that
(17:28):
push less nuanced versions of reality.
Speaker 1 (17:33):
What Dam's saying here is that these algorithms are flattening
our culture even with the expansion of language, which in
some ways means our thinking might be getting flattened also.
But this isn't the first time that human language has
evolved in response to technology.
Speaker 2 (17:48):
We have changes in mediums that affect the way we communicate.
The change from oral tradition to written chapters meant that
we could like segment our words differently. During oral traditions,
we had to have rhyming meter, that kind of stuff
which helps us remember our stories better through songs.
Speaker 1 (18:08):
I actually never thought about that, the function of rhyme
helping you remember stuff. Yes, I'd never thought of it
like that.
Speaker 2 (18:14):
The way we're telling our stories always reroutes around the medium.
Once we have television, things could be serialized. Once we
have the Internet, we have the written replication of informal speech.
I think we're at another inflection point, one where the
medium is now different. It's a new medium, and each
new medium is going to affect how we communicate uniquely.
So the fact that we have algorithms now means that
(18:36):
our language is rerouting around algorithms to the same degree,
maybe as the shift from oral tradition to writing things down.
Language itself is just humans doing what we've always been doing,
which is using tools to express ourselves. And when we
say things like on a live we're acknowledging our presence
in this communicative medium and the social context in which
(18:57):
we're relating to each other.
Speaker 1 (19:00):
Which changing in response to new technology. Is not a
new thing. But there does seem to be something unique
about this latest trend of algo speak. There's a new
force in play confusion.
Speaker 2 (19:12):
I talk about this boundary of confusion being a productive
force in language changing, where slightly confusing turns of phrase
are good for going viral.
Speaker 1 (19:21):
Really like what.
Speaker 2 (19:23):
I talk a lot about the boundary between irony and authenticity.
So I have a chapter on in cells. For example,
in cels being involuntary selibents this far right misogynistic group.
There's phrenological filters like cancel tilts and hunter eyes and
interocular distance. These are all in cell concepts and they
popularize these categorizations of like people's faces. But they were
(19:43):
kind of funny as a joke. People came up with,
like mewing, which is a jaw strengthening technique. It's funny
because it's a joke, but there's some people who believe
it's real. But it's spread as a joke, but then
some people reinterpret it as real, which spreads it further.
Same with oh, hunter eyes, cantel tilts. It's funny, haha
that inceels think these things are important. But now we
have beauty influencers on TikTok showing you how to put
(20:03):
on eyeline or using your canthel tilt. They wouldn't have
been doing that five years ago, but because in cell
concepts somehow wormed their way into the mainstream through this
hopscotching between what's real and what's fake. Yeah, that confusion
also generates comments from people saying is this actually real?
And then once you the comment is engagement, it tells
the algorithm, let's push this further. And now, the word
sigma is a very popular kind of middle school slang word.
(20:26):
Right now, Sigma just means like somebody who's a cool
dominant male. Everything's about classification, about what's your attractiveness level?
And now where does this put you in what they
call the socio sexual hierarchy.
Speaker 1 (20:39):
And I know some of y'all might be saying, here, wait,
hold up. If we're going to talk about slang here,
especially quote unquote gen z slang, there's something else we
got to talk about. Not too long ago, somebody told
me about this new gen z slang they'd heard, which
was finna. And I had to tell them that My
grandmother said finna, like I'm finna. Whoop you'r behind if
(20:59):
you don't do your homework. That word. A lot of
this quote unquote new slang is just pulled from what
a lot of linguists now called AAE African American English.
You might have heard it called aa VE or ebonics
or just black English. And the flow of AAE being
adopted into mainstream slang has been around forever where it's
(21:19):
like hip, cool, woke, I could keep going, and AAE
does have a heavy influence on Internet slaying to this day.
But the Internet now also has another source of slang
and ideas in cells.
Speaker 2 (21:35):
There's a rule of thumb of slang on the Internet
that it's either from four Chan or it's from AAE. Right,
those are the two big sources. There's occasionally gonna be
stuff from other things. I talk about other echo chambers
that make language, like swifties or K pop stands, but
really it's either black people or it's far right misogynistic
trolls on four Chen. How did that happen? Right? There's
(21:57):
increasingly porous edges to echo chambers that allow ideas to
travel through if they are compelling, and they're compelling, if
they're funny, or if they seem cool. In cells are
very good at weaponizing their memes to be funny. Sometimes
people use those memes to make fun of inseels as well,
so it wasn't always like the inseels themselves pushed in
this ideology, but in cell language spread because it was funny, right,
(22:18):
and then people try to talk more like that, and
then in doing so they perpetuate it to increasingly peripheral groups,
and words transcend cultural boundaries. And it also happens faster
online where there's context collapse. Where you see this video,
you're like, oh, this person's talking to me, it must
be okay for me to also use this word. So
words more quickly transcend filter bubbles like that, and all
(22:39):
of our slang is either African American English hm hm,
or it's in cel rhetoric.
Speaker 1 (22:44):
Unfortunately, this context collapse that happens online means that words
lose their origins and it gets easier for them to
travel and be used by communities that are unfamiliar with
the original context. It's not necessarily a bad thing, it's
how language works. But it also means people get more
comfortable with using or alluding to certain words even when
(23:05):
they know the context. I see a lot of people
throwing around the in word more than they used to,
just kind of in a new way.
Speaker 2 (23:13):
The ninja emoji that's the common like al go speak.
You'vehemism for it, right, Instagram reels just straight up allows
it now because they loosen their content guidelines after Trump
got reelected. I think Instagram reels is the worst social
media ecosystem to be on right now because there's there's
some really really racist stuff on there. Really, there's one
video with thirty million views of a swarm of shirtless
(23:35):
black men running towards a KFC while the underlying audio
says the N word repeatedly. It's AI generated and that's
just like one example. There's a bunch of videos like
that on Instagram reels right now. Because one meta is
allowing AI generated content, they are actively incentivizing it. Two
they're allowing for more extreme content because they know that
gets attention better and at the end of the day,
(23:56):
they want your attention so they can then commodify.
Speaker 1 (23:58):
It instinctively in some way. I've had an idea of
what it is that attracts, say a white person to
really want to be able to say the N word.
It seems like there's different incentives now given algorithms.
Speaker 2 (24:13):
When I interviewed the influencers behind these AI slob accounts
that are making really racist videos, I DMed dozens of them,
and five to six of them got back to me.
I asked them, why are you making these videos? What's
the underlying motivation? And what I kept getting isn't oh,
I'm a racist and I want to spread racism. It's
I want views, I want followers, I want likes. The
(24:33):
institutional structures are there for stupidity and racism and these
terrible things to occur. Once Instagram starts incentivizing that, people
will make content because that's how they're going to get
views and likes and followers and thus money and thus money.
And it's purely because Instagram's incentivizing it.
Speaker 1 (24:52):
Okay, So now I have to ask the question that
probably a lot of people are wondering right now, is
al Go speak is the way that communicate online and
thus in real life? Is it making us stupider?
Speaker 2 (25:04):
Now I can confidently say language is not making us stupider.
It doesn't matter whether you're saying skibbity or scooby doo
or whatever. Language is a mechanism for humans to relate
to other humans. It's a way for us to capture
our worldview and then transmit that to other people. We
are doing it perfectly, even if we're talking about Skibby toilt,
even if that's quote unquote brain rot. Culture, on the
(25:25):
other hand, is a subjective thing that is constructed internally,
and there are definitely concerning cultural trends, declines and literacy rates,
shortened attention spans. We know these things are happening.
Speaker 1 (25:37):
Throughout this whole conversation, I had this feeling that there's
not quite a word yet for something I've been seeing
that's happening, because I'll go speak so Adam and I
made one up. That's after the break. Protests are still
(26:02):
happening in Los Angeles, and as I've been out documenting it,
I've noticed something different. One examp that I've seen that
really kind of hit me is IOF.
Speaker 2 (26:11):
That stands for Israeli Offensive Forces, which is a pejorative
algo speak euphemism for IDF Offensive forces.
Speaker 1 (26:19):
Right exactly, if you say IDF in TikTok or Instagram
or whatever, you will potentially be shadow band or your
posts will not get shown to as many people. What
I started noticing though, is I'm in Los Angeles and
there's a bunch of protests here, the anti ice raids,
and I started hearing people chanting about IOF, and it
(26:40):
took me a second to realize what was going on. Yeah,
so people are chanting it, but also dig this. There's
graffiti and people had tagged IOF and crossed that out.
Speaker 2 (26:52):
And that just comes from algo speak. That's a fascinating example.
Speaker 1 (26:56):
It's nuts because, I mean, if you think about it,
graffiti by nature, you can write the bed, you can
write whatever you want on a wall, and so why
would you censor yourself in graffiti.
Speaker 2 (27:05):
Well, it's not even censorship anymore. The words taken out
a new life, I mean online IOF is a metal
linguistic wink. It's saying, yes, I'm submitting to the algorithm
while I'm saying this thing that I want to say,
but also I'm reclaiming it. There's an act of reclamation
and turning the D into an oh, because you're signaling
something else about the ideaf that you want to communicate.
Speaker 4 (27:25):
Here.
Speaker 2 (27:26):
When we say something like segs or essay or on
a live or all these things, we're always doing so
with an implicit acknowledgment that there is an algorithm governing
our speech, that it's always present when we say these
words and there's always some level of acknowledgment for that.
Speaker 1 (27:44):
Watching people say something, for example, watermelon at a protest.
You know, oh, I want to support watermelon. I've heard
people say that you don't have to say that you're
at a protest. Everybody here is on the same side
as you are. But you're saying, oh, you know, well,
watermelon cause and things like that is implying somebody could
be watching us. Nobody is.
Speaker 2 (28:04):
Nobody's listening, but the signal that they could be is
also impactful in its own way.
Speaker 1 (28:09):
It's signaling to an end group. But the extra thing
here that I'm seeing is something that I can't think
of a time a part of English that has done
this in the past, which is to say, for example,
you know, in Chinese you have the completion marker, you know,
the lea in Japanese, you have certain linguistic features that
don't exist in English. In Black English you have certain
(28:32):
versions that don't exist in you know, things in standard
English like you know, I've been gone to the store already,
or you know, things like that I've been hungry, things
like that. Yeah, I can't think of anything in any
language that I know where you use a word that
is also implying I shouldn't be saying this, and somebody
is watching me say this, and you understand what I'm saying,
(28:54):
and there's a sense of forbiddenness built into the word.
Speaker 2 (28:59):
It's called a voidance speech. There are languages in Polynesia
as a common example, I have taboos on certain words,
and you can't talk about things like menstruation sometimes okay,
or you can't mention certain relatives that are passed away,
and there are ways of circumventing that with euphemisms avoidance speech.
(29:21):
That's not super new. I will say what you said
about this awareness of surveillance is really interesting. I don't
think we've ever had as much of a perception of
being watched and surveilled, and the fact that we're in
this digital surveillance panopticon.
Speaker 1 (29:37):
I feel like we almost need a grammatical term for this,
because it seems like this is something that we're going
to see more of and not less of.
Speaker 2 (29:46):
Well, we should quote one, what about algorithmic performativity. There
we go, let's do that one. Algorithmic performativity here is
speech with the knowledge that you're being watched by an algorithm.
And I also want to be careful when I say
watched here, because no one's act sitting in a room
looking at you like it's honestly, it's eerier than that
that there is no person behind the control room. It's
(30:09):
it's all automated. Our lives are being controlled by something
that not even the engineers knows what's happening. So I
find that more terrifying than if I actually had an
FBI agent behind his computer looking at everything I said.
Speaker 1 (30:22):
Given how language clearly is being changed by the algorithm,
by which we mean, you know, the three companies that
control the algorithms that we use to distribute stuff, are
we cooked?
Speaker 2 (30:32):
I like to think optimistically. I think tentatively. No, especially
in regards to language. I hope I've drilled home the
point that with language we're fine with other stuff. Culturally,
are we cooked? I don't know. We're still humans doing
human things, right. We're humans using tools to communicate with
each other. That feels fine. At the end of the day.
(30:53):
We should just do what makes us feel good. Life's
too short to not just try to like vibe when
we can, and if algorithms help us vibe, then great,
But you know, we should we should be aware of
how they affect our lives and piece together what makes
you feel good.
Speaker 1 (31:17):
And that is it for this one. Thank you so
much to Adam Alexic for talking with me. Adam's new book,
I'll Go Speak is out now and if you look
in the show notes as a link to that, and
of course everywhere else you can find Adam online. And
thank you to you for listening to kill Switch and
let us know what you think. I know we've been
kind of all over the place with different topics, but
(31:38):
the world of technology obviously is pretty wide. So if
it's something you want us to cover or something you're
curious about, hit us up on email or kill Switch
at Kaleidoscope dot NYC. Or you can find us on
Instagram at kill Switch pod or I'm dex digit that's
d e X d I g I on Blue Sky
(31:58):
or Instagram and we'rever you're listening to this right now.
Whatever podcast service you use, leave us a review. It
helps other people find the show, which in turn helps
us keep doing our things. Kill Switch is hosted by
Me Dexter Thomas. It's produced by Shena Ozaki, Darla Potts,
and Kate Osborne. Our theme song is by me and
(32:19):
Kyle Murdoch and Kyle also mixes a show from Kaleidoscope.
Our executive producers are oz Ba Lachin, Mangesh Hatikador and
Kate Osborne. From iHeart Our executive producers are Katrina Norvil
and Nikki Etur. Catch you on the next one.