All Episodes

May 4, 2025 59 mins

The weekly round-up of the best moments from DZ's season 386 (4/28/25-5/2/25)

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello the Internet, and welcome to this episode of The
Weekly Zeitgeist. These are some of our favorite segments from
this week, all edited together into one NonStop infotainment laugh stravaganza. Yeah, So,
without further ado, here is the Weekly Zeitgeist. Miles. We're

(00:26):
thrilled to be joined once again in our third seat
by a research associate at the Leverholmes Center for the
Future of Intelligence, where she researches AI from the perspective
of gender studies. She's the co host of the great
podcast A Good Robot. Please welcome back the brilliant doctor
Kerrie McNary.

Speaker 2 (00:44):
Doctor kay, Hi, thanks so much for having me back.

Speaker 3 (00:48):
Oh awful, thank you, thank you for accepting our offer
to return, you know, to class up the joint as
I say.

Speaker 1 (00:55):
International you know, diplomacy going on. We had yeah, you know,
work it out, but we're thrilled to have you back.

Speaker 2 (01:04):
How have you been, I mean, how have we all been?
I don't answer that question anymore so I'm just like,
I guess in the big scheme of things, I'm personally
doing really well.

Speaker 1 (01:14):
But yeah, yeah, thriving, Yeah yeah. What's the energy like? Answer?
What's the energy like? In Europe?

Speaker 3 (01:22):
Looking at the cess pit that is a flame as
known as the United States. Is it like, is it
like a tooth like Iraq war? Anger like I experienced
going to Europe, or like what the fuck are y'all
doing over there? Or is it like how we felt
after Bregg's there people are like, oh, sorry, is fucking
themselves bad?

Speaker 2 (01:42):
I feel like, yeah, a lot of confusion and fear.
I don't know if it's necessarily even just anger as
much as it's just people being like what is happening?
Like what are you doing? But I think there is
a degree of like shell shock to it, where just
so much keeps happening that I think you start to
like become slightly numb, which is very bad but also
very understandable. I got a bit of a refresh from
that because I was back home in New Zealand in

(02:02):
March and like, there's nothing quite like being in a
country when you watch the news every day and like
nothing happens. It's like a tree fell near the motorway,
not even on the motorway, or you know that kind
of level of me right right right, And then it
came back here and I was like, oh my goodness,
this is a very different level of news.

Speaker 3 (02:20):
Yeah yeah, yeah, yeah, well we're not getting them to it. Unfortunately,
every day it hurts every day. Damn you got anything
for that hurts every day?

Speaker 1 (02:32):
Yeah?

Speaker 3 (02:33):
Yeah, I mean, but that is the point. I think
they want as many people to turn off and sort
of ignore what's happening. But I think I think as
things ramp up here, people more and more realize how
how much the norms, as flawed as they were, were
things that were worth keeping and trying to improve rather
than be like FuG everything, get rid of it all
and whatever.

Speaker 1 (02:52):
This is, Yeah, what is something from your search history
that's revealing about who you are?

Speaker 4 (02:59):
So I am working on a show called How to
Live Forever right now, which is you know, I'm just
so enamored with like how how these billionaires are spending
all their lives like trying believing that they're gonna live
to one hundred and sixty and and but they're also
like crazy things like cats that like you know, I
think Rick Perry's cousin is helping them.

Speaker 1 (03:20):
Live to their thirties. They're like cats live to their thirties.

Speaker 4 (03:25):
Yeah, with like tiny thimbles of wine and like there's
a whole exercise course for them.

Speaker 1 (03:30):
Wait, is the wine thing real, because like I always
assumed that was just wishful thinking. Is it real? Well,
for cats it is. I don't know about us with
a little bit of wine. Well wow, okay, I mean
it takes the edge off, I guess.

Speaker 4 (03:46):
But you know, they're all these like things and and
they're you know, miracles of science right now. Like they're
these dancing molecules that in a lab in Northwestern they
have severed mices spines to make them paraplegic and then
give them this injection and within a month they're like
walking normally again. And like, like the advances and science

(04:08):
are unbelievable. But also there's all this like snake oil,
and like what happens when like billionaires stop like giving
their money to society and taking like billionaire's pledge and
just like conserving it for you know, when they're living
to two hundred or whatever. But in all this research
I stumbled into like this rabbit hole on the enhanced games.
Have you been paying attention to this? Like it's basically

(04:30):
an all drug Olympics.

Speaker 1 (04:31):
And they're doing it.

Speaker 4 (04:33):
I know that it was a one in it, but
it is that it's supposed to happen next year, and
and there's all this craziness of like trying to figure
out where is it going to be because they're like,
you know, you can, I think, dope up as much
as you want.

Speaker 1 (04:49):
For these Olympics.

Speaker 4 (04:50):
They're getting people who are like retired Olympians and like
people who have sort of like failed out of the
you know, they were like a fifth place, like swimmer
or whatever, and they're getting roted up and and other
drugs obviously, and and uh and then they're getting these
huge cash prizes to like beat the like Michael Phelps
record or like you.

Speaker 1 (05:11):
Know, and and uh.

Speaker 4 (05:13):
It's just the whole world is fascinating to me, like
the fact that like, you know, the Olympics themselves are
are pretty corrupt. It's not like you know, they're there
are things where like Ben Johnson and and.

Speaker 1 (05:24):
Uh car Johnson Johnson.

Speaker 4 (05:28):
Yeah, I mean they took the same drugs right, like
and and and one of them is like a hero
and the others like you know this pariah and and like.

Speaker 1 (05:36):
Carl Lewis took those drugs too.

Speaker 3 (05:38):
Yeah, yeah, come on, Barcelona Olympics.

Speaker 1 (05:46):
So that's been like documented since.

Speaker 4 (05:50):
No, but but but but there's a documentary about it
Steroids that that talks about it.

Speaker 1 (05:56):
Ben Johnson just like didn't hide it as well, because
he definitely looked more on steroids than I think anyone
I've ever seen.

Speaker 4 (06:03):
Well I think also, I think the countries are better
at hiding it than other countries, right, Like certain doctors
are better at hiding it. So like you know, whether
it's like Russia or China or various places, Eastern Germany,
right was very very good at hiding it and not
hiding it. But anyway, like the fact that like you one,

(06:23):
like the countries are trying to bid on whether they
can like sports wash through like an all drug Olympics
is like kind of.

Speaker 1 (06:30):
Stunning to me.

Speaker 4 (06:31):
And and and also just the people are invested in
this because they feel like people like Peter Thiel are
invested in because they hope that all these longevity drugs
will come out of an Olympics. And it's hard to tell,
you know, is it going to be like a fire
festival of sport or is it going to be like
something that makes money and becomes like you know.

Speaker 1 (06:52):
Regularly washed on TV. Or where's it located? Haven't announced
yet oh International waters.

Speaker 3 (07:00):
Right, Yeah, we're going to see it like the first
televised like race where people are having like a heart
attack during the butterfly stroke of a swim competition and
you're like.

Speaker 1 (07:11):
Oh, boy using air while you can literally see his
heart pumping out of his chest. I feel like it
either needs to be Vegas or Detroit in honor of RoboCop.
I'm not sure, yeah, right, like or like or to
just be like Saudi Arabia. Yeah, that makes a lot

(07:34):
of sense. Yeah, and and it just like plays it
to this whole like weird.

Speaker 4 (07:39):
There's a GQ article about like everyone you know is
on steroids, you know, and it feels.

Speaker 1 (07:44):
Like this, no, not Jack, I told you I'm not.

Speaker 3 (07:52):
Yeah.

Speaker 1 (07:52):
I had to ask my mom twice. She's everyone I
know that she lifted the couch with one man. It
was crazy, but got terrible back acne too, I noticed.

Speaker 4 (08:04):
Yeah, but it is I mean this whole world of
like biohacking and human augmentation where like people showed up
two feet taller or like six inches taller after the
pandemic because they'd had those like you know, like like
this whole world is so bizarre and it's fascinating to
see what'll stick.

Speaker 1 (08:23):
I have some suspicions that there was one time where,
like some I ran into someone I hadn't seen it
in a while, and they were definitely taller than they
used to be, and I just like wouldn't because I
hadn't really thought of that as being a real thing.
I wouldn't let it rest. I'd be like, are you
so damn tall? It's crazy.

Speaker 5 (08:43):
It's like you had a goddamn Rosebury in your thirties
and they were like, yeah, man, just like stop, just like, no,
I've always been this tall.

Speaker 1 (08:50):
I don't know what you're talking about.

Speaker 3 (08:51):
Oh man, bro, yeah I had my bones extended.

Speaker 1 (08:58):
You're like, oh yeah, exactly. And you were saying before
we recorded, Brian Johnson is hosting that show with you.
That's your co host, the guy, his son's bonus. So
that's cool. No, that's not true. That's uh, that's that
is super interesting. I can't wait to have them have

(09:19):
these games and they can't even get close to the
real Olympian records because they were all on steroids too.
You know, we'll see what is something you think is underrated?
Mashed potatoes underrated? Okay, that's just you think they're more versatile,

(09:42):
just more more delicious than people give them credit for.

Speaker 6 (09:45):
Yeah, I think they're top potato.

Speaker 1 (09:50):
So you're saying underrated because we're emphasizing what the fry
too much, I think so.

Speaker 6 (09:56):
I think we're in too fancy with potatoes. Being an
Idaho brought me closer to the potato and you can
just throw butter and salt and that's good shit. It
doesn't need anything else. People gussy it up too much.
So mashed potatoes underrated a.

Speaker 3 (10:12):
Lot of butter, you know, if you really want to
real good, a fucking a lot of butter, like enough
that their people.

Speaker 1 (10:18):
Are like are you Are you okay? And I'm like, yeah,
you okay, because this she's delicious when you worry about yourself. Yeah.

Speaker 6 (10:26):
No, one just makes a bowl of mashed potatoes as
they should.

Speaker 1 (10:30):
Right, Yeah.

Speaker 7 (10:31):
Yeah, I did it down with the movie.

Speaker 1 (10:33):
Three Nights ago.

Speaker 3 (10:33):
I had a bag of potatoes that like one started
to sprout a little, you know, a little eye out
of it, and I.

Speaker 1 (10:38):
Was like, all right, I got to cook these straight, yeah,
just right away. And I was like, maybe I can
roast them.

Speaker 3 (10:44):
I'm like, no, dude, I want to eat a big
ass bowl of mashed potatoes, and I did it, and
my life is better exactly.

Speaker 6 (10:50):
You don't even need gravy. It's just comfort filling easy. Yeah,
mashed potatoes are underrated.

Speaker 1 (10:56):
Where are we on the you can't see big bowl
where like where we sort of base the base that
you're working off of. I loved it.

Speaker 6 (11:07):
I've been suicidal at one point in my life, but
never that sad r.

Speaker 1 (11:15):
K big bull A dark experience that is also like
kind of worth it.

Speaker 6 (11:21):
That's the thing. I know, it's great. It's just I
can't look at myself in the eye, you know.

Speaker 3 (11:27):
Right, It almost feels it's like, yeah, like probably the
reason I never did Heroin.

Speaker 1 (11:32):
I was like, I've gotten.

Speaker 3 (11:33):
Close, but part of it is just like it ain't.
It's just ain't calling me like that. And I feel
like if it did, it would be all bad. It
would be all bad.

Speaker 1 (11:41):
I did feel like, like William Burrows talking about Heroin,
I was like, it's dark, it's a dark experience, but
it's actually worth crying down the road that like you
had to go down at least one in your life. Yeah,
I give lunch indeed, Yeah, I.

Speaker 6 (11:58):
Haven't tried that monstrosity where it's the fried chicken, cheese, bacon,
fried chicken.

Speaker 7 (12:04):
What was it?

Speaker 6 (12:06):
Yeah?

Speaker 7 (12:06):
That thing?

Speaker 2 (12:08):
Have you guys tried it?

Speaker 6 (12:09):
I can't bring myself.

Speaker 1 (12:10):
That one just seems gammicky to me. But the mashed
potato bowl like always made sense to me. I was like, yeah, no,
this is something I would make at home. You know,
It's basically what I did with Thanksgiving leftovers for.

Speaker 2 (12:29):
It makes total sense.

Speaker 6 (12:30):
And I think because my family Southern KFC is just
like an abomination of fried chicken, so that I do
have a bias, so that probably goes into it.

Speaker 1 (12:41):
Yeah, wow's your favorite fried chicken?

Speaker 6 (12:44):
Honestly it is Ralph's.

Speaker 1 (12:47):
Ralph's Real.

Speaker 6 (12:52):
Yeah, So California grocery chain has probably some of the
best fried chicken I've had. That's not my mema us right, Yeah,
bad day fried chicken, mashed potatoes, couch.

Speaker 1 (13:06):
Ralph Kroger. I'm wondering if Kroger elsewhere has good fried
chicken or if it's just something about Ralph. Oh, yeah,
how's it up by you? Anywhere else?

Speaker 6 (13:17):
We don't have Ralph's. Like I'm on the border of
California and Nevada, so we have a rallies and it
looks just like Ralph's logo, but it's different. I think
it's privately owned.

Speaker 1 (13:32):
Yeah that's not even a name. They just made that
ship up. Yeah, Rales.

Speaker 6 (13:39):
It's just as expensive as Ralph's, but different. They don't
have Kroger brands. I don't know what their brand name is,
their store brand that gang.

Speaker 3 (13:50):
Let us know what is the best grocery store chicken?
Because I do agree? I mean that and Pavilions. Pavilions
get a fresh ship, fresh batch of Pavilions fried chicken too.

Speaker 6 (13:59):
Yeah that in their donuts. Holy fuck. When I lived
in Orange County, our Pavilions made donuts every morning. They
were like still hot in a little greasy from the friar.

Speaker 7 (14:11):
Oh my god.

Speaker 6 (14:12):
If maybe they fry the chicken in the donuts.

Speaker 1 (14:17):
Maybe they fry the donuts in the chicken.

Speaker 6 (14:22):
KFC are you listening.

Speaker 1 (14:25):
KFC Krispy Cream collab.

Speaker 3 (14:26):
It's like, yeah, we brought over the friar oil from
a CAFC to fuck up these donuts over here.

Speaker 6 (14:32):
Bring in a weed brand and I never get anything
done again.

Speaker 1 (14:36):
Wow, Brian, what is something you think is overrated? Oh?

Speaker 8 (14:41):
Yeah, so that that my pit. Thing was anti heroes.
I think they're overrated. We're done with it. Leave that
ship in when we weren't in the Age of Aquarius,
because we are in the Age of Aquarius.

Speaker 7 (14:54):
It's currently as of a few months ago.

Speaker 1 (14:57):
That's great news for me. Is that good? That's from hair?

Speaker 7 (15:00):
It's great.

Speaker 8 (15:01):
It's when all the it's like historically, I don't know
anything about what I'm talking about that, but.

Speaker 7 (15:08):
Historically it's when like huge like revolutions and shit happened.

Speaker 1 (15:13):
All right, Oh, hell yeah, I think we kind of
need that right now, guys low key, I think we
need that right now. All right, So that was your
under overrated? Underrated? Is the pit? Overrated? Is he anti heroes? No?

Speaker 7 (15:27):
Underrated? Was uh?

Speaker 1 (15:30):
Fuck?

Speaker 7 (15:31):
I actually forget what I just said.

Speaker 1 (15:33):
No, did you say the pit?

Speaker 7 (15:36):
No? The pit was me leading into anti heroes? I
think anti and underrated is something else?

Speaker 1 (15:43):
Is the pit? Wait?

Speaker 7 (15:45):
Oh oh wait, I fuck this up? Processed food?

Speaker 1 (15:49):
Oh that's okay, sod under let's do that, Okay, Okay,
So I think we flipped your overrated and underrated? What
what's something you think under rated? Okay? Process? Which one?
What kind? What's the most underrated? Process? Food. I think
like cheese, processed cheese, American cheese.

Speaker 7 (16:10):
Yeah, I'm just kind of like over healthy.

Speaker 1 (16:13):
It's like, yeah, American just oil and water in the
shape of it. I fucking I am fully in agreement
that I love American cheese. Like the melt the melting
point on that ship is so crucial, Like, yes, I guess,
and I was made in a lab.

Speaker 3 (16:30):
I used to rake my teeth over the edges of
the plastic to make sure I was getting all that
cheese off when I was a kid, you know what
I mean, Like.

Speaker 1 (16:37):
Like room temps, like melts at room temp is a
wild It is a wild quality for a substance to have.

Speaker 7 (16:48):
Watching the craft singles get squirted into the plastic and
then the plastic kind of adhering it around it is
pretty fascinating.

Speaker 1 (16:55):
Poetry poetry.

Speaker 7 (16:56):
Yeah, that's the Hopecore edit that I should be watching.

Speaker 1 (17:00):
Yeah, I guess. I mean we shouldn't be surprised that
processed food as you're underrated because you said that your
favorite cookie is white chocolate macadamia nut cookies from somewhere. Yeah, yeah,
that's fine.

Speaker 7 (17:15):
They're always moist.

Speaker 1 (17:17):
They are they are, and you know why, because they
are sprayed down with chemicals that aren't available to anybody
except No Fortune, five hundred corporations and the US military.
Brian just he trusts the process. That's right.

Speaker 8 (17:33):
You gotta trust the process. I think we need to
stop having trying to have so much control over everything.

Speaker 1 (17:37):
That's trust the process. Control works in the short term.
Long term, it's a bad strategy. It's gonna it's gonna
fuck you up trying to control everything. Just to let
it go. Eat the fucking processed food. Trust the process.
Trust the process food. And we we know how well
that worked out for the seventy six ers, so it'll

(17:58):
probably work out for all of us as well. That's
my basketball team, bro. You need a hope Core Sixers
at it? Man, really, I do not. I'm going over here.
I'm dead man. I think about that ship for a
little while. Yeah, all right, great to have you back, Brian.
We're gonna take a quick break. We're gonna come back.
We're gonna talk about some of this news. You heard

(18:20):
about this stuff the news. We'll be right back and
we're back. We're back and we on this podcast, we
asked the hard hitting questions such as where are we
at with AI generally just like generally, what's the vibes,

(18:44):
doctor McNerney, So every because I remember when we first
had you on, we were.

Speaker 3 (18:50):
Like in the midst of like e AI EI, mommy
E I yeah, AI, researchers giving us like the warning
could be the next just's gonna end the world. I'm
I'm taking cyanide now to preemptively excuse myself from this apocalypse.
And after a while, I think that's as we started

(19:12):
to speak to more and more experts, we all became
rightly unconvinced by its capabilities and they're like, oh, it's
a fancy computer program that's like a toy basically, but
they're saying you can do so much more. Then we
see more and more articles about how people using AI
are like fucking up in like legal proceedings like the
MyPillow guy his lawyers used chat GPT recently and the
judges like they're like thirty critical errors in your like

(19:36):
your response here, Like I don't even know what this is.
But now I'm like, are we now? Sort of I
think because we're sort of past the thing of like
this is that world ender game changer. But still these
companies have to make the line go up. Are we
like now seeing more of a strategy to just kind
of normalize AI, use to get people to slowly warm

(19:56):
up to it than sort of like the big explosive
sort of debut with like chat GPT and things that
we saw, because I feel like now I hear more
people talk about AI when they are like miyazakifying themselves
or getting a recipe recommendation, rather than like this is
going to replace the entire medical field.

Speaker 1 (20:14):
Yeah.

Speaker 2 (20:14):
I mean, I think normalization is the exact right word.
And I was just thinking this today as I went
on to Google Search, and of course they now have
that AI overview summary and even some of my fee Yeah.

Speaker 1 (20:26):
You're talking about my friend Gemini. Yeah, we're pretty intited actually, so.

Speaker 2 (20:31):
They're yeah, some answers, you know, even things like that.
I think people like me who say probably wouldn't have
gone to chat GPT just to do like basic information searches,
I have to admit I do read the AI summaries
quite a lot, and you know, and I think that
it's a way of just kind of like making it
really frictionless and really really easy to immediately use an
AI tool. And I think we see that with a

(20:52):
lot of like AI roll out and look, and on
the one hand, like, I'm not completely against the use
of these AI tools if you're really finding it useful
and if it's from liable, you're kind of getting the
full picture behind the tool and what it's for. But
on the other hand, I think something I do worry
about is something that I think has to be integrated
into all our ethical tech use is like a meaningful
sense of the opt out or a meaningful sense of

(21:13):
the opt in. So like, do I have the right
in the ability to say, like, nah, I don't want
to use this, and you know, am I being like
actively consulted and being able to consent to using particular
tools or programs and so on and so forth. And yeah,
I think the current rollout of AI tools is not
really complying with that particularly.

Speaker 1 (21:31):
Well mm hmm right, Yeah, they're as excited they have
a new toy that they want to that they want
to use that they're going to make a you know,
Super Bowl commercial about Gouda cheese the most did you
see that, doctor mckenry that Google advertised their AI product
with a Super Bowl ad that had incorrect information in it.

(21:54):
The premise was, this is a cheese farmer, and he
is a whiz when it comes to me making good cheese,
but he's all thumbs when it comes to writing marketing material.
And so he it showed Google AI like telling him
facts that he could put on his cheese. And one
of the facts was that gouda cheese is the most
popular type of cheese and sixty percent of the world's

(22:19):
cheese sales. What which is just like on its surface,
like obviously wrong, and they still like put it in
a in a super Bowl commercial, like somebody caught it
once the ad had like been up and so it
didn't make it to the Super Bowl, but it made
it like online and was viewed millions of times. And
it's just that seems to be like it's if if

(22:41):
it's not going to be one hundred percent, if it's
not going to be right one hundred percent of the time,
it's kind of useless because it's just like that. I mean, yeah,
I feel like everybody should be like would be opting
out of it if they knew, like and just so
you know, like one out of like every I don't know,
twenty of the things that you search is going to

(23:02):
be like blatantly wrong in a way that is going
to be humiliating to you. Yeah. Yeah, we're not going
to tell you which one enjoy the product.

Speaker 5 (23:12):
Yeah.

Speaker 2 (23:12):
I mean, first, I love the idea that it's not
even an era, but it's just big Goudas out there
trying to spread some misinformation about the popularity of Good
as Cheese. But second, like, yeah, I feel like one
way I've heard people describe it as this idea of like, yeah,
something like chat GPT or the AI overview can be
useful for getting like a sense of a topic. But yeah,
I guess the issue for me is like that often
requires quite a lot of expertise to be able to

(23:34):
know whether something is right or not. And so for example,
like my husband as from North Carolina, as a basketball fan,
like I am sort of forcibly inducted into the NBA
enough that yeah, I could probably tell the eighty percent.
Maybe actually that might be overestimating my knowledge, but yeah,
that twenty percent would be totally out of my knowledge.
And so I think that's my fears. If you're relying

(23:54):
on these tools, it's not that people can't tell that
something's wrong, but you know, when we're using it for
like a really wide range of applications. It does actually
require a lot of expertise in all those different things.
And I think, certainly speaking for myself, like that's not
something I would be able to do or to discern.

Speaker 3 (24:11):
Yeah, because I mean sure, like you think of it like, well,
if I get to be on my test, but if
you're asking something to like explain the Civil War and
you get all the generals and the battles and dates right,
but the reason for the American Civil War wrong, where
it's like and they all fought over you know, economic rights,
and then that's now it doesn't matter that eighty percent

(24:32):
is completely negated by this other piece of information that
has completely colored the you know, the description of something.

Speaker 1 (24:39):
Now.

Speaker 3 (24:39):
That's why I'm like, even every time I see those summaries,
I'm like, no, Like I'll click on the links that
are saying like we're using these to tell us, and
I'll look at them like, this is not really even
exactly what's happening here, to the point that I feel
like it's causing more harm than good because I've at
least learned to try and just research myself. You know,
That's how I got my theories on the Earth shape

(25:02):
and things like that, you know, research.

Speaker 1 (25:03):
I have some interesting ideas on that. I don't know
if you have a couple hours, Doctor mcinnerey.

Speaker 3 (25:08):
Well, Doctor, every time I've emailed Doctor Carry, she's respectfully
declined to entertain the conversation, which I say she has a.

Speaker 1 (25:13):
Luck going on. She has a luck going on, but
I mean it is she could be a useful source
to you though you're looking for somebody who's been in
an airplane. You know, I looked out. Are there cool
uses of AI that aren't getting attention or I guess
even like tech breakthroughs that you know, we asked this
the last time you were on but like we're still

(25:34):
in the early stages of AI. Are there directions that
have like popped out to you that you know the
future of technology and this technology in particular could take
that are promising for like the bettering of the world
for more than like twelve rich guys, which I feel
like is the current the current model. It's like these
twelve guys are like, yeah, this would be amazing if

(25:56):
we could replace all the people to see me with
total role. I was with Total Rope from the Miyazaki
movie but where are you seeing hope? Where are you
seeing hope?

Speaker 2 (26:09):
I guess I'm genuinely terrified and like say something it
would be like that's the exact same amount so you
said two years ago, and like destroy any hope. Yeah,
I mean I think like I'm always like quite excited
by more creative uses of AI or people who are
like really trying to think about like instead of saying,
like how can we make like one product that works

(26:29):
for the whole world, like which tends to be the
approach of things like to GPT and then like sprow.
They obviously don't work for the whole world in all
these different cultural contexts and all these different linguistic contexts,
because I don't think a single product can. But you know,
I do think like there are really interesting examples of
groups like Tahukumdia in New Zealand where I'm from, which
have been like using AI machine learning techiks to try

(26:51):
and focus on tremodi or the indigenous language of New
Zealand's language preservation. And so because of like long histories
of kind of the state suppression of TERI, there was
like a period where like there weren't that many native
trail Moldi speakers, it was like really aggressively suppressed, and
now as a result, people are kind of trying to
really reinvest and support the kind of revitalization of trail

(27:12):
Modi and Peter Lucas Jones, who this CEO, I believe
Teaku Media and like their whole team have been really
intentional and sort of really world leading, I think, and
how they've been trying to use AI machine learning for this.
I think a big part of their project though, is
that they're very insistent on like indigenous data sovereignty or
making sure that like their platforms aren't sold to big

(27:33):
tech or reliant on big tech. And I think I
imagine that's like a really challenging project because like this
sort of small handful of like big tech firms like
are incredibly dominant in this space, but a lot of
that has been around like no, you know, we really
want to make sure that this remains like technology by
and for MALDI people and for our organization, And so
I think projects like that I find like really really

(27:55):
inspiring and really important. But I think they're also just
like a great example of like AI development being done
super well, which is like you have a clear problem,
and you have clear ideas and ways that you think
that AI machine learning can help us address and part
some of this problem without like positioning it as the solution.
Because I think if anyone comes out of the gate
and it's like AI is going to solve this problem,
that's when I think you should always be a bit like, oh,

(28:18):
I don't think it is, especially if the problem is
something like really really massive, right discrimination, it's kind of like, well, look,
we just have to reject that one sort of straight
out the gate and kind of think a bit more
specifically about this.

Speaker 3 (28:30):
And I'm sure those companies are like and when we
made that claim, that wasn't actually meant for you to
be the receiver of that message. It was for Wall
Street and investors when we said this thing will solve everything,
because like, yeah, I mean like that application feels like
the kind of thing that like, you know, in the
US right now, Trump is very focused on eliminating any
semblance of equity or diversity inclusion.

Speaker 1 (28:53):
Obviously as we've seen like the woke.

Speaker 3 (28:55):
DEI initiative crusade against all those things, and that sounds
like exactly the kind of thing that Trump would be. Like,
that's not useful. It just has to be this other
thing that's a money making endeavor. Because right now his people,
like the people within the administration, like the head of
Science and Technology, have said things like Biden's AI policies

(29:15):
were like divisive and it's all been about all been
about the redistribution, like redistribution in the name of equity.
And naturally Trump has fired many of the AI experts
Biden hired because it was clear like obviously Biden hired them.
He's like, there's a huge bias problem with any of
this stuff, and if it's even going to be a
product people use, like that's probably a thing worth tackling.

(29:37):
But now like it's become sort of like normalized within
this administration to say this is all harmful now because
it's trying to advance equity, when like when we've spoken
to people like you and other experts, it's like, no,
you have to get rid of those inequities or else
it doesn't even fucking work, Like like when you're talking
about things like being able to someone who has like

(29:59):
a darker complexion, and how does a like a self
automated car even identify that pedestrian as an object to
avoid because again, these biases affect all these different systems.
But it feels like now, like at least from the
American conservative side of things, or just generally the tech
con con conservative movement.

Speaker 1 (30:17):
Is like, yeah, maybe it's just like fine if it you.

Speaker 3 (30:20):
Know, misidentifies like black people or doing these other things
that just kind of show that at the end of
the day, we only it only needs to work in
the way that we want it to work, and all
the other applications whatever be damned.

Speaker 1 (30:32):
Keeps identifying black people as traffic cones. Do we think
that we need can we go to market with us still?
Or is that? Are we good here?

Speaker 3 (30:40):
Well that's yeah, And I'm just I'm amazed at how
you know, I think, how much like objectively, this is
the thing. If you want a product to work, it
has to be able to be used around the world.
So how useful is that in a place that doesn't
have like an ethnic majority that's all white people in
the same way like that? You know, again, these all

(31:03):
just feel very counterintuitive, But that seems to be the
name of this year, this year's theme generally.

Speaker 2 (31:08):
Yeah, yeah, I mean the idea of this is the
year of counterintuitive things. Really resonates. Yeah, and I think
it's like not only disappointing, because it's like I do
think that it shouldn't be super hard to buy into
the idea that like, yeah, AI that is like more
equitable or less biased and more fear like genuinely is
actually in the long run good for everyone. Although I
know there's like a reactionary group that feels like any

(31:29):
kind of equity or equality is you know, kind of
impinging on their own kind of share of the pie.
But you know, I really think like AI ethics and
safety is for everyone. But yeah, I also think it's
very sad because like this has huge knock on effects
for the rest of the world, because like the US
is a wild leader in AI and tech production, and
so yeah, if you have an environment that is kind
of saying, let's throw ethics out the window, then that

(31:52):
does have knock on effects for the rest of the
world their bias and uses still a lot of this technology.
So yeah, I think like these rollbacks obviously massively affect
the as domestic context, but they certainly don't stop.

Speaker 3 (32:03):
There, right, Is there any do you see any put
like this is as stupid on its face as it
sounds right to be like, we have to we have
to stop with these inclusive efforts to address biases within
AI like models.

Speaker 1 (32:16):
There's no.

Speaker 3 (32:18):
Like it's as dumb as that sounds, right, because in
my mind and everything I've read, people are like, no, no, no,
like it it works worse when it has all these
like inbuilt biases, like it will not work as good
therefore is not viable. So it is that is bad, right,
There's no like secret things like well, you.

Speaker 1 (32:36):
Know some bias is good for these things.

Speaker 2 (32:39):
I mean, if someone if that is that is the
secret source and please tell me, definitely not what I
would think. I mean, yeah, I mean I think it
just comes down to again, like I think there's a
fundamental irony of saying, like, you know, the power of
these AI enabled tools and products is that you know,
we can perform all these like massive tasks at scale
in this idea, and again like a product that can
be sold the world, a product that can be used

(33:01):
at scale with the kind of knowledge that this only
really works for like a very very narrow base. I
mean my assumption to be fair though, is like I
think that a lot of people who make products that
have you know, these kinds of exclusions or biases aren't
necessarily going in being like, I know my product is
really biased, and I actually just like don't care, Like
I don't think that usually is it? I think often
to me, it's just this kind of mindset of either

(33:23):
we just like haven't really thought about it. I think
this particularly common with accessibility, which is that of accessibility
has to be integrated and from the very beginning of
the design process. It can't be slapped on at the end.
And too often I think that's how it gets approached,
and so people just haven't even begun to think about,
you know, oh, like will my language, I mean, sorry,
will my model work for disabled people? My product work

(33:44):
for people with these different kinds of like physical disabilities, Like,
you know, I think it's just that the whole groups
of populations just get ignored, or you know, maybe they've
realized and they think, oh, it's actually a really bad
thing that this product doesn't work for this particular group.
But I think I'm just going to make the trade
off and decide, like, I think that's a small enough
consumer base that I can still sell my product, and like,
I don't think I'll get too much into pushback. I

(34:06):
think it'll be fine. And I think this, you know,
might often be the case for populations that are perceived
as being like very very small. So I'm thinking of
say like trans or gender diverse people who often get
erased from certain data sets because they're like, oh, well,
this is like statistically a very small percentage. It's like, yeah,
but those people's exclusion like really really matters. It has
a huge impact on their life. If they go through

(34:28):
a scanner and their body is not recognized or seen
as non normative, or if they're excluded from different databases
like these do have huge knock on effects for people.
So yeah, so I guess I would say that, you know,
the kinds of people maybe who are not seeing AI
ethics as a priority, aren't doing it because they're just like,
you know, oh, you know, debiasing whatever. That's fine. I

(34:49):
think it's Yeah, it's probably more just to do with
a lot of different blinkers or like a certain kind
of narrow mindset about who your consumer is and who's
I actually using these products?

Speaker 3 (34:58):
Right, Yeah, it does feel probably, But for the Trump administration,
I'd say they're probably very much focused on the fact
that because there it doesn't matter it's like, I don't know,
even if it makes everything unsafe, I just have to
say the words I don't like equity, and that's without
any consideration for what that means.

Speaker 1 (35:13):
And if the whole world breaks, like the better for
him to consolidate power, you know, like that feels yeah,
I mean, did you see this Semaphore article about like
the group the Mark Andresen group chats that like it.
So he's been having like these signal chats since the
days of the pandemic, and it's like he has like

(35:35):
gone out of his way to bring in all of
these tech leaders and then like fucking super far right
wing like people like Richard Hannania, like the guy who's
like an outright white supremacist, and like put these people
in group chats together. Like at one point he like

(35:56):
brought in some progressive people and then they wrote a
New York Times op ed criticizing laws that were banning
critical race theory, and he just like had a meltdown
and was like you betrayed me by writing this thing,
and like kicked them out of the group chat, and
like since then, it's just all this like extreme right

(36:18):
wing propaganda that's being like kind of vomited back and
forth between these people who are like the biggest, most
powerful oligarchs and like the people who are in charge
of the direction that tech takes. And so I yeah,
I mean it feels like this whole thing is developing
in a way that feels particularly like non optimal and

(36:42):
like stupid and narrow minded and racist and white supremacist
and all those things, and like that this was very
helpful for a context. I just wonder, like all of
like we're seeing a model that's being developed, not in
the best way possible. It seems like like to say

(37:02):
the least, and we're probably going to like discount a
lot of these possibilities because they're so shitty at what
they're doing. But do you do you see that sort
of white supremacist mindset kind of pervading in the tech mainstream?
What a question? I know, it's a big one. Yeah.

Speaker 2 (37:25):
No, I mean I feel like, I guess, like what
I do think this gifts just towards is this like
very public repositioning of Silicon Valley, which I think always
had this like relatively liberal venea, even if it's not
clear how deep those roots actually went. Kind of very
explicitly aligning themselves with the right and with Trump. And
it's a little bit hard to tell, like how much
of this is like political expediency people saying, well, clearly,

(37:47):
you know, I'm going to do anything to avoid heavier regulation,
We do anything to avoid this kind of like sort
of punitive regime that Trump's exacting on many different institutions.
So we're going to you know, put ourselves in a camp.
And how much of this is like an expression of
like deeper ideologies and deeper kind of political beliefs that
have maybe sat door mentor sort of have kind of

(38:08):
been cultivated within Silicon Valley and tech firms and now
kind of finally bursting onto the scene now that they
feel a bit more empowered as there's kind of been
this like global shift towards the right. So yeah, I
don't know, honestly, like how much you know, we can
discern between sort of like the real deep feeling and
the kind of political expediency argument. But I think it's

(38:29):
just undeniable that you have people like particularly Mark Zuckerberg,
who would have been like normally more kind of center
possibly center left a lot of issues even though he
was obviously running like a massively exploitative, gigantic, quite dangerous
tech company now sort of really aggressively rebranding as a
kind of wooing Trump, but also sort of big into

(38:51):
a lot of these like hyper masculiness, tech bro sort
of languages and ideologies that I don't think certainly five
years ago we would have seen him sort of pointing
in the same way. So yeah, it feels like there's
been this sort of very visible shift in Silicon Valley.
But again, as someone who's not based in Silicon Valley,
like I probably couldn't tell you like how much of
that feels like this this is actually the real face

(39:12):
of the beast versus this is actually just like what
people are doing for the moment.

Speaker 3 (39:16):
Yeah, because I mean you see how much like Andresen
got Silicon Valley money together to get Trump into office,
along with a bunch of other crypto people's like seventy
eight million dollars from like the and recent side of things,
and you know, like these group chats they do, it's
like our modern day smoke filled lounges where you get
to see these very powerful people sort of debate these

(39:39):
topics and get their takes out there that are like
wacky as hell, and that's why I think, like reading
this article, it was a little bit more like damn.
I mean, I don't know if any everybody in this
group chet thinks that, but there are definitely some vocal
people in this group chat that definitely think they are
the ones who are going to solve these very complex issues,

(40:00):
but in the most like inelegant, one size fits all
kind of way.

Speaker 1 (40:04):
That's just all about power. And that's when I'm like, oh,
I think it's really these are.

Speaker 3 (40:10):
Starting to blend together, especially as we see how much
how people's like media diets and the information they receive
are informed by what these people who are running these
companies how they believe and like how information should move
and how people get siloed into information bubbles and things
like that. That's when I started to begin to be like, hmmm,
this feels a little bit more like a smoking gun

(40:31):
when you hear like, you know, these ideas are being
like exchanged and knowing that, like there's this guy Curtis
Yarvin that we talked about a few weeks ago who's
like basically like a tech monarchist who has a lot
of ideas that people like Elon Musk and Peter Teel
like are really subscribed to and yeah, and like in
these group chats, it's is where these a lot of
these very influential people start getting introduced to these.

Speaker 1 (40:54):
Kinds of ideas.

Speaker 3 (40:55):
So that's when I'm like, hmm, this feels very sinister
at best. I mean, like when you see this and
then thinking that these are the people that control the
levers that you know, just normal people who are using
the Internet and stuff get affected by their policies, it
feels like a little bit like they figured out this
is how they exercise like their immense control over people

(41:18):
in these sort of you know, less sort of not
like in the ways that we think in terms of
governance or whatever, but through the spread of information and ideas.

Speaker 2 (41:26):
Yeah, and I guess I feel like the signal chats
like raise two things about this kind of like pivot
to the right, which I think are both really important,
and like, I think the one is what you've pointed out,
is this like small scale influencing. Like I think often
when we talk about disinformation or misinformation, we're thinking about
that in terms of like, oh, someone makes like an
ai deep fink and they like release it onto the
Internet and somehow that deep fake like convinces like many,

(41:48):
many many people that this event occurred or didn't occur,
and it has like seismic effects, and like, you know, again,
I don't think that's necessarily how disinformation works. I think that, yeah,
these kinds of like small cultivated circles of true maybe
the things that we should be really really concerned about,
which is like what actually causes people to change their
mind or change behaviors, Like maybe these are the levers
that can like be significantly more effective. But yet the

(42:11):
second is kind of a point you were raising, and
I think does again like tie into what happens when
you build these bonds of trust, which is just like
following the money. And like you mentioned sort of the
massive amount of tech funding that got Trump at the office.
And I had a colleague who now wisited Guardian and
he and the team kind of just like tracked all
the funding behind the Trump and Harris campaigns. And what

(42:31):
was just really astounding was just like the extraordinary amount
of money that specifically just came from tech, and like
that's been a really big change in the last five
years or so. It's just like how much money sort
of Silicon Valley has putting it into political lobbying, and
so I think, you know, yeah, it's even regardless of
like your political orientation, but like particularly right now with

(42:52):
the Trump administration, like I think that's something we should
all be really concerned about.

Speaker 1 (42:56):
M h.

Speaker 3 (42:57):
Yeah, I mean, because it feels like this is the
best a in some level, the tech industry is sort
of like, well, we're actually now the ones that are
really able to manufacture consent through social media and misinformation,
and you marry that with like an a you know,
presidential administration that would really benefit from that kind of
like full court press propaganda kind of making, and it

(43:20):
feels like, oh, everyone kind of wins in their own way,
although there are clearly some people in tech have their
regrets after all the tariff chaos and they're like no no, no,
no no no no not actually also, don't just fuck
fuck that one's actually really valuable. I just wanted less
regulations and for people to say the N word on
Twitter more. That was my whole thing, and now I

(43:41):
have no money or less money. But yeah, it's it
feels like it's just like the most clown show version
of all this. I think, which is also very frustrating
to watch or just have to sit idly buyas it's
all happening to.

Speaker 1 (43:55):
Us very intentional. Very Also, it's just like really pathetic,
just like seeing how easy, easily influenced these people are.
I mean, it makes sense because it feels like they're
from a church that believes that whoever, wherever, the most
money is equals the right thing. And so you just

(44:16):
like get them on a group chat with a billionaire
and they're like, well, I mean that guy must be
the smartest guy in the world, and so right his
ideas I have to listen to.

Speaker 2 (44:25):
Yeah, and I do think as well, Like, you know,
to some extent, I'm like, yeah, to that point, much
less interested in what people like quote unquote really think
and much more interesting than just what they do. And
so to some extent, I'm like, yeah, you know, does
it really matter with them? Mark Zuckerberg is personally invested
in these like you know, quite misogynist ideologies about like
what it means to be a man or doesn't matter

(44:47):
that he like has gone on Joe Rogan and now
like propagates a lot of these ideas and has you know,
publicly shift admittter to kind of align with or support
Trump's administration. Like that, to me is like what really
matters right now, which is that we have kind of
these active, kind of shifts of power in the tech
industry that go beyond kind of like politically signaling in
certain ways, like they're really really tangible. And I think

(45:07):
Musk is like the flagship of that in him to
someone who's just been like very very visible in the administration.
But you know, he's certainly not the only one.

Speaker 1 (45:15):
Yeah, it feels like there's no shortage of these tech
billionaires who are dead set on a horrifying ideology. All right,
let's take a quick break and we'll come back and
keep talking about AI will be right back, and we're back.

(45:40):
We're back, and I do just add the floam reference smiles.
I don't want that to go by without being remarked on.

Speaker 3 (45:49):
It's okay, Like I said, check my bio encyclopedic knowledge
of the nineties.

Speaker 7 (45:53):
Is that the one that would like that would almost
like crack when you molded it.

Speaker 1 (45:56):
Yeah, it was like a bunch of little beads almost, yeah,
like held together. It was really what if Rice Krispy
Treats was toy. You know what, if Rice Krispy treats
gave you bowel problems, if you even touched your mouth
after handling it, if you ate them, they were in
your body for the rest of your life and also
your children's lives, your progeny.

Speaker 8 (46:18):
Also, now every third person is like gooning to asmr
shit and we have we have Floam to thank for that.

Speaker 3 (46:24):
I know right exactly. We were just out here just
trying to figure it out on our own. With Nickelodeon Floam.
I gave it the full brand name because I felt
like people would forget if I didn't properly say Nickelodeon Floam.

Speaker 1 (46:36):
Nickelodeon Floam, a larger and larger part of their body
is just made out of floam. It's like instead of
a left knee, I have floam down there. I can
just mold it and remold it as I walk. Anyways,
old debate and this is not this is not the
story that I thought I was going to be talking
about today. And yet but like I said, we report

(46:58):
on the zeitgeist, we don't make it. Yesterday, I had
the experience of checking social media and it just being
wall to wall one hundred men versus one gorilla, you know,
commentary videos, and so this is an old debate. I
mean this sort of debate. You know. There was one
of my favorite articles from the early days of Cracked,

(47:20):
like two thousand and eight Cracked, I think, was by
this writer Chris Buckles that was just like how to
fight twenty children like as one person, And it was
just like really well thought out and good advice that
I've that has shaped me as a person.

Speaker 7 (47:37):
That you've used many times.

Speaker 1 (47:38):
Yeah, yeah, I'll link off to that as my work
in media. But it is kind of one of those
questions like once you start thinking about it, it's kind
of hard to stop. I will say the original question,
as originally posed on Reddit five years ago, I think
has a fairly simple answer because they really stacked the

(48:01):
deck against the humans. Oh yeah, yeah. The original question
was like, all right, first round, you can only go
two at a two men at a time, Yeah, in
a gorilla and they're in a grilla exhibit. The second
one is in a construction as at a construction site,
and you can go ten at a time. They also

(48:21):
say that it just has to be like one hundred
average people. And then the third one, I guess you
can go all at once, but you're not gonna have
many people left after those first two rounds to go
all at once, which is like, really your only hope.

Speaker 3 (48:35):
I think all of the sort of modifiers are meaningless
because really the question is, how can what just a
in a fair one?

Speaker 1 (48:45):
Okay?

Speaker 3 (48:45):
Yeah, no tools, no weapons, not because this guy's basically
bringing up like Rainbow Road type shit where like the
headgrill fall off the edge or some shit.

Speaker 1 (48:54):
No, what is it going.

Speaker 3 (48:55):
To take for one hundred human beings to beat the
fuck out of a gorilla?

Speaker 9 (49:01):
Yeah, like a cage match or something like that. Yeah,
you know, Yeah, And my first reaction is it's not happening.
It's not happening. Ever, my big thing with all of
this is human beings don't have like fangs. They cannot
draw blood, So you're you're you're counting on purely overpowering
a gorilla. I mean, we do have thangs, Moms, can

(49:24):
you okay, can you can you draw blood from a
through that gorilla's fur?

Speaker 1 (49:28):
Probably if I had to, if he.

Speaker 3 (49:32):
If he stayed still, if I hit the finger real
hard like Charlie, yeah, I probably could. I can bite
through a lot I got, but I'm just saying that's
how I see a gorilla somehow, like if it fought
a tiger. It's because the tiger slashes the gorilla and
it's it's like it's leaking in a.

Speaker 1 (49:49):
Straight up hold the gorilla. Not a fucking hundred people
aren't doing that.

Speaker 3 (49:54):
The first guy that gets ripped in fucking half by
this thing, all ninety.

Speaker 1 (49:59):
Nine other people are gonna go, no, right, you need
These are the things that I think you need. I
think first of all, we need to we need a
draft like and not not a like military breach where
we're going by birthday. I think we need to mcgroover's style,
get to put together a team mcgoover of the film,
not the SNL sketch, Oh thank you, which the film

(50:24):
the film sketch. That the sketch is a completely different thing.
The film is a work of art. The you know,
I think we need to be able to intentionally assemble
a team. I think there needs to be some sort
of a squid game style motivation that is going to
prevent what you're talking about. Whether they see a person

(50:45):
get ripped in half and everybody's like, this is not
worth it. It needs to be worth like it needs
to be one hundred men and the griller in the
cage and like it's not over until.

Speaker 3 (50:57):
It would be sick in squid like in the actual
get a new season of Squid game, Like when there's
like four hundred people left, it's like all four hundred
y'all versus gorilla right now.

Speaker 1 (51:06):
Right, Yeah, y'all are going to die. A lot of
y'all going to die.

Speaker 3 (51:11):
But four hundred, four hundred feels like maybe four hundred,
So how are you?

Speaker 1 (51:17):
I don't even know hundred is different than a hundred
other than like I feel like a gorilla killing three
hundred people with its bare hands may tire the gorilla
out get tired, and then the other ones are just
like coming up and like kind of yeah, I feel
like that would probably be true after like fifty right,
Like that's that's a lot of people to kill.

Speaker 8 (51:37):
I could see that. I almost think now, I think
you could almost it can be. It can almost be
a random assortment of one hundred people. It doesn't they don't.
You don't even need to do a draft. You just
need a coach. That's what, like like a maestro, a conductor, Yes.

Speaker 7 (51:54):
A maestro who's running plays and basically the goal of
every play is some he goes for the eyes and
once you get the eyes.

Speaker 1 (52:03):
Brian, Okay, that is right, It's got to be the eyes.
Although all right, so this is what I'm saying, you
gotta be the eyes. Like that convinced me in a
split second. At first, I was like, we need a
like legendary football coach. I think the coach. No, I know, well,

(52:24):
now I'm thinking we need still still I want Mike Tomlin.
Mike Tomlin from the Steelers, and was like, just every
year they have a terrible team. Everyone's like, these guys
are gonna win two games this year, and every year
they are like a two seed, right, and he's just
has energy. He like yells at everybody, cut your eyelids off,

(52:46):
cut your eyelids off when they go out to play,
which is just like so I don't know. And and
also a great tactician, although we might not need one
with Brian. No, no, Brian, I just thinking primatologist would
be helpful, like somebody who sure can really speak to it.
But like a primatologists where you've like, you know, kidnapped

(53:06):
their kid, because other way they're going to be like
we should let the gorilla kill us.

Speaker 3 (53:11):
Yeah, because I don't want to introduce like, you know,
there could be something like gorillas hate like reflective material
or something like maybe like a non weapon object that
opens it up. But I don't even like that idea
because now we're not talking. We're talking about one hundred people. Yes,
one hundred people with tools can beat in a gorilla
with just if we're going to brass tacks, bare hands

(53:32):
and feet. I think, Brian, yes, you you take out
the vision. Now you're dealing with an animal has no
situational awareness, and now you can kind of get a
little like but even then you're.

Speaker 1 (53:44):
Going to be teasing it. Yeah. Yeah, but either way
terrified all of a sudden.

Speaker 3 (53:56):
But you still have to solve the part that you
have negated the absolute dude exponentially higher strength of the gorilla,
because the second you grab it, the gorilla will just
put hands on you.

Speaker 1 (54:06):
Rip you a half, even if it can't see.

Speaker 3 (54:08):
And I feel you're gonna you're just gonna take the
gorilla off even more with his eyes being all pick
gorilla is pod.

Speaker 1 (54:14):
Yeah, I so I was. I was fully on the
you just overwhelm it with one hundred at once. And
then this Guardian article about this debate pointed out that
somebody did a three professional soccer players, three professional football
players verse one hundred children, Oh, I say, in Japan,
in Japan, Yeah, and the professional soccer players won. Oh yeah,

(54:37):
they beat the shit up them. They won. I mean,
they won one nothing, But like, I don't know that
maybe well they had if.

Speaker 3 (54:43):
You see the video, they had like sixty kids in goal,
like literally sixty, Like there were like sixty kids dressing
goalkeepers in the goal and they just had to kind
of headed in over them. But like the way they
just the three of them just spread the field out
and were just hitting long balls that they can like
like just had the kids chasing right uh. And it
was pretty it was pretty impressive if it's the one

(55:05):
I'm thinking of, because I've definitely seen that one.

Speaker 1 (55:07):
I was like, yeah, their skills are just way too
high to get for the kids to handle that. But
I think if it's one hundred random people, it depends.
I do think you're probably gonna get one person who
screams go for the eyes in that group, you know
what I mean, And then and then you might have
a chance if we're allowed to build a team. I

(55:28):
do like the humans chance, like if we're allowed to
draft on even if we're allowed to draft ten, there's
no answer. There's no answer, Like there's no conceivable answer
for how you gate? What are you gonna do? Like
have ten guys try and hold the arms right like
a gorilla will I'm talking Mike Tomlin and the foremost
primatologist and then like Laurrence Taylor or whoever the closest

(55:52):
person to lost Taylor current.

Speaker 3 (55:54):
Actually I like the eye thing, but I keep coming
back to the thing, like you can't negate the strength
in the sting.

Speaker 7 (56:01):
Even in the I thing. I do think ten people
will die for sure, Oh yeah, yeah, oh yeah, Ultimately
the people will win, I do.

Speaker 1 (56:11):
I do think that's right. Like everyone everyone's pushing vibes
right now.

Speaker 7 (56:15):
No one's giving me plays, and then you you show
them and give me.

Speaker 1 (56:20):
An example of the play. What happens? How are you
gonna like, you can't can't damage your gorilla's achilles, ten
do with your teeth, you know what I mean.

Speaker 7 (56:27):
I'm trying to think the first ten are absolutely going
to get fucked up.

Speaker 8 (56:31):
That eleventh person goes through the eyes and then the
ninety rush.

Speaker 7 (56:37):
The fuck out of the gorilla.

Speaker 1 (56:38):
I think a big difference is going to be if
it's in a ring where no, you can't pick up
something that's close by, because like our access to tools
or like our ability to use whatever is at hand
as a tool was like the crucial thing that allowed
us to like get out of the food chain. So
like if if it's a gorilla enclosure to zoo even

(57:02):
or like you know, I feel like that is almost
an advantage because then I do feel like you can
like you have rocks, you.

Speaker 3 (57:09):
Know, Okay, how about this not see but that's the thing, Yeah,
I get that, but I'm I think that the thing
that is in the appealing about this problem is how
could the pure just strength.

Speaker 1 (57:20):
Of the human body overcome a gorilla?

Speaker 7 (57:22):
Obviously powerful.

Speaker 5 (57:27):
If you've been watching these sports cars, these ope Core
videos have run uh in a place where he believes
I'm with him.

Speaker 3 (57:40):
Yeah, he's just like over that, like like track, you
just hear this song playing that's like thee and all
the Hope Core You're like, oh man, and then the
hundred guys ran after the gorilla.

Speaker 7 (57:52):
It is very graphic and poking.

Speaker 1 (57:54):
Guy, I'm saying one like access to one rock. I
don't know. So there's this story I like that guy
Brice versus a guy named the Lion.

Speaker 7 (58:08):
This is what I'm talking about.

Speaker 1 (58:09):
Thank you ahead Biblically, the original sports Hope Core. Way
to bring it back, Way to bring it back. All right,
that's gonna do it for this week's weekly Zeitgeist. Please
like and review the show if you like. The show
means the world to Miles. He he needs your validation. Folks.

(58:34):
I hope you're having a great weekend and I will
talk to you Monday. Bye.

The Daily Zeitgeist News

Advertise With Us

Follow Us On

Hosts And Creators

Jack O'Brien

Jack O'Brien

Miles Gray

Miles Gray

Show Links

StoreAboutRSSLive Appearances

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.