All Episodes

August 19, 2025 • 46 mins
Buckle up. Educators and activists Forrest Valkai and Erika (Gutsick Gibbon) showed up for a chat about AI, fossils, "the Monad," and more!

Become a supporter of this podcast: https://www.spreaker.com/podcast/thethinkingatheist--3270347/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Understand the thinking atheist. It's not a person, it's a symbol,
an idea.

Speaker 2 (00:10):
The population of atheists this country is going through.

Speaker 1 (00:13):
The rule, rejecting faith, pursuing knowledge, challenging the sacred. If
I tell the truth, it's because I tell the truth,
not because.

Speaker 2 (00:22):
I put my hand on a book and made a.

Speaker 1 (00:24):
Wish and working together for a more rational world.

Speaker 2 (00:29):
Take the risk of thinking. Feel so much more happiness.
Truth Fusian wisdom will come to you that way.

Speaker 1 (00:35):
Assume nothing, question everything, and start thinking. This is the
Thinking Atheist podcast hosted by Seth Andrews.

Speaker 3 (00:56):
So anytime I see Forest Valcai and Erica hosted the
Gutsig Given channel together in the same physical space, I
have to put microphones on them and we have to talk,
which is why I do this about every ten minutes
on my channel. And this was the case just recently
at the Baha Khan event the Bluewater Atheist Humanist conference

(01:18):
in Sarnia, Ontario. So Erica had presented that morning. She
did her thing on human evolution, which was amazing, and
then we were there early afternoon. Forest Valkei was up
at four and then I was following Forest right after
that at five o'clock, so we had some time to
kill and I said, hey, let's talk for a bit

(01:40):
and just roll and see what happened. So we went
up to my suite. I didn't take them to my
hotel room. We had a suite with a living space.
We all kind of hung out on the couches and
chairs and we just started talking. Now, one of the
prompts for this was that a presentation earlier in the
day was getting into AI and I just wanted to

(02:00):
get their take on it because there are AI religions
that are popping up.

Speaker 2 (02:06):
All around the world. We knew that was going to happen.
There are people who are falling.

Speaker 3 (02:10):
In love with AI chat bots, so instead of having
human intimacy, they are now dating their chat GPT companion,
which has a name and an avatar. And what's that
look like in this strange new world? And you know,
is it healthy, is it terrifying? Is it dysfunctional? I

(02:31):
wanted to find out about that, and then I wanted
to get into how AI is affecting education and how
do we know what's true in this perhaps post truth culture.
So that was the catalyst for the conversation. But you know,
we weren't going to stick to that. We just went
all over the place. The chat runs about forty five minutes.

(02:52):
I hope you enjoy. Here we go. So I'm at
a convention. I've got Erica, I've got Forest. I've got
one word. I want to start with, What the hell
is monab?

Speaker 2 (03:07):
Now? Monad? Monad? N Wait, what's monad? It was more
than one word all that. By the way, there was
a whole ask question in there. I don't know what
you know? What is? What is monad?

Speaker 3 (03:18):
And my listeners who haven't been watching the line, is
this where this originated?

Speaker 2 (03:23):
Monad? Yeah? Well clearly it originated eons ago in the
most brilliant minds that humankind has ever produced, and happened
to never write down in succinct ways, but only in
poetic extracy.

Speaker 3 (03:37):
Set up Forest. Someone called the show and said said there.

Speaker 2 (03:44):
Is a god.

Speaker 4 (03:45):
No, no.

Speaker 5 (03:46):
What they said was how will the host I'm paraphrasing here,
but they said, how are the hosts going to deal
with the monad?

Speaker 4 (03:52):
Like?

Speaker 5 (03:53):
How will we cope with it? And you know we're
taking this call in It was a call that we
took the last minute. You know, we were aiming for
a shorter of only.

Speaker 4 (04:00):
About six hours.

Speaker 2 (04:01):
I'm sorry, did you just say shorter show?

Speaker 5 (04:05):
We're trying to keep things concise, but you know, we
Arden twisted our arm and we're like, I guess we'll
have to take one more call.

Speaker 2 (04:11):
So I decided her by the way, because if we
had taken one more call, we would have beaten our
previous record of eight nine and it would have been
really and we ended eight hours fifteen minutes.

Speaker 3 (04:18):
It would it was a telethon, not a show. It
was it was a show.

Speaker 4 (04:22):
It was a show for sure.

Speaker 5 (04:24):
But the call in says, you know, i'll have how
to host deal with this, and we're taking this is
a theist call, so we're thinking. I'm thinking to myself,
this is a Catholic thing, and I don't know what
it is, but Forrest probably knows. At first, the call
opens and Force was like, what the hell is Monad?
And I was like, oh, thank god, because I also don't.

Speaker 4 (04:40):
Know what monad is.

Speaker 5 (04:41):
And then we had an hour and probably thirty maybe
forty five minute long conversation and we did not get
a definition for Monad until maybe the last thirty minutes.

Speaker 2 (04:50):
And even then it was worse than when we didn't
know what it was.

Speaker 5 (04:53):
It made things worse because it made them stupider. Yes,
it made the whole thing. And you said at the beginning,
you were like you were to put his monad in
a few words, and it's like, well, monad because it
is everything and also nothing and also one, and also
infinity and also zero. I kind of think you can
just he did use one word. It is that all
of existence is mona.

Speaker 2 (05:13):
Yeah, exactly, it was that. And then also it was
electromagnetic radiation, which is also the same thing as energy,
which is also the same thing as your mind. And
you can't separate your mind from the rest of the
universe intellectually, and therefore you can't separate the rest of
the universe from light. And therefore everything is something is

(05:34):
something is monad. And just for like ninety minutes, we're
just beating our heads against the wall trying to understand
this stupid word.

Speaker 3 (05:42):
I posted a photo from Erica's speech this morning at
Baja Khan, which was remarkable, and immediately in the comments
section someone wanted to honor her monad or whatever, and
I'm like, I need help with this. Is that a
typical call for the line and I co host there.

Speaker 2 (06:03):
You co host there.

Speaker 4 (06:04):
I've had a lot of weird calls.

Speaker 3 (06:06):
Help Line dial in and they have we take we
prioritized on the Sunday shows. Sunday how about skeeptalk? Is
that theist callerss all of it is?

Speaker 2 (06:16):
Yeah, skip talk is more supposed to be like you
have at least one expert in an area that's taking
calls about that specific thing, But it always defaults to
theist calls because that's what that's who watches our shows,
trying to find a way to get us, you know
what I mean.

Speaker 3 (06:30):
And the thumbnails are always prove God, help us understand God?
Why should we believe in God? Why are we wrong
for not believing in God? And people will call and
defend God, and the defenses are sometimes just a ram
and noodle soup of what the hell right?

Speaker 5 (06:46):
It's that And it's also just like it's it's somebody's
first rodeo half the time where they call in and
they think that they're going to get you with like
a baby's first you know, argument for God that they
heard on the Creation Today or from you know, a
TikTok shot or whatever, redeem Zoomer or someone like that,
and it's like, you know, we've been doing that. I

(07:07):
don't even know if I can say that I've been
doing this for a long time. It feels like a
mona out of time. I've been working on this a
long long time, but it's been a long time since
I've heard something that like totally.

Speaker 4 (07:17):
Was brand new. But I have to say, the Monad
call was new to me.

Speaker 2 (07:21):
It was it was a new thing. And it's funny.
There have been a couple of times like that where
it's like, Okay, this is actually a new argument, and
I haven't heard a new argument in months. And you
know what, I fucking wish I hadn't. I wish you
didn't call in.

Speaker 3 (07:34):
I want to start substituting it for other terms, Like
I just want to like throw it into a sentence
whatever I'm talking about it.

Speaker 4 (07:40):
I've been doing it since the call.

Speaker 2 (07:41):
I've been doing in my house.

Speaker 4 (07:43):
I've been doing it online.

Speaker 2 (07:44):
I think tons of comments like how the Monad are
you today, and like praise Monad. I get to see
your video and like just also, and that dude is
still arguing on my Instagram, by the way, he is
still in my comments section, arguing to this this this morning.
I saw him in there. I don't still argue.

Speaker 5 (07:57):
I don't know if Jimmy has seen that clip yet,
but there is no way that there's not going to
be some capitalization, probably merchwise on the monad, which is
going to infuriate this person that we're making dozens and
dozens of dollars off of them.

Speaker 3 (08:11):
You will make tens of dollars dollars. I brought you
the concept of the monad, right, and you have now
what you have now created an idol.

Speaker 2 (08:21):
I need to pay this guy's stupid royalties, and I
don't know how to do that.

Speaker 3 (08:25):
All right, So I'm going to do a subject change
because we were talking about this right before I hit record.
We were talking about Drew McCoy had just given a
presentation about AI in relation to religion, which is wacky
as people. First of all, he was talking about people
using chat GPT as a kind of a life partner,

(08:47):
even a romantic partner. So you type in things, it
gives you answers that sound really human and speak to whatever.

Speaker 2 (08:55):
I don't know.

Speaker 3 (08:56):
It's picking you up as you type in, right, it's
bouncing back reflections of your personality. But it feels sometimes
like a human interaction. And then it becomes the movie Her,
where your life partner is now Ai or Ela is
it la lamb whatever they called it. And I think
to myself, are we now entering that age where people

(09:18):
are leaving their spouses and running away with a laptop.

Speaker 2 (09:23):
Yeah? Yeah, it was like flip dude, it sounds funny,
but people are already leaving reality and running away with
their laptop. And like, we see that a lot. There's
a ton of like So it's tricky because like, on
the one hand, there's all this information coming out about
how it just freaking rots your brain. You are sacrificing
your critical faculty or your imaginative faculty, like all these

(09:45):
parts of you in order to let a machine do
thinking for you. There's some really worrying graphs usage charts
of chat eybt that shows it peaks during weekdays and
drops off on the weekends. Wonder why that would be,
And then the second square lets out it just plummets.
And so this is what people are using to use
their homework to do all their fact checking, to do

(10:06):
their thinking for them. I can tell you several times
I've gotten in arguments online with somebody and they'll say,
I asked jat gpt in it said I'm right, and
that's that was their source. And so there's this worrying
statistics about that. But on the other hand of that,
it's also a burgeoning technology that isn't going away. And
literally every argument that's made about it was also made
about books that is going to ruin people's minds that

(10:27):
we're going to be thinking for them, or the internet exactly, books,
the internet, you know, writing in general. It was just
these are things that are going to ruin people's minds,
and oh, these kids have their brain in a fucking
book every day.

Speaker 3 (10:39):
I mean, I didn't open a book and the book
was I don't know. First of all, a lot much
of this is plagiarism software, yes, plagiarizing racism, Okay, a
lot of it's also.

Speaker 5 (10:49):
Just wrong as well, Like you know, the more the
more esoteric the subject, the worst the results tend to be.
Like I can sit there and ask it. For instance,
I was doing when I was putting my sled together
for my talk today, you know, I was putting in
I was googling some pictures of like different hominins, different fossils,
whatever for getting images, you know, but as we all know,

(11:10):
when you search, Google puts the AI result at the
very top, and it's viewing information about this stuff that
you teach in an introductory bioanthe class, which is offered
at the very basic freshman level at many large universities.
It's just outriting correct information, right, Like, I'm not sure.
I like, there's the brain rot aspect of it, and
then there's also the it's not even correct in as

(11:33):
a replacement, it's not even using it's not even mining
wholly correct information in the first place. But it's easy, yeah,
and there's there's hazards there.

Speaker 4 (11:42):
It's funny, you know.

Speaker 5 (11:43):
We run when I TA last semester, we run exams
through chatchupt after they were created to see what kind
of answers chattupet could get, so we know what the
flag that there's a there's a whole other side of
the coin to this where there's a you know, I'm
sure it always comes.

Speaker 4 (11:57):
Back to this. But there's a capitalistic.

Speaker 5 (11:58):
Incentive or universities in a sense to let AI run wild,
because then they buy companies that are specifically chatgypt checkers,
and then you're only by by FURPA rulings. You're only
allowed to run students' assignments through certain AI checkers, right
like you can't you can't use external sources. And they're

(12:19):
paying god knows how much money to license that software
for the entirety of the university. And now there's a wonderful,
horrible feedback loop. An arms race really is what I
would call it, between the AI and the AI checking
and how students use that, and so you know, I
mean to derail this in the direction that I wanted
it to go in.

Speaker 4 (12:38):
The first place.

Speaker 5 (12:40):
I don't know what we're gonna do education wise, because
you're just talking about how CHATGYBT usage plummets right when
school gets out. I suspect we're going to have to
go back to oral presentations and pen and paper, and
that's going to have to be all that's allowed.

Speaker 3 (12:56):
I did wonder if we would end up moving into
communities like it's almost like like those dystopian future sci
fi books and films where you have those who are
like enslave to technology and then you know the wholeness natural.
We love sun and light and water. In the Elements group,
they're all wearing you know, beige and earth tones and

(13:17):
they go off and live in a forest.

Speaker 2 (13:19):
Way. I feel like it's the exact opposite. I feel
like the eloy are these people who just live in
this happy wonderland where everything's fine and normal, and we
morlocks are down here in the lab trying to tell
them how horrible the world is and how scary is
how they need to believe in maczines please, it just sucks.
And like that's with jet Gipt. I try to remain

(13:41):
optimistic about it. I really do, because like I said,
it could be really helpful. It really could. As of
right now, it's hugely my environmentally destructive, systemically racist, very
stupid wrong all the time. And I think about it
the same way like when when Google first came out,
you mentioned the Google AI. When Google first came out,
all the boomers reminded us that anybody can post anything

(14:02):
on the Internet, and that you can't if you google something. Yes,
it's easy, you're getting results faster, but it's nothing compared
to going to the library and getting a book that's
edited by an expert.

Speaker 4 (14:10):
Wikipedia is not a source exactly.

Speaker 2 (14:12):
And now look where we are, you know, And so
I have this optimism where it could be better than
it is. And then I also work on the Internet
for a living and I see how people actually use it. Well,
it makes me want to die speaking to how it
can be wrong.

Speaker 3 (14:28):
I was reading an article I forget where it was
about how okay AI brings this information it is now
eighty percent correct, eighty percent accurate. Someone uses it as
a source and it is posted, and then a future
AI search plagiarizes from that and it then dilutes it further.
So now you've got something that's a source of a

(14:50):
source that is seventy one percent accurate, and before you
know it, how do we know what's real? And that
makes my brain explode? How do we know what real?
If everything can look like everything?

Speaker 5 (15:03):
I think when it comes to AI, there's AI is
a tool, and if it's treated like a tool, it
could be really wonderful for us, Like it's going to
automate so many processes, especially in like genomics and biomedicine,
if a single dollar ends up remaining in that pot
to spend on grant score projects. But assuming that that happens,
like there's wonderful potential here for this is a tool.

(15:25):
The problem is it's not being used as a tool.
It's being used as a crutch, right, So if you
want to find sources, and I've done this before, it's like,
you have a question, you type it in a gule ai,
I'll give you an answer that sounds like the answer
to your question, and it quote unquote provides its sources,
and I don't know, maybe twenty thirty percent of the
time you can go check that source and it's it's
not saying what chatchypt is saying it saying, but sometimes

(15:47):
it is, and in those cases it becomes useful because
then it's like, oh, okay, like I didn't have to
comb through a million hyper specific papers or blog posts.
This at least pointed me in the right general direction,
and that can be That can be a good thing,
But then other times it hallucinates and it'll tell you
that someone says something over and over and over again,
and you're like, I don't understand where are you getting
this quote from? And then eventually it's like, well, I
actually don't think that person said that. You can go

(16:09):
back and forth with chat GPT on this kind of thing,
but at the same time, the technology is only going
to get better, So I think what you're going to
have to do is divorce education and critical thinking and
teaching people how to do that in sort of primary
learning environments from this technology and only allowed to come
in after those skills have been developed.

Speaker 4 (16:29):
I just don't see that ever happening.

Speaker 2 (16:31):
This is impossibly enough, Like that's what's crazy, is even
Like so in grad school they had us use chat
GPT and notebook LM quite a bit to fact check
our shit, Like, don't have chat gept write your essay,
but plug your essay in the chatchept and have it
proof read it and say, what are some good arguments
against what I said here? So that now you can

(16:52):
think about those things and make your essay better. Plug
it in there and ask you how can is this
clear and concise and well written at least you know whatever?
And then notebook LM is great because you should do
your primary literature of you and then plug those papers
in notebook LM. Have it scan all those papers and
say am I right to think this? And now would say, yo,
well this paper says this, and this paper says this
is so here's some things, and then you can do

(17:13):
the follow up. The critical problem is when you just say,
what do I need to think. Tell me what to
think about this? Is this right? Is this what I
should believe? Have it? Just that's your first step Off
chat chipet and nobook LM are your first step. You
fucked up because it has a proven track record of
saying what it thinks you want to hear, not what
it thinks is right, telling you that all your stupidest

(17:33):
ideas are the best ideas in the entire world. And
it's really extraordinary, compelling. You should really be in movies
and write a book about it. Oh my god, it's
an incredible contribution to humanity. And just like Erico is
pointing out, making shit up. And when you believe the
dumb shit that it says not only about reality but
about you, you become the health secretary and you publish
a fucking report that's full of citations that don't exist

(17:57):
and is clearly written by AI in order to tell
people that the best doctors and scientists in the world's
are all wrong. And if you just have a wormy
hYP your brain, you'll understand some things. And if you
don't know how to reassign a paper, you can work
for a president who doesn't know to read anything. And
then you can change the world. It drives me crazy.

Speaker 3 (18:14):
There is so much more remaining in my conversation with
Forrest and Erica. Let me take a real short break
and I'll be right back. I am doing a Texas
speaking run last weekend of this month, so join me

(18:36):
for August twenty nine, thirtieth and thirty First, I'm going
to be Friday night, the twenty ninth in Austin Drew McCoy,
the Genetically Modified Skeptic Channel host with his partner Taylor.
They're going to be there, and that's a tag team
event Friday night in Austin. Saturday, I'm presenting in San
Antonio Sunday morning for Houston Oasis. So it's three Texas

(18:59):
cities in three days. All the details are at the
speaking tab on my personal website. Just go to Seth
Andrews dot com. And this is the last half of
my chat with science educators, activists and friends. Forest Valcai
and Erica, host of the GUTSI Gibbon Channel. So I've

(19:22):
got two scientists here. Define your area of expertise quickly, Erica.

Speaker 5 (19:27):
Uh yeah, I mean gosh, I do biological anthropology, so
I study human evolution.

Speaker 4 (19:31):
Is that broad enough.

Speaker 2 (19:32):
That's totally fine, And I guess human biology is the
best way you could say it.

Speaker 3 (19:35):
Give me an example of how AI or whatever we're
calling it has improved what you do, like if you're
getting into human fossils and whatnot. Has it improved the
search algorithm to link up different types of fossils on
don't know.

Speaker 2 (19:49):
Give me an example. Is there any thing that strikes you?

Speaker 5 (19:52):
You know so much of paleontology, and I wonder if
this is going to map for you as all, Like,
so much of it is relying on your experience and
intuition and following your gut and then taking that down
an analytical rabbit hole. So it's like in those ways chechipt,
at least at present, it's very hard for it to mimic.
It's very hard for chechipt to look at an aerial

(20:13):
overlay of you know, the AFAR region and tell you
which area is going to be best likely to support
fossilization or something along those lines. So it's not really
being used in that aspect. I'm going to give you
a way that I do think it is extremely helpful
to my field in particular, but also to science in general,
which is that it's really good at checking your code.
So one thing that people don't understand is that all

(20:34):
science involves statistics. Unless you are just describing something like
a new species or you know, a new methodology or
something like that, their stats involved and that sucks and
they don't actually tell you that when you get into science,
and then it turns out statistics is like forty percent
of it. But statistics has to be done in programs
like R or Matt Lab. And so what you have
to do is you have to write code to tell

(20:55):
chat GPT, you know, or to Chell chapter PT. You
have to write your code to tell R that, okay,
you know, I want to create a linear model and
specify all your you know, data and all that. And
it's like those are very finicky programs, and if you
don't have a background in coding, you can spend I
kid you not like cumulative days fighting with this program
just getting it to run something that is, you know,

(21:18):
it is conceptually simple, but it's hard to code it
because you're not a coder and you're you're really an
amateur in that field. So what CHATCHIPI is really useful
for is if I'm trying to run some code and
my code isn't working. I can put it in chat
gypt and say where's the typo, like what's causing.

Speaker 4 (21:33):
This code to fail?

Speaker 5 (21:34):
And it's it's useful in that sense that it helps
streamline the experiment that I designed, the method that I designed,
the hypothesis that I designed, or the the methodology that
I'm using. Right, But it's it's a glorified like scut
monkey in that sense, right, Like it's not actually doing
any of the work. It's just taking the time that

(21:56):
I would previously take to comb for mistakes and streamlining that.
And I think that's I mean, that's the I'm biased,
of course, but I think that's the ideal way in
which I should be used as a tool by people
who do themselves also know what's going.

Speaker 3 (22:12):
On a proper research monad research, Yes, is very good
for me.

Speaker 2 (22:19):
My second master's is in BioMed and I can't tell
you how many talks I sat through of people saying, like,
you know, we we thought we knew the end of
what we knew about this. We plugged all this huge
population data and chattubute, and it found patterns we humans
didn't notice, and now we are running new experiments on
this disease spreading in this super niche population, this this

(22:42):
this particular gene that we never would have thought to
look at before, because it's able to look at all
of this just bonkers loads of data about demographics, about psychographics,
about genomics, about you know, uh, of epidemiology, about like
there's data and everything else, and treat every single line

(23:05):
of that data as if it's super important and see
which one actually is in a way that a human
could never do, and it is able to do it
in an hour, which would take a million humans a
million years to do. And that is a huge thing.
But then, as Erica pointed out, the end, the critical
next step is someone a human who knows what they're
doing then going up and verifying that and following up

(23:26):
with it. So like that's a big thing that I
see it as a huge opportunity to go back and
re re re review the data of generations and come
up with new information and new models and new new
track records and new patterns that we would never have
thought to look for that could advance science so much
as long as there are scientists to interpret that at the.

Speaker 3 (23:46):
End, instead of TikTok influencers who are falling in love
romantically with Ai.

Speaker 2 (23:52):
I don't know.

Speaker 4 (23:52):
I don't forget.

Speaker 5 (23:53):
They're also using it to commune with the spirits, with
the monad if you.

Speaker 2 (23:56):
Will love No.

Speaker 3 (23:57):
I mean, there was a video on this play earlier
this morning of someone who said that Ai was not God,
but AI now is apparently a funnel or a conduit
through which God can now speak.

Speaker 2 (24:10):
It's I don't know, it's what is old as new again.

Speaker 3 (24:12):
I guess people have been looking for conduits for spiritual
revelation since whenever, and what this is just the next
big shiny thing.

Speaker 2 (24:20):
So AI.

Speaker 5 (24:24):
It's it's weird though, right because it's like, what did
he just wad around twiddling his thumbs like Weedi boards
came out and he was like, this is good, but
I'm going to wait another fifty years until I really
get this shit going. And then AI comes out and
it's like, finally I can commune with my people. It's like,
I don't know that that's the most efficient way of
doing things. I should playing through chagbt.

Speaker 3 (24:45):
We were terrified of Wuiji boards. If someone had tried
to bring one into our home, they would have been
locked out.

Speaker 2 (24:51):
We would not have.

Speaker 3 (24:52):
And of course now on the other side of the
looking glass, I'm like, it's cardboard and ink for god.

Speaker 2 (24:57):
I mean it is actually literally licensed, my husband.

Speaker 4 (25:00):
It is a Hasbro game, it's a toy.

Speaker 2 (25:02):
It's a children's game. Yeah, that's that's my favorite thing.
There's there's this great clip. It was, you know, on
TikTok and whatnot. This this guy's doorbell camera and he's
getting door to actually delivered and this woman comes in,
which has I swear to got her shirt just has
the word Jesus on it fifty times, like it's just
Jesus in all different styles of letters. And and he
has a Ouiji board doormat for Halloween and guess who

(25:24):
has a problem with it. And so she's dropping off
his McDonald's and she's like, you know this is evil, right,
And he's like, it's a Hasbro game. And she's like no, no, no,
it's a communication with the spirit. It's like, no, no,
it is literally a Hasbro game. I need you to
look at it. It is a Hasbro game. That stuff
not real. And he talks to her like she's five,
like no, no, no, sweety, that that stuff's not real.
That's not real. I'm sorry.

Speaker 4 (25:44):
You've seen the La Boo boo stuff too as well.

Speaker 2 (25:46):
I've heard so much.

Speaker 4 (25:49):
Do you know about le Boo Boo?

Speaker 2 (25:51):
No, it's the biggest mon out of the Central.

Speaker 4 (25:53):
It's the biggest one in the Central. So Boo boo.
I can't believe that.

Speaker 5 (25:57):
I get to explain boobo to settle boo the weirdest.
It is like a little plush toy and it looks
like a little like monster baby kind of like looks
like a little baby monster.

Speaker 4 (26:08):
Basically, some K pop stars, if.

Speaker 5 (26:10):
Memory serves, started like wearing them, like clipping them on
their belts or on their designer bags or whatever, and
they got really popular here in the West. People are
buying La Boo boos everywhere, and the La Boo boo
sort of the game. The gamification of La Booboo collecting
is such that you buy a box and you don't
know which labuboo is contained within. So it's another fun

(26:30):
way to sell gambling.

Speaker 4 (26:32):
To kids, which is great. I love it.

Speaker 5 (26:34):
It's so so amazing that they're getting in and dated
with this absolutely everywhere.

Speaker 2 (26:38):
Back in my day, it is it's all the worst
parts of like furby beanie babies, and like Pokemon cards
all wrapped into one horrible new trend trend, and you
have like adults and again I can't just know this
is not a new thing, but it is the thing
now where you have videos of adults like sobbing in

(26:59):
their car because they did get the pink Laboo boo.
They got another goddamn green la boo boo and they
have enough green la boo boo and they're just ripping
its head off and screaming at the camera that they
need the pink one. And it's like it is everything
that we have seen every trend over and over and over.

Speaker 4 (27:13):
And participated in arguably when we were younger.

Speaker 5 (27:16):
But I mean it's like you're like, God, you're right,
people are losing their shit because we don't get glim
glamor or something like that.

Speaker 3 (27:21):
Put that la boo boo in your is a major
religion in this country. It's probably consumerism, isn't that excess?

Speaker 2 (27:27):
Right?

Speaker 3 (27:27):
That's why cartoons exist these days, mostly just to sell
toys in the first place. But the boo boo is
fun to say.

Speaker 5 (27:34):
For the reason I have sort of brought this to
your attention, this pressing matter, is because there's a non
zero portion of evangelicals, but I'm sure.

Speaker 4 (27:44):
I'm sure it extends outside of.

Speaker 5 (27:45):
Them who think that the boo boo, much like Harry
Potter or Pokemon before, it is actually demonic because it's
a little monster baby and it has sharp teeth, and
you know, they do all this stuff where they're.

Speaker 4 (27:56):
Like, you're gonna die at this. I saw one the
other day. It was like, the.

Speaker 5 (28:01):
Boo boo sounds like Pazuzu demon and it's like, dude,
I don't a lot of stuff stuff like a lot
of other funny stuff.

Speaker 4 (28:11):
I don't know what the time, you're a.

Speaker 2 (28:12):
Demon named Pazuzu.

Speaker 3 (28:13):
I'm not taking you seriously anyway, right, it's just Pazuzu.

Speaker 2 (28:18):
If I ever had like a pet pair, no, but
I mean.

Speaker 3 (28:21):
I was growing up and it was they were playing
rock albums backwards, and you know they were looking.

Speaker 2 (28:25):
We're patterned seekers, right, So anyone who is looking for.

Speaker 3 (28:28):
Something that looks as a pointy end on it or
somewhat dissonant, you know, or or they're going to see agency,
probably dark agency, everywhere they look. We've evolved to see
those patterns. See me, set you up, Erica.

Speaker 4 (28:42):
Right, I know this is a softball though, it really is.
You were like, You're like, how can I can I get?

Speaker 2 (28:48):
How do I read?

Speaker 3 (28:48):
How do I bring us back to to paleontology, et cetera.

Speaker 5 (28:53):
Yeah, I mean, I mean pattern seeking is enormously helpful,
whether you're trying to problem solve and figure out where
the best place to forage or hunt is, or whether
you're trying to figure out how to navigate a social relationship,
which I think is something that we appreciate less when
it comes to pattern seeking. What's the most important pattern.

Speaker 4 (29:08):
Seeking that we do. It's facial recognition.

Speaker 5 (29:10):
I mean, we are This is exactly why if you
see two dots and a line, you see it as
a face. It doesn't matter where you see it.

Speaker 4 (29:17):
It looks like a.

Speaker 2 (29:17):
Thing, like the face on Mars.

Speaker 5 (29:19):
Yeah, the face on Mark. I look at that outlet
in the corner of the room and it looks like
a little guy, you know.

Speaker 4 (29:23):
I mean this is this.

Speaker 2 (29:24):
There's a Facebook page it's called faces in Things.

Speaker 5 (29:27):
Yeah, or like wooden grain or you know a funny car,
like the way with the headlights and the bumper in
the front. I mean, this is this is what we do, right,
is we pattern seek and we tend to seek patterns unfortunately,
that confirm things that we want to be true or
things that we already know, and we tend to reject
things and actually put the blinders on, like psychologically or
less likely to see information or patterns that you don't

(29:49):
want to see.

Speaker 4 (29:50):
So I also said, la boo boo, Oh my god,
it's true. Barely is a relationship.

Speaker 2 (29:57):
I admit it.

Speaker 5 (29:58):
I'm actually a la boo boo agent out here. But
you know, I mean, you see something scary, and you
see something that kids enjoy, and you see something that
it could eat any shape form or fashion act as
an idol, and all of a sudden, it's demonic. I
mean they do it with they do it with absolutely everything.
I don't know why they still think people are taking
them seriously, but then again, I know about their beef

(30:20):
with it, so obviously it's doing something right.

Speaker 3 (30:22):
For the record, that everyone in this room isn't a
booboo whist, I believe you.

Speaker 2 (30:25):
You don't believe. You don't know shit about me.

Speaker 3 (30:30):
So I'm going to come back into the fossils and whatnot.
I'm totally taken another turn. But that's what we do
tell me about the technology used to find that stuff.
I know in the movies they're shooting like sonar pangs
into the ground and there's a digital readout that says
dinosaur skull or you know, human human ulna or something.
Is that bogus?

Speaker 4 (30:51):
Do you want to know what it's really like?

Speaker 5 (30:53):
Most of the it is sweltering, and I'm crawling on
my hands and knees up a spiky hillside in the
middle of an exposed section of dirt and rocks in
rainforest in western Kenya. I can hear the thunder in
the distance. I know it's about this storm. Baboons have
stolen my lunch two times this week already, and I

(31:13):
am looking for rocks that vaguely have the sheen of
fossilized enamel on them, amongst millions of other shiny rocks
that are just shiny rocks. That's what most of paleontology is,
and a lot of it is very frustrating, and a
lot of it is the best thing you've ever experienced.
The high that you get when you've spot a perfect

(31:35):
Ranguopithecus molar with the cusps all still intact. It's not
even warners clearly from an adult in the prime of
its life and it's from a tooth position that nobody's
ever found yet. And I get to find it, and
I get to hold it up, and everybody's here.

Speaker 4 (31:48):
It's an amazing experience.

Speaker 3 (31:49):
I mean, like, what's the strike ratio on something like that.
I mean, I know it depends in the region and
the dapth and blah blah blah. But you're out there
and you're looking at what thousands of tiny stones until hey,
look at tooth.

Speaker 5 (31:59):
For my field work, yes, that's what it is. But
my fieldwork is also really unique because of the area
that it's in and for whatever reason, the taphenomic bias,
which that just means like the types of preservation that
favor big things or small things or jumbled things or
articulated things, is like bias towards jumbally itty bitty things
like teeth or little bits of long bone, which you

(32:22):
would think aren't very helpful, but in actuality, that's like
micromammals make up the baseline of the ecosystem, so it's
really important stuff. There are other places that I've been
where it's a little bit more straightforward, where you actually
get to stand instead of crawl and you're walking and
you're looking for exposed shafts of theemura or humorous or
something this that or the other thing, and you can
see it strikingly. Once you've got what do we used

(32:45):
to call it? There was a phrase that we use
in the field. It's sort of like you have to train,
you have to train your brain like you would in LLM.

Speaker 4 (32:51):
Look at this, I'm bringing it. I'm bringing it full circle.

Speaker 5 (32:54):
Yeah, you patterns seek, right, So it's funny you start,
you start with this kind of thing. And I went
my first probably weak, and I didn't find a single fossil.
And I'm talking about I go over a spot and
the experienced Kenyan fossil collector behind me finds eight things
right where I just pass over.

Speaker 3 (33:09):
Do you have like a little brush like they do
in the movies and all that stuff you're just looking?

Speaker 5 (33:12):
I pick with my hands, Okay, I mean people do
it brushes or like this. One of our guys has
a screwdriver that uses to turn stuff.

Speaker 2 (33:18):
Over like that today everybody tool.

Speaker 4 (33:20):
Yeah, they like just that they that they tend to use.

Speaker 5 (33:22):
I just like using my fingers, like I just like
to turn rocks over, and you know, see if I
can get the dirt off them, and if it's an
animal or or you know, you can see the spongy
bone and it's it's nice. That's how I like to
do it personally. But once I started finding them, my
brain starts saying, that's what it looks like and sort.
And what's cool is that the fossils are different colors
in different places. So we have sits where they're kind
of this beautiful pale pink, and others where it's this

(33:43):
obsidian like shiny black color, and then another place still
where it's sort of this sulphuric yellow, and your brain
starts to be able to be like, Okay, you know,
I'm at Chamtwara, here's what the fossils look like. And
by you know, the third week I'm finding I mean,
I'm finding fossils, probably not with the best of them,
but I was certainly one of the best Western fossil
collectors out there. The technology and paleontology is just wonderfully

(34:06):
low tech when you're finding the fossils, and then once
they're found, now we're talking about taking one thousand slices
in a micro ct scanner of a bone to get
the perfect curvature to compare it to the curvature that
you see in a chimpanzee thigh bone or femur versus
a human. And then it's like we're doing geometric morphometrics
and we're using eggy statistical tests in the book, and

(34:27):
it is truly cutting edge, but until that point, it's
crawling around in it.

Speaker 2 (34:31):
There Erica comes back to the stage.

Speaker 3 (34:33):
She's got bruises and slices all over callouses, and she's
all sunburned.

Speaker 4 (34:38):
And were you at the airports?

Speaker 2 (34:41):
What I look like? Some of the worst sunburned I've
ever gotten was from from Fossil Prospective and stuff like that.
We were out in West Oklahoma last year and we're
way out on the side, like right right in the
corner of the Panhandle, on these areas where it's like
the oil fields and stuff, these massive, wide open spaces
of nothing, and there will be the the hills they
are just muddy, gross, awful, and you just kind of

(35:04):
look to where the water channels are, where the water's
cutting down through the hill, and you just look on
the grounds right around there and you kind of look
and there's some vaguely white rocks and there's some vaguely
white rocks, but that's a particularly interesting white rock, and
you turn that and sure, that's a piece of bone.
And then you walk up the channel there where the water,
just kind of scanning around until you find the rest
of that little bone fragment sticking out of the ground.

(35:24):
And then you start picking away right there and maybe
you get some more bone. Maybe it's just bone mush
that's been squished into mud for a few million years,
and it doesn't matter anymore. Sometimes it's something interesting. You
can say, I see a piece of a skull in there.
And you come away from there and your back is
just like a lobster, and your hands are also red,
but from the red dirt of Oklahoma, not from the sun,

(35:46):
and you're just miserable and sweaty. But you have a
really interesting like shaft of a bone. Now that means
nothing because it's not the end of a bone, so
you can tell it's a mammal. But that's about it.
And you go home happy but not happy enough, and
you go back to that area later on to try
to find either a bone end or a tooth, because
those are the diagnostic things. I can really tell you something.

Speaker 5 (36:05):
Let me tell you the most fun falsil I ever found.
Back toally, my favorite false il I ever found was
the rang with the biggest muller. And I found a
couple of my snape teeth, which since that's what I study,
that's what I like. And I have at least two
canines to my name, which is amazing because I'm hoping
to include them and my dissertation and then I can
be like, and I'm citing myself because I found them,
which is amazing, although that's not quite how it works.

(36:26):
But there's ones that we were at. We're I think
it was at Legit Tedd eleven. So we're at this
site in West Kenya where it's where you find I
don't know why I'm looking at you like you're like.

Speaker 4 (36:33):
Oh, yes, of course, I love my I guess.

Speaker 5 (36:35):
And live no bit there my favorite Yeah, yeah, it's
to explain the situation. We were prospecting the site because
it was about to be destroyed by the local town
for limestone mining, as tends to be the case, so
they've got the excavators out there. I mean, it's like
the cartoon where you know you're you're chaining yourself to
the tree and like you can hear the excavator in
the background getting ready to come and destroy it. So

(36:55):
we were doing this last minute search, hoping to find
anything to justify telling these people and it's you know,
fair right, please.

Speaker 4 (37:04):
Don't buildose this fossil stay.

Speaker 5 (37:05):
But also this is where they live and they should
be able to use the resources around them, So it's
like a double edged So the question was where we
gonna find anything. And this is like I'm very much
resonating with the spiky hills because this is like dark,
really jagged, like probably i would say probably forty feet
of like sort of debris like heading up the side

(37:26):
of this gigantic cliffs sort of like a cliff, and
there are these hideous nettles growing everywhere. They get in,
you know, in your skin, and there's biting ants and
it sucks and it's horrible. And we found nothing all day,
and it was everybody was having a bad time. So
I'm standing with one of my friends and I was like,
you know, I want to be like.

Speaker 2 (37:43):
I'm allowed to swear on this, yeah, please.

Speaker 4 (37:47):
I'm like, fuck this place. I hate it here.

Speaker 5 (37:50):
And I kicked the side of the wall and this
there's some of this debris crumbles, and I like a movie.

Speaker 4 (37:55):
I kid you not. I look up and I'm like,
what is that?

Speaker 5 (37:57):
Because I see the shine of the enamel, and I
reached and I pull it out, and it's a partial mantle,
so parcel jawbone that's about six or seven inches long,
fold and tistion.

Speaker 4 (38:07):
All the teeth are in place.

Speaker 5 (38:08):
And it belongs to a member of Bathier Goidy's, which
is a gigantic naked mole rat species that live during
this time. So imagine a naked mole rat. But the
size of it is such that it's mandible. Half its
jaw is seven inches long.

Speaker 4 (38:23):
I mean they we're talking about we're talking about a
naked mole rat.

Speaker 5 (38:26):
Yeah, it's a naked mole rat that is the size
of like a like a jack Russell terrier.

Speaker 4 (38:30):
They say, there you go, and I pull it out,
and I was like, what is this?

Speaker 5 (38:35):
Because we found a million of these little horrible rodents
called diamanthones, and you find them everywhere and they suck
and they don't they're not useful all. And I was like,
I know this is a rodent because it's got the
incisors in the front ball.

Speaker 4 (38:44):
What kind of rodent is it.

Speaker 5 (38:45):
And my friend Aby walks over and she was like,
holy shit, that is the biggest bathrough Guidy's mantible I've
ever seen.

Speaker 4 (38:49):
And I was like it was the first.

Speaker 5 (38:51):
Like I told you, these sites are biased towards little things,
but so it's amazing to find like a beautiful Hemi mandible.

Speaker 4 (38:57):
Like it was just it was wonderful, the best false
I've what.

Speaker 2 (39:00):
We were out in that same site in West Oklahoma,
and we were out there for days and days and days,
and like the last year, we've been pulling up handfuls
of bone, like no bro because we're looking for horse
and camel fossils because of course horse and camels of
all them in North America. And we're out there digging
in the same spot. We have a new group of
students coming out with us, and like we're just finding
dick over it so much and every now and then

(39:23):
they're like, hey, is this bone. We're like, yeah, that's
a piece of bone and it's just a chunk the
size of a quarter. That doesn't mean anything. Side throw
it on the ground. You can keep it as a
souvenir if you one doesn't mean anything. No one's gonna
look for it. Uh, and we're just kind of digging around,
and like on one of the last days we're there,
this one person who wasn't even a palaeontology person, wasn't
even a biology person. She was an English student who

(39:43):
just came along for the vibes and just wanted to
kind of hang out and see what that was all about.
And she comes down the mountains like, what's this And
she's got this chunk of rock that's about twice the
size of her hand. It's a fucking gomfor their tooth. Amazing,
she found goum for theirs. These it's it's a proba
city and not imagine an elephant and you'll have the idea.
It's a fucking elephant relatives from the goddamn Miocene that

(40:06):
evolved in Africa, then spread all around Eurasian and moved
its way to North America. And so there's this massive
Oklahoma elephant that has never been described in this region,
and now her name is going to be all over
this paper. We sat there for the rest of the
day while all the paleontologists who were there supervising, We're like, hey, everybody,
fuck off, we're gonna be here, we're going to this
thing up.

Speaker 3 (40:25):
I'm still stuck on camel and elephant and Oklahoma.

Speaker 2 (40:28):
In the same story, land horses too, because you think
about horses already as something in Oklahoma, But the horses
that are in Oklahoma now are not Oklahoma horses. They're
European horses that were brought here. Yeah, and they are
largely feral, which is incredibly sad. Uh. And and so
like you, you have horses and camels and evolving here,
elephants migrating or elephant like things migrating here. And you know,

(40:52):
go back to the you know, the the Eocene all
the way into the even the Pleistocene. This is a
completely different landscape, completely different fauna crawling around the area.
And we murdered them and then and then you know,
the Holocene starts, the climate change is massively the last
of them die off, and all you have left is
some of us. And nobody remembers the time when we

(41:13):
were going up against terror birds and giant wombats and
huge armadillo like creatures and all sorts of the short
faced bear, all the other incredible MegaFon that we're all over.

Speaker 3 (41:25):
This content imagination is now loaded with images of these amazing,
splash horrifying type things.

Speaker 2 (41:33):
They're so cool, They're so cool.

Speaker 3 (41:34):
I'm going to type that into chat GPT and have
it generated in so yes.

Speaker 2 (41:38):
Please explain to me which will be wrong?

Speaker 3 (41:42):
Well, then it will become a reference point for somebody's
college paper, and then it will go on Wikipedia.

Speaker 2 (41:47):
What's what's tell me the truth about all these ancient creatures?
I must consult the monad and it will tell me
everything I need to know.

Speaker 3 (41:53):
That must bring us full circle because Forrest is on
stage giving a presentation about what today about biology in general.

Speaker 2 (42:01):
Just how fucking cool biology is so essentially your work.
This is what you do. Yeah, I study biology too
much and then I regurgitate it in front of strangers.

Speaker 3 (42:10):
That job Eric conducted out of the park this morning.
I'm sure you will as well. I'm going to let
Forrest run down do a tech check before he speaks
at four o'clock. Any final thoughts, do you have any
hope about this sort of mix of humanity, nature, technology
or whatnot?

Speaker 5 (42:25):
Let me leave you with this, right, the creationists answers
in Genesis has their own AI. It is a safe
AI for creationists to use.

Speaker 2 (42:33):
That's exceedingly funny.

Speaker 3 (42:34):
I did not know that any Do you know what
it sounds like those Internet filters for their like we're
going to go filter out anything that bothers us almost
safe interest.

Speaker 4 (42:44):
Yeah, it's a safe space for them.

Speaker 3 (42:46):
So what do you mean, like if I typed in
something that contradicts the Bible, their AI will return to
deal that.

Speaker 5 (42:52):
If you said to their AI, if you were like,
when did dinosaurs go extinct? It would say forty four
hundred years ago approximately when the global flood wiped most
of them off the planet. And then those that were
taken on Noah's Ark did not survive to repopulation. Why
pariserathereum did and Saura pods didn't.

Speaker 4 (43:12):
We don't know. It won't stay that part, but it.

Speaker 2 (43:13):
Sure were you there. You were there to know. But
also this fucking Bronze Ade sex manual is there to know?
And so what are we gonna say? And the dude
I djusted a video about them, and like their dragon
shit has only gotten crazier.

Speaker 4 (43:26):
How is impossible? It was already absolutely bad?

Speaker 2 (43:30):
Wit wait wait wait, dragons in the Bible?

Speaker 4 (43:31):
Yeah, raged everywhere everywhere, everywhere else.

Speaker 2 (43:34):
Yes, they believe that that dinosaur and dragon are synonymous terms,
and that after the flood there were still some dinosaurs
left and those were dragons, and that all the dragons
slaying stories knights and shining armor, those are all true
people slaying dragons. And also, unlike every other dinosaur, they
could fly and breathe fire. But fuck you for asking questions. Yeah,
and every single example of any story anywhere in the

(43:55):
world that mentions the word dragon is absolutely credible and real.
And therefore it was a dinosaur that was around at
that time. And the one that I was looking at
was it's actually the most viewed YouTube short from Answers
and Genesis.

Speaker 4 (44:07):
This is Canada or right now the regular one.

Speaker 2 (44:09):
Uh, And it's it's just this guy talking about all
these dragon stories around the world. And he starts with
Saint George, who you know, is this this Christian Byonics? Yes, yes, yes, yes, yeah,
that one, and he's it's it's Saint George killed this
dragon back in three hundred eighty. Uh, and that means
it must have been a real dragon. And there's this dragon.
There's this dinosaur called Buryonics, which is a massive spinosaur

(44:31):
looking motherfucker that lived in southern England. And so clearly
this englishman that killed the dragon was this, never mind
that the story takes vice in fucking Libya and so
like they just literally they were like, here is English Christian.

Speaker 4 (44:43):
I didn't, and.

Speaker 2 (44:46):
It's like, here's this. This it's a Christian English story,
so we need a Christian dinosaur to fill in the gap.
And they didn't even be able to read the goddamn story.
And it's just it is. I have to stop there,
and I hate that. I would.

Speaker 3 (44:59):
I could talk for another eight hours and fifteen minutes
about dragons, but I'm gonna let you grab your laptop
and do your thing. Erica, Forrest, I'm gonna put links
to your channels in the description box.

Speaker 2 (45:09):
So good to hang. Thanks for chatting with me, you
got it. Thank you for having us here in your
hotel room. It wasn't creepy at all.

Speaker 5 (45:14):
I'm so glad that Forrest has to go give his
presentation now and I'm done and he's not done, and you.

Speaker 2 (45:19):
Need to go. Oh shit, I gotta follow. We're gonna
be sorry, I have to follow. Oh that's good, I see,
you know.

Speaker 3 (45:25):
And mine's gonna be uses the music and he's out
there remoting.

Speaker 4 (45:30):
When you exident said that you had songs. I was like,
oh no, I'm not. I'm not down.

Speaker 2 (45:37):
He did, Sagan.

Speaker 3 (45:39):
When I went up after him last time, he's he's quoting,
he's doing the Dale Blued.

Speaker 2 (45:44):
I did that speech at the Secular Student Alliance conference
and fucking Sasha Sagan was there in the audience and
listened to me. Oh so she liked it, and she
was very cool, and she came up and talked to
me afterwards. I was so mortified, Like, I just quoted
your dad to you. I'm sure that's not weird.

Speaker 3 (46:01):
No, no, but you're gonna kill it thanks to both
of you. Let's do it again. Okay Yeah.

Speaker 1 (46:05):
Follow the Thinking Atheist on Facebook and Twitter for a
complete archive of podcasts and videos, products like mugs and
t shirts featuring the Thinking Atheist logo, links to atheist
pages and resources, and details on upcoming free thought events
and conventions. Log onto our website, The Thinkingatheist dot com.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Law & Order: Criminal Justice System - Season 1 & Season 2

Law & Order: Criminal Justice System - Season 1 & Season 2

Season Two Out Now! Law & Order: Criminal Justice System tells the real stories behind the landmark cases that have shaped how the most dangerous and influential criminals in America are prosecuted. In its second season, the series tackles the threat of terrorism in the United States. From the rise of extremist political groups in the 60s to domestic lone wolves in the modern day, we explore how organizations like the FBI and Joint Terrorism Take Force have evolved to fight back against a multitude of terrorist threats.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.