All Episodes

August 26, 2025 23 mins

A thought-provoking conversation about something every tech leader needs to hear: the hidden biases around AI adoption in coding.

We dove deep into some uncomfortable truths:

  • Why women and older engineers face unfair judgment when using AI tools
  • How fear of "looking incompetent" is holding back entire teams
  • The real impact on job security fears among developers
  • Why cultural resistance to AI is creating workplace inequities


The most striking insight? The same AI tool that makes one developer "efficient and forward-thinking" can make another seem "less capable" - and those perceptions often break down along predictable demographic lines.

If you're leading engineering teams or navigating AI adoption in your organization, this conversation will challenge how you think about competence, bias, and the future of development work.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Well, that's really messed. Up and what I find scary about
this is that if I'm running a startup and these AI tools are
available, we all know there's an AI arms race going on.
The companies that can implementAI most effectively are gonna
win. It's going to be a winner take
all type of scenario and here you've got these biases working
against using these tools which create productivity.

(00:22):
Very counterintuitive forward. Let's get it rolling.
Big ideas, Money folding dreams turn that.
You. Started out just so let's get
it. Rolling.

(00:44):
Big ideas. Money folded.
Welcome to the program. Hey, Chris.
Hey, go. I love the theme music.
Yeah, it's much better when we sing it.
OHS going on Alright, so today Iwant to talk about OK, so think
about AI right? And I've kinda had this feeling

(01:04):
for a while in the back of my mind and then I read an article
recently from HBR and I was like, oh, they put their finger
on it OK and it's it's like thisif I use AI will everybody think
I'm dumb? OK.
And especially as a? Code.
They already think you're dumb. Can it get any?

(01:24):
Get it? Get any word?
Can I go to must use AI by? This Marty had zero.
Can I go negative and smart? 1.
Yeah, exactly right. So right.
A really good article from HR onthis topic they're talking
about. I just wanna put a little bit.
So people have just like this, right?
So people think that you're dumbfor using AI?
Right. Like, let's say you're a weak

(01:44):
employee. OK, you're alright.
Let's set the stage. OK, We're a software company.
OK. Why don't we say like
underperforming? You mean is that you're trying
to say sure, like? Generally you're new or OK,
you're just not As. You know you're.
Confident you don't have? The skills.
Really. Sure.
OK. You know, like, think about
employees and their skill sets. It's a spectrum.

(02:05):
Some people are gonna be more. Proficient weird to think about
but yeah like obviously people are not as strong necessary like
there are people who are better and worse with their jobs for
sure, but I usually think of it as pretty tight.
I don't think of people if you're like really weak
performing usually have a new seat or something or you're.
Fired. Yeah, right.
Yeah, absolutely. OK, so think like software, we
use software development. I think it would equally work
well with. Like a sales department.
Yeah, sure. OK, So you're you're in a

(02:28):
software company and you are writing code, you are putting in
your pull request, you're writing so many lines per week
or whatever your sprints might be.
And you're the person that it's no mystery that you are using AI
heavily to help you. OK.
And what they put their finger on was that there's a perception

(02:49):
that the people using AI aren't as intelligent or aren't as
competent. So, and I, I think that is one
of my hesitations sometimes whenI'm gonna be coding stuff and
I'm thinking, well, maybe I could use AI to help me, but
it's almost like oht, maybe I shouldn't because I don't want
to be one of those dumb coders who can't code and you know.

(03:11):
Actually, you know, on a relatedsemi related point, so my most
successful LinkedIn post, which I know everyone here is very
closely following my LinkedIn isthat was was on kind of like
vibe coding and obviously babe coding being like you, you use a
tool like, um, give me a, give me a, what's that one lovable

(03:32):
lovable that you use? And it codes the thing for you,
so to speak. It's doesn't always work great,
but it works. And a lot of the responses from
like it was a big debate on this, like the comments are
crazy back and forth, like really intense.
I think there's some people who are defending their livelihoods.
On the one hand, you know, I mean, obviously the people who
are like software developers andthey made a lot of really good
points too. But that was the most aggressive

(03:53):
thing that I've seen in a response to one of my posts.
And I think it was really just because there was definitely a
looking down your nose at the people who are vibe coding
without a software background, right from the people with the
software background. Some of it warranted
potentially, some of it not. But I thought it was really
fascinating. I think kind of is what you're
talking about here, this manifestation of, well, you're

(04:13):
Dumber for using it, which I think is really
counterintuitive. Yeah, you know, so I just want
to read a sentence from my noteshere.
So despite the tools promise to boost productivity, only 41% of
engineers had tried AI coding assistance within the past 12
months, with particularly low adoption among female engineers
and engineers over 40. So the female or you're over 40.

(04:38):
You don't want to use a 5 because people will think you
are dumb. Over 40, I get why?
Well, because you feel more experienced than you feel like,
I think, I mean, not over 40, but I imagine when you're well
over 40 as you are, that you should start to feel that like,
you know, you know, a lot of youhave a lot of experience to
build on and maybe you feel likethis tool isn't really going to

(04:59):
help you. I think also those people are
doing more complex coding thingslike more complex problems, more
challenges. Younger coders will benefit.
It'll make up for the skills gapor experience gap that they have
with their older counterparts. I, I definitely know when I was
coding, this was like pre AI, like we weren't what I was doing
heavy coding AI wasn't really a thing that could help you.

(05:20):
There was code completion and stuff like that, but it was
pretty, pretty weak. It wasn't, wasn't anything near
what we have now. And I know myself, I could get
into a real flow where the code was just pouring out off my
fingers and I didn't even have to think about it right.
And I could see that there are maybe some experienced
developers that they can just get themselves into that mode
and like, why would I like? Yeah, like it actually might

(05:41):
hurt more. It mostly would get in the way
of the process. So why?
Why the women piece? Yeah.
That's what I'm kind of male engineers.
OK, so let me add another. Obviously we're not women.
There's probably some women listening to say like I know
exactly why, but if you have data on that I'd be interested.
OK, So the output being the same, people using AI tools,
people not being not using AI tools, OK, So compensated for

(06:03):
that, OK. The results showed that when
reviewers believe that an engineer had used AI, they rated
that the engineer's competence was 9% lower on average, despite
reviewing identical work. Oh, that's interesting.
OK. So there was like a panel and
intelligence penalty almost for using it, if that.
Pull request is flagged that AI was used to help do it versus a

(06:26):
pull request. There was no AI.
They the reviewer felt on average 9% excuse me, that the
developer using AI was 9% less competent.
Huh. OK, that's interesting.
Right, I think that. Would be probably also, for what
it's worth and I think you kind of alluded to this, but I think
that this would be the same across other industries.

(06:47):
Like I think about writing I I would imagine if an editor knew
that you used AI versus an editor thinks you didn't use AI,
they would rate your work a lot higher.
When you wrote her. Book.
No, I didn't, actually, no. It was kind of a little bit it
was. Like a little too early maybe.
I mean, I probably. I certainly could have, Yeah.
But you know, I never would have.
I'll talk about it, talk about adifferent piece about this

(07:08):
later, but I think it wasn't gonna be super helpful.
Concerned that I already had like, I think I had like 9
editors for our book. Yeah.
So it was kind of like, do I really need another layer?
Also me doing a lot of the editing too.
You friends, whatever. So yeah.
So I guess you're 9% more competent as a?

(07:28):
9% more common as a startup bookwriter, yeah.
Yeah, so and then this other piece being this gender and age
bias, OK, which amplify the problem.
So women using AI face nearly twice the reputational damage as
men. Wow.
If female engineers rated 13% less confident when using AI
assistance compared to 6% for men, and male engineers who

(07:51):
don't use AI themselves rated female AI users as 26% less
competent. Dude, that's just sounds like an
amplification of already existing, like systemic bias.
Yeah, it's just like another reason to hate on someone who's
different from you, in my opinion.
Like that doesn't that's obviously not grounded in any
fact at all. Yeah, that's that's that's so

(08:14):
weird to me to think that women would be like a like an inferior
coder on a one to one basis, butB and inferior coder in the
context where they're using thissame AI tool.
Do you have anything where it was like in the data?
Was there anything like how the output?
Is this the same output? Yeah.
This is absurd. This absurd.
So this is more a story in my mind about a systemic bias.

(08:37):
Against Yeah, they, you know, mynotes were like, it's it's it's
a social problem, not a technical.
One really a social. Problem Yeah, if the.
Outputs the same. Then it's just revealing deep
bias against women coders. Yeah, which is not where I
expected this episode to go, butthat is kind of, that's really
messed. Up and what I find scary about
this is that if I'm running a startup and these AI tools are

(09:01):
available yeah we all know there's an AI arms race going
on. The companies that can implement
AI most effectively gonna win. It's gonna be a winner take all
type of scenario and here you'vegot these biases working against
using these tools which create productivity.
Very counterintuitive forward. Yeah, I, I think a lot of this,
OK, I don't know, the sex bias there is very much like I just

(09:27):
absurd as is where I'm going to land on that like definitely a
societal issue that should be solved by society.
But the in terms of like productivity, that is kind of
fascinating that there's a general resistance.
I think there's a resistance though encoding going back to my
my LinkedIn post on fear of losing your livelihood, right?
I think that a lot of people feel like if they use that tool

(09:49):
and then that tool gets better at what they do, that is coming
for their job. And frankly, it is, I think
about no, we, we, we, we, we talked about this in one of our
last episodes, the interview on friends Lenny podcast with the
anthropic Ben, whatever his nameis, and the cofounder.
And he was like, you know, it's like clode code terrible.

(10:11):
I just respect what they're trying to do, but it's really
hard to say was was ready like 95% of their code.
Yeah. So, you know, if you hear that
as a software developer, I think, you know, your ears perk
up. I think your, your, your, your
back shoots up. Your alert to this is a.
Problem, it's one thing when youare an AI company and of course
you're going to believe in your own technology and use your own
technology. So they're they're going to be

(10:32):
the most they have. To be forward adopting.
If they weren't, I'd be very skeptical.
Right. Yeah, OK.
So this this is interesting too,because you know what you think
would be really important. I think one of the things that's
really important going forward in our society is that we need
to be flagging stuff as AI generated or generated using the
assistance from AI is so this. Flags on YouTube and stuff.

(10:54):
Yeah, we do. Stuff there's like this
transparency piece yet about whether I was using.
Optional like people have to flag.
It but isn't it interesting thatit backfires in this scenario
knowing that that that this pullrequest was generated with AI
assistance or you know, entirelyAI then penalizes yeah the
person who who did. It you know a segue here for a

(11:17):
second one of the websites I like to go on as I think you
know is injure you so like the IT was probably Reddit a million
years ago, I think, and now it'sowned by some other company and
they're doing some terrible stuff.
But anyway, the what's interesting about on Imgur is
there's a really visceral hatredof AI.
OK, really. And so there's two thoughts on
this. One is I think there's injury

(11:39):
just happens to be a very left-leaning website.
I don't really know why, but a lot of the community just seems
to be pretty lefty, which is fine, no issue with that.
It's interesting. So there's that, but then
there's also, I think a big partof that community is around like
creators. And I think creators also have a
really profound fear of AI for the same reason, right?
I remember when you would see some like probably about six

(12:01):
months ago when you were seeing a lot of posts suddenly by AI
generated images, for instance, right?
You would see the first comment would be this kind of like stamp
thing that says 100% AI bullshit.
That was, and people would be basically people just go around
and do this really. And so I, I wonder if there's
something like playing into it there as well that like you

(12:21):
don't like. On the one hand we should signal
for a transparency, but it's definitely punished if it's
identified as a. I wonder if people put on their
post I made this with AI. That would be better or worse I
think. We still get the.
Stamp. I think people still get the
stamp and they're still just downvoted.
The way Andrew works is you upvote or downvote.
OK, whatever is on the front page used to be called the front
page of the Internet. So whatever is on the front page
was uploaded the most by the people in user sub.

(12:45):
And so anyway, so I think they would just get buried is sort of
the nuts and bolts of it. So it's kind of say, I think
it's a there's a lot of hype around AI and I part of it feels
like the counterculture this like visceral fear that this is
gonna replace. That jaw what you do, and even
if you're just somebody who posts funny pictures on Imgur.
They make money doing that somehow.

(13:06):
You don't want some like bot outthere that's able to do way
better stuff and just like outpost you.
And you know, easily right like,but that is kind of interesting
that in a so OK, tired of business authors and influencers
who have never had a successful business.
Me too. I sold my company for $40

(13:28):
million and I want the same success for you.
Check out Startup different on Amazon, Audible and Kindle.
O let's let's put this into a situation that's relevant to
people watching. So you have a startup or side
hustle or whatever. Maybe you've got like an
employee or two with how do you think about these AI tools?

(13:50):
You know, actually there's a wide range of like kinds of
businesses, like different contexts.
But to me, it would be really surprising if as, especially as
a small company founder, you're not incredibly motivated to
having people use these AI toolsas much as possible.
And yet you have to be aware that there's kind of this
negativity associated with. That so I would as like a

(14:13):
software development manager, I would be very much in favor of
these because I definitely have your backlog that we cannot mow
through fast enough, right? I would be encouraging this
everywhere I could. I would have to be aware of this
bias, and when I think about thepersonalities in our development
team and our former development team, this is a real thing.

(14:35):
I can think about which people would be on board to use the AI
tools and which people would be like Mediately.
Resistant. You're an idiot, I think.
I think we know exactly. Who this is?
Yeah, but yeah, exactly. And so that would be interesting
to see. So this perhaps it comes down to
a fundamental problem around like a comfort level, an

(14:55):
education and like job security of middle management because I
wonder if that's a lot of this too.
So the people who have who are being they're being like
evaluated and the code they're making that evaluation is
basically internal there. There's no customer who's going
to look down on you for making aproduct more quickly.
And you're not going to be like,oh, we use we did it all by AI

(15:16):
unless they ask and you want to be transparent, but you know
what I mean? Like I wonder if to some extent
too it's like, so you're a middle manager, you're a coder
that's been doing a, a software development manager at a big
company that you are 45 years old, you've been doing it for a
long time. You have a lot of respect for
the process and you know, payingyour dues.
Then you have a 25 year old coder who perhaps a woman in

(15:39):
some cases who generates really good code or really good
finished product. You're going to hate on them
because that is threatening. Maybe, yeah.
You know what I mean? Like considering the incentives
there in a lot of these situations where that data is
going to be pulled, those peopledon't want.
That that's using comment. Let me make comments.
Do you remember the first developer we we hired to help
with App armor? Navdy.
Ohe yeah. Remember Me?

(16:00):
I remember having conversation with Navi and he was explaining
to me that it could either be a he could either code it fast or
he could code it with high quality.
But you traded one for the other.
Some game I remember. That OK.
And so I I drew a chart kind of doing like a 2 by 2, high
quality, low quality. High productivity.
Productivity. And and I said, yeah, so you're,

(16:22):
you're arguing it's like this, right?
Yeah, you gotta go along this. Linear.
Declining, you give up line, yeah, productivity for quality
or vice versa. OK.
And I said to him, I said, yeah,but as you develop things, you
gain experience and that that line like shifts out, OK?
So you can actually get higher productivity and higher quality

(16:42):
because you have experience, OK.And I think AI is the thing that
actually can help you do that aswell.
Like this is a second bump. Or it compensates earlier on
like that you would. So if you imagine the chart,
instead of going straight down, it's kind of like a little dip
and then kind of kind of like there won't be as good as
someone who's maybe who's experiencing right now.
Maybe it is. I don't know.
I haven't used clothed code. But that's an interesting take

(17:06):
too. Yeah.
But I do think it comes to this incentive thing.
It comes back to people just being.
I think, I think as a manager you're gonna have to work to
change those perceptions becauseI want high quality and speed, I
think. They just want to lose their
job. I, I think it's, I think it's
not, it's not just a I struggle.With that, because I when I
remember we our product backlog at one point had like 5000 items

(17:28):
on it and we actually cleaned itup and got it down to 1000 like
legit items that we had to do. And I think we were we were
knocking down maybe like 30 a Sprint, you know?
So like we had weeks and weeks and weeks and weeks and weeks of
work and, and we were turning down so much stuff that we could
have been doing like so I don't understand the I'm going to lose

(17:50):
my job piece. OK, At least not not.
Yet you're looking at it so fromthe CTO cofounder perspective,
right? Right.
Which is not wrong. Like obviously productivity is
good and more efficiency, more spend efficiency, more profit,
their employee, all these important metrics.
Yeah, but you considering like II really think if you're in a

(18:13):
bigger company, 250 people, you're a middle manager, you're
1 of like 10 middle managers of a big development team.
You can see how this would get threatening quickly.
I think that maybe like let's just change like metaphor for a
second. So if you were like, let's say
you were a student at a school and just leave a for a.
Let's pretend AI doesn't exist. You write a.

(18:35):
Freaking awesome essay. Yeah.
The best essay on whatever. It's so good that it's
threatening to the TA that's marking it right, that TA will
be hard on you. Do you know what I mean?
Like there's something about like fear of being 1 upped.
And I think in bigger organizations, like even if it's
not just I'm going to lose my job, even it's like this

(18:56):
person's making code so well andbeing so productive that they
could have my job. And even if I or be promoted
faster than me or be seen as a rising star, people are weird.
Big organizations are political.Yeah.
And I think that that's also a big part of this.
And I think that definitely you've also identified A
systemic problem around women inthe in the workplace there too,

(19:20):
and certainly them being promoted over what is probably a
man in a lot of these cases. So let's give the listener some
takeaways here. OK, we understand the issue.
OK, you say I tools, you're perceived as dumb.
You know, there there are issuesaround the bias, gender biases
or or, you know, older people using AI or perceived as less

(19:43):
competent. This is a big issue that we need
to be aware of, but like, how dowe action it?
What are the takeaways? As a a manager, I think the
first thing is you have to be aware of this.
If the share this OK, But hey, just so you know, data actually
shows that we're we're gonna think people using AR like 9%
less confidence. That's just like the

(20:04):
transparent. Let's just be aware of that, OK,
that there's gonna be that builtin bias, OK?
But let's actually now go back to the data and say, like, look,
how can we use this responsibly and make ourselves more
productive where we we are chewing through that backlog.
And maybe part of it is what youtalked about, which is
eliminating any risk of anybody like losing their job.
You know what I mean, I think. A lot of the time is you gotta

(20:26):
identify the people that they have, the security that they
desire, Yeah. And in order to create
effectively like a safe environment for this kind of.
Stuff. Yeah, the word sabotage.
Didn't they come from, I think Igot this from a Star Star Trek
movie or something like that, where sabotage is like workers

(20:48):
who were didn't want machines totake their jobs, took their
wooden shoes off, which were called sabos, and threw them
into the machine, Right. Yeah, Yeah.
It was like a really wow Star Trek reference.
But is it kind of that that's happening?
Is this the sabotage that's going on?
I'm going to sabotage by perceiving that this person is.
Less there's probably also some element of like, look, I paid my

(21:10):
dues, I'm 45 years old, I work really hard at all this stuff.
I didn't have A and I got here and then look at you.
You're able to use an AI tool and be almost as functional as
me in like 1/5 of the time. I will say that I hate that as
well. I've been that category right
For like some Jackass who can't code goes on to lovable or
something like that. Like I can build apps and like.

(21:32):
It's true, like it's a thing. It's like, OK, well, actually do
it. Like actually do it, put it out
there, support it, have it breakand then try and fixing the
issue. Like like there's more to it
than that. But I hate that these people who
are always like I like maybe fancy themselves as coders,
always thought that maybe I could take up coding and be a
really good coder and then but leverage really eight week
course AI and then like I couldn't do it because I have.

(21:55):
A I did it. I'm like, yeah, I hear that guy,
right? Yeah, I mean, it seems like
you've got an issue. With it, Chris, I know, I know,
I can see I can see that piece of it, but I I don't like that
I. Think a lot of it is that I
don't I, I honestly think more of this is about job protection
and security and paying dues andlike, social norms, societal
norms, yeah. As opposed to an actual
functional issue with AI. So bottom line is not a

(22:16):
technology problem, it's a cultural 1.
And I think that leadership has to set the culture on this sort
of stuff and be aware of these biases and get ahead of them and
action them that way. Because like, frankly, I want
those productivity means I, I don't want to be the last
organization, last software development company.
It's it's the size to start taking up using AI to help.
Code for sure. Because you're out of business.

(22:38):
You're. Going I think so too.
I think so too. It remember, I remember when I
was a kid, maybe just driving this, remember seeing like an ad
on TV for like IBM and it was like any business not on the
Internet is not in business. And this was like in 2004 or two
or something, probably rightaroundthe.com bus.
And I was like as a kid I was like, whoa, what if I ever start
a business, I better make sure I'm here.
And so I thought about that and,you know, I kind of the, the AI

(23:01):
think feels I, I wasn't like working then obviously it was
really small. But it feels kind of like that
at times where at least for people who are closely watching
this, that the winners in any market are going to be the ones
that are best at using these tools.
And yeah, fighting these kinds of biases will be super
important going forward. So cool, man.

(23:22):
That's interesting. Yes.
Alright, well, we'll see you next week, folks.
We looked at the wrong camera there for a second.
Good luck. God bless.
See you next week, hey? Let's get it rolling.
Big ideas, Money, hustle, Smart dream.
So turn that grinding through a joyride.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.