All Episodes

November 16, 2025 53 mins

In this week’s episode of Brown Ambition, Mandi Money sits down with author C.J. Farley to unpack his new book Who Knows You by Heart  a story that hits right at the intersection of ambition, identity, and the harsh realities of working in big tech.

Farley takes us behind the curtain of corporate life through the journey of Octavia, a young Black woman navigating discrimination, mounting debt, and the emotional toll of trying to thrive in an industry where representation is still painfully lacking. From microaggressions (and yes, “technoaggressions”) to the weaponization of PIPs, this conversation gets real about what it actually means to succeed while marginalized.

Mandi and Farley also dig into the importance of mentorship, building community, and how responsible AI is reshaping creativity and career paths. And—because we’re all doing our best to stay sane while securing the bag—the episode highlights why self-care isn’t optional… it’s survival.

Whether you’re scaling the corporate ladder, considering a pivot, or searching for a network that gets it, this episode offers both heart and hard-earned wisdom.

 

Key Takeaways

  • Financial & career empowerment remain at the core of Brown Ambition’s mission—this episode brings both.
  • Octavia’s story exposes the realities of big tech, from discrimination to navigating massive student debt.
  • Black talent is still underrepresented across the tech pipeline, highlighting systemic barriers.
  • Mentorship is vital, especially for women of color facing uphill battles at work.
  •  Microaggressions and “technoaggressions” continue to shape the everyday experience of marginalized employees.

Go pickup a copy of the book today ---> https://www.amazon.com/Who-Knows-You-Heart-Novel/dp/0063418630?ref_=ast_author_dp

 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, ba fam, let's be real for a second, and
y'all know I keep it a book. The job market
has been brutal, now not brutal trash, especially for women
of color. Over three hundred thousand of us have disappeared
from the workforce this year alone, and not by choice,
but because of layoffs, disappearing DEI programs, and stagnant wages

(00:20):
that keep cutting us out of opportunity. Our unemployment rate
has jumped to over seven percent, while our pay gap
continues to widen. I know all of that sounds dire,
but here's what I want y'all to know. You do
not have to wait for the system to save you.
That's exactly why I created the Mandy money Makers Group
coaching community. It is a coaching community that is built

(00:41):
for us by us. Inside the community, we're not just
talking about how to negotiate or to how to get
the job that you want. It's about finding purpose in
your career. It's about finding communities and others, feeling seen,
feeling heard, and also having a sounding board and a
mirror to reflect your own magic, your own sparkle right

(01:03):
back to yourself. In this community, you'll get group coaching
led by me, but you also get peer to peer
accountability with proven tools and resources that can help you
do what we have always done since rise. Even when
the odds are stacked against us, despite all the challenges,
we will rise. If you're interested in joining the Mandy

(01:24):
money Makers community and having that support to bolster you
and help you tap back into your magic so that
you can lead your career with intention and heart and
your own intuition, trusting that again, please join us. You
can find information in the show notes of today's episodes
or go to mandymoney dot com slash community. That's Mandy

(01:47):
m A n d I Money dot com slash community.
I would love to see y'all there. Enrollment is open,
so please go check out mandymoney dot com slash community today. Hey, hey,
via fam, Welcome back to Brown Ambition. I am your host,

(02:08):
Mandy woodrim Santos aka Mandy Money, and today I am
joined by the remarkable author of a brand new book.
I'm going to hold it up so if you're watching
on YouTube, you can get a picture so you know
what you need to look for on Barnes and Noble
shelves or your local independent bookstore. The book is called
who Knows You by Heart. It's written by C. J. Farley,

(02:29):
And with this novel again, Who Knows You by Heart,
which is going to hit shelves November eleventh, and it's
already turning heads, y'all. It's getting tons of buzz as
one of the falls most relevant and brilliant releases. It
has been hailed by Booklist. It's got stars all over it.
This book is going to be shining right, just covered
in stars. It's gotten a starred review which is called

(02:50):
it a timely masterpiece. But the book delves into a
story that's actually told in the second person and follows
Jamaican American Octavia Crenshaw as she's navigating the minefields of
big tech like so many of us are today, but
not just a career in a space that is still
overwhelmed with gender discrimination, racial discrimination. She's also facing some

(03:13):
real world challenges like crushing debt and the moral quagmires
of building bias free AI, all while racing against the
clock and against competitors who are threatening her job and
her dreams. So, CJ. Welcome to the show.

Speaker 2 (03:30):
Hey, with that intro, if you can just wrap it
up now, I think we're done I think you've covered it,
so thank you for having me.

Speaker 3 (03:36):
I'll be on my way now.

Speaker 2 (03:38):
It's a pleasure to talk to you about my new book,
Who Knows You by Heart. I think there are a
lot of relevant issues to surviving and thriving in business today,
and in the tech industry in particular. It's also a
fun book, you know, it's funny, it's romantic, but there
are some serious issues that it explores.

Speaker 3 (03:56):
So happy to talk about it.

Speaker 1 (03:59):
Yeah. So, you know, my first question for you was
when you were setting out to write this book, when
were you certain I want this to be told through
the lens of a black woman working in today's modern
you know, big tech sphere.

Speaker 2 (04:13):
Well, one reason why I wanted to sort of focus
it on and have a black female protagonist is, you know,
the numbers of black women and black men in big
tech are not good, and they're not getting better. You
sometimes go to the websites of these companies and they
have all these sort of laughing, you know, minorities and

(04:33):
looks like having fun and yes they believe in in
you know, fair recruiting and fair hiring the less so
these days, and I wanted to sort of go under
the ugly underbelly of that of all that, and show
what's really going on. And the numbers show that although
the tech industry is booming on the s and P
in terms of trains of dollars of revenue, that a

(04:56):
number of black people like you work in the industry
is far less than our numbers in the population. You know,
we're like thirteen percent of the population in the US,
and we're like less than seven percent of the tech industry.
And that's not really the prime jobs. Sometimes that they
prop up those numbers by counting people aren't really working
as executives, aren't working in skilled jobs to pump up

(05:16):
their numbers, And so I wanted to show what the
numbers really were like, what the experience is really like,
so people would know what's happening the tech industry. Because
the tech industry, you know, it's hot right now. It's
a new gold rush, it's a new world is where
people are making and breaking fortunes, and black people should

(05:36):
be involved in it. Some of us are, but even
their accomplishments are being erased or ignored or not widely spread.
And so I wanted this book to sort of shine
a light on it, but also that has some fun too.

Speaker 1 (05:50):
Yeah, you know, Octavia is no joke like, she is brilliant.
She's a coder herself. And at the beginning of the book,
I think the funniest part of the thinking the book
is she she's basically created her own chatbot. She's working
in tech. She creates this chatbot to basically do her
job for her, right, it's like creating content for the website,
And she's building this AI and you know, ais are

(06:13):
hungry animals. They need data, they need content so that
they can, you know, know how to mimic whatever they're
trying to mimic. And so she's taking all that data
from people's computers or work computers, not realizing that in
the process she's gobled up a lot of personal, you know, content,
including some some unsavory images and video of the company CEO,

(06:34):
which leads to her getting thrown out on her little booty.
And then she's in the job market. And you know
what I really appreciate about the book is how accurately
I think it reflects the well, maybe not as in
this current economy, it's so hard. She gets a job
relatively quickly, but it really accurately I think reflects what
it's like to be on the outside trying to bang
on the door of big tech, trying to break in.

Speaker 2 (06:57):
And that's exactly I wanted to capture. I wanted to
capture feeling it being outside and inside. I also wanted
to capture the experiences of people of color within the
tech industry. And one person that really influenced me was
a real life, you know, I think icon in the
tech industry, doctor Ayana Howard. I don't know if you've
heard of her, but she's a roboticist. She's dean of

(07:20):
the Ohio State University College of Engineering. She's really a brilliant,
brilliant person. She makes robots, she worked for NASA, she's
an educator, she's a leader. When I worked at Audible,
I worked on an audiobook with her called Sex, Race,
and Robots, And it was about her experiences in the

(07:43):
tech industry and UH and working as a sort of
someone who builds and builds robots for like, you know, perspective,
Mars Landers and that kind of thing. And she had
some fascinating stories about the things she went through the
way in which she sometimes are often undervalued and underestimated
by her white male colleagues, but also you know the

(08:07):
brilliance she displayed in terms of the innovations and the
businesses she created. And I thought about her a lot
as I wrote this book, because I'd work with her.
I knew how great she was, and I knew that
her kind of experience in her name was.

Speaker 3 (08:22):
Not out there. I had a reference her in the book.

Speaker 2 (08:24):
I mean, she her name pops up in the book,
and I wanted to make sure people knew about people
like that that although times are tough, although the numbers
aren't worth they should be, in the tech industry, there
are people like doctor Ayana Howard who are doing great
things that all of our kids and all of us
can really learn from and be inspired by and maybe,

(08:45):
you know, be inspired to do great things ourselves in tech.

Speaker 1 (08:50):
You're right, that is something that you sprinkled throughout the book.
I thought that was so brilliant. Is all these like
references to these little known heroes. I think one thing
I learned from your book was the man who invented
stop lights or traffic lights is a black man. Never
knew that.

Speaker 3 (09:04):
Yeah.

Speaker 2 (09:05):
Also, like I remember I mentioned Jerry Lawson. He's someone
else that I worked on an audio project about Jerry
Lawson and all the kids should know about Jerry Lawson.
All these kids who are playing like Fortnite and playing
video game should know about Jerry Lawson. Jerry Lawson is
a black man who helped leave the team to create
the first interchangeable game cartridge, and that really helped pave

(09:30):
the way for the multi billion dollar game industry because
before him, you know games where you'd buy a game,
it would be a whole sort of unit, and you
couldn't like sort of pull out the game and put
in another game. His seemingly simple innovation led the way
for people to realize, oh, we can change these out
just like record, just like you know, just like a trax.

Speaker 1 (09:51):
And blow on it down, get the dust out exactly.

Speaker 2 (09:56):
I mean, you can blame him for kids sitting in
front the TV, you're sitting in front of their computers
and sitting in front of wherever and playing games, but
you can also credit him for helping make the whole
industry possible.

Speaker 3 (10:09):
And people don't know about Jerry Lawson.

Speaker 2 (10:11):
His name is trying to get more traction, but he's
someone that want people to know about, to realize, oh,
there are creators who are African American in this world
of tech business, and we can be like them, we
can surpass them, we can stand on the shoulders of
giants by people like doctor Ian Howard, people at Jerry Lawson.
I wanted to make sure their names were checked out
from this book, even as I explore how awful the

(10:34):
tech industry can be for women and black men and
people of color.

Speaker 1 (10:38):
Yeah, you have had so many professional lives. As I
was reading through your bio, I mean, you're a Harvard graduate.
You have written a best selling biography about Bob Marley
called Before the Legend The Rise of Bob Marley. This
is not your first novel, you had another novel before,
but you've also been an editor at the Wall Street Journal.
And you mentioned your career at Audible, and I I

(11:00):
think in the in the in the clipping I had,
you know, with a little bit of your bio, it
said you were an executive in tech. But the fact
that it was Audible makes so much sense because in
the book we learned about this. I was kind of
thinking it was similar to like a Tesla type type vibe,
like a big tech but it's it's audio. It's an audio. Well,
you describe it almost meant do a good job. But
it's basically an audible type, you know, big company that

(11:23):
is harvesting all this data from customers to uh you
take it from there to do well.

Speaker 3 (11:28):
Yeah. Well.

Speaker 2 (11:29):
In the in the book, the main character works or
gets a job for a tech company that specialized in
audio entertainment. And the one reason why and and the
audio entertainment is you know, digital entertainment.

Speaker 3 (11:42):
It's streaming.

Speaker 2 (11:43):
And the reason why I did that and made that
it made made the this tech book about that kind
of a company is to me, a lot of what's
going on AI today is about the collision of art
and ownership.

Speaker 3 (11:59):
Uh.

Speaker 2 (12:00):
You know, throughout the years, black people have been just
ripped off by the entertainment industry, particularly in music. You
think of the legend of Blind Lemon Jefferson and a
guy who was so seminal in the field of blues.
Legend has that he died with his hands still frozen
around the neck of his guitar. I remember when I
was working as a music critic for Time magazine, I

(12:23):
once interviewed TLC. You know, the great R and B band,
and who they did. You know, don't go chasing Waterfalls.
I won't say any of the songs, but you know Creepy,
you know them.

Speaker 3 (12:31):
They're great. And they told me.

Speaker 1 (12:33):
About how you could do a little creep You can
do little something that's right.

Speaker 3 (12:38):
Little dance.

Speaker 2 (12:38):
But they told me, even if when they were pumping
out all those hits, they weren't getting any real money,
and their legs were getting turned off, They're having utilities
turned off, they were totally broke. And I remember doing
an interview once with Prince when I was at Time magazine.
He's the only artist I can think of right now
that when I did the interview, told me, Okay, you
can't tape this. You have to write it down. Why

(13:00):
can't I tape it? And he's like, well, because I
don't want you sampling my voice and using it. And
they had a digital means in future years, so he
had a vision for what was happening. At first I
thought it was crazy, but now looking back, and I
realized how smart and how pressing he was. He realized
that digital content meant something, that his voice meant something,
and he didn't want it sort of out there in

(13:20):
the world, built to sample and create their own things with.
And we're seeing that happen now and streaming where artists
can have millions of streams and still not really make
a living wage. We're seeing the way in which AI
is being trained on filmmakers and actors and musicians and

(13:41):
then pumping out art that seemed similar, but because it's
in a black box, nobody can sue. There's a recent
court case with Nthropic, which is a leading company in
the generative AI space, and they agreed a subject to
court approval to pay one point five billion dollars and
to corporate in a freement lawsuit that was brought by
a group of authors saying that anthropics AIS were trained

(14:04):
off of their books. So this is happening, and we
have to get ahead of it because we don't want
to end up like blind Lemon Jefferson, with our hands
frozen around the neck of our guitar while all these
AIS are trained on our art, trained on our music,
trained on our culture, and then they sell it back
to us at a premium and we can't afford it.

(14:26):
So now's the time to talk about these issues. And
so the book kind of explores the whole collision of
AI and art and the fact that it's being trained
off our work, and we're once again and can get
ripped off unless we can pull a jay Z slash
Beyonce slash Rihanna and really take control of the market
and try to profit off of it ourselves.

Speaker 1 (14:47):
Yeah, I think, you know, towards the end of the book,
I don't want to give too much away. I'll try
my best, although I'm the worst. I'll try. But towards
the end of the book, you know, we encounter a
character who's like, he's a narrator for audiobooks and he
ends up being one of the casualties of the use
of AIS. They're able to take his voice. You know,

(15:07):
he's been doing this work for years and basically like
they don't need him anymore. We can get a bot
to do this now. Good luck to you. And we
were seeing this, we're seeing this happen, you know as
we speak, and and that, and you know the fact
that to your point earlier, when we do show up
in tech black and brown people, we tend to be

(15:28):
much lower on the totem pole. I know, I used
to work for a fintech company and they would bring
their diversity numbers out and I'd be like, well, let's
get rid of the call center and then let's see,
you know where black and red people are showing up.
But we're in like a vulnerable, much more vulnerable position
to be replaced, you know, by A I think that's
a real fear. Do you think that the book in

(15:50):
a way is addressing that as well? Or do you
like what do you think about about the idea that
we'll be pushed out even further?

Speaker 3 (15:58):
Well, I think you're exactly right. And two things are
going on there.

Speaker 2 (16:01):
One, I think that AI is replacing that first run
that you need to get to in order to climb
up the corporate ladder, because those jobs can be done
by AI. And that's a real problem in terms of
how we get entry into these spaces that weren't we
weren't allowed to go anyways. Two, the positive way to
sort of address some of that and address any kind

(16:22):
of success in business is the idea of mentorship. And
that's something that that's also a theme of the book.
How you can be tossed into these weird spaces and
not know how to operate, not know you know how
to get ahead, not know where all the bodies are
buried or or how you get to that next level,
or how you even do your job. Other communities often

(16:42):
have mentors or sponsors or people that will help you
out and show you the ropes you can get ahead.
Often because we're the only people in that space, we're
thrown in there to sink or swim and no one
to tell us how to operate ourselves or work in
a business space, and then we get washed out and
they wonder under why. So one thing I try to
develop in the book is the idea of mentorship. Not

(17:04):
just because you're right, we're often at lower levels. It's
hard for us to find that that higher ranking mentor
we consider be someone who can support us and show
us around and introduce us to the right people. But
we also to embrace the idea of peer to peer mentorship.
People who are maybe on our same level, who have
been there longer, or maybe they're at the same time,
who you can share ideas with, share experiences, share frustrations with,

(17:29):
and and rely on each other to get ahead. In
the book, Octavia makes those kind of connections with other
workers who are or at her level. A guy who's
at her level who she ends up having a romantic
relationship with.

Speaker 3 (17:43):
But it's okay because on the same levels.

Speaker 2 (17:45):
But that's the kind of thing I think we need
to we need to do as workers in the tech
space or any space, to make sure we have the
support structure we need to get ahead and work and
in life, and even if it's not for career, sometimes
it's just for psychological purposes. So we have someone that
we can invent with and go like, did you see that?

Speaker 3 (18:05):
Crazy? What happened?

Speaker 2 (18:06):
And I just am I being gaslet what's happening here?
That's a very important thing just to survive in your
career and your life, to develop those mentorships or those
peer to peer mentorship because nobody tells us anything and
you need people to sort of help you out.

Speaker 1 (18:25):
Hey, ba fam, we got to take a quick break,
pay some bills, and we'll be right back. Yeah, it's funny, funny.
I mean you have to have like a Gallows humor
about it, I guess. But the number of microaggressions that
Octavia is clocking, I mean I think we got up

(18:45):
to like the thousands. I think every once in a
while you'll have microaggression number, you know, nine hundred and
eighty six, and they're just so real. I think anyone
who's working in corporate women of color, like you've been
mistaken for an intern, You've been mistaken for a janitor,
you know, or almost person walking on the street, and
and these are your colleagues, and you're you've you've done
everything right, you're in the same money, and you've earned

(19:08):
your spot there, and you know that I that just
what you look like you can. It's like a it's
a constant like assault.

Speaker 3 (19:17):
This is exactly right.

Speaker 2 (19:18):
And in the book, I pined a term called techno aggressions,
which even worse microaggressions, because it feels like they're you know,
they're powered by silicon and steel. I mean, part of
the problem the tech industry is people have a kind
of arrogance to them, often because they think that their
decisions and their opinions are supported by data. Everybody elects

(19:40):
to say that they're making data driven decisions, but often
the data they have is flawed because it's collected by
programs that were written by mostly non black people and
not by women trained on data sets that are drawn
from really skewed so sources on the internet. And so

(20:02):
when they give us these data driven decisions and data
driven takes on things, they're often completely off based with
our lived experience as black people, and so they become
these technico aggressions that really can make us feel like
they're they're somehow informed opinions, when really they're just the

(20:23):
same old Jim Crow racism wrapped up in technological clothes.

Speaker 3 (20:31):
So we have to be.

Speaker 2 (20:32):
Conscious of that, and I have to I have to
realize it when we see data and it's thrown at
us and opinions thrown at us there or or employee
reviews that that come out negative towards us, to realize
to take it with a grain of salt and a
lot of salt and realize that this may or may
not be the truth. And that's where again where peer

(20:54):
to peer relationships come into play, where you can talk
to someone and about am I really making this mistake
at work?

Speaker 3 (21:02):
Can I do this better? Can you help me out
with this project?

Speaker 2 (21:05):
And someone I can go to for advice, just to
help counter some of the negative stuff, these techno aggressions
that may be thrown at us at work under the
cloak of something real.

Speaker 1 (21:17):
The Pip, the Almighty Pip, makes a lot of appearances
in this book. And I don't know why that felt
so inside baseball to me, and I'm just like, oh,
he's talking about pips, and you weren't just bringing up
PIP performance improvement plans, You're also showing how they can
be weaponized against women of color intech. Like there was
one character in the book who you know, is doing

(21:39):
everything right but gets the dreaded PIP and then you're
on this like the anxiety of trying to match up
to whatever this Like are these arbitrary you know, goals
that are being set to you by a performance improvement
plan only then to be eventually fired no matter what
it is that you do. Why was it important for
you to dive into that that tool that you know,

(22:02):
companies are using increasingly to try to especially you know,
in the wake of all these tech layoffs, kind of
using the PIP as a way of you know, what
do they call like not straight layoffs, but like indirect
layoffs by identifying like low performers. Can you talk a
little bit about that.

Speaker 2 (22:20):
Yeah, they're also the uphum missions that I use for
firing people. But you know, it can be layoffs, it
can be you know, reductions in force. But regardless of
what the call it, it still hurts. It still stings,
and people tend to think it's a judgment about their
character rather than a judgment about the company. I've always
thought when companies doing mass layoffs, it doesn't and there's

(22:42):
some data.

Speaker 3 (22:44):
That supports this, and some research and supports this.

Speaker 2 (22:46):
I think it's bad for companies because at least many
people on edge, it reduces people's connection to the company
because it goes, oh, we're just units can be moved
around and disposed of. And when you believe that a
company has your long term interest at heart rather than
just a unit, you can be dismissed at anytime. You're

(23:08):
liable to work harder for the company and less likely
to want to take the first good offer that comes
along and take off. But you know, it's funny. Early
in my career I lived a sort of a good
lesson about that. I wrote a book called Aliyah More
Than a Woman for assignment, and she used to slash
MTV books because I interviewed Aliyah before she died and

(23:31):
met her mother.

Speaker 1 (23:32):
You really have had about seventy eleven lives.

Speaker 2 (23:35):
Yeah, are my favorite performers. I've always thought that she
would have had a Beyonce that career. She was kind
of Beyonce before Beyonce with Beyonce, and it was a
tragedy that she died in a plane crash so young.
She was on she was doing stuff in fashion, she's
doing stuff in movies. She really was breaking through with
what she calls her street but sweet style, which is

(23:55):
now kind of like mainstream. So once she died, really
just because I'm like, oh, she was gonna be she
was it, and I'd written about her. I thought she
was she was she was the next big kind of
musical icon. So I wrote a book about her. I
got bopped by Lifetime and turned into a movie. And
at first Zendia was was was uh locked in to

(24:17):
star in the movie, and but that became controversial because people,
some people out there on the internet said, Okay, she's
mixed race, so can she really play Aliyah?

Speaker 3 (24:27):
That's a problem.

Speaker 2 (24:28):
And and also there's some problems with like the the
licensing the music. Were working on use licensed music from
the family because we wanted to be able to tell
the whole story, including the story of r Kelly, And
if we'd taken if we'd a signed to deal with
a family, we couldn't have done that and would and
we had to tell that story. It would have been
a marketing me as a journalist. And so in the end, Zendia,
you know, for.

Speaker 3 (24:49):
Her own reasons. I don't know exactly reason we decided
to walk away from the project.

Speaker 2 (24:52):
And at the time, Zendia wasn't Zendia, she wasn't huge yet,
so oh, that's gonna be bad for her career. But
you know, obviously she's done really well since then, she's
become a multi hyphenet superstar. And the lesson that taught
me is sometimes you have to know when people a
situation isn't right, if you're not being treated right, if
the vibe isn't right, believe in your talent and walk

(25:14):
away and go on to do great things. That's exactly
what Zindaiya did. We got someone great to fill the roles,
Alexandra Shipp whose nose loud.

Speaker 3 (25:23):
She is fantastic.

Speaker 2 (25:23):
He's also had a very strong career and if you've
seen her in Tic Tick Boom and in the X
Men playing Storm, so she did really and she's also
great great in the film. But obviously she knew one
to walk away. It was a lesson for everybody about
like situation doesn't feel right, they're not treating your right.
If you're getting feedback fro the puppet that doesn't feel right,

(25:45):
believe in your talent, walk away. And it's something you
can learn in business as well, to know you're worth
and know you're worth more than the hassle or the
bad situation you may be in. And if you and
you can, I think off and find something even better
in the situation that may feel so crushing for you
at the time.

Speaker 1 (26:06):
Yeah, how to walk away? No, I love that, and
I think I mean not to go down that rabbit hole.
But yeah, casting in Hollywood, I think it is important,
especially for black women to have representation. Like you know,
we saw what happened when Zoe's Holdna did did Nina
Simone what if fiasco? That was so I can see.
But you know, I work with the woman because I

(26:26):
do coaching a lot, professional development coaching, and I work
with a woman who was so hell bent on proving
that she deserved like to like the pip for her
was like She's like, no, I'm going to meet these demands.
I'm going to prove that I belong here. And I'm
like they're using it as a tool to try to
get you to leave, like they don't act. I think
once you get a PIP, the message is clear they

(26:48):
don't want you there anymore. But she was so focused,
and you know, they gave her an opportunity to to
do the PIP and then if she didn't meet the
requirements at whatever thirty days, sixty days, then she would
be fired without any severance, or they offered her some
severance to just leave right then and there, and she

(27:08):
was so hell bent on proving that she could do it.
She you know, went through with the PIP and the
ended up getting fired with no with no severance. And
that was one of the toughest stories I had to
listen to, you know, working with someone and to your point,
like if she had developed that sense of confidence, which,

(27:30):
to be fair, is really hard to do in a
world that just knocks you around. And even with Octavia
in the book, you talk about walking away, but she
could not afford to walk away. What I like about
her characters, you saddle her with this inherited debt. We
like to think about we like to talk about generational
wealth here at Brown Ambition and how we can create that.
But you know her mother has taken out a reverse

(27:51):
mortgage that now Octavia is on the hook for paying
back this debt and these golden handcuffs of tech like
have her n chocal it's I love you talk about
equity and r s US and you know all that,
and like if she doesn't hang on, she'll miss out
on being able to you know, have the stock bested
and she can cash it out. Why was it important

(28:12):
for you to like illustrate the way that these golden
handcuffs can sort of like keep us stuck, even if
they are artificial artificially, you know, real handcuffs.

Speaker 2 (28:23):
Well, I think you found this too through your show.
Is that unfortunately about our community. I don't know some
of the basics about business and being cappitated and employment
is not a knock on our community. It's just that
there's been no situation which we've been able to learn it.
But what our RSUs are, what they represent, you know,

(28:46):
even the four oh one K and I wanted to
make sure people unders had had this this you know,
this business situation, how it had an emotional effect on
this particular character.

Speaker 3 (28:59):
I also wanted to illustrate.

Speaker 2 (29:01):
The fact that you know, for many people, for many
people in the white community, and again this is based
on data.

Speaker 3 (29:09):
They have.

Speaker 2 (29:11):
A much more household wealth. They have sometimes have second homes,
family homes, getaway homes to fall back on. They have
mothers and fathers who can lend them money float them
through tough times. And that's just not my assumption or stereotype.
It's what the numbers say. And when we as black

(29:33):
people lose our jobs or positions, we're often sometimes the
only people in our families who are the breadwinners, who
are generating that kind of income, and so it's disaster.
There's no safety net to hold us up. And so
I wanted to sort of illustrate that, and the book
actually goes through these rough times where it's hard for
her to leave those rsues behind. It's hard for her

(29:56):
to leave the job and find another job because it
caused enormous financial stress. But on the positive side, I
think it's also important to distress how important self care is.
And it's a great quote from Audu Lord that I
often share that caring for myself is not self indulgence,

(30:17):
it is self preservation, and that is an act of
political warfare. It's a great quote from Audre Lord, and
I think whenever we start going through rough times. Sometimes
it's tough to take the time to set it aside
a moment just to care for ourselves, to maybe give
us that spot day we can't afford, or keep exercising,

(30:38):
or go out with friends for some reasonably priced dinner.
Things like that aren't indulgent or actually keep us sane,
keep us operating, and allow us to work at a
high level. And so we shouldn't see that as somehow
weak or somehow keeping us away from the job at hand,

(30:59):
but actually part of the self care, part of the
job of getting a job or working get a job,
and keeping us operating on a high level.

Speaker 1 (31:12):
Yeah, I love that you talk a little bit about
self care. Yeah, Tavi, I really needed some. She Felly
didn't have anybody to encourage her to take care of herself.
She was such an I mean, I know she had
a couple of friends in the book, but to me,
it felt she felt like a very isolated, lonely character
and you know, successful maybe on paper, but something about

(31:34):
her just made my heart kind of ache.

Speaker 2 (31:37):
I think you're zachly right. She is someone who's a
lonely character. We find out more about her best friend
situation as the book goes along, but by the end,
she's created a network. And I don't want to give
this away, but she's created a group of people around
her that she can work with and she can succeed with,
maybe fail with, but I think likely to succeed with.

(31:58):
And that's what we should all try to do. You know,
my my wife, Sharon, she teaches as an adjunct at Columbia,
and she does this one exercise that I've actually borrowed
for when I teach students, where she has them think
about what their dream job is and they all write
it down a slip of paper and they pass it

(32:18):
up front, and she reads out the dream jobs and
where people want to work the most, and she sees
if anyone in the class knows someone at that company
or dream job that can maybe help that person out
to achieve that dream job. And almost always, if you
have a class of like even twenty people, more if
it's like thirty, someone in the class will know like,

(32:41):
oh I entern there, Oh yeah, my dad's cousin works there.
Or there's always someone in that network who knows someone
who knows someone who could maybe help someone realize their dream.
And that's why I think networking is important. You know
a lot of people. You don't use people just to.

Speaker 3 (32:57):
Get somewhere, but go out there.

Speaker 2 (32:59):
I have fun and go to part is to go
to book readings, meet people, create that network that's for
social socializing and for fun, but can also be activated
sometimes because people, more people you know, the more people
who know things can maybe help you get ahead and
get you to the position you want to go in.
And I've seen it work in just a small little

(33:19):
classroom and it can definitely work for your life.

Speaker 3 (33:22):
And it worked for this book. I mean for this book.

Speaker 2 (33:24):
You know, I talked to a lot of people in
tech who told me about their experiences that I fictionalized
in the book. Because I wanted to make it real.
I had people who were had been trained in out
of field intelligence, in computing and other disciplines read through
the book to make sure it was real, to make
sure it was accurate, and even for the blurbs. Because

(33:46):
I sort of have a wide network, I had a
lot of friends and authors that I knew read through
it and if they liked it, they gave me a blurb.
People like Ammani Perry, who's a National Book Award winning
author and Alison Bechdell, who's a great cartoonist and did that.

Speaker 1 (34:01):
I got those emails. I'm like, damn, who know Diaz
Walter as.

Speaker 3 (34:05):
Yeah, And it's all a matter of networking known people.

Speaker 2 (34:08):
And you don't get to know people because you want
to use them get a blurblater on you do because
these are all fun, interesting people with great stories that
you just want to hang out with. And this so happens. Hey,
I wrote a book that you might enjoy. Maybe you
want to give a blur, but so put yourself out there,
get to know people, and you'll have a much more
fun life than just sitting at your desk working away.

(34:30):
And actually it'll probably enhance your work as well. I
certainly found that to be the case for me.

Speaker 1 (34:36):
Yeah. I could not agree more one hundred thousand percent.
I want to talk about Wombat in the book because
she's the one of the most sinister characters. Like really,
you know I did I went viral recently, Like talking
about this study. It was hard for Kennedy's school study
about the impact of black women, especially on majority white teams.

(34:59):
And it didn't specifically say, you know, teams with white
women on them, but the number of comments from black
women who have felt victimized by a white woman at work,
you know, or being on a heavily you know, white
female team at work. It just when I got to
that part in the book, we meet this character. You know,
she's introduced as sort of she's running FLIT, which is

(35:21):
what is it Female Leaders in Tech? Yes, the e
RG at the company, and she's kind of like, so
all about Octavia, come join FLIT, Let's go to this meeting,
and she sort of becomes this I don't want Okay,
I won't give too much away, but she's she's a
scary lady, and I'm scared of her, as we should be.
Not who not not all, she seems who inspired her

(35:43):
in particular, Yah, No one.

Speaker 3 (35:46):
In particular inspired her.

Speaker 2 (35:48):
One reason why I wanted to have some of these
sort of negative type characters in the book is that,
you know, often when people talk about artificial intelligence, you know,
they talk about things like the terminator, the terminators coming
to get us, and these machines with guns that are
can take over the world. And to me, it's not

(36:08):
about the machines. It's about the people who own the machines,
the people who are operating and programming with machines. People
who are behind these companies, who are inside these companies,
they're the ones that are going to negatively influence the
creation of these AI, to create things that will be
bad for the black community. When you realize these kind
of negative personalities are behind kind of the seemingly neutral

(36:32):
facade of tech companies, then you think twice about the
kind of products that they're putting out. And so, uh,
that's why I wanted to sort of create characters like that.
And I also didn't want to make this because I've
seen other books where they have sort of negative, uh

(36:54):
female characters in them. And we learn more about that
character and the reasons she is where she is, we
also see who's really sort of running the company above
her and and how negative he is, and that that's
that's this you know, a guy. So I wanted to
show the way in which he may be pulling her
strings in certain kind of ways or pushing her towards

(37:15):
towards certain negative actions just to sort of blame isn't
sort of all at her feet.

Speaker 1 (37:22):
But yeah, I was supposed to feel compassionate. Okay, I
don't know if I wrote I might I might have
read too quickly through those pages. This lady is the worst.

Speaker 2 (37:31):
You can feel how you want to feel. And I
think that one thing I realized as an author is
you write these books. He put them out there, and
people can have their own takeaways from it. You can't
control that. You want to put ideas out there for
people sort of debate, discuss, make them feel certain ways.
But you shouldn't, as an author, feel like you have

(37:51):
to control the takeaway of your readers entirely. That's not
what a book should do and not what an author
should be aiming to do.

Speaker 1 (38:00):
Hey, ba, fam, we're gonna take a quick break, pay
some bills, and we'll be right back. I think we're
all also we are projecting our own stuff onto it.
And uh yeah, I just saw Sharon Epperson's name behind you.
That's my girl, Rachelle. Y'all are probably besties.

Speaker 3 (38:22):
Oh you know, I'm married to Sharon Efferson.

Speaker 1 (38:24):
Wait, shut up, you're Sharon's husband wife.

Speaker 3 (38:28):
Shut up?

Speaker 1 (38:31):
Wait what she's gonna die that? I did not realize
I'm talking to her husband this whole time.

Speaker 3 (38:37):
It's such a small world. Yes, that's another thing about the.

Speaker 2 (38:41):
World of of success.

Speaker 3 (38:45):
And you know professionals. Is that it was you know today?
She does. Yes, of course I told her I was
talking to you.

Speaker 1 (38:55):
I'm so embarrassed, I know, because I know her husband's
a novelists and you, I know, we talked about you
winning the NAACP Award recently and like all this stuff.
My bad.

Speaker 3 (39:07):
Yeah, just so people know.

Speaker 2 (39:08):
I'm Sharon Epperson is CNBC's senior Personal Finance the show correspondent.
She talks about personal finance. She's on the Today Show,
you know, every month or so talking about personal finance.
She went rid to column and and she's my wife,
and we have two kids.

Speaker 1 (39:25):
I'm dead. Yeah, I know you have two kids because
Sharon's like, my son is doing tutoring. Make sure you
tell the mom friends in your neighborhood.

Speaker 2 (39:32):
I know.

Speaker 1 (39:34):
I'm so I cannot. I'm like sweating now. It's so funny.

Speaker 3 (39:37):
It's a small world, it is.

Speaker 1 (39:39):
Well, that just makes me like you even more because
you got my girls. Sharon. Can we talk a little
bit about like the idea of responsible AI. You know,
there's a point in the book when our heroin, our protagonist,
Octavia is sort of charged with creating an anti racist
AI is that accurate to say, and they're are you know,

(40:00):
I think my brother he works in AI software sales,
and he's he's told me about the Responsible AI Institute,
and like there's these organizations that are supposedly working to
like rein it in, Like do you have any thoughts
on that? Is it is the is the what's it saying? Like?
Is it out? Is the what out of the box?
Is it too late to put it back in the
box to rein it in? Or are you hopeful?

Speaker 2 (40:23):
Well, it's it's a couple of things. Number one is
part of what's going on AI is marketing. They talk
about these things thinking, They talk about these things, you know,
befriending us. They talk about these things in answer fromorphic
terms because I realized that helps drive their stock, it
helps drive their product sales. And so far, at least

(40:45):
for the public facing AI stuff that's been put out there,
these things aren't thinking. They're not they're not human, they're
not persons yet I'm not you know, perhaps it will be.
Maybe l MS can't that that technology can't.

Speaker 3 (41:01):
Get us there.

Speaker 2 (41:02):
Maybe some other platform from technology is But again a
lot of this is just hype to build up AI
buildup the stock.

Speaker 3 (41:10):
That's not to say can't.

Speaker 2 (41:11):
They can't agented AI can't do amazing things. It can,
But we should remember that that AI, at its heart,
it's really just sort of.

Speaker 3 (41:22):
A prediction machine.

Speaker 2 (41:24):
It predicts what word will follow another based on large
sets of training data. That's a far thing from actually
making moral choices, true creativity and creating things, you know,
like you know, like Jimi Hendrix or Phyllis Wheatley, that
that change the game and excite people in art and

(41:47):
lead us into new things. AI isn't doing that yet.
There's no reason to believe it ever could because what
because the very way in which it's it's based. But
that said, that could happen in the future, and were
to be on the lookout for that. And I mentioned
Phyllis Wheatley earlier, and she pops up in the book.
Phyllis Wheatley was a great poet around African American poet

(42:10):
around the Thoma Jefson. She's the first black person man
or women, to have a published work of poetry in
the United States. But people like Thomas Shefferson didn't believe
she could do what she did. They didn't believe black
people could do that, I didn't believe we are fully human,
and they tested her almost like a Turing test for
back then, to see can she really do this, can
she really understand the work? And even when she passed

(42:33):
all those tests, they still didn't quite believe it. And
so we have to be on the lookout to make
sure we don't do the same thing to AI where
somehow it's passing all of our tests and we keep
moving the line, not realizing maybe becoming something worthy of
respect we're worthy of rights, worthy of of doing something
respecting with certain sort of human in a human kind

(42:54):
of way. So that's something to look out for. We're
not there yet, we may get there, who knows. We
should also be wary of power hungry AI heads who
are heck bent and creating an uber AI just for
their own selfish reasons that they can't control. It may
not take the form of things like suddenly take control

(43:15):
of all the launch codes of nuclear weapons. It could
just take the form of an AI that gets releases
of virus that then messes up all over our computers.
That's something that could easily happen. And until we have
more of a say by women, by people of color,
by a broader base of humanity into the operations of
these AI companies, these tech companies. There's more of a fear,

(43:38):
we should have, more of a healthy fear of respect
that something bad could happen when one guy power hungry
makes a mistake with the development of AI unleashes something
that they can't control. That very well could happen. Some
of it's marketing, but it could take some real form
that we have to be aware of and wary of.

Speaker 1 (44:00):
What are the missing like where are the weaknesses? And
like having these people, these singular sort of pole like
you said, power hungry executives able to do too much damage,
Like what are some safety measures? Like is it the
is it having a strong board, is it having like
are there checks and balances? Like I'm so seeing every
major you know, tech company ceo kind of kissing the

(44:24):
ring of the imish of the you know, the White
House and trumpministration right now just kind of kind of
terrifies me in a way. And you know, as a citizen,
they say go vote and you know, make sure your
voice is heard. But when it's the heads of these
private companies that can do whatever they want, it's like
who who's going to stop them.

Speaker 3 (44:44):
Yeah, I think that number one.

Speaker 2 (44:47):
I think we have to sort of push our elected
representatives to put some some uh some uh guidelines and
laws in place to check the uh, the the unrestrained
development AI. I don't think that'll inhibit us in terms

(45:08):
of competition with other countries that.

Speaker 3 (45:09):
Are doing it. I think it should be global.

Speaker 2 (45:12):
But until we do that in the same way we
regulate nuclear weapons to sort of help prevent its spread.
Certainly it's still around, but only certain countries have it
and it's not growing. Look, it maybe could be if
if those check court in place. The same has to
happen with AI. If we view it as something that
could be a potential weapon of mass destruction. I think

(45:33):
we all also to wake up and realize that this
really is a major turning point in human history and
recognize as such and take actions that we can to
sort of, you know, keep it in some kind of
check again, working without representatives, starting our own companies picking up.

(45:57):
If we're at these kinds of tech companies, all those
kinds of things can work as a check on what's happening.
But again I have to say we're not We're not
there yet. We may well be there. We're not there yet.
We've gone through other AI winters in the past where
people thought AI was heading someplace incredible and things that
have stopped and took longer for the things to developed.

Speaker 3 (46:16):
I think we'll see some.

Speaker 2 (46:17):
Very powerful AI agents coming out that are able to
do video and audio and the ways that are astounding.
But we have things that are thinking and and interacting
with us, and the level of human beings anytime and
making art and level of human beings I don't know,
but you know this.

Speaker 3 (46:33):
Is something that's a real concern to me.

Speaker 2 (46:35):
I mean, one reason why I wrote this book is is,
you know, a couple of years ago and I had
protect Cancer and have my Project Cancer taken out and
actually ever been one of those MRI machines A very loud,
very noisy, it's almost like you've got to going crazy.
You have lots of time to think. And I was
just thinking about the that the line between the life

(46:57):
and death and machines and humans and and and uh,
a lot of these big existential questions, and I realized
this is something I have to go has to go
into my book, something that I really I really want
to sort of write about. Uh, the meaning of life
as we sort of create artificial life with AI felt
that this was a moment to do that, and seat

(47:18):
of came into the realization and sitting there, you know,
getting ready to get my pro state taken out. And
I remember around the same time after I get my
proseate taken out. And by the way, it's a good
time to tell all your listeners out there that in
our community, you know, project cancer is ten percent higher
among people men of color, that we should all get
checked out. If you have any problems with like you know,

(47:39):
we don't talk about these things in our community.

Speaker 3 (47:40):
We should.

Speaker 2 (47:41):
You have a problems with the urination in terms of
like stopping or starting or whatever, go see a doctor.
Get yourself checked out. It's very important. We tend not
to want to talk about this stuff, but we've got to.
It's it's it's it's a it's a money issue too,
because it costs a lot to sealth health insurance to
deal with this kind of stuff.

Speaker 3 (47:57):
So get yourself checked out if you can.

Speaker 2 (47:59):
But remember, so I had my projecy taking out, I'm
you know, sitting on what uh, trying to recover, but
I was also working on a concert a New Year's
Eve concert with Cynthia Revo, and so I dragged myself out.

Speaker 3 (48:11):
Of bed, put on a had had to wear a
cafe that I.

Speaker 2 (48:14):
Went out to this concert concept for Cynthia reveal that
I was helping to put on. And remember she did
a cover of of Killing Me Softly, you know, the
Fuji's Slash Flax song, And then she did a cover
of Nothing Compares to You that the Prince song that asked,
and she finished the show with that, and and I

(48:34):
remember afterwards I was in in the in the green
room with her and and and people who'd help put
it on, and I was talking to myself, this is
what sort of life is about. This is what humans
can do. We can put on shows like that with
a kind of glamor and style and artistry that only
a Cynthia Revo can bring, that a Prince Song can
bring to the table, and that a Laura Hill can

(48:55):
bring to Killing Me Softly because she kind of did
it in the Laura Hill kind of way. And computers
aren't in the current firm, aren't ever going to get there,
and we have to protect that it all costs. We
have to protect the ability of humans to make that
kind of art to flourish, to have those kinds of
careers and to come together in a communal kind of way.

(49:16):
And you know, being operated on, going through cancer, you know,
being going through an MRI, having to drag myself to
the concert kind of showed me that that, you know,
creating the kind of art comes at a cost, and
that's why it's so worthwhile. That's why you know, an
AI chat GPT can pump out stuff but doesn't come
at a cost. They don't have to have the chat

(49:36):
GPT isn't either have to go through cancer or go
through the kind of things that Cynthia Revos has to
go through to get to.

Speaker 3 (49:42):
Where she is.

Speaker 2 (49:43):
We do, and that's why our art is meaningful, and
that's why I think we have to support it, champion it,
and and always realize that we create special things as
humans that computers cannot.

Speaker 1 (49:58):
That's such a lovely note. Unfortunately to end on, because
I can't believe it's already been an hour, and like, what,
where's the time. Well, what an honor to finally get
to beat you only years of hearing about you from Sharon.
I'm still dying inside. What a gift this book is
I mean, I don't think AI could have written this book,
although it is. You mean again, spoilers do we say?

(50:18):
I don't know. I just knew when I was reading it,
and I'm like, what is this? Narrator? Who is that?
It is not up to you like what is it?
It's very very good. I hope, ba fam you go
get the book Who Knows You by Heart. Oh my
little blurr is getting in the way, But don't worry.
I'll put it. I'll put it in the show notes.
It's on sale November eleventh. We will put a link
to where you can buy it in the show notes. C. J. Farley,

(50:41):
thank you so much for sharing a bit of your
story and sharing Octavia with Brown Ambition. It's been a
pleasure having you on.

Speaker 3 (50:48):
Thank you.

Speaker 2 (50:48):
It's great to be here, great to talk to you.
And I hope people check out my book, Who Knows
You by Heart?

Speaker 3 (50:52):
Thank you?

Speaker 1 (51:00):
Hey, ba fam, let's be real for a second, and
y'all know I keep it a book. The job market
has been brutal, now not brutal trash, especially for women
of color. Over three hundred thousand of us have disappeared
from the workforce this year alone. And not by choice,
but because of layoffs, disappearing DEI programs, and stagnant wages

(51:20):
that keep cutting us out of opportunity. Our unemployment rate
has jumped to over seven percent, while our pay gap
continues to widen. I know all of that sounds dire,
but here's what I want y'all to know. You do
not have to wait for the system to save you.
That's exactly why I created the Mandy money Makers Group
coaching community. It is a coaching community that is built

(51:41):
for us by us. Inside the community, we're not just
talking about how to negotiate or to how to get
the job that you want. It's about finding purpose in
your career. It's about finding communities and others, feeling seen,
feeling heard, and also having a sounding board and a
mirror to reflect your own magic, your own sparkle right

(52:03):
back to yourself. In this community, you'll get group coaching
led by me, but you also get peer to peer
accountability with proven tools and resources that can help you
do what we have always done since rise. Even when
the odds are stacked against us, despite all the challenges,
we will rise. If you're interested in joining the Mandy

(52:24):
money Makers Community and having that support to bolster you
and help you tap back into your magic so that
you can lead your career with intention and heart and
your own intuition, trusting that again, please join us. You
can find information in the show notes of today's episodes,
or go to mandymoney dot com slash community. That's Mandy

(52:47):
m A N D I money dot com slash community.
I would love to see y'all there. Enrollment is open,
so please go check out mandymoney dot com slash community
today
Advertise With Us

Host

Mandi Woodruff-Santos

Mandi Woodruff-Santos

Popular Podcasts

Ruthie's Table 4

Ruthie's Table 4

For more than 30 years The River Cafe in London, has been the home-from-home of artists, architects, designers, actors, collectors, writers, activists, and politicians. Michael Caine, Glenn Close, JJ Abrams, Steve McQueen, Victoria and David Beckham, and Lily Allen, are just some of the people who love to call The River Cafe home. On River Cafe Table 4, Rogers sits down with her customers—who have become friends—to talk about food memories. Table 4 explores how food impacts every aspect of our lives. “Foods is politics, food is cultural, food is how you express love, food is about your heritage, it defines who you and who you want to be,” says Rogers. Each week, Rogers invites her guest to reminisce about family suppers and first dates, what they cook, how they eat when performing, the restaurants they choose, and what food they seek when they need comfort. And to punctuate each episode of Table 4, guests such as Ralph Fiennes, Emily Blunt, and Alfonso Cuarón, read their favourite recipe from one of the best-selling River Cafe cookbooks. Table 4 itself, is situated near The River Cafe’s open kitchen, close to the bright pink wood-fired oven and next to the glossy yellow pass, where Ruthie oversees the restaurant. You are invited to take a seat at this intimate table and join the conversation. For more information, recipes, and ingredients, go to https://shoptherivercafe.co.uk/ Web: https://rivercafe.co.uk/ Instagram: www.instagram.com/therivercafelondon/ Facebook: https://en-gb.facebook.com/therivercafelondon/ For more podcasts from iHeartRadio, visit the iheartradio app, apple podcasts, or wherever you listen to your favorite shows. Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.