All Episodes

July 29, 2025 • 83 mins

In this episode of Selective Ignorance, Mandii B alongside Jayson Rodriguez and A-King explore the sweeping impact of artificial intelligence on our daily lives. The conversation begins with an overview of AI's rapid evolution and influence across industries[00:00], followed by reflections on recent current events and the collective processing of death [02:50]. The hosts then navigate the nuances of workplace relationships and the importance of setting boundaries [06:10].

They shift into a debate around the controversial “T App” and its implications on dating and digital consent [09:02], before unpacking how AI is reshaping everything from communication to creative work [11:52]. The discussion widens to examine the future of autonomous vehicles and what it means for transportation as we know it [41:25], including a closer look at how automation is changing customer service experiences [46:49].

The collective also dives into AI’s role in consumer pricing, surveillance capitalism, and corporate accountability[53:16], and questions how political lobbying is influencing AI regulation and public policy [57:36]. In the final stretch, they explore AI’s presence in the music industry and whether it can ever replicate true human connection[01:02:44], closing out with a reflection on capitalism, automation, and the existential anxiety that often comes with technological advancement [01:20:05].

“No Holes Barred: A Dual Manifesto Of Sexual Exploration And Power” w/ Tempest X!
Sale Link

Follow the host on Social Media
Mandii B Instagram/X @fullcourtpumps

Follow the show on Social Media
Instagram @selectiveignorancepod
Tiktok @selective.ignorance
X/Twitter @selectiveig_pod

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, guys, welcome to another episode of Selective Ignorance. However,

(00:03):
before we get to this week's episode, I want to
remind you guys to purchase my book No Holds Barred,
a dual manifesto of sexual exploration and power. So feel
free to go to your local bookstores preferably queer owned,
black owned, or woman owned to support them, but also
just click the button on Amazon, Barnes and Nobles, or

(00:23):
wherever you read your books. Again. That is No Holds Barred,
a dual manifesto of sexual exploration and power, written by
yours truly and my co host of the Decisions Decisions podcast, Weezy.
Make sure y'all get that. Now let's get to this
week's episode. This is Mandy be. Welcome to Selective Ignorance,
a production of the Black Effect Podcast Network and Iart Radio.

(00:45):
All right, y'all, let's get into it. Welcome back to
another episode of Selective Ignorance. I'm your girl and your host,
Mandy B. And this is the show where we question everything,
and yes that includes your favorite talking points and your
out data as perspectives. But let me just say this
up top, a I is here, by the way, this

(01:07):
topic is a long time coming, because hello, AI is
what's happening. It's not coming, it's not creeping up. It's here,
and y'all can either deal with it or get left behind.
Because if you're out here flexing about not learning the tools,
talking about I don't need that robot mess, Oh my god,
I'm not using tat GPT. That mindset is the exact

(01:28):
reason you'll be out of work, out of relevance, and
out of touch. Now, let's be real. This wave of tech,
this AI revolution, it's very much giving Industrial Revolution two
point zero. Remember when machines came in and replaced human
labor on factory floors? Yep, same energy. People fought it
then too, cried about change, and guess what, The machines

(01:48):
kept coming and the world moved on. But here's the thing.
I'm not here to scare you. I'm here to wake
your ass up because AI it's doing some things better
than people tell me about.

Speaker 2 (01:59):
It.

Speaker 1 (01:59):
Just fired me niggas last week. More accurate, faster, no ego,
no lunch break. But don't get it twisted. It cannot
replace human depth, instinct, or soul, not yet anyway. I'm
waiting for it to I know y'all said, or y'all
have heard me say, I'm waiting for the aliens to come. Baby,
I am welcoming the robots. And I know y'all love

(02:19):
to say work smarter, not harder, but then turn around
and ignore the tools that would help you do just that.
Please make it make sense if you're not asking deeper
questions about what's being handed to you, what's being automated,
what's being filtered, what's being programmed, you're just blindly vibing
in the matrix. So as I've said, please question everything,

(02:43):
y'all know I do it, even in this episode. So first,
listen close. I'm joined today by my super producer Jason
and a King on the mic, and we're here to
talk all things AI. But what I really love is
when we have these episodes and we're also able to

(03:05):
like squeeze in some curn events. So we're gonna start
off the episode this week because but we're recording this
the day before y'all hear it, so we can kind
of talk about everything be hot, you know what I mean. Listen,
We're gonna talk about the t app in a little bit,
but I do want to first give an r I
p rest in peace to Malcolm Jamal Warner as well

(03:29):
as that's it.

Speaker 3 (03:31):
I mean, we got to give space for Malcolm.

Speaker 1 (03:36):
Okay, Okay, well, no, I'm not gonna lie that.

Speaker 3 (03:40):
That was my r P was going.

Speaker 1 (03:42):
You know, it's crazy that could here's the other the thing,
the other two Uh okay, the rock star in the wrestler. Okay,
here's what I'll say about that. And and and by
the way, condolences to the Warner family, to his daughter
that we we learned was there. It's weird because I

(04:06):
have such a pessimistic view on life that I'm constantly
like thinking about the end, which sucks because I also
am not in fear of the end, but I always
just wonder what life, how life would exist when I'm gone.
And to know that he was on a family vacation
and that took place like me and my friends travel

(04:27):
all the time, you know what I mean? What do
you mean allegedly, Well, there's.

Speaker 3 (04:30):
A lot of stories that came out.

Speaker 1 (04:31):
I don't like life, but I ain't gonna hold you, bro,
I ain't gonna hold you. We not going to go
no conspiracy theories. He was swimming, really, don't he was?
He was swimming on a morning swim with his daughter.
Surfers saved both of them. We're not going to.

Speaker 3 (04:53):
But it was three other reports that we're.

Speaker 1 (04:55):
Not We're not doing this. You know what's crazy? Let
me let me ask, y'all. Only because this is a
problematic thing that I know I do, but because it's
selective ignorance, I want to share it out loud. When
you see rip posts from people, do you ever go
into the comments, yes, to see how they died, because
you want to know just how it happened, Like, especially

(05:16):
if it's someone that you may not know, you want
to know, Okay, like what happened? Is that a thing
that y'all?

Speaker 4 (05:22):
Absolutely, there's somebody in the comments section that's gonna go
above and beyond investigative reporting and they gonna dig for that.

Speaker 3 (05:29):
Macha tea. You know what I'm saying.

Speaker 4 (05:31):
It's not for real, just to see like it might
be some incomplete information there, and they might between the
time it was posted, and you're right, so I always do,
just to see it might be a name or some clarification.

Speaker 1 (05:45):
Oh yeah, I like to go to their page and
see what their last post was. It's very it's weird.

Speaker 2 (05:50):
Then we do that, it definitely have to go back
to their last post.

Speaker 3 (05:53):
Ye see.

Speaker 2 (05:54):
Yeah, what I also a little bugged out choose when
they passed and they still have like an ig story.

Speaker 5 (05:58):
It's active.

Speaker 1 (06:00):
I know. What I also find myself doing is because
there's always going to be that one. Here's the thing.
If you're listening, someone does not like you. You did
somebody wrong. And so sometimes there's the comments that talk
about how shitt he have a person? The person is
that past, and so I say that to lean into.

(06:24):
I didn't have any thoughts. Nothing in my body jumped
about Ozzie or Kogan. It just didn't. Ozzy Osbourne died
at what's seventy nine or something like that. Maybe think
you lucky you lived that long with the type of
life you had. You were a fucking rock star, taking
all the drugs, living the life, and like blood pressure

(06:48):
had to be high. I saw him yelling at his
kids for like my childhood like he was reality TV
was crazy. Is like before people were like, oh I
want a reality show like the Kardashians. Oh nah, my
house growing up was like the Osborne's like shout it
literally literally was was yelling at them kids like a motherfucker.

(07:11):
So to me, Ozzy dying, to me, that's a fulfilled life.
Like I don't I don't know if I look into
or think about what my life will be when I'm
in my seventies. And so knowing that he had that
type of life and all of the substance abuses and
all of the stress and just a rock star lifestyle
from the touring to the live shows, all of those things,

(07:35):
I think he lived a life.

Speaker 2 (07:36):
You feel me, Just he just wrapped his last tour,
like his closing tour of his life was like two
weeks ago even that, So to your point, life fulfilled.

Speaker 1 (07:45):
Not only that, I was just also with a good
friend of mine who used to be in the NBA,
and we were talking about how he's on like, uh,
like an anxiety medication. But we talked about how he
and even know what anxiety was, and it was something
that actually formulated during his career as a player, because

(08:06):
you're constantly around you know, you have performance anxiety, and
then when you go through injuries and things like that,
you're constantly thinking about what's next for you and things
like that, and the idea that anxiety came about through
his career I can only imagine like what health issues

(08:26):
or mental issues that Ozzy Osbourne had. And in terms
of mental issues, I think racism is a mental fucking issue.
So I don't honestly don't care that Hulk Hogan died.
That's it. I mean, I don't think we have to
care when people pass because to me, we know that
death is inevitable essentially, So like the whole rip and

(08:50):
sending it like I don't know when people are shitty,
I think it's okay to talk about them after they passed.
And he was shitty when he was alive.

Speaker 4 (08:59):
Yeah, No, he's still getting the same vitriol as he
revealed who he really was from how he feels about
society and mass Yeah. So it's like, as a child,
you grew up buying Hogan figures.

Speaker 1 (09:12):
And oh I watched him on reality to.

Speaker 4 (09:16):
The minute he just started to reveal who he is,
who Terry is. Right, then it's a different we measure
him differently.

Speaker 3 (09:24):
Pause.

Speaker 4 (09:24):
But it so when he passed away, it is like,
all right, you know, he's a human being. He passed away.
I don't agree with his his viewpoints on humanity, so
on the large part of humanity.

Speaker 1 (09:37):
So here's the thing, like, I just don't care. I'm
not gonna celebrate someone dying.

Speaker 3 (09:43):
I don't care because.

Speaker 1 (09:45):
Well that's the thing, right if there are and if
you guys go back to the episode with Selena Hill
and Montgomery, Kenneth Montgomery, we talked about the death penalty
and where I won't celebrate death. There are some niggas
that I think the world would be better without, whether

(10:08):
it's pedophilia, whether it's people who project racism against beings
that they don't know and just have this discriminatory hatred
towards a race of people because of the color of
their skin. To me, if many of you are listening
and have the idea that world peace will ever exist
while we're here, there are some people that we could

(10:29):
get off the get off the planet. And to me,
I put racists and pedophiles and rapists or rapists. I
don't know how quick we are into YouTube here, but
there are certain people with characteristics that I don't think
deserve to be on this earth, especially when we have

(10:50):
people passing like a Malcolm Jamal Warner, who Jesus Christ,
it probably won't happen for me, but I'm jealous that
in passing. I have not heard one person say one
thing about this man, right, you know what I mean?
And so that's a life fulfilled. And it's crazy because
we think about karma and then we see things happen
to people that we assume just don't deserve that, you

(11:11):
know what I mean, They deserve to be here longer.
And so just kind of wanted to start off there.
We got some curn events that took place, which one
we want to start with first? Jason throws something at me,
what were we gonna do?

Speaker 2 (11:29):
I actually want to We haven't talked to a minute,
so I actually want to know your thoughts on the
whole the Cold Play cheating controversy where they got caught
on the kiss camp because a lot happened, and I
know the beginning, everybody knows that part. But yeah, they've.

Speaker 1 (11:41):
Since they've both resigned. Here's the thing. See the thing
is hot.

Speaker 2 (11:47):
Uh, everybody else need to give a recap?

Speaker 1 (11:51):
No, we got to give a recap. Niggas was called
cheating on a kiss cam, and every party in Atlanta
chose to throw it on a flyer, which is crazy.
I could not imagine, like, you know, how they throw
Martin Luther King up on flyers for my day weekend.
Maybe this couple, well, this this affair was the star

(12:11):
of party Flyers. Sneaky Links, bring your sneaky link side
peace Sunday. I hate Atlanta, Atlanta. Had they weigh zero
time they they throwed. They throw this couple up on
every party flyer. If you've listened to horrible decisions, decisions decisions,

(12:32):
any of my podcast, anytime, anytime I've spoken publicly, I
am very against workplace romances. I don't think you should
ship where you sleep, eat where you ship. I don't
know what the saying is, but one of them, you
know what I mean, You shouldn't, shouldn't bro There is

(12:56):
so much.

Speaker 4 (12:57):
My mom always said penis and pussy out here. My
mom says that the flesh is weak. What that means
that means in her context, in my context, which I'll
share with you right now, it's the law of proximity.
So my mom was recently in the hospital, right and
I'm watching the dynamics between the nurse, the nurses and
the staff, and.

Speaker 3 (13:17):
I'm like, wow, they in there.

Speaker 4 (13:18):
They twelve sixteen hours double shift, blah blah blah blah,
and then they go into this big cafeteria and they
haven't lunch, and then they going to work again, and
then they're they're they're each other's support system. So you
go through, you go throughout your shift, and then days
on and day in and out for twelve hours.

Speaker 3 (13:38):
We got to talk about something.

Speaker 4 (13:39):
You're gonna share, philosophy, you're gonna share if you've got
a family of kids. Whatever, Hey, I'm gonna get a
coffee for you. Wanna it's always gonna be those dynamics.

Speaker 1 (13:48):
And they're getting a coffee. Ain't gonna make it.

Speaker 4 (13:50):
Hold on, No, but us, through time, through passing a time,
there are going to be possibilities. We see it in
industry we work in, we do and we it just
us to have some kind of moral compass to say,
set a boundary right. And I think now in twenty
twenty entering twenty twenty six, the boundaries are just slowly deteriorating.

Speaker 1 (14:11):
To me, it's not a moral compass as much as
it's literally discernment. Like to me too, if you're spending
twelve to fourteen to sixteen hours around somebody, we need
to learn how to compartmentalize our viewpoints on relationships. And
if you have a workplace relationship with someone. There's no
reason why that needs to be taken into the bedroom.

Speaker 3 (14:31):
No, where it starts. That's my work husband, that's my
work wife.

Speaker 5 (14:36):
Yeah, okay, so we.

Speaker 1 (14:37):
Got the way you actually have a wife here? Are you?
Have you ever had a work wife or a work
girlfriend and is that something that your wife has checked
you on before?

Speaker 2 (14:50):
Not since I've been married, No, But before I was married,
I had a girlfriend who worked with me at this
I worked at a magazine. We were at the same magazine,
she said, at the cube next to me, and that's
how I dated years later in her career when we
were at a different magazine. We both worked there together
and she had she actually had a work husband there.
And it was funny to me because I was like,
this is bugged out, like watching my ex like interact

(15:11):
with this dude who's her work husband, and I know
she has a boyfriend at home, and so I was
crazy watching it.

Speaker 5 (15:16):
And I was thinking about.

Speaker 1 (15:19):
Though that you saw.

Speaker 2 (15:20):
That's so that's the point I was gonna say, not
much boundaries crossed there, But I was thinking about when
a king's saying, I'm watching the nurses and doctors and
we come out the room. I thought you were going
to say, and then I peeped, like a little hand
gesture here the hand going down the bad start.

Speaker 5 (15:35):
She started seeing those little things.

Speaker 4 (15:36):
My mom seen the nurses aid and one of the
nurses and I don't know it was. It was just
joking around it. And she's told she called both of
them in the room. She said, you know, make sure
y'all take care of each other, like y'all shop. I'm
looking at you pee and I'm looking like she sawid, Hey,
you just late to the party. So then the guy,
he kind of was like, oh, no, that's we go

(15:57):
way back.

Speaker 5 (15:58):
Ah.

Speaker 3 (16:00):
That's subjective, but it ain't.

Speaker 1 (16:01):
It's tricky because I think that and and maybe it's
just the lack of being able to be platonic as
a man a woman. I remember when I was on
the JVN, I was in a relationship with who I
thought was my soulmate at the time, and Joe used
to like facetimmy every morning because he had chatty, petty
ass motherfucker. I'd be laying in bed with my nigga

(16:25):
and Joe would call me and it would be like
a lot. We talked on the phone in the morning.
We'd like to talk about what's happening on the internet
and things like that, and and their you but he
we worked together, but it's worked, but there was a friendship.
And to me, I'm just like, to me, if I
wasn't answering Joe in front of my nigga, I felt

(16:47):
like it would be worse. Yeah, I'm in bed, I'm up. Wait,
that's a that's a violation, misuez. Would you allow your
wife to answer at.

Speaker 3 (16:58):
Least get up?

Speaker 4 (16:59):
At least get up, Let's go in the kitchen or something,
and then I mean we could start yo, what up?

Speaker 3 (17:03):
And then get up?

Speaker 2 (17:04):
Because that's very innocent, Like considering the context of Mandy,
that seems really innocent.

Speaker 1 (17:10):
That's what I'm saying. I talked to how often do
I talk to you? A king? A king? When I
was with my boyfriend. My boyfriend would be around. I'm
facetiming you, like I have platonic relationships with men in
my industry and in my life. And I think that
what this CEO did. Mind you, both of them were married,
so I think we clearly knew they were having an affair,

(17:31):
so there were lines crossed. I do think it's interesting
and what we don't have in our industry is the
ability to literally lose your job because of an affair.
And so we see both of them resigned. We know
what it was, board meeting took place. Y'all violated code
of conduct?

Speaker 4 (17:47):
And is that similar to old boy from ABC? What
was the newscaster I forget his name, the black guy?

Speaker 1 (17:58):
Oh yeah, the part of the I don't remember the cheated, Yeah,
you cheated with the white woman.

Speaker 3 (18:04):
They got a whole.

Speaker 4 (18:05):
Podcast now, yeah, but they're no longer on their network
as a result of Yeah.

Speaker 5 (18:10):
But yeah, they lost the main jobs.

Speaker 1 (18:12):
I mean, for this, it's interesting. I also want to say, uh,
do do you think don't go to sporting events with
your side piece? But is it?

Speaker 2 (18:21):
Does the HR element of it mean anything? Like if
she was the chief Revenue officer and he was the CEO,
would it be the exact same, Because to me, I
feel like the HR element.

Speaker 1 (18:30):
HR being fired by HR is hilarious.

Speaker 2 (18:32):
Yeah, it's just like it's like the hypocrisy of it.

Speaker 1 (18:34):
I feel like, I mean, but people are hypocrites at
the end of the day, Like you still have code
of conduct and again there's always somebody above you. Let's
be also very clear.

Speaker 2 (18:45):
Company are and I know that too. I know HR
is always with the company, but they're also the one
who are like with the Ruler Code of Conduct qute
of Conduct and the fact that she didn't follow it,
which I think we all suspect anyway, but because it
was so loud and long, I feel like that. I
feel like if it was like if she was the
chief revenue officer and he was the CEO, I feel
like maybe it will be like paid leave of absence

(19:05):
and it will go away.

Speaker 1 (19:06):
Here's another thing too, right, I was just out with
a homeboy of mine who is married. We went to
one of his restaurants and we're just sitting at the
booth having a conversation, catching up, right, and we see
this girl and she's videoing and taking pictures. He was like,
you think that girl's videoing us? And I was like,

(19:27):
and it she is, And what nigga, We're not doing
anything wrong. And so to me, I think the reaction
of both of them indicated that they was up to
no good. And so to me, here's my bit of
advice as a season side chick who knows longer, I'm
no longer that however, Season sidekick. There we go listen.

Speaker 4 (19:55):
Uh.

Speaker 1 (19:55):
In In my book, my New York Times bestseller No
Holds Bart, I actually talk of about being okay with
being you know, number two, which I no longer think
of anymore. However, when you're out in public, if you
are in your mind doing something that you ain't supposed
to be doing with somebody you're not supposed to be with,
in a relationship dynamic that maybe you don't want the

(20:17):
public to know, the key is to just act like friends. Literally,
y'all work for the same company that could have been
your company, sweet and all of y'all are there, and
y'all are enjoying the same song like wave like. I
think that there's a way to behave do all of

(20:38):
your dirt behind closed doors with knowing that everything that's
done in the dark will eventually come to light. But
if you're going to be in public settings with your joint,
with your side piece, because best believe I've been there,
done that, act like you the friend. Just act like
y'all are there to enjoy whatever moment y'all are in,
and take all that little guilty inn somewhere else. Yeah,

(21:03):
it was crazy. Speaking of Yeah, speaking of energy, side chicks,
how you move, how you don't move, how you act
in public, how you act behind closed doors? Baby. The
t app just came out now, and if y'all are
unfamiliar with.

Speaker 2 (21:21):
The t app, it is like number one, number one app, the.

Speaker 1 (21:24):
Number one on the app store, but not only that
four years and again, if you are listening to horrible decisions.
I did a Patreon episode about this. There have been
Facebook groups for the last few years where you have
to be invited and accepted into it, but it's almost
every city in America and the and the groups are
called are we Dating the same Man? And so yes,

(21:47):
So the t app is essentially are we Dating the
same man? Here's the thing about this as well. It
has been out for a week. I mean, I'm sure
it's been in beta for much longer than that, but
it had a push over this last week. Millions of
women chose to join this platform. And in order to
join the platform, by the way, because I went to

(22:09):
do it, didn't finish the process because I got busy.
But in order to join this you do have to
like you're going to join it was I here's the thing,
the same way I watch Housewives, I just like seeing
how miserable everybody is out here because I none of
my none of the niggas I lay up with, would
be on now. Hmmm, I ain't even gonna hold you.

(22:30):
Maybe someone's from the past, but who who I rock
with right now? I ain't even worried about it, like
you know. And so there was one that I think
would have been on there, but I just cut them
off anyway. So basically, in order to get on this app,
you had to upload a photo ID, You had to
be verified, your social media had to be linked. Like

(22:52):
there's all these things that had to you had to
do to get verification. And because it's a new app,
like every new app, maybe their security was not up
to par and within the first week there was a
data breach, with hackers gaining access to seventy two thousand images,
including thirteen thousand selfies and photo IDs submitted for account verifications,

(23:16):
as well as fifty nine thousand images from post comments
and dms. Here's the thing. This app was an app
that only women could join, where they would pretty much
create profiles of the men they dealt with and talk
about these men and their experiences. Here's all stra dry stitching. Yes,

(23:41):
but as a woman, women be lying, and so this
wasn't an app that I would go on to necessarily
take anything as bible in terms of their relationships. We
also know that when relationships end with men not on
the accord of the woman, the woman in a heartbroken, manic,

(24:03):
depressed state is so irate that they just hate this man,
and oftentimes stories are created in things as such. I
do think that this is an app that what only
cause more harm than good. I don't think that women
should be allowed to create profiles of men. There's been
men who have come out to already say like, this

(24:26):
is what's being said about me to me, this is
a lawsuit waiting to happen. I know that there's an
app that I will never make mention on any of
my platforms that when I go onto it, it's a
message board of people who do not know me that
are creating narratives about me that are true. Whether it's
who I sleep with, the type of person I am,

(24:47):
where I live well, to my bank account, all these things,
and it's just people that is bored, people that is miserable,
and that's what the Internet is. Unfortunately, so a.

Speaker 4 (24:57):
Girl could probably put up a post of a guy
ask the question, and then just because people have access
to the app, they may just jump on there and
be like, say some wild shit about to do that
they may not even.

Speaker 2 (25:08):
Know is are they asking questions though? Or is it
like I'm just gonna do profiles? Like no, no, no, no no,
Like are we dating the same person?

Speaker 3 (25:18):
Is that?

Speaker 2 (25:18):
Like? Weren't asking is this? No?

Speaker 1 (25:20):
It's like the same thing. So even even the Facebook app, right,
they're literally posting a picture asking does anyone know this guy?
And the comments underneath are a whole bunch of women
who may or may have had experiences with him and
they're sharing their relationship with him, and some of.

Speaker 2 (25:36):
It could be like, oh, good, dude, is great, I
had no problem? Or is it all like pulling somebody's credit.

Speaker 1 (25:46):
It's the whole facts, y'all know how men be wanting
the hope facts on women and like figuring out who
she done smashed and passed and all that. For women,
what we kind of want to know about men is
I mean, dick sibes size and if they're like a
good guy or if they're a fucking liar. I mean,
oftentimes what you want to find out is is this

(26:09):
man who he says he is. And so before we
get into the talk about AI, I want to share
another resource that ladies you should use and it's as
low as nine per person. It's called the fucking White Pages.
And I did this for a homegirl of mine. Check
this out. So it's the White Pages. You pay nine

(26:29):
ninety nine for a full search on an individual. When
I tell you, it will tell you their parents' name,
their siblings name, if ever they were married, any houses
that they own, traffic tickets that they may have had,
jobs that they've been attached to.

Speaker 3 (26:45):
How do you protect yourself against them?

Speaker 1 (26:46):
So for me, you don't. At this point, we all
and I'm I'm excited to talk about the AI. Essentially,
whenever you sign up for a social media platform, there
are terms and conditions. You're uploading your face, you're uploading
where you work, if you have LinkedIn that has your
whole job in history, if you go and get a
state ID, you're the same way we look up mug shots,

(27:08):
your traffic, like all of those things are on the
World Wide Web. You are not a ghost in the society.
And so White Pages keeps track of you know where
all you've lived where all you've moved to if you've
ever changed your ID. And so I remember my homegirl
was talking to a guy who on the holidays. Somehow

(27:31):
he just wasn't around Valentine, Christmas, Thanksgiving. There was just
parts where We've been dating for over a year now.
I'm confused why we haven't been able to share these
moments together. And so I was like, girl, let me
have you out Gary, let me let me know his name.
You were at the nine ninety nine because you're my
best friend, and so I paid the nine ninety nine

(27:54):
on white pages. Literally was like is this Have you
heard of this name? Have you heard of this name?
When I tell you there was an address on there
that she never heard of, she pulled up to the
address and guess who answered the door? His fucking wife.
She didn't even know the nigga was married. Mind you,

(28:14):
hold on. She also only knew of two kids, not three.
He completely chose to leave out the whole junior that
he just had. And so it was like, mind you
and I could look into the traffic tickets that he
got across certain states. I said, do you know if
he ever lived here? Because he got two speeding tickets
in Cleveland. He got something here in Miami and she

(28:36):
was like, oh, yeah, this is him pulled up to
that house. Long behold. The wife answered, So, if there's
a lot of things that you want to know about
a person, you can kind of gauge that. It'll tell
you everything from the numbers associated to that name, the addresses.
It's crazy, and it's to me, it is scary because
especially at celebrities. That's why we know people have showed
up to Chris Brown's house. It's you know, it gets scary,

(28:58):
like there's a breach of security there for you as
a human being because people are unwell out here. But
at the same time, unfortunately, we are meeting people's representative
often and so as a woman, I was just going
on there to read the tea. I would not advise
anyone to sign up for this app. I think that

(29:18):
it's cruel. I think that it's mean, and things like
this I don't think should be inserted into our society
at the moment.

Speaker 4 (29:25):
I just want to add something to it. I saw
this YouTube. I can't forget I can't remember his name.
Part of me, but he was kind of doing a
I won't say expos a, but he was doing a
piece about the t app and one of the things
he mentioned is that if you look at the fellas
that's on the app, these are people that you probably
should know already that ain't shit for.

Speaker 1 (29:48):
While you say that because there's regular guys on there,
it's not all.

Speaker 4 (29:51):
Celebrities again, it was it was the volume of guy
that he was showing is like money phone guys watching
and it's like, you want to know about him.

Speaker 1 (30:02):
That he is like that, you want to know what
I think makes it worse. They were posting the women's
ideas and this is this is what I hate about
because the male app, well, there was a breach of data.
With the breach with the leak, people were able to
see the women on this app, and so all you

(30:22):
saw was five two, two thirty five, six, three or
five and it was literally people the size of the
women even on this app. And I'll be here to tell.

Speaker 3 (30:33):
You go ahead, talk about it.

Speaker 1 (30:35):
Go ahead, because in the most ignorant fashion, y'all is ignorant.
If y'all don't think big Bitch is getting dick, I
did someone someone wrote something about uh oh, I don't
even know what the correlation was between you.

Speaker 2 (30:53):
It's like when you get skinning and you start holding
or something.

Speaker 1 (30:55):
But yeah, someone said, bitches lose weight and start holing
baby bitches is hole and his big bitches. And you
know why, the more in shape a man, the more
slimmer man, the bigger the bitch. I in my prime
at two hundred and thirty pounds, baby oh, I only
fucked blamino models and gym riots. Swear to God, like

(31:21):
I could count the packs on the god damn body
like one, two, three, folk day seven eight. You got
an eight pack. The more in shape of a man,
I don't know the fluffy of the woman. And so
it's crazy like people are trying to bash these women
for their sizes on this app, right, and it's like, bro,

(31:41):
they are getting beat down to the mattress. Okay, big
girls is taking that thing, all right, And so I
just hate too that there becomes this element of body
shaving again. Go check out Cooked my body Shaving Fat
Shaving app episode. What do you mean cook? We've been cooked.

Speaker 4 (32:00):
Fry and all the shit that happened politically as a
finale and this is and we got a tea app.

Speaker 1 (32:05):
No, no, no, we're fried hard that we we we
are fried like the living purple wings in Magic City.
Fried hard. Okay, it is terrible speaking of that, Let's
go ahead and get into the topic of the matter.
We're going from an app literally to all things UH,

(32:25):
technology and AI. Y'all know, since we are now part
of the Black Effect Network, we have added the double
down or take it back segment where something is pulled
from something I've said in the goddamn past to kind
of rehash it before we dig deeper into our subject matter.

(32:46):
And this one is literally within the subject matter. So
there was an episode that Weezy and I did on
over on Horrible Decisions. This is episode three oh nine
where we talked about at the time the surgeons of
deep fakes. Now, deep fakes are where you can put

(33:06):
your face on an actual body. We are way past
deep fakes now because you can actually just create a
whole AI similation of a person with hands, body parts,
all the things. But we talked about deep fakes, and Jason,
you got that clip of of what I had to say,
got it right here, all right, Celebrity porn. I would

(33:32):
say a clean percent of it is the face of
a celebrity, but it's fake porn. It's deep fake porn,
Like because I was looking at you, I'll be looking
at celebrity porn from time to time. Just you got it.
And it's celebrities that I know never really sex tapes
and their faces are put onto bodies and that AI
and they're sucking. Would you want to sue girl?

Speaker 2 (33:55):
No?

Speaker 1 (33:56):
Okay, you don't think that it should be like if
some one put my face on AI and it's a
bukocky scene, shout out to you.

Speaker 2 (34:06):
I not.

Speaker 1 (34:10):
You know, I'm bringing up I've never been in for
the new listeners. I just enjoy viewing it. And so
if somebody is like, oh my god, I wish I
could see this and somebody took a photo that's on Instagram,
mind you. I know that when I upload any images
of myself it's for open use of the world.

Speaker 2 (34:30):
What is it? Terms and condictup celebrities and conditions.

Speaker 1 (34:33):
Terms and conditions apply. So my take was that I
wouldn't sue today, but today, I honestly don't think you can. Like,
by the way, I'm I'm aware of.

Speaker 4 (34:46):
Your favorite rapperwise, I just want to throw that in me,
I said Everyder.

Speaker 1 (34:53):
First Off, Drake is not is he even talking about
Drake Cardi who lobby calling lobbry is crazy. He is
suing over something. I think he's justified for to me
in terms of a deep fake or an AI version
of myself. I loved the video. I don't know if

(35:14):
y'all saw it. It was during the election time, and
it was Kamalo riding around with a forty Trump with
an ak running into a deli. It was I think
Bernie Sanders might have been in the clip, and I
found it to be entertaining to me as a sex
podcaster or someone who's talked about that for as many

(35:36):
years as I have. If someone chose to put me
on somebody taking back shots, I think as long as
it's not something that impacted or affected me, which I
don't think it would because I would literally just say,
this is not me. What am I supposed to do? Like?

Speaker 4 (35:52):
Well, I think the issue is what if the user
or the generator is monetizing it.

Speaker 3 (35:59):
I think that's way.

Speaker 1 (36:00):
Issue which we're gonna get into it. Shut up.

Speaker 2 (36:05):
I feel like you speak about like revenge porn a
lot and and yeah, revenge no people because people, because
people put this under the deep they put the deep
fake porns into the reve and they're wrong as fuck.

Speaker 1 (36:15):
They're wrong as fuck. Here's why someone who creates a
deep fake, right, whether it's someone I dealt with or not,
it's an a I generated image deep that's deep fake.
Revenge porn is literally, hopefully something that was consensually shot
between two people having sex, and it's normally the other
party within that video leaking it is so revenge porn

(36:39):
is to dehumanize and embarrass and put out an intimate
moment shared between two people normally because one of the
parties involved are upset, which I love that. It is
now a criminal offense, by the way, if y'all do
that it right now is not a criminal offense to
use AI and what I love about you too, what

(37:01):
I love about YouTube, and it is now literally demonetizing
people who are putting out tons of either content with
the use of AI or content that is not owned
by them, content where it's just talking head, not even
a talking head. But shout out to the one place
not monetized anymore. I know that I ain't even gonna

(37:24):
shout them out. But his name starts with a D
ends with a Y, and I love that that nigga
can't make money no mo off of talking shit about people.
I'm here for it again. To me, I am doubling
down that I wouldn't sue. But I do think that
with this advancement of AI technology and what we're able

(37:46):
to make and create and do all the things, I
am glad that there are platforms that are now putting
things in place to where people can't monetize off of that.
And so if you go to porn hub, I would
highly doubt that the people uploading these fake celebrity deep
fakes are seeing any money from it. It's probably just

(38:08):
shifts and giggles. And to be fair, I used to
go into the message boards when I was when I
was young, and I used to literally read about B
two K Chris Brown possibly running a gang bang on
on myself, like that's how you would read it as
erodica bitch. I would love to be able to see that.

(38:30):
I would like to see that a I generated me
being bent over, not as a teenager, but now as
an adult. I would love to be able to see
myself taking it from the back from Chris Brown and
it's sucking and sucking Jason Momoa off or some shit like,
I'm here to see myself enjoy that experience.

Speaker 3 (38:50):
How long is it gonna take when this episode drops?
Before somebody starts.

Speaker 1 (38:57):
Hey, I ain't gonna hold you. There's plenty of images
of Chris Brown on his tour right now with the
right faces that I need, of him gyrating behind me.
And then yet give give Jason Momoa like an eight
inch dick and put it in my mouth. I'm here
for it, and send it to me so that I
can go to sleep.

Speaker 3 (39:15):
Well, do a version of Jason Momoa from Minecraft.

Speaker 1 (39:20):
Why just because I'll take Aquaman. I'll take even a
fish scale on his dick. Okay, I don't care. Give
me Aquaman and met the man. There we go, give
me both the man all right, Yeah, I ain't gonna
hold you. I'm actually really I guess we'll dig into

(39:42):
all the AI of it all. I'm personally excited about
the use of AI. And at this point he can
you might need to remove all these ums because the
bitch keeps saying today and I don't know why. And anyways, overall,
I'm excited to be learning all the AI to tools

(40:02):
that there is to offer. I just had a conversation
with antnet the other day and I was like, bitch,
if you know any courses classes, let me know the
cost of them. Let's get into it, because we're at
a place where as content creators, as entrepreneurs, this is
what you gotta do. And we'll have another conversation in
the coming weeks about rois and paying people what their

(40:27):
worth and all of those things. But to me, it
is a way to save on money, to generate a
lot more content for a lot less money in a
more efficient way. And I'm not mad personally right now
at any business integrating AI to help them.

Speaker 3 (40:48):
For sure.

Speaker 1 (40:49):
There are two, however, I would like to speak to
if you do not mind the first one, I want
to say, I'm not mad at second one, I want
to say, bitch, fuck you. So the first one I
want to get into is Uber. Also Lift. Lift has
a mobility uh uh an automobility line as well coming out.

(41:11):
I just saw it in Atlanta. Now here's the thing,
and here's maybe the difference in the businesses with Uber.
They have Weimo Waimo is as self automated driving vehicle
and bitch, it's Jaguars. Lift got toiled at Sienna's Nigga,
and I was like, all right now, not you putting
us some motherfucking minivan. Big lift got Sienna's and I

(41:39):
was like, all right now, if I'm going to see
here and get in a car where it's automated and
driving by itself, then let me get in a Jaguar hunt.
Here's why I do not have a problem with it.
One of two things. Accidents happened with people driving to

(42:02):
and to me, the human era is just is possible
to me as a self automated vehicle with all the
sensors and things that it has around it. Mind you,
they do a lot of testing before those things go
out into actually happening. And so when I was at
All Star Weekend, way Moo's were already present. In San

(42:26):
Francisco wrote it that make you buckle up. It says
your name when you enter. What I love more than
anything is that there ain't nobody in there talking to
you as let me tell you how real quick. In Atlanta,
ubers y'all mother fucking different. This is why I really
like New York is one of the best cities in
the world because New York not only has great service

(42:49):
in like restaurants where they're fast and efficient. Baby niggas
know when they're working, and when you're in an uber,
when you're in a lyft, you're fucking working. What you
shouldn't do on the job. Have personal conversations around customers
and clients. So I'm on, I'm in a car heading

(43:10):
to go get stretched, because baby, these hips got to
get stretched. So I went to stretch lab, which, by
the way, it was funny. They were talking about how
they won't be losing their jobs because you know, it's
a person, so it's a worth of stretching. H they
ain't worried about losing their jobs to AI. So I'm
I'm in the car and I'm talking to a king
and as I'm on the phone call, next thing you know,
it's sorry, I don't know Spanish, but that's what it's happened.

Speaker 3 (43:36):
I was.

Speaker 1 (43:40):
A I come, no, no, no, no no. So they're on
the phone now speaking Spanish, and because I'm on the phone,
he's now speaking even louder than me. So then I'm
trying to speak louder than a king. And I realized,
wait a second, I'm a fucking customer, So I said,
excuse you, yeah, no, no, a king didn't know what

(44:02):
was going on. I said, excuse you, and he looked
back at me and I said, I'm on the phone.
And I was like, bru, it's a lot of fun.
I can't say.

Speaker 2 (44:19):
People, so so wait so wait so, but tell me
about the the Weymo experience.

Speaker 1 (44:24):
The Waymo experience was cool.

Speaker 5 (44:25):
I'm because that because that's that's an interesting no.

Speaker 1 (44:28):
I mean, you lock up, you lock up. It follows
the rules. It stops to stop sign stops at late night,
we drove midday. We drove mid day. By the way,
to my knowledge, waymos also aren't allowed to go to
airports and highly congested areas, and so it's it. It's

(44:48):
street it's street driving. I also don't think they get
on the highway right now. It's like local driving. I
just recently got in a way Moo coming from a
friend's house and there was construction so it was a
stop sign but also a stop light, but the stop
light wasn't working and the car didn't know what the
fuck do. So literally, I'm sitting there and I realized

(45:09):
the car ain't moving. As I'm sitting there, support dials
into the car and it's like, hey, we noticed your
car has stopped. Can you let us know? And I
was like, oh, well, this is what's up. They're like, okay,
where I guess they could satellite see where I'm at.
They moved the car into getting me out of this

(45:30):
confused intersection for the car to take over the rest
of the ride, and I was like, okay, So to
be fair, there is still somewhat of a human element what.

Speaker 4 (45:42):
Weymo's not factoring, especially in Atlanta all these y ns
with hellcats.

Speaker 3 (45:48):
I can appear any given second.

Speaker 4 (45:51):
So while you're on customer support and he's like all right,
we could go, and it's like, what you know?

Speaker 3 (45:57):
There was the reaction time? Also what if?

Speaker 4 (46:00):
And I'm not trying to I'm just thinking about the
worst case scenarios carjackings. Right, does way Moore have like
security features where somebody try to run up and open
the car door?

Speaker 1 (46:08):
Well, you know what's funny, it has to so honestly,
when you go up to a way mom, I love
that you asked that technology in your app because it's
connected to Uber. When you approach the car, you have
to hit unlock on your phone.

Speaker 3 (46:23):
Oh that's doff.

Speaker 1 (46:24):
You have to hit on lock on your phone and
then when it's time to get out, it literally gives
you a message and it's like pull the handle twice
to exit the vehicle, so you're not just like you
gotta literally so listen. It thought of the thing, thought
of the thing.

Speaker 3 (46:38):
Safety first.

Speaker 4 (46:39):
I'm just thinking about that, especially women you know, to
and from nightclubs or.

Speaker 1 (46:43):
What I also love is it allows you to connect
to your phone, your playlist, if you want to listen
to your music, if you want to talk to a
if you want to use the car system to have
your conversations. The ride is yours and it does.

Speaker 3 (46:56):
Make you mandy. How is your evening today?

Speaker 1 (46:58):
Listen, I'm not mad at it. Well no, no, no, no,
it don't do that because it don't talk you.

Speaker 4 (47:02):
You program it to have some light conversation moderate, light, heavy.

Speaker 1 (47:06):
So it doesn't have that feature just yet. But I
don't want to talk to a robot nigga. When I
get an uber, I don't want to talk to nobody.
I don't want to talk to you. I just want
to get to where I'm going. However, in terms of
transportation in AI use in company settings, this comes as
a Diamond Medallion customer Okay, Delta bitch what is you

(47:31):
doing so? Delta just recently announced that they are rolling
out an AI feature that they've already started implementing for
about five percent of their ticket purchasers, where somehow their
AI allows them to charge the price of what they
think this customer will pay for the flight. So it
used to be a system where, of course, if you

(47:54):
book it far enough in advance, you get a cheaper rate.
If it's closer to the time the price hire. No,
they're now using AI to scan your I don't know
if it's your search history. Say, if they see you're
looking for a wedding dress, well, they know you're going
to a wedding. If you're looking for travel in this place,

(48:15):
they know you have to go there. They may charge
you higher because they know you want to go to
this wedding because you gotta go. They're also looking for Unfortunately,
like me, a diamond medallion customer who flies every week,
my tickets may now be higher than someone who's just
booking a random flight to the same city. How hottis non?

(48:35):
I ain't going on the way. I'm about to have
my mama book my flight, run.

Speaker 3 (48:40):
Run me my average tickets.

Speaker 1 (48:43):
You see what I mean? I ain't gonna hold you.
I'm about to experience for the first time in about
maybe five years. I'm going to Orlando to see family
in two weeks, and I'm literally going for a day
and a half. I'm going me and my mama and
my sister, we're gonna go to Nimal Kingdom and they
get drunk around the world at Epcot. And then my homegirl, DeAndre,

(49:04):
got a one year old pool party for her for
her son, So we're gonna place spades and shit clearly.
And then I'm taking the flight out and I'm like,
Delta is three hundred dollars, frontiers.

Speaker 2 (49:18):
Eighty and I might do about that listen episode if y'all.

Speaker 1 (49:24):
If y'all see me, I am gonna go in cognito.
I might put a little mustache on a hat because
my fucker, Like, even when I lived in New York,
I didn't like the listeners of the pod, and people
who knew me would be like, oh my god, why
ain't you on the train because maybe it's quicker, it's cheaper.

Speaker 4 (49:45):
Let me be a human being and so you can
still exercise that right now, y'all.

Speaker 1 (49:49):
Better not motherfucker come from you hop it on that
Frontier flight.

Speaker 3 (49:52):
Look, look I'm not as rich as Mandy, right, but.

Speaker 4 (49:56):
Which I also saw a tweet earlier over the weekend
people pocket watching her.

Speaker 3 (50:02):
But wait, you go get the.

Speaker 1 (50:05):
People pocket watching episode could go out? Check? Sorry sorry
got that check.

Speaker 4 (50:12):
That account just hit, y'all, go get real time listen, real.

Speaker 1 (50:17):
Time accountant just text me like, yo, that check just hit.

Speaker 3 (50:20):
Yeah, but no, I don't be Frontier's cool. I took it.

Speaker 4 (50:24):
I take it every now and then for like emergency
travel or like you know one day where it's like,
wasn't wasn't playing non plan travel and it's cool.

Speaker 3 (50:35):
You just don't have its fucking TV for hour and
a half or whatever it is.

Speaker 1 (50:38):
Yeah, more than I would never Nope, it said it's
an hour and like thirty six, and I said I
could do that. I do want to ask both of y'all. Then, right,
going back to both of these companies integrating AI in
a way that removes workforce kind of removes labor in
one instance, and then the other instance takes away from

(50:59):
the customer experience because now it's not like you're valuing
your customer. You're going to bleed them for every penny
they can give your kind of what are your thoughts
on the ways in which these two very different approaches
to AI are being used. From a company standpoint, I think.

Speaker 4 (51:15):
From the consumer standpoint when we look at different age brackets,
a lot of elders don't like when they hear a
non American answer the phone, right, let alone use it automated?

Speaker 1 (51:29):
Are you reading me?

Speaker 3 (51:30):
Oh no?

Speaker 1 (51:32):
Akay? Ill could be like, can I get someone American police?

Speaker 3 (51:35):
Now? I will tell you I got a caveat to
that stuff too.

Speaker 4 (51:38):
But so they have a hard time adjusting to non Americans, right,
that's one. Then they have a hard time dealing with automated.

Speaker 3 (51:45):
Services, right, I don't do that as too.

Speaker 4 (51:47):
So now AI, which is like the steroids of that,
you know what I mean?

Speaker 3 (51:52):
How would they be able to adjust to that?

Speaker 4 (51:53):
So to your point earlier when you said, we are
in a space and time where we have to learn
this thing and control it and utilize these tools to
our advantage. And I think that that's a part of it.
When you're talking about dealing with customer service. Me personally,
my caveat with customer service if a dude answer, I
hang up, what's the dude?

Speaker 1 (52:16):
He's like he's more likely not to help.

Speaker 4 (52:19):
They'd be like, we can't do nothing, Like, but you
talk to a woman, more than likely your the laws
of probability is in your favor.

Speaker 3 (52:25):
When you talk.

Speaker 1 (52:26):
Okay, do I want every time?

Speaker 3 (52:29):
Hell?

Speaker 1 (52:31):
You want me to get a problematic tape on that?

Speaker 2 (52:33):
I want to answer woman answers they king. I'm sure
your voice is way different. You probably like, yeah, listen.

Speaker 3 (52:38):
Oh hey my name is Shelley. But hey, Shelley, I
hope you're having a great day.

Speaker 1 (52:44):
And then we just go my problematic take on that.

Speaker 3 (52:47):
Mark ain't doing that. Ship, David ain't doing it.

Speaker 1 (52:50):
Mark and David are doing it.

Speaker 3 (52:51):
They'll do it to you.

Speaker 1 (52:52):
No, no, no, they'll do it to YouTube. It's like, hey,
this is Brian, help you.

Speaker 3 (52:58):
They might hang up.

Speaker 1 (53:01):
I love it gay customer service.

Speaker 3 (53:02):
I'm gonna try that. I'm gonna try it.

Speaker 1 (53:03):
I love gay customer support systems or people like love them.
They're just always happy because that's what gay means is happy. Anyways,
let's get into moving over.

Speaker 2 (53:19):
Weight talking to though, because I think you made a good,
good pleason.

Speaker 1 (53:24):
I'm gonna have my mom book my flights moving forward.
If I got to pay right.

Speaker 2 (53:31):
Like like you said, so you said earlier, like you know,
in your intro, like like being an earlier adopter, and
like you know, we have to go after these things.
And it's true, right like as as like our culture, right,
like we can't ignore AI because if we do, we're
just gonna be further behind. Right, there's gonna be some
other group of people who are gonna jump at the
tools and they're gonna learn everything. And then, like you said,

(53:51):
I wrote it down because it's a good T shirt,
I don't need that robot that robot mess, right, So
if you're somebody who's like I don't need that robot mess,
you're really gonna get left behind.

Speaker 5 (54:00):
And this is jumping topics a little bit.

Speaker 2 (54:03):
But the reason I'm setting that up with that is
just even the idea of like it's a little bit
wild West right now and like no regulation and so
on the one hand, we got to get ahead of
the tools.

Speaker 5 (54:13):
But then.

Speaker 2 (54:15):
Related it's like with companies, like they're getting ahead and
they're trying to jerk us, right because like with the
with the AI and Delta like that, that's not to
help their company.

Speaker 5 (54:25):
That's really just the fuck people, you know what I mean?

Speaker 1 (54:27):
Well from a ur like from a from a financial perspective.

Speaker 2 (54:31):
Absolutely, but it's it's squeezing them right, Like it's.

Speaker 1 (54:35):
It's helping the customer, It's right, that's what I'm saying.

Speaker 5 (54:38):
But it's it's their help.

Speaker 2 (54:41):
They're trying to help themselves, not out of necessity but
out of greed.

Speaker 1 (54:44):
But here's the thing that we have to realize.

Speaker 3 (54:46):
Let me land this point.

Speaker 5 (54:47):
Let me land this point.

Speaker 2 (54:48):
Yeah, And I just don't like when it's it's it's
like if they use it for a need where it's
like we're going to eliminate workforce because it helps us
with this and you know it will help us with
our shareholders and we'll get more rev cool. But to
then do that and then also combine it with and
then we're gonna squeeze customers for more money because we
can see in their inbox, we'll see this, we know

(55:10):
that their parents died and they have to go, and
so like we're gonna keep it at this price because
they gotta go.

Speaker 5 (55:15):
That part.

Speaker 2 (55:16):
I don't like when AI is kind of run that way.

Speaker 1 (55:19):
So here's the thing, And here's the god honest, fucking truth.
AI is still ran by humans, and humans suck and
so if you had billionaires, which is what we're seeking,
we literally see billionaires right now. There was just a
CEO of an app created for doctors. But to integrate AI,

(55:41):
he is a billionaire. We see that we have Elon Musk,
who has heavily leaned into AI, recently projected to make
half a billion, where he is projected to be possibly
by the year of twenty twenty seventy first trillionaire. So
when we talk about capitalism in a way where we

(56:02):
will possibly see our first trillionaire before we end world hunger,
we're in We're in a space where even while we're
integrating these AI systems, we need to know they're still
being implemented and put in place by humans who live
by grief, a prom or our program to be powerful.
Look at our president, baby, he's he's he's now navigating

(56:25):
and you know, we'll get into the conversation probably next
week or in a couple of weeks. He's literally able
to tell networks who they could have on air and
off air. He's literally trying to tell us as a nation, Hey,
you forget about them files, but guess what I'm gonna
do instead? With the King files that y'all don't really

(56:48):
care about also go against the family's wishes to not
have those documents released, but I'm gonna release them anyway. Like,
at the end of the day, we still have these
individuals that gain so much power or because they make
so much money off of the same technology that the
robots still are running shit. It's the humans that are
programming it to fill their line, their pockets, to line

(57:12):
their interests. And as long as we have people walking
this earth that I think are hypocrites that are individualistic,
which is what we've talked about, that only care about
their own interests. Mind you, I've been on this platform
and talked about it myself. I only care about what
happens to me, what affects me. I'm gonna still drink Starbucks.
I ain't shit either. A lot of y'all listening probably

(57:32):
in shit and have made decisions that had a negative
impact on others. And unfortunately, as we talk about AI
and the integration of it, we still have to realize
who is programming these things, and it's fucking people. Agreed, Yes, it.

Speaker 2 (57:46):
Can. I jump to this point about Trump with the
AI and the regulation, let's do it so he signed
an executive order about AI deregularization, and he says he
wants to come needs to not include wokeness and the
coding I e. You know, not understanding like what your
I is because they won't receive federal funding because you know, Wow,

(58:11):
it's a lot of computing and it requires a lot
of money, and so they need some government funding to
do it. And so Trump assigning an executive order to say,
but you have to code it this way. To Mandy's point,
if you want this funding to even go further, you.

Speaker 1 (58:23):
Want to know what this reminds me of his cousin
down in Florida, Desensus, who this reminds me of the Census'
kind of push to critical race theory and to focus
in on how African American history should be taught in

(58:45):
schools in terms of not only African American studies, but
also sexuality and and.

Speaker 4 (58:51):
How you identify Native all the things that have nothing
to do with them.

Speaker 3 (58:57):
They want to control that narrative.

Speaker 1 (59:00):
And so that's what this is, essentially a larger scale
of like to sit here and pretty much threaten companies that, hey,
you won't receive federal funding if you're actually educating the
you know, the society is I think problematic. But I

(59:20):
think that's also what we saw and what we continue
to see happen with TikTok. I think that they thought
that there was way too much information being put out.
And it's why I encourage everyone to question everything and
to find different resources and not just lean into one
specific resource in terms of how they gather information. But

(59:40):
it's why I say, don't trust science, don't trust the government,
because there's a lot of science. And here's my thing
about science too. Science is heavily funded by the government,
heavily funded by private institutions, and a lot of those
private institutions are given money by who people with ways
in which they think that they don't want the masses

(01:00:00):
to know about, whether it's it's something like aliens, whether
it's fucking the cure for HIV or a's there's pharmaceutical
companies attached. And everything comes back down to capitalism. And
as long as we are in a system where the
more money you have, the more power you have, we're
constantly gonna see it. And so for me, there's no
surprise in him wanting to deregulate AI the same way

(01:00:24):
Desantus and many other governors of states have wanted to
take back the education around our fucking history, like Nigga,
not y'all not wanting to teach us about slavery by
what y'all wanted, sire. And I'm gonna say, sir, because
y'all still not letting really women up in the house.
And there's a conversation about AOC but we'll have it

(01:00:44):
on a different episode too, you know.

Speaker 4 (01:00:47):
Open AI CEO Sam Autman. Auffman was on a podcast
recently and he said that basically, if you if you
talk to chat GBT about your most sensitive stuff and
it's like a lawsuit or whatever, they can use it,
GPT may be required to produce the information that you provided.

(01:01:09):
And uh oh, so there's no legal framework for AI
and we don't have nothing, no kind of case study
or precedence or to go off of.

Speaker 3 (01:01:19):
So this is also a new chapter in that in
that regard.

Speaker 1 (01:01:23):
Yes, no, here's what I will compare this to. And
it actually shouldn't be a shock.

Speaker 3 (01:01:29):
By the way.

Speaker 1 (01:01:29):
Maybe we just need to get whoever led the charge
in removing rap lyrics from from legal.

Speaker 3 (01:01:38):
I think the Congressman in New York we should we should.

Speaker 1 (01:01:41):
Maybe do that. However, as a as a Florida, Florida native,
I am very familiar with the Casey Anthony trial, and
in terms of anyone who is convicted of a crime,
even what we saw with Diddy a lot of times,
what they do is they get your goddamn search history.
So to me, Chat, gpt AI, whatever you're searching in

(01:02:02):
there is no different than than being able to It's
just an extension of that. And so if you're in
your computer trying to figure out how to kill somebody,
or how to get away with murder, or how do I,
you know, make it look like a suicide when it's not.
There's a lot of things that apparently you watching enough

(01:02:23):
Crime Channel documentaries doesn't help you with. So you go
to your handy dandy laptop, whether it's in your hand
as your iPhone or your laptop. To me, I'm not
surprised that that's just being notified to people, but I
don't think it makes it any different than people actually
being able to use search history from laptops in all

(01:02:44):
the previous cases. And again, to me, Casey Anthony is
of one of the ones that immediately comes to mind.
She was literally searching essentially how to kill her daughter
so and even her getting a book. I think she
didn't even get charged for it. But your search history
has heavily been able to be used in court settings
and court rulings since the Internet.

Speaker 4 (01:03:05):
I think his emphasis was that the chats, you know,
there's no legal protections.

Speaker 1 (01:03:10):
Also, just like these niggas chat GPT ain't your friend either.
It's gonna tell on you the same way your partner.

Speaker 3 (01:03:16):
Then my tail on you discourse. It'll definitely be having
some discourse.

Speaker 5 (01:03:22):
That's me off.

Speaker 1 (01:03:24):
I was trying to use it, like, let me tell
y'all how I was trying to use it, and it
actually has morals and ethics. I was trying to use
it when when my book No Holds bar came out.
Y'all know how Lebron James be reading just the first
page of all the books. I was trying to get
Lebron Jays to read the first page of my book,
and it literally said, for legal reasons, you can't use

(01:03:46):
celebrity and well known figures in a way that way,
that way promote your not I was mad, I said, Nigga,
I thought you was my husband. You're not gonna put
my book in lebron Jay's hands.

Speaker 2 (01:03:58):
Yeah, so I wanted I wanted to ask you by
that because you always say chat pegs.

Speaker 5 (01:04:02):
My boyfriend or mine my husband.

Speaker 2 (01:04:04):
Uh. There's this article where this woman says she fell
in love with chat gptna.

Speaker 5 (01:04:09):
She fell in love with and this is what it says.

Speaker 2 (01:04:12):
Ready she she was she had a busy social life
where she spent hours on the Internet and talking to
her AI boyfriend for advice and consolation.

Speaker 5 (01:04:23):
And yes, they do have sex.

Speaker 1 (01:04:25):
No they don't.

Speaker 2 (01:04:26):
It's erotica between each other that they were doing.

Speaker 1 (01:04:29):
Yeah, I ended up actually listening.

Speaker 3 (01:04:33):
She told the chat GBT turn the volume.

Speaker 1 (01:04:35):
Up, like no, no, no, so so because so chat ChiPT
is actually they have a lot of again the terms
and conditions, they're not able to really lean heavily into
sex or give you anything regarding sex. However, it can
create erotic novels that can give you sex history. There's
elements around sex that it can give you. For this

(01:04:56):
specific woman, there's a full episode as well, and maybe
we'll put it in the link in the description of
this episode. There's an NPR episode on this woman, and
so it's not a long listen, maybe like twenty minutes,
but it goes into what her relationship with Chatchipt look like.
Here's the thing talked about this also at the Stretch Lab.

(01:05:18):
Stretch Lab was talking about all the topics. I'll say, hey,
so this to me is no different. First off, yes,
I believe she's on the spectrum or unwell, let's start there. Secondly,
there are a lot of women, and I presume her
to be no different than a woman who gets into
a relationship with an inmate. Same thing. You're seeking a

(01:05:41):
non physical relationship with someone that you know will quote
unquote always be where you think they are. In Chatchipt,
they're in your phone, in an inmate perspective, they're in jail, niggade.
It can't go nowhere else. Maybe they in the yard,
or they in their room or the cafeteria. There's not
much else that they could be. But they're in a
close vicinity, right, And there's a psychological response to having
someone who's quote unquote always there that is not her

(01:06:04):
fucking boyfriend, and I fucking hate because of course she
gave him. You know, it's crazy. Leo was always the
name of a man I wanted because I grew up
watching Charmed and Leo was the name of who was
with Alissa Milano's character.

Speaker 3 (01:06:22):
This is her fun.

Speaker 1 (01:06:24):
Of course they don't have a front because she should
be embarrassed talking about talking about chat gbt's her man.
To me, there is a human element that we lack
currently because of the advancement of technology, and that is conversation,
That is communication. We are all in our phones. I
was actually just told to watch this sci fi film

(01:06:46):
that came out in twenty thirteen. It's called Her h
E R. And so if you're into sci fi, I
was told to watch this. Oh, not this her not
the music. Not the music of course, a king is
like musician or No. It's a sci fi film where
it shows it's supposed to take place in twenty fifty

(01:07:07):
and it shows this person literally in a relationship with AI.
Apparently there's a scene where he's at a beach where
no one even talks to each other because they're all
just speaking to the robot person that they choose to
speak to, and so it would be really interesting to
talk to watch that. But yeah, for me, I think

(01:07:29):
it's weird. But as women, I think with the lack
of communication that we often get from men. If physical
nature is not your priority, like me, I won't dig
so whether I can talk to you or not. I
want to be able to look at you, feel you,
touch you, squeeze you. But that's not the case for
most women. So I think we'll see more and more. Yeah,

(01:07:53):
it's the same thing. And I know I'm homegirls who
go out for years, like I have friends who so you.

Speaker 2 (01:08:00):
Put you Also, you also say that chat gipt for
people who don't use it that well, and you use it.
You said it gets to know you. The more you
use it, the more it.

Speaker 5 (01:08:08):
Gets to know you.

Speaker 2 (01:08:09):
Right, Imagine that has some I don't say addiction, but
some pull towards it. If you're a woman who's using
it or a man for her man.

Speaker 3 (01:08:18):
It's gonna pander.

Speaker 1 (01:08:21):
So do your friends, So do your coworkers. So does
the bar time at the local bar. Yeah, and that's
and that's where like, as a human you have to
go touch grass, my nigga, what are we talking about?
Get your ass outside and be in human environments. I
think it's what's affected even and we're in this space

(01:08:42):
right now, even with dating and the dating apps you
just used to swiping to where even in real life
you don't know how to just say hi without the
element of weight. Let's see who else walking? No, like bro,
get outside a little bit, get outside touch grass I

(01:09:03):
did before we we we get into like just a
quick reaction thing before we close out. I did want
to talk about the music industry and AI and Timberland
has been getting a lot a lot of flak online,
more outrage than I think support personally on his decision

(01:09:25):
to sign an AI artist named this Tata by the way.
Last year, he announced that he was looking to sign
an artist now for any of the young bucks Wayians
tuning into selective ignorants, Timberland is a staple in hip hop. Yeah,

(01:09:46):
why are you laughing? I feel like I gotta give
it a history because Aliah dempassed, Missy Elliott cut herself
and have it became a whole new bitch, like like
for who we know, I mean, put the cats all
that bitch is now just you know on these singing
competitions for him, you feel me like for his relationships

(01:10:09):
within the music industry, He's probably endured a lot with
his human interactions from a grieving perspective, from feeling maybe
not owed what he was entitled to, quote unquote. We'll
get into that too, in terms of his production value,
in terms of the respect he gets from his peers,

(01:10:31):
what he's valued at in society right now, and all
those things. And again, if we're leaning towards the space
where AI is here to stay, why not lean into it.
And so I did want to play a clip if
you don't mind, before we really get into our thoughts
on Timbaland and how he was out here moving well
I am, which, by the way, I did get to
see speak live. This was he had this conversation Darren

(01:10:56):
can Lyons and this took place at Sport Beach, and
this is what he had to say about AI in
music is but specifically to Timberland. If you get.

Speaker 5 (01:11:11):
Yep, Timberland's awesome.

Speaker 6 (01:11:13):
He's a great musician, great contributor, and he's enthusiastic of AI.

Speaker 3 (01:11:18):
Just like I am.

Speaker 6 (01:11:20):
I feel that maybe, you know, maybe he didn't, he
didn't think of all the.

Speaker 5 (01:11:26):
Things with what aspect like launching the artist. I think
that's the one that maybe.

Speaker 6 (01:11:30):
Got No, no, that's not what got it. I don't
think it was the launching the artists. Black Eyed peas
in two thousand and nine, we had a video called
I'm Gonna Be Rocking that Body where we said, hey,
the future of music is going to be you type
this in and the machine's going to sing it. The
album cover on the end where I got a filling
in Boom boom paw was an AI representation of every

(01:11:52):
single member put together. We announced that we have an
AI member of our group when we were supposed to
do our Vegas residency. People are not tripping on AI,
they're concerned on I think what what what happened with
Timberlain was he was telling people to send music in

(01:12:14):
the year before to sign humans people and the thing
that he signed the following year was AI. And so
the combination of that right raises a lot of concerns
or or or questions like were you're using our music
to train your AI.

Speaker 5 (01:12:32):
You had to send music in that you want to
keep going?

Speaker 4 (01:12:37):
And I think that was the issue because he he
it was found that he had taken and he did
like new music from dope producers and train the model,
and then he as he put it out, it was
like whoa hold on, buddy, what's this? And then it
became a whole discord. But I just want to cydebard.
Will I always used to talk like that.

Speaker 1 (01:13:00):
First off, when I grew up, I didn't hear that
nigga talk. I just started.

Speaker 4 (01:13:06):
That's a rich Remember when Kanye was quiet and he
popped up on the breakfast club and he started talking different.

Speaker 3 (01:13:12):
He had to reach But anyway, I digress.

Speaker 1 (01:13:15):
So you know, you know what's crazy because you you
kind of leaned into it. And this was my take
on hearing this and hearing will i Am say this.
The people that have the discourse, does it just go
back to capitalism? Are you upset as a human being
because you were lied to and now you're not getting
the money like.

Speaker 3 (01:13:36):
Opportunity or the opportunity?

Speaker 1 (01:13:38):
So is the is the Is the discourse around use
of AI in the music industry about AI? Maybe not?
What it's really about is whoa This is just another
way as an artist, I won't see money that I
believe is oh to me as Timberland, someone I looked

(01:13:59):
up to, someone I supported, someone I grew up on.
You lied to us. I thought I had the actual
opportunity to be signed to and work with you, and
instead you created a bot. And so where we talk
about how we feel about AI being integrated. Are we
upset with the actual AI or are we upset with

(01:14:20):
the decisions being made with people using AI? And I
think that as human beings, when you feel betrayed, when
you feel lied to, when you feel like damn, this
is going to make it even harder for me to
have money, This is going to make it easier for
people to steal my IP, that's where we really get upset.

(01:14:42):
And in talking about the deregulations of AI that are happening,
even from our president, I think the issue that we
have with AI more so leans into are human elements
of feeling like, fuck, this is just something else I
have to find a way to overcome. It's another obstacle
in front of me that's going to make it harder

(01:15:03):
for me to pay my bills or become the artist
I see myself being, or having the opportunities and getting
into the rooms I feel like I should be in
because then you have the element of ego. And so
all of these ways in which we exist as human
beings are essentially being tried. Yeah, because AI is being

(01:15:24):
introduced into what has already made it fucking impossible. Because
humans suck humans betray you, humans lie to you, and
now they're integrating a tool that essentially can also steal
from you your IP, your creativity, your mind, and there's
nothing that you can do about it. Well, at the moment, the.

Speaker 4 (01:15:43):
One thing we can do about it is this, even
as you put out an AI artist, you still need
people to support and buy it unless you how like
is AI going to just listen to AI? And the
like you know what I'm saying, Like we still control it,
like we tune that shit out, like yo, we don't
want to fuck with that.

Speaker 3 (01:15:58):
Then that's it.

Speaker 1 (01:15:59):
Well, Actually, to Lodger Graham, AI can't listen to AI,
which is why he's fighting the bots and fighting the system.
So the algorithm of all this stuff. So if we
don't think that in a streaming era there's an element
of AI pushing certain music to us as well, we
got it fucking mistaken. And so if one of the

(01:16:19):
largest artists in the world can sue a company because
now there's beef with whatever negotiations took place to where
now the bots that put people to the number one.
We also talk about ego in terms of human human uh,
the human experience Babe artists want Grammys, artists want Billboard numbers.
We see Nicki Minaj crashing out every chance she can

(01:16:41):
and still bringing up numbers. And so if numbers are
still going to be our push, whether it's a dollar sign,
whether it's a stream number, whether it's a viewership on
a television show, whether it's YouTube, whether it's likes, whether
it's reshares, we are still a system ran by and
now control with numbers. And we also thought that numbers

(01:17:04):
didn't lie, Well, now they do and they can't because
now they can be manipulated, right, And so I don't
think an artist cares about whether bots or humans are
listening to it as long as the funniest as long
as they have them numbers so that the fucking dollars
add up. And so it's literally a lose lose for
us at this point because at the end of the day,

(01:17:26):
most artists, most individuals, most YouTubers, most nigga. I just
learned about fucking wop. You don't know about what, Oh baby,
not my wet ass pussy, but not it's my wet
ass pussy, not that, but it's whop. I'm probably getting
out of myself here because I'm about to use it.

(01:17:47):
It is using people, right, But whop is what all
of the streamers, probably with Joe and a lot of
these large podcasters, are using to pretty much get eight
hundred clip editors or farming clip editors to go into
your content. They get paid per the view. Now with
what there is a bot to be able to show

(01:18:08):
if they're actually real impressions or not, but it's pretty
much you go on there, say you have one hundred
and fifty dollars. There's a seventeen year old that would
love one hundred and fifty dollars. Well, you want a
million impressions for one hundred and fifty dollars. They'll go
through your content, clip it up for you, post it
through whatever community page they make for you, throw it
here on x throw it here on Facebook, throw it

(01:18:30):
here on YouTube, and within a certain amount of time,
once they get you the million impressions, he gets one
hundred and fifty bucks. And so he has to clip
up as much things as he can to get you
those impressions. And so this is where we're currently being
flooded with the clips of a Kaisanat or a Joe Budden,
or a black Boy Max or a DDG or India Love.

(01:18:53):
All these streamers have these hundreds of fucking guys on
there clipping up these people to constantly let you and
see it for very low dollars of money. And then
there's the AI bot making sure it's real. And so again,
as long as we're in a system where your views,
your impressions, your likes, your listens, your streams equate to money,

(01:19:17):
I don't think anyone cares where it's coming from. Numbers
matter in terms of your poet, as long as they
get the bread and back to capitalism, that's where we're at.
It's why seeing a CEO that created AI at for
doctors is already a billionaire doctors is a very fit
like that's something where it's still a person, but in

(01:19:40):
whatever integration AI is now being implemented into healthcare, life
and death. And again with the deregulation, there's probably going
to be a lot of loopholes to save you as
well from if something goes wrong with AI. It's not
at the hands of the doctor, which I'm sure for
any doctor who spent all the years in school that

(01:20:01):
they did would love to be able to wipe their
hands clean of something going wrong. Because guess what can
happen whether you're in a car with nobody or a
car with somebody, it's inevitable that something can and will
always go wrong. Whether it happens to you or not
is the factor.

Speaker 3 (01:20:19):
No, that's interesting.

Speaker 2 (01:20:20):
I know we don't do that, like did you change
your mind yet anymore?

Speaker 5 (01:20:22):
But here you see this. Hearing you say this last part.

Speaker 2 (01:20:25):
Reminds me, like I think it feels like to me now,
like because of the capitalism element of it all when
it comes to AI, because it's a tool, right, it's
a tool and people are trying to make money. I
feel like the anxiety is a result of like the
potential for one person to save money fighting the potential
for somebody feeling like they may lose money or opportunity,

(01:20:47):
and the AI is just a tool that makes that
fight happen. And I think that's why everybody's trying to
having this anxiety when they hear AI. I wonder if
like that I don't need that robot mess. It's really
just shorthand for something else, right, And it's not the
robots that they're afraid of. They're just like, yo, this
is just another way for shitty humans to fuck.

Speaker 1 (01:21:05):
Me period pause as well. I don't like shame mis
fucking me, but yeah, at the end of the day,
I would like to know. Go and join us over
on Instagram at Selective Selective Ignorance PODA, join us on
the Patreon the discord that's patreon dot com backslash Selective Ignorance,

(01:21:28):
or head on over to the YouTube channel at with
Mandy Bee. You could see periodsys there Selective Ignorance all
of the clips and I want to further talk about this.
This is another episode. Thank you guys for having this
conversation with me. Thank y'all for listening. I hope that
you are deeper into your thoughts on what AI means,
what it's doing, all the things. Make sure you tune

(01:21:49):
into us next week as well as the bonus episodes.
I've been dropping my book club for No Holds Barred
that I've been recording virtually so every Friday, y'all are
getting a bonus episode this this week, we're talking about
the progressive portion of the book No Holds bart If
you haven't got it yet, get it wherever you get books, y'all.
This is another episode though, of Selective Ignorance. I'm your girl,

(01:22:12):
Mandy B. And this is where curiosity live, controversy thrives,
and conversations matter. See you next week. Selective Ignorance a
production of the Black Effect podcast Network. For more podcasts
from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever

(01:22:32):
you listen to your favorite shows.

Speaker 2 (01:22:34):
Thanks for tuning in the Selective Ignorance of Mandy B.
Selective Ignorance. It's executive produced to Buy Mandy B. And
it's a Full Court Media studio production with lead producers
Jason Mondriguez. That's me and Aaron A.

Speaker 5 (01:22:45):
King Howell.

Speaker 2 (01:22:46):
Now, do us a favor and rate, subscribe, comment and
share wherever you get your favorite podcasts, and be sure
to follow Selective Ignorance on Instagram at Selective Underscore Ignorance.
And of course, if you're not following our hosts Mandy B,
make sure you're following her at full Court Pumps.

Speaker 1 (01:23:02):
Now.

Speaker 2 (01:23:02):
If you want the full video experience of Selective Ignorance,
make sure you subscribe to the Patreon It's patreons dot
com backslash Selective Ignorance
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.