All Episodes

June 26, 2025 63 mins
Black faces still can’t get seen by AI?! In 2025, tech still fails us—from facial recognition bias to voice assistants that ignore AAVE. Are we being erased in the digital future? Meanwhile, Black Sudanese immigrants face deadly deportations, Diddy’s wild hotel “freak-offs” leave luxury chaos behind, and Pusha T’s surprise label drop stirs major hip-hop drama. This episode breaks down the bias, the backlash, and the headlines shaking Black culture right now. Tap in!

SUBSCRIBE TO OUR CHANNEL NOW:  YOUTUBE.COM/@TRUTHTALKS-LIVE

Become a supporter of this podcast: https://www.spreaker.com/podcast/truth-talks-live--6611166/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Truth Talks, a show where four totally different
people get together and argue about what's going on in
the culture. Kind of like friends beating in a bar
for a drink, but we don't always like each other,
like all the way. Like I don't even like Da
Beetri at all. Right now, that's just the Bible.

Speaker 2 (00:18):
I'm on.

Speaker 1 (00:19):
I didn't even talk to him today. That's just how
I came out of bed this morning. So we're gonna
see how that portends for the show. Good luck to
him anyway. Today we're gonna be talking about Diddy's dirty
ass hotel rooms, Push a T fiery at Drake Buck
fuck Funk, and if AI doesn't work right, you know

(00:39):
what could happen to you?

Speaker 3 (00:40):
Your black ass might end up in jail. Truth Talk
starts right now.

Speaker 2 (00:57):
Are y all ready to roll? Met yell?

Speaker 3 (01:02):
Welcome truth Tellers. It's another episode of truth Talks.

Speaker 1 (01:05):
So we're gonna get all the way into Push a
T dissing Drake because I love stories where somebody is
dissing Drake. If Drake has one million haters, I am
one of them. If he has one hater, it is me.
And if he has zero haters. It is because I
am dead. That's how it is. So we're gonna talk

(01:26):
about pusha dissing Drake again. We're gonna have a deep
talk about AI. That's artificial intelligence. Dimitri not Alan Iverson,
try to keep up. Please welcome my co host, doctor
Cheyenne Briant.

Speaker 4 (01:41):
What's going on? Truth tellers? You guys know, I'm bringing
the truth, the whole truth, and let them but the
truth because a lot of times my other co hosts
they're a little more on the ninety side. So I
bring the reality of what these topics really need a
voice to. So anyways, yeah, I'm back. You can continue.

Speaker 1 (01:58):
Welcome friend to the show, Reva Martin, a legal expert
you've seen on CNN and ABC Live and Harvard Law
and everywhere else important.

Speaker 5 (02:06):
I love a show that starts by calling the co
host naive HM the word sis. But we'll see who's
naive by the time we get to the end of
this show.

Speaker 1 (02:19):
All right, sister, I appreciate that. And let me tell
you something. Late last night, the Make a Wish Foundation
called me and said, we really need a favor.

Speaker 3 (02:27):
There's a kid who wants to do a TV show
with you.

Speaker 1 (02:29):
Just one time, so please welcome by, wakem by, make
a wish.

Speaker 6 (02:33):
Kid to me, Tree Wiley. I just can't catch a
break with this show. Teaz Christ. I'm just here to say,
HOI to my grandma. How you guys doing.

Speaker 3 (02:43):
What's grandma's name?

Speaker 2 (02:44):
Her name is Rose Webb? That's her name?

Speaker 3 (02:47):
Oh wow, my grandmother was named Rose. He is crazy.

Speaker 2 (02:50):
We have more in common than you think. To her,
I know, I know, all right.

Speaker 6 (02:55):
Uh.

Speaker 1 (02:56):
Let's move into trending truth brought to our brought to
you by our friends at hair Maniacs. Our first story
is a bit of an odd aspect of the ditty case.
We heard Didty's former assistant, Mia not her real name,
recount the many humiliating.

Speaker 3 (03:11):
Aspects of her job with Diddy.

Speaker 1 (03:14):
One of the tasks cleaning up hotel rooms after the
freak offs when baby oil was everywhere.

Speaker 2 (03:24):
It's disgusting.

Speaker 1 (03:25):
They also had a Beverly Hills Hotel employee testify. Yet
the rooms were destroyed after did He left. Sometimes they
charged Didty upwards of sixty seven one hundred dollars for
deep cleaning the room. They tended to put the rooms
into deep cleaning his rooms into deep cleaning after he
left anyway, and they often charged him one thousand dollars

(03:46):
extra for cleaning the room before he even entered the room,
Doctor B.

Speaker 3 (03:52):
Six seven hundred dollars.

Speaker 1 (03:53):
That's a lot of money to clean a hotel room.
Is that what it costs to clean your house?

Speaker 4 (04:00):
Now, that's what it costs to clean up evidence like this?
You see, this is called pay to play. So when
you're talking about cleaning up drugs and things of that sort,
my question is, and Riva, you're an attorney on here,
I know, with civil rights, but you may be privy
to this from last I checked, it is illegal to

(04:22):
not report when you find drugs to this level in
a hotel room, and that it is the staffs, like
you know, duty kind of like we have a duty
to report certain aspects. Now maybe not baby oil, maybe
not condoms or things of that sort, but when you
got drugs and things like that. I always teld that

(04:42):
those things are supposed to be reported when they are found.
And so my question is, again, you know, why was
the staff not reporting these things? But then again they
were getting paid, in my opinion, not to clean up
the room for the next check in a person, but
to clean up the evidence from Ditty's freakof scene. That's
how I look at it.

Speaker 1 (05:01):
But she's absolutely right that the employees were being used
to clean up these crime scenes. But the other part
of it, when the doc says, you know, they're supposed
to call the police, the people at Diddy's orbit were
not calling the police because they were more afraid of
Diddy and they didn't think that the police could actually
help them, which is an insane situation.

Speaker 5 (05:24):
Yeah, let's just use the word nasty. Let's just be
real honest about what this is. Anyone that does to
a hotel room, what Diddy did to these hotel rooms
is nasty. And you're right, doctor b this is not
just cleaning up. This is covering up. This was a
cover up by these employees. And we heard testimony from

(05:44):
one of the security guards that worked at the hotel,
particularly the hotel in la where that horrific video was
shot of Diddy dragging and beating Cassie ventur or Fine.

Speaker 3 (05:56):
He said he was.

Speaker 5 (05:57):
Given one hundred thousand dollars to take that video and
make it disappear, to give it to Ditty so that
no one could ever see that violence. That was perpetrated
against Cassie Ventur. So everything about this case is just
been horrific. It's been the way the prosecution is building

(06:18):
is conspiracy and racketeering case against Diddy, showing the jewelry,
these hotel rooms, the photos, the pictures, and the video
so that those jurors could feel like they were right
there in real time, being a witness to the crimes.
The prosecution has alleged in this case.

Speaker 3 (06:34):
A free for the hotel told Diddy, it'll cost you
fifty thousand dollars to take the tape of you beating
up Cassie, Right, So that's their price for a bribe.

Speaker 2 (06:44):
Right.

Speaker 3 (06:45):
He got there and he gave them one hundred thousand dollars.

Speaker 1 (06:48):
So he's like, well, I'll bribe the other people in
the situation too.

Speaker 3 (06:51):
They didn't ask her, but.

Speaker 1 (06:53):
I'm gonna bribe everybody in the room now, So he
doubled the price, doubled the bribe.

Speaker 3 (06:58):
Dimitri.

Speaker 1 (06:59):
When they arrest him, they went into his room and
found he was planning to do another freak off. Is
it insane to you that when the ship was sinking
and he knew he was close to arrest, that he's
gearing up to do it again.

Speaker 2 (07:12):
See the thing about it is it's you're comfortable.

Speaker 6 (07:16):
This is what you've been doing for years, So no
matter if you under fire or.

Speaker 2 (07:20):
Not, you steal feel like I am so powerful. This
is a ship they cannot sink.

Speaker 6 (07:25):
And the thing about it is the laundry employee stated
there was blood, urine, vomit, all.

Speaker 2 (07:30):
Of these things on the sheets, Like what are we doing?

Speaker 6 (07:34):
Like for me, the first thing ran through my mind
when we start speaking the numbers and everything.

Speaker 2 (07:39):
Sixty seven hundred dollars.

Speaker 6 (07:41):
If a hotel charges me sixty seven dollars, I'm furious.

Speaker 2 (07:45):
If you charge me for the litle ba la wat,
I'm furious.

Speaker 6 (07:48):
I just think the idea of this is, this is
a man so comfortable, Like doctor Bryn said, pay the play.
I'm cool with how this looks because I'm not the
one go I have to clean it up. My money
can take care of all of this.

Speaker 1 (07:58):
But arefa you know, both Cassie and Jane, who was
also a freakoff victim, both testified that they were expected
to do freak offs during their monthly which is part
of what we're talking about when we're reporting on blood
is found in these rooms I mean like that is

(08:19):
a level to me that shows a level of depravity
of not caring about these women as human beings.

Speaker 3 (08:25):
That is unbelievable.

Speaker 5 (08:27):
Yeah, terray, it's menstrl cycle. We're okay with the We're
grown ups. Let's destigmatize what happens to women. Once a month,
we go on our menstral cycle.

Speaker 3 (08:37):
And we bleed.

Speaker 5 (08:38):
And yes, these women were on their cycles and they
were bleeding menstrel blood and they were still forced, they
were still coerced to participate in these sex acts with
male sex workers while did he masturbate it, and while
he videotaped them, while he orchestrated every aspect of the

(08:58):
sex act. That's been the testimony, of course, it's testimony
from these witnesses that have come into this trial. We've
not seen anything as depraved as this. I've been following
trials for a minute. I've been involved in a lot
of cases, and I have to tell you what we
have heard from these witnesses. I don't think this could
be even written by a Hollywood producer or the level

(09:22):
of depravity that we saw and are witnessing as the
witnesses continue to testify, and it's not over. Let's be clear,
this trial nowhere from being over. I think we're going
to even hear about more acts and acts of depravity
than what we've already heard.

Speaker 1 (09:39):
Doctor one of the Freak God victims testified that she
repeatedly asked, can we please at least use a condom?
And he forcefully said no, I don't want that.

Speaker 4 (09:52):
Yeah, And then to add to that, to Ray, you know,
the victims said that they were still and he was
still willing to have sex with them with untreated it
uncured STDs. I mean, come on, this is not just
freak off. This is like putting someone's life at risk.
There are certain STDs that are not curable, that are deadly,

(10:14):
and so.

Speaker 1 (10:15):
When you start to look at tis also and expected.

Speaker 4 (10:20):
And so so so again, right, completely dehumanizing these women.
This is past sex, This is past being freaky. This
is like dehumanizing. This is like and you know what
this is. This brought something to my mind. Now we've
been talking about this narcissistic behavior he has. Right, but
there's a difference between a narcissists and someone who is

(10:42):
actually psychotic, two different, Right, But narcissays actually believe it
or not. They actually do care about what you think
about them. Okay, but some of them, some of who
was psychotic, they don't care. They don't have a compassion,
they don't need your validation, nor do they care about
your thoughts, what they're doing, whatsoever. This has given me

(11:03):
more of a psychotic type of behavior, where it's like, listen,
I can care less what you think. I can care
less about your health, I can care less about what
the enem results. I want what I want in the moment,
and if that means you your life is ended, that
means you're at high risk and all freaking well, that's
just I mean, it's just the pite me. I mean,
the epitome of just of weaponizing and demonizing these women

(11:26):
to the root and core, and we got to be
they would to take a whole two epists of talking
about the psychological impacts of something like this. This is
just bananas, Dimitri.

Speaker 1 (11:34):
This is it's a complicated issue because you know, these
women are alleging something very very very serious and you
know nothing they're proven yet. But like what we're talking
about here feels like almost like systematized rate.

Speaker 6 (11:56):
Yeah, I think that's exactly what it is. And I
don't know you guys brought up minstrels and a lot
of men may not understand. But I don't think this
was necessarily menstrual blood that may have been on these sheets.
I think if there's vomit, if there's urine, this is
some other act we may not be in realm or
retrospective thinking of, because our minds may not be able

(12:17):
to go there. What I know for sure about the
human interaction is once you go far enough, there's always
something more you tend to chase. After you've sat at
that certain height for so long, it seems a lot
crazier than we could think. And again, I'm with doctor Bryant.
This doesn't seem like narcissism. This seems like pure psychotic,

(12:38):
a pure psychotic act.

Speaker 2 (12:39):
Honestly, Yeah, I agree, Dimitri.

Speaker 5 (12:43):
This may not have just been all mistrul Blood could
have been blood from injuries, could have been blood from
any number of things. But let's keep in mind the
defense has been putting on a very strong case that
these women were strong, independent women that had agency, and
that they liked these sexual encounters, these free bloss nights,
and that they purposely went back for more. So we've

(13:04):
heard a lot from the defense trying to show that
these women weren't course, they weren't forced, that they did
it because they love bidding. They did it in one
case because they wanted the financial benefit that comes from
dating or being involved with the celebrity.

Speaker 3 (13:21):
They are trying to make that case. It is a
tough uproad.

Speaker 1 (13:25):
Battle, you know, as well as anyone in the federal
government wins somewhere upwards of ninety five to ninety eight
percent of their cases. And they are showing in a
very detailed way the way these women were controlled financially,
their living situations, violence. Cassie's afraid her album's not going
to come out, afraid that the tapes are going to

(13:46):
be sent to the media and ruin her life and
her career. Jane is her whole life wrapped up in
DDY in different ways, So he's controlling Devis. It's not
the defense would love for us to believe why didn't
they just leave? But it's much doctor b It's much
more complicated than you could have just been.

Speaker 3 (14:04):
You're an adult, you could have just walked out the door.
There's all these different ways that they were controlled.

Speaker 4 (14:07):
Yeah, and I love what you an Areva said, to
ray because I was looking and studying the case yesterday
to prep for our shows today. And I'm listening and
I'm listening, and I'm listening, and I'm like, Diddy's counsel
is attempting to get the jury to believe that there
was no sex trafficking. It was these women who were

(14:30):
in love with him. This is why they keep getting
on the stands Jane and so far, Cassie and you'll
see the rest of them come on that are going
to say they were in a relationship with him. They
were totally in love with him, which means if they
were in a relationship with him, they were just doing
what having swinger parties, having freak offs, just having these
sexual parties that they liked, but he just took it

(14:51):
out of control. And they're trying to get the jury
to say, wait a minute, so this man, you were
in love with this man, This is a real relationship.
This wasn't a transaction or relationship. There was no transaction here.
This was love involved. And how many men or women,
especially women do things for the man that they love.
And I'm listening, I'm going Diddy's counsel was trying to
get the jury to believe that these women were so

(15:14):
in love with him that this was a relationship. This
wasn't did he taken advantage of these women. They were
actually in love and these women just allowed him to
do these things because love makes you do crazy things.
Love and insanity are the same emotion, just experienced differently.
And I thought, as much as people may not agree
with it, to me, that's genius for his counsel to say.

(15:34):
If there's only one way out of this, this might
be the way to get folks to think these women
were in love. There was a relationship. There was no
sex trafficking, there was no pay to play per se.
I was just sitting up here with with relationships and things.

Speaker 3 (15:48):
Not too far well.

Speaker 1 (15:50):
I mean you made I mean yes, that is what
the defense, but that is part of what the defense
is trying to argue. Jane talked about a love contract
whereby she was paid ten thousand dollars a month for.

Speaker 3 (16:06):
Multiple years including now.

Speaker 1 (16:09):
She said, he's still paying her rent to this day,
so she gets ten k a month and she understood.
They both understood that her job, her responsibilities in return
were to be available for freak offs whenever he walked
it right, and Cassie the same thing, had that same
sort of financial relationship to the whole. Cassie said on

(16:31):
the stand, I was basically a sex worker, right, I mean,
like so Ariva. There's all these different ways that they
were controlled, and we keep calling them former girlfriends, but
I think they were really lived in sex workers.

Speaker 5 (16:44):
Well, thank god for experts like doctor b of those
that work in her profession, because as skilled as the
defense lawyers have been and putting forth this theory that
this was all about love and women willing to do
anything to please a man and that they love, We've
had testimony from an expert psychologist that has testified about

(17:05):
what happens to women when you were in a relationship.
Keep in mind, Cassie was nineteen years old, Diddy was
thirty seven when they started that relationship, and what that
imbalance of power does and how men like Diddy powerful men.
We've seen it before, Harvey Weinstein, Bill Cosby this he's
not the first one on trial for something like this.
That they manipulate, they brainwash, and they cause these women

(17:29):
to get so confused about what love is and it's
not love. They have been brainwashed and they are coerced,
and the law has a very broad definition of coercion,
and it does include withdrawing necessities like your rent, it
does include using physical threats and using physical violence. So
at skill as the defense has been, I think ultimately

(17:52):
jurors are way smarter and they're going to see through
that narrative that this was love, and they're going to
see it for what it was. It was coersion and
it wasn't consented. It was control.

Speaker 3 (18:03):
Dabitri, what do you think does it sound like corsia control?

Speaker 1 (18:06):
Are you still like a lot of people on the
streets are like, why didn't they just leave?

Speaker 6 (18:11):
You know, it's a fine line between loving someone and
thinking you're in love with somebody and loving what a
person is doing for you. And the thing about it
is that's at a base level where I am very
normal for people to get confused with that. So when
you put it in the multitude of millions of dollars

(18:31):
and a person funding your livelihood and your career and
your reputation and your family and everything under you, them
doing for you, regardless of what it is they got
you doing, you could believe is love.

Speaker 2 (18:42):
But at the end of the day, it's still course.

Speaker 6 (18:47):
It's still it's still him controlling the narrative.

Speaker 3 (18:52):
Yeah, for sure. This is an extraordinary story.

Speaker 1 (18:55):
I can't recall ever hearing about an entertainment executive control
rolling people in the way that Diddy is alleged to
have controlled people and had them basically live in sex workers.

Speaker 3 (19:08):
It's an insane story.

Speaker 1 (19:11):
Moving on story number two, Push It tas Push It Back.
I love that we learned this last week how much
influence Drake has on the entire music business. When Pusha
tal revealed that his unreleased album has a song with Kendrick,
and because Pusha is on Depth Jam and Death Jam's

(19:31):
parent label is UMG, the parent label of Drake's label,
so they're ultimately on the same label, and Drake.

Speaker 3 (19:37):
Is suing UMG because of the Kendrick beef.

Speaker 1 (19:40):
When UMG heard that push It had a song with
Kendrick on his album, they said, no, the optics of that,
even if you guys aren't actually talking about Drake, even
if there's those subliminals about Drake, the optics of us
supporting that song, the two of you with this lawsuit
over it too much. And of course Pusha said, no,
I'm not changing song and I'm not dropping the song.

(20:02):
So label said, fine, we'll just drop you. And actually though,
they made push A pay a million and a half
to get off the label because they're like, we're just
not gonna put out your album. And he's like, all right,
we got to give you a billion and a half
to be free of this. And now they're on Jason's label.
It's a whole different situation, but we see how Drake's situation.

Speaker 3 (20:20):
Had an impact on Push A d situation.

Speaker 1 (20:25):
Dimitrio, I continue to think that the lawsuit that Drake
has filed and all the fallout around it, including something
like this, is far worse for Drake than the battle.

Speaker 6 (20:35):
Losing the battle ever once, yeah, I think he lost
the battle, I think, and then this pushed him in
a realm of also losing the war.

Speaker 2 (20:45):
This was a very very.

Speaker 6 (20:47):
Weak move to the hip hop culture. Hip Hop used
to be a distrack. You got lyrical warfare. Now it's
about paperwork and pushed through at corporations and companies. And
the thing about it is it's taking us out of
the realm of just seeing you as an artist.

Speaker 2 (21:03):
Now. Honestly, it's it's it's.

Speaker 6 (21:06):
Pushing back towards the exact reason Kendrick.

Speaker 2 (21:10):
Kendrick mentioned before, you.

Speaker 6 (21:12):
Only have this spot in music because of your Jewish descent,
because of your and and And that's how I look
at it. That's that's the only reason I care. It's
boring me, like, it's taking me out of the mindset
of loving the music for what the music is.

Speaker 2 (21:27):
This is, this is more than us hip hop fans
bargain for.

Speaker 1 (21:30):
He But here's part of why I care about this,
because as a real hip hop fan, I cared about
Kendrick's message, which was ultimately he's not real hip hop, right,
so fuck him get out of here.

Speaker 3 (21:42):
I'm with that. But now Drake's non hip hop response meant.

Speaker 1 (21:46):
That I have to wait even long year, year and
a half to hear an amazing album from one of
hip hop's great groups, right the Clips. So Drake's adult
temper tantrum continues to affect hip hop in general right,
like overall, like I can't even get away with it,
away from it, even if I'm not paying attention to Drake.
But please talk from your experience as a lawyer about

(22:11):
watching Drake sue UMG for defamation, his label for defamation
of character when it was Kendrick who distant.

Speaker 3 (22:22):
Well, first, let me be queer. I am a big
Drake fan, obviously.

Speaker 5 (22:26):
I mean, oh, no, Michael or I like Drake and
I like Ken Drink, and I think not my legal hat.

Speaker 3 (22:37):
I don't taking my legal hat off a minute.

Speaker 5 (22:40):
I don't care about any of this because it just
looks like all of these rap artists are doing what
they are so brilliant at doing, skilled at doing what,
just making money doing distaste, keeping themselves in the limelight,
keeping the attention on them and selling more music and
selling out more concert stadiums. So, Terrey, are we getting

(23:00):
wrapped up in this and taking it far more seriously
than what this really is about, which is these guys
figuring out how to stay rich.

Speaker 1 (23:07):
So I think this is a really important moment for
the culture, Drake and Kendrick battling and what they mean
as far as authenticity and a lack of authenticity African
American diss and be late to the culture, be real
and big not.

Speaker 3 (23:22):
I mean, my question is, why do you like Drake Ray?

Speaker 5 (23:26):
I think you're taking him way too seriously. I think
these are some skilled artists. They're both rappers, they're artists,
but They're business people at their core, and they've figured
out a way to keep them their names in the
pressed to keep their stadiums full.

Speaker 1 (23:41):
I think Kendrick comes to this battle in good faith,
not from a product or marketing standpoint or like here's
how I'm gonna.

Speaker 3 (23:49):
Get my name out there.

Speaker 1 (23:51):
Like I think he genuinely sees Drake as a bad
figure within hip hop culture, as somebody who is not
part of the norms, the traditions, and the ways of
hip hop, and as an elder statesman of hip hop
and somebody who's like I am the AUTHENSI I am
the embodiment of hip hop. I am calling you out
because you do not belong here. So I mean the

(24:12):
very nature of what the culture is about and should
be about, right, I think the Dmitri really is part
of what we're talking about here with Kendrick and Drake.
It's not a cynical marketing mooove. This is like about
the purity of the culture that he cares about.

Speaker 3 (24:27):
Dmitri, I agree.

Speaker 6 (24:29):
I think the idea of it is exactly what he
said was exactly what he was supposed to do. Whatever
Kendrick said on those records was exactly what he's supposed
to do because it won him the war.

Speaker 2 (24:38):
You don't get into a fight and hold back your punches.
You throw them. But Drake's too worried about why y'all.

Speaker 6 (24:43):
Let him put this music out and not what he said,
not coming out with better music. Yeah, it's taking us
away from hip hop.

Speaker 3 (24:50):
Doc.

Speaker 4 (24:51):
I think they both belong in hip hop. Hip hop
has different representations. They both belong here. It's not about
who doesn't belong. And when you're talking about, like Ariva said,
than being businessman, Kendrick is he's getting endorsements, he's getting money,
he's getting deals, he's doing super Bowl and he's all
right with this back and forth. This Drake is the
one who's being a spoiled brat saying I have a
problem with it. Pull this out, don't do this, don't

(25:12):
do that. Now I'm going to sue you. What happened
to hip hop? I thought hip hop originated from people
be bopping, you know, somewhere in the park on the
street and dissing each other, and that it was supposed
to just be a competitive be bop disc do your
thing and continue this thing. And now they're taking it
to being commercialized and you got two folks who are
doing what Riva said, you know, not be bopping, but

(25:33):
they're doing in studio, you know, terret ly Milan in
studio records. But they're still dissing each other. So what
I'm saying is this is a business. I think Kendrick
understands the business aspect of it. He's ready to go
total toe. He's ready to keep doing these records and
make his money. Drake is crying and being more of
a little I don't want to say that word, a brat,
a cry baby, and it's like, bruh, You're you're part

(25:57):
of the culture. You're part of hip hop. You're part
what's the problem? Make another distrack? Make your money, Like
Mariva said, why are y'all not making money off this
great generation of wealth. Drake, you'r a father. Do something
with that. But don't sit up here and cry about
who's on a track where you could be making money
in the studio using them Damn crocodile tears. Get over it, Drake.
Push through.

Speaker 1 (26:17):
And I know that beat boxing wasn't big in Cali culture,
but it's.

Speaker 3 (26:21):
A very important part of hip hop.

Speaker 1 (26:23):
I don't know what beat bopping is Grandma, but like whatever,
you can have that with your dua or whatever.

Speaker 3 (26:30):
Okayo, my poor ears.

Speaker 1 (26:33):
All right, moving on to politicking, Moving on to our
politic exid. But the Trump administration mistakenly deported another immigrant,
this one a Congolese immigrant. Where they sent yourselves to
Dan because they do not know that Africa is split
into many countries and they are not the same. Africa
is not one big place. It is many, many, many countries.

(26:53):
Glad we got that clear. And then the Trump he
said to the Supreme Court, can we just de poor
people to wherever we want to send them? It doesn't
matter where they are originally from. We just want to
send them to whatever country we want to send them to,
which is kind of insane. Roll the clip about that.

Speaker 7 (27:10):
Away from this bizarre diplomatic dispute appears to be if
the Trump administration decides that you're from South Sudan, you
will be deported there, even if you're not from that country.
Officials in Jubasa, the United States deported a Conglese citizen
to South Sudan, and when they send him back to
the US, the State Department reacted by canceling all existing

(27:31):
visas for all South Sudanese people and blocking the issues
of any new visas.

Speaker 3 (27:37):
So what happened is South Sudan.

Speaker 7 (27:39):
Did issue travel authorization for somebody named Nimri Ghran, but
the US sent them a Conglese man named Makaula Kintu.
South Sudanese officials believe that the US made a mistake
and sent them the wrong man.

Speaker 3 (27:51):
Tomitri.

Speaker 1 (27:52):
This is really a sad story and again the inhumanity
that the Trumpes practic toward UH immigrants in this country.

Speaker 3 (28:04):
But you think about, you know.

Speaker 1 (28:06):
If the Supreme Court says yes, you can do this,
what does an African country do. We would love to
see them say no America. But these are much smaller,
much less powerful countries for the most part in Africa
compared to the United States. So it's politically, militarily, economically,
it's hard for them to just be like no.

Speaker 6 (28:27):
They turn in deportation and passports and visas, and it's
it's it's a it's a roulette wield that they make it.

Speaker 2 (28:35):
And that's that's like a lack.

Speaker 6 (28:36):
Of humanity for basic people, not just to mention the
black race, like black lives in general.

Speaker 2 (28:42):
It's it's it's poor care.

Speaker 6 (28:44):
It's but it's what America's always done in every other regard.

Speaker 2 (28:48):
This is what America's done. This is nothing new.

Speaker 6 (28:51):
This is just taken out in South taking place in
South Africa at this point, completely inhumane.

Speaker 2 (28:57):
I hate everything else. It stands for everything it.

Speaker 5 (29:01):
Seventy six million people voted for Donald Trump, and one
of his biggest promises was to quote unquote you know,
read the country of immigrants that he called rapists and
drug dealers and criminals. But what he really meant was
to make sure that white people in this country, white
supremacy were you know, uplifted, while he continued to de

(29:26):
humanize people of color, including those immigrants that come from
black and brown countries.

Speaker 3 (29:31):
No surprises here.

Speaker 5 (29:33):
Just over the weekend, Donald Trump said two thousand or
ordered two thousand National Guards into downtown Los Angeles after
trying to raid some you know, workplaces in the Los
Angeles area. He has been chumping at the bit to
do that because he wanted to send a force, a
military force into a democratic city like Los Angeles. So

(29:55):
I'd say, deray, where are the people that voted for
down Trump? If nothing angers you or makes you mad,
hopefully looking at the images of people being kidnapped off
the streets college to be sent to countries that can
make you mad.

Speaker 3 (30:12):
I think some of them want that.

Speaker 1 (30:13):
But Doc, the African side of this, these African countries
may want to repel the US. They may want to
give us the middle finger and say, no, you can't
do this sending immigrants not from this country back here.

Speaker 3 (30:28):
But that's a complicated road for them.

Speaker 4 (30:31):
He said it loud and fucking clear, Make America great again?
What do people now understand? His representation? His ideology of
make America great again is modern day slavery. We're living
in it right now. What are you talking about, grabbing people,
taking them out their country and putting them in any

(30:51):
country or any place that you want them to be.
That is exactly what they did with us in the
slavery days, boxes us put us on shap since say
you're gonna go where we drop you off at and
you just figure it out. When you get there, we'll
tell you what you're gonna do. This is modern day slavery.
And then to go further, he's taking advantage of a
very very vulnerable country, very vulnerable content. Was that not

(31:12):
what they did back in slavery. So all I'm saying
is this is modern day slavery where he is trying
to turn his leadership into dictatorship. And I go back
to what I said a while ago. He is definitely
trying to create a martial law so that he can
show that we are out of control and he needs
to come in and dictate and take on this power
of superiority over people who have lost their mind. They

(31:34):
were hearing in Los Angeles last night. The guards came in.
They came into our courts. You have people protesting. They
literally are throwing people in cages. This is not listen
listen for the folks who ain't seen it. We're just
watching it on TV. I was at a Democratic dinner
last night, left the dinner earth. They look they had
to sleep early to make sure we can get into

(31:56):
our homes because the National Guard was here to support
and back.

Speaker 3 (32:01):
When you have.

Speaker 4 (32:01):
People who are fighting in the street, who are getting
thrown in cages and then they're arresting folks, I mean
it is Terrett.

Speaker 1 (32:07):
It was literally but no, no iis instead of ice
is the best ballid propismself.

Speaker 4 (32:18):
It's called ices. You know why, because that's exactly the
ignorance of what these people are doing. I did that
on purpose. That's the ignorance behind this. They have kids
who are there throwing tear grap gas at they have
bit hold on This is I'm gonna land on this.
This is why it's called ices. They are arresting. They're
arresting citizens, putting citizens in the back of cages for protesting.

(32:40):
You're here for immigration. Why are you arresting the US
citizens because they look look like immigrants. This is the
racism I'm talking about. You can't say that immigration has
a look. I mean, it's sick. It's modern day slavery.
It's sad, and it's happening to majority of our brown
people right here in la it's mind.

Speaker 1 (33:00):
But when you talk about the potential for martial law,
that is real. And when Trump talks about wanting to
federalize the National Guard to go against American citizens in
America on our streets, that is an extraordinarily frightening step
toward dictatorship.

Speaker 3 (33:21):
We're going to keep talking about that.

Speaker 1 (33:23):
I want to expand the conversation a little bit away
from that into our main topic, because AI ain't really
for us today today. AI is not properly programmed to
recognize African American facial structures, our nuances, our vocal intonation.
Studies show that AI training data contains biases in skin

(33:47):
tones and sile stereotypes, and this can have an impact
even on policing.

Speaker 3 (33:51):
Look at this clip.

Speaker 8 (33:53):
AI systems are becoming increasingly integrated into our society and
most of the time they make our lives. But sometimes
these systems failed to live up to expectations, and when
they do, they're not failing everyone equally.

Speaker 1 (34:08):
It made me look paler and gave me blue eye,
so the general impression was that, you know, it had
changed my race.

Speaker 8 (34:16):
We tested out Playground AI. It's a generative AI tool,
meaning it can generate new images for the user. I
asked the software to make me look more professional, and
this was the result.

Speaker 3 (34:33):
It's clearly whitewatch these people at Dimitri.

Speaker 1 (34:36):
We also have a whole uproar going on online because
your boy Timberland, the legendary producer, has released plans to
put out an AI generated recording artist, and a lot
of people are very upset about that. How do you
feel about an EI wrapper?

Speaker 6 (34:57):
You know, I think it's absolutely ask for me as
much as I love timbling, and I understand him trying
to get with the changes and culture and identity of
what we're doing. But moving in this direction implies that
one day AI is going to drive everything. If AI
can speak, because let's let's let's speak about what hip

(35:19):
hop is, let's speak about what music is. Music has
been a language for black culture, or hip hop and
black culture forever. If you're telling me you could come
in and make a robot give me my culture.

Speaker 2 (35:32):
Better than a man can.

Speaker 6 (35:34):
This is this is insanity. I hate the direction is
pushing toward. I think it's it's as much as he
wants to be innovative, I think he does not understand
how destructive this is.

Speaker 3 (35:46):
I agree with that.

Speaker 1 (35:47):
I want to I want to go back to what
you were saying about AI and the criminal justice system.

Speaker 5 (35:52):
Yeah, you know, I was supposed to make the system there.
It was supposed to address out of the question that
outcomes that we see particularly around black and brown people.
For example, using AI was supposed to give judges a
way to take to make risk assessments about whether somebody
was likely to reoffend if they were let out of

(36:13):
a jail sentence, and what we are finding now is
that the programs have the same biases, if not worse
biases than what people have. And that's because the AI systems,
the algorithms they're using the same bias data and information
that's been generated by humans.

Speaker 4 (36:31):
So junk in and jug out.

Speaker 5 (36:33):
It's very, very scary. As much as I like criminal
justice reform, as much as I like technology, as much
as i'd like to see AI use, we're not there
yet to reread and right now, it's a dangerous tool
to be used by law enforcement because basically it predicts
that black people are more likely to offend. So that
means if you are trying to meet your case to

(36:54):
get out of jail, you're more likely to stay because
this algorithm predicts that black folks are more likely to reaffit,
and the data about that is absolutely positively wrong.

Speaker 3 (37:04):
Doc, what do you think about this AI and the
criminal justice system?

Speaker 4 (37:08):
Well, AI and the criminal justice system. I think that
there are some pros and I think there are some cons.
I think Arriva made some really good points that if
so AI is found to have a lot of discriminatory
software glitches where it is making it is more soil
to the mental health field. It is programmed more to

(37:32):
how can I say it to be more for a
Caucasian type of audience, which means that we are like
most things that they create are on the back burner
of who they're creating it for. So a lot of
the tests that they've done to create AI are based
on folks that don't look like us, and so when
they're using them for us or against us when it
comes to the law or the criminal justice system, it's

(37:53):
not going to work for us at all whatsoever. It's
going to work against us, and it's gonna dissipate and
create more of a marginalization and a disparity within the
black and brown community than we already have in the
judicial system. And I think that that is going to
contribute to the systemic racism that we are already experiencing
now and not cool Demitri.

Speaker 1 (38:14):
We see like people making mistakes like we see with
the South Sudan story. How do we use AI to
stop racial bias and stop humans for baking mistakes and
get us on the right track.

Speaker 6 (38:29):
Well, there's ways to do it, but I think it
tends to leaning against what A stands for because what
I want people to understand is that an intelligence that
erases who we are in existence, black people, black culture.

(38:49):
This isn't artificial. This is very intentional because the thing
about it is, and if you look, that's all American
history has ever done. You can pick up a textbook
exact same things happening. This isn't new, This isn't new.
This is just a different way of a racings. This
is a different way of making us seem invisible.

Speaker 2 (39:09):
That's all it is. It is not we mean invisible.
We right here, they see us.

Speaker 6 (39:13):
But the thing about it is, this is a system
we weren't designed to be in Clearly.

Speaker 1 (39:18):
The dark part of the thing is that the way
that AI is set up now, it quite often doesn't
recognize our features, recognizing that it's talking to or looking
at a black person.

Speaker 4 (39:30):
Terrey, you are so right, because my team has actually
been for the past literally six months, attempting to create
a doctor Brian AI, just keeping it real where folks
can go on there and when I'm not available in session,
they can have an actual session with doctor Brian. But
guess what, it's not identifying my tone as a black woman.
It's not identifying my way of speaking, and it's not

(39:52):
identifying again me esthetically and how I show up. So
they have literally been taking this long to try to
get this AI to identify with everything that's doctor b
and not have it sound robotic like some white woman
or someone who's not me. And that has been our biggest,
biggest They have uploaded and downloaded every book I have,

(40:12):
every post I posted, because you have to upload all
that to the software for it to become you. And
when I tell you, it is still not coming out
as the Doctor Brian. It is very very commercialized. It
is very very very Caucasian. It is very very not me.
So if anybody went on there to try to have
a section with me through our artificial intelligence, I'm gonna
be like, this is not the feel or the culture

(40:34):
of Doctor Brian, this is not it.

Speaker 3 (40:37):
Demitri.

Speaker 1 (40:38):
Do we need to have more black stems to more
black engineers to help get AI to recognize us, our tones,
our way of talking, our culture.

Speaker 6 (40:49):
Absolutely, we need more black engineers. We need more black text,
we need more black examples, we need more black samples
people to actually pull from because I think in this case, fucket,
we need black GBT.

Speaker 2 (41:00):
I'm gonna keep it on this with you.

Speaker 6 (41:01):
We need black GBT, we need goddamn like, like, let's
just start making them up about that's exactly what we
need because we don't have representation at all.

Speaker 3 (41:11):
Uh A Reva.

Speaker 1 (41:13):
This is a really complicated issue because if they're using us,
if they're not recognizing us fatially, but then they're using us,
using this within the criminal justice system, you could obviously
see the potential the mast particilar problem.

Speaker 3 (41:27):
Let's look at this video real quick.

Speaker 9 (41:30):
So when we get to the interview room, the first
thing they had me do was read my rights to
myself and then sign off that I read and understand
my rights. A detective turns over a picture of a
guy inside chinola and he's like, so that's not you.

Speaker 2 (41:49):
I look.

Speaker 9 (41:50):
I said, no, that's not me. He turns another paper over.
We say, I guess that's not you either. I picked
that paper up and hold it next to my face.
I said, this is not me, Like I hope y'all
don't think all black people look alike. And then he
says the computer says it's you.

Speaker 1 (42:08):
Look at that, and then we're going to have that
be part of this man's life.

Speaker 3 (42:12):
He's trying to fight these charges.

Speaker 1 (42:14):
Go ahead, Doc, you know this issue look is complicated,
like this brother is this brother is trying to live
his life and the criminal justicism is acting like he's
guilty when he's not because it can't properly recognize black
and brown faces.

Speaker 4 (42:28):
But this is the this is the bigger umbrella. When
are they going to start fucking making things that actually
cater to anything about us? Again, the field psychology was
made for a Caucasian white man, so we are over
under diagnosed. You got AI that was not made for
us to serve us. So again, if you have everything

(42:49):
that's working for one group of people that is caused
systemic racism, that is called marginalization, that is called disparity,
you're not taking an account for human rights or taking
an account for a superior race. That means that the
inferior race has they have no chance to begin with,
no chance. Listen, we already don't get a lot of love,

(43:09):
are a lot of respect when we're in a courtroom
to begin with. You think we need an AI that's
gonna go against so we got we got humans that
go against us, and now we've got some damn robots
that go against us. You tell me where are we
gonna go from here? Seriously, where do we go from here?
When we have nothing that's working for us but our
own people, and when we try to do it, Guess what?
If we are an employee in the court at the courtroom,

(43:31):
or we're an employee at the DMV, guess what. We
also have our hands tied because if I speak up
on behalf of demetri Utah Ray, guess what happens. I
may lose my job, my job that pays my bills,
that keeps my lights on, that gives my kids health insurance.
So we are tied at the hands almost everywhere we go.

Speaker 1 (43:48):
AI is increasingly being used in mainstream America to broaden
this whole issue via employment practices, college applications.

Speaker 3 (43:55):
Places that we get discriminated against our lot.

Speaker 1 (43:59):
So what does this mean in terms of hiring practices?
Since DEI is being removed, it sort of suggests a
riva that, like AI, represents a future where America is
more racist and black people have an even harder time
getting ahead.

Speaker 3 (44:15):
Yeah, unfortunately it looks that way.

Speaker 5 (44:18):
Of these HR programs that are being used, basically, you
upload your resume, but the problem is that the resumes
that the algorithm is looking at is the resumes of
the people that already work at a particular company. So
if the company already has a predominant white male workforce,
the AI, the algorithm looks and says, well, gee, this

(44:39):
company only likes white men. So if you're a black
man or a black woman applying for a job at
this country, the AI throws out your resume because you
don't look like the white men that already work at
that company. So if we want to use AD, we
got some steps to do. We've got to step back
from the program and look at what is it training on,
what that is it pulling from. How do we make

(45:02):
sure that it's not just looking at the white men
that are at this company, but looking at the diversity
we want to have at this company. Until we do that,
we're going to continue to get the discrimination and the
racist outcomes of these algorithms.

Speaker 3 (45:17):
Arriba.

Speaker 1 (45:18):
We got a brother about forty years old, Derek Mobley,
who's suing work Day because he says their AI driven
application system disadvantage him and other older job seekers, penalizing
older grown mature Stop looking at me like that, Dmitri.
Job seekers, right, we deserve to work too, not just

(45:38):
the young like Dmitri, but he submitted one hundred applications
and he's getting rejected over and over, and he's suing,
suggesting AI is really the problem. And this sort of
seems like the law, the legal system starting to come
in to try to adjudicate what's right and wrong in
this area.

Speaker 5 (45:56):
Yeah, I'm so happy about that lawsuit because that guy's
that he submitted one hundred applications or resumes and in
some cases he was turned down within and out. So
there is something inherently wrong with this HR software. It
is not again taking into account that even though the
company may be predominantly white and male, that it is

(46:18):
discrimination to read out candidate is based on their race,
based on their age before you even look at their qualifications.
And that's what's happening with this HR software. I hope
this loss only is successful. I hope these AI companies
have to go back to the drawing board and reconfigure
the AI taking into account discrimination laws.

Speaker 3 (46:38):
To be treated.

Speaker 1 (46:39):
I mean, I still am stuck on like if the
criminal justice system is using AI to determine who to
pick up and who not, who to let go and
who not, and this system is not really smart enough
and sharp enough and nuanced enough to understand us.

Speaker 3 (46:58):
That isn't extremely damged for a person like you.

Speaker 6 (47:02):
Oh, absolutely, because the thing about it is unlike, wrongful
arrest is a constitutional violation, and that's exactly what they're allowing.
If the police can't identify me, neither can your AI system.

Speaker 2 (47:16):
You gotta you gotta.

Speaker 6 (47:17):
Believe me at my word in what I'm doing. So
this is actually it's the thing about it is criminal
justice has already been scary for Black America.

Speaker 2 (47:24):
This is this is even worse that this is.

Speaker 6 (47:27):
This is this makes it even worse, even harder to
get because you're going to have so many false incriminations
like this is absolutely inserted you in the video. The
man said, are you sure that's not you? Like I
just told you this wasn't me, Like like what are
we saying? Like why are you asking me this question
two times? Like it's gonna change my answer? It's not me.

Speaker 2 (47:49):
But now you in AI are gonna tell me this
is me. No, I don't like that, not one.

Speaker 1 (47:53):
I mean, doctor technology leaves us in a really dangerous
spot as black people.

Speaker 4 (48:00):
This is the thing. This AI is a robot. You
know it creates what you feed it. Right, there is
no information in the software unless you feed it information
in the software. So let's go deeper than that. Let's
talk about the developers who's developing the software and feeding
it this racial injustice. Who is feeding it this racially

(48:23):
biased information that's telling us the humans that black folks
are the ones who have a higher rate of criminalizations,
are we are the ones creating the most crimes. That
is a systemic racially fed robot. Robots only do what
we create in them. So the developer of the software,

(48:46):
the folks who are feeding and creating the robot are
the ones who are to be held responsible. Period, That's
just what it is. And then secondary you have the
folks who are in the judicial system who just say,
hold on, this is racially biased. Cannot stand in a
court of law. We can't enforce law what they robot
that was fed by a developer who fted false allegations.

(49:10):
If we want to talk legal terms, this is false allegations.
This is a false this is not true, not at all.
So we got to push back on all of this.
We got to get back to who's developing these offers,
because again the doctor brand and I being developed by
WHO my team, So if it's racially biased, it's because
WHO my team vetted the racially biased information for it
to show up in that way.

Speaker 5 (49:28):
But can I say, can I say this, We're all
making some assumptions that there are people out there that
want the criminal justice system to be unbiased, that they
want the criminal justice system to be fair, and that
they want black and brown people to have a fair
opportunity when they get involved with the criminal justice system.
And I think that is a false assumption for us

(49:50):
to make. I think there are people that want to
see black folks disproportionately arrested and incarcerated into serve lagual
sentences and to have their lives racked by the criminal
justice system. We know that by the recent election. Where
black folks work and are often pitted against other nationalities,

(50:10):
we're often painted as the most violent group of people.
We're often painted as the most criminal group of people.
And that's purposeful, that's intentional. So I think we got
to check our assumptions about this fairness because folks ain't
trying to check for us, ain't trying to make.

Speaker 3 (50:25):
This criminal justice fair.

Speaker 5 (50:26):
And when you have prosecutors and you've seen this to
raise that are progressive prosecutors.

Speaker 4 (50:31):
What happens?

Speaker 3 (50:31):
They get voted out of office.

Speaker 5 (50:33):
Oftentimes they get recalled because the system doesn't want to
be fair for black folks.

Speaker 1 (50:39):
No, I think that's a really important point that the
expectation that the system itself wants to be fair for
us is wrong. That we should assume the system would
calibrate itself to be unfair to us. Now, that's just
the system in general and the way that white supremacy
will always work and rebound to be unfair to us.

(51:02):
But AI, doc is, we've been We've been having a
lot of fun on AI just as a community, as
a culture, and like you know, you see a lot
of funny things making fun of us, and I don't know,
it's it's it's a whole new art for a whole
new way to.

Speaker 4 (51:20):
Yeah, you know my social lot I have social media
posts that are AI. Now, all my ais are black
with a lot of melanin. The men and the woman
in my eyes are all black, and I have a
multi racial following of what almost three million people. My
eyes are black.

Speaker 1 (51:35):
But I'm sorry, so you said your ais have a
lot of melanin, So they don't look like you.

Speaker 4 (51:42):
They're they're brown, they're darker than me. Oh hecky, I'll
stand with my eyes.

Speaker 3 (51:46):
Don't look like you.

Speaker 4 (51:48):
Listen, they don't look like this camera on here God,
who look like a lightbrier. We cannot borrow some melani
because I am not this right, Okay.

Speaker 1 (51:57):
Those at the black spirit is there, but the bell Terrey,
the black this is here you six one hundred percent.

Speaker 3 (52:06):
But it's not.

Speaker 4 (52:07):
See colorism, colorism.

Speaker 3 (52:13):
I'm here for you.

Speaker 5 (52:14):
This is colorism exactly. This is what we're fighting chain.
This is what the pulture needs to rid itself up.
And look at you whifting it up.

Speaker 3 (52:24):
Look, I think the sister is beautiful. I think you're beautiful.

Speaker 1 (52:27):
I think Dabtri is also here with us, so you know,
everybody's beautiful.

Speaker 4 (52:32):
So my Ai, my ais can't be black. My alies
are black, melanin, chocolate, all skin colors of our color spectrum,
and I use them on my social media. People love it.
They even d m or asked me, is that AI dot?
And I say, yes, it is. But but we're able
to drive messages through it. So that's why I say
there are some really good perks in it. But you

(52:53):
know when it comes to people using celebrities and putting
a different message to them, like they have fifty cent
as a AI and they have his mouth speaking a
whole different message that he doesn't speak. So in my mind,
I thought to me, that's the information. I mean, this
brother didn't say that these words aren't And that's when
it becomes dangerous. And then you look at privacy at
it's like, wait a minute, so how private? How much

(53:16):
privacy do I have?

Speaker 5 (53:18):
Is created?

Speaker 1 (53:20):
I definitely am at a place where if I see
a clip online and somebody says something that seems like
wow and out of pocket, I am mediate like I
don't believe it, it's probably AI. I need to see it,
you know what I mean in four other places before
I I'm assuming it's AI if I see something out.

Speaker 2 (53:36):
Of the north.

Speaker 6 (53:37):
Absolutely, nowadays and in this day and age, you have
to question everything you see on social media, from pictures
to honestly to hate to put a personal story out there.
But there are if anybody was to have a look
on TikTok and search my name, there's eighty Dimitri Wiley
accounts like eighty literally, and someone sent me an Instagram

(53:58):
video where they're using my voice as AI to scam
women out of money, and I'm speaking thousands of dollars,
thousands of dollars.

Speaker 2 (54:09):
When you play.

Speaker 6 (54:10):
It, it sounds there's no infractions. It doesn't go up,
it doesn't go down as very monotones. The thing about
it is, as a man who is on social media
and cares about what he pushes forward, this is terribly disturbing.
Like it's not only is it fraud, it's cat fishing,

(54:30):
it's manipulat, it's manipulative, it's it's in the people behind
the these acts. I've spoken egregiously to people I'm talking
cussing their kids out, cussing them out, your life, and
it's it's it's I hate everything about her.

Speaker 3 (54:46):
I think that.

Speaker 1 (54:49):
Too much, and that he's actually doing this stuff and
coming on here claiming that somebody else didn't do.

Speaker 3 (54:54):
How do I know that he ain't doing it?

Speaker 6 (54:56):
And try to right, I'm talking to you, I ain't
got that much time to be treated.

Speaker 3 (55:02):
How easy is it to steal someone's ID?

Speaker 6 (55:06):
I mean, the thing about it is all they do
is have to take a video or a podcast or
anything where my voice is because it's a very distinctive voice.
I don't know about you, Torrey. Some people would consider
this a little sexy. Some people give it it's given
like whispers in the dark. You know what I'm saying,
So don't be a hater.

Speaker 2 (55:22):
That's one.

Speaker 3 (55:22):
Hey, you have a fantastic voice.

Speaker 2 (55:24):
Thank you, thank you. It's about time I got some
acknowledge that.

Speaker 1 (55:28):
The hairline we're going to talk about, but the voice
is fantastic.

Speaker 4 (55:31):
Books is so good that Ways stole his voice. You
know he's on Ways when you when you get yes
his voices AI, but he has a contract with Ways.
They used Dmitri as a turn left here and turn.

Speaker 5 (55:48):
Around, Tore.

Speaker 2 (55:49):
How does that sound?

Speaker 1 (55:52):
I would I can't wait to never use that. But
I am so happy that that exists a world.

Speaker 4 (55:57):
Such a hater.

Speaker 6 (55:58):
But all of that was done through AI and it
has its perks and it has its it's detriments too.
What's going online? And I just want to publicly make
this known if I can hear on Truth Talks Live.
I have one social media account on every platform.

Speaker 3 (56:13):
I am not the military.

Speaker 6 (56:16):
Wiley, not Dmitri Wiley O two, not DMT you Wiler
Berner account. I am terribly sorry for the people who've
been getting scammed in Finesse.

Speaker 2 (56:24):
But I promise you it's Ai. It's not me in the.

Speaker 6 (56:27):
Year of twenty twenty five. If you ain't facetiming that man,
that ain't that man.

Speaker 4 (56:31):
I have a truth. I have a real story. My
niece and I went to go see Coco Jones perform
and my niece, my niece's twenty she was gonna be
twenty one and two months, so y'all don't come for
me because I clap back. And we were trying to
get into the Coco Jones concert and she wasn't twenty
one yet. And literally my little sister who is a

(56:53):
little older but younger than me, sitting her her. I
had no idea if my niece was doing this. She's
on her phone on an app on Honor and I go,
she goes, okay, we're good. She literally used AI to
swap out her photo for my sister's photo to get it. Now,
mind you, was so to get in. That was God,

(57:15):
God was happing our back. Guy's like, no, no, we're
not gonna do this. But I'm trying to say, that's
how quick, But that's how easy. AI is used to
swap identifications.

Speaker 3 (57:24):
That is crazy.

Speaker 1 (57:26):
That's a fantastic story about doctor Bryant helping to ruin
the youth of America. Thank you so much for that.
I'm so glad that it didn't work out that night,
but they're still tonight. That is our show tonight. Like
covent and subscribe on our YouTube channel at truth talks
dash Live. Continue to watch us simulcasting right here on
the Blackstar Network right after Roland Martin Unfiltered. Support Black

(57:50):
Media a cash app Truth Talks Live, and support us
by advertising on the show, as our friends at hair
Maniacs did tonight.

Speaker 3 (57:58):
Thank you so much to them.

Speaker 1 (57:59):
Truth Talk has an over ten million person viewing audience.
If you advertise on the show, they will fall in
love with you. Small business and influencers, please partner with
us by visiting truth talkslab dot com and become a
VIP member of the Truth Lounge and click on.

Speaker 3 (58:16):
Any co host route even mine.

Speaker 1 (58:18):
Black media is under attack, so support us by telling
a friend to tell a friend to check us out.
Every night at eight pm EST and eight pm PST.
Those are two different times. Doctor Brian, thank you Riva
for being here all week. We love you so much.
We appreciate you, especially when you disagree with doctor Bryan.
Next week we will have Judge Laura Lake here with us.

(58:41):
We are here for you every weeknight at APM. We
will see you.

Speaker 4 (58:46):
Way. Can I please let people know that I actually
have my hair Maniacs in right now, honey, and I
will be doing a review on how much I love it.
I don't have to wait to do a review next
week how much I love it, because can you guys
tell how nice it is? Anyways? I have long, beautiful,
luxurious hair. But I did put in my hair my

(59:07):
hair Mania acts to leave me alone, my hair Maniacs extensions.
And let me tell you one thing for sure, it
matches the heck out of my hair. Two things for sure.
I go wash this thing and get it right back
the way it is three tooths for sure. I worked
out in it and just did.

Speaker 3 (59:21):
My warm curl. Honey.

Speaker 4 (59:22):
Hold on, hold on, y' hold on, hold on, hold
knuckle play with you, knuckle hold you. Anyways, I'm back,
okay toory you can compliment me. Go ahead, I'm waiting.

Speaker 3 (59:29):
It looks great, it looks fantastic, it really does.

Speaker 4 (59:31):
Thank you, thank you. But I do want to say this, y'all.
I really I have not been the woman who I've
always been into natural hair okay natural hair, but I
want to say this. This is literally my first or
second time that I have put in extensions and this
is the first time and I'm not playing. You can
look for yourself. This is my first time worth the
hair has matched my hair absolutely to the tea. Let

(59:54):
me show you one more time, Y'all'm not playing with
you because I want you to really get into texture
the color. But look at hold on, look at this.

Speaker 5 (01:00:01):
This is the hair.

Speaker 4 (01:00:02):
I'm not playing. This is my hair and the hair,
the color and the texture, it's all blending in. You
cannot make this up. I can't make the sun. Okay,
look look how pretty this is. I didn't have to
dye itt. I didn't have to do anything a hair media.
This is the point. Let's see if we can literally
hold on. I'm not playing.

Speaker 2 (01:00:19):
Hold on.

Speaker 4 (01:00:20):
Okay, I'm going to try to show you if you
can see a trap and good luck. But there's nothing
in there. Do you see this? So this is my
hair that I'm pulling. Okay, this is the hair media,
this is the hair mania. Can you tell the difference?

Speaker 5 (01:00:32):
No, my point.

Speaker 4 (01:00:34):
And guess what when I go on dates because I'm
seeing them, neither can they thank you? Neither can they?
You don't mind the grays where you seeing gray at honey,
I'm kidd because I've used to grays and my hair
and you don't have any grades.

Speaker 3 (01:00:49):
I do.

Speaker 2 (01:00:52):
Hair.

Speaker 4 (01:00:52):
Madia has gray extensions as well, So for the grays
out there, you can put it in and last, but
not least literally, you can diet. So I have some
more of their hair that my hairstylis says. But they
have died blonde, so y'all may see me back on
you talks with some blonde streaks. Don't you tell nobody
that that is my hair intensions? Okay, that is my hair.

(01:01:13):
It's gonna blend in and it's gonna be absolutely gore.
So they have the other Dmitria, leave me alone because.

Speaker 2 (01:01:18):
You feel me, Demitri.

Speaker 4 (01:01:20):
You met you see me at my at our show
in Atlanta? Was the hair not popping? I had a big,
beautiful and curly and when I tell you, I got
so many compliments on it, full beautiful curly hair and
the hair is soft, and I can't. I'm not playing
y'all because I'm not an extension.

Speaker 2 (01:01:38):
Girl.

Speaker 4 (01:01:39):
I am now sold. I will be wearing my extensions
and everyone's gonna compliment it, and you're gonna leave me
alone about my about my hair. Now, I let me
go a little further. I even I even wore this
to the BET Awards Hair Experience everything. And when I
tell you, even our red carpet, people are like, dog,
your hair is so beautiful. Not did you get extensions?

(01:01:59):
Not like literally? And I'm you know, I get compliments,
But for hair to be the main compliment, that's a
big deal because a lot of people don't have great hair.
I don't know, Reva, if you can, can you know
confirm that it is hard to find hair texture that
matches hair. Oh, I've just been sitting back, like you.

Speaker 5 (01:02:16):
Know, quietly going go girl with these hair extensions because
so many people have like trauma around hair extensions, because
people have made us feel bad about wearing hair extensions.
So thank you for breaking down that stigma in saying, ladies,
if you want to wear hair extensions that are long
that a curl and you or a bond that or whatever,
wear your hair extensions.

Speaker 2 (01:02:35):
Do you? So?

Speaker 5 (01:02:37):
I do, corgeous, I couldn't.

Speaker 3 (01:02:39):
Wait to see the blonde.

Speaker 5 (01:02:40):
I can't wait to see you in that summer beautiful
round hair.

Speaker 4 (01:02:44):
And one more thing, I know, I'm coming by my
hair a son forrmer thing. This hair is like a
twenty eight thirty two is long. Hold on they I
have gotten their shorter hair. Put a bob in this hair,
braid up my whole hair. And that's bob was bobbing.
So I want pirl, I want, I just want the
girly girls to up.

Speaker 6 (01:03:01):
It was.

Speaker 4 (01:03:01):
It was be bopping and bobbing. But I want the
girls to know it's not just about length because I
have long now, but I have used their hair for
a fullhead what they call full head. Quick weave of
me doing a short bob, which I'm gonna post pictures
up on my social media so you can see the bob.
You can see the long both extensions. Hands down. I
cannot make this up. Bomb straight up amazing and bomb

(01:03:23):
the textures. Bomb yard and listen if you guys, go
in order, tell him that the doc sent you, Tell
him that the dog sent you. You want to talk
about your hair now, won't listen.

Speaker 3 (01:03:36):
You don't have enough time. I don't have enough time
for this hair hair extensions.

Speaker 4 (01:03:41):
Oh, he needs gray hair extensions. They got that too,
to a right, tell him the doc sent you.

Speaker 2 (01:03:45):
I'm gonna get himself.

Speaker 3 (01:03:47):
Oh y'all
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.