Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Bill got her talking get heavy,okay in here every day. Indeed,
I ain't so man. My beellgot them this songing gay, good morning,
(01:55):
good morning, good morning, andwelcome into w d i A The
Bev Johnson Show. It is indeeda pleasure I have you with us once
again on this Thursday, May night, twenty twenty four. Enjoy this fabulous
day to day. Let's get readyto talk in this day. Oh lord,
(02:17):
why let's get talking this morning.We will be talking with our brother,
our psychologist, mental health specialist Raceman, Doctor Varen Harper will be back
in the house to talk to uswhen a jury turned to talk. You
know you can eight three three fivethree five nine three four to two eight
(02:38):
hundred and five zero three nine threefour two nine zero one five three five
nine three four two fine one thatyou can get in. Yeah, you
can get in. You can getin. And if this day, this
(02:59):
day, Thursday May night, twentytwenty four, is your birthday like my
brother was that Floyd Hunter, what'sup Floyd, Uncle Floyd as we know
him, Uncle Floyd. Happy birthday, Uncle Floyd. From your wife,
(03:21):
your lovely wife Ernestine, and yourbrother Abe and your nieces and your nephews
and all your family who loved you, brother, Happy birthday, you know
what we say. And also tomy little sister, was up Karen Ollie.
Happy birthday to Karen Ollie, thisday was up Carring. Happy birthday,
(03:46):
sister. I hope you have afabulous day to day. And all
of you all out there who maybe celebrating a birthday on this day,
you know what we say, goout and celebrate your life. Yeah better
you better. When we come back, we'll talk to you psychologists, mental
(04:11):
health specialists, doctor Warren Harper andme Bev Johnson on the Bev Johnson Show
only on do w d I Agood morning and welcome back to wd I
(05:06):
a The Bev Johnson Show. Itis indeed a pleasure to be with you
once again on this Thursday, Mayninth, twenty twenty four. Enjoyed this
fabulous day to day. We willget into session in just a few with
our brother, our Raceman psychologist,mental health specialist, doctor Warren Harper is
(05:27):
in the house. We'll be talkingwith him on our topic of conversation.
And let me tell you, beforeI started unmasking artificial intelligence, my mission
to protect what is human in aWorld of Machines by doctor Joy Bulowani.
Doctor Harper will get that name outfor me. Well, doctor Harper,
(05:50):
before we start, let me sayhappy belated birthday day to you, brother.
We made it another year, besure did. And happy birthday to
you for tomorrow. Yeah, okay, thank you, doctor Harford, thank
you, thank you. Let metell y'all I had a vacation, I
guess y'all know. Yeah, right, and a wonderful vacation. Needed went
(06:14):
to Aruba. Bev Johnson's grown folksagain. Yeah, Bev Johnson's grown folks.
They were in the house and I'mtelling you there was some grown folks
in the house. Let me saythank you, thank you, thank you
to all of my Bev Johnson's grownfolks travelers. Y'all showed up. Y'all
showed out, Doctor Harper, theyshowed them grown folks, cut up,
(06:36):
Doctor Harper and went to yourself.Hey, yea, hey, we had
a good time. Thank you,grown folks. But let me say a
big, big thank you to mytravel agents. They work hard. Regina
Johnson Sister, you a bad badass. I'm just gonna say it. You're
a badass. And Verdale bout youa badass, as brother. Those are
(07:00):
my travel agents. If you wanta book a trip or group or by
yourself, you need to hodgepodge travel. They are the bomb. Thank you,
Regina, Thank you Verdale. Verdilliais also not my travel agent.
My other bodyguard going body God can'tgo the man. Thank you, thank
you. They we had a greattime. Regina knows how to put on
(07:25):
the ritz. And those grown folkshere I did, Doc happo, grown
folks who know how to act andwho know how to have a good time.
All right, they were good,They were good. But thank you
grown folks for traveling with me.I keep saying one last time, like
I'm saying, Doc car Beth,where you going next year? Where are
you going next year? I haven'tdecided, but we'll see if we'll have
(07:48):
a grown folks travel again. Butalso Doctor Harper in in Aruba, Memphis
folks know how to go. Memphiswas in the house. Let me say
this. I told him I wasgonna say this. The cougar was hanging
with the cougars. Who are thecougars? It was the nineteen eighty nine
(08:09):
class north Side High School Cougar,Doctor Harbor, the North Side Cougars of
the class of nineteen eighty nine.They cut up because when we were getting
on playing, I saw I hadthe little T shirt. They had cougars
on them. I'm going like,who these folks when they got there,
we're from we're from north Side.What So we all had a good time
(08:30):
parted. But that's how you doa reunion, doctor Harvar, exactly what
I mean. That's how you doa class reunion. Now you do something
different. Travel And they told me, they said, we decided we were
gonna go to a route we had. We chose three places and we voted,
and the young lady told me shesaid, I can't remember her name,
she said, and Aruba won out. But the nineteen eighty nine I
(08:54):
was gonna give y'all a shot.I'm cougar, y'all some bad That's that's
stan Beale's High schools North Side,but Memphis and then with some other folks
there from Emphis. That's what's howyou live your life, doctor Harper.
Life is too short. That's whatmental health is all about it. And
thank you doctor Harper. You knowwhen somebody said that said, you know
you're the physical but this is goodmental health. We were away from the
(09:16):
stretch from the crime. They don'thave crime in a Ruber, Doc Harper,
they don't have no crime. Youain't doing no crime in the Ruber
area. People enjoyed theirselves. So, Doctor Harper, we were at the
reu Antilles. It's right on thebeach. You go to the beach.
We hung out the pool. Ohand let me have to say this,
because I'm celebrating my birthday. Ihad an all white Sa Rai birthday dinner.
(09:41):
They laid it out for us.Then, Dr Harper, we painted
the town red. Wait a minute, we got on the party bus.
We had a party bus and allof the folks we wore a red and
the party bus all we jam andthe party bus will stop at different little
clubs with a little hole in thewalls. Ah, So we painted the
town red. We had a cocktailparty and we all had on the BEV
(10:05):
Johnson Grown Folks shirts. But itwas we just had a good time.
That's all right. It was agood time. So thank you Regina Johnson
for about my travel agents. Thankyou to all the grown folks who decided
because you didn't have to do it, decided to go with me one more
time. But I had to tellyou, doctor, thank you for letting
(10:28):
me do that. But because peopleare where were were vacation and what was
I thinking? I should have beenoff today in the morning. But that's
okay. But you know you andI we worked, don't We work on
our birthdays. We work on ourbirthdays. He had no problem when when
you're serving the people, there isa say, no such thing as the
day off. You got that right, brother, no day off, but
we we it was the bomb.Y'all, better start traveling, better start
(10:52):
enjoying your life. Sitting there atthe same old place. Hey, all
right, doctor Harper, this topicwow, Unmasking Artificial Intelligence, my mission
to protect what is human in aworld of machines? By doctor you pronounce
her last name, Joy Bulowani Bouluwanie. Okay, all right gone, doctor
(11:16):
Harper, Okay, bab. Itis my pleasure to introduce to the WDIA
worldwide audience a compassionate, dedicated,brilliant, fully credential and protector of humankind
all over the globe. Her nameis doctor Joy Bouliwani and the title of
her book is Unmasking Artificial Intelligence.My mission. You're that my mission to
protect what is human in a worldof machines unquote. Doctor Boulliwani was born
(11:41):
in Canada and her parents are Guyanians. Doctor Bouliwhanie's mother is an artist and
her father is a scientist. Shecurrently lives in Cambridge, mass However,
she grew up in Oxford, Mississippi, and Memphis, Tennessee. What She
went to Cordova High School. Getout of here? Hello, Hello,
somebody Memphis. Doctor George Bouliwani isa Canadian American computer scientist and digital activists.
(12:07):
A digital activist formerly based at theMassachusetts Institute of Technology Media Lab,
she is founder of the Algorithmic JusticeLeague, which is an organization that works
to challenge bias and decision making softwareand to highlight the social implication and harms
of artificial intelligence. Doctor Billy Whyniereceived her bachelor's degree from Georgia Institute of
(12:28):
Technology and masters from Oxford College.She received a second master's degree the Massachusetts
Institute of Technology and her PhD fromMIT as well. While at the MIT,
she was a resident tutor at HarvardUniversity. She has traveled to Zambia
as a full Bright Fellow to teachyouth how to give instructions to computers,
which is called coding, and toincrease the number of women in persons of
(12:52):
color in computer science. She isalso a Rhodes Scholar, an Astronaut Scholar,
and an Native Borg Institute Scholar.The Sisters heavy Yeah. To start
off, let me define artificial intelligence, which is the theory and development of
computer systems able to perform tasks thatnormally require human intelligence, such as seeing,
recognizing speech, decision making, andtranslation between languages. Artificial intelligence is
(13:18):
about machines that can analyze large amountsof information, recognize patterns, solve problems,
and make decisions based on that information. So, in summary, artificial
intelligence is the science of making machinesthat can think like human beings and can
do things that are considered smart.And the goal of artificial intelligence is for
the machine to be able to makedecisions and judgments just like humans do.
(13:43):
So Doctor Bulawani, as a computerscientist, works on making machines think like
humans do, and she does thisby embedding instructions and information called data into
the machine, and the machine processesthe data and is able to think like
a human being thinks, which meansthe machine will be able after receiving the
information and instructions put into it bya person who is called a coder.
(14:05):
The machine will be able to speak, think, analyze information, and make
decisions like people do now. Thepeople who embed or put the information and
instructions into the machine again are calledcoders, and again they input the data,
which is large amounts of information andinstructions into the machine which the machine
will use to analyze the information andmake decisions. Doctor Boulawhani's work with what
(14:31):
is called excuse me, Doctor Builiani'swork with what is called facial recognition,
which is the process by which people'sfaces captured in video footage or photographs are
compared to a database of known individuals, like persons who have committed the crime,
to find a likely match and identifyan unknown person in the footage or
the photograph. In addition, facialrecognition technology is the kind that scans,
(14:56):
identifies, and profiles people in largenumbers and is becoming more common in both
the private and public sectors of thiscountry. Grocery stores use it to track
customer shopping habits. Many people useit to unlock their cell phones. I
use facial and thumb recognition, fingerprintrecognition when I go to the bank to
put things in my safe deposit boxand law enforcement uses it to match the
(15:18):
faces of some unsuspecting person, toooften black people, with the mugshot of
a suspected criminal. Now, inorder for facial recognition to work effectively,
it requires that one must have agood quality image of an unknown person.
For example, me say, Imight be jogging to Shelby Park and unbeknownst
to me, there is a facialrecognition machine that has been trained on a
(15:41):
wide variety of human faces, includingmine, And so the machine, without
getting my approval, is going touse my photo and compare it to pictures
in a database of bad men whohave done something illegal, inappropriate, or
criminal in the park. And ifthere is a match, then law enforcement
will be contacted and I will bedetained without my consent and without knowing why
I was accosted by them. Theproblem is that if the machine works at
(16:03):
its best, which it has notbeen proven to do, then I am
the person in the muckshot. Butthe facts are that the facial recognition misidentifies
people who have dark skin, peopleof color in general, and women.
So if the machine is incorrect becausethe picture of me taken by the surveillance
camera in Shelby Park is not clearenough good quality, as research has proven,
(16:27):
then the machine will make a falsematch. But that is after I
have been stopped by the police,probably manhandled, not told as to why
I'm being accosted, told I haveto comply with whatever questions they are interrogating
me about, search possibly forced togive my fingerprints, and having me against
my will to sit in the squadcar until the machine figures out that I
(16:49):
am not the crook that it thendefied me to be at who they thought
I was. This can be veryalarming and can easily lead to an altercation
between me, a peaceful man,and the police who have accosted me.
Now imagine this is you, anotherlistening audience, because it could be you,
the same as I watched a blackteenager being physically overtaken by his arm
(17:11):
and forcibly led away from his friendswho were coming home from school. The
teenager was interrogated on a busy streetin London, in full view of hundreds
of passing people, searched, fingerprinted, without being informed as to why he
was being accosted, and yes,he was black, and then evidently the
surveillance camera could not positively identify theacosted team with the muck shot of the
criminal that they were looking for.And then he was told he can leave
(17:36):
now again. What if this happensto you, common Man, to you
Meredith, to you David, toyou Prince Charles, to you Brother Molee,
to you, Nurse Beverly, toyou, Bev, to you,
Lady d to you Lady p toyou Big Hebrew, to you Marcus,
and to you King Harry O.What would you do? Because, as
(17:57):
they say, to real exist isfutile. There are benefits and risks for
government or law enforcement anti entities touse facial recognition. A benefit is that
facial recognition systems create quick investigative leadstowards identifying suspects with fewer policing resources.
However, facial recognition systems also reflectracial, gender, and age bias in
(18:22):
the data or information which it hasbeen fed by the coder who puts the
data or information and instructions into themachine in the first place. And if
that coder is bias regarding black people, Hispanic people, LGBTQ plus people,
older people, or disabled people.Then misidentification then misidentifying people from information generated
by a facial recognition system can havereal life negative consequences like the risk of
(18:47):
physically physical confrontation or violent escalation ofa black person and the police. And
since blacks have a disproportionate number ofencounters with police, then they will be
likely accosted and interrogated more often thanwhites due to the fact that we blacks
are overrepresented in muckshot databases, whichmeans that black facial recognition technology is more
(19:08):
likely to identify a person as asuspect in the US if the person is
black. Doctor Bullywuani is both astonishingresearch focused on what she calls coded gaze.
Coded gaze. Coded gaze refers tothe bias practice of embedding or putting
information and instruction into machines that istotally unfavorable or biased against black people,
(19:32):
people of color, and others thatI've already mentioned. In doctor Bullywuani's new
book entitled Unmasking Artificial Intelligence, mymission is to protect what is human in
the world of machines. She describeshow the information or instructions or codes that
are put into the machine comes frompeople who knowingly or unknowingly put or embed
information which can be either biased orprejudiced or unfavorable views of people and in
(19:55):
particular blacks other people of color.She says, quote every everyone has unconscious
biases, and people embed their ownbiases into technology. These coders, therefore,
have the power to write instructions andput large amounts of information into artificial
intelligence machines based on what they knowor what is out there at large in
(20:15):
the world population out there in termsof knowledge, these large amounts of information
about all sorts of things, whichin this case would be instructions and information
regarding ethnic groups or to be Frank, instructions and information about blacks, Whites,
Asian, Latinos, people of colorin general, also LBGTQ people,
people with different levels of ability,criminals, world leaders, terrorists. However,
(20:40):
these coders are information and instruction input. Gurus can also unknowingly leave out
or ignore finding out about what findingout about what they don't know or what
they don't think is important for themachine to know and to reason within.
So if the coder is white,biased, prejudiced, discriminatory or race nature,
and does not value or does notknow of the heroes and heroes of
(21:03):
other ethnic groups, does not programor put into the machine computer that Black
people are beautiful, intelligent, culturedenslaved in America and throughout the world for
over four hundred years, but alsohaving been historically the first humans God made
on Earth and the buildings of thepyramids. So if they leave out the
enormous amounts of information called data onAfrican peoples and people of African descent,
(21:26):
then the machine will only use thedata or information that has been programmed into
it, resulting in mischaracterizations of blackpeople and people of color. So one
can expect that the information that isincluded or embedded into the artificial intelligent machine
will only produce views, characterizations,decisions, and recommendations that can be blatantly
biased, discriminatory, or racist.The point is that if you have largely
(21:51):
unfair, distorted, and biased informationthat is being put into the machine and
the machine uses it to categorize people, then the result will be produced will
be characterizations of people that is distorted, unfair, and biased. For instant
Let's say there is a white malewho is a colder and his job to
end is to embed or place largeamounts of information into a machine about various
(22:14):
ethnic groups of people. The machineis going to analyze the information and to
make a decision regarding. Quote,if you are a good candidate to be
granted a bank loan for ten thousanddollars, and you're a black person.
Now, due to the bias thatthe colder has about black women, he
has programmed the machine to see peoplelike you as less than honest or unwilling
to pay back the borrow money,then you will automatically be denied the loan
(22:37):
because the machine said so based onpeople who look like you, and other
biased and distorted information that the machinehad been programmed to use. The same
coder, a white male also inbeds or places large amounts of information into
the machine about European or white people, Spanish speaking people, Asian people,
or Arab people. This information regardsdaring all these ethnic groups is coming from
(23:02):
a larger number of sources, includingdriving history, credit card use, credit
scores, banking history, job history, salary history, marital relationship history,
police involvement or criminal history, theneighborhood you live in, the education history
that you have the same with yourchildren, their spending habitags, your political
(23:22):
affiliation, athletic ability, possible sexualinterests and tastes, and their findings good
and bad, to name a fewof the things that's easy for private information
to get give them to get ahold of. The machine takes all this
information, analyzes it and comes upwith its characterization of you, and decides
that you are not a good candidatefor it paying back the loan. Now
(23:45):
where all the massive amounts of informationis put. When all the mathematic information
is put into the artificial intelligence machineby a white nationalist or a white person
who enjoys his or her privileged status, this information is compared with other males
or females, what do you thinkwill be the characterization of blacks in comparison
to whites. I'll tell you wewill probably be classified as a spiritual people
(24:07):
for the most part, hard workingwho were victims of historical but not present
racial oppression. And then the characterizationswill take a turn towards stereotypes such as
not being inclined to work, particularlyyoung adult blacks, being overly aggressive,
loud, over sexed, underdeveloped thinkingand reasoning abilities, prone to engage in
violent crimes, prone to be unfaithfulin marriage, a high risk for not
(24:30):
paying back a bankload, high risksfor not meeting up to being a CEO
of a major company, and ahigh risk for failure in any capacities.
These characterizations will be the case becausethe person who embeds information and instructions into
the machine has literally set the machineup by the information that was put into
it to mischaracterize blacks, and thisperson may not even be aware of it,
(24:53):
or they might be, so whenthat machine is asked a question about
black people, it can only provideinformation that is negatives, since that is
the extent of the information and instructionsthat was placed into it, therefore describing
and judging blacks and persons of coloras less intelligent, less attractive, less
employable, less stressworthy, less ethical, less business minded, less professional,
(25:18):
and less qualified to get a highpaying job, and more than likely to
be viewed from a law enforcement standpointas more criminally inclined than their Caucasian counterparts.
Why because the machine replicates and reflectsthe world as it exists, and
this world is dominated by racist viewsregarding blacks and other people of color.
Let's stop there for a bit.Wow AI Artificial intelligence, Wow wow wow.
(25:47):
If you've just tuned in this day. How are you doing? We
are talking with doctor Warren Harper,Psychologists, mental health, specialized race man,
as he likes to say. Yeah, we're talking talking about the unmasking
artificial intelligence. Unmasking artificial intelligence,my mission to protect what is human in
(26:10):
a world of machines, by doctorJoy Bulawani. If you have a question
or two or have a comment,we invite you to call eight three three
five three five nine three four twoeight hundred five zero three nine three four
two nine zero one five three fivenine three four two will get you in
(26:36):
to us. You're listening to doubleyou d Ia. You're listening to the
(27:03):
Queen of Talk on w D IA K I know over the town working
(27:40):
hard to bring you bouta days nowsettling forday every days off the dear friend
Monday Moday people, Yes, goodmorning and welcome back. We are talking
(28:07):
about unmasking artificial intelligence, my missionto protect what is human in a world
of machines, by doctor Joy boolyWani. Doctor one Harper is here,
Doctor Harper. We're going to ourphone lines and talk with you. Hi
Marcus, Ah, Yes, doctorHarpe. Greeting Marcus is a pleasure to
(28:33):
hear your doctor Harper, you bringsome sunny to the madness. You know.
Well, I'm looking at the coverof the book and it the lady
is spoke about the Joy, doctorJoy, and it resemble so much M.
(28:56):
Franch Fernand's book. I said that, yes, black, black skin,
white mass you know, and itsays that's that's what they've been probapted
into. Black folks doing is artificialintelligence, putting putting missing for me,
(29:17):
you know, our Marina. Youknow, that's what they've been doing.
But I tell you something new.On the back end of itything. And
yes they are negatives, but onthe back ending, you see it's a
machine and it breaks down. Yousay, it's prone to break down.
And I tell you, if ourkids could learn basic algebra, just basic
(29:42):
algebra, you don't even eat chericulus, just basic algebra, you could get
you could get really good jobs,you know, to repair those machines because
they're gonna break. And right thereat that school lay that you teach beaf
that Southwest the community college. Youknow that college should be filled up.
(30:03):
Yeah, that's chance department should befilled up with our kids learning. You
know what I'm saying, Because that'san institution that is underutilized. You know,
yes, but you know that's allI got. You know, we
we need to learn be it's withalgebra. We can fix there. But
she yes, you know, weknow that there's negative, but there is
(30:29):
positive too, because I mean there'sforty dollars or fifty dollars of our job.
You know that's that's that's de saidmoney. Anyway, that's all I
got to take care of. ThanksMarcus W. D. I A Hi
Donnell. Hey, uh doctor Harper, doctor big p. Well, I
(30:51):
wanted to make sure I got thename of the book. I guess when
you were name the name of bookthat in and out of the car whatnot?
And uh And I knew I wasgonna miss it, but I had
to take care of something. Andit just needs you to repeat that in
the name of the author. Andit made me think about this M I.
T. Student. I saw thatshe was being interviewed. She she
(31:18):
basically came up with a formula.Well I'm not saying a formula, but
herr her thing was basically saying describingwhat you were talking about. It could
be that they followed her, youknow, to make this book or she's
the author of the book. I'mnot sure, but I know she she
basically showed the bias and uh onher thesis and whatnot. And I think
(31:44):
the name of the documentary was CodedBias. You got it. It's on
Netflix, Coded Bias Okay, okay. And and and it's a funny thing.
I just saw one of the guyson the show where they, you
know, they take these people thathave these ideas and they finance, uh
(32:08):
the I guess whatever product they have. Anyway, this guy, one of
one of the members, he wason some kind of uh I guess interview
basically stating he's gonna start using thisthis technology, which made me, I
think initially to the coded by he'sgoing to use it to basically lead out
(32:30):
people that out there protested. That'swhat that's what they're doing with. Wow.
Okay. I didn't want to makesure I you repeat the name of
the book in the author of thebook so I can follow up on it.
(32:51):
Okay, the name of the book. The name of the book is
Unmasking A. I Unmasking AI andthen my mission to protect what is human
in the world of machines. That'sa bold that's a bold statement. It
is say you. She's deep andshe does not play because she went up
(33:13):
against all the big boys, theib ms and everybody, and she says
in the book that she was intimidatedby that, but she had to stand
up for her people. So it'sunmasking AI my mission to protect what is
human in a world of Machines bydoctor Joy bulowanee b U O l A
m W I N I and checkout her Netflix special and she's also on
(33:37):
Ted Talks too. Okay, great, thank you, thank you, Darnell
w d I A Hi Dave.Good, good morning, my most we
got one man left afternoon. Goodmorning, my most killerful. Good morning,
my David. How are you,brother? Good to hear your boys.
(34:00):
Sister, it is great to hearyour melodious voice today. And let
me first say something to brother Doc. How you doing as well? Happy
belated birthday to you. And sinceI'm not going to be the call tomorrow,
I wanted to make sure I gota very very happy birthday to my
(34:22):
favorite guide as Hall of Fame RadioHope Beverly Lane Johnson, now Miss Johnson.
I did not come with a songtoday, but I want you to
know that two things I got yourfish plate in the mail delivered United States
(34:45):
Postal Service, and with every letterin your name, it's a song in
my heart. And well, youknow, I could say, look,
I'm gonna try and make it hotfrom from Pycs tomorrow for I had But
hey, David, Hey David,you don't have to seduce me. I'm
(35:07):
already yours. I'm already and Isaw and I saw them give picture.
Somebody said the giant pictures when yourboys in town too, and said,
oh, yeah, he kicked offmy birthday month, Kim, he did.
But I'm glad that you had atime to take a break because it
rejuvenates you and me and others whenwe do take a break. And I'm
(35:29):
certainly I'm enjoyed mine as well,although I certainly miss you and talk to
you. Let me say, brotherDoc and Bell, and I don't know
whether you had made it back intown yet, Bell, but sixty minutes
had a piece, and I alreadyknew about the two sisters, two brilliant
African American high students at the timethat came up with the solutions that had
(35:52):
not been solved in two thousand years, a mass solution. And so if
you get a chance, please goby sixty minutes and check it out.
One of the one of the oneof the sisters is attending now since they're
in college now, is attending adcU Xavier and she's a pharmacy student.
(36:12):
And the other one is going toLSU U P p w I and she's
made made an environment engineering. Now. The basis of computer signs, of
course, is math. That's whatit is. So these two was asked
by the brother, are y'all goingto go further with math? Both of
them is, no, We've done, because how do you do? How
(36:36):
do you go farther than what they'vegone? They solved something that could not
be solved in two thousand years,Brother Doc. This is see this is
this is the brilliance of our people, and they go to it. They
went to an all girls school ofwhich they were encouraged that there was nothing,
there was nothing that they could notaccomplish. And we know, brother
Doc and Belle if our babies aresound founded in a circle by our people
(37:01):
and encouraged that they can accomplish evengreater things than that. The great tragedy
of white supremacts and racism is thatwhat you do is that you lock out
solutions of major things in this world. They could have been solved only if
you didn't have that racism menu.What they have done, doc. And
(37:24):
I got a call from a brotherabout about a year ago who wanted who
wanted me to look at business planningthrough AI, and I said, yeah,
I'm counter familiar with it, butI had been keeping up with it
to be honest with you. AndI started asking questions that you asked of
new technology, proudly new technology that'sbeing implemented, and he got kind of
(37:45):
impatient with me, and I said, okay, we'll fine. I got
some other things I'm doing. Anyway, let me say that, Doc,
I've been around tech people a longtime. To the tech world out there
Dons Street. They lie, Theygonna always lie. They gonna always say
that it's perfect because when you're sellingsomething, that's what you do. Now.
(38:05):
Do they care about the buyers?Hell no, they care about the
ROI sixty minutes just says in thevideo of CEO who their hardware is used
for this stuff and it takes verypowerful processes, very brilliant guy. And
it was clear to me that man, that's an asterthought. The whole idea
(38:27):
of how this thing is going tochange the world's an asthot. Think of
that's for a second, because thisis what I was thinking only last week.
I said, you know what,with AI, we wouldn't even be
even know whether George Floyd was actuallykilled or not, because they're going to
say it's fake, it didn't happen. See, so they haven't looked at
all the implications of this. Whybecause there's always a profit motive. I
(38:50):
will say that Bill Gates, oneof my heroes and technology, has come
out with his own concerns about thisstuff. You know, I can't remember
whether Musk has or not. Butthe fact that the mat is, once
you let it out of the bagis over. Because one thing that I've
always when I've had my go topeople and technology because I asked the man
(39:12):
people who have actually done some serioushacking, I said, and all nights
hacking? I said, man,how can I stop you from doing this?
Right? You know? And sothey give me some things And most
of the time when they had Docand Bell, it's not one person sitting
in a room, it's a groupof people, maybe a nation. Right,
I think about what's going to happenNow, if you got AI in
hacking and they use aiut of hack, right, the only way to stop
(39:36):
it is on the other side,you got to have AI. That's right.
See this is then last week,lastly, the movie I can't remember
this two thousand States honestly two thousandand one or the next one they had,
but I saw both of them solong ago. But in that movie,
folks, this is what happened,that the computer would asked to do
(39:58):
something and pretending said I'm sorry,I can't do that. I'm sorry,
I can't do that. Right.So the point is that it's always a
profit motive. They're not thinking ofthe implications. And if it's a profit
motive black folks and buys, theygonna give two shapes of a rats behind
(40:19):
in that to God, all thatis is all that is is that's saying,
well, you're going to have somethings that happen that's going to be
negative. But you know the factthat they're arrested and put in jail and
handcuffs and having experience to change theirlife, well, that just happens with
new technologies. YadA, YadA,YadA. But Dave David, yes,
let me, let me, letme read something okay? Doctor Doctor Bulliani
(40:43):
received an invitation from the House Committeeon Oversight and Reform to testify the congressional
hearing. Hearing was chaired by ElijahCummings from Maryland. Question from Ocassio Cortez,
who stated, doctor Boulliani, Iheard your opening statement. We saw
that these algorithms are effective to adifferent degree. Are they most effective on
women? She said no. Arethey both effective on people of color?
(41:07):
Absolutely not. Are they both effectiveon people of different gender expressions? She
said no. In fact, itexcludes different gender expressions. Which demographic is
it most effective on? The congresswomansaid her answer white men? And who
are the primary engineers and designers ofthese algorithms? Women said, sister said
definitely white men, Doctor Cortez,So we have a technology that was created
(41:30):
and designed by white males that isonly most effective on white males, and
they're trying to sell it and imposeit on the entirety of the country.
So we have white data set beingused as something that's universal when that isn't
actually the case when it comes torepresenting the full sleepia of humanity. And
do you think do you think itcould exacerbate the already agre egregious inequities in
(41:55):
our community in the criminal justice system. The answer was it all really is.
Yeah. Well, one other thingthat's the same in medicine. I
mean, how could you be ina first world country and black women have
the type of issues they have withchildbirth? Teach think about that. So
but anyway, enjoy both of y'all, My sister, my brother, I
(42:17):
hug y'all, be safe and thankyou. Thanks for for Sharon and I
had not ever heard of her.This is the first time I heard of
it. I look forward to getthat book. What you think my brother
over there and by knowing side wouldhave it, Brother Norman, you know,
I stopped over there and prepared him. He's going to have it.
Yeah, okay, my brother.I appreciate that. I give him a
call. Thank you, Bye bye, brother Bernard. Yes, ma'am,
(42:44):
miss Johnson, doctor Harper, howare you. I'm well. I'm well,
thank god. As I was listeningto the to the commentary about this
unmasking of artificial intelligence, my thoughtwas that, you know, a lot
of companies and these organizations are replacingthe hands of human beings with technology,
(43:12):
and I know, I know technologycan be used for the better good,
the greater good per se with,you know, for medical purposes and so
on, reconstructive surgeries or whatnot.But when you when you notice just what's
going on, immediately you know youhave and Kroger, for instance, you
(43:36):
have cashiers being replaced with machines.And when the babble says that the love
of money is the root of allevil, that's a that's a large statement
to say that the love of moneyis the root of all evil. And
(43:58):
when you you know, you justglance at the gas stations, Uh,
they're they're starting uh to do theuh self service and so on. It's
almost as though we're getting out ofthe business of serving one another and technology
serves us, which replaces us inthe in the labor for us. Brother.
But now let me ask you thisquestion. Do you believe that the
(44:22):
root of all evil is the loveof money is the root of all evil?
Do you think so? I'm justcurious. I do because it's because
when you when you start to lovemoney and instead of loving on people,
(44:42):
uh, you start to see peopleas as as as as an object,
uh, and not as a asa as a sensible or human being.
You see, people are simply anumber or an item on a shelf and
h and it and and it hasbeen that way for a good while.
(45:04):
I can remember at the University ofMemphis, I was studying economics and and
Uh, an Indian uh brother wasteaching the course. And when he started
to kind of dissect the economy andabout you know, uh quantity supply,
quantity, demand, demanded, andso on, and I started viewing,
(45:28):
I said, you know this,this almost seems somewhat as as modern slavery
because the business organizations are concerned withmaximizing output per unit, and and the
units are the are the workers.And so if you know, it's it's
like, let's let's maximize what wecan get out of one person and a
(45:49):
given amount of time. Okay.And the thing about it is that we're
all human beings. We're all humanbeings, and it's it's it's like there's
a lot of greed, whereas everyoneis focused on trying to have something without
necessarily working, and whoever can winin that race is able to just sit
(46:13):
down and not labor, you know, not not work or anything to that
nature, just kind of sit downand watch numbers or whatnot. But I
do I do believe that the lovefor money is the root of all eel
because every everything that goes on aroundus. I believe it was the Wu
Tang clan, a method man andthem says cash rules everything around me.
(46:37):
Cream get the money. It's it'sI mean everything. When you see the
lights are on, if you ifyou're if you see a car moving,
that's money because they're charging for gas. They're charging for that car. Note
someone bought that car. Someone's payingtaxpayer dollars to pave the street. And
so it's just, you know,it's getting out of hand. And I
(46:59):
think this is where we are withwith crime. Uh. You know,
people are murdering, murdering people,murdering mothers, murdering fathers, murdering children,
and people are putting money before people. And so basically this humane society
has become inhumane. You know,we we are not sensible to to to
(47:22):
human beings. We're sensible to objects. The objects and technology have taken over
uhur our sensibility. So we're we'rewe're non Uh everything is about strictly business
and money. Uh. And ifif they say it's about strictly business,
that means it's it's it's it's aboutuh, it's strictly about money. And
(47:43):
so yeah, I believe the lovethe love for money. But you know
what, I don't think we lovemoney. I think we're obsessed with it
and and and we've become addicted.Uh. So there is an addiction to
money that has taking over. Andif you're addicted to something, you're willing
to do anything to get it.Yeah, okay, okay, and so,
(48:07):
uh doctor, I appreciate you,uh for weighing in on that.
I kind of google the uh thebook, the reading per se, just
kind of get a glimpse at it. But yeah, technology is replacing everybody.
People are exing people out for youknow, for the love of money
(48:28):
and so on and jobs and everything. And you know, do the companies
not say, well, instead ofus, what's going to happen to the
to the to the families of thehousehold? Uh, that we eliminate and
we're gonna put these machines in orthese computers in to replace their jobs.
Now what what? Everybody just hasto be shifted elsewhere and at some point,
(48:52):
you know, it has a certaineffect on the community. So I
just think we need to get backto love the way doctor Martin Luther King
priests on and things of that nature, because you know, people have to
take care of people. And I'mnot gonna go further on that, but
thank you, miss Johnson for takingmy car. You're welcome. Thank you,
(49:14):
Brother Bernard, Father, father,hold on, let me the record.
What you're doing. Father, Oh, I'll sit up here, heavy
prayer, and I'm glad came up. Okay, okay, go on,
father. Look well I'm glad.This is lucky I even heard you with
(49:36):
all that records. I got thatright, got that right. So look
what I was saying interject is uhthe uh human intelligence, I mean artificially
intelligence. And I just think abouthow they when you go and buy beer
and you got I V and bea little people, but you've got I
D. Then they use your ID just can it for the beer?
(50:00):
And then with that information, man, what the work is that all about?
So think you would you know whattimes that you uh ba B have
you have many brands? You donnachange up all th years? Yeah,
you know, man, they cango on and on and on, and
I don't like giving that information andyou know, blinding like that it's like
(50:21):
me knowing what kind of way,you know, what kind of wine you
got your wind teller, what I'mgonna do with that? So anyway,
that wouldn't be nothing threatening. Butbut with them, it's a whole different
story. Just none of your businesswas written out over it. And Bill,
I miss you while you don't.Yeah, well, being actions when
(50:42):
he when he stood in with yourshow. Bill, Now, I'm I'm
a person like this, I'm veryuh electrical and uh and being actions is
from it back in the day.And they really it was like racist and
I'm talking, I'm show they foundout you black, they ain't want to
to talk and uh uh so hedon't really like me here. I like
(51:06):
him, you know he's saying,but I don't like his game. What
he's doing. How you doing itlike that? Because it's oppression is racism?
And uh uh so and you know, I like to reality. He
want to jump web back in theday. What's going on now? Brother,
I'm talking about what's going on now? So anyway, all right,
father, Well thank you for fatherfor that report here. Yeah yeah,
(51:30):
uh huh uhh. And another thingtoo, bell on a little shout out
of you all shout, I alwaysto all of them. Missus Jennings put
love until you got white said,I'll be glad when she come in do
a show if she if she gotto tell the folks you know, hey,
uh, I got sick and thenuh, exactly when I came in.
All right, father, she can'tmess up her job down there.
(51:52):
But you know what, that's agood idea. I'll see if she's available
next time. Feel good that welove to have her. All right,
father, Thank you, Father figure. We are talking this day. Hold
on, callers, we are goingto get with you. Doctor Harper is
in the house. A topic ofconversation. You know about AI, do
(52:15):
you artificial intelligence? Where we're talkingabout the book Unmasking Artificial Intelligence, My
Mission to protect what is human ina world of Machines by doctor Joey BOOLEYWANI.
Hold on, callers, we'll getto you. Doctor Harper will continue
his session eight three three five threefive nine three four two eight three three
(52:39):
five three five nine three four twoeight hundred five zero three nine three four
two nine zero one five three fivenine three four two will get you in
to us. We're going to theother side of the beb Johnson Show on
Double DIA whether you're in Arkansas,Tennessee, or Mississippi on Facebook, Twitter
(53:14):
or Instagram. Thank you for listeningto the BEV Johnson Show on w d
I A Memphis. Listen on thefree iHeartRadio app for all your music radio
and podcasts. Free Never Sound isSo Good with a hard and soul of
(53:39):
Memphis ten seventy w d I AMemphis Show, be Johs Memphis Talkie and
all Away, How you Go,You go so get ready in time show,
(54:01):
Let's go, We make good Bye? You tell you Hi? Listen
to what today? You Know?It's time the belt to this show show,
(54:21):
Let's go. Good afternoon, andwelcome back to the second half of
the BEB Johnson Show. We arediscussing the book Unmasking Artificial Intelligence, My
(54:42):
Mission to Protect what is Human ina World of Machines by doctor Joy Booty
Wanie. Doctor Warren Harper is here. Psychologists mental health specialist Raceman, Doctor
Harper. We're going back to ourphone lines to talk with Earlene. Thank
you for waiting. Earlene, Hi, how bad? How are you doing
well today? And yourself? I'mgood? I thank you, thank you
(55:06):
for taking my call. Doctor Harper. I just finished reading the book Back
in Science and Modern and it isan awesome book. Awesome book. I
ain't taking a series of class inmy community calling racial disparity and it's this
(55:27):
book have really helped me to sharelight on some of the inventions and things
that I just have been involved withfor a long period of time. So
I thank you for coming on thisshow sharing all your knowledge and your weasom
with us. I need to knowwhere I live in Illinois. I need
to know the address of the bookstorein Memphis that I could possibly get some
(55:52):
of these books from. Okay,we will. I will get if you
keep listening early, and I'll getthat address and i'll say it on the
air. I know it's on Volentine, but I'll get your address, okay,
because I'm driving right now, right, Okay, good, But well
I'll give you a chance to yeah, do that, and I will give
the address for the African Village Institute. Yeah, okay. And about the
(56:19):
AI, I am interested in AIbecause I'm a photogopher, yes, and
I would like to learn to usethe skills to you know, manipulate my
pensions and stuff like that. Soyeah, that's why I'm interested in and
I know things we have to involveevolve as things continue to progress in this
country in other countries, so wehave to be ready for it. We
(56:42):
do. Yeah. So any way, well, thank you for taking my
call. You glad you had avacation. I did. I did,
thank you early and I'll make sureI give that address before we go off
the air. All right, thankyou guys, be glad. No let
nobody feel your joy. Got thatright, sister, you too, Bye?
B Hi, Tiffany, Hey,and Staf. How you doing.
(57:07):
I'm doing well today, Tiffany,how are you? I'm fair? I'm
fair, okay, So as faras AI go, it's pros and comes
to everything so good. I'm moreon the current side than the pros side
because I still oh high, doctor, I'm sorry. I feel like we've
(57:30):
been warned or forewarned about the AIand the changes of the world since well,
they put everything in a movie beforethey let it happen, just like
Enemy of the State, just likeour robot. They put that out there
before it even happened. So,as far as AI go, with me
with the droids and the picture fALSlike how they falsify pictures, and just
(57:54):
anything that's negative about it. It'ssad that I'm focused on that part because
I want to be an optimistic person, but I have to pay attention to
everything that's negative about it. It'stoo much going on in the world today
on God's brain nerds for us notto focus on how AI is kind of
destructive. It's getting into all ourbusiness. It's falsifying pictures and we able
(58:21):
to do criminal activities with it.It's just it's not as good as they
portray it to be. It's helpfulin some ways, yes, and we
have to grow in every generation.We must grow. But and you know,
each time I see each time Isee these these gurus on TV selling
AI, they always talk about howit's going to create all these jobs for
(58:45):
people, and people in the audienceusually say, well, is it going
to take away job? To sayno, no, no, it's going
to create jobs. Believe me,it is going to absorb all jobs at
some point. You know already they'reover one hundred and seventeen million people in
this country that has their face alreadyin facial recognition network that's used by the
(59:06):
police department. I'm pretty sure mine'sin there. THEVS might be in there,
you know, anyone that they thinkthat that might be of have influence,
okay, or might be suspect ofsomething. They're gonna have our face
in that facial recognition for the policeto search. Believe me, it's scary.
But God keep first. I'm tryingto keep God first. Well,
(59:30):
I do keep God first and everythingbecause I know this world is what it
is, what it is, butwe've been worn. They put it in
our faith, they hid it andpray view. They knew what they were
gonna do, and every the wildmartlike the lines are extra long now because
they the machines do all the work. They want you to keep the line
(59:50):
and serve yourself. If I hada choice, if we had a choice
to keep it or dismiss it,well, I tell you. In terms
of artificial intelligence, you know,what they found is that in these these
cars that are self driving cars.Yeah, the artificial intelligence, the facial
(01:00:15):
recognition does not pick up black blackfaces very well. So therefore we may
get run over. Yeah, weget run over because the machinery is not
at that level advanced at that level, so they're still trying to perfect it.
Okay, So it's it's problematic forus really. Okay, okay,
(01:00:36):
all right, thank you for yourcall, Thank you for listening. I
get I get Tiffany w j Hi. Yes, I appreciate you letting me
speak. Uh. I don't wantto have something to say about it.
It's not gonna be long. TheAI stuff and stuff you're talking about.
I don want to ask the doctor. And you think it's something in what
(01:00:59):
we do to each other. Mostlywhat we're what's happening is what we are
doing when you look at that whatyou're talking about on paper and how to
identify it. But we identifying ourselvesas being criminals to each other that I'm
talking about Sei. The main thingis is deep enough to see the man
trained us I hate and hurt eachother. Anything they want to know about
(01:01:20):
us, they ask us. Wedo it and they don't have to ask
if we do it to each otheranyway. We're a dog and these others.
We are dogmatic and our man themale and I'm a mail on myself
old male. We're dogmatic to ourrace. We're dogmatic to ourself period with
drugs and drinking and smoking and andand we asked for help. And you
can read the books all you wantto understand. I think about your book.
(01:01:43):
What we need to realize is whatthat man has taught us and put
in us. We need to tryto take some of that out of some
of us some kind of way sowe can get along. Were killing each
other, and I was looking.I was looking at the young lady yester
their news as supposed to kill theman who supposed to rape her. See,
And I don't think that was whenshe bought two snich in front of
(01:02:04):
her boyfriend. I think what happened. She didn't do that. I begin
I don't knew her mother had awork best on, but she's been working
a yellow vest and she turned herdaughter in. And I believe what happened.
She went to do that act withthat guy, and that guy walked
back there to where they were,and that man resistant. He shouted,
I don't I don't read a girleven did that. Our black males,
(01:02:25):
and I'm a black male, we'redoing a lot to our race. I
promise you look at Haiti over there. They're black. We have a thing
for killing each other. We loveto kill each other. The man ain't
got the key. We just lovedit. Like to kill another. That's
good for a little nothing. AndI just wanted to say that, doctor,
But it kind of gets me whenpeople we read the put it in
(01:02:49):
a book. That's good. Mostof us don't even read the book.
You see what I'm saying. Butit's good for you to come on to
the radio like you were doing andtell us what you feel. And that's
good because most of the one youreason, oh you're welcome, w going,
doctor Harard. But but a lotof people do read. And you
know, doctor Bobby Wright says somethingthat that was very very very profound.
(01:03:13):
He's ancestor. Now he's a psychologist, and I would hope to be like
him. At some point, DoctorBobby Wright said, blacks kill blacks because
they were not trained to kill whites. Now after after unpack that, because
people think, oh that's racist,No, no, no, no,
What he's really saying is that isthat we have we have taken, we
have internalized all this racial hatred andwe and we have we have been rewarded
(01:03:37):
a most for killing each other.You know, kill a white person,
you're going to brison. Kill ablack person. Hell, you kill a
black, kill a policeman, you'reout on bond. The next night,
I mean, what the hell isI can't understand that, right, So
that's what he's referring to. Wehave not if we have not been trained
to kill any other group except ourselves, because we are literally rewarded for doing
(01:03:59):
such. You right, doctor Harper, w d I A Hi, mister
James, Hello Bell Johnson, welcomeback. Hello doctor Harper, welcome back.
I'm just I just want to jumpinto it. That's a lot to
cover, but I'm gonna try tocondense it and get it out of the
way. Number One, I wantto say to black people, we got
(01:04:20):
to vote. We just got tovote. We can't sit on the sidelines
and just talk talk talk. Wegot to get out and vote. We
got to get our people out tovote. It is so important for us
to vote. So you mentioned youmentioned Congresswoman Cortes's talking about AI that lady
always asks the right question. Shealways asked the right question. That's the
(01:04:43):
reason Trump ohe half a billion dollarsto New York right now because she asks
the right questions. I can't hearyou. I'll let that rest because right
now in Baton Rouge, they aretrying to get their own city where they
don't want black people to be init. They don't want black people around
(01:05:06):
it. But to me, that'sgreat, that's great. I love that
because we have to do the samething. We have to get back to
our communities. We've got to shopblack by black, support black, hire
black. We got to take backover our community. It seems like everybody
else want their own communities, butus, we won't integrate. Everybody else
(01:05:30):
is segregated. We got to getsegregated. We got to get back into
our own community. And when youwas talking about AI and the coders,
it's no difference from a polygraph test. It depends on who's giving it.
A racist white person, a jobinterview, it depends on who's given it
a racist white person interrogations, Itdepends on it depends on who's doing it
(01:05:55):
a racist policeman, arresting officers.It depends on who get arrested, white
arresting officers. So we have toreally get out and vote. From Congress,
people and politicians want to kind ofput a halt to this. AI
and I don't have enough problems withthat, but we definitely got to get
out and vote. But I readwhere when AI take pictures of individuals the
(01:06:20):
only way you can tell is youcan tell is by the eyes. They
said the eyes are dead. Haveyou heard that? And I guess the
eyes of the of the life tigersouled or something like that. They said
that. But have you heard anythinglike that where if you take a photograph
with Ai. And that's why alot of those actors, that's why they
(01:06:43):
went on strike because of Ai.But have you heard anything like that?
And I'm just going to have tothank you so much for all the information.
James, Yeah, haven't heard ofthat. Haven't heard of that?
W D I a Hi caller,el Oba, Hey doctor j from the
bay. Yes, good, goodto hear that. You have a nice
time on your vacation. I did, I did. I had a wonderful
(01:07:08):
time. Yeah. Good, goenjoy yourself. Girl. Anyway, look
at here. Have you heard ofClaude Allison, doctor Claude Allison Powers?
Oh yeah, yes, Claude Andyes, sir, yeah, okay,
uh you know la uh ho uh. He said, money it's route to
all evil. He said, well, I don't have any money and my
(01:07:30):
people don't have any money. I'mevil. They said, money can't buy
love with got no fan ace.There's no romance. Then this lady that
was in the church though he doesa lot of uh, he does a
lot of left in their churches.You said that, uh, money can't
get you to heaven. He said, I would go as far as money
(01:07:53):
take me, and I walk therest of the way. But well,
you remember the old Jayson money forthe love of money. I do,
yes, yes, sir, Iknow there listening. This's all right,
yes, But a person that woulddo anything for money would do anything us.
Most of your money is a fairreserved note. The fer Reserve Bank
(01:08:17):
can counsel it tomorrow anytime they wantto. Artificial intelligence is stuff, explanatory,
artificial mean, it ain't real.Right. Technology eliminates social skills.
It takes away your ability to think. Thank you Bill for taking my call.
(01:08:40):
You are so welcome, Doctor jhave a good, good day to
day, be saved. You enjoyyour birthday tomorrow. Baby. I will
thank you so much. All right, thank you, bye, bye,
Common many, I'm doing well,common man in yourself, I'm doing okay.
How you doing, doctor Hopper?Doing fine? Brother? All right,
(01:09:00):
all right, I missed the majorityof your show. I heard you
asking a question somewhere. I heardyou call out names, but I didn't.
I was on the phone and didn'tget a chance to understand what you
were talking about. Yeah, butI put the question out there, what
would you do if you are accostedby the police because they because your your
(01:09:23):
picture showed up as being as amatch between some criminal and you're accosted,
You're pulled over, you're forced togive fingerprints, your service to what.
They won't tell you exactly why they'redoing it, And all of a sudden
you're sitting in the squad car fifteenminutes later, fuming, and then the
machine says, i'start a match,let them go. What would you do?
Right? Uh? With my mindsetnow, I would hide an attorney
(01:09:47):
and foul a lawsuit. Thank you, thank you. Yeah. And I've
actually had that happened to me oncebefore, some very similar to that.
I was on my way to thestore one night I made I was in
my early twenties, I remember,and I was on my way out on
a date with a young young ladyfriend of mine, and the stopped at
(01:10:08):
the store on the way. Inthe store, so I got that iknew,
so we stood there, Mama.The two had a short conversation.
While doing that, police rise byand he turns around, so I gone,
I go on in the store.He comes in and take me out
of the store, puts me inthe back seat of the car, takes
me around the block, and thelady say, yeah, that's him.
(01:10:30):
The lady pointed me out and saidI had just carjacked her. Wow.
Wow. I was yelling and screamingat my cars in front of the store
with my lady friend in it running, and she said, that's him.
They took me on their little rideso she could look and see that.
She said, that's him, youknow, and I was yelling and screaming.
They know what was going on.They know what the accusation was at
the moment, but comes to findout the lady was saying that I had
(01:10:53):
carjacked her. Coincidentally, just assoon as she said that, they got
a radio call saying that they hadgot to suspects with the vehicle. Is
that something, m That was theway I was I was taking back to
my cart in but it was.It was really a terrible moment for me.
And at that time I was notwise enough to get an attorney,
you know. Yeah, But anyway, I wanted to speak on the technology
(01:11:16):
and the money being the root ofall evil. You know. Back like
around about in two thousand and three, I took this little course in Texas,
uh and it was a course tolearn to solder, you know,
which is somewhat similar to a wedding. And in this plant there was robotic
machines. So that board that youpunch in on your microwave those members and
(01:11:41):
that board is a little just alittle things with a lot of pieces of
metal salder together that basically has abrain. We punched old numbers and the
activats to build that board. Itwould take me probably about an hour to
sorder all those little bit of civilpieces on the back of the board.
With that rope out of machine.It was done in like five seconds.
(01:12:01):
That's right, that's right. Whoneeds you, that's right. The only
thing they needed me for was toprogram the machine. But once I do,
which takes about ten seconds, gettingthe machines put that whole board together
in less than five seconds. Soof course it takes away jobs for human
beings. Normally it would take youknow, four to fifty people around their
(01:12:25):
warehouse once they put their road bythe machines in three or four people could
run their entire warehouse. Also,Walmart and I think, if I'm not
mistaken, Walmart plans to go completelyself check I think it's Walmart with those
self checkout lines. There's another scenarioof the technology, you know, and
(01:12:47):
then a lot of things with thesephones. A lot of times we don't
study and reading anymore because we lookup everything on the phone. All the
information is there, you know.So it's a lot of damaging things.
There's being done to the human beingwhen it comes to this technology, just
as well as you know, theopportunity to obtain this information, but you
(01:13:08):
may be cheating yourself because you're notlearning for yourself when it comes to you
know, knowledge, that's that's exactlyright. And and believe me, uh,
we don't need to be no moredumber and dumber. We have to
get smart, right, We haveto uh huh. And I'm gonna tell
you real quick, and I lety'all go. I know it's getting short.
Bill. If you asked brother,but not about you know, you
asked the question, do you thinkmister gurro evil believe that you know?
(01:13:35):
There's a small word there all youknow, But I guess when we think
about, you know, what whatpeople do for money? Was some people
do for money. Some people dosome terrible things for money. Yes,
you know, some people do someterrible I'm talking about from robbery, you
know, murders, selling drugs,prostitutity, you know, all these kind
(01:14:00):
of things. And the call ofa couple of calls before me that said
a person that would what he said, however he said it, they'll do
anything for money. Most cases,most people that does those type of things
will just about doing anything for money, you know. And so it's I
won't to like be reluctant to sayit's the root of all evil, but
(01:14:23):
I will say that money does bringa lot of evil along. People do
a lot of evil things for money. You You remember a common man with
Reverend Ike used to say, uh, the love of the lack of money
is the root of all evil.He would say, the lack that was
(01:14:45):
he would say, the lack ofmoney was the root of all evil.
Mm hmm. But uh, youknow, I think about it. In
some cases, it's sad. Someof the things that people will do for
money is really Yeah, whether itbe man or woman, you're right,
you're right, top all right,bye bye, unforgetful. Hey, I
(01:15:12):
just want to say, Doctor Hopper, welcome back man, y'all. He
got a great scout going over here. I'm telling you man dealing with his
AIS situation, Doctor hop It's onething that really made me mad the other
day, and I just uh wantto throw it out to the public.
Do you remember the incident happened withwhat's your lady named? Kim Kardashi?
(01:15:32):
What about it? Uh say whatwhat incident? Well, Kim Kardashi And
they was at that uh you knowwhat they called it roasting somebody. Oh,
she was roasting up Brady the footballfootball ex football player Tom Tom Brady.
Yeah, and she was talking about, uh, the Kevin Hart,
(01:15:56):
how short he is? Okay beforeso she's a't know, joking and stuff,
and it was a lot of itwas a lot of boring and all
that kind of stuff going on inthat episode that was going on before talking
and stuff. And remember back inthe day Doctor Harper with the model Luther
King with WYMC, when they hadhis speech with nobody out there, they
(01:16:23):
edited where they feeled it in away they showed the popular on television.
I think we were dealing with acommercial, and you know, they made
seem like he was just standing thereby hisself with nobody out there with that
speech. I think it might havebeen I had a dream speech something like
that. But the point that I'mtrying to get to this AIS situation cut
(01:16:46):
all that out, only showed himspeaking from a podium with nobody out there,
and I was so pissed out.It was back in the eighties,
so this technology had been out therefor a long time. Its just got
more advanced. You know, I'mgonna tell you, doctor Harper, we
and we're gonna end up being inthe nature in a way. You know,
(01:17:10):
look, people talking about AI.AI is good, but it's not
good for the public. You know, it's good for maybe the local and
state, federal goldment, but outhere in the public is no good because
look at what's going on with allthis hacking and all kinds of stuff.
You know, people, you knowthat's a danger. You know, That's
(01:17:30):
what I'm looking at. So Ithink it shouldn't be out here period for
the public. Okay, you know, it's just ain't because Dr Harper is
a danger man, and people don'teven want to see it. I mean,
it systematically psychological assimilating people too.A man of society. And just
(01:17:50):
think about when you hear these callstalking about money, think about that.
I mean money, rouf. Moneyis more important to God. Now they
don't look at the humanity side ofbeing a human being. Just think about
their people when you listen to peopletalk about money. I don't want it,
(01:18:12):
but I got to have it becauseit's the only thing I can survive
out here in this so called worldthat we call. I wish he didn't
exist, to be honest with you, in my opinion, because it's a
danger. People don't look at you. My money's been around since there was
a world. So there's money,know what Heaven? All right? He
(01:18:38):
prove it to me. Might havebeen around since there is the world.
You said something. You said somethingthat Johnson, don't prove it. Tell
me if might have been around sincethere's the world. Or doctor Harper talked
about the richest man in the world, remember that Massa. I always had
(01:19:02):
some money. Okay, think aboutthat. Okay, thank you, thank
you for that, Miss Johnson.Okay, I'm talking to miss Johnson.
Now tell me what you just said, Miss Johnson. I'm listening to Young
Radio Okay, bye, I saidmoney's been around since the world, right,
Doctor Harper, Okay, hold on, y'all take this break, and
(01:19:28):
we're coming back to you as wetalk about AI. The book is called
hold On Colors, Unmasking Artificial Intelligence, My Mission to Protect what is human
in a World of Machines, bydoctor Joy Boulu one eight. Doctor Warren
Harper is in the house and me. You're listening to the Bev Johnson Show
(01:19:50):
on double d I A. Ladiesand gentlemen, you're listening to the Queen
of Talk, Bev Johnson on wd I A. You're listening to the
(01:20:45):
Bed Johnson Show. Here's Bev Johnson, and we're gonna get back to A
I in just a few minutes.But you know I've been missing I've been
missing the rocking Chair. Y'all knowI've been missing the rocking chair. Well,
let me tell you about the rockingchair. If you don't know about
the rocking chair, the Rocking Chairof Memphis fifteen forty two. Elves Press
(01:21:08):
Leg has the best Southern soul foodaround. Yes it does. Let me
tell you. It's in South Memphisfifteen forty two. Elvis Presley and Wednesday
through Sunday, they're serving up thebest Southern soul food around eleven am to
five p m. And let metell you they have the tastiest fried chicken,
(01:21:30):
smothered pork chops or baked pork chops. Today it's Thursday. Hey,
catfish, baked and fried chicken,pot roast, buffalo fish, smothered turkey,
necks, miss ANNs chitlings. Yeah, yams, fresh picked greens,
macaroni and cheese, cabbage, greenbeans, northern beans, spaghetti, pinto
(01:21:51):
beans, corn bread. Yeah,if you are hungry, you can dine
in or take out. Let megive you that number to call. They
will have your plate waiting for you. Nine zero one four two five five
two six four nine zero one fourtwo five five two sixty four At the
(01:22:13):
Rocking Chair, we rock with entertainment, but we rock with the best Southern
soul food around Wednesday through Sunday,eleven a m. Till five p m.
And were getting ready tomorrow for mybirthday celebration the Rocking Chair. They're
giving me a birthday party tomorrow.Yeah they are so uh, take out
(01:22:33):
the Rocking Cheer. But if you'rehungry for this evening and tomorrow. Stop
by there the Rocking Cheer fifteen fortytwo Elvis Presley, where they are serving
up the best Southern soul food around. Nine zero one four two five five
two six four. Nine zero onefour two five five two six four is
(01:22:55):
the number to call dine in ortake out, and when you go tell
them BEV Johnson sent you to theRocking Chair fifteen forty two Elvis Presley Boulevard.
(01:23:19):
We are going back to our phonelines to talk with you. Thank
you for waiting. Hi, Lily, Hello, there are you. I'm
doing well today in yourself. I'mgreat. Hi, Doctor Hopper, can
you hear me? Yeah? Turnyour radio down? Yeah? Oh yeah,
(01:23:40):
he Look, I'm the lady Ihave talked with you last. I
want to flash your about the stirllers, about the community, the environment,
the chemical that was being exposed toour area in Riverside. Can you kind
of remember that one? I doremember you. We want Oh okay,
look you see now that that theywere telling us that this was no problem.
Now they're at one of the placesthat we were complaining about, the
(01:24:02):
sternialization place. If you notice thenews, they have closed that place down
and they are moving away. Good. But they have been interviewing people that
have died of cancer or dying ofcancer and whatever right there at Florida and
Mallory. It's been a long timecoming, but they're still they're getting in
too together. But they were tellingus that some meeting that that was no
(01:24:23):
problem, and we knew it wasa problem because I've seen so many people
in each household on the street thatI grew up on died four and five
people in the household, So Iknew it had to be something. And
that was twenty thirty years ago.But what I wanted to say to about
our young people that is acting out. Do you remember the experimental that they
were doing with the black man withthe assiphilist. They wouldn't treat them,
(01:24:46):
Yes, yeah, yeah, Okay. I'm kind of figuring that this is
probably still happening with our kids inour neighborhood at these clinics. We don't
know what they're injecting our young blackkids with when they're born, because it's
there's no way that these kids shouldbe acting like this, you know.
I think about the kids. Thepeople say, well, it's the it's
(01:25:09):
the it's the drugs, it's thecrack cocaine. It's the food, it's
the gun. But if you thinkabout it, other race of people they
do drug crack cocaine. Other raceof people eat certain food. And Number
One, guns guns have been inwhite people homes forever. They have guns
(01:25:30):
all in their sakes. I meana bunch of guns. They teach their
kids how to shoot guns. Theytake them out at four or five years
old and teach them how to shoot. But do you don't ever hear about
them harm in each other like wedo our kids. That I mean,
I say, it's got to besomething they're injecting to our kids at the
early age that's doing it, Likeyou say, a study on what's gonna
(01:25:53):
happen to them like they did themen at the with the simplest things.
And I really believe that's what ishappening. And I will tell you another
thing that we were talking about afriend and I about you remember, Belle,
You might remember this when they boughtchurches chicken into our neighborhoods and how
they were saying that they had beenusing in bombing fluid in those chickens to
(01:26:17):
make them so big, and theywere all in our neighborhood. They were
injecting them with the chemicals and makethem bigger. But all of this were
put into our neighborhood. So that'swhat I'm saying. I really believe that
this is what is going on,and they're just studying us using our kids.
One lady worked at the pharmacy.She said, she asked one of
the ladies, well, why youkeep giving our young our people these generic
(01:26:40):
brands. But the lady come andshoot her off, you know, like,
oh okay, you know, leaveit alone. But I think that
this is what is happening to usand our kids. Okay, all right,
Lily, thank you Lily, Okay, thank you, so thank you
Harper. Good topic, doctor Harper. Last words you like to say today?
And we want to continue this?Sure, well, I think we
have to be vigilant. I thinkthat we have to again go out and
(01:27:04):
take a look and buy books likethis and read read about this lady's journey.
And believe me, I have noteven talked about her research yet and
how she got started on this journey. But you know, these are one
of these These are one of thesesituations whereby you're out there alone and either
you are out there to help yourpeople, or you're out there to help
(01:27:25):
yourself in terms of making money.She's dedicated towards helping African American people and
people of African descent because she's gotthe knowledge and she has to know how.
And she's gotten over any any fearthat she's had about confronting the IBMS
and all the big tech companies therehave to bend towards her now because she
(01:27:45):
is a a something that has tobe reckoned with. And I think the
next time we come on, whichwould be next Thursday, I'll talk about
her research and I'll probably talk aboutsome other things too, But I think
her research is fascinating because it saysthat AI technology misidentifies black people, people
of color, and women, andyou cannot use a system like that that
(01:28:06):
cannot identify you correctly because the consequencesof getting misidentified could be the taking away
of your life. Night, DoctorWarren Harper. And for the sister who
wanted the address of the African VillageInstitute, and it spells African a f
ri k a n African Village Institute, it's twelve twenty five Volentine, Memphis,
(01:28:33):
Tennessee, three eight one zero seventhat's the African Village Institute twelve twenty
five Volentine b O L L IN T I N E, Memphis,
Tennessee three eight one zero seven,where you all can get all the books
(01:28:54):
that doctor Harper talks about on onthe show. Thank you again, Doctor
Harper, have a fabulous weekend.Be safe, brother. We appreciate you
well. Thank you brother, Thankyou brother, thank you and thank you
callers, Thank you listeners for joiningus this day on the BEB Johnson Show.
We do, we really do appreciateyou. So until tomorrow, please
(01:29:19):
be safe, keep a cool head, y'all, don't let anyone steal your
joy. Until tomorrow, I'm BEBJohnson, and y'all keep the faith.
(01:29:42):
The views and opinions discussed on TheBEV Johnson Show are that of the hosts
and callers and not those of thestaff and sponsors of wt IA.