All Episodes

March 14, 2024 26 mins

Host Vanessa Tyler discusses the impact and the evolution of A.I. on the Black Community.  

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You have created a monster and it will destroy you.

Speaker 2 (00:03):
Is AI the monster out to destroy black people?

Speaker 3 (00:08):
You know? To answer the question in a more nuanced way,
I think black people should be aware of the potential risk.

Speaker 2 (00:17):
That's next on black Land and now, as a brown person,
you just feel so invisible.

Speaker 4 (00:24):
And where we're from, brothers and sisters.

Speaker 2 (00:28):
I welcome you to this joyful day.

Speaker 1 (00:30):
And we celebrate freedom.

Speaker 5 (00:31):
Where we are, I know someone's heard something.

Speaker 2 (00:37):
And where we're going.

Speaker 6 (00:38):
We the people means all the people.

Speaker 1 (00:40):
The Black Information Network presents Blackland with your host, Vanessa Tyler.

Speaker 7 (00:46):
With AI fraudstres can take three seconds and you all
know this three seconds recording of your voice. I've watched
one of me a couple of times. I said, when
the hell did I say that?

Speaker 2 (01:00):
While you didn't, mister President, he's talking about those deep fakes. Okay,
Afua and Nicole, you guys ready with me the most
brilliant minds on the topic. I especially wanted to talk
to these two sisters about AI and what it means
for us. Nicole Jackson is a computer designer and one

(01:20):
of the leading women in STEM and Afua Bruce is
an engineer, data executive. Deeply committed to how AI can
be used for the social good nicole. We've been hearing
bits and pieces about how AI will hurt black people,
loss of the wage jobs, totally racist facial recognition, an

(01:42):
election misinformation targeting us. Should black people fear AI for
dragging racism into the future? But I think we do
need to be aware about what these systems are being
used to do, how that affects us as a community,
how that affects us on long term from a system's view,
employment in healthcare, all the way through education, and through exposure.

(02:06):
But I also think we need to lean into it
and say, Okay, so now that we know that these
are risks, what are we going to do about it?
What are we going to do? I'd like that data
in data out AI is up and running, who's feeding it,
who's programming it? And is it all ready too late?

Speaker 3 (02:25):
Yeah?

Speaker 1 (02:25):
I think that.

Speaker 6 (02:26):
I think that is the question of the day and
the question of the coming days, because even for all
of the efforts that have gone into creating AI systems,
style and AI has been around for decades, all the
part of public conversation following a year or so. So actually,
let me step back just for a second and sort
of define what AI is. So I think people hear
the terms that may not understand what that means. So

(02:48):
I think of AI as algorithms that are processing data
to process predict or to produce information. Traditional AI uses
computer algorithms to process data to do things such as
analysis and predictions. Generative AI, which is Chatgypt and the
like euse with algorithms to generate content and so all

(03:10):
of this is really powered by the underlying data systems
and data systems and data is created by gave us.

Speaker 2 (03:18):
So this is where it gets tricky human input, human bias,
human assumptions.

Speaker 6 (03:23):
So we do see in a lot of data systems
the masking really or overlooking or over generalizing of information,
especially as it relates to put people of color into
black people specifically. This then results in your AI system
is built solely on top of data that has bias

(03:43):
in it. The predictions, the processing, the analysis is going
to be somewhat biased.

Speaker 2 (03:50):
The tragedy for us is when that bias data is
used against us. A fool of Bruce, who was the
executive director of the White House Office of National Science
and Technology under President Obama, now advises government agencies how
to incorporate AI into their work.

Speaker 6 (04:08):
AI is a new type of programming. So the first
thing I tell people to keep in mind is to
really understand your values, because that's going to then be
reflected in what you develop. I encourage agencies to use
AI tools with ways to include and not ways to
automatically exclude. So we want to think about using AI

(04:29):
to help maybe identify instance of the fraud, or identify
who might be at greatest need of resources, as opposed
to saying this AI tool is going to automatically remove
people from access to healthcare, or remove people from access
to housing, or automatically determine who stays in jail for how.

Speaker 2 (04:53):
Long there is good to AI, and Nicole Jackson, the
computer designer, says, this country is going to need the
flip side of AI.

Speaker 3 (05:03):
And then how can I enable you to scale? How
can it help you reimagine some of the monotony of
the things that we've just gotten used to doing because
it's how we did it, and how are we going
to reimagine systems? Then the problems of AI can actually
do that, or at least some of it. It just
shouldn't do everything, and the things that are the higher
risk that if we was talking about the things where

(05:24):
people can be harmed, that's where we start to apply
that ethical boundary. That's what we start to apply the
risk pool, which, by the way, we've seen Canada and
we've seen the UK really apply pressure in this space
and really really questioned and I don't think anyone's gotten
it right or perfect, but at least it's in the conversation,
and it's an opportunity for black people in particular to

(05:45):
do exactly what we stated, how will you be participating
if you are in this space in software or data
science or any of the things that we're doing, how
are you going to engage?

Speaker 6 (05:56):
So what we can do about that is a couple
of things. One, from a research perspective, for all the
scientists and engineers out there, we can work on how
to correct that data. Two is we can teach people
how to identify bias and data and then how to
update the decisions that are made based on that bias

(06:17):
and to sort of correct for that. And so three,
we can actually decide in some cases that we don't
want to use AI in situations because the bias is
so great that the way that the systems would work
would inevitably lead to even more biased outcome.

Speaker 2 (06:36):
We've all heard about the horrors when AI is used
incorrectly by law enforcement, specifically facial recognition when it's the
only investigative source. Something Niger Parks will never forget. His
is the face of AI gone wrong for black people,
he tells CNN. Back in twenty nineteen, police from Woodbridge,

(06:56):
New Jersey, came looking for him for a theft at
a local whampton in resulting in a list of serious
crimes aggravated assault, unlawful possession of weapons, shoplifting, hitting a
police car while trying to escape, almost running over an officer,
and using a fake ID Because at the scene of

(07:17):
the crime police found a fake license with the picture Bingo,
cops relied on AI to find their man. A facial
recognition match says it was Niger Parks.

Speaker 1 (07:29):
It wasn't any evidence, it was just a picture a
computer saying we look alike.

Speaker 2 (07:35):
The picture a black man about the same age, early thirties,
same shape, face, but looking at both Niger and the suspect,
his mother could tell you the two men don't even
come close.

Speaker 4 (07:46):
It is nothing like him, looks nothing like him, which
goes back to they think we people think, you know,
have a saying all black people look the same. That's
the first thing came to my mind. The worst part
for me was when we went to court to see
my son shackled and handcuffed and court fighting for his life.

(08:09):
He didn't do it, and it's like nobody believed him.

Speaker 2 (08:13):
On top of that, Niger Parks was arrested, spent eleven
days ten nights in jail, and was facing years in
prison over a photo and like his mom said, no
one believed him.

Speaker 6 (08:26):
Oh no, y'all, I didn't do this. That's not me.

Speaker 1 (08:28):
He's like everybody's in here, city's innocent. You got to
prove that you're innocent. But I shouldn't have to prove
that I'm innocent. You should have to.

Speaker 2 (08:34):
Prove that I'm guilty as good as Ai is, and
it could help police quickly close the case. It is
terrible in many instances at getting dark faces right. Niger,
by the way, is suing over his false arrest and
the trauma he adored. Back to my conversation with Nicole

(08:55):
and Afua, Nicole tells a story about a project she
worked on only reinforcing the shortfalls of AI and black people.

Speaker 3 (09:04):
I remember working with a robot. It was called Gibo.
It was a product from a Kickstarter I think, MIT
help back it. I love this little robot thing, okay,
and it was it was incredible. We were working health
I on a project and I started to play with
it and I noticed that it could never see me
when I went to go and activate it because it

(09:26):
used facial recognition. But if any of my peers in
my office went up to it and Shorewood unlock and
began to engage, and it was you know, interactive and
would talk back and respond. And you know, the point
there is it took quite a few you know, conversations
back and forth with Kickstarter team. They made some updates.
It eventually fixed itself over time.

Speaker 2 (09:48):
And despite the concerns we need AI. Talk about why.

Speaker 6 (09:53):
I think there are many great applications of AI. To
be honest, and you know, I think that sometimes we
call victims to a lot of how the media or
movies especially like to portray technology and AI is that
it is evil and it will always ultimately take over
the world.

Speaker 2 (10:11):
But they say, our world needs AI. It will help
us and surprisingly fill in the population gaps.

Speaker 3 (10:19):
You know, the mass retirement of a baby boomer population,
and you know the shrinking birth populations, and how all
of those things play in dynamics in the workforce. And
what we'll find is a we're going to have to
reimagine how people have worked, meaning we used to apply
a lot of human capital to places where there won't
be a lot of human capital anymore.

Speaker 6 (10:40):
I used to help lead a nonprofit that did AI
per mission driven organizations around the world, and we created
solutions such as helping frontline health workers in Nigeria digitize
their records. Going from taking that information it used to
be more than thirty days the process to processing it

(11:01):
in just eight hours. That means that the doctors on
the ground can respond to health crisises a lot more quickly.
We worked with AI to well get systems in areas
where they were drought prone and so therefore we were
able to help the water districts better anticipate where the
drought was going to be and to redirect water supply

(11:22):
in different places. I know they're actively organizations today that
are using AI to help the people better filed for benefits,
to apply for SNAP benefits, to use SNAP benefits, healthcare benefits,
and more. There are a lot of positive uses of AI,
and as Nicole said, it's really time for people, for

(11:44):
all people, black people especially to sign up the help
on this. You can sign up and you can contribute
as an engineer, as a coder, as a developer, but
also especially with AI, because it relies so much on
the interpretation of data, we.

Speaker 3 (11:59):
Need way more people applying pressure. We need way more
people really looking at what's being built, questioning and applying pressure.
And as that happens, you start to see the right
course of action.

Speaker 6 (12:11):
At Black people. We are not new to what racism
looks like and how it can show up in every
aspect of our life, and so we have to remember
that AI reflect what the humans who design it put
into it.

Speaker 2 (12:25):
There are a number of black people making sure they
are on the other side of the keyboard in putting
the correct data. Let's take a quick trip down to Jackson, Mississippi.

Speaker 5 (12:42):
Hello, thank you so much for having me.

Speaker 2 (12:44):
You call yourself a tech evangelist. So what is the
gospel of tech.

Speaker 5 (12:50):
Well, the gospel of take is very inclusive and accessible
now more than we think.

Speaker 2 (12:57):
Meet doctor Nashley Cephas, tech evangelist. Normally we think Silicon Valley,
not Jackson, Mississippi, a majority black study mostly known for
having years of bad water issues, but nationally has a vision.
Her company, Bean Path, is planting seeds that are sprouting
tech leaders.

Speaker 5 (13:17):
Yet still a lot of times our communities go without.
And so I wanted to make sure that at least
for this one community for which I was from which
I was born and raised, Jackson, Mississippi, and Central Mississippi,
really Mississippi overall had access to emerging technologies. Youth were
exposed to these technologies, small business owners and startups could

(13:40):
be encouraged and empowered to use these technologies to their advantage,
and that we could be more informed overall about the
limitations as well as the benefits. Because the more we
fear technology, the more farther behind we get, and that
only hurts us in the long.

Speaker 2 (13:57):
Run, she says, We've got to get in.

Speaker 5 (14:00):
A lot of technology is based on the computer. A
lot of that is based on the Internet, and a
lot of times we don't have access to that or
very efficient access, or even have access to the education
around how to use technology.

Speaker 2 (14:18):
That's where bean Sprout comes in. She teaches black people
how to code program their experiences into AI.

Speaker 5 (14:25):
We incorporate a hands on aspect with AI and robotics,
three D printing, laser cutting, drafting and design, even sewing
in fashion tech. And so there's so many ways to
apply technology to our culture, to our way of arts,
our way of cooking, our way of making, and we

(14:47):
just haven't really seen a lot of people make that
connection one because there's not a lot of us in
this space.

Speaker 2 (14:53):
And it's lucrative. She should know. She was part of
a team that made millions.

Speaker 5 (14:58):
You don't always have to go to work for someone.
You can create your own business. And I actually did that,
being the CTO of startup company part Pick based in Atlanta,
and we actually were acquired by Amazon. We sold that
company for multimillions of dollars. That was something that I've
never done, never heard of, never even thought it was
possible because they'd never been shared with me. But this

(15:21):
is exactly how people do it in Silicon Valley, etc.

Speaker 2 (15:24):
When we look at some of the people who are
the billionaires, the wonder boys, attach many of them that
never even went to college.

Speaker 5 (15:32):
If you put in some work, some due diligence, even
self study over the course of maybe six months, three
to six months, get a certification or learn a new skill,
you can get into the six figure salary pretty quickly.
We're not asking everybody to get a degree in computer engineering.
We're not even asking everyone to learn how to code.

(15:53):
We're just saying, hey, you need to be aware of
what this tool can do for you and how you
can use it to create and innovate.

Speaker 2 (16:00):
But first, doctor Sephis had to sell her community on this.

Speaker 5 (16:05):
We started changing the way we were marketing to people.
So we went to the churches, of course, because there's
our huge community there of people that need take help.
We also went to the schools. We went to community
organizations like the Boys and Girls Club and Jackson Public
schools and the other libraries, and even the partnering universities
like Jackson State University, and we started getting more people

(16:28):
to come. And before we knew it, people were lined
up at the library door before the library even opened
to come and talk to us and get take help.
Sometimes parents would blame their kids just to see what
a black engineer looked like because they didn't know what
to tell them it.

Speaker 2 (16:44):
God's so big. She had to get a building that
is now the centerpiece of town where Blacks learned tech.

Speaker 5 (16:50):
It's literally a seventeen thousand square foot barn located in
the middle of downtown, and I said, okay, I want
to purchase building. I found that the same guy owned
majority of the property around it, surrounding property. Gave me
an awesome deal and fast forward. Now have twenty one
acres of land in eight buildings in downtown Jackson.

Speaker 2 (17:13):
And now it's so huge it's morphed into providing affordable
housing on the grounds. How is this changing the city?

Speaker 5 (17:20):
Well, I think it's changing it in a few ways. One,
we want people to realize that there is talent here
in Jackson and the surrounding areas, and we do have
a tech ecosystem. We may not be as collaborative or
aware of each other, but we just celebrated our five
year anniversary, and we do have a tech ecosystem from

(17:44):
youth all the way to the business owners and the
adults who are seeking help and they want access to
somewhere where they can learn more skills because they're not
associated with the university, they're not associated with a job
that allows them to learn these things, and so where
else do they go? And so they come here. We're
also trying to fight the what we call the brain

(18:06):
drain problem with this broader real estate development. If we
bring more tech startups and tech companies to Jackson, if
this helps attract them, then more people like myself, which
I'm a product of the brain train people that get
educated in the state and leave in order to get
the jobs that they need or jobs that they want,
or getting the salary that they want. We can also

(18:29):
entice those people to stay in the state and help
the economy that way. We've helped over three thousand people
to date and that number keeps growing. With tech skills,
tach exposure. We've put on at least one hundred and
fifty programs in that time for youth, for senior citizens,
for churches, etc. So I've invested over a million dollars

(18:53):
of my own personal capital into the broader project. We've
also raised over a million dollars on the philanthropy side,
and we're continuing to move.

Speaker 2 (19:04):
Bottom line, we can't sit back and let AI program us.
We must program it. At least be aware of what's
happening from Mississippi. We go to the Bronx. We are
in the house, the knowledge house, that's the name of
the place in New York teaching people about AI. Here

(19:26):
the passion of founder Jerlyn Rodriguez also joining in with her,
Daniel Adeyanju First, Gerlin, how are you making sure black
people are aware and involved in AI?

Speaker 8 (19:39):
So, what's really important about the knowledge houses work is
that we are raising awareness about opportunities to enter and
thrive in the tech sector, and now AI is part
of that. The future of AI is here and so
the same way that we want the general public to
have digital literacy skills, the same way that we want

(20:01):
job seekers to have the basic tech foundations. Right now,
your average technologist needs to know how AI works, what
are the risks and innovative opportunities that we have because
of AI? And so for the knowledge house, it's about
exposing our participants to what AI looks like in the

(20:21):
day to day and what's coming ahead. So at the
knowledge house we provide technology education programs and job training
to make sure that folks from low income communities have
the opportunity to thriving tech.

Speaker 2 (20:36):
Well, Daniel, what will happen to us if we are
not involved in this revolution?

Speaker 1 (20:42):
Vanessa, That's such a great question. It's something that we
think about all of the time, and I think there
is two things that I want to focus on. Is one,
the economic opportunity landscape. If we are left out of that,
it will continue to increase the gap between the rich

(21:03):
and the poor, between black and brown folks and other
counterparts who simply just have more access to this information,
have more access to the economic opportunity in Web two
point zero, essentially digitizing online food, ordering cars, all these
things that we've seen a large amount of wealth is
generated and we were left out of that. So that's

(21:25):
why when we're founded ten years ago in generally made
this happen, it was to ensure that we wouldn't be
left out. So we're not going to be caught unawares
this time because of what we're doing at the knowledge house.
The second part we should be thinking about is ensuring
that we are part of building this new order. We
need to make sure that our communities are included in

(21:48):
making decisions. We can't have ais policing us that were
designed by forces that just simply weren't considering our dynamics.
We need people ensuring that biases are not encoded. That is,
the past decisions that have been made against our communities

(22:08):
are not essentially repeated. So if you think about what
code does you write code one time and it's repeated
over and over and over again and without a human
saying no, this isn't right, something is wrong here. This
is actually impacting my community. Some people can go to
jail that really should have been given an alternative pathway,

(22:33):
an opportunity to redeem themselves. And the knowledge house ensures
that our future is brighter by making sure that folks
from our underestimated communities are, for example, going into our
data track. They're learning the fundamentals of databases, how to
clean data, how to find data, how to analyze data,

(22:56):
how to use coding as you mentioned, to create these
AI models, and the underpinnings of these systems. So our
fellows are going to be leading the future of work.
They're going to be getting AI jobs, they are going
to be getting data scientists roles. They're also going to
be designing experiences. We have a user experience track. When

(23:18):
you talk about things like chat GBT, how do you
design an experience that is relevant for communities that we're from,
solving problems that are native to us. So we see
our fellows as problem solvers who will solve problems in
their communities, not just problems for companies looking to profit.

Speaker 2 (23:37):
It sounds as if you're saying that all the things
that we've been worried about that may be wrong can
be corrected.

Speaker 1 (23:44):
There's hope, absolutely. I consider myself a techno optimist in
that with the work that the knowledge House is doing
and organizations like us and partnering companies that invest in
our work, community organization, politicians, governments, if we have if
we kind of take this full court press, all hands

(24:07):
on deck, the mother's, grandmama's aunties and uncles in the community,
we can make sure that our young people aren't enrolled
in a program like our Kareem cor Coding Fellowship. We're
in high school. They can learn these skills from a
young age, and post eighteen, they can enroll in a

(24:28):
program like ours. Way they can learn how to code,
how to design, how to protect cybersecurity systems, how to
build an AI leveraging problem set. So I think it's
really important that our communities are part of this work,
that they're getting these skills, and that they're solving problems

(24:48):
in their own communities.

Speaker 8 (24:50):
When we started the knowledge House nine years ago, Tech
was trending, especially in New York City, but we were
just saying that the Bronx was being like out of
those opportunities, and so that's why we started the knowledge House.
And so as we grow and expand the program into
other cities across the country, it's so important that we

(25:12):
stay grounded on increasing access, right because increased access leads
to economic mobility, right. And so when it comes to AI,
you know, my understanding is that not everyone's going to
become an AI technologist, right, but it's about accessing AI

(25:33):
tools and knowing how to use them responsibly.

Speaker 2 (25:36):
So now we know. In addition to the Bronx, the
knowledge House has expanded places to learn tech in Westchester, Atlanta, LA,
and Newark, New Jersey and coming to other cities. Check
their website the Knowledgehouse dot org. The takeaway AI is
part of our lives. Make it work for you, not
against you.

Speaker 1 (25:56):
We must be problem solvers, and we need to make
sure that our communities are getting these skills. We need
to make sure that we are seeing ourselves as builders
and not just consumers.

Speaker 2 (26:10):
Next time on Blackland, we're going to the streets with
the story of a woman who has lived that life
with all the horror, abuse, exploitation, human trafficking.

Speaker 9 (26:21):
It really is the new a slavery. It really truly is.
Because I was sold, I did not receive a dime
of any money that was around from me. I was humiliated,
I was raped, I was tortured, and this was like
a daily thing.

Speaker 2 (26:38):
Join me on Blackland with Vanessa Tyler
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.