Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
The following show is brought to youin partnership with the Institute of Politics,
Policy and History, Blue Star Strategies, Bright Road Incorporated, Make It Plain
Podcast, and RPC Media from thecampus of the University of the District of
(00:45):
Columbia. This is State of Play. Welcome to State of Play. I'm
Sharon Pratton. With me, Ihave Karen Tramontano and the Reverend Mark Thompson.
Our topic today the technology gap,especially for community teas of color,
whether it's jobs, businesses, orthe adverse impact and getting credit in the
(01:07):
criminal justice system. Fortunately for us, we have a leading expert, doctor
Nicole Turner Lee. She is thedirector of the Center for Technology Innovation with
the Brooking Institution. Before that,she was with the Multimedia Telecom Internet Council,
and even before that the Joint Centerfor Political and Economic Studies. What
an honor to have you with us. Oh, thanks for having me Mary.
(01:30):
I just so appreciate being here today. I talked about I highlighted the
areas where it has an inverse impacton people of color. Which of these
concern you the most? Oh,my goodness, I think they all do.
I mean, when we think aboutthe death of technology it's almost as
if technology is sort of trailing alongsidesystemic inequalities. At all of these stops
(01:53):
where actually seeing people of color,particularly people of color who are low income,
live in rural areas, you know, maybe older affected by these technologies
in ways that technology was never designedto be right. It was always supposed
to be a game changer to slowsocial problems. So I would say all
of them have become equally important.So do you think the federal government understands
the depth of the problem. Well, you know, I think it's interesting
(02:15):
that question because I think the federalgovernment has a role to play on a
couple of fronts and allow me tosort of break it down. First and
foremost, we need the federal governmentto understand that they have to invest in
us and what that means is alot of times if we see these intersectionalities
between racism and discrimination and tech,it's because nobody like us is sitting at
the table at the beginning when we'redeveloping these ideas. And it's important that
(02:38):
we have workforce diversity, particularly incompanies that have less than two to three
percent of representation and had decision makers, engineers as well as data scientists so
let's start there. There are waysthat Congress can start pumping that money back
into computer science careers for young people, or figuring out ways to create more
(02:58):
inclusive data sets for the scientists thatare building these products. But then I
said, the second thing is wedidn't distinguish what Congress is looking at,
which is a digital divide, fromthese other things Mayor that you talked about.
The digital divide is about who isonline, who is not, who
has a device, who does it. Last year, when fifty million school
age kids were sent home from school, we found out that fifteen to sixteen
(03:20):
million of them did not either havea device or broadband. Nine million didn't
have either, and the majority ofthose were kids who were of color black,
brown, and those whom tribal lands. We're now suffering this from this
because those young people are now one, two, three, ten months to
almost a year behind when it comesto schooling because they lack the materials necessary
(03:43):
to actually engage in distance learning.Sounds to me like a Brown versus Board
of Education situation. But I wouldcome back to that in my writing.
When we start to piece the digitaldivide, that obviously requires the government to
look at adoption infrastructure, as wellas a range of other ways that out
to get people involved. And thenthe final thing, I'll just stay there
on the criminal justice side, it'sreally important that we begin to look at
(04:06):
the decisions that these computerized systems aremaking. Did you know, not too
long before the insurrection on January sixand a black man in Detroit was misidentified
by facial recognition technology, sat atthe station for six hours, only to
be found out that he didn't evendo a crime. We need to get
better at these things because the verytechnology that was supposed to solve problems should
(04:30):
not be creating new ones. We'vealready marched, we've already fother civil rights,
and so we need to make surethe technology catches up with where we
are in terms of our freedoms.So the pandemic has clearly taken the covers
off of these systemic issues. Howdo we keep the pressure on so that
it's not forgotten a month from now, two months from now? You know,
(04:55):
Karena is such a great question.I'll go back to my example of
the schools here. We have nowthe ability to bring our kids back to
school in the fall. But Iactually was saying that any educator listening that
we cannot abandon what we did toget kids connected. We should be having,
much like we did with No ChildLeft Behind, an initiative of no
Child Left offline. The fact thatwe had young people who didn't have a
(05:16):
tablet next to their textbook, aWi Fi hotspot with a pencil, or
any type of provision or guidance onhow to get online. It's something we
need to work with it. Weshould not allow these lessons of twelve months
during this pandemic to actually push thosekids back into a vacuum where they cannot
have the same twenty first century toolsto be productive. So I'm saying out
(05:38):
there there's a lot of stimulus moneythat actually went to good programming. We
need to make those more permanent,much like the Emergency Broadband benefit for affordable
broadband and the moneys that went toschools to be able to provide for these
hot spots and tablets. And also, isn't having or not having broadband going
to impact some workers' ability to geta job and keep a job. Oh
(06:01):
my goodness, revenue are so righton that. I think this is where
it becomes important as administration thinks aboutthe infrastructure money. Let's start thinking how
FDR thought about the new deal.What's our tech new deal that it moves
it away from just passive consumption toproduction. Where are we actually putting people
to work in these jobs that arenow going to be taking over our communities?
(06:24):
There were thousands, hundreds of thousandsof businesses owned by people of color,
women that were lost. They're notcoming back to our communities. But
yet it takes an idea through astartup, a black startup. It takes
the ability to train people on howto stream wire or fiber optics cable.
Those are livable, wave scale jobs. And so we need to be pushing
towards a tech new Deal that combinesnot just closing the digital divide, but
(06:47):
making sure we get people back towork, particularly people of color. Well,
the impact that we're having on theHill now around the infrastructure initiative pushed
by Biden seems to be a pushbackby Republicans for any investment and retrofitting in
human beings to participate in the neweconomy as against physical infrastructure. Isn't it
(07:10):
about the dollars or is it aboutwhere you spend your money? So,
when I was growing up my mommyneeds to say it's the dollars and it's
the sense. Let me tell yousomething. You can't ride on a new
road without a car, the sameway that you cannot be a student K
through twelve without a tablet or ahotspot. That soft infrastructure I call it
is local infrastructure. You know.Can I tell you all one thing ten
(07:30):
seconds? We should have every federallyassisted housing developing United States, particularly a
WASHDC with broad bed already built intoit. We have to start looking at
ways to have a connective mindset ifwe're going to make sure that our connected
communities are economically prospering in the neweconomy. You're very fortunate to have you
(07:53):
here. We're very fortunate in ourcountry to have you. Thank you for
being on State of Play making.Welcome back to State of Play. We're
today talking about the digital divide,that huge chasm where the digital world is
growing and it appears our numbers interms of participation may be shrinking. It
(08:16):
is a very daunting circumstance that peopleof color are not at the table in
any way, and people of colorare being negatively impacted in profound ways,
and it's very very alarming. Whatthis brings up for me is once again
how far behind black and brown peopleare in so many places. So we
(08:41):
look at what's happening in neighborhoods withgentrification, we're being priced out of neighborhoods,
that we're being locked out of cyberspacereally, and the more you hear
about it that there are certain jobs, especially post pandemic, that are going
to depend on employees being connected tobroadband, being online, then we have
(09:03):
a real challenge. And so ifyou look at it, so many black
and brown workers are behind in termsof access to jobs, and then you
add on top of that the boomingbroadband working economy. It's something that really
has to be taken more seriously.Now, the thing that concerns me the
(09:28):
most, and perhaps it's because Ididn't understand the depth of the problem,
is the design bias. Now.You know, I know the pipeline is
very thin for black and brown peoplein technology, and I understand, and
certainly the pandemic has made us allaware of the access issues. But the
(09:52):
design bias in data data that webelieve is neutral is not. It is
far from neutral. It has decades, in decades, in decades of systemic
bias built in, and it isthis data that policymakers are using to make
(10:13):
decisions to do what but to correctyou know, the historical technology gap.
Well, it can't happen if we'rebuilding decisions on this data. Now we've
talked about how it impacts employment andbusiness opportunities, but our next guest is
going to explain to us how itimpacts every decision, every aspect of our
(10:37):
lives. Welcome back to State ofPlay. Now we're going to talk about
how this digital world we live inhas the negative implications on so many aspects
of our lives, especially for peopleof color. We have the perfect expert,
doctor Ruha Benjamin. She's a professorin the African American Studies Department at
(11:00):
Princeton University. She's the founder thatIda B. Wells Just Data Lab and
the author of Race After Technology All. It's sort of transposed of there.
Thank you so very much for beinghere, such an honor. Thank you
for having me. So you talkabout this gym code, so to speak
(11:20):
of this how this technology is oftenused in a way or how it's organized
in a way that has negative implicationsfor people of color. Can you sort
of elaborate on what gym code means. Absolutely, the new gym Code is
a combination of coded inequity and imaginedobjectivity, the fact that we imagine technology
(11:43):
to be more objective and neutral thanthe human counterparts. So if we acknowledge
that there's racism and discrimination in ourcourts, or in our hospitals or in
our schools, a lot of peopleassume that taking those decisions that human beings
would normally make and allowing technology tomake them will get us around the bias,
will kind of fix the problem withoutrecognizing that technology is created by human
(12:07):
beings. The assumptions, the values, that desires, the interests that shape
our society becomes baked in and encodedinto these technical systems. And the real
danger is that, unlike their humancounterparts, where you can point to a
racist judge or or doctor or teacher, when it's within a technical system,
we assume that it's neutral, soit's harder to hold accountable. And so
(12:30):
what the New Gym Code does isname this problem so that we can start
to shine a light on it anddeal with it and try to address it
in really productive ways. So couldyou take us through example. So let's
say you're trying to get be paroled, or you're trying to get a loan,
or trying to get a job.Could you show us how all of
these Yes, all of these automateddecision systems have to be taught how to
(12:56):
make decisions. They don't just growon trees. And so the question is
is how do we teach them.We teach them by feeding past data,
past human decisions, whether it's whogot who got gets loans, who gets
paroled, you know, who getsthe job. So we take that historic
data and we train these systems howto make future predictions and decisions. So
(13:18):
if in a certain industry, blackfolks have been discriminated for generations, or
in a certain neighborhoods have been upyou know, profiled for generations, that
data becomes the starting point that weuse to teach these systems how to make
future decisions. And then we assumethat it's neutral because it's being spit out
by a software system. But wehave to question the source of that data
(13:41):
and the source of the human decisions, and the assumption of neutrality when it
comes to these these software systems.In so many areas of our lives,
is there aspect of life words morealarming. I think we have to be
concerned about it everywhere. So Ido think when it comes to let's say,
do good professions like healthcare, peopleassume, because it has this ethos
(14:03):
of wanting to heal and help,that we can put our guard down and
assume that, actually, you know, just because people are well meaning that
the outcomes will be good. AndI think we do have to stay on
high alert even in those arenas.What's so interesting to think about is that
when an AI system in the contextof healthcare is trained using data from doctors
(14:24):
reports, let's say, doctors reportsof pain. We know that doctors and
healthcare professionals routinely underestimate the pain ofblack patients. That's in their reports,
and so if you train an AIbased on that, the AI system is
going to continue that process of underestimatingthe pain of black patients. But recently
(14:45):
there was a study that trained AIbased on patients self reported pain, and
it was much more accurate because itwas actually going to the source of the
pain and was not being filtered throughthe discriminatory ends of the experts, let's
say, in that industry. Sothere are ways to think critically and create
(15:05):
these systems with equity and justice inmind. So you want to lead,
or to some extent, inspire othersto lead an abolitionist movement against this New
Gym Code. Can you give ussome of the tools to do that.
So, first and foremost, weneed a name. We need a way
to name what's happening to us,because if we can't name it, we
(15:26):
can't talk about it, we can'torganize against it. And that's why I've
developed this idea of the New GymCode, to remind us that history is
in our present. It's being putinto all these shiny new features, shiny
new systems, but historic data isactually driving these emerging technologies. And so
to organize, we have community organizationsthat are working on this, organizations like
(15:48):
Data for Black Lives and different indifferent cities. There's different regional and city
based organizations that I call tech justiceorganizations. There's legislation that's being dragged around
algorithmic accountability because this can't just bea kind of patchwork thing where you leave
it to different states and cities todeal with. And we need a national,
(16:10):
even international way of thinking about accountabilitywhen it comes to technology. And
of course my home term is education. We have to reimagine how we're training
future technologists so that rather than justbeing reactive to harmful technologies, we can
have people at the source that arebuilding them with these values in mind.
(16:30):
And so the pedagogy from paid throughtwelve and higher education in computer science and
engineering and STEM across the board hasto be infused with an equity lens and
an equity approach. Where do youthink you can have the greater impact?
Is it education, is it regulation? We need to be working on multiple
fronts. Education is where we seedwhat we want to see fifty years from
(16:52):
now. We start seeding that ineducation. But the algorithmic harms are already
happening now. People are being misidentifiedby facial recognition, people are being excluded
for opportunities based on targeted advertisements andautomated systems. So legislation and also legal
mechanisms to actually hold these harms accountablethey have to happen yesterday, and so
(17:14):
we need people working on multiple frontsrather than saying one code goes before the
other. But in order to getwhere you think we need to go,
and I agree with you, we'vegot to sort of purge all of our
data banks, do we not?You know, I think it's really hard
to simply think what we might doas individuals because you know, the common
parlance is that we're users. Right, we're users of technology. But if
(17:38):
you're a user, you're going toget used. So another way to think
about technology and technology access is thetechnology that we have access to has access
to us. And as you're noting, the data that we are producing is
actually the lifeblood of these technological systems. So something that we can all do
right away is to actually become startto think of ourselves more as stewards of
(18:03):
technology, as people who don't justhave an obligation to protect our personal privacy,
but have to think about what arethe legal what are the policy frameworks,
what's the ecosystem in which technology isbeing developed, and push for that
ecosystem to reflect public values rather thanprivate interests. And so this to do
this, though, we all haveto feel like we have a stake.
(18:26):
We can't just leave this to thepeople with the fancy degrees, the people
with the technological know how, becauseoften they don't have the social and historical
know how. And so black communities, black scholars, black engineers, black
students. We all have to riseup and understand that we have a stake
in this and we have to shapethe future that we want to see exist.
(18:48):
Well, I think, obviously youare getting the attention of a great
many people, and you're winning alot of praise, which is very encouraging.
At the end of the day,it's still a daunting task to get
people exercised around something where a fewof us know a lot about it.
And so I guess maybe you tuneinto and work with other activist groups,
(19:12):
like you know, Black Lives Matterand that sort of thing. Yeah.
Absolutely, If you go to theresources page of my personal website, there's
a whole host of ways to plugin. We don't have to just sit
at home and feel paranoid about surveillanceor paranoid about you know, big brother
looking over our shoulder. We canactually plug into community groups, to organizations,
(19:34):
and there's a whole host of themaround the country to actually begin to
empower us to change that paranoia intopower. And so I would encourage those
who are listening, who feel concerned, to realize that now is the time
to get involved. Now is thetime to shine a light on systems that
would rather stay in the dark.So many of these systems would rather hide
(19:55):
behind the cloak of neutrality and makeall kinds of decisions about our lives without
us having a say, And nowis the time for us to speak up.
Well, good for you. You'redoing great work and we need you
desperately, so thank you for beingon State of Play. Pleasure, welcome
(20:19):
back to State of Play. We'retoday talking about the digital divide, that
huge chasm where the digital world isgrowing, and it appears our numbers in
terms of participation may be shrinking.But we now have an expert, someone
who's managed to bridge that digital divide. It actually has strategies for increasing the
(20:42):
numbers in the pipeline. We havedoctor Nashley Cephas, who is an applied
scientist with Amazon Web Services and she'son their machine learning team. Thank you
so much for being here. Thankyou all so much for having me your
path, your journey to becoming whereyou are today. Your mother I gathered
(21:04):
encouraged you, and you had teacherswho encourage you because it's so rare to
have women, much less an AfricanAmerican woman in the kind of position you're
in today. Yes, yes,so I was encouraged. I grew up
in a house full of women,very strong matriarchy, and I believe that
that was what instilled to me thatas a woman, you could be and
(21:26):
do whatever it is that you wantedto do. We worked a lot on
projects at home, DIY projects everythingfrom hanging sealing fans and putting down floors
and more on the yard, andso we learned how everything worked and we're
encouraged to do so. It reallywasn't until I got into the world and
realized some of the challenges it wasto be a woman in this field,
(21:48):
especially a black woman from the South, and so I was very you know,
challenged at first, but I foundthe help that I needed through mentors.
I was fortunate enough to attend severalsummer camps as a child, middle
school, and high school on computerengineering and other types of engineering at nearby
university, Jackson State University, aswell as Mississippi State University, where I
(22:12):
ended up getting my undergrad and Ithought that was extremely helpful as well to
at least be exposed to that technology, which is something that I hope to
do from our community and continue todo for our community as well. Your
mother got you started by giving youa computer when you're around eight or nine.
Yes, So my mother bought acomputer for my sister and I believe
(22:36):
it was called a Packard Bill computerand it had speakers on the side,
and we used to have to Thiswas the age of dial up, very
slow internet, and so we hadto get off the computer anytime we were
downloading music when when my grandmother wantedto use the phone and so. But
it was very fascinating and how youcould cold, you can you play games,
(22:56):
you can pretty much do your homework, you can listen to music.
And I've just really learned and wantto delve into it even more. Doctor
Sefas, tell us a bit moreabout your professional career that's also exceptional.
I know you're with Amazon, you'reworking with artificial intelligence and facial recognition.
Also tell us, if you would, how the algorithms for artificial intelligence are
(23:21):
not so intelligent when it comes todiscerning the facial recognition of those with features
like yours and mine. Right,So we have a team and Amazon that
I work on focused on fairness andmitigating biases and AI technologies and it's quite
(23:42):
a unique, you know, andtaboo concept because as my math teacher used
to tell me, whenever you usea calculator, you know, garbage into
the calculator, garbage out. Itdoesn't guarantee the right answer. You can
think about these machine learning tools andthese AI tools, same type of concept.
(24:02):
We feed it data. We feedthis, uh, these algorithms and
these computer programming data that has happenedin the past. It learns from that
data and it creates models to predictthings in the future, the same way
we do with predicting weather forecasts andlearning from past history and humidity, temperature,
basically the location, etc. Itself. For now, with machine learning
(24:26):
applied in terms of artificial intelligence,we're now able to predict, uh,
you know, what a person said, or have a conversation. A computer
can to have a conversation with someone, or we can you know, like
you say, predict different things anddifferent attributes about a person's face. Now,
the nuance comes in when the datathat this algorithm was trained on is
(24:49):
not balanced or it may not berepresentative of all the people that are using
the data. And so that iswhen you come into issues where when you're
testing you'll have some discrepancies of disparitiesand the results across different ethnicities, you
may have disparities across different races,gender, age, etc. And so
(25:11):
it really depends on what that algorithmwas trained on, how you're testing it,
and quite frankly, for me,it all goes back to who's at
the table when you're designing your products. Are we creating fair pipelines and equity
and access to everyone that works onthis technology to come and have a seat
(25:33):
at the table, whether that meanshiring them, which I hope it does,
or you can get that same inputthrough focus groups. And that's something
that we as a large company thatI work at at Amazon as well as
across the tech industry, have totake a hard look at um, including
working with other entities like government policyholders and things like that. So,
(25:56):
doctor Cepish, was this your inspiration, this lack of diversity, lack of
people of color and other voices aroundthe table, Is this your inspiration for
the Being Path? Yes? Absolutely. I started the Being Path nonprofit back
in twenty eighteen in my hometown ofJackson, Mississippi. Some majority of Black
(26:19):
City, Mississippi as a whole,it is known to be under the poverty
line, and so I wanted tobe a part of the solution and the
problem, and I just quite franklygot tired of every time I would go
spend a lot of time in theWest Coast, in New York City,
even in Atlanta, and when Iwould go back home, topics like Internet
(26:40):
of Things or AI or cybersecurity orcryptocurrency were just majority foreign concepts in the
places where I grew up, andI wanted to make sure that I changed
that and provided some exposure and somelight to those people young and old so
that they can be a part ofthis conversation. Well, you're an inspiration
tour, Saw, I mean,we really appreciate your leadership on this and
(27:03):
your journey alone is an inspiration toeveryone. So we're very We're going to
get behind you in any way wecan. We want you back on State
of Play anytime you can. Andwe were honored and enriched by your conversation
here on State of Play. Thankyou so very much for participating, Thank
you all for having me. Welcomeback to State of Play. Our topic
(28:02):
today the digital divide and how thatdivide, that chasm is daunting. How
the digital world is expanding while thenumbers of people of color seem to be
shrinking. But we have with usnow an expert, someone who's trying to
address this very issue, doctor AllisonScott. She is the CEO of Kapor
Foundation. Thank you so very muchfor being here. Thank you, mayor
(28:27):
fort pleasure to be here with you. So you're a graduate of Hampton University.
You have a PhD in education fromthe University of California at Berkeley.
You've always had a focus seemingly onpeople of color in this space of stems,
the world of stem. Why isthat what prompted you to focus on
(28:48):
that space? Yes, so,my background in social science research has really
led me to want to focus onunderstanding and examining systems of inequality, and
specifically in STEM education and computer education, in the tech system, tech ecosystem
more broadly. As we've seen overthe past decade, there's been such an
increased focus on technology and that's playedan increasing role in our society, and
(29:11):
so just so important to understand howpeople of color are being excluded and then
what potential solutions are to those challenges. Will you say people of color are
excluded? I think we all kindof grasp that, Well, what are
the numbers? You know, whatis the percentage of blacks, let's say,
in the tech space, the percentageof blacks who are executives in the
tech space and the like. Yeah, so we know the tech sector is
(29:33):
increasingly driving our economy. So we'retalking large numbers twelve million people and almost
two million new jobs just in thelast decade, but only eight percent of
the tech workforce is black in lessthan one percent are black women. And
we've seen over the past five yearsacross the large tech Silicon Valley tech companies,
very very minimal progress in representation.We actually did a study found only
(30:00):
one percentage point increase despite all ofthe efforts over the past five years in
the representation of black folks in thetech workforce, sort by venture capital.
To what extent is that available?And I'm so glad you asked that question
because it's such a critical space aswe think about innovation and creation of new
jobs. So in twenty nineteen,there was one hundred and thirty seven billion
invested in tech startups and just onepercent of those were founded by black entrepreneurs.
(30:25):
So there's still a significant amount ofwork that's needed in the space of
venture capital and investment. We needmore capital flowing into entrepreneurs. We need
to develop and support entrepreneurs as theycreate new innovations in the technology space and
provide them the opportunity and the roomto flourish and grow. Well, you
know, in the report that Ithink I saw that your foundation did,
(30:47):
it suggested that a venture capital venturecapitalists provided a black entrepreneur but one hundred
and twenty five thousand dollars on averageas the investment, but before white entrepreneurs
like two point five million dollars alwaysunder capitalized. I mean, how do
you turn that around? Yeah,and that was a critical report done by
(31:07):
We cited data from Digital Undivided andthey've been looking at this data for the
past three or four years. Sowhat we're seeing is really very very incremental,
if any, progress in the amountof capital that's flowing in. I
think in the last six months wesaw pretty significant commitments from tech and from
larger private equity firms towards black entrepreneurship. But those have really just been a
(31:30):
drop in the bucket. A greatplace to start, but we need much
more capital flowing in. Well.You know what also fascinated me is how
your foundation put such a focus onvoter education voter registration. One would not
think that that was necessarily a strategyfor a foundation focused on addressing presence in
(31:52):
the STEM space. Why so,we are a racial justice focused organization.
We focus at the intersection of racialjustice in technology, and we saw what
was happening over the last year orso and said, it is going to
be critical for us to empower communitiesof color to have the opportunity to turn
out to vote. And some ofthe solutions to some of these really complex
(32:15):
challenges, both in education and theworkforce in tech, all require both public
and private solutions, and so whowe elect, who our public officials are,
can also contribute to some of thesechanges. So we saw a direct
connection between the two, and weprovided resources to grassroots organizations that have been
(32:35):
leading the charge for quite some timeto turn out voters. So is there
a particular piece of legislation federal orstatewide, let's say in California, you
think they could make a difference,They could have an impact. So we're
actually looking at a couple of differentthings. One, we think that the
government, both at all levels,has a huge role to play in holding
companies accountable. So regular data collection, reporting and transparency, regulation and oversight
(33:00):
of things like antitrust and competition.The utilization of algorithms, especially in the
case of algorithmic bias and facial recognitionsoftware is something getting a lot of attention,
and data protection and data privacy,so accountability is a huge portion.
And the second piece, we thinkthat public policy can play a role in
(33:22):
being proactive in developing more equitable futures, so broadband infrastructure investment to provide high
speed internet to folks who have beenleft behind, and educational workforce development,
providing resources to institutions to think aboutthings like upskilling, reskilling, how to
(33:44):
ensure that the existing black workforce canbe prepared for tech jobs of the future.
And then also economic stimulation, aswe were talking about how to get
more capital flowing into communities of color, how to inspire more black entrepreneurs and
provide them the capital that they need. There's a lot of legislation that can
happen there. Well, do youthink that these companies should be treated as
(34:05):
utilities? I mean, they're ubiquitousand don't shouldn't they be regulated since they
impact so many facets of our lives. I think that's a really outstanding question.
I know that there is some legislation. There are a lot of folks
considering different angles on that piece,and we're following that very closely. But
we do think that there is arole to play around regulation around Section two
(34:27):
thirty in content moderation. Now thatthese companies have become so powerful as sources
of information and we saw what happenedwith the January sixth insurrection and the role
of disinformation misinformation, there is arole that the government should be playing and
we are following that very closely.Your leadership, the cap or foundation,
the leadership then you're providing in thisspace, which is exemplary. What of
(34:51):
those big texts. I mean theyhave foundations. Also there's money that they
could provide matching dollars, but let'ssay venture capital. And in terms of
what you're doing, are you seeingthat happening? Yeah, I think we've
seen I would say over the pastfour years, we've seen more momentums,
some more chief diversity officers hired,more commitments to diversity. There's a lot
(35:12):
more work to be done inside ofthose workplaces to ensure inclusion and actually that
the numbers move. As I mentioned, we only saw one percentage point increase
over the past four years. Butthere is also a role that they can
play in terms of deploying resources theirCSR budgets, the very very powerful lobbies
that they have as we think aboutthings like broadbands, them, education,
(35:36):
they can leverage their power in waysI think that can benefit all of society.
And the bad word that people don'tlike to talk about too much as
taxes and are these companies paying theirfair share as they continue to make tremendous
amounts of profits, are they payingtheir fair share to ensure that this is
the type of equitable society that weall want to live in. Well,
you deploy a lot of strategies,a lot of initiatives, and you know,
(36:00):
for the purpose of increasing participation inthis space, what of the strategies,
So, what of the initiatives haveproven most effective and you know,
possibly could be followed by others?I would say too. Obviously, as
a researcher, I'm biased, andI'll say I think producing consistent research and
continuing to call out the problem andjust shine a spotlight on where the disparities
(36:22):
exist, I think that has provento be effective in just really getting people
to understand and address the challenges.And then second, on our venture capital
side, we invest seed stage dollarsin gap closing startups that are led by
entrepreneurs of color and diverse entrepreneurs.We have about one hundred and thirty companies
(36:43):
in the portfolio now who are doingamazing work. And I think once we
see the ways that those entrepreneurs arecontributing in really meaningful ways in spaces like
health tech and clean tech, Ithink it just provides additional motivation for us
to expand the work and continue todo amazing things. Well, what you're
doing is very important, and Ihope when you have subsequent reports you'll report
(37:07):
on what these other companies are doingas well as companies as foundations. You
know, as Kuba Gooding said,show me the money now, are they
putting their money where their mouth isand making a difference in this mammoth industry
in mammoth space. But thank youso very much for all you're doing,
thank you for what the Caper Foundationis doing, and above all, thank
(37:29):
you for being here on State ofPlay. Thank you so much for having
me. Welcome back to State ofPlay. Our topic today the technology gap,
especially for communities of color. Weare very fortunate indeed to have with
(37:50):
us now Doctor devdas Chetty. Heis the Dean of Engineering and Applied Sciences
at the University of the District ofColumbia, the state University for the District
of Columbia and also an HPCU.Thank you so very much, Dan Sherry,
Thank you. Mayor so I saidthere was a problem, a problem
of people of color being involved intechnology. To what extent is it a
(38:14):
real problem? It is a realproblem because there are very few in numbers,
very few people of color are inthe area of technology, you know,
and it had been there and weare trying to address it at University
of District Columbia. I mean,we have the number of federal grants,
you know, our faculty apply forthese research grants as well as educational grants.
(38:38):
One part of those grants is likethe outreach service, so the people.
It will give an opportunity for ourfaculty to work with the high schools
or organizations so that we can bringstudents to the campus, expose them to
STEM and other type of activities,get them excited about computer science, get
(39:00):
them excited about engineering. There area lot of such activities are going on.
Actually so I can give you someexamples. Also, we have a
training program. There is a programcalled an ambassador program where we have our
own students, the engineering students andcomputer stime student reach out to the community,
go to the schools and bring themand have them a lot of fun
(39:22):
type of activities and as a result, we increase their interest in this area.
Stemp Alia University District of Columbia isone of which you I think you
told me at one point fifteen hpcusin the United States that offers engineering and
apply sciences. Is that some totalof the ones out here. Yes,
there are only fifteen HBCUs which areat accredited. These are the accredited program,
(39:49):
the top standard of accreditation. Soout of three hundred and seventeen engineering
schools, which is only four percent, and these four percent of engineering schools
use twenty nine percent of engineers andcomputer scientists. So that is really a
figure which should be looked at.So if the resources to these hbcuse is
(40:09):
increased, you know, so itwill result in more students, help more
students, We'll be able to graduatemore number of computer scientists and engineers,
you know. So that is aclearcut advantage there and do you think the
federal government appreciates the challenge you saidthe National Science Foundations provided, for example,
(40:31):
grants to the University of the Districtof Columbia. Has the federal government
weighed in so that you can expandthe kind of programs you have preaching out
to young people in secondary school,Yes, definitely, federal government is very
much interested. National Science Foundation,Department of Energy MIST, National instruit of
(40:52):
Standard and Technology NASA SO, andthere are several other organizations which are funding
us, all mostly federal government,but each of those grants as a student
components. So these grants go tothe students either as an undergraduate scholarship,
either as an opportunity for them todo research, or reach out to high
(41:14):
schools or reach out to community collegesso that we can bring the students and
get them excited about STEM engineering andcomputer science. It's very encouraging all that
you're doing, Dean Sherry, andyour insights and contributions are much appreciated,
and thank you so very much forbeing on State of Fly. Thank you