All Episodes

December 23, 2021 31 mins
The digital divide is just the beginning of the problems faced by black people because of the technology industry. Find out more about the consequences of and potential solutions for the technology gap. Interview guests include Dr. Nicol Turner Lee, Dr. Ruha Benjamin, Dr. Allison Scott, and Dean Devdas Shetty
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
The following show is brought to youin partnership with the Institute of Politics,
Policy and History, Blue Star Strategies, Bright Road Incorporated, Make It Plain
Podcast, and RPC Media from thecampus of the University of the District of

(00:46):
Columbia. This is State of Play. Welcome to State of Play. I'm
Sharon Pratton. With me, Ihave Karen Tramontano and the Reverend Mark Thompson.
Our topic today the technology gap,especially for communities of color, whether
it's jobs, businesses, or theadverse impact in getting credit or in the

(01:08):
criminal justice system. Fortunately for us, we have a leading expert, doctor
Nicole Turner Lee. She is thedirector of the Center for Technology Innovation with
the Brooking Institution. Before that,she was with the Multimedia Telecom Internet Council,
and even before that the Joint Centerfor Political and Economic Studies. What

(01:29):
an honor to have you with us. Oh, thanks for having me Mary.
I just so appreciate being here today. I talked about I highlighted the
areas where it has an inverse impacton people of color. Which of these
concern you the most? Oh,my goodness, I think they all do.
I mean, when we think aboutthe data, technology it's almost as
if technology is sort of trailing alongsidesystemic inequalities at all of these stops where

(01:55):
actually seeing people of color, particularlypeople of color who are low income,
live in rural is you know,may be older affected by these technologies in
ways that technology was never designed tobe right. It was always supposed to
be a game changer to slive socialproblems. So I would say all of
them have become equally important. Sodo you think the federal government understands the
depth of the problem. Well,you know, I think it's interesting that

(02:16):
question because I think the federal governmenthas a role to play on a couple
of fronts and allow me to sortof break it down. First and foremost,
we need the federal government to understandthat they have to invest in us
and what that means is a lotof times if we see these intersectionalities between
racism and discrimination and tech, it'sbecause nobody liked us is sitting at the
table at the beginning when we're developingthese ideas. And it's important that we

(02:38):
have workforce diversity, particularly in companiesthat have less than two to three percent
of representation and had decision makers,engineers as well as data scientists. So
let's start there. There are waysthat Congress can start pumping that money back
into computer science careers for young people, or figuring out ways to create more

(03:00):
inclusive data sets for the scientists thatare building these products. But then I
said, the second thing is wedidn't distinguish what Congress is looking at,
which is a digital divide, fromthese other things may that you talked about.
The digital divide is about who isonline, who is not, who
has a device, who does it. Last year, when fifty million school
age kids were sent home from school, we found out that fifteen to sixteen

(03:21):
million of them did not either havea device or broadband. Nine million didn't
have either, and the majority ofthose were kids who were of color black,
brown, and those who tribal lands. We're now suffering this from this
because those young people are now one, two, three, ten months to
almost a year behind when it comesto schooling because they lack the materials necessary

(03:44):
to actually engage in distance learning.Sounds to me like a Brown versus Board
of Education situation. But I wouldcome back to that in my writing.
When we start to piece the digitaldivide, that obviously requires the government to
look at adoption infrastructure, as wellas a rain of other ways that we
have to get people involved. Andthen the final thing I have to stay
there on the criminal justice side,it's really important that we begin to look

(04:08):
at the decisions that these computerized systemsare making. Did you know, not
too long before the insurrection on Januarysix and a black man in Detroit was
misidentified by facial recognition technology, satat the station for six hours, only
to be found out that he didn'teven do a crime. We need to
get better at these things because thevery technology that was supposed to solve problems

(04:30):
should not be creating new ones.We've already marched, We've already fought a
civil rights and so we need tomake sure the technology catches up with where
we are in terms of our freedoms. So the pandemic has clearly taken the
covers off of these systemic issues.How do we keep the pressure on so

(04:50):
that it's not forgotten a month fromnow, two months from now? You
know, Karena is such a greatquestion. I'll go back to my example
of the schools here. We havenow the ability to bring our kids back
to school in the fall. ButI actually was saying that any educator listening
that we cannot abandon what we didto get kids connected. We should be

(05:11):
having, much like we did withNo Child Left Behind, an initiative of
no Child Left offline. The factthat we had young people who didn't have
a tablet next to their textbook,a Wi Fi hotspot with a pencil,
or any type of provision or guidanceon how to get online. It's something
we need to work with it.We should not allow these lessons of twelve
months during this pandemic to actually pushthose kids back into a vacuum where they

(05:34):
cannot have the same twenty first centurytools to be productive. So I'm saying
out there there's a lot of stimulusmoney that actually went to good programming.
We need to make those more permanent, much like the Emergency Broadband benefit for
affordable broadband and the moneys that wentto schools to be able to provide for
these hot spots of tablets. Andalso, isn't having or not having broadband

(05:57):
going to impact some workers ability toget a job and keep a job.
Oh my goodness, revenue are soright on that. I think this is
where it becomes important as this administrationthinks about the infrastructure money. Let's start
thinking how FDR thought about the newdeal. What's our tech new deal that
it moves it away from just passiveconsumption to production. Where are we actually

(06:19):
putting people to work in these jobsthat are now going to be taking over
our communities? There were thousands,hundreds of thousands of businesses owned by people
of color, women that were lost. They're not coming back to our communities.
But yet it takes an idea througha startup, a black startup.
It takes the ability to train peopleon how to stream wire or fiber optics

(06:41):
cable. Those are livable, wavescale jobs. And so we need to
be pushing towards a tech new Dealthat combines not just closing the digital divide,
but making sure we get people backto work, particularly people of color.
Well, the impact that we're havingon the Hill now around the infrastructure
initiative pushed by Biden seems to bea pushback rubber by Republicans for any investment

(07:03):
and retrofitting in human beings to participatein the new economy as against physical infrastructure.
Isn't it about the dollars or isit about where you're spend your money?
So when I was growing up.My mommy needs to say it's the
dollars and it's the sense. Letme tell you something. You can't ride
on a new road without a car, the same way that you cannot be

(07:23):
a student K through twelve without atablet or a hot spot. That soft
infrastructure I call it is local infrastructure. You know. Can I tell you
all one thing ten seconds? Weshould have every federally assisted housing developing United
States, particularly a WASHDC with Broadbentalready built into it. We have to
start looking at ways to have aconnective mindset if we're going to make sure

(07:46):
that our connected communities are economically prosperingin the new economy. You're very fortunate
to have you here. We're veryfortunate in our country to have you.
Thank you for being on State ofPlay. Thank you, Welcome back to
State of Play. Now we're goingto talk about how this digital world we

(08:09):
live in has the negative implications onso many aspects of our lives, especially
for people of color. We havethe perfect expert, doctor Ruhab Benjamin.
She's a professor in the African AmericanStudies Department at Princeton University. She's the
founder of the Ida B. Wellsjust data lab and the author of Race

(08:30):
After Technology. Al it's sort oftransposed of there. Thank you so very
much for being here. That's anhonor. Thank you for having me.
So you talk about this gym code, so to speak of this how this
technology is often used in a wayor how it's organized in a way that
has negative implications for people of color. Can you sort of elaborate on what

(08:54):
Jim code means? Absolutely, thenew gym code is a combination of coded
in equity and imagined objectivity, thefact that we imagine technology to be more
objective and neutral than the human counterparts. So if we acknowledge that there's racism
and discrimination in our courts, orin our hospitals or in our schools,

(09:15):
a lot of people assume that takingthose decisions that human beings would normally make
and allowing technology to make them willget us around the bias, will kind
of fix the problem, without recognizingthat technology is created by human beings.
The assumptions, the values that desires, the interests that shape our society becomes

(09:35):
baked in and encoded into these technicalsystems, and the real danger is that,
unlike their human counterparts, where youcan point to a racist judge or
or doctor or teacher. When it'swithin a technical system, we assume that
it's neutral, so it's harder tohold accountable. And so what the new
GYM code does is name this problemso that we can start to shine a

(09:56):
light on it and deal with itand try to address it in really productive
ways. So, could you takeus through example. So let's say you're
trying to get be paroled, oryou're trying to get a loan, or
trying to get a job. Couldyou show us how all of these Yes,
all of these automated decision systems haveto be taught how to make decisions.

(10:18):
They don't just grow on trees.And so the question is is how
do we teach them. We teachthem by feeding past data, past human
decisions, whether it's who got whogot gets loans, who gets paroled,
you know, who gets the job. So we take that historic data and
we train these systems how to makefuture predictions and decisions. So if in

(10:41):
a certain industry black folks have beendiscriminated for generations, or in a certain
neighborhoods have been up you know,profiled for generations, that data becomes the
starting point that we use to teachthese systems how to make future decisions,
and then we assume that it's neutralbecause it's being spit out by a software

(11:01):
system. But we have to questionthe source of that data and the source
of the human decisions and the assumptionof neutrality when it comes to these software
systems. In so many areas ofour lives. Is there aspect of life
where it's more alarming. I thinkwe have to be concerned about it everywhere.
So I do think when it comesto let's say, do good in

(11:22):
professions like healthcare, people assume,because it has this ethos of wanting to
heal and help, that we canput our guard down and assume that,
actually, you know, just becausepeople are well meaning that the outcomes will
be good. And I think wedo have to stay on high alert even
in those arenas. What's so interestingand to think about is that when an
AI system in the context of healthcareis trained using data from doctors reports,

(11:46):
let's say, doctors reports of pain. We know that doctors and healthcare professionals
routinely underestimate the pain of black patientsthat's in their reports, and so if
you train an AI based on that, the AI system is going to continue
that process of underestimating the pain ofblack patients. But recently there was a

(12:07):
study that trained AI based on patientsself reported pain, and it was much
more accurate because it was actually goingto the source of the pain and was
not being filtered through the discriminatory lensof the experts, let's say, in
that industry. So there are waysto think critically and create these systems with

(12:28):
equity and justice in mind. Soyou want to lead, or to some
extent, inspire others to lead anabolitionist movement against this new Gym code.
Can you give us some of thetools to do that. So first and
foremost, we need a name.We need a way to name what's happening
to us, because if we can'tname it, we can't talk about it,
we can't organize against it. Andthat's why I've developed this idea of

(12:52):
the New Gym Code, to remindus that history is in our present.
It's being put into all these shinynew features, dining new systems, but
historic data is actually driving these emergingtechnologies. And so to organize, we
have community organizations that are working onthis. Organizations like Data for Black Lives
and different in different cities. There'sdifferent regional and city based organizations that I

(13:16):
call tech justice organizations. There's legislationthat's being drafted around algorithmic accountability because this
can't just be a kind of patchworkthing where you leave it to different states
and cities to deal with. Andwe need a national, even international way
of thinking about accountability when it comesto technology, And of course my home

(13:37):
term is education. We have toreimagine how we're training future technologists so that
rather than just being reactive to harmfultechnologies, we can have people at the
source that are building them with thesevalues in mind. And so the pedagogy
from paid through twelve and higher educationin computer science and engineering and STEM across

(13:58):
the board has to be fused withan equity lens and an equity approach.
Where do you think you can havethe greater impact? Is it education?
Is it regulation? We need tobe working on multiple fronts. Education is
where we seed what we want tosee fifty years from now. We start
seeding that in education. But thealgorithmic harms are already happening now. People
are being misidentified by facial recognition,people are being excluded for opportunities based on

(14:24):
targeted advertisements and automated systems. Solegislation and also legal mechanisms to actually hold
these harms accountable. They have tohappen yesterday, and so we need people
working on multiple fronts rather than sayingone code goes before the other. But
in order to get where you thinkwe need to go, and I agree

(14:45):
with you, we've got to sortof purge all of our data banks,
do we not? You know,I think it's really hard to simply think
what we might do as individuals becauseyou know, the common parlence is that
we're users, right, we're usersof technolog oology. But if you're a
user, you're going to get used. So another way to think about technology
and technology access is the technology thatwe have access to has access to us.

(15:09):
And as you're noting, the datathat we are producing is actually the
lifeblood of these technological systems. Sosomething that we can all do right away
is to actually become start to thinkof ourselves more as stewards of technology,
as people who don't just have anobligation to protect our personal privacy, but

(15:31):
have to think about what are thelegal what are the policy frameworks, what's
the ecosystem in which technology is beingdeveloped and push for that ecosystem to reflect
public values rather than private interests andso this. To do this, though,
we all have to feel like wehave a stake. We can't just
leave this to the people with thefancy degrees, the people with the technological

(15:52):
know how, because often they don'thave the social and historical know how.
And so black communities, black scholars, black engineers, black students, we
all have to rise up and understandthat we have a stake in this and
we have to shape the future thatwe want to see exist. Well,
I think obviously you are getting theattention of a great many people, and

(16:15):
you're winning a lot of praise,which is very encouraging. But at the
end of the day, it's stilla daunting task to get people exercised around
something where a few of us knowa lot about it. And so I
guess maybe you tune into and workwith other activist groups, like you know,

(16:36):
Black Lives Matter and that sort ofthing. Yeah. Absolutely, If
you go to the resources page ofmy personal website, there's a whole host
of ways to plug in. Wedon't have to just sit at home and
feel paranoid about surveillance or paranoid aboutyou know, big brother looking over our
shoulder. We can actually plug intocommunity groups, to organizations, and there's

(16:56):
a whole host of them around thecountry to actually begin to empower us to
change that paranoia into power. Andso I would encourage those who are listening,
who feel concerned, to realize thatnow is the time to get involved.
Now is the time to shine alight on systems that would rather stay
in the dark. So many ofthese systems would rather hide behind the cloak

(17:18):
of neutrality and make all kinds ofdecisions about our lives without us having a
say, and now is the timefor us to speak up. Well,
good for you. You're doing greatwork and we need you desperately, so
thank you for being on State ofPlay. Pleasure, welcome back to State

(17:42):
of Play. Our topic today thedigital divide and how that divide, that
chasm is daunting, How the digitalworld is expanding while the numbers of people
of color seem to be shrinking.But we have with us now an expert,
someone who's trying to address us thisvery issue, doctor Allison Scott.

(18:03):
She is the CEO of Kapor Foundation. Thank you so very much for being
here. Thank you, mayor Pratt. Pleasure to be here with you.
So you're a graduate of Hampton University. You have a PhD in education from
the University of California at Berkeley.You've always had a focus seemingly on people
of color in this space of STEM, the world of STEM. Why is

(18:27):
that what prompted you to a focuson that space? Yes, so my
background in social science research has reallyled me to want to focus on understanding
and examining systems of inequality, andspecifically in STEM education and computer science education,
in the tech system tech ecosystem morebroadly. As we've seen over the
past decade, there's been such anincreased focus on technology and that's played an

(18:52):
increasing role in our society, andso just so important to understand how people
of color are being excluded and thenwhat potential solutions are to those challenges.
Will you say people of color areexcluded? I think we all kind of
grasp that, Well, what arethe numbers? You know, what is
the percentage of blacks, let's say, in the tech space, the percentage
of blacks who are executives in thetech space and the like. Yeah,

(19:15):
so we know the tech sector isincreasingly driving our economy. So we're talking
large numbers twelve million people and almosttwo million new jobs just in the last
decade, But only eight percent ofthe tech workforce is black in less than
one percent are black women. Andwe've seen over the past five years across
the large tech Silicon Valley tech companiesvery very minimal progress in representation. We

(19:38):
actually did a study found only onepercentage point increase despite all of the efforts
over the past five years in therepresentation of black folks in the tech workforce
by venture capital. To what extentis that available? And I'm so glad
you asked that question because it's sucha critical space as we think about innovation
and creation of new jobs. Soin twenty nineteen, there was one hundred

(20:00):
and thirty seven billion invested in techstartups and just one percent of those were
founded by black entrepreneurs. So there'sstill a significant amount of work that's needed
in the space of venture capital andinvestment. We need more capital flowing into
entrepreneurs. We need to develop andsupport entrepreneurs as they create new innovations in
the technology space and provide them theopportunity and the room to flourish and grow.

(20:25):
Well. You know, in thereport that I think I saw that
your foundation did, it suggested thata venture capital venture capitalists provided a black
entrepreneur but one hundred and twenty fivethousand dollars on average as the investment,
but before white entrepreneurs like two pointfive million dollars always under capitalize. I
mean, how do you turn thataround? Yeah, and that was a

(20:47):
critical report done by We cited datafrom Digital Undivided. They've been looking at
this data for the past three orfour years. So what we're seeing is
really very very incremental, if anyprogress us in the amount of capital that's
flowing in. I think in thelast six months we saw pretty significant commitments
from tech and from larger private equityfirms towards black entrepreneurship. But those have

(21:11):
really just been a drop in thebucket. A great place to start,
but we need much more capital flowingin. Well. You know what also
fascinated me is how your foundation putsuch a focus on voter education voter registration.
One would not think that that wasnecessarily a strategy for a foundation focused
on addressing presence in the STEM space. Why so, we are a racial

(21:37):
justice focused organization. We focus atthe intersection of racial justice obviously in technology,
and we saw what was happening overthe last year or so and said,
it is going to be critical forus to empower communities of color to
have the opportunity to turn out tovote. And some of the solutions to
some of these really complex challenges,both in education and the work force in

(22:00):
tech, all require both public andprivate solutions, and so who we elect,
who our public officials are, canalso contribute to some of these changes.
So we saw a direct connection betweenthe two, and we provided resources
to grassroots organizations that have been leadingthe charge for quite some times to turn
out voters. So is there aparticular piece of legislation federal or statewide,

(22:25):
let's say in California, you thinkthat could make a difference, that could
have an impact. So we're actuallylooking at a couple of different things.
One, we think that the government, both at all levels, has a
huge role to play in holding companiesaccountable, so regular data collection, reporting
and transparency, regulation and oversight ofthings like antitrust and competition. The utilization

(22:47):
of algorithms, especially in the caseof algorithmic bias, facial recognition software or
something getting a lot of attention,and data protection and data privacy, so
accountability is a huge portion. Andthe second piece, we think that public
policy can play a role in beingproactive in developing more equitable futures, so

(23:07):
broadband infrastructure investment to provide high speedinternet to folks who have been left behind,
and educational workforce development, providing resourcesto institutions to think about things like
upskilling, reskilling, how to ensurethat the existing black workforce can be prepared

(23:30):
for tech jobs of the future.And then also economic stimulation, as we
were talking about how to get morecapital flowing into communities of color, how
to inspire more black entrepreneurs and providethem the capital that they need. There's
a lot of legislation that can happenthere. What do you think that these
companies should be treated as utilities,I mean the ubiquitous and don't shouldn't they
be regulated though, since they impactso many facets of our lives? I

(23:53):
think that's a really outstanding question.I know that there is some legislation.
There are a lot of folks canideringdifferent angles on that piece, and we're
following that very closely. But wedo think that there is a role to
play around regulation around Section two thirtyand content moderation. Now that these companies
have become so powerful as sources ofinformation, and we saw what happened with

(24:15):
the January sixth insurrection and the roleof disinformation misinformation, there is a role
that the government should be playing andwe are following that very closely. Your
leadership, the Caper foundation, theleadership then you're providing in this space,
which is exemplary. What of thosebig texts, I mean they have foundations.
Also, there's money that they couldprovide matching dollars, but let's say

(24:38):
venture capital. And in terms ofwhen you're doing are you seeing that happening?
Yeah, I think we've seen.I would say over the past four
years, we've seen more momentums andmore chief diversity officers hired, more commitments
to diversity. There's a lot morework to be done inside of those workplaces
to ensure inclusion and actually that thenumber move as I mentioned, we only

(25:00):
saw one percentage point increase over thepast four years. But there is also
a role that they can play interms of deploying resources their CSR budgets,
the very very powerful lobbies that theyhave as we think about things like broadband
stem education, they can leverage theirpower in ways I think that can benefit

(25:21):
all of society. And the badword that people don't like to talk about
too much as taxes And are thesecompanies paying their fair share as they continue
to make tremendous amounts of profits,are they paying their fair share to ensure
that this is the type of equitablesociety that we all want to live in.
Well, you deploy a lot ofstrategies, a lot of initiatives,
and you know, for the purposeof increasing participation in this space, what

(25:47):
of the strategies, So, whatof the initiatives have proven most effective and
you know possibly could be followed byothers? I would say too. Obviously,
as a researcher, I'm biased,and I'll say I think producing consistent
research and continuing to call out theproblem and just shine a spotlight on where
the disparities exist, I think thathas proven to be effective in just really

(26:10):
getting people to understand and address thechallenges. And then, second, on
our venture capital side, we investseed stage dollars in gap closing startups that
are led by entrepreneurs of color anddiverse entrepreneurs. We have about one hundred
and thirty companies in the portfolio nowwho are doing amazing work. And I
think once we see the ways thatthose entrepreneurs are contributing in really meaningful ways

(26:32):
in spaces like health tech and cleantech, I think it just provides additional
motivation for us to expand the workand continue to do amazing things. Well,
what you're doing is very important,and I hope when you have subsequent
reports you'll report on what these othercompanies are doing, as well as companies
as foundations, you know, asKuba Gooding said, show me the money

(26:56):
now, they're putting their money wheretheir mouth is and making a difference in
this mammoth industry, in mammoth space. But thank you so very much for
all you're doing, thank you forwhat the Caper Foundation is doing, and
above all, thank you for beinghere on State of Play. Thank you
so much for having me. Welcomeback to State of Play. Our topic

(27:26):
today the technology gap, especially forcommunities of color. We are very fortunate
indeed to have with us now DoctorDebs Shetty. He is the Dean of
Engineering and Applied Sciences at the Universityof the District of Columbia, the State
University for the District of Columbia andalso at HBCU. Thank you so very
much, Dean Shetty, Thank you. Man. So I said there was

(27:49):
a problem, a problem of peopleof color being involved in technology. To
what extent is it a real problem? It is a real problem because are
very few in numbers, very fewpeople of color are in the area of
technology, you know, and ithad been there and we are trying to
address it at University of District Columbia. I mean, we have the number

(28:14):
of federal grants, you know.Our faculty apply for these research grants as
well as educational grants. One partof those grants is like the outreach service,
so the people. It will givean opportunity for our faculty to work
with the high schools or organizations sothat we can bring students to the campus,

(28:36):
expose them to STEM and other typeof activities, get them excited about
computer science, get them excited aboutengineering. There are a lot of such
activities are going on actually, soI can give you some examples. Also,
we have a training program. Thereis a program called Ambassador program where
we have our own students, theengineering students and computer science and reach out

(29:00):
to the community, go to theschools and bring them and have them a
lot of fun type of activities andas a result, we increase their interest
in this area. Step area.University District of Columbia is one of which
you I think you told me atone point fifteen hpcus in the United States
that offers engineering and apply sciences.Is that some total of the ones out

(29:22):
here? Yes, there are onlyfifteen HBCUs which are at accredited. These
are the accredited program, the topstandard of accreditation. So out of three
hundred and seventy engineering schools, whichis only four percent, and these four
percent of engineering schools produced twenty ninepercent of engineers and computer scientists. So

(29:45):
that is really a figure which shouldbe looked at. So if the resources
to these HBCUs is increased, youknow, so it will result in more
students, help more students. We'llbe able to graduate number of computer scientists
and engineers, you know. Sothat is a clear cut advantage there.

(30:07):
And do you think the federal governmentappreciates the challenge you said the National Science
Foundations provided, for example, grantsto the University of the District of Colombia.
Has the federal government weighed in sothat you can expand the kind of
programs you have preaching out to youngpeople in secondary school. Yes, definitely,
federal government is very much interested.National Science Foundation, Department of Energy,

(30:33):
MISS, National Instruite of Standard andTechnology NASA SO, and there are
several other organizations which are funding us, all mostly federal government, but each
of those grants as a student components. So these grants go to the students
either as an undergraduate scholarship, eitheras an opportunity for them to do research,

(30:55):
or reach out to high schools orreach out to community colleges so that
we can bring the students and getthem excited about STEM engineering and computer science.
It's very encouraging all that you're doing, Dean Shetty, and your insights
and contributions are much appreciated, andthank you so very much for being on
State of Play. Thank you
Advertise With Us

Popular Podcasts

Stuff You Should Know
New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.