Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
What if I said that
we need Big Brother to monitor
everything, literally everything.
We should have surveillancegoing on all over the place?
That would be equity at itscore.
Equity at its core To be ableto monitor everything, to track
(00:27):
everything you ever remove.
Maybe that's where equityactually lives.
We need to be able to to checkeverything.
Leave, leave all the decisionmaking to the robots, and then
maybe we can have some equity.
What if we allow the robots todetermine what's fair?
(01:05):
I like to be educated, but I'mso frustrated.
Hello to my loneliness.
I guess that endurance is bliss.
Take me back to before the noon, go away and take it out of
queue.
So when I was at the Universityof Florida I've said this on
(01:27):
this podcast before I will admitI was not the most attentive
student.
While I was there, I had a lotof fun.
I had a lot of fun playingbasketball at UF.
I pledged and the crazy thingis I'm not even that social like
that.
But yes, I had some fun while Iwas in college.
(01:48):
Sue me, but even back then wewere actively, especially
through my fraternity.
We took a stand when we neededto.
We tried to help and give backwhen we needed to.
We were very active in thatsense on campus and one of the
things or the issues I shouldsay that came up at that time
(02:12):
was around this concept ofaffirmative action and it was
around this concept of includingor not including race on
applications to the Universityof Florida, on applications to
the University of Florida.
And of course, you know, at itsmost basic level, most of the
(02:35):
folks, especially people ofcolor, were arguing that we
should have race on theapplication period.
That should be included in thedecision-making process when we
were talking about shouldsomebody be accepted or not.
And obviously at the time youhad a lot of people opposed to
that.
And so we, along with someother organizations on campus
and the college NAACP and theother fraternities and
(02:56):
sororities, I mean we had acouple protests.
I mean we did our kind ofcollege thing in terms of
resisting those decisions toremove race from the application
.
But one of the arguments that Iwas trying to make back then I'm
not sure if I did itsuccessfully at that young age,
but I was advocating, actually alittle bit on the opposite of
(03:19):
some of my peers at the time Itoo agreed that we take race off
of the application.
I said it a little tongue incheek, but I was advocating that
we take race off theapplication.
We should not include race onthe college application.
It should not matter ifsomebody is black or white or
(03:42):
whatever.
Take race off the application.
It should just strictly be onthe merit of their GPA in high
school or their SAT score andACT score.
That's all that really shouldmatter.
That's all that should matter,right?
And people looked at meespecially folks in my circle
looked at me like I was crazy.
But I guess my argument didn'tstop there.
(04:05):
What I was saying was it shouldbe on that merit.
So take race off.
But we should take everythingoff if we feel like it's not
important, right.
If we feel like race is notimportant, then I don't think we
should include gender on theapplication.
I don't think we should include.
If your parents went to schoolat the University of Florida,
(04:28):
let's not include that on theapplication.
I don't.
I don't think we should include.
I mean, for that matter, let'snot even include any
recommendation letters in theapplication.
I don't even want to know ifyou are a top tennis player or
quarterback or basketball player.
I don't want to know any ofthat.
I don't want to know ifsomebody has a physical
(04:52):
disability.
I don't want to know any ofthat information I was
advocating for.
Let's just have a list of allthe applicants.
You have their GPA, you havetheir SAT score or the ACT score
.
You rank them based off theirGPA and SAT score and then you
just have as many freshmenyou're trying to accept that,
(05:13):
you just draw the line andwhatever you get, whatever you
get, I mean if you get I don'tknow 80 percent women or 80
percent males, that's that'swhat comes in.
If you get 95 percent white,white people or black people,
that's what comes in.
If you get 95% white people orblack people, that's what you
let in.
If that quarterback doesn'tmake that cut, he doesn't get in
.
If the alumni's kids don't makethat cut, they don't get in.
(05:37):
So I say like, let's do itstrictly.
Yes, on full merit, full GPA.
Now, clearly, when you get tothat level, nobody that was
advocating for taking race offof the application wants to take
it that far.
They didn't want to take itthat far, right, they just
wanted to take race out of theapplication.
(05:57):
But they know for a fact youcan't have an incoming class
that has 80% men right, so theyhave to keep gender on there.
You can't have the class comein and that those alumni or
those boosters their kid doesn'tget in you can't have that.
You can't actually have thequarterback the number one
(06:17):
ranked quarterback in thecountry you can't have him not
be accepted.
That's not reasonable.
So of course, we're going toleave some of those things on
the application and so if you'regoing to leave those on the
application for considerationand consider race and that was
the argument that we wasessentially making at the time
how does this tie into thisconcept of surveillance?
(06:38):
Big brother robots makingdecisions is because, as
marginalized communities, ascommunities that often are the
ones that are on the short endof the stick when it comes to a
human making a decision on youbeing accepted into the
university, you getting that job, or you getting that speeding
(07:01):
ticket, you getting pulled over,you getting that certain
decision handed down to you fromthe judge, any of those
situations we as marginalizedcommunities, collectively have
been on the short end of thestick and we traditionally think
about OK, well, how do you getaround that?
(07:22):
Well, you get around thatbecause you need to have maybe
the right judge making thatdecision, the right cop stopping
you, the the right admissionsdirector at the school, the
right CEO of the company.
So in some ways, we've beenthinking that, yeah, this is how
we get around that, but theremight not be enough of that
(07:45):
right someone to make all ofthose decisions.
Right?
When it comes to AI?
And this is where I am mostlyscared of.
When it comes to to AI, we willget closer to so-called
allowing these systems to makethese decisions for us.
Right, allowing these systemsto to decide whether or not
(08:09):
somebody is going to be acceptedto the University of Florida or
whether or not somebody isgoing to receive a speeding
ticket, those types of things.
But if we aren't careful, ifwe're not careful, even those AI
systems and we all know thiscan be biased, can be biased.
What we should be actuallyadvocating for is the creation
(08:33):
of those systems, but in a waythat is truly diverse.
If you had an admission systemat a university that was
developed, that was developedwith community, by community and
in collaboration with others,right.
If you had a system that wasset up that way, well, now we
(08:54):
can leave it up to the robots tomake equitable decisions on our
behalf.
I even on a speeding ticketconcept when you think about
somebody constantly monitoringyou, constantly watching some
stuff.
First of all, I'll make theargument right now.
Some people, they feel likethey are off the grid.
If you have a cell phone inyour pocket, if you are
listening to this podcast, youare officially on the grid.
(09:16):
You are not off the grid,you're on it.
There is a company that knowsexactly where you went today, at
all times.
They understood and know aboutall your conversations.
They have a very good sense ofwho you are, what your health is
.
Right on down the line you areon the grid.
If you listen to this podcast.
I can confirm that, all right.
Right on down the line, you areon the grid.
(09:37):
If you listen to this podcast.
I can confirm that, all right.
So if we are going to be on thegrid, if we are going to be
constantly monitored anyway, weshould be leveraging that to our
advantage.
To our advantage, if you thinkabout a speeding ticket, why?
Why is it left up for chancefor police officers to randomly
pull us over because we werespeeding?
(09:57):
Let's just think about this fora second.
We know that Black communities,people in Black communities,
are pulled over far more oftenthan any other community.
Why not advocate for a systemso that, hey look, forget the
cop pulling you over forspeeding?
(10:18):
We should have something atevery corner in America, maybe,
or in a city, and you know thespeed limit on this particular
road is 45 miles per hour.
If you are driving over that, wegot the technology.
You just mail them a ticketperiod.
I don't care if you're white,black, indian, green, doesn't
(10:39):
matter.
If you are in a car on thisroad and you go over 45 miles
per hour, you know you're goingto get a ticket in the mail.
That'll do a few things.
Number one I bet you will slowa lot of people down because
you're not even trying to riskthe thing where I hope the cop
is there.
Are they there?
Can I speed up here?
Can I slow down now?
No, no, you know, if you go, ifyou go fast, too fast, you're
(11:02):
going to get a ticket.
It's going to be in yourmailbox tomorrow, right, that's,
that's, that's one.
It'll slow some stuff down.
But then too, what it reallydoes, what it really does
actually, when we say, oh, wewouldn't want that, no, no,
there's a lot of other peoplethat wouldn't want that, no
different than there's a lot ofother folks that didn't want us
to say well, just take GPA andSAT scores and just cut the line
(11:26):
on the admissions of color, forsure, for sure, there's a lot
more people that will be gettingtickets under those conditions.
Under those conditions, weshould be fighting for equity.
No, everybody gets a ticket,including us, including me.
We go too fast.
Give me a ticket, matter offact, give everybody a ticket.
(11:51):
We should lean in this thismoment in time for us, right now
, with the different ways thatwe can monitor stuff, technology
we should be leaning in tothose types of tools in order to
bring about equity.
We had on our podcast a coupleof weeks ago I was talking to a
good, good brother, a goodfriend of mine of mine.
(12:18):
He's a prosecutor and we had avery just, brief, kind of you
know, but colorful conversationaround the use of AI in the
decision to prosecute or convictcriminals and obviously he was
making the argument that thatwould never necessarily be the
case.
You couldn't do that becausethere's so many different, many
conditions and things that onewould have to consider when it
comes to prosecution of a crime,and I don't doubt that.
(12:44):
Probably true, matter of fact.
We know that to be true.
Right, all crimes are not thesame.
You can't just consider allburglaries to be the same, all
whatever to be the same.
We understand that.
But that is the power of data inthe moment.
If anybody has opened up a chatGPT browser, if you've ever
done anything in AI, youunderstand that there's so many.
(13:06):
That's the powerful nature ofit.
You can handle so manydifferent scenarios, so so many
different scenarios.
And will it always be right?
No, no.
But I tell you what if we said,okay, whenever this particular
scenario happens or doesn'thappen, this is how you're
supposed to prosecute.
(13:27):
This is what determines whetheror not the person gets
convicted or not.
Period, period Right, and weshould not be scared of that, of
any group, of any group.
What are we going to get?
More like?
We already are prosecuted, more, convicted, more that's.
That shouldn't be our word.
(13:47):
We already are having troublegetting into the University of
Florida.
We already are getting pulledover more.
We should be the ones to say,no, let's put some rules around
that and advocate for that.
And yes, there's going to besome casualties along the way,
100%, 100%.
But in the grand scheme ofthings, we could use technology,
(14:11):
it's possible to enforce equity, to make sure that things are
fair.
The key is, we have to makesure that we're creating that
technology.
We have to make sure that weare are developing that.
But if that can be done, thatis how we scale equitable
(14:31):
situations.
That is how we scale fairness.
That is how we can ensure, inthis new climate that we're in,
we can't run from this.
This is like the Internet.
You can't be in business in 2025and tell somebody that you
don't do Internet things noteven possible.
(14:53):
You're not allowed to be inbusiness in 2025.
You're not allowed to exist.
You're not allowed tonecessarily have a job, be
productive, participate in thiseconomy.
If you're saying you don't dointernet things like email, you
don't do email, you don't dotext and stuff like that, you
(15:14):
don't go on websites.
That's not what you don't.
You don't do that kind of stuff.
No, that's impossible for youto do in 2025.
Very, very shortly, very, veryshortly, that same concept will
apply when it comes toartificial intelligence.
You don't have the option tosay you don't do artificial
intelligent things, you don't,you don't participate in that.
Oh no, you do participate inthat, because if you've ever
(15:35):
shopped on Amazon, then you'veparticipated in that.
If you've ever received a phonecall or a text or you're trying
to figure out, like, how in theworld did it know to show me
that commercial on Netflix,you're participating in the AI
economy.
So this is not an option on.
Whether or not we're going toparticipate in AI or not is a
(15:56):
fundamental reality, and so AIis going to be here, like the
internet is going to be here.
How might we use AI, how mightwe use data and those types of
things to force equity in theseenvironments and in our
community?
I believe that the time is nowfor us to absolutely lean in and
(16:22):
drive equity using the toolsthat we have at our disposal.
This is the Scratch WordPodcast, where we don't fear the
future.
We create it.
One thought, one idea, onedream at a time.
Thank you.