Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:01):
This one on the other side ofthe world. There is a continent with
fifty- four countries inhabited by theBalan, Cueno, Carabalí, Lucumi,
Ararat, Arshanti and many more villages, fathers, warriors, thinkers, farmers,
(00:23):
miners, ranchers and fishermen who arepresent today in Colombia. If you
use the words chevere yam or chhofarseto speak, you are rescuing the oral
tradition of Africania. If you prefersalsa, vallenato or reggae when you go,
you evoke the rhythms of Africania.If as a child you heard the
(00:47):
myths and legends of Mother Monte,the Tunda, the Mother of water and
the crybaby, you are remembering thestories of Africa. I used a radio
rosary in association with the National Conferenceof Organizations in the proclo de sn present
traces of Africania, Tetnic territories buildingvas traces of Africa in Colombia to Hello,
(01:23):
Dear Netizens, it is a pleasureto get back in tune thanks for
continuing to connect with a Rosario radioand remember that we are opening hours.
Now we are going to listen onTuesdays every fortnight from ten to eleven in
the morning here, for a Rosarioradio and for Striker. Welcome to African
(01:47):
footprints a radio space that is realizedthanks to the alliance of the Social Projection
Directorate of the University, El Rosario, the National Conference of Afro- Colombian
Organizations goes and Rosario Radio. FromBogotá, Colombian capital, we will travel
through the national territory Afro- descendant, black, palenquero and rootland, knowing
(02:08):
the different ways to make peace,that is, traces of Africania, few
making mones remember that they can communicatewith us through our social networks on Twitter,
Facebook and Instagram like arroba or rosarioradio. We can also be found
on Twitter as an afro- rayedcnoa dish, on Facebook Xnova, on
(02:31):
Instagram as the coll floor is notrooted and on YouTube as an anal conference
of Afro- Colombian organizations. Therewe have a very interesting audiovisual proposal now.
If you connect the online signal lateor miss the program, you can
(02:51):
find it on our website TRIPWW convergenceSenoa org in the podcast session or here
by spoty there you will be ableto listen to the program, share it
on your networks and save it onyour preferred digital device. It' s
a poor sport. Special regards toour secretaries and operating secretaries of the seventeen
(03:15):
tons of the dinner in the differentdepartments of the country, to our director,
Mario Castro in the master' sdegree of control, Nelson Duarte,
and those who speak Mayo Rivas Molina, remember following me on Facebook as Mayo
Rivas Molina, Bedo Royre Sedura toyou Hello. Dear netizens. It'
(03:53):
s a pleasure to get back intune. And today we want to talk
about participation and representation in the developmentof artificial intelligence. Black representation in development
and artificial industry already faces many verysignificant challenges. One of the main problems
is the underrepresentation of Black people inthe technology industry and in fields related to
(04:16):
artificial intelligence. This translates into biasesin the algorithms and systems of the ia,
as they are often developed and trainedwith a set of data that do
not adequately reflect racial diversity. Thesebiases can lead to discriminatory results and perpetuate
existing inequalities, negatively affecting Black communitiesaround the world. Lack of representation also
(04:42):
means that the perspectives and needs ofour communities are not adequately considered in the
design and development of these new technologies. However, there are also very important
opportunities to address these challenges and promotemore equitable representation. Increasing diversity in one
- way development teams and encouraging theinclusion of black people in key roles can
(05:06):
significantly improve artificial intelligence systems. Inaddition, the creation of educational initiatives and
programs focused on attracting black students intofields that are science, technology, engineering
and mathematics can help build a solidfoundation of diverse talent. Now we have
to think about the collaboration between industry, academia and communities to generate innovative solutions
(05:30):
that address the biases between algorithms andensure that technologies and all artificial intelligence can
benefit all people regardless of their city, social status or ras. But to
talk about all these topics that areso broad, he accompanies them today.
Chelseamena. She is oil engineering fromthe National University of Colombia in Medellín,
(05:56):
and became a data scientist in thetwo thousand and twenty. He is currently
a data analyst at undercol apti Nobel. She is passionate about the technology and
power she has to transform the worldinto what we live now. She is
an enthusiast in visualization and data analysis. In addition, she believes that artificial
(06:17):
intelligence can be used for cool projectsthat solve very real problems. On the
other hand, we are also accompaniedby Alexander Paz García, communicator of the
association Casa Cultural El Chontaduro, anAfro futuristic man, web developer and digital
marketing. Welcome to the two Chelseas, we started from the importance of enunciating
(06:42):
ourselves when the program began, Imade a brief presentation of each and every
one of you. However, wewould like to know who Chelsea is and
who motivated her to change from heroil engineering career to age. I am
now twenty- six years old,I am twenty- seven years old in
(07:03):
less than a month and I studiedoil engineering here Mellin. You fortifyly know
that kind of decisions you make atsixteen that turns out to be monumental and
you don' t realize Petrol engineeringFor me it was like a very good
choice, because it' s veryinteresting from a technical point of view,
that is, mathematical science. Theinnovation required to get oil out of the
(07:28):
land is super big and it's something you don' t find online.
If you don' t know whatyou' re looking for, then
you brought me a curiosity and oilengineering and you kept me up because I
liked it and let' s saythat I even worked in the industry for
a while and in Pandemia when weall went out because the oil really went
(07:49):
into a crisis and there are nopeople even using cars, we went into
crisis and it was a time whenyou can explore other things that also have
always generated me curiosity, at leastwhat is software, what is predictions,
what is even how it works inSpotify and how Netflix works. I had
always had that curiosity too and becauseof the rigor of the race, I
(08:11):
had never explored myself. And inPandemia I had that opportunity and I realized
that it' s also engineering,that it' s engineering, even if
it' s not as physical,and they have like a touch of science,
very nice fiction that tells us howthis looks like magic and that was
in two thousand twenty that wasn't yet so magical. Right now it
' s like an awesome thing.I was very pleased to be part of
(08:35):
the section of the population that hasas knowledge how it works. It seems
to me that data science is somethingmultidisciplinary that oil was not, and it
seems to me that we are ata point where it is going to transform
how we do everything, absolutely everything, and to have the basis of how
it works beyond being a user solumno, it also allows to take out that
(08:58):
game. So, that' show it' s not a chemical engineering,
an industrial engineering, an accountant,a lawyer can work in a company
of whatever, a data scientist canwork in a company of whatever data science
is applying, to the guild he' s working on. That' s
something that there are some careers thatdon' t allow among those oil engineers.
Then open that field of work thatway. Obviously, it was an
(09:22):
economic edition, but it has alsobeen a very wise decision on issues of
the projection that I see in mylife. I saw myself yesterday Let'
s say in a field all thetime. However, to see as a
specialist my tookid towards data science allowsme to go back to oil. If
I want to come back right nowI work in manufacturing, I can end
(09:46):
up working to know medicine, whichwould be something I have planned, because
it seems to me to be themost important application of data science that we
have right now. Issues like medicaldiagnoses. I can end up working on
Facebook, in Spotify, in Netflix, in Google, that is really the
data sciences opens up to you theworking capo completely at that time. So,
that' s why I made thatdecision. Thank you so much.
(10:07):
Let' s notice that little datadate right now from data science. I
am very struck by Alexander how tointegrate his passion of Afrofuturism and the technology
of his daily work as web arrogadorand marketing, and it also gives them
his name. There are few wordsthat have much knowledge. That' s
how I feel. Be good,too. I like it a lot.
(10:30):
I really like the subject of marketing, but also from a communication perspective.
I am currently studying social communication andhave discovered that there are many ways to
do many things and that our communities, for example, in terms of Afro
- Colombia, there are a lotof very important elements from the way we
(10:52):
communicate and tell ourselves things. Wewere passing on information. For example,
I was very impressed by the factthat orality, which is something that is
very present in our communities, continuesto demand space in current technologies, such
as voice notes. Most women alreadyknow half the voice notes send voice notes.
So I was thinking about how wecould use these tools, in which
(11:16):
even black people have been involved.For example, the idea that they'
ve called by Whatsapp, Internet calls, that' s created by a black
woman, for example, who's right now. I think she'
s Google' s research director,also the fact that there are, for
example, Pacific Robotics teams, thatwon her robotic chinan in Mexico' s
(11:37):
competition, for example, and otherexamples as such of how we' ve
been involved Black people in the creationof technology. So, Afrofuturism involves a
lot of narrative elements from one's own narratives, a lot of imaginaries,
a lot of aesthetic philosophical positions.And I have thought from there,
(12:01):
like then, with this passion thatI have for digital marketing, how I
can help Black people recognize the potentialof everything wonderful we can do and innovative
and take it to another level,the level where they can create businesses can
create projects that solve problems. Andbesides, there is also the idea that
black women, who are the oneswho are leading, now innovation, because
(12:22):
they also have some very particular waysof seeing reality and help solve very distant
problems for what has been said thepatriarchy and the hegemonic ways of doing things
these days, for example, inthe project it is of the Cultural House,
the chontaduro, the playtre is likea popular Pablo Freire. We talked
to the students and the girls thatthere are several if they want to study
(12:45):
the development of software, creation ofvideo games, about the idea that two
African- American African- American girlsshowed various possibilities to solve the Pythagorean theorem,
a thing that comes 2, 000years ago in research and they mix
a lot of elements from their perspectivesand generate these new possibilities that have the
(13:07):
world of physics and amazement mathematics aswow and are of girls who don'
t even want to devote to mathematicsas a lifestyle, but you live them,
for example. So, as wellas from Afrofuturism and those narratives we
can approach Black youth, to theseareas of knowledge that we have been relegated
and relegated to where we have beenthere and there are also very famously the
(13:28):
women who participated in those missions atNASA, that there is a film that
I have not present here. SoI think furfuturism, digital marketing, from
narratives and ways to do different things. From our own experiences, I think
it' s something very potential thatcan pull him out. I provide after
the presentation of you how we focuson matter. And lately it' s
(13:54):
gone crazy. Everyone talks about artificialintelligence, that the ia Chad GT has
already had conversations for translation, thatis, everything in this life already has
chat everything in this life has artificialintelligence, from the cell phone. Now
even the coll center, that is, everything has become artificial intelligence. People
are very reluctant to talk about this. He' s very afraid of it
(14:16):
and other people use it by obligation, because now artificial intelligence has come to
stay and you even want to orwon' t force you to have it
here. What we' re thinkingis I want us to start talking first
about what artificial intelligence is, becauseas a black community this issue, we
need to start giving it as well. But we also have to understand that
it is so that someone who explainsto me what artificial intelligence is is basically
(14:39):
an equation in which you give itvariables in numbers and deliver it sounds very
basic. But it' s importanthow to start from that definition, because
at that moment, when you talkabout artificial intelligence, a lot of people
from a chagipiti copilot piece, allthese chats from the coll Center, and
(15:05):
that' s yes, it's artificial intelligence, of course, but
it' s not the only thingthat' s artificial intelligence. Artificial intelligence
to our computers since the nineties practicallymaking little predictions. So artificial intelligence.
These numbers that you give him areusually data, they are told data points,
which is like a point of adata and is, for example,
(15:26):
tell qogle your location and where you' re going. And with all the
people who gave him the location andwhere he' s going. He already
knows more or less how long you' re going to be and he tells
you to throw a fear number atyou for fifty- one minutes. That
' s already artificial intelligence, forexample, and we' ve been using
it for a decade now, atleast in Google Maps. And people are
(15:46):
not talking about artificial intelligence as theyare talking about artificial intelligence this year that
opens Ay gave us the GPT chat, but people have as a vision of
artificial intelligence, which is something thatthinks like a human being and communicates with
me as a human being. Andthat' s why these great models of
language, as they are called Ahorita, is that they are generating that boom
(16:10):
of artificial intelligence, because already whenpeople see that they respond as a human
being, they are thinking wow.There' s a brain parallel to mine
that' s talking to me.But artificial intelligence is many other things and
we have been living with it along time ago. She' s part
of our long- term lives.Right now, people are interacting with her
like she was a human being andshe' s creating problems. Let'
(16:33):
s just say diet relationship. That' s why people are talking about that
artificial intelligence. Stop is artificial intelligence, they call it general artificial intelligence and
it is the concept of science fictionthat we have of the robot, because
it exists between us and walks betweenus and many are said to say that
in the guilders of artificial laíencia wewill never arrive as to really a general
(16:56):
artificial intelligence, because it requires thatthe robot not only respond to you,
but understand what is responding to you. And right now, what Chad Gipt
does is that when you give himwords, he turns them into numbers and
looks at all the texts we've written and delivered to him, he
says good. When people write thisit is statistically likely that they want this
(17:18):
answer phrase. When I say goodmorning, you answer good morning and he
says good morning. There is ahundred percent chance that the person would like
me to write him good morning,but he doesn' t really know what
good morning is. Just throw goodmorning. So there it is like the
difference between artificial intelligence and intelligence andour intelligence, so to speak, real
(17:40):
intelligence. So yes, the furorof artificial intelligence right now is since it
' s finally like making a mimicof language and it' s allowing users
who are actually just using the finalproduct of software in the last prediction,
the fifty- one minutes of GoogleMaps. At this point you can already
talk to a chat pod that isresponding to statistically likely things. And that
(18:06):
has caused all people to put theireyes over there, because it' s
a novelty and it' s veryinteresting and it' s very bacano really,
and the amount of computation you needis very large. But artificial intelligence
is simple predictions. To put itthat way, we don' t have
to be afraid of him because they' re very simple predictions. It'
s just that this is making alot of simple projections, very often,
(18:30):
or that it can mean that it' s doing. Besides, he'
s doing a lot of data consumption. Not because it' s another moment
you know what it' s beenlike then Alex, Right now Chelsea,
told us a little bit because it' s so in Furor, but we
' re going to land the most. You also work with young people and
Alex has also been a person whohas been very concerned about this whole issue.
(18:52):
Sometimes when we talked, when thesethemes come, as you hear.
But what do you think about whyhe' s in Furor and he'
s out there. He' sgoing to say like in emotion, because
then I want him to tell uswhat the anger is with this issue of
artificial intelligence, because also chelscero toldus how this already comes from the nineties
and we' ve always had itinside. I was thinking to you ah
(19:14):
clear when and when I was doinglike the comparison with the Google Maps or
when people ride in after mileni thatyou will spend twenty minutes and one says
no that goes and literal does notspend twenty minutes as if it is,
already I know whether there is trafficor not according to the city where one
is. But the data also startsto be skewed So I was saying,
if we have artificial intelligence in everythingthat when you' re writing because it
(19:36):
' s data, it' snumbers, but for normal and common humans,
we think it' s how they' re stealing other information then alex
because that' s so in bloomthis topic. Well, I would also
like to talk about how the roleof communications has been and the role of
capitalism also in the furor of thisissue. Not because, for example,
(19:56):
as Chelsea rightly says, that wasalready coming up to ratic but then right
now it' s a very popularword to sell a product. Then you
may have device launches. Now thisis ten, I think it' s
ten. But right now vienne Applealso with her bet between artificial biencia that
is somehow pooling with opendia and thosewho make chagipt And then they are being
(20:21):
like a fashionable word. Even inthe last release of Google tools, journalists
I consume counted the times they saidthe word AI or artificial intelligents So,
of course in terms of sales andmarketing was extremely exploited. In fact,
those days in Santa Marta listened toa man who today flips with artificial intelligence
(20:44):
plug. So how he has alsoplayed with this concept since this imaginary being
able to sell eye- catching product. No. But then there' s
also a theme. Let' ssay how companies that make these tools have
a lot of access to information.Example, what Microsoft owns. I think
forty percent open up and I'm talking about a number that can be
(21:04):
refalse, because all those things growand decrease. But Microsoft has a lot
of power in opendia and that weare who make chagipiti and they are and
thanks are created copy Then imagine Microsoft, which is the owner of Office where
we process spreadsheets or please points andapart we use Windows a lot. Then
(21:25):
all the information I' ve hadcollected from your giant user as well.
Now let' s talk about Google. Google is a giant that has virtually
the Internet organized on its servers.So, Google also has its artificial interaction,
which is called its language, ratherits language of is that we sciso
the words in English, the languagesmont language models that are language models.
(21:48):
So, Google with Jemina and it' s also doing awesome things. But
then there also comes a topic ofhow the media also get into other spaces
that are, for example, mistakesare made these, these technologies, because
they are not thinking technologies, theyare not conscious technologies, they are Technogestan
repeating a pattern of information that theyhave acquired from the databases. There are
(22:08):
quite a lot of bochinches, like, for example, that Open Aiga was
feeding on the information found YouTube videosand YouTube didn' t know it,
but it looks like it did.Over there are others who were feeding on
a large platform called rair reddits.It is a platform of giant forums and
too much information, as much asgarbage as very useful instructions for things.
(22:29):
So they' re also feeding around, so this, these mambula machines are
feeding on all this data that humansgenerate, and so, they also see
the issue of racism a lot withouttouching it in this momentistic. But it
' s the issue of mistakes.Example, Hogle, their new launch to
also ride the races to be onpar with everyone, made a lot of
boom on the media internet. Whenthey finished it was that their artificial interest
(22:53):
was giving false predictions, such asthat geologists recommend eating a daily rock or,
for example, such absurd things.But of course it' s this
model repeating things that you' vebeen learning and it turned out that that
recommendation that gos say eating is adaily rock for health. That comes from
a man he published many years ago. No, or this is one of
(23:17):
those databases that they consume on platforms. So what it is is to replicate
all this crap without going in totalk about racism that is also there for
now and all those issues of howblack and relying people are being represented in
this system. But basically it hasalso been a boom since marketing to increase
sales and to show a novelty factorin good products dear hibernauts and we continue
(23:41):
in rounds of africania, technical territoriesbuilding peace. Today we are talking about
artificial intelligence and how black people arein the midst of this artificial world,
in the midst of all that intelligence. Chelsea, in the first part,
told us about what artificial intelligence wasand ales told us because it is also
in ferret and how that is relatedand why we are talking about this issue.
(24:04):
When we started, they talked aboutsomething also important and it' s
the algorithm. So I tell youthat a lot of Africania is a space
in which we try to talk ina very colloquial way, but we also
understand that to talk about the wholeissue of artificial intelligence, is, to
talk about data science, as Iused to say seras, and to talk
(24:25):
about mathematical, numerical and technological forms. But then now yes, chelsea already
eats the algorithm and what is thatfamous algorithm that is telling us lately and
is leading us to see what wewant to consume and how we want to
consume things. The algorithm that peopleso complain about in simple words is a
recommendation system. There are two typesof recommendation system, one that looks at
(24:51):
what you consume and recommends similar thingsand another that looks at what you consume,
classifies you as a person and recommendsthings that other people like you consume.
Those are the two basic types.As I see Alex earlier, companies
that set standards have been swallowing upother companies and diversifying their portfolio and already
(25:18):
have too much information from too manysources to classify you as a person.
So, for example, Meta hasaccess to not only your Instagram profile,
your Facebook profile, your Whatsapp conversationsand partnering with Google, and Google has
access to your search history, yourGmail email, your Google Maps location.
(25:49):
And other than that, ninety percentof the casual population uses cron and crom
What you do is that it collectscookies, you say that pages that you
put in like hello you give medictatorial and if you don' t leave
you s no. It' slike I' m gonna give you my
cookies. When you give him thecookies, he tells him. The user
(26:10):
of Bogotá' s number five computer, Colombia, entered this page and stayed
five minutes. Then Google builds aprofile of who you are A person who
gets twenty thousand minutes reading antiracism,fifteen minutes to Amazon, then Instagram and
gives like to three things of collectiveAfro. And then he gets into Whatsapp
(26:30):
and has a conversation with Alex aboutanti- racism issues. And then all
that information goes to your profile.So he says Mayo is a person who
is interested in antiracism. Netflix buysthat and you open Netflix, which is
the first thing you get on thefront line. It' s like black
voices and or not wow. Youknow, how do you know then this
(26:51):
is all the companies like putting togetherthe bits of IT they know, generating
a profile and saying the other peoplewho are as it is, consume in
this up there or not. Yeah. Okay. Thank you, very well.
Yes, it saves me search onNetflix, but right now what you
say and, as Alex said,capitalism has led you to a point where
everything is already buying buy buy,buy, buy and you can' t
(27:15):
practically talk about Android cell phone asan operating system is Google. They also
have access to your microphone, theyhave access to your camera. Even then
you' re casually talking about mybike not being damaged. A Google tells
Meta, Meta is hurting her bikeand Meta is looking for all your ads
and you' re starting to getinto a relationship with a bike. That
(27:37):
' s like the algorithm, whichis basically twenty different algorithms coming together to
buy something. There' s away to make him the algorithm kit.
I think so, but good.In fact, with him we talked about
the issue that we are missing.Let' s also say create a lot
more res between black people. No, and I' m saying that because
(28:00):
all mus that also allows an exchangeof information, ways of doing things so
that we can then hack into thematrix. For example. I was thinking
about this subject, about this question. I was thinking about the book of
demanding my freedom, where these womenlearned to know the system and hacked it,
they did a lot of interesting thingsto seek their freedom and that of
(28:22):
their relatives. So not us,we and we are also in a moment,
I think we are in a veryparticular moment, where you can still
do something very transcendental, and it' s how you can turn these algorithms
around for our benefit and convenience.I also see it very much from marketing,
for example, and how black peopleare so resourceful and innovative in creating
(28:44):
products and services. Then also howcould I understand them and then give them
a turn where we have much moreexposure to our own products and services and
be able to consume them as well. But there is also a theme with
the accumulation of information and is thatof course all the time, they are
investigating us all the time, theyare shaping up, but in the end
(29:10):
they also continue to be based onprejudices, racism. In many things and
as we could also have, say, tools that allow, from that information
collected, to understand more also whatthe dynamics of black people on the Internet
are. There' s a theme. It is that, of course these
things require money, but I thinkthat more than silver require minds that have
(29:32):
the ability to generate systems and serversand tools that can allow collectivizing a lot
of information. From there, becauseright now, for example, I find
it very interesting what Europe does,and Europe has a lot of progress in
terms of information data and there andthe European entry, because since it has
access to information, it knows insome way or its rights and they have
(29:53):
claimed a lot of things like,for example, that companies can tell you
may I is that I will lookfor your data for such a thing.
I mean, it serves you toallow me or how it would be to
use this data in such a thingand it touches you so much you believe
and understand me then are dynamics thathave been given in the Europes And it
is interesting as also to learn fromthat to generate other ways, because well,
(30:17):
the African continent, which is theyoung continent, which is in Furor
and which is going to be thecontinent with the most projection that already is,
in fact, because it is acontinent where there is going to be
a pretty big consumption And then thecompanies are also looking at themselves there and
how to have a step forward andthen also to have a regulation, in
(30:37):
some rules, strategies to hack thesystem in that sense. Thank you so
much. When we start this footprintof africanity and the concern to talk about
artificial intelligence, it is because we, as black people, are talking about
this issue and how we are situatedin the whole issue of artificial intelligence.
Here' s a question for Chelsea, and that' s how she thinks
(30:59):
it suddenly feels black or black peoplein the development of AI affects the results
and ethics of certain technological products,including what we consume and the answers.
Because to make her talk to you, I got a little bit of TP
shawl and wondered about black people,black communities, not just in Colombia and
(31:19):
the world. I asked many questionsand many questions had a number of words
as marginal. They are very biasedthings and I said that' s how
they see us and when people whouse chat giti P to do their research
so great academics, they continue toperpetuate and repeat certain words that vote for
(31:42):
them this kind of artificial intelligence.A maximum that is handled in question of
artificial intelligence is garbsh in garbash out. Garbage that comes in trash that comes
out. Then. That means,as we say more in prana, a
model of artificial intelligence receiving the numbersand delivering a little number. If the
(32:04):
numbers enter biased, there is noway the number comes out unbiased. And
that' s the problem with modelscurrently in existence, from chatgipet to artificial
vision models, which are models thatreceive input into images and deliver an image
to you. Absolutely everyone was trainedwith biased data, because all the data
(32:30):
that exists is biased. I mean, just as you can' t talk
about a person who grows this societyand isn' t racist, because of
how society breeds us. Even blackpeople. We have an internalized racism.
Likewise, all the data that wasgiven to him is going to have that
bias. So Chad Gipit trained himwith data very open to the Internet,
(32:52):
that is. They didn' teven do as an initial cure, and
that' s giving them a problemthat some of the data they gave him
has already been generated by another artificialintelligence. So it' s like generating
a false information cycle and in theend it' s delivering you as we
said, it' s a wordor phrase that' s statistically likely to
(33:15):
make sense. He doesn' tknow if the phrase is good, if
the phrase is bad and the phraseis true. He only knows that when
a person gives him the phrase itwill make sense. So it' s
not going to give you like amix of adverbs and weird adjectives, but
it can give you something that's categorically false, just that it sounds
real or something that' s racist, something that exists that' s homophobic,
(33:37):
because it read it then in theexample that Alex said right now.
It is an example that was talkedabout a lot because when Google was left,
Google said let' s train withReddit' s data and pay Radito
a million dollars for information that allof us have built for free and all
the network people complained about, butanyway the deal happened, what about network.
(34:00):
Raddy is a space that has historicallybeen a space for white, heterosexual
men with certain social resentments. Thenthere are answers that are very edgye so
to speak. So a very famousquestion of that case was like there is.
I' m depressed, what doI do and the answer the TIRA
gave itself a balcony and the modeljust so many people have said TIRA gave
(34:23):
itself a balcony in reddid that itseemed statistically relevant to the model and that
' s what it delivered then howthat applies to black people in language models
what you say, how we talkabout ourselves and as we' ve always
been talked about the model, it' s going to keep talking about us,
that we' re marginal, thatwe' re less capable, that
(34:44):
we' re historically impoverished the model. That' s what he' s
going to deliver, because that's what he found of black people in
the texts they told him to train. But there are other cases of discrimination
that have also been widely publicized andare mainly cases of artificial vision. Artificial
vision is an It is a branchof artificial intelligence in which I had to
give it an image, it turnsit into numbers. Each dice, each
(35:07):
pixel of an image is actually acombination of three numbers. How much red,
how much green and how much blueit has with those three numbers,
you can generate a pigment. Alldigital images are three numbers per pixel.
It turns images into numbers and thosenumbers use them to make a prediction of
something. So, a very publicizedcase was someone who was looking at a
(35:31):
model to recognize the people in theoffice to enter without a license and in
all their model development never used aphoto of a black person. Of course,
when black people went into the office, I told him I didn'
t work there. Then you sayno, because it' s a minimal
problem. Bla blablabla, but itis something that speaks of an unconscious discrimination
(35:52):
of the developer of the model thatnever thought black people in the company.
Then the model never understood that thereare black skins, for example, or
in Amazon they had models to doa screening before the leaves of life and
the leaves of life were rejecting womenand minorities and they did not, but
what happened they said should be thename. Then they took the name off.
(36:15):
Let' s just say what theygave the model on the resume and
kept rejecting women and minorities. Andin the end they realized that there were
activities so related to women and minoritiesthat nothing more than playing volleyball or saying
that you studied at such a university. I already said that the model would
(36:36):
be rejected because all the people whoworked on Amazon were neither a woman nor
a minority. Then the good examplesof who should work for us were not
including then the fact of not havingdiverse people. The fact that you don
' t have black people in developmentteams creates blind spots on what the developer
and what the development team believes isnecessary. And there are some problems that
(37:00):
let' s say they won't affect people in the comestiza, so
they' re never gonna think plagueis a model we should make. You
don' t even miss the reidea. So there' s a lot of
detection models. They' re fromnematological problems and that' s been a
boom in medicine, because she's something you had to do. Three
tests, draw blood, and let' s just say that taking pictures of
(37:22):
the lesions on the skin can helpa lot in the diagnosis. And they
also realized that that worked a wonderfor white people, but black people took
a picture and the model didn't say either egg or bad. The
model said no, because he wasnever trained with black skins. So that
' s like the kind of thingsthat having fun with your development teams would
(37:45):
catch before the model comes out productionright now. They are delivering things to
production and they are seeing them formillions of dollars, because if already ta
ta tatada and there is not ablack person in the whole process, they
are selling tatto products. If thematch is being told about this whole thing
and I' m thinking about what' s going on, why the black
people aren' t there. Wehave a structural racism and we also have
(38:07):
evidence to access certain technological things,because we have been instilled in other types
of careers or we have even beenintroduced that we can be in mathematics,
which is the whole subject of science, tem is what is sometimes most afraid
of. So that makes there's a lot of bias and we'
re not there. But when Ithought about it, one says it can
(38:28):
be in Colombia. But this issueof artificial intelligence comes out of Colombia.
It' s a cool something.So it' s on a global level
that we' re having problems wherethey' re not representing us, because
many matrices are taken from the UnitedStates, from African Americans, but there
' s no representation of African Americanpeople right now. Or suddenly yes,
because, for example, for meto meet Chelsea was like wow a person
(38:50):
who is talking about data science andbeing a woman and being black is like
a basis even for this program.So, ale you, how do you
think that the representation and inclusion ofblack people can influence innovation and the development
of more equitable technologies. I thinkthe entrance is like this, not like
(39:12):
somehow looking for what' s goodfor us. I hate that word of
inclusion, because inclusion is also thatone of one' s access does not
want to be in a racist spaceand I believe that inclusion leads one like
that. But then I think thetask is to begin to collectivize more these
conversations among preety people. Also onthe subject of how to be able to
(39:37):
talk more about it with the blackpeople of the common that is not let
' s say as much connected tothese trends and start normalizing a conversation something
that is already happening. That's what happens. Let' s say
from the experience of the Cultural Houseof Chontaduro there has been a very interesting
topic and is that the idea thattechnology is here is already being discussed much
(39:58):
more. We have to start applyingit and we are also adjusting to the
same dynamics of the organization and adaptingthings step by step. Even there it
is an interest in the new generationsto know more about that kind of thing.
So I think the task is tocommunicate more and to make ourselves known,
to show us to show, becauserepresentation begins from our renowned community and
(40:23):
then to create. I also think, as stronger groups and collectives that start
to advocate on the subject. Forexample, of course the influences we have
are very African- American. I, for example, follow Rua Benjamin,
who is hers. She founded anorganization called Justates ALAB which is like a
(40:44):
data lab only for mistranslating it,and she is in charge of talking about
how algorithms incite racial discrimination. Right, these create policies to the least of
that. There is also, forexample, the Tenia Swin that she is
from Harvard and she works on thesubject. Let' s also say online
(41:07):
discrimination in algorithms and is the directorof Datha pray bs and Lab, which
is a data privacy lab. AlsoJoey Bullowimi, who is a woman who
has been she is already said tobe the poet of the code and she
works, because with her she foundedthe League of Asocis algorithms and also advocate
the issue of racial discrimination on theInternet and the development of software. And
(41:31):
there' s the case of Timmicg Broo, who was a Google employee
hired for the issue of ethics inartificial interedency. He made his report that
Goll was quite unethical and the talk. So, then, like, let
' s kind of get to knowthese types of profiles, meet Chelsea who
' s here and can talk to, say, the project girl who wants
(41:53):
to study software and video game developmentand start globalizing us a lot more and
recognizing us. I think the maintask is that, because, anyway,
companies always wear warm water. Iremember a lot when they started to criticize
the topic of photography and googlen Pediand Google hired some organizations that work the
(42:14):
topic of the shades of black peopleso they could then tear them into their
new cameras with artificial interagency that photoswith pixel six or seven, something like
that Google cell phone, that alreadyGoogle and its cameras could see the different
skins of black people. So,how do they continue to grasp these things
(42:36):
as conjunctural as money passes from capitalism. But they are not solving a deep
issue and it is how they arealso then evaluating their employees, who are
so familiar with their employees about theextermination of racism. We also understand that
in the United States the conversations aboutracism that they called critical critical racheld theory
have been banned, something that islike talking about racism and racial theory.
(43:00):
That is being banned by many partsof the United States. They' ve
even banned books at colleges' universities. So I think the task is more
to do from here, where Ithink we have a lot of potential and
a lot of minds, because industryis not going to do it anyway.
And the other thing is that alsothese tools that are already very popular,
because they already have too much power, it has too much capital, and
(43:22):
we are at a time where,before they gather all these tools, we
could also start collecting and generating things, looking for, looking for funds,
looking for cooperation black people who arealready involved in certain powerful things to be
able to generate tools of artificial interestfor communities. A curious datico is that,
(43:42):
Right now, there' s alawsuit from this gentleman and mosk him
to open and because open And supposeshe was free. That' s why
I had the word open, butOpen' s making money. Then you
know how he lost his philosophy.So I still think there' s time
to start evaluating a lot of things, talking about it, talking about it
also from a language that understands toshow who we need to show the community
(44:07):
and start normalizing technology. He's here. We' ve been part
of it. How do we beginto abnormalize that technology, begin to have
an impact on all this of artificialintelligence. Because, ahorita, chelsea said
that according to what we consume,a few numbers are passed. So we
consume and that becomes a few numbers, that is, it gets there,
(44:28):
and that bounces back into what weare given back what we have to consume
then, because while we are notin this great sphere of the data world,
we are all generators and data.So what is done as a black
community to generate data that gets thereor how we' re going to achieve
that incidence. But there has tobe some way, that is, how
we started to bring in that bitof sand to change these systems. Well,
(44:51):
I think that question is I don' t answer very sad, and
it' s just that we're going to get out of capitalism.
Right. So I think what wecould do as black communities is start creating
and consuming black products, because,well, by the end of day two
(45:15):
we already need Google to work.I mean, one no longer exists without
Ungmail. There are many companies saythat their entire trade movement does so in
Instagram profiles. There are things youcan no longer find on Instagram. I
mean, I have Instagram just becauseone day I went to the restaurant and
(45:37):
the menu was on Instagram. It' s already impressive. So I think
that, like everything else, takingit as from the appetizer of artificial intelligence,
if one can prioritize consumed products ofprietos, black products, things that
one knows who made them with locallove, excellent and if, as a
(45:57):
technology industry, one day we havethe incidence of being able to set up
our own companies and be able toset up our own social networks and be
able to assemble our own digital products. Let us hope that we will have
the welcome and support in the Communityfor products that you never know where they
are coming from or who is handlingus. That' s like the dream
(46:19):
and it' s still a verycapitalist dream. It' s really hard
for me to imagine how we wouldn' t get out of that system,
because right now almost everything is free, in the sense that one uses Instagram
for free, one uses Facebook,for free, Whatsapp for free. Everything
you sign up for is usually free. Suddenly there is a paid version,
but usually one to use for freeand the cost is your data. The
(46:44):
cost is what you' re givingthem so they can get into other models.
You' re paying in data.And other than that there is a
tangible real cost that people don't consider and is the energy consumption and
the consumption of or water required bythe datacenters that are doing these calculations.
(47:04):
When you cast GPT appeared or itwas being reported that for every question was
spent, I think it' salso a number that I' m not
very clear about, but it wasspent like five bottles of water for every
question. It was like the numberthey calculated, so to speak. And
it is water that is used,because in the cooling of the data center.
So people, because they have theproduct on the computer or on the
(47:28):
digital cell phone, they never sizeeverything that is required for the product to
be working. So, that's why I find it hard to think
of one at a cessation of consumption, because they' re free products and
people taking away the free stuff issuper difficult and I think what we can
(47:49):
do is, like black people,start generating products that compete with them and
try to do it ethically and sustainablyand hope that black people when they see
the product comes with that social rasa, they pass. Yeah. I really
like what Chelsea says, because Iwas thinking about how organizations say how they
assigned it, the PCM, thechontaduro, etc, to name some that
(48:10):
already have agency, that already haveinfluence and are known. We could then
start this way to abnormalize technology likethat and from there could come interesting projects,
such as own technology projects, roboticprojects, software development projects, for
example. I really like to saywhat the robotic school of the crash has
(48:34):
been doing and with these topics,those bootcamps for girls in software development,
and that there is also one interestingthing and that is how we can begin
to understand how inside we have totalk about racism all the time, because
racism is mutating all the time.And so look, what we' re
talking about is mutated racism, artificialintelligence, and we' re not understanding
(48:57):
that dimension. So continuing to talkabout racism is important to get to know
these new dynamics and so on,so also get into playing with it.
For example, I like what isbeing done by eoa children of African descent
with Dora and it is publishing imagegalleries with a smiling Afro child for example,
and of course don' t Googleblack boys, Afro girls something.
(49:21):
Thus, it will be based onthose roughest images shown by these international cooperation
companies. No, and as forexample, it unites that in the images
of porn, misery, and childhoodafror x that you say as seriously.
So, as there is one thingthat is called tags, right, then,
when you upload an Internet image,to your Instagram, to your Facebook,
(49:45):
to your web page, look intothe options of the image or have
some tags that are nice and theimage is nice. For example, Afro
- Colombian children from the Cali Eastsmiling in the parking lot, for example.
That starts feeding these databases in amore positive way to other narratives and
we can also play We can playthem the game also from our ways,
(50:08):
for example, also in the applicationsof that artificial lece there is an option
to give a feedback if they don' t like the answer to mark,
that they don' t like,that why they don' t like it
and I hope I can do asdetailed or detailed as possible marking because we
like that answer. That' salso ways to get back to the system
like these claims, how to makethis evolution as well, and of course
(50:32):
it' s painfully up to usto do the task. But I think
I really like the subject of electronicre- use, because if you can
gain examples, in Africa there arequite a few projects from electronic glaso and
this has allowed people to have accessto information technology, for example, by
doing not know how to collect computersfrom companies because they are two years old.
(50:53):
Then they change them. One couldcollect their equipment and condition and melt
them to use sonson free as Linux, editing tools, design tools that are
free. And start to understand whatit means to use these free tools and,
as the partner says, they arefree. But then there is no
capitalist issue here, that there isa matter of cooperation. If I have
(51:16):
a group of software developers with whomwe have some programming logic workshops, we
are talking about equipment reused with Linuxand we know that we can make a
contribution among the people we already knowabout the topic to that tool that they
need of that facility. I thinkwe should also do it and mention ourselves
from the digimons come in or.We are a community of black and black
(51:37):
developers from Colombia and we come tomake this contribution, this tool that has
facilitated access to things. For example, that' s starting to collectivize much
more activism in technology and do alittle bit of counterweight to all these dynamics
that what they' re going todo is divide us badly, destroy us
and confuse us For example, Ilike to be right now and I'
(52:01):
m going to pick up everything we' ve been talking about a sentence,
and it' s collectivizing technology.As black people we are very collective subjects,
so technology has to be talked aboutand we have to think and process
collectively. And when a while ago, I was asking them how we do
to hack him or how we doto intervene. A little bit gives us
(52:23):
some ideas and if he said verylittle that it is very difficult to do
it and right now, I willcome back and take this idea back.
Speaking of having to speak positively ofartificial intelligence and algorithms and all that data
is so used to speaking negatively ofracialized people in the world, that'
s what' s pulling. So, like the message you' re making,
(52:46):
it' s let' s starttalking about everything positive about Black people
and let' s start talking aboutthese great contributions to building the world that
Black people have made and talking aboutwhat those messages are about. I think
we are very lazy sometimes because aswe are on the subject that everything is
(53:06):
ordinary and all that time, weare very lazy and we give ourselves this
footek back that is needed, thenI do not like your answer. And
you give him like an X-Y, and he comes back and he
comes back and he gives him andhe comes back and he gives him until
he tries to get the answer thatyou consider, but he never tells him
why you' re having a badproblem with that ethics issue. When we
started talking about artificial intelligence and Istarted looking, like in many podcasts and
(53:29):
so on, what people are mostconcerned about is intelige. It' s
ethics, well, artificial intelligence.At last and leading a tool. How
you use it depends on who isusing it. That has always been the
case, it also becomes a toolfor racism, or for sexism, or
(53:50):
for discrimination or a positive one.It' s important that they regulate.
I mean, I think there's always going to be people who lack
it It' s very important thatthere' s regulation, even fair to
Orientan. Yesterday Megan of this union, the famous American rapper, made me
in Deeppake as a sex video andshe was. Let' s just say
(54:14):
he was complaining because since he's not here, it' s not
regulated. That' s a sexualabuse or by minimum is a sexual thing.
And there' s no one tostop that, because in crime it
' s his face. No oneknows who uploaded their Internet. So ethics
hopefully each of us, when it' s interacting and when it' s
doing things with these tools, let' s stop and wonder what I'
(54:35):
m doing, who am I gonnahurt? Doing this, who am I
representing bad, who am I goingto cause an injury to? And globally,
as I have read Alex the EuropeanUnion is marking the stop of regulation.
They have had good standards, massivesurveillance, regulation of data use and,
generally, as we do, thegame of what they decide. I
(55:00):
wish the Colombian government would stick togood decisions, the U S government said
it is a little more influenced bywhat corporations want more because they are based
from their own country and money isentering them. So we are waiting for
the European Union to complete the regulationof these models of generational goings is like
(55:21):
the Changiptives. Nick journal those whogenerate images and that those decisions someday yell
at the rest of the country,the rest of the planet forgive us.
However, as Colombia, as yousay, is already in the process of
regulation is going to arrive faster eventhan racial regulation. Let' s see
what happens to that. Neor alexAlexander, well, I also say already
(55:45):
people with regulation. I don't remember the data very well, but
I had realized that there were likea few beginnings of talking about regulation,
let' s say at a globallevel. There were even people from the
African continent and in this case theyrepresented those big corporations. But I'
(56:05):
m beginning to say they were alreadyat least there. And yes, then
that' s important. Regulation isalso the issue of being able to put
limits on certain things. It's complex, but we' re also
not like within these circles of powerwho make these decisions. Then you have
to be ex- pending on theinformation that the news shows. I think
(56:28):
that also the issue of ethics verymuch in which we must be responsible and
we must handle self- criticism withthe information consumed on the Internet and there
is too much junk and so muchfalse information created even with artificial interest that
feeds artificial interest. So I thinkit is also a responsibility of a self
(56:49):
- criticism, to question what weare consuming and to look for other sources
as well and I think there isalso a topic there with the artistic,
which is very particular, would bech as with people of the artistic environment
who are also a little affected orare seeing a threat or someone are seeing
me many opportunities, but the planeof the ethical in the artificial interest was
(57:13):
very interesting and complex, for example, about the voices. That came a
long time ago, too. Forexample, Amazon with Alexa put voices of
celebrities, but right now. Forexample, there is a theme with Scarlet
Johanson that she was asked to dothe voice for the new open Ay update.
She said no. However, theyhired someone with a super- like
voice and threw it and with itthat or because in the end it would
become like a kind of plagiarism.But legally it wasn' t her voice,
(57:36):
because it wasn' t her.Then what? But yes, it
is a very particular subject. AndMicrosoft, for example, Ahorita is on
his copilot, who is down onopen and has made GPPIT. He'
s doing a theme called collect andcollect what you record every five seconds of
your computer interaction. So you cancome back in time, or particular questions,
like what you remember the pizza thatwas fine being that this promotion by
(58:00):
the TE is going to send thedate were throwing that pizza over your computer,
but you could also pull passwords,complex phrases, private information. So,
as they' re also going tohandle that when Windows is a massive
operating system is used by millions ofusers, so it' s interesting and
I invite them to follow the mediathat makes these observations sensibly. Many know
(58:22):
in English. So, well,well, dear cidernauts. We have thus
reached the end of African footprints,technical territories building peace and technology. Today
we were accompanied by Alexánder Paz Garcíaand Chelsecimena, two people who spoke to
us about artificial intelligence, about algorithm, about data science that is happening with
(58:43):
black people and that is a topicthat is hardly the tip of the investment
Thank you very much. So wehave reached the end of African footprints,
technical territories building pastures. On theother side of the world there is a
continent fifty- four countries inhabited bythe groups Balanda, Cueno, Caraballillos,
(59:06):
lucumi araath Ashanti and many more villages, warriors, thinkers, farmers, miners,
ranchers and fishermen. They are presentin Colombia today. If you use
the words chevere yam or chhofarse tospeak, you are rescuing the oral tradition
(59:30):
of Africania. If you prefer salsa, vallenato or reggae when you go,
you evoke the rhythms of Africania.If as a child you heard the myths
and legends of Mother Monte, theTunda, the Mother of Water and the
Rain, you are remembering the storiesof Africa. On a radio rosary.
(59:50):
In association with the National Conference ofAfro- Colombian Organizations CENOA, they presented
traces of African Sednic Territories, buildingtraces of Africania traces of Africa in Colombia