Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Will Lucas hearing Black Tech, Green Money, and I have.
Speaker 2 (00:03):
A special special episode. So I used to do a
different podcast called of Ten Podcasts, which I told you
guys about.
Speaker 1 (00:10):
On the show before.
Speaker 2 (00:11):
And one of the people who was on season two,
which would have been like twenty eighteen twenty seventeen, was
my guest today's Baritune Day Thirst and he's a storyteller,
Emmy nominated host, writer, producer, public speaker. Most recently, his
work explores ai, nature and humanity. Welcome Bartune Day Brother.
Speaker 3 (00:32):
Will good to be back as a reunion man.
Speaker 2 (00:35):
Absolutely, I'm so excited for this because you have probably
one of my favorite talks at this past years Afro Tech,
and I'm like, we gotta get him back.
Speaker 3 (00:43):
On this show. Thank you man, and thank you that
was a special gathering there. Yeah.
Speaker 2 (00:47):
So, actually, before I even get into my questions, like
when you go to prepare for that audience, you know
you're talking to you know, tens of thousands of black
people all pushing to build the future, be part of
the future, figure out with the features gonna hold what
is that? What does your spirit say I need to
bring to this audience.
Speaker 3 (01:08):
That's a good question, and I think, you know, I
do a lot of talking. It's literally my job is
just to put words together, to put them on a
page or a stage. And for that gathering, I knew
I had, like the spine set. There is a perspective
that I'm trying to bring to the conversation about tech
(01:29):
and AI today that it's like, how do we live
well with this tech and not just survive it and
endure it? I think feeling into the blackness of it,
it's like, how can I be even more relaxed? How
can I breathe and feel? And I don't know, it's like,
(01:50):
what's the cookout energy? I could bring what's the the
spades table kind of energy where it's like, okay, it's us,
you know, to us. And I think the tone was
set really well because I've walked into that convention center.
It's just like it's so black. Jadenna went right before me.
That set a tone and just walking out and feeling it,
(02:13):
it's like, oh, this is family, So we could have
a family conversation, right and we can be really like,
how can I try to amp up the level of
realness and playfulness and just say say what we know
to be true in this moment, whether it's about the
political situation, about the perils of AI, or about the
(02:33):
promise to try to build something different this time around.
Speaker 2 (02:36):
Yeah, so I think about and this again, this is
not even part of my questions. That's when I talked
for a second. Yeah, so I remember the part of
your talk. You're talking about how your mother. I forgot
how you phrased it, like she you know, creatively you know,
escorted out a computer.
Speaker 1 (02:50):
From the way you said.
Speaker 3 (02:52):
I said, she leveraged excess inventory, liberated excess inventory.
Speaker 1 (03:00):
I like that. I like that.
Speaker 2 (03:01):
And so because I think about like the kind of
the waves of industries that specifically young black people want
to be in.
Speaker 1 (03:11):
Oh, in like generations.
Speaker 2 (03:13):
It's like when I was, you know, coming up, like
we wanted to be in the music business, and then
there was like this, you know, I may skip over
some but then it was like everybody wanted to be
in media. Then everybody wanted to be you know, in
you know, technology, and everybody wants to be in not
in technology any more because they realized technology is really
just media companies and you know.
Speaker 1 (03:31):
Social media. Think about that.
Speaker 2 (03:33):
And so when you were coming up, I imagine you
probably didn't necessarily want to be like in tech, but like,
what did you want to be in and how did
you find your way telling these stories?
Speaker 3 (03:45):
When I was very young, I wanted to be a gardener.
I really liked the idea of playing in dirt. Surprise, surprise,
little boy playing in dirt. And I had worked in
a community garden. I grew up in DC, Chocolate City,
and I was a member of the little community garden.
(04:06):
And it's so satisfying to pull something out of the
earth that is different than what you put in the earth.
It's a different type of investing. But you're like watching
something grow that's not in an account. It's like in
your hands, and then you could put it in your
body and the deliciousness, like I can still taste the radish.
(04:28):
I can taste the first radish I ever grew, and
mustard greens spicy, and it was just like I was
so small and to be able to contribute. I think
this is probably clearer now than when at the time,
but my mother was doing a lot of work. She's
raising me and my sister on our own, were entering
(04:48):
levels of the crack wars and all the nonsense that
has you know, hit a lot of our communities and
still does in so many ways. And so there was
also a sense of witnessing how much she was investing
in sacrificing and being able to add to that. So
I was like, yeah, I got you. I got to
be a gardener, you know, that was it? That that
was very obvious to me. And I get to be dirty,
(05:10):
and no one can be like you're dirty. I'm like,
it's my job.
Speaker 1 (05:12):
It's job.
Speaker 3 (05:13):
I'm actually working here, ma'am sir. So from that to
this quite a journey, quite a journey. I know that
I liked mouthing off. I knew that I had like
a little gift of gab. Also because of that same mother.
My mom had a shorter temper than I did, and
(05:35):
she had kind of a higher standard for humanity I
think than I did, but mixed with lower tolerance, which
is like a hard place to be, especially if you're
doing a lot of this parenting work alone. So she
had trust issues based on her childhood and a lot
of things. She was violated in various ways by society,
but also lived by her own family, and so I
(05:58):
was often deployed to communicate this neighbor did something I
don't like, if I go over there, it's probably gonna
get worse. Can you write a letter? Can you make
a call? Can you? And so I started being deployed
almost like a diplomat, which is a lot of work.
So it's a mix of like her self awareness and
(06:20):
also maybe some premature assignment of responsibility through this child
to interact in the world of adults and in the
world of language and communication. But I was good at it,
and I could feel the response that people had to
me relative to her, and I think that probably helped
set me even more on the path of all this
(06:41):
narrative and storytelling and performing and mouthing off work did
I do.
Speaker 2 (06:47):
What is your perspective on how academic institutions how they
should embrace AI for education and kids learning, because I imagine,
you know, some are in the category of we just
don't want you using it at all. Others are like,
you know, we're okay, but these are the parameters, and
(07:08):
but like, I don't necessarily need to learn, you know,
a diition. I can have a calculator, And they said,
you said, what has When I grew up, it was like,
what if you don't have it available?
Speaker 1 (07:18):
What it goes everywhere with.
Speaker 2 (07:19):
Me, you know, and so like that day that they
imagined has never happened, it's never arived. And so what
is your take on how parents and educators should embrace AI.
Speaker 3 (07:35):
That's that's a big one. We're actually for life with Machines.
We're in the midst of making a mini series about
youth and AI our next episode. I just recorded the
voice over yesterday about education in particular, and it's really fraud.
And so I would start with a lot of empathy
for parents, for teachers, and for students, because everybody's wrestling
(08:00):
with choices that none of them made. We're all we've
all been forced to react to the choices of a
very small group of people that have unleashed something that
is affecting all of our lives well beyond technology and sore.
If anybody's feeling overwhelmed by it or inadequate to the task,
(08:21):
that's not on you, and you're not alone. And so
we all need to give ourselves some grace and some
forgiveness for not having figured it out. What we're facing
is absolutely ridiculous. I think that there's something in the
middle where we need to land here outright banning these
technologies and tools does not feel realistic to me. I
(08:43):
think the radical embrace of just like give every kid
access to an AI is highly irresponsible. These tools are
not intelligent. They are you know, the large language model
editions of them. I should be more specific, but these
are statistical models that are predicting math and converting that
(09:04):
math into language or images, et cetera. Very convincing, often useful,
not intelligent, and there's a lot of risk built up
in them. So what I've seen that works. I heard
Clay Cherokee, he runs take AI and tech education program
for New York University, and he was saying, you know,
(09:25):
if you measure a tire company, you're going to look
for them to produce more tires as a measure of success.
And so you add technologies, you get more tires, like
you're winning. If you measure a history department, you might
be tempted to measure production of history papers as a
measure of success. But we educational institutions should not be
(09:46):
in the business of like producing history papers. They should
be in the business and really the civic mission of
producing historians and people will understand history. And so I
think there's I would advise people to think not just
about the output and and generating high scores, but about
thought process, about community membership, about self awareness and introspection.
(10:09):
What are we actually trying to create here? If we're
trying to create little robots, then keep it up and
talking to another one, Avi Folek, She's got a radical
new school concept called the flight School, and it's really
just about self discovery and mentorship and community and these
things that could be a bit harder to train a
model to replicate. And so with the young people say
(10:32):
what kind of humans are we trying to create and
raise and grow? And then less work backwards from that? Okay, well,
what kind of educational system do we need to put
in place to get that output? And I think that
will get us off the path of efficiency scoring. How
fast can a kid regurgitate a data point? A human
(10:57):
will never win that race? And should we should concede
that now and tap out right. I'm not even going
to try to play that game. That game is unworthy
of me as a spiritual being, you know, me as
a as a member of the community of life. And
I think it gets real philosophical and moral. Are we
going to define ourselves? By our output, whether it's financial
(11:22):
or textual. How many tokens did you generate today? Will
You're gonna lose? You're already done lost, like I've lost
by even asking you that, because my expectations of you
are machine like as opposed to like, what did you
learn about yourself today? Will? What did you learn about
or how did you contribute to your loved ones today?
(11:45):
That that's you, chatchybt ain't got nothing on that.
Speaker 2 (11:48):
Yeah, yeah, I'm interested in this so many places I
want to go with this, but I'm interested in your
take on because you think about humanity a lot. You
think about technology and AI as it relates to humanity
and has has humanity relates to AI? And you know,
I wonder I'm gonna explore this while I'm saying it
(12:11):
so with me, I think about how we could get
to a day where AI has rights, robot attacking rights,
And because you know, people today would just say, you know,
if it goes, hey ryer, just unplug it. And I
don't know if that works because number one, like it's
bigger than that, but more so, like you can just
(12:33):
unplug a human you know, so while they're not biological beings,
they there is some there's something there, and I'm trying
to formulate, like, like what is the line that we
could approach, you know, X I, I don't know if
you know x X I A so x X was
talking at to somebody and she was saying, how there
(12:57):
is something more spiritual that we are not including when
we talk about how robots can take over humans. You know,
we haven't figured out how the those works she was
talking about. And she was like, this is more than
just biology. There's something spiritual there. And I just wonder
what you take from all that stuff.
Speaker 3 (13:14):
I just said, that's a bunch. Let me offer something
on the previous one before we get to the big
spiritual existential question.
Speaker 1 (13:26):
Of what is life.
Speaker 3 (13:29):
In the educational context, there are there's a value for
students who need a different learning style or have language
skills that are different from their teacher or the raw
material a I can really help with that. There's a
lack of judgment in the interaction that I know a
lot of young people find helpful, and humans often don't
(13:49):
know how to check that part of ourselves when we're
dealing with each other. That can be really useful. And
I've seen teachers find ways to create more adaptive lessons
with these tools that help them reach their students and
make something relevant. So I just want to name that
there is a positive opportunity here while also challenging the
(14:11):
assumption that everything's got to be AI, and that we
also need to be very get clear on our definitions,
because the large language model model of what AI is
is really slippery nonspecific slope that we end up on,
and all the hallucinations and whatnot. The fact that these
(14:32):
tools announce constantly like I'm reasoning, I'm thinking now. It's
like the lady doth protest too much, bro, Like that's
not it's not how thinking works. Like when I'm I've
been thinking with you this whole time. I'm not just
you will now, I'm thinking now, I'm processing. If you're
like over, you're trying too hard and you're trying to
convince yourself. You know the people who built these things
(14:54):
as much as convince us that this is a truly
thinking machine. On this big question of and x AI
x I E S, how do you pronounce that name?
I x I A thank you. I think that we
have a lot of work that we could do to
(15:15):
honor the rights of the already living, the already living humans,
and the already living plants and animals and Meba and
and all of the life that's already here. And that's
an ancient way of being that we have we have forgotten.
(15:36):
But it's not just that we have like passively forgotten.
There's a whole economic, dominant model that has severed us
from this earth and allowed us to not see the
dignity in all life. And so the AI moment is
an ironic one because you see people skipping ahead to like, wet,
we're gonna have to be respecting the rights of these machines.
(15:59):
And I'm part of me invites it because I'm like, well, yeah,
everything is just energy at the end of the day.
Even those machines are just resonating at a frequency not
that much different from us. We're all products of the
big bag.
Speaker 2 (16:12):
Ask my question, I was trying to figure out you
just really asked it better than I said it.
Speaker 3 (16:16):
Yes, but we're both we're thinking out loud here look
at it. We should have a little transcript of our
thought process, I chain of thought, just to prove that
we're sentient. So so I want to see this the
rise of these machines as an invitation to respect all life.
(16:37):
I don't think these machines are alive, but I do
acknowledge that they are part of the same energy field
that everything that alive is also a part of. And
I don't know what that distinction is, or if it's
just like a fuzzy gray zone between what I feel
comfortable with and what I feel uncomfortable with. They're the
(16:58):
I have a I have a problem with the leap
so on the idea that we are more than we
don't know how the nose works or the earny. I
think we've been hoodwinked to a great extent by the
language of those selling us AI, like let me just
(17:19):
I gotta. I want to repeat as much as possible
that there's people and organizations that need us to buy
into their vision because they have spent trillions of dollars
on their vision and people want their money back. So
this is not a civic enterprise, This is not a
charitable cause, This is not a nonprofit human rights spiritual even.
(17:43):
This is a financial driver. And then it's getting wrapped
up in all this outher language about human potential and growth.
Some of that couldn't be true, but the primary driver
both need to get their money back, and in service
of that, there have been some sloppiness with the language.
You know, neural networks, the AI works like the human brain.
(18:08):
This is not a true statement. This is not a
true statement. And I think for those who are I'm
not even that deep in the tech, but I'm closer
than the average person. Because you use the word neural
does not make it like our brain. It is a
radical simplification the relationships between these digital neurons and the
(18:29):
ones that are actually in our bodies, and the relationship
and the models that we have versus the models that
they have. Just because you use the word model doesn't
mean it's the same thing. And so our minds are
more than neurological connections. The way we learn has to
do with being embodied. We are relational, we are social.
(18:50):
We literally lose our minds without connection. And the premise
of so much of this tech is be alone and
rely on us instead, rely on our models. Have bots
and say you don't need a person, you don't need
to hire people, you need to talk to people, you
don't need to build with people, you don't need to
be in love with people. You need to be friends
with people, right, just choose the machine versions of all
(19:10):
those things, which serves the people selling the machine versions
of all those things. So the robot rights conversation to
me cannot be separated from the big financial and economic
incentives of those pushing the robots. While also everything is
light and everything is energy, so I can't fully dismiss it,
(19:32):
but I want I want us to get to that
kind of universal respect for light in a high integrity way,
in a way that we're choosing rather than being backed
into a corner by it from a real narrow definition
of life, which is this kind of centrally controlled set
(19:55):
of agents and robots that serve a profit motive. That
that's not what a true is. Bro Like, we're mixing
too many things and calling them the same thing.
Speaker 1 (20:07):
That's really good. That's really good.
Speaker 2 (20:09):
So I know you have a take on this, and
I want to set a foundation here.
Speaker 1 (20:13):
So I've heard people like Bill Gates.
Speaker 2 (20:15):
Satya Nattella sam Altman say, you know, well, what happens
when I don't have to do my work anymore, Oh,
I'll have more time to paint. And what I hear
from that is you won't have a job because you
got to read between the lines there, like, yeah, so,
and I know you have a take on this, and
(20:36):
I want to give you like one more layers. I'm
cautious when I talk like this because it sounds like
I'm like a doomsday is but I do believe that
we have not been honest with people on There.
Speaker 1 (20:48):
Will be jobs lost.
Speaker 2 (20:51):
And because people will say, you know your job, won't
You won't lose your job to AI, It'll get lost
to somebody who knows how to use AI. And I'm like,
because AI, I may not need that job anymore online
or a chart. Yeah, and so the job may just
not exist. It's not that somebody using AI is doing it.
I just won't need it anymore. And so I wonder
(21:14):
what your take is on the level of honesty we're
having with this conversation.
Speaker 1 (21:20):
And what you really believe will unfold And.
Speaker 3 (21:23):
We're all guessing here, but yeah, you know, thanks for
the wide margin of allowance to no one actually knows right.
We're all speculating, guessing, bullshitting whatever. I agree that we
are not being honest in many ways. I have a
critique that I offered up to the Afrotech audience but
(21:46):
I've done it elsewhere that AI is not taking your job.
That that sentence is not very helpful because it's not accurate.
AI is not self conscious, it doesn't make its own choices.
It is operating under directives and goals set by people,
and people are making decisions to not hire folks and
(22:07):
have robots or software do those jobs instead, and those
people are accountable. There's people who created these tools, there's
people who are hiring and firing and everything in between,
and so I think it's just important to maintain the
appropriate accountability. Otherwise we set up this awkward situation where
people hate robots and they hate chatbots and software and
(22:34):
the irony is like and this is getting into that
fuzzy zone. But like, those entities did not choose to
be here, right Like jem and I didn't ask to
be made, chat GPT didn't ask to be made. Pie
didn't ask to be here. We invoked them a very
(22:54):
small subset, but humans invoked them, and business leaders choose
to deploy. And there's a lot of incantatives that make
it feel like an obvious, inevitable choice, but it's still
a choice, you know, foam all being a really big
one financial pressure signaling to Wall Street that you're being innovative.
But it's still people, that's all.
Speaker 1 (23:12):
People.
Speaker 3 (23:13):
Machines haven't done anything. So that's a that's an important
like major asterisk.
Speaker 1 (23:18):
You're saying choices will take your job.
Speaker 3 (23:20):
Then, yeah, a human choices, human choices, and it's it
is somewhat inevitable, and that we have set up a
system where we measured a health of our economy on
this made up number called gross domestic product, which measures
certain amounts of economic activity which have nothing to do
(23:41):
with well being.
Speaker 1 (23:42):
Yeah, yeah, so we're already we've.
Speaker 3 (23:44):
Like got ourselves play a game to achieve a high
score that is meaningless to the things we actually care about. Right,
GDP don't say anything about how your family feels about you,
or how you feel about yourself, or if you are
sick this morning or thriving this morning, if you have
a stomach ache, or if you feel alone, and so
(24:06):
there's other measures of happiness and contentment. And I don't
know how many people are dependent on snap benefits. I
think I think the percentage of your population dependent on
government assistance for basic food is a significantly more important
metric than gross domestic products. I think we flip the
(24:27):
world upside down in a second if we just, oh no,
we're changing every measure to just based on what the
least of us can afford. So okay, all right, so
that's all RANTI on jobs, there's gonna be massive impact
on jobs, and I think it's because it's what the
(24:48):
people who've designed these things intend. There's such a funny
thing happening here where the folks pushing or warning us
about the thing that they're pushing. Right, It's like if
I'm standing at your house pouring gasoline all over it,
lighting matches, and I'm like, you know your house has
(25:11):
a high risk of learning down, I'm just like grow.
It's you like, just stop pouring gas But then you
say it in such like an erudyite way, and you
say it in front of like a senate panel, so
it makes you feel you get to like cosplay responsibility
(25:31):
when you're actually playing out recklessness. And they're so intertwined,
and it's not I don't think it's as simple as
like bad people. But it's also not as complicated as
they're making it out to be. So they have said,
and I think specifically Sam Altman at Opening Eye has
said that their goal with artificial general intelligence is to
(25:52):
create technology that can perform, achieve, deliver, and and accomplish
all economically viable human labor. That's a pretty clear goal, right,
So you're actually so you're coming from my job, right.
You just said you put it in your like investor
(26:14):
statements and your slide. It's all over YouTube. It's not
a conspiracy. So you said that the loud part out loud.
So given that, I think we have to take that seriously.
And then this idea that okay, we're just going to
have machines and software, you know, hard and soft machines
(26:35):
do all this work, which is going to liberate us
to be poets and gardeners and pastors and parents. I
think that is beautiful. I also think that as bullshit.
I think every technology in the modern industrial era and
the modern capitalist era has people promoting that have promised
that women won't have they'll be freed from domestic work
(26:59):
because of a washing machine, will be freed from our
desks because of the BlackBerry. So now we're free to
work on our smartphones in bed and not talk to
our partners, and both partners have to work because we
have an economic system that doesn't provide enough for a family,
so we're I don't know. I have this poem that
(27:20):
I have this lines like, we're going to use these
technologies to produce too much, so we'll have to use
these technologies to consume too much. And I think there's
a cycle here where for a company head, they face
a very rational choice reduce your cost to maintain or
increase your output. Therefore, install machine labor force and shift
(27:42):
your orger chart to more synthetic than biological. And if
you have a bunch of people out there, there's still
humans out there who are not earning money anymore. Who's
going to buy your service or your product? And I
feel like a lot of folks aren't seeing that. And
(28:03):
I do love the idea of an art filled, family
filled community gardening future where all the mundane and dangerous
stuff is taken care of by some synthetic entity which
frees us. I just don't trust these people to get
us there.
Speaker 1 (28:22):
Yeah.
Speaker 3 (28:22):
Yeah, because they have billionaires who want to be trillionaires
as their primary financial objective. They got to return that money.
So there's no evidence that these humans are going to
be the ones to lead us to that promised land.
We could create that promise land together. I think there's
something closer to it, but not with this bunch. I
(28:44):
think we had to swap out some of the casting.
Speaker 2 (28:49):
So I hate to ask two two layered questions of
double layer questions, but I think you can you can
handle this one. So there is so study I can
find it. Maybe putting the show notes, I found that
was talking about like use in AI and demographics and
Asians are like the highest users of AI technologies, and
(29:09):
second is black people and.
Speaker 1 (29:11):
Like white men were like way down on the list.
Speaker 2 (29:14):
And I'm like, that's astonishing to me that we use
them at such high rates. And so I want your
take on that and be because I think about you know,
we it's like social media. We over index on social media,
but we just use it. We don't necessarily buy and
large this is broad strokes by and large economic opportunity
in it. We just contribute our dances and et cetera.
Speaker 3 (29:37):
So that's part A.
Speaker 2 (29:38):
I want you to like just opine on that, okay,
And then part B is I was having this conversation
last night with an artist, UH visual artists, and he
was talking about who he works a lot in the community,
impoverished people in you know, inner city neighborhoods, And he
was talking about how they use AI. They do use
shat GPT to do things, but they don't use it
(29:58):
to help them broaden their worldview to get out of
the situation that they're in. So they use it for
task orients. I need to fill out a form. It
helped me to figure out what to put in this line.
But they don't think about, in his experience, why do
I have to fill out this form? I film in
this form for food stamps or whatever, and I need
to get off food stamps, you know. And I could
(30:20):
use it as my sparring, my mental sparring partner to
help me get out of the situation. So I want
you to appine on those two things if you want,
I think.
Speaker 3 (30:32):
Oh this is this is very thoughtful, and that it's hard, right,
it's not. I don't have like pre baked. That's a
good job for you. And I'm just gonna I'm gonna,
I'm gonna think out loud and and see what emerges.
We black people, we are we've been in survival mode
(30:53):
in this country for a couple of hundred years and
just trying to survive of America, not supposed to be
here like this. Yeah right, we're kidnapping victims. We had
our language taken, we've had our families taken, we've had
our identities taken, our names taken, and we're still here,
(31:17):
and we have contributed to and helped the nation that
has officially hated us become a better version of itself.
We're caring a lot, Caroline. I think a lot of
that gets wired in. I think there's probably some epigenetic thing.
There's generational trauma. There's all kinds of stuff. We're not
the only people to experience it. Jewish people have generational trauma,
(31:39):
like all kinds of folks do. But for this question,
in this time, I think our Black experience in this
country may explain part of it, which is that we're
always looking for an angle. You know, we're trying to
find some shortcut because we were the shortcut for America.
We were the shortcut to become an economic and military
and cultural power. That was us. We were the cheat code,
(32:01):
we were the machines, we were the AI And so
how do we try to find our own advantage and
our own angle. I think it explains hustle culture. I
think it plays a whole bunch of things like why
should we respect your rules? So the idea that we
would take up AI to help us survive and reduce
(32:22):
the pain and the time and the expense of how
to make it in America, that doesn't surprise me. This
next level required to not just play the game better,
but redefine the game, change the rules altogether. Be like
this whole game is bs. We need to shift to
a different field and a whole different set of plays.
(32:45):
That requires spaciousness, that requires a level of group creativity,
that requires confidence and in a major bold vision and
aspiration for what's possible. And that's rare. That is rare,
and a lot of people probably don't feel like they
(33:06):
have time for that. When you want you want me
to think about to change the game, I'm just trying
to get on the field, right I just I just
want to play a little bit. I want to reduce
the rate of injury while I'm on the field. I
can't be thinking all the way up there. I don't
have time. I'm trying to eat, I'm trying to sleep,
I'm trying to provide and so in the system broadly,
(33:29):
speaking creates those conditions, so we don't feel like we
have the time to think about changing the rules of
the game. But that is what's required. And my hope,
my hope. You know, we're in a Kalulambitu's time. Man.
We've got a lot of crises going on the climate.
The earth is trying to shake us off, holding a
(33:49):
shoney on daga chief or a lions and he's like,
you know, the earth is a dog and where the
flees and the dog is shaken. The dog is shaking
real hard and that's happening in climate. Obviously, democracy in
our political situation is nuts. You know, we have a
secret police force, not so secret police force, but hiding
(34:10):
their faces, snatching people off the street, tackling folks at target.
That's not normal, right, That's not a healthy way to live.
And then nobody has faith in the economic situation. You know,
our sm P five hundred and third of it is
seven companies as a highly leveraged situation for an economy
to be in, to put it gently, So with all that,
(34:32):
I predict, I speculate that this system is not going
to hold and we will be forced into higher thinking
because the thinking that got us here, don't work no more,
and we're going to have to change the rules of
the game. And there are people who are already doing
that against us, you know, against against humans, against black people,
(34:53):
against life itself. They want ceo monarchs, and they're saying
it out loud. They think if you criticize them and
want to put rules on business behavior, that literally makes
you the Antichrist. That's Peter Teel, Like that's who that's
He's on a lecture circuit talking like that, like this
is the big thinker, right, this is this is their
best and he's saying pretty clearly that he wants monarchy
(35:17):
and that his critics are the Antichrist. Okay, so I
don't think the system's gonna I don't think that system
is gonna like hold. So we might as well take
the opportunity as as they're going to destroy a historical
choice is going to lead to the destruction of so
much of what we have known. Let's figure out what
we actually want to build here. Let's let's see can
(35:40):
we make a democracy that works. Can we make an
economy that is just and is fair? Can we give
this land back to the original people. Can we get
the reparations that we are owed.
Speaker 1 (35:50):
Can we.
Speaker 3 (35:53):
Have dignity for all of us? And what would that
look like? That? So I hope that we choose that
and are violently forced into that level of creativity. But
either way, I think that level of choice and that
level of creation is what's required and is what's coming.
I'd just rather do it in a nice way like
when a podcast.
Speaker 1 (36:12):
Would you.
Speaker 3 (36:14):
Than facing down somebody who thinks that I'm not Antichrist
because I don't because I think it's business should pay taxes.
What are you smoking? Bro?
Speaker 2 (36:23):
So I officially asked zero of the questions that I
have prepared. But yeah, you're easy to talk too.
Speaker 1 (36:29):
I love this. So in closing, like what are you
working on? So just do we know? Like what's where?
Where can we find Baritune day?
Speaker 3 (36:38):
Working on so many things? Life with machines is the
main thing. This is, This is our show about us
more than it is about technology and and prompting us.
You know, what do we want and how do we
take this moment to try to set us on the
best possible path with all the momentum that's that's already
(36:58):
underway in terms of this AIE. So that's life with machines,
with YouTube channel, we're on substack, we're in your podcast app.
And then me, I'm Baratunde b A r A t
U n D. I live on all the platforms under
that name. And the other thing I'm working on is
it's not so much a business thing. But I've hinted
(37:19):
at it but not been elicited that we have this
big co creative opportunity ahead of us in the United States.
But I think in the world more broadly, in light
of the two hundred and fifieth birthday of US in
twenty twenty six Declaration of Independence, about to celebrate a big, beautiful,
awkward birthday party given the current circumstance, and I think
(37:40):
we have a truly, a legitimately beautiful opportunity to press
a kind of reset button and say, all right, what
are we about now? Why are we together? What vows
would we take to renew our bonds with each other?
And what is our vision for what this place should become.
Somebody else is already thinking about that. They want it
to be a dark, racist Texas place where a few
(38:05):
people hoard all the benefits I'm going to go out
and the limits, say most people don't want that. So
let's get together and let's remember who we are. And
so I've been working on a project that restores some
of the knowledge of democracy that was indigenous to this land,
practice here by the first people still here doing it.
And there'll be more on that through those channels. But
(38:27):
that is about centering interdependence and remembering who we really are,
not who they need us to be so they make
more money off of them.
Speaker 2 (38:36):
Black Tech Green Money is a production to Blavity, Afro Tech,
Black Effect podcast Networking Night Hiart Media.
Speaker 1 (38:41):
It's produced by Morgan Debaun and me Well.
Speaker 2 (38:44):
Lucas, with the additional production support by Kate McDonald, Sah
and Jada McGee. Special thank you to Michael Davis and
Love Beach. Learn more about My Guess Other Technish Off
is an innovators at afrotech dot Com video version.
Speaker 3 (38:56):
This episode will drop to Black.
Speaker 2 (38:57):
Tech Green Money on YouTube, So tap in, enjoy your
Black Tech Green Money, share us to somebody, go get
your money.
Speaker 1 (39:05):
Peace and love,