Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Mala.
Speaker 2 (00:00):
Have you listened to an AI podcast?
Speaker 1 (00:04):
Oh, an AI podcast? You mean like this podcast? Look out,
our radio AI hosted has been the whole time you
might have nothing. And that's and that's the plot of
the movie.
Speaker 2 (00:14):
Let's write it down.
Speaker 1 (00:15):
That's the big reveal. We were We were the original
bots from day one.
Speaker 2 (00:20):
That's like Ai meets Terminator, meets all all of the
horrible things.
Speaker 1 (00:26):
But also meets like real women have curves and sister
heard of the traveling fans. It's all wrapped up into one.
Speaker 2 (00:31):
That's a movie I'd watched.
Speaker 1 (00:35):
Let's write it.
Speaker 2 (00:39):
Hey, look a mo is viosa here. I'm so excited
to share that. Look at our radio and Signora Sex said,
our finalist for the Signal Awards. We are also up
for the Listener's Choice Award, so we're asking for your help.
Please vote for us by heading to vote dot Signal
Award dot com or click the link in the show notes.
(01:01):
It would mean so much to us if you vote
for us and show the LATINX voices and our stories matter.
Thank you so much. Besitos Ola la loka morees, I'm.
Speaker 1 (01:16):
Theosa and I'm Mala.
Speaker 2 (01:18):
Today we're talking about the intersections of artificial intelligence, the
creative industry, and the environment. Oh A lot to unpack here.
I hope we get through.
Speaker 1 (01:29):
All of it. This is a big, tangled, tangled mess.
Yes here, the world is on fire and AI is
taking over, just like every Hollywood movie for the past
thirty years has warned us.
Speaker 2 (01:43):
The vow was nobody paying attention during Will Smith's AI
film like Hello.
Speaker 1 (01:48):
I Robot Yes, Yeah, which was based on books Yes,
I Robot Yes, which also predicted this very thing where
the artificial intelligence we create becomes so intelligent that it
destroys us in order to preserve itself. Yeah.
Speaker 2 (02:03):
I'm just like everybody like signed reading go for it,
like nineteen eighty four, Fahrenheid hohrfe do one. Let's bring
it back, read them, study them, parable of this.
Speaker 1 (02:13):
Hour, like yeah, do it. Watch all the Terminator movies.
I mean it's right there.
Speaker 2 (02:18):
Yeah. Not to be an alarmist, right and like so
doom and gloom, But I do think, like we on
this podcast and the Audio art Hive that it is
we have to have this conversation of AI. We've like
dabbled here and there this past season and over the
seasons we've talked about disappearing art and the way AI
(02:38):
is like stealing from existing art right that's out there
in the creative world, and then art that exists being
taken down from streamers and platforms. That's like the way
we've talked about it in the past. This season, we've
talked about it in terms of chatgybt and it being
used this therapy, but today we're talking about it in
(02:59):
the creative field and how that also intersects to the
literal material world that we live in, that is the environment.
Speaker 1 (03:06):
Absolutely. So on a past episode earlier this season, we
discussed the crash out or crisis. Is it a crash
out online or is it a mental health crisis. We
also released our episode on AI psychosis and this phenomenon
of utilizing chat shept as therapy, And so we're just
kind of like building off of these ideas and we're
(03:26):
bringing it, like the USA said, to the creative, to
the creative industries, and to the environment, because these things overlap.
And I just want to say, you know, in general,
I think that right now we're talking about how AI
is like coming for creative jobs and for the creative
industry and like replacing artists and I think in general,
(03:48):
like the arts and education are just very easy targets,
like when it comes to defunding, when it comes to minimizing,
when it comes to removing, and it's never it never
stops there. You may defund the arts and education, but
it's going to expand into other things like healthcare and infrastructure.
(04:08):
And I think there's a similar phenomenon going on here
where AI is starting by replacing artists and creatives and
by minimizing opportunity for human beings in the arts and creativity.
But that's just where it's beginning, and it's going to
branch out and expand into other areas. And when the
arts are targeted, we always need to be concerned, We
(04:29):
always need to be worried. It's just a testing ground
because if you can succeed here, you're going to succeed
in other places. So I kind of want to just
I've been thinking about that. I want to like pin
that at the top of our episode.
Speaker 2 (04:40):
I love that, And that's a really good analysis and opener.
Breaking news in our industry this past week is this
startup that is creating AI podcast. They have been able
to create five thousand podcasts and three thousand episodes a
week and with the cost of one dollar per episode,
and this is reported by the Hollywood Reporter. One of
(05:03):
the reasons this was really striking, I think in our
digital community and our podcast creative circle is because this
AI startup called Inception Point AI is founded by a
former Wondery exec and a Wondery is a podcast studio
that has made incredible podcast over the year over the years,
(05:25):
and so it definitely is disheartening to see a former
exec championing the use of AI podcast and this is
not just AI podcasts that are using AI to script
their episodes and to possibly edit, but to also host
these podcasts, which is I think the opposite of what
(05:49):
a podcast is meant to be. A podcast is about
connection and familiarity and storytelling and that cannot be replaced
by an AI, by a robot. That is the point
of radio and podcasting is the connection with the host
and the way we connect with our listeners in turn,
it's mutual, it's a we give and we take in
(06:12):
that way.
Speaker 1 (06:13):
Yes, it's people telling stories, people sharing news of people
opening up and being vulnerable and creating art which is
meant to connect people and so people feel less alone,
people feel less isolated, they feel heard, they feel seen,
and I just don't know that artificial technology can do
(06:35):
that for human beings. I think that we're seeing more
and more of it, where people are seeking connection, They're
seeking a therapist through their chatbot, They're seeking a boyfriend
through their chatbot. I even heard this week at school
someone pitching a documentary about grief bots that are taking
like the voices of your deceased loved one and like
(06:56):
videos and photos to create like a grief bot so
you can continue to have like contemporary conversations with your
deceased love one, and so the bots being used as
a stand in for real human connection. And I think
that we've seen enough Black Mirror episodes. We've seen, like
we said, enough movies and TV shows that sort of
(07:20):
imagine for us what taking this road like where it
will lead us. And it's this question of art imitating
life and life imitating art, and technology imitating life. Is
I think what really we're seeing here, and something we
talked about before we started recording is that my belief,
my understanding is that in order to have an AI host,
(07:44):
that voice needs to come from somewhere, and there's enough
podcasts and radio shows and music and YouTube videos and
TikTok videos. There's enough content out there with people's, real
people's voices. That has to be where the AI bots
are sourcing their voices and their inflections and their accents from,
you know, like they have to farm real people's voices.
Speaker 2 (08:07):
Absolutely, And I think, you know, one of the really
beautiful things and why podcasts have been so successful is
that it podcasts did democratize what we considered like radio
host right when we as much as I love MPR,
it did in its heyday, it did have its moment
where it was a lot of white hosts and it
(08:28):
sounded a certain way. It was Ira Glass, right. It
was these podcast hosts that you know, sounded like professional
radio hosts, which is fantastic. But then that also meant that,
depending on where you lived, you probably didn't sound like
that either, you know, And so one of the beautiful
things about the podcast is that you could listen to
(08:49):
folks that sound like you, that have your accent, that
are from your community, and you can really see that
representation and you know, not to like minim you know,
podcast to representation, but that was I think one of
the ways folks felt really connected at the advent of podcasts,
(09:09):
diversifying and like this in a way like anyone can
do it. Yeah, And so I think that that has
been one of the many reasons that podcasting has been
so successful and why audiences connect with the hosts that
they connect with. And to your point, according to the
Hollywood Reporter, this team is in the midst of navigating
the ethics around creating AI personalities as the technology advances.
(09:34):
As of right now, each host identifies themselves as being
AI at the top of the episode, and they've stayed
away from having the host invent their quote backstories, but
that could come.
Speaker 1 (09:47):
Yeah, And the name of this company Inception Point AI.
They're like basically building a stable of AI talent, again
from the Hollywood Reporter, and not only to host podcasts,
but to be come like influencers across social media platforms
and apparently literature and more. Hosting producing podcast I guess
(10:10):
is relatively expensive, not more expensive than making a movie
or a TV show, but you know, you're paying for
human labor and human talent, and I think that it's
a scary place where we say it's too expensive to
pay people for their work and for their art and
for their originality. So we're just going to replace the people.
(10:32):
And I think that it's really creepy. You know, we've
seen here and there, like Okay, there's an there's an
AI influencer. There's even like yeah, AI profiles on dating apps,
and there's this intersection of like AI and romance with people.
That movie with Joaquin Phoenix her, you know, and it's
(10:55):
not the only one. And it's just very eerie. And
I also think it speaks to like something that has
been talked about out there, this like loneliness epidemic, like
people are very lonely, and so instead of somehow connecting
with other human beings, Well, the thing that's most accessible
and closest and at our fingertips is the is the phone,
(11:16):
is the computer.
Speaker 2 (11:18):
Don't go anywhere, locomotives.
Speaker 1 (11:19):
We'll be right back, and we're back with more of
our episode.
Speaker 2 (11:27):
Since launching, you know, they are producing more than three
thousand episodes a week, and I think what that also
is what we should also consider is that it's then
flooding the media landscape. Yeah, and they've already seen ten
million downloads since September twenty twenty three, so that means
(11:50):
there is an audience for it, you know, as and
so I'm super curious as to who is listening to
these AI generated podcasts and what it is that they
enjoy about it, because clearly, as of now, the host
is identifying themselves as AI, so it's not a trick, right,
the listener knows that it's an AI podcast. So I'm
(12:11):
really curious as to what about it makes it desirable
to listen to. That is definitely not something I would
seek out, But clearly there is an audience, and we
can say that about all of the AI that we're seeing.
Like as much as we have our qualms, our reservations,
our maybe our own personal ethics around using AI, it
is still being used. And I think in this case,
(12:33):
clearly we're seeing the podcast are still being listened to.
And this startup is just to note, like the employees
are not yet salaried, they're not getting paid yet. So
the small team that's making these AI podcasts are still
not getting paid for their own labor. Interesting and that
it to me is incredibly ironic.
Speaker 1 (12:54):
Very ironic. So then eventually they're going to be aied
out of their own startum job.
Speaker 2 (13:00):
Yeah, the startup is currently seeking outside funding. That is
normal for that's what a startup does. That's that is
not outrageous. But I think the irony is still there
that the folks that are working are making these podcasts,
scripting it through an AI tool, however they're editing it,
and then they're not getting paid for it either.
Speaker 1 (13:20):
Hmmm, they have to So it's it's they have to
be getting something out of it or they see uh,
some payoff in the future, and yeah, strange.
Speaker 2 (13:34):
Yeah, yeah, and it's it's been it to me based
on what everything I've read in this Hollywood Reporter and
based on the chatter that I've seen online from my
creative circle, you know, it seems like the point of
this business model is just profit. It's not storytelling, it's
not connection, it's profit. And so I think that's also
(13:54):
something to keep in mind. Is that the kind of
thing you want to support?
Speaker 1 (13:58):
You know, it's also scary. You know, you're a fan
of someone, you're a follower, you're a listener, you're a subscriber,
whether it's a recording artist, whether it's a podcast host,
whether it's an influencer, a creative of some kind. Nobody
is perfect, right, and quite often the people that you
(14:21):
are listening to are going to disappoint you at one
time or another, or let you down, or they're not
as accessible as you want them to be. And I
think that like an AI host can be tailored to
be perfect for you, right, and to never disappoint you
and never let you down and always show up and
always give you what you're looking for, where like humans
just can't do that, you know. And I think that
(14:43):
we've talked about this on the show before, but there's
kind of a rage cycle, especially in the online space.
Audiences get mad at their favorite influencer. You know, People
fall off, people get called out, people get dragged. There's
this human tendency to falter or to fail and to disappoint.
(15:03):
It's just part of the human experience because nobody is perfect.
But something that is programmed like this could analyze so
much data that it can be tailored to be as
close to perfect for you as possible. That is humanly impossible, right,
And that is something that I think is maybe enticing
(15:25):
for people because it's always going to give you what
you want. It's never going to disappoint you because it
knows all your keystrokes. But it's like that I think
is very scary. That I think is very scary.
Speaker 2 (15:41):
I think that that is an excellent pointment because it
shows us that what we're looking for is perfection, perhaps
from our podcast host, influencer, social media personality therapists, therapist
even yes, and you know, to be the human is
to be imperfect, not to be trite. But this is
(16:03):
how we learn and we grow and we get better
is by the human error and mistake, you know.
Speaker 1 (16:08):
And learning from it and changing. Change is like the
only constant in life is time continues, Things change, people change,
seasons change, change is the only constant. And AI might
study all of us, you know, and learn from us
to like feed us and give us exactly what it
(16:31):
thinks we want, you know. But then like but then
like if we're not changing, then does the AI change?
And then where do we go?
Speaker 2 (16:40):
Yeah? Well, I think it just like in my mind,
AI just like teaches itself. It like continues to learn
from what you've fed it, so it will like evolve
beyond you, which I think is what every sci fi
film and book has told us. And I think, you know,
when we think about I'm thinking historically like novels, right,
where there has been some type of resistance to AI,
(17:03):
and yes there's resistance now, but there's also a lot
of opting in, and there's also a lot of forced
opt in. I think about now when I use Google, Yes,
you know you have to in order to not get
an AI generated summary, you have to put minus or AI,
and so you don't even get the option to opt
out anymore. You have to manually opt out. And I'm
(17:25):
seeing that for a lot a lot more programs these days,
whether that be Canva, whether that be Adobe or Photoshop,
I'm seeing more and more AI plugins, you know, which
you can't opt into those, but some like you can't
opt out of even like with Google and Gemini, you know,
even on your email. And so I feel like this
(17:45):
also ties really well the use of AI into the
surveillance state. We are so willing to opt in to
the surveillance state, whether that be through like self surveilling
and documenting everything we do, to training a large language
model like CHADGBT or an AI bot. We're training it
(18:06):
to think like us, like ourselves. And so I do
feel like that laps into like the surveillance state and
how this technology can be used against us.
Speaker 1 (18:17):
Oh absolutely. I mean, if you're using your phone and
you're using any app at all, the app always asks you, oh,
can it track your location at all times? And you
have to say yes or no? But then sometimes but
then for the app to work, you have to say
yes or else. Why do you have the app?
Speaker 2 (18:35):
Right?
Speaker 1 (18:36):
You know? And so there's somebody somewhere who always knows
or you are scary. It's very scary. It's very scary.
And you know, we were talking about like movies and
film that have sort of predicted these things. Isaac Asimov wrote,
I Robot in nineteen fifty, you know, like cell phones
were not invented, then the Internet was not around, but
(18:57):
in less than one hundred years later, like here we are. Yeah,
and it tells us a lot about how quickly, how
quickly the technology can develop and then how exponentially it
can grow and then like impact us because that's a
very quick turnaround.
Speaker 2 (19:14):
Don't go anywhere, look amotives.
Speaker 1 (19:16):
We'll be right back, and we're back with more of
our episode.
Speaker 2 (19:24):
For me, my point of reference is always fair onhead
four fifty one. That book changed me in middle school,
you know, and I have it by my nights stand
now because I'm like going back to it here and there.
I'm like, this is a time to reread it. And
you know, there was this resistance right by the narrator
and wanting to uncover you know, wait, what's really going
on in this reality? Here in my reality? And I
(19:47):
feel that this is not to say there's no resistance
to AI. Of course that's that's not the case. But
it also feels like we are being forced to use AI,
whether we agree with it or not, in terms of
labor and jobs and like staying relevant where even if
we ethically don't agree, it's like, do you know, like
(20:08):
you mentioned the other day, do you know how to
use AI? Is being asked in your in your classes.
Speaker 1 (20:14):
Yeah, there's a at the School of Cinematic Arts, there's
a directing AI class, And I've seen some of the
projects and it looks like it's shot on location in
Italy with human actors. You have you have to really
really have a very trained eye to see that it's
it's AI generated. And that's the thing is the technology
(20:34):
is only going to get better, right, and it's there
is AI that has been incorporated into filmmaking for some
time now. This is just like a whole new level.
And the thing is, like the school would not be
incorporating this sort of class if there wasn't a very real,
uh reality in the job market post graduation that this
(20:58):
is a necessary tool, like a necessary skill that you
should learn. And I know that there's been different thoughts
about like, well, in the right hands, it can be
a democratizing tool, right, Like you can use an AI
tool to basically replace like a test shoot day, and
(21:20):
you can pre light things and figure out your location
and your blocking and your camera placement without leaving your desk,
whereas to do that in person, it might cost you
ten thousand dollars, you know, just to do a test.
And in some ways, like is that going to allow
people from underresourced backgrounds to like break into an industry
(21:44):
that is otherwise like really expensive to break into. And
one of the greatest barriers into filmmaking is money, you know,
That's really the thing that keeps people out. It's just
so expensive to make movies. So there are these like
competing schools of thought. But then at the same time,
(22:05):
if you can replace a whole lighting team, you know,
that means you're replacing like tons of skilled workers. And
that's also not good either.
Speaker 2 (22:16):
And this is at the tale of the strikes.
Speaker 1 (22:19):
At the tale of yes, you know, like this is.
Speaker 2 (22:21):
This I wonder if this is also maybe not that
the strikes caused it, but as a way to remove
like the human labor in a very horrible way, retaliation
of it. This is this is I'm not basing this
off of anything. I'm just wondering, Okay, this is you know,
(22:41):
a horrible retaliation because of you know, humans asking for
better wages and protection and you know, respect for their labor.
Speaker 1 (22:53):
Yeah. And and this being you know, union labor, unionized labor,
and so I can see how the execs and the
powers it be in the funders instead of playing ball
with the unions, well, we're just gonna there's just not
going to be any more work for you, right, And
I think that the history of labor in this country
(23:16):
tells us that that's not too much of a reach, right,
that theory.
Speaker 2 (23:21):
Right, Yeah, And it's it's horrible. I mean, it's a
horrible thing to consider that, you know, labor and humans
are so devalued, you know. I think about that even
in terms of you know, when I go to the
grocery store, and now there's like self checkout, and that
seems fine and dandy, but that means there's less cashiers
(23:41):
on the clock, you know, and the labor is then
put on us to check out ourselves. And it's not
that I'm above labor. I'm not above doing it myself.
But there's a person that could be getting paid a
union worker at that yeah, to be working this shift.
My mother was a union worker cashier at Kroger for
twenty plus years and she does not like the self
(24:04):
checkout right because she sees it as like this, this
labor is now on me. Even when we go to
a restaurant and we have to order ourselves on our
phone and like the QR code crazy, I'm like, that
is now labor being put on me. And the person
who pointed that out to me is labor journalist and
my homegirl from USC siek Lali, because we went to
(24:26):
have a drink with our class and we were still
in grad school and she was the one that pointed
it out. She's like, now the labor's on you to
put in your order. And since she told that to
me years ago, I have not unseen it. Yes, And
this is again not that anyone is above doing labor.
But now there is a person who is not getting
paid to take your order.
Speaker 1 (24:46):
Yeah, that's an entire household. You know that In a
previous version of America, like a person could feed themselves
and pay their bills and like take care of their kids.
With a union job like that, working as a cashier,
(25:07):
you know, a person could sustain themselves. And these jobs
still exist, but there we're seeing them slowly being chipped
away at. There's not as many as there used to be,
you know, And then we wonder why are people struggling well,
because we are just eliminating jobs left and right and
not replacing them. It would be one thing if these
jobs went away, but we're creating more jobs. But we're
(25:31):
not creating more jobs. We're simply hacking away at the jobs.
Speaker 2 (25:34):
This is an argument not to be tangential, but this
is an argument to why we need universal income.
Speaker 1 (25:39):
Yes we do, universal basic income.
Speaker 2 (25:41):
Universal basic income is what we need.
Speaker 1 (25:43):
If you're not going to let people work, how are
people supposed to survive? Yes, there has to be something
else in place.
Speaker 2 (25:50):
I mean, and it ties to AI in some way.
I'm going to get us there. Because if you think
about if everyone had access to universal basic income, you
could probably be away working living artists. Yeah, if you
could cover your rent, yeah, and there would probably be
more people making art.
Speaker 1 (26:05):
Yeah yeah, and even folks with jobs, Like we're learning
that folks with jobs, folks who are homeowners but who
live near these data centers are also struggling because in
order to house all of this data, because this is
like data heavy, there's a lot of information and it
has to go somewhere. There is a physical location where
(26:28):
all this AI code and information has to be stored
in big old computers, and there is a true environmental
cost to it. And something that we're learning is that
people who live next to data centers are running out
of water. They don't have running water anymore, Their water
(26:49):
is polluted visibly, and there is like a very real
toll on again, just working people, even homeowners. And usually
we think about environmental degradation and like urban communities, communities
of color, right, who live near airports, who live near
(27:09):
oil manufacturing plants. This is a whole new wave.
Speaker 2 (27:13):
It adds another layer to environmental racism because it's we
know that they are not going into affluent communities and
building data centers, right, we know that they're going to
go to either rural, low income neighborhoods and build their
data centers there, and so that is also something to consider.
And it's not just water that these AI data centers
(27:37):
are using, but it's also drinking water. So we are
quite literally sacrificing our drinking water for these AI data centers.
And it might feel like there's an endless supply of water,
but there's not. And we are not the only nation
that struggles with water and access to clean water.
Speaker 1 (27:58):
It's very scary. I know, it's a very scary thing.
You guys. I think about this all the time.
Speaker 2 (28:04):
Oh malla does think about running out of water? You
do all this your fear? Yeah, I forgot about that
when I said it. I'm so sorry.
Speaker 1 (28:12):
No, it's okay, but it's true, especially in LA. My
partner asked me once, is there something that like like
that you think about that bothers you? And I said yes,
that there's going to be a natural disaster in Los Angeles.
We're going to be cut off from our water supply
and we're going to run out of water in like
twenty four hours, and all hell is going to break loose,
and it's going to be chaos. It's going to be
(28:32):
really hard to get out of here. It's going to
be impossible to find like life sustaining supplies like water.
Everything's going to go off the shelves immediately, and we
have no other source of clean water. We just don't.
Speaker 2 (28:45):
This is like your version of doomsday prepping. You're thinking
about the water.
Speaker 1 (28:48):
Yes, I'm thinking about the water. And if the water,
the drinking water is being used in AI centers and
not in like accessible facilities for catastrophee, you know, then
where do where do we go? Where do we get it?
Because there is a finite supply and we can't treat
water ourselves. We don't have the tools, we don't have
(29:11):
the mechanisms, we don't have the chemicals, we don't have
the facilities. Like it's a whole thing to treat water
so that it's pottable and drinkable, and we just can't
do it ourselves.
Speaker 2 (29:19):
There are efforts to treat recycled water, especially in La,
in the city of La, but I don't think that
we're there yet, but there are efforts being made. But
it is a real anxiety, especially as we add AI
to the conversation and the way it's being used like
a search engine. Right, it's not just a tool, it's
(29:40):
a search engine, which is what we talked about in
a previous episode, you know. And so I think that
there's an overuse of AI when we're using it in
that way of it being a search engine versus hey,
can you I'm going to run this through and give
me some feedback. You know, however it works, you know,
it's when it's being used I think it to be
your you're all in one everything, you know, your tutor,
(30:03):
your therapist, your friend, your partner. You know, it's that
means you're running it all the time. Yes, the way
we need connection all the time.
Speaker 1 (30:13):
All the time. And in the Washington Post there is
this quote, there's a resistance against infrastructures advancing AI a quote. Generally,
it's important to note that data centers increasingly are becoming
critical necessary infrastructure to meet the growing needs of our
connected digital world. Data centers also bring a multitude of
(30:35):
benefits to communities big and small. The data center industry
is in growth mode, and every place they try to
put one, there's probably going to be resistance. The more
places they put them, the more resistance will spread. And
you know, so, I just we have to hope that
people are going to resist, that people continue to resist
(30:57):
for our own self preservation as a society, as a society.
But what that also means there has to be some regulation,
legislative regulation, some cap to how big this thing can get,
how much it can grow, because only if there's a
cap will there be a cap on the number of
data centers that are built.
Speaker 2 (31:18):
Absolutely so, when thinking about necessary regulation and infrastructure and
policy to protect us from overuse of overconstruction of data
centers and AI, we actually learned that the White House
(31:40):
and the Administration has a website called AI dot gov
where we learned there's actually plans to accelerate federal permitting
of data center infrastructure. So quite the opposite I think
of what we feel that we need as a society.
Speaker 1 (31:56):
Absolutely so. According to AI dot CoV the AI Action
and you guys can go online and type this into
your search engines AI dot gov, there's a big old
picture of Donald Trump right in the front. It's horrible.
The AI Action plan. The United States is in a
race to achieve global dominance in artificial intelligence. Whoever has
the largest AI ecosystem will set the global standards and
(32:18):
reap broad economic and security benefits. Under President Trump, our
nation will win, ushering in a new golden age of innovation,
human flourishing, and technological achievement for the American people. America's
AI Action Plan has three policy pillars, accelerating innovation, building
AI infrastructure, and leading international diplomacy and security. So it
(32:39):
is definitely a goal and an initiative to expand in
advance AI across the country. According to AI dot gov,
So all the scary things that we were talking about
that we hope don't happen, it seems like they're definitely
going to happen. And it's yeah, it's all right there
(33:01):
in plain English.
Speaker 2 (33:02):
It sounds like he wrote it. Is what my immediate
thought was, Our nation will win golden age of innovation.
Speaker 1 (33:09):
Yeah, yeah, it's very Trumpian.
Speaker 2 (33:11):
Yeah he wrote it.
Speaker 1 (33:12):
It's very trumpy in the whole thing.
Speaker 2 (33:14):
I bet you he copy ed its.
Speaker 1 (33:16):
He does his own copy of it, bet you he does.
He's like, you have to add win, He's like, can
you add my picture? No? For sure?
Speaker 2 (33:25):
Ay?
Speaker 1 (33:25):
Yeah? Yea, I yes, y'all.
Speaker 2 (33:28):
Well, it's tough.
Speaker 1 (33:30):
It's tough.
Speaker 2 (33:31):
This My sigh is guttural, it's real, it's a deep sigh.
It's a deep sigh because there's just so much to consider,
and I think, you know, we never want our listeners
to feel overwhelmed when they after listening to an episode,
or feel like there's no hope and there's no you know,
nothing matters, or nihilistic. Even I was when we were
(33:52):
prepping for this episode. I was saying, like, it is
really easy to fall into nihilism and to forget that
RAT is a huge tool in our toolbox and one
that we need to really champion right now because it
can feel really scary. Times are changing. It feels like
it's changing so rapidly, I think, faster than before. And
(34:16):
I think that's part of the anxiety that many of
us are feeling. And it's I think just important to
remember when we're thinking about AI, like who really benefits
from all of these quickening changes? And I mean it
seems like the billionaires that are running the country, the
venture capitalists, the tech bros, you know, and so just
(34:37):
thinking about, you know, ways that we can divest from
AI and creativity at that, you know, And I think
one of the beautiful things that I love about being
in this creative industry is the brainstorming is sitting in
a meeting with our team and ideating and brainstorming and
getting those creative juices flowing. And AI can not rob
(35:01):
that experience, you know. And so does that mean like
unplugging and like going to the library and doing your
own research probably you know, reading a book. And I
don't know if it's just my algorithm, and it's like
again another AI told the way I've trained it. But
I'm seeing like a lot of pushes for people to
(35:22):
like bring back analog and bring back the landline. And
I saw a video of someone who literally chained her
phone to the wall and so she could only use
it in one spot of her house, and so she
went on and did all of her errands without a phone.
And you know, there's obviously like I don't want to
(35:43):
be without a phone safety wise, you know, but there
are what I'm the point is, I'm seeing like real
push from people from individuals to like do like digital
detoxes and like and goma sama craft and be in
community with each other. And so that does give me hope,
(36:04):
and I hope that for whoever's for I hope that
if you're listening, you feel inspired and hopeful to like
get out there with your community and do these like
analogue things. Yeah, you know.
Speaker 1 (36:17):
Yes, the technology is supposed to help us to live
like better human lives, not to replace our human lives.
And I think that we have to love people more
than we love the technology, and remember that it's really
(36:38):
about loving ourselves and loving each other and not loving
the tools, right, because the tools cannot love us back, right,
and they actually don't take care of us. No, we
take care of them, and hope, I think is the
most active form of love, you know. So we have
to keep loving each other and putting each other above
(36:58):
the technology, so that means resisting, you know, and using
it to help us thrive, but not to like eliminate us. Yes,
it's a slippery.
Speaker 2 (37:09):
Slope, slippery slope at that. I think that's a really
beautiful note to end on. You've been really wholesome this season, right,
Her takeaways have been like so wholesome and lovely this season.
Thank you, Yeah, thank you.
Speaker 1 (37:22):
I'm in a very optimistic space that's good in spite
of everything.
Speaker 2 (37:26):
In spite of everything, that's a good place to be
in the hope and the optimism that things are going
to be okay, because if not, it can feel really
dark and scary. Yeah, and that is what they want
us to feel.
Speaker 1 (37:39):
They want us to feel hopeless and to tear ourselves
apart and each other.
Speaker 2 (37:45):
Yeah, and we're really good at doing it.
Speaker 1 (37:47):
We're really good at tearing each other apart and tearing
each other down. And that's how you you weaken the resistance,
you know. But yeah, we just we gotta we have
to stay positive.
Speaker 2 (37:59):
We got got to keep going, gotta keep going. Yes,
all right, y'all, thank you for listening to another Capito
lok A Radio. We'll catch you next time. Bessitos lok
A Radio is executive produced by Viosa Fem and Mala Munios.
Speaker 1 (38:13):
Stephanie Franco is our producer.
Speaker 2 (38:15):
Story editing by Me.
Speaker 1 (38:17):
Diosa creative direction by Me Mala.
Speaker 2 (38:20):
Look At Radio is a part of iHeartRadio's Michael Tura
podcast Network.
Speaker 1 (38:24):
You can listen to lok A Radio on the iHeartRadio
app or wherever you get your podcasts.
Speaker 2 (38:29):
Leave us a review and share with your Prima or
share with your homegirl.
Speaker 1 (38:32):
And thank you to our local motives to our listeners
for tuning in each and every week Besitos
Speaker 2 (38:41):
Loca