Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Steve Taylor (00:00):
Welcome to
Breaking Green, a podcast by
Global Justice Ecology Project.
On Breaking Green, we will talkwith activists and experts to
examine the intertwined issuesof social, ecological and
economic injustice.
We will also explore some ofthe more outrageous proposals to
address climate andenvironmental crises that are
(00:22):
falsely being sold as green.
I am your host, steve Taylor.
This October, the 16thConference of Parties of the
UN's Convention on BiologicalDiversity convened in Cali,
colombia.
Global Justice Ecology Projectwas there to mobilize against
the release of geneticallyengineered trees.
At COP16, there was theestablishment of the so-called
(00:47):
CALI Fund, which requires bigtech companies working with
artificial intelligence tocompensate indigenous peoples
and traditional communities foruse of their genetic information
in the development ofgenerative biological programs.
But what is generative biologyand why are observers now saying
(01:08):
that we are at a tipping pointwhere a lack of regulatory
oversight and the use ofartificial intelligence now
threatens the release of novelorganisms into the wild?
On this episode of BreakingGreen, we will talk with Jim
Thomas.
Jim Thomas is an activist,writer, researcher and poet with
almost three decades ofinternational experience
(01:29):
tracking emerging technologies,ecological change, biodiversity
and food systems on behalf ofmovements and UN fora on which
he serves.
Jim was a co-executive directorand research director at the
ETC Collective, jim Thomas,welcome to Breaking Green.
Thanks very much.
(01:49):
So, jim, you have aninteresting bio.
You have been working onenvironmental, food sovereignty
and social justice issues for oh, I don't know about three
decades.
What drew you to these issuesand how did you get started in
this work?
And how did you get started inthis work?
Jim Thomas (02:06):
How did I get
started in this work?
I think, like many people, lovefor the natural world, concern
about the way in whichindustrial capitalism although I
might not have called it thatat the beginning is crushing the
natural world, and particularlythe role of technology and
corporations in that.
And so, yeah, I've been luckyenough for about three decades
(02:27):
to be fighting against corporatepower, particularly
technological power, and workinga lot against the effects of
new and risky emergingtechnologies and the corporate
strategies behind it.
Steve Taylor (02:39):
I noticed in your
bio that you helped form
something called the NewLuddites, so could you tell us a
little bit about that, what theLuddites are and how you think
that may be relevant to the workthat you do?
Jim Thomas (02:57):
Sure.
So the Luddites were workingpeople in the north of England
in the beginning of the 19thcentury who saw the onset of
industrialization and how it wasaffecting their craft they were
mostly weavers and theircommunities and began to smash
(03:17):
some of the machines that werehurting the commonality was the
language they used.
They saw machinery that washurtful to the commonality and
they organized as a movementthat actually was so powerful
that it required more soldiersthan were sent to France to
fight Napoleon.
It was a huge social movementat the time.
(03:38):
They were crushed, they werehung, they were transported to
Australia and, of course, theword Luddite has since become a
word of contempt for anyonewho's supposedly fearful of
progress and technology.
But that's not what the Ludditesoriginally were introduced by a
particular industrial class, abrand new industrial class, were
harming their communities, wereharming their craft and pushed
(04:10):
back against it.
And I think the important thingand this is certainly why I was
involved with others aroundstarting a new Luddites was the
fact that they recognized thattechnologies that were hurtful
to the common good, that theyrecognized that technologies
that were hurtful to the commongood shouldn't be allowed to go
ahead and they literally wouldpull weaving frames into the
(04:32):
marketplace and put them ontrial, and people would say what
they thought whether theseweaving frames should exist or
not exist, and then they wouldsmash them or not smash them.
They were exercising judgmentover these new technologies and
exercising judgment from thepoint of view of the social
effects, the effects oncommunity, and that's something
(04:52):
that many of our movements havebeen trying to do as well more
recently.
Steve Taylor (04:56):
It is a
fascinating history and I think
it's really cool that you guysstarted that.
And if you ever want to read alittle bit about that, the
listener can go and just Googlethe new Luddites and I think
you'll find some news stories.
Jim Thomas (05:11):
I would really point
people to.
There's a bit of a resurgenceat the moment, particularly in
the context of artificialintelligence, of people who are
saying no, wait a minute, I'm aLuddite too.
There's a really wonderfulstory of the Luddites produced
by a journalist called BrianMerchant.
He's produced a book calledBlood in the Machine, which is
the history of the Luddites, butit connects it directly to some
(05:32):
of the resistance movements nowagainst artificial intelligence
and digital technology andbiotechnologies.
So there's more and more peoplewho are critically considering
technologies, calling themselvesLuddites.
Again, it's a reclaimed word.
Steve Taylor (05:46):
You mentioned AI
and your work recently has been
dealing a lot with that at COP Ibelieve it was COP 16, but the
recent COP in Cali we're goingto get to that.
But let's start with just theconcept of artificial
intelligence as a new technologyand as a new Luddite your
(06:07):
thoughts.
Jim Thomas (06:08):
Yes, I mean, I think
a place to start is to
understand what so-calledartificial intelligence, ai, is
and what it isn't.
So the term artificialintelligence was originally a
marketing term from the 1950sand it was a very smart
marketing term, but it doesn'treally mean anything but what's
(06:31):
today called AI, artificialintelligence the likes of chat,
gpt and so forth.
It's not intelligent.
This isn't thinking machines.
We're not talking about smartrobots or smart computers that
can think unusual things.
That's an image that's beenprojected by this industry and
(06:52):
it really is an industry inorder to sell themselves and to
look impressive.
Really, all artificialintelligence is is very
complicated sorting of data inorder to make predictions.
So an artificial intelligencesystem will take large amounts
of data.
(07:12):
It will use that to build amodel and on that model it will
try to predict things.
It will try to predict what'sthe next word that it might spit
out, and that's how chat GPTworks, for example, for those
who've used that.
So it looks like it creates newtext, but it's really just
trying to predict what it is youwant it to say.
(07:33):
Or these artificial intelligence, so-called image machines which
take lots of images, scrapethem off the internet and then,
when you ask it to makesomething.
It tries to predict what is theimage that you would like and,
in many ways, what artificialintelligence is and it's been
classified this way is kind of abullshit machine.
It's literally something that'strying to guess what you want
(07:57):
to hear and that, of course,produces texts.
It produces images, it producesthings that can look very
impressive, but they're justvery complicated sorting and
predictions, and somesignificant portion of the time,
those predictions are wrong andcertainly those predictions are
not thinking.
They're literally just tryingto predict what you want to hear
(08:21):
.
That ability to do that topredict words or images and
produce what look like new textsor new pictures or new videos
is, of course, something thatthis industry is now selling to
the public, to militaries, tothe government, um, as as a as
if it was magic.
(08:42):
Um, you know, many, manytechnologies try to sell
themselves as magic, and AI isthe latest in the long line.
Steve Taylor (08:50):
It's clear.
You're not saying that AI issome sort of malevolent
intelligence or has anycognizance or anything like that
.
What you're saying is that it'sgood at scraping data, and I
think that's going to berelevant to our conversation.
It's good at scraping data, andI think that's going to be
relevant to our conversation.
It's good at scraping data andbeing predictive and sorting
through just tons of data, likemaybe DNA strands.
(09:12):
Right, there's a lot of data inDNA strands.
So tell us now.
I mean, we're not just going totalk about AI, we're going to
talk about how AI may be given anew role when it comes to
something called generativebiology.
Jim Thomas (09:34):
Sure.
So those who are familiar withso-called AI artificial
intelligence at the moment, andmany of us are using it, whether
we choose to or not, and manyof us are using it whether we
choose to or not.
I used to its use for text, forchatbots such as ChatGPT, or
that little kind of AI overviewthat now turns up at the front
of Google or making images ofourselves, these kind of things
(10:01):
and there has been many hundredsof billions of dollars being
put into this industry byventure capital, by large banks
and investors, and they'regetting a little bit antsy
because the amount of money thatthey're putting in, that
they're spending trillions onbuilding new data centers, isn't
really coming back withanything.
That is very impressive, and sothey're looking for something
that looks like it's a realbreakthrough.
(10:21):
And one area where companiessuch as Google or Microsoft or
OpenAI are trying to do that isin the area of genetic
engineering.
The argument is that, yeah, yeah, we have chatbots that can
generate supposedly new text andnew images, but we can also
generate new viruses, we cangenerate new organisms, we can
(10:42):
generate new proteins and wecould generate new chemicals
that could be drugs or theycould be new materials, and so
this area of generative biologyis sort of quietly exploding
right now and, just as the sortof AI you have in chat GPT is
based on scraping all the textoff of the internet, this
(11:06):
generative biology AI startswith scraping all the DNA
sequences that they can get.
The GTCAT the language of genes,so-called is all being
sequenced and put up ondatabases and all of that is now
being used as the training forAI models.
(11:27):
And then the companies such asNVIDIA or Microsoft or Google
will say make me a protein thatis red and thermally stable, or
make me a virus that does thisor a bacteria that does that,
and the AI will try to generatethe code for building that
(11:47):
particular living organism orprotein or virus.
So we're moving from generatingpictures and text to moving to
generating living organisms,proteins and things that are
biologically active in ourenvironment.
Steve Taylor (12:01):
So I guess you
would take this technology, the
capability to be predictive whenit comes to the genetic code,
and combine that with, maybe,crispr.
That's an interesting thought.
It's a scary one.
What are your thoughts on this?
Jim Thomas (12:23):
Yeah, so we've been
talking a little bit about
artificial intelligence anddigital things, but if you move
over to the world of biology,the ability to genetically
engineer biology in the worldhas really shifted in the last
20 years.
Whereas 20 years ago we weretalking about genetically
modified soybeans and corn andthings like that, where you
(12:44):
would cut out a piece of DNA andmove it across, now, as you
mentioned, you have technologiessuch as CRISPR, where you can
just directly change the code ofan organism at the genetic
level, or you can just print outthe DNA you want on a DNA
printer.
This is very common now, and sothis whole what's called
(13:05):
synthetic biology area, the kindof upgrading of genetic
engineering, opens up theopportunity to start changing
the microbes that are in thesoil, the microbes that are in
your body, the insects, as yousay, that are affecting how
ecosystems work, the trees,everything.
(13:27):
Everything becomes kind ofeditable and changeable at the
genetic level, and you havewhat's being called a
convergence between, on the onehand, this very strong
biotechnology tools andartificial intelligence which,
through this generative biology,can say here are the codes you
(13:48):
need to change, as if you'rechanging the code of a computer
program, but you're supposedlychanging the code of life such
that you can get any number ofdifferent outcomes, and that's a
tremendous power.
Of course, governments are bothvery concerned about other
governments being able todevelop new bioweapons, but also
(14:09):
they want to control thatability.
They want to be able to makenew powerful weapons or toxins
and so forth make new powerfulweapons or toxins and so forth.
But in many ways it's moreserious when you think about how
you can now start to reproduceproteins and chemicals this way
that would have been grown byfarmers.
(14:30):
You can recreate vanilla or youcan recreate saffron, and in
doing so you can take over thoseareas of production and put
them into vats so you can startmoving farmers off of the land.
You can start reallysignificantly changing how
things are produced in oureconomy, and that's the real
power behind this is changingthe basic production of our
(14:55):
everyday needs.
Steve Taylor (14:57):
So, jim, when it
comes to your knowledge, when it
comes to your knowledge aboutthis, you actually have an
official capacity at the UN,with a group that monitors such
technology.
Could you tell us a little bitabout that?
Jim Thomas (15:09):
Yeah, I mean, for
the past decade I've been
sitting on an expert group thattracks this area of synthetic
biology, the new ways of doinggenetic engineering.
So I sit on that at theConvention on Biological
Diversity, and that expert group, which is made up of experts
from different governments, hasbeen particularly saying we need
(15:30):
to have better horizon scanningand assessment of these new
technologies and what they meanfor biodiversity, for their
impact on communities,especially indigenous
communities, and in fact havebeen particularly waving a flag
on artificial intelligence andgenerative biology.
That's the way I know aboutthat.
So, yeah, and we, along withothers I've been involved with
(15:54):
trying to bring these topics tothe Convention on Biological
Diversity and other UN spaces.
Steve Taylor (16:02):
It's a fascinating
subject.
It's a very scary one because Iknow that there's been somewhat
of a fundamental shift is whatI'm hearing when it comes to the
COP.
Jim Thomas (16:14):
What I would say
about that is and this goes to
how these technologies are goingto be employed Increasingly,
there is an industrial interestin trying to turn nature into a
sort of financialized market.
So, on the one hand, can wegenetically engineer and alter
nature so that it will take upmore carbon dioxide and
(16:36):
sequester more carbon dioxide,can we change other natural
processes such that we canfinancialize them as carbon
credits or biodiversity credits,and so there's a lot of
high-tech industries that arenow swarming around the
Biodiversity Convention lookingto see the opportunity to sell
(16:56):
artificial intelligence services, new genetic engineering
technologies and also carboncredits and biodiversity credits
, that this is sort of seen as anew economy, what they're
calling a new bioeconomy, andthese are the technologies that
will enable that and enable themto take lots of money in the
process of trying to build thatdifferent economy.
Steve Taylor (17:17):
Enable them to
take lots of money in the
process of trying to build thatdifferent economy.
Could you tell us a little bitabout how maybe this notion of
using artificial intelligenceand genetic engineering to help
biodiversity might be liningpeople's pockets more than
helping biodiversity orpreserving biodiversity?
Jim Thomas (17:35):
Yeah, and I would
say I think our movements,
particularly climate justicemovements, have become familiar
with the way in which carboncredits and carbon finance,
climate finance, is being usedto drive these false solutions.
But one thing I heard somebodyin this world say you know well
climate is Ken, but biodiversitythat's Barbie.
(17:56):
Climate is Ken, butbiodiversity that's Barbie.
The potential to develop newtechnologies and sell them, the
potential to securitize not justcarbon but also other parts of
biodiversity, even the actualdiversity of biological material
in one area or differentconservation areas, and water
(18:19):
cycles and nitrogen it sort ofbecomes like a playground for
those who want to financialize.
So, to be very specific, if youcan create genetically
engineered organisms that willchange the nitrogen cycle or
that will supposedly sequestermore carbon, or you can
(18:39):
genetically engineer insectsthat will remove invasive
species by killing off certaininvasive species, then all of
that could be first of all sold,and so you've got to make money
there.
But if you can put biodiversitycredits alongside that and you
(19:02):
can claim that you're increasingbiodiversity and therefore your
biodiversity credits are worthmore, then you've got another
market.
There's a third kind of marketgoing on, which is in the DNA
itself.
So it used to be that you hadbioprospecting or biopiracy
companies who would scoop up DNAand send it abroad.
But now they scoop up DNA fromcommunities, from waters, from
(19:25):
soils, and they sequence it andthey put it in these databanks,
and that digital version of DNAthis is what's called digital
sequence information is now theraw materials for building new
genetically engineered things,and so that's actually what
you're training these AI modelson.
So DSI, this digital sequenceinformation, is becoming a new
(19:47):
type of commodity, and thismirrors what's happening in the
economy more generally, wheredata is the new oil, basically,
and there's no bigger amount ofdata than genomic data.
There's literally trillions ofletters of GTC&A out there that
you could put in to sell andtrade in, if you can make an
(20:08):
economy around that.
Of course, it's a highlyspeculative economy, but that
doesn't really matter if you'rejust trying to make some money
in it.
Steve Taylor (20:15):
Well, what's
interesting about this is the
carbon credits, the plans, the30 by 30 often
disproportionately impacts theglobal south, the indigenous
people, who are going to be ourtraditional communities, who are
going to be moved off the land,when they're the ones who have
(20:39):
a culture, a lifestyle thatpromotes biodiversity, that
nurtures biodiversity, and arenot the leading cause of climate
change.
So we put an undue burden onthese communities and indigenous
peoples.
Would that be any differentwith what's happening with this
(21:04):
generative biology?
Jim Thomas (21:08):
and ocean
communities, who have really
stewarded biodiversity, who havedeveloped the plants and the
animals and the breeds and theseeds that are the basis of
people's food security and theircultures.
(21:28):
And that's now what thecompanies first of all the
biotech companies and now the AIcompanies want to plunder, and
they want to take that and usethat to train their ais in order
to produce new products to sell.
And then they'll tell us thatthose new products that they're
selling are the solutions, thesekind of single bullet,
(21:50):
technical solutions, um, but,but they don't.
You know that ignores, first ofall, the knowledge of the
people who actually developwhat's in place, and it
ultimately will move them outthe way, because it will be
putting in place monoculture,agriculture or agribusiness and
these kind of privateconservation models that you
(22:13):
mentioned, about 30 by 30.
Steve Taylor (22:34):
models that you
mentioned about 30 by 30.
So there is definitely acontinuum.
As always with these newtechnologies, they start by
taking and extracting and then,once they've extracted what they
need, they move out of the waythat people who actually they
extracted from.
So I think it's the same oldstory, it seems to me there was
a fundamental shift during theCali cop.
Jim Thomas (22:42):
Yeah, so there's
been a long discussion within
the Convention on BiologicalDiversity about the problem of
biopiracy, where the globalnorth and industrial companies
try to take biological materialswhether that's genes or seeds
or breeds and then industrializethem and profit off them
(23:03):
without giving any benefit backto the communities who first
held them.
And usually they just take andsteal those resources.
And so within the conventionthere's been this idea of what's
called access and benefitsharing, that if you're going to
access these genetic resources,then you're going to have to
(23:23):
make an agreement to share thebenefits back to those who
originally developed them.
And that was set up undersomething called the Nagoya
Protocol, which basicallyrequired that if you're going to
take DNA or seeds from onecountry and move it to another
part of the world, then you needto have prior informed consent,
you need to have some material,agreed transfer agreements and
(23:48):
promise that you're going togive some benefit back.
And that was alreadyproblematic because it didn't
actually necessarily givecommunities the opportunity to
say no, you just can't take it.
This is going to hurt us.
But with the advent of digitalmethods of turning DNA into data
, and now with things likeCRISPR, and particularly this
(24:10):
artificial intelligence andgenerative biology.
It all got circumvented.
Now you could scrape DNA fromanywhere in the world, you can
upload it to the cloud,so-called and then it frees
those companies of thisobligation.
So something that actuallyhappened in Kali was an
agreement, which has been 10years in the making, to force
(24:33):
well, theoretically to forcecompanies who are taking digital
sequences and industrializingthem and getting a profit off of
them to have to pay somethingback.
So they said, if you're acompany of a certain size and
you use this digital DNAinformation, then you should pay
back to indigenous peoples andothers from whom this was taken
(24:56):
into a, into a fund, what wascalled the Carly fund, and so
that's.
That's actually something thatgot established.
It's not clear yet, um, how manycompanies are really going to
pay into this.
It says they should do this,but it doesn't like force them.
Um, the good news is, adecision was made that AI
companies, artificialintelligence companies, have
actually got to pay into this.
(25:16):
They should pay 1% of theirprofits or 0.1% of their sales
into this fund, and that fundshould give monies back to
indigenous communities.
There's many devils in thedetail.
Theoretically, that means thatthere should be billions of
(25:39):
dollars coming from the likes ofMicrosoft and Google and NVIDIA
these big AI companies back toindigenous peoples, but how
that's going to be handled,whether any of that will
actually get to the communitieswhose DNA, andna and data has
been stolen, that's all got tobe worked out.
And it kind of doesn't fullyaddress the bigger problem,
(26:03):
which is that there is thisdifferent economy being put
together which you know, whichwhich might end up moving
indigenous people andcommunities off the land in
order to have biodiversitycredits and carbon credits and
closed conservation areas.
It would be terrible if whatthe Kali Fund allows is
(26:25):
basically sort of legitimatesthis new digital bioeconomy.
So it's a double-edged sword.
Steve Taylor (26:32):
Yeah, it does
sound like it.
It sounds like a voluntaryenforcement mechanism.
Jim Thomas (26:40):
It would be a bit
more than voluntary, because
there are definitely southerncountries, brazil, india and
many of the African countrieswho are going to push hard to
try and force those companies topay.
How about the US?
So the US, of course, isn't aparty to the Convention on
Biodiversity and most of thesecompanies are based in the US
(27:02):
and the US has said, especiallyunder the incoming
administration, we're not goingto make them pay.
But those companies, whetherit's Microsoft, Google and so on
, also operate in every otherpart of the world, and the
European Union, for example, hasalready indicated that they
think big tech companies shouldpay into the Kali Fund and
they've got a bit of a historyin making Microsoft, google and
(27:26):
others pay antitrust monies.
So maybe there'll be some money.
Steve Taylor (27:31):
Isn't there some
principle regarding the
introduction of new organismsinto the environment that is of
concern to the UN body?
Jim Thomas (27:45):
So the Convention on
Biological Diversity has for is
even a protocol the biosafetyprotocol at the convention,
(28:12):
which is about trying to enforcethe precautionary principle
around genetically engineeredorganisms.
The problem we're seeing isthat the biotech industry has
more and more successfullyconvinced European and obviously
US and other governments toclear away the precautionary
principle.
They've really soldbiotechnology and now artificial
(28:34):
intelligence as the sort offuture for economies and that
anything that stands in the wayis going to keep countries
behind in a sort of high-techrace for innovation.
And these kind of languages nowvery, very visible in what
should be an environmental andecological convention, but it's
(28:56):
looking more and more like atrade fair between governments.
So there's definitely a problemhere with the sort of surging
power of the biotech industry,and there's almost no industry
that's more powerful than theAIotech industry and there's
almost no industry that's morepowerful than the AI and tech
industry.
Steve Taylor (29:12):
You play around
with artificial programs, you
know.
Ask them to create a picture,you get all these errors.
Jim Thomas (29:19):
I mean yeah and
that's a that's a very good
point that about a third of socalled generative AI output has
these errors.
They're called hallucinations.
That's built into them.
That's just how they work.
So you end up with pictureswith six fingers or people who
have kind of extra arms andthings like that, and that's
funny when you're looking at apicture.
(29:39):
It's not so funny if you'recreating a new organism, a
bacteria, a virus that's goingto get out in the environment.
So there's a real urgent needfor putting precaution when
you're using these sort ofbullshit production technologies
.
Steve Taylor (29:58):
Almost feels like
we need to slow this down,
doesn't?
It Sounds like we need to slowthis down, but it's the opposite
.
I mean, the big tech tech brooligarchy is just running away
with things.
Billionaires who want to escapethe planet go to Mars when we
can't take care of this one.
I'm just not getting a goodfeeling about it.
Jim Thomas (30:21):
You're exactly right
that we're now seeing that
surging power of the sort of thetech bros, the brologarchy, as
some people have called themtaking over Washington and so
forth.
But I think there are placeswhere people can resist.
For example, for artificialintelligence to work, or seem to
(30:42):
work, they need these massivedata centers, these huge
warehouses of computers, and theAI data industry is now putting
a trillion dollars into tryingto build out enough data centers
to run all of their AI programs.
And where are they placing them?
They're placing them on theedge of cities.
(31:02):
They're placing them in thecountryside around cities.
They're placing them in placeswhere people can fight them.
To be honest, these data centersare using incredible amounts of
water.
They're using incredibleamounts of energy.
They're producing incredibleamounts of pollution, because
(31:23):
they actually do often run ongas and other things, all of
which communities can fightagainst.
Every time you do a chat GPTsearch or not search a question
to chat GPT, it's the equivalentof pouring away a bottle of
water.
In fact, it may be more thanthat.
(31:45):
The energy use is just throughthe roof in terms of.
It's currently something likeabout two to three percent of
all global electricity is goingto data centers, but it's going
to be about nine percent of allelectricity by 2030 in North
America, and so that's blowingthrough all of our climate goals
(32:07):
and so forth, and so there'sgood reasons for communities to
say no, you can't do this.
You can't be building these bigAI clouds.
They're not clouds, they'repretty heavy right in the middle
of our communities, taking ourwater away from our farms and
from our biodiversity, takingour electricity and energy that
should be for our hospitals andour schools and causing
(32:27):
pollution be for our hospitalsand our schools and causing
pollution.
This is actually a front linewhere communities can fight back
and they might win.
Honestly, I think that's aplace we can fight.
Steve Taylor (32:41):
I don't hear a lot
about AI when it comes to
government oversight.
Am I missing it?
Are you seeing it in othercountries?
In the United States, AI can dono wrong.
Jim Thomas (32:53):
Yeah, I think the
problem is governments have been
very much told that AI is thefuture for their economies and
that they have to be locked intoa race.
There's a race between Chinaand the US and Europe and Russia
and Brazil and India, andthey're all trying to have a
place in that AI future.
(33:13):
The European Union has an AIAct which is beginning to do
some regulation.
The Biden administrationbrought in an executive order
which had some oversight notvery much and that's about to be
ripped up by the Trumpadministration, or the Musk
administration, as we shouldcall it.
And I don't think it's going tobe in the capitals that you're
(33:37):
going to get a pushback againstAI.
I do think it's going to be incommunities.
I think it's going to be incommunities who are threatened
by data centers.
It's going to be in communitieswho find that their essential
services are being destroyed byAI algorithms and that their
economies are being destroyed byAI-driven commerce and their
jobs are being taken away by AI.
(33:57):
It's being communities,particularly in the south or in
places like Palestine, who arebeing bombed by AI.
There are many reasons whymovements are going to
increasingly come together andsay this is wrong.
This is not what we want.
Steve Taylor (34:15):
It's a kind of
colonial occupying force, and I
think there's a big fight comingon this and it's not going to
be governments who turn itaround, it's going to be people
and, of course, those who standup and say maybe we ought to
think about this AI when itcomes to employment, when it
comes to the environment and allthese social, political and
(34:35):
economic reasons, are going tobe called Luddites.
You know it, you know it right,so maybe—.
Jim Thomas (34:43):
And when we go back
to the original Luddites, maybe
they are.
They're people who are askingis this a technology for the
common good?
Steve Taylor (34:54):
And if it's not a
technology for the common good,
then let's move it out the way.
If it's not a technology forthe common good, I don't know.
We're probably close to thesame age and I was reading your
bio and you go back into the 90s.
I go back into the 90s.
I started, you know, opposingbelow-cost timber sales in the
United States and organizingprotests and blockades and
(35:15):
things like that.
You probably did or know ofsimilar things, but I remember
back in the 90s it's sort oflike well, you know, there's
this global warming.
It's if we don't do something,but it just feels like now that
it's here and at our doorstepand consequences are becoming
more apparent, we're just movingto these, these carbon trading,
(35:40):
and now our solution will be tobuild these large artificial
intelligence machines, datacenters, cloud, whatever.
I don't want to sound too oldand out of it by using the term
machine, but that's what we'redoing.
We're building these AIcomplexes that are just
(36:02):
degrading the environment moreand saying our solution may be
in creating a lot of thisgenerative biology.
It just seems like a move inthe wrong direction.
Jim Thomas (36:12):
I think what I point
to is that the silver lining of
all this is that many of ourmovements are already beginning
to prepare to fight and some areeven winning.
You may have seen last year thatthe movie screenwriters and the
movie actors and so forth allhad a big strike and it was
(36:34):
because of AI.
It was AI was going to put themout of a job and they won.
They put all sorts of things intheir contract that AI couldn't
be used in acting and writingscreenplays and so forth.
That's an example of fightingback against AI in a very niche
industry, although it happens tobe one that many of us see, and
those sort of fights can happen, particularly across different
(36:56):
parts of labor.
They're going to have to happenbecause this is going to affect
the environment and climatechange.
It's going to happen incommunities.
It's going to happen inagriculture, where AI is
increasingly driving automationof agriculture and pushing farm
workers off the land.
We can see these as separatefights or we can bring them
together and say this is notacceptable.
(37:18):
We know, as you were sayingearlier on, we know how to live
in ways that are just andecological and in community, and
AI probably isn't a part ofthat picture.
Steve Taylor (37:31):
So, jim, I did
want to you know this whole
thing.
With the techno bros or the,whatever these billionaires, you
got Musk who wants to go toMars.
This is such an old fantasy.
Have you ever looked into that?
I mean, it just seems sounlikely that we're going to be
able to live on Mars when we'refacing, you know, the real
possibility that we won't beable to live on Earth.
(37:53):
And there's even aninternational scientific body
who says that you know, there'stoo much radiation because of a
lack of a magnetic field aroundMars.
We have the Van Allen beltshere.
There's no liquid core in Mars.
You're just going to beirradiated.
It just seems like a childhoodfantasy.
Jim Thomas (38:10):
Thoughts on that,
yeah, I mean let's look at what
Musk is really doing.
He's putting up satellites theStarlink satellites around the
planet, which are then allowinghim to control communications,
including communications in warand battles.
So that's a very smart move ifyou want to gain sort of
strategic power.
(38:30):
He's making deals with NASA andothers to take his rockets up
into space and so beginning totry and sort of put himself
ahead on a commercial space race.
The thing that these tech brosor technology companies always
do is they'll give some kind ofaspirational dream we're going
(38:54):
to have sustainable food, we'regoing to have net zero, we're
going to go to Mars or whateverit is, and all of that is always
a sheen in order to moveforward.
What they really want to do,which is to build new ways to
make money, want to do, which isto build new ways to make money
(39:15):
.
So I don't think anyone reallywants to go to Mars, but they do
want to build an industry thatthey can use for military
purposes, industrial purposesand so forth.
So that's the near-term moneythat's being made behind these
dreams.
I think there's a deeper thingas well, though.
I mean these fantasies ofleaving the Earth and going to
(39:37):
outer space.
For example, jeff Bezos verystrongly believes this and it's
the reason why Blue Origin, hisspace company, it's complete
focus.
Actually, he wants to build thehighway to space.
That's the thing that Bezos isinterested in.
But you know, it's also whatallows these sort of sci-fi-like
(39:59):
tech bros to legitimate thingsthat ultimately destroy the
Earth.
You know because, as they canimagine that the future lies
somewhere else, jeff Bezosactually is someone who concerns
me much more in that case.
You know, jeff Bezos controlsthe Bezos Earth Fund, which has
now become a major philanthropicplayer, is putting tens of
(40:22):
billions of sorry, yeah, tens ofbillions of dollars into
biodiversity and climate, andyet has this fundamental belief
that he's going to move all ofhumanity off of the planet into
orbiting space stations.
He then says you know, earththen becomes something like
(40:42):
Yellowstone Park, a kind ofprotected area with a few
indigenous people left in it.
And that's the view of not justJeff Bezos but the Bezos Earth
Fund and a kind of increasinglyfortress conservation movement
that thinks, if the technologycan just move humans off the
planet, then we can treat theplanet as a kind of tourism
destination.
So there is a sense in which,whether or not they manage it,
(41:06):
these kind of stories thenmanifest in really perverse
approaches to how we deal withour planetary problems, whereas
I think, as ecological movementswe're saying no, we don't want
to go to space, we want to cometo Earth, we want to come back
to Earth, we want to be of Earth, and that's what we connect to,
(41:27):
not to technologies that put usbehind screens or out in space.
So I think it's a useful thingto push back against.
Steve Taylor (41:35):
It is.
We've spent billions of yearsco-evolving and now we want to
just ignore the wisdom of thatco-evolution and just use AI to
create an abundance ofgenetically engineered organisms
to release into the wild, whilewe have dreams of living off
(41:56):
Earth, terraforming a planet,ignoring how inhospitable that
planet is and never talkingabout mitigating the radiation
from the sun that's hitting thesurface of that planet.
Then you know, all this is, youknow, encouched, ensconced in
this Star Wars, star Trekfantasy, that there's all this
(42:16):
life out there, and maybe thereis.
But space is a big, big place.
Space is vast, inhospitable,lonely Even.
I think William Shatner took aride on a Blue Horizon craft and
he reported back that when helooked out there at the depths
of space he just thought ofdeath and his own mortality.
(42:37):
But we do need to.
You know, I don't want to betoo much of a downer, but it is
just amazing to me how we havethis mythology of that we're
going to expand into thegalaxies and all of this.
You know, one planet at a time,let's say.
I mean, we can't even handlethis one.
Jim Thomas (42:58):
Yeah, I mean you
mentioned you were involved in
resistance in the 90s behind thebanner of Earth First.
And that's where we need tostart.
Start with this planet.
Steve Taylor (43:09):
Well, maybe we
ought to leave it there.
Jim Thomas, thank you so muchfor joining us on Breaking Green
.
Jim Thomas (43:14):
Thanks, steve, that
was fun.
Steve Taylor (43:17):
You have been
listening to Breaking Green a
Global Justice Ecology Projectpodcast.
To learn more about GlobalJustice Ecology Project, visit
globaljust ecology dot org.
Breaking green is made possibleby tax-deductible donations by
people like you.
These help us lift up thevoices of those working to
(43:38):
protect forests, defend humanrights and expose false
solutions.
Simply text give G I V E to one, seven, one, six, two, g-i-v-e
to 1-716-257-4187.
That's 1-716-257-4187.