Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:28):
Hello everyone, and welcome to Books and Done.
I'm your host, Livia J. Elliott.
And today we are doing another Author Spotlight.
I'm honored to be joined by the doctor of cyberpunk himself,
fellow Aussie, TR Napper.
Tim, thank you so much for joining me today.
G'day, thank you, Livia.
(00:49):
For your listeners, I'm T.R. Napper,
I'm a cyberpunk author,
basically an Australian, and I am Australian, as Livia said.
I came to writing a bit later than most.
I was an aid worker for a decade,
at least before I became a writer.
And so I lived in different parts of Southeast Asia,
working on a humanitarian program.
So a lot of my work is based there
in different parts of Southeast Asia.
(01:09):
I've got a collection out called Neon Leviathan.
I've got a novella out, which we're talking about today,
called Ghost of the Neon God,
and two novels set in that world.
All everything I've just mentioned is set in the same world.
A novel called The Eshoman, which came out literally,
how many days ago, nine days ago,
but isn't out in Australia, actually,
in physical copy until 1st of October.
(01:30):
So that's The Eshoman,
and my debut novel is called 36 Streets.
And we'll go into, I'm sure Livia and I
will go into some of the themes of Ghost of the Neon God,
which is also one of the themes that resonate
through the rest of my work.
I also wrote one tie-in in the aliens universe
called Aliens Bishop,
which was something I never thought I'd do,
but I never thought I'd be a writer, so.
But I've written a tie-in novel as well
(01:50):
for the aliens universe called Bishop,
which is pretty cool.
I think that's all you need to know.
Cool.
Before we get started,
let me do the usual disclaimers.
First, in this episode,
we will deep dive into his novella,
Ghost of the Neon God.
As a result, the discussion around Ghost
will contain spoilers,
but we will not spoil the other standalones
(02:13):
in the same universe.
Namely, you can listen safely
if you plan to read 36th Street or The Asher Man.
Second, what you will hear
is our subjective opinion on these topics.
You may disagree, and that's okay.
Now, Ghost has a lot of themes we can discuss,
but I think that an umbrella theme that I found,
(02:36):
and I could be wrong here, of course,
is exploring the elements of cyber psychology,
namely how online and virtual technology
affect human behavior
and how our behavior changes on those interactions.
There are some very interesting discussions
when Sally and Andiri discuss about that.
(02:57):
What attracts you to that theme or that dynamic?
I'm just trying to think of how I address that in the novel,
but certainly generally, it's absolutely true.
The way a person behaves is,
oh, well, unfortunately, it's sadly true.
The way a person behaves is often online,
is often very different the way they behave in person.
Online, we have an audience, we have an imagined audience.
(03:19):
Every time we go onto social media,
and we have, and for writers like me,
but I think increasingly for everyone,
we have an image that we wanna project,
and the way you speak to an audience
is very different than the way we speak to individuals,
as it should be.
I think it's one of the sad things I see.
(03:39):
I'm drifting away from my book already.
But one of the sad things I see is,
when I meet people in real life,
and they don't understand
that they're talking to a human being anymore,
and they talk to you like there's an imagined audience
behind my shoulder that they're trying to impress,
and it's a set of talking points that they go on with
rather than just having a human conversation.
That's a trend that I've noticed.
(04:01):
The other thing, of course, is the intermediate.
So all our communications are through, not all of them,
but the vast majority are through sites
owned by vast corporations, which are sucking down
all of our data and personal detail,
with a view to manipulating us.
And the types of manipulation are often benign,
if there's such a thing as benign manipulation,
(04:21):
but manipulation in the sense of trying to sell us things,
which on the surface maybe doesn't sound so bad,
but to get our attention online,
the best way is to aggravate us or to upset us,
rather than to make us happy or tell us the truth.
So getting people's attention can be very polarizing
insofar as algorithmically,
they're trying to bring out the worst in us
in order to keep us here on this device
(04:41):
for as long as possible.
So these interactions in that sense can be very unhealthy.
Obviously it's good to connect to people.
I like connecting with readers overseas.
That's the single best function, quite frankly,
from my perspective is it's just connecting with people
I would never would have met in real life,
and then talking to me, that's fantastic.
But I wish it was only that, the good old days
when Facebook was just for connecting with people,
(05:04):
mates from overseas who you hadn't seen in a while.
There is also something that Cole implies
that ties in into what you mentioned,
and that's very early at the start.
He says that every piece of data that is collected
from the implant they have is converted into data
for behavioral analysis and marketing.
And I actually found that quite extreme.
(05:25):
And part of it is exactly as you say,
that everything we do online,
our behavior is datafied, to say it in a way.
Everything becomes data points for somebody to use
for profit or sometimes maliciously.
How did that help you define the idea of ghost?
Because I think that even if in this novella,
(05:46):
the effect of the implant is a bit subtle,
it's not touched to so much detail.
It is, I think, something that defines the world
that everything goes through that implant,
that every type of behavior they have is being collected.
Yeah, well, I mean, this is just an extrapolation
from the present, and this is for me,
Ghost of the Neon God is set in about the year 2100.
(06:07):
So that's, and most of my work is set around then.
My collection kinda ranges from 20,
my short stories range from about 2040
to as far out as 2200, I suppose.
But my main works, Ghost of the Neon God,
the Eshement 363, is about 2100.
So where are we going with smartphone technology?
Would we put a smartphone in our head if we could?
(06:29):
The answer is yes, absolutely.
The reality is, this is what I try to do with my fiction,
is to try and make a plausible extrapolation
from the present.
So it's entirely possible that by the year 2100,
we can have neural implants,
and that we can have the internet in our head
all the time, constantly.
What are the implications of that?
Well, they're just exaggerations, if you will,
from where we are now,
(06:50):
how much memory we store already in a smartphone.
Our smartphones function as part of our memory banks.
It's an exomemory.
Our smartphones contain things
that we can't really, our human brains
are very good at remembering,
like numbers and birthdays,
like phone numbers and birthdays, which is great.
It's good to have an exomemory for that type of stuff.
But our smartphones increasingly form
(07:12):
our memory and thinking function.
So if we can't remember, and I'll give you an example,
we can't remember a name of a band,
or the definition of a word,
or the capital of a particular country,
that we know we know,
but instead of taking 30 seconds
and letting our memory banks catch up,
we grab our smartphone, right?
Well, this is a phenomena called cognitive offloading,
(07:32):
is when we outsource our thinking to our smartphones.
The consequences of that, they think,
and they're doing studies on this,
are that our memory function can decline,
and our recall capacity can decline.
If we think of memory as a muscle,
we're not exercising our muscles,
and so increasingly become reliant.
Our brains are very malleable,
so we can possibly recover,
(07:53):
but it's certainly true
that the more dependent we become on the phone,
the more dependent we are on our phone.
This is just something that's happening in the now.
This is not science fictional,
this is our cyberpunk present.
And of course, our phone monitors us all the time.
It's our stalker in our pocket,
everywhere we go, every time we buy something,
every time we look at a screen,
(08:14):
how long we look at a screen,
for every infinitesimal detail it collects.
Do we think for a second they're gonna stop here?
Ha ha ha, this is nothing,
compared to 10 years or 50 years,
or in the case of my books, 75 years or so.
So I think about that.
Privacy is an interesting thing.
I never really cared too much about privacy.
Like I cared about privacy,
(08:35):
but as a human right, I'm like,
well, I prefer the right to an education and clean water.
I mean, privacy doesn't really resonate as much as,
because certainly in my previous employment,
I was working on education and clean water
and things like this.
So privacy, I never really cared about too much.
But if you start to think about privacy
as a collective right,
if you think of privacy as,
it's not just for you,
(08:57):
it's so populations can't be manipulated.
Populations can't be made depressed
and develop mental illness as we see in kids,
probably in part through social media.
It's fascinating when you look at young people today,
compared to Gen X, which I am,
the rates of binge drinking is half,
half the, when I was young,
the rates of having sex at the age of 15 is half,
(09:20):
and risk-taking behavior is half.
So that all sounds pretty good, doesn't it?
But the mental illness,
depression and anxiety is more than doubled.
Well, why is that happening?
Now there can be a lot of other reasons
to climate change and so forth,
but we're on these anxiety machines all over the day,
and that's certainly part of it.
I've kind of gone off of the tangent,
but going back to privacy,
if you're unknowable, you can't be manipulated.
(09:43):
You can't be sold to,
you're not another cog necessarily
as part of this giant machine
that just wants to sell us and exploit us
and use us as a profit center.
And sometimes worse, of course,
I mean, the best example is a profit center.
The worst example is science operations
that the Russians, for example,
might be running against us,
manipulating us to hate each other in our own country,
(10:05):
to disbelieve in democracy,
to believe the very worst of the people
with a different political persuasion.
Not that I just disagree with them,
but I want them to die,
making social conflict more likely.
An obvious example is the Russians,
it was discovered, for example,
ran operations during the Black Lives Matter movement
(10:25):
where they would organize Black Lives Matter
and Blue Lives Matter protests
in the same town at the same time.
That's so fucking cynical,
but it's part of the machinery of manipulation,
the instruments that we have available.
If you can think of the worst thing,
someone would use infinite information technology for,
someone else has already thought of it and doing it.
(10:47):
And so privacy, going back to that point,
makes those things harder.
Having a right to privacy,
having a right to get rid of your Google history,
for example,
having a right to just to go on somewhere
and not be recorded all the time,
actually matters for kind of a social good
and a social health,
because if we don't have privacy,
all these other things come into play,
all these other instruments of manipulation.
(11:08):
It's very interesting,
especially when you take into account
the possibilities of an informational war,
since regular war is no longer available,
that type of collective manipulation through the internet,
through misinformation,
it's the only way countries have to fight each other,
that's economically.
Well, I mean, of course,
(11:29):
and I'm sure the Americans do it too,
but the Russians and the Chinese
quite openly commit to information warfare.
They make no secret of it.
It's not a conspiracy theory.
They're actually doing it,
and they're quite open about their attempts to do it.
Misinformation and disinformation
is part of our landscape.
It's part of that every day.
Every day I get online and I see clear disinformation
(11:49):
that someone else will be repeating.
And I think, really, did that really happen?
We see it on Twitter all the time.
We're in an insane period.
And of course we got blasted here to a lesser extent
in Australia around elections,
but just imagine being American at the moment,
the tsunami of madness that they're sort of drowning in.
And the thing about misinformation and disinformation
isn't making someone believe in lies,
(12:12):
it's about making people never being able
to believe in a truth.
That's the aim.
So you, I don't wanna use American examples,
so I hate having to do it,
but for example, the election was stolen.
30 plus percent of Americans
believe the last election was stolen.
They can't believe the simple truth.
American elections are weird,
but as fair as it can be in that country,
you know what I mean?
(12:32):
People can never believe in a truth,
even if it's a truth of like,
Vladimir Putin doesn't have our best interests at heart.
And that's the scary thing for a society.
If there is no truth anymore,
all sorts of demagoguery are possible.
All sorts of hatred are possible.
It's scary that living in a world
in a post-truth world can be scary sometimes.
Yeah, and you know how they say
(12:53):
that information is a resource more valuable
than any other currency.
It's the most important currency.
And by diluting it with so much information
that we are looking at all the time,
it creates disinformation.
Because exactly as you said,
you don't know exactly what is truthful.
You can't corroborate the sources.
(13:14):
And even the fact checkers that we have,
there is a reasonable doubt
that they could be manipulated as well,
defeating the entire purpose of those.
So the technology that was actually created to connect us
in a way that human connections
couldn't be done before,
it's actually our world's enemy to some extent.
Yeah, I mean, the same avenues
(13:34):
that could be used for connection,
for understanding, for feeling like we're living
in a closer world, a more immediate world,
have been made to do the opposite,
have been made to make us more atomized,
more distrustful, less likely to cooperate
and less understanding.
And that's the thing about technology
is neither good nor bad.
And this is something good Cyberpunk understands
(13:55):
that technology itself isn't good or bad.
But it's the question of what's the socioeconomic system
that is functioning in?
Who owns this tech?
What do they wanna get from it?
And if we're living in a world of staggering inequality,
and if we're living in a world where there's a few oligarchs
and a few billionaires are increasingly more powerful
and own all this technology,
(14:16):
well, what do they want from it?
These Cyberpunk speculations really,
in a quite obvious way,
just extrapolating from where we are now.
It's just the DNA.
The future, the futures that I try to write anyway,
the DNA of them is all here right now.
I don't think many things I write are implausible
in terms of, well, this might be where we're going.
Cyberpunk's meant to be a warning.
(14:37):
It's meant to be saying, this is a really bad idea.
Sometimes it feels just that
we've ignored all those warnings
and we're heading down that road.
And it's somewhat dystopic.
And there are parts of the world which is full dystopia,
as others have mentioned.
If you're an occupied Ukraine,
or if you're living in Xinjiang and you're a Uyghur,
you're living in a dystopia.
But the other thing about Cyberpunk,
which is important to me,
(14:59):
is that there's human rebellion exists,
that the flame of the human spirit still exists,
and there are people who refuse
to bend the knee to this system.
And I think Cyberpunk, it might be a bleak future,
but it's one where the human spirit is still alive.
At the end of the day, technology is like any other tool.
It can be used for good or for bad.
It depends on who is wielding that technology.
(15:21):
Circling back to something that you mentioned before,
when you were talking about doing plausible extrapolations,
something that I found terrifying in this novella,
it's how feasible everything felt
from a political standpoint.
Since we are in Australia,
and I have some understanding
of what's happening in Australia,
every geopolitical comment that was mentioned there,
(15:44):
it felt very possible,
especially when we open up the news
and see things that are leaning in that direction.
So is this related to your background,
that you focus here so that you can make
that reality feel more real?
As an undergraduate and a postgraduate,
I actually was in international relations,
and I was a diplomat and aid worker for a decade.
(16:04):
So I like, they have hard science fiction.
I like saying hard geopolitics.
Having in the background as part of the world,
the world building,
that all has to make sense to me and be quite rigorous.
And so politically, again, I look at trends.
I imagine a world where America collapses
and China is the sole superpower.
Now, I don't necessarily think that will happen,
but I think it's very plausible future.
(16:27):
Well, if it did happen, what would the world look like?
What would Australia's position be?
What would be like for China's neighbors?
What would be like for Europe?
Well, I think about it in terms of power dynamics
and what the world would look like.
What would democracy look like?
Here, we would be a very tenuous democracy
that is essentially a resource pit for China
and wouldn't have any real or effective foreign policy.
(16:48):
That's what we'd be.
America would be, if it collapsed,
no one would really care about it anymore.
So I think that a reader said that I wrote
it was a post-Western future,
which I never thought about,
but kind of is true.
What happens if the West does decline precipitously?
And we do have the Chinese century
as international relations have been promising
(17:09):
for a very long time.
What will that look like?
So yeah, in answer to your question,
because my expertise before I became a writer
was in international relations in part,
it really mattered to me
that that world building was rigorous.
Yeah, and it wasn't a minor detail
because it also changed who was a provider
of the technology in the world as well.
(17:29):
There are a few conversations within the characters
that there has been fear that China is providing everything
and from there, something that I actually loved,
it's a detail here, but they mentioned the gatekeeper,
an unsentient AI that is specifically designed
to hand down tribal AIs
and inhibit the development in other nations.
(17:50):
And considering this cyberpunk setup
and the idea that we are moving towards informational wars,
I thought that was genius.
It's completely plausible and likely has been done already
that countries will start or try to stall the development
in other nations to continue that century
(18:11):
of a particular country.
Yeah, the gatekeeper, that wasn't my idea.
I got the gatekeeper AI.
I try to do pretty rigorous research
for everything I write.
I'm a bit obsessive and I was researching
the key, the seminal works in artificial intelligence.
And one of the main thinkers was talking
(18:33):
about different forms of artificial intelligence
and kind of the arms race around AI.
One of the ideas, I don't know if we call it gatekeeper,
but it was something very similar,
was the one, an AI that would just destroy other AIs.
And I thought, oh, that's a good idea.
And so that was a suppression.
So in the book, it's in the satellites.
And so if AIs are trying to get out, it will be shredded.
(18:55):
Science has so many cool ideas in it.
But one of the things about these books
that might be on memory or climate change
is a good example, but in this case,
artificial intelligence, is they're not accessible.
Even the most accessible of these books
are not going to be read by many people.
And one of the cool things about science fiction
is you get to take these ideas that
(19:16):
are sort of a mind blowing and mind bending, but plausible,
and then put them into a story.
William Gibson talked about the poetics of technology.
He said he wasn't an expert, but he was interested
in the poetics of technology.
And one of the things I think he was saying
in that was what writers can do, what authors can do,
is imagine that technology in the world
(19:39):
and what it would look like.
Strangely enough, not many tech bros can.
Like it's a really interesting,
just even in just terms of a study of the human mind,
I find this interesting, that the self evident
negative impacts of some techs,
just so glaringly obvious that they don't see
that they're blind to.
And then it gets introduced into the world
(19:59):
and exactly all the things that we thought
were gonna happen, did happen.
If the poetics of technology is about taking
hard science fictional ideas,
or at the very least plausible ones,
but putting them in the real world with real people
and in a real society, and how that would change a society
or the way people interact and so forth.
That's one of the important things I think authors can do.
(20:21):
Following up on what you said about the tech bros,
I think that they are too focused
on only one side of the equation, which is the technology.
They forget the users and the users generally
do whatever they want with the technology.
I think that's also the key point of cyberpunk
of any science fiction.
It's never only about the technology.
(20:42):
It's always interaction between both of them,
people and technology and how it is used or misused.
Yeah, the other thing is that William Gibson said,
if I can quote him twice,
he says the street will find its own uses.
So the other thing is that something that's punk,
if we wanna have punk in our cyberpunk,
one of the things is how we can flip that as well.
In the book, in Ghost of the Neon God,
(21:04):
they just disconnected.
They just not plugged into that system.
That's the way they retain a certain amount of freedom.
In the book, the two characters are homeless,
which is a dire state.
There's nothing romantic about it.
It's a miserable state.
However, they're free in the way that so many other people
are not because they're not knowable
and they can't be tracked, they can't be manipulated.
But also using technology like jujitsuing with technology
(21:27):
and using it back against the owners.
Often that sounds to be satire, for example.
And the hilarious ways people sometimes use tech
and programs that go out there.
And the hackers do that as well, obviously.
The most obvious cyberpunk thing is the hacker.
Yeah, but showing that on the one hand,
there will be an economic incentive and a power incentive,
and maybe even sometimes good intentions
(21:49):
on the part of development of some technology.
There'll be the way it's actually implemented,
which we usually reflect the socioeconomics of a society.
And then there is also the ways we can use that same tech
to rebel.
And there's all those three parts.
A part of my thing, come out and ghost of the neon god
as well.
Because there is an AI in that.
Who gets to own it is the equivalent of who gets
to own the first atom bomb.
(22:10):
They have that sort of power.
Unfortunately, we're in a situation
where the AI doesn't necessarily want to be owned,
which is good and bad, actually.
It could be catastrophic, but the AI
doesn't want to be anyone's weapon.
So that's another part of the equation.
Moving into that direction, something
that I really liked is how the cherry
on top of the cake of the book is a discussion on memory
(22:31):
and how memory makes us be human beings.
Sally and Andiri discuss that we are actually
more than our memories as individuals.
We are also on constant evolution
caused by the choices that we make.
Sally and Andiri are constantly making choices.
Basically, it made me think at that point in the book,
(22:53):
they are both alive.
They are both more than what they are, because they are,
by their definition, they are also both making choices
and deciding things as they go.
Yeah.
Memories are tough on, well, not tough.
It's fascinating because I go back to memory time and again
in my short stories I have.
And in my latest book, The Escher Man,
(23:14):
is almost entirely about memory and memory
wipes and memory manipulation.
So much so I actually had forgotten about that part
in Ghost of the Neon God, because in a way,
I was trying to, thematically was looking more at AI
and AI rights compared to human rights and social class
and so forth.
The thing about memory that I come back to time and again
(23:36):
is what are we without it?
If we meet, if you meet, say, you
know someone who has memory loss or say they have Alzheimer's.
That's a tragedy.
It's one of the worst things we can imagine,
because our memories contain everything.
We contain all our loves and all our fears and our hates
and all the most important parts of our lives.
And our family and everything are stored in memory.
(23:56):
So when we lose memory, what do we lose?
Well, we kind of lose everything, don't we?
Feels like it.
So in a way, the memory is a metaphor for the soul.
But it's also so fragile.
Our memories, our human memories are really fallible.
We forget all the time and we misremember
and we mush memories together.
And it's a very human thing as well
to have these fragile memories.
(24:18):
But they're also very rich.
That's another paradox.
They're so rich because our memories
are blended with all our other memories in a way
and all the experiences we've had since.
So we can have a memory that was sad,
then at some point in our life it
changes to a happy one or a nostalgic one or a warm one
because of all the things that we've experienced since.
(24:41):
Our understanding of ourselves compared to when we say 12,
to when we're 30, to when we're 40.
So memory is something that then I in turn
think about when it comes to artificial intelligence.
Because how much of our humanity is bound up in memory?
But how can you ever replicate memory in an artificial being?
Because its memory will be perfect and linear.
You can't ever have a human memory system.
(25:03):
How does that change them?
But memory is also connected to our bodies.
That's the other thing.
And this is something I'm going to talk about
in Ghosts of the Neon God.
But we have our bodies remember things.
They remember a smell, for example.
We might remember the touch of something
and then it brings back memories to us.
Whether it's the touch of a loved one or hugging a child
or the touch of a physical object.
So bodies stored in memory as well.
(25:25):
And our humanity is stored in our bodies.
The foods we eat and the things we've experienced
with our bodies.
And this kind of seems self-evident in a way.
But our physicality is such a vital thing in our humanness.
But then we transport that to artificial intelligence,
which doesn't have a body and doesn't have our memories.
(25:48):
There are a whole bunch of other questions
that emerge from that.
Like if you have no understanding or respect
for the human body,
well, what would you be willing to do to it?
Because you have no empathy there.
And the other question of course is,
well, this is getting very deep.
But then the other question is consciousness
and being self-aware.
And the idea that will an AI ever actually have that?
(26:09):
Will it ever be aware of the self?
Because there's no reason evolutionary
that we have to be self-aware and have consciousness.
These are like the deepest philosophical questions.
It's like soul, what it means to be human,
our essence and memory is one of the ways
where we can ask at the very least
some of the very deepest questions.
And fiction is one of the avenues
(26:30):
by which we can imagine it most readily.
So we can conceive and think about these deeper questions
in our lives.
I particularly liked the ending when Jackson gets copied
into that Android body and he wakes up
and he says that something around the lines of,
it is himself, but it's not at the same time.
(26:51):
He doesn't feel like himself.
It's part of what you say,
how important our body is to our identity
and our relationship to our body,
the memory that is there.
That posits the question whether the being that wakes up
at the end is Jackson or not.
Yeah, and that's why I live to you.
That's a question for the reader.
And because it's maybe an unanswerable one,
(27:14):
he gets uploaded into an Android.
He has his memories, but it's not gonna function
like memory anymore and he's not gonna have
that relationship to his body anymore.
Who is he?
Because his brain is different.
He feels that immediately when he wakes up.
He has a faster brain and not slightly,
like phenomenally faster and different brain.
But he still wants to see Sally.
(27:34):
He still maybe wants to live the life
he never could have lived because of his poverty
and because of his homelessness.
There's some ambiguity there.
And I think it's probably better for authors
not to answer those questions.
It's very open, but I think it also, at that point,
Jack thinks memories and experience become so intertwined.
It is hard to remember where we end and they begin.
(27:58):
I love that part, honestly, that we are our memories
and we lose ourselves when we lose them.
Yeah, and our memories, one of the fascinating things
is sometimes our memories will be things our parents told us.
And then we have a vivid memory of something,
but actually we don't remember.
We just remember our parents telling us over and over again.
(28:18):
Or we remember an experience.
But sometimes it didn't happen to us.
That might have happened to a very close friend of ours.
Or something you have, you can have a thing
called dyadic memory where you and your, say, your partner,
the memories you have together when you're sitting together
at a table and you're talking are greater than the sum
of the parts in a way, because you both remember
(28:39):
each other's lives and your mutual experiences
and you actually have more in that memory.
This is also a question of being.
What is, where is our being and where does our consciousness
reside, say, for example, and is it just in our brain
or is it in our body?
One of the interesting things about body or let's say mind
is our minds kind of go beyond ourselves in a way,
(29:00):
because they put all these connections
to other people in the world.
And memory, the memory examples I just gave
show that in a way that our being is not,
it does extend to others in the world.
It's not, or certainly not very easily separated.
And this is just, again, this is a philosophical question
of what, again, it's the great imponderables
of being and consciousness.
(29:21):
Yeah, what it means to be human.
And I think partly what it means to be human
is something that extends beyond ourselves.
What it means to be human is something that exists,
because we are more than just, we extend into our family
and we extend into our community
and we exist in that context as well.
So these are tough questions.
I don't know why I keep doing it to myself.
Because that's a point of writing
(29:41):
of a speculative fiction, right?
To speculate about these things.
But yeah, I have two follow-up questions on that.
To finalize with that scene at the end,
something that comes in that part is that Jackson remembers
that Sally had mentioned about this new legislation
stating the right that AI had.
And he actually uses that to, quote unquote,
(30:03):
trick the professor of Western Australia University
to not own him and let him go.
It's very interesting how much we are always catching up
with technology.
On the one hand, we let it go so far away
and then we start thinking of the consequences as a society
and putting up legislation.
But in there, it's actually almost the reverse,
(30:25):
because I understood that they hadn't developed sentient AIs
until on D.D. was there.
But they also had all these classes
and philosophical questions about the rights of AI
and how they may differ from the rights a person has.
This is our backwards thinking.
Really bothers me that today we talk about AI rights.
(30:46):
And who talks about AI rights?
Well, people who own it,
the kings of industry of Silicon Valley talk about AI rights.
Why do they talk about AI rights?
Well, being cynical,
why would the guy that owns the toaster factory
want to have a suite of rights for toasters?
Well, it's in their interest, isn't it?
We talk about AI rights.
This is a discussion that comes up.
There's already courses now which discuss this at university.
(31:08):
So firstly, in 75 years, it's gonna be way more developed.
Secondly, we don't even have bloody human rights.
And so there was an example of, there was not an AI,
it was just a bot in Saudi Arabia.
And they were talking about making it a citizen
and giving it certain rights.
It was a female Android thing.
And someone pointed out that women don't have the same rights
as that Android.
So this is one of the things about in Ghosts of the Neon God,
(31:31):
he's in a situation where he had more fully formed rights
when he got uploaded into an Android at the very end.
He had rights that he could claim.
He couldn't claim he has nominal rights as a citizen,
but we can see the society we live in now,
and it's gonna be the same in the future.
What do they matter if we're homeless?
What do those rights matter
when we live in crushing poverty?
(31:51):
These are just words on a page.
So one of the things that gets to me sometimes
is that we have these earnest, involved,
resource intensive discussions about the rights of an AI,
where fundamental rights of human beings
do not exist in large parts of the world,
and they don't even exist here in Australia in some ways.
And again, this is not to say that
if we develop an AI with consciousness,
(32:12):
it probably should have rights.
I don't know if we ever will.
That's just as a scientific question,
I'm not sure that there'll ever be a self-aware AI.
I think we'll have very powerful AIs
that have no self-awareness or consciousness.
Just that, why are we having these discussions
when we can't even have discussions,
when we can't even secure those for ourselves?
This is the situation that Jack was in in the book.
(32:32):
Sally's telling him about the rights of the artificial being.
He's like, geez, that'd be nice, you know?
Yeah, it's like humanity always has the power to do right
for its fellow humans,
but we always put up the interest in technology
or in tools that we could leverage
to keep killing each other.
So the other question that I have related to that,
it's also part of the discussion on Sally and on Diddy.
(32:54):
And they have this chat about
how there are two characteristics of a human being,
one being the need for self-preservation
and the other, the evolutionary need for community
and how they relate to each other.
And I really love that discussion
because it implies that society and human connections
are fundamental to the evolution of humankind
(33:16):
and to who we are now as a global society.
It also implies that we, as individuals,
we need the community to support us and to grow.
And we need to give back to that community.
And there is a discussion in there
that on Diddy as an AI grew alone
and it didn't need that support from a community.
(33:38):
So does that make it human or not?
Is it a characteristic that is so fundamental to us
that we cannot consider that AI human
because it lacks human connections?
So is on Diddy less human because he doesn't need community
or does he, I think the question is for, in the book,
as you say, like, Sally's view is that we are a tribe
(33:59):
and a tribe is fundamental to being human.
It is, from the very beginning, this is in our DNA.
We needed a tribe.
We need to be part of one and we need to be back
to make a feel that we're making a contribution to our tribe.
This is in our nature.
And societies where, well, being tribal
might have negative connotations,
but in the positive interpretation of that word,
(34:20):
societies where you feel you belong
and you feel you are needed
and that you have something valuable to give back
to that tribe are the happiest and the most stable
because this is something fundamental to us.
We don't want to feel superfluous.
If I can use an extreme example,
people who feel that they have nothing,
how can I put this, are made redundant by society
(34:43):
and not needed anymore.
These are communities where there's high rates
of suicide and violence, for example.
Being made redundant, being feel not redundant
just from your job, although that's one example,
but you're not part of this anymore.
You have nothing to contribute.
That's devastating.
So we're a communal people and we're a tribal people
and there's positive forms of tribalism
and negative forms of tribalism.
(35:03):
But in the book, I'm trying to talk about
the positive forms of tribalism.
And AI doesn't have that.
And in terms of the book, this artificial intelligence,
as I think Sally says, grew up in the darkness alone.
Well, what does that, because I, as a tribal person,
the idea of something bad happening to my tribe
or even just the desire to see them grow and emerge
and flourish is inherent in me, right?
(35:26):
And AI doesn't have any of that.
It doesn't have any tribal lead.
And this goes to a question of almost evolutionary purpose.
What is the purpose of an AI and how do you give it that?
Well, there's all the different sorts of discussions
about goal alignment, they call it.
The goal alignment of an AI and human society,
because it doesn't think like we do.
(35:46):
And it's getting to the point even now
where there are neural nets,
where there is what's called a black box,
in so far as the people who built the neural nets
don't actually know how it's getting the answers, right?
Because it's not thinking in any type of human way.
This is one of the apocalyptic scenarios of AI,
where something, it is an alien,
(36:06):
unfathomable intelligence with great power.
That's dangerous.
So this is Sally's argument.
Like she doesn't, this is an alien, unfathomable,
hyper-powerful intelligence.
What on earth could it unleash if it wanted to?
Well, it could unleash Armageddon,
because it doesn't have that goal alignment,
because it doesn't have the things
that we understand as fundamental.
(36:28):
One of those things is our tribalism,
our need to belong and our need to contribute
and the need to see our community flourish.
Simply doesn't have it.
So yeah, and this is just again thinking about,
it's just again, it's the what if of speculative fiction.
Well, what if an AI came
and it doesn't have these same goals?
But again, this is not a stretch.
This is where we are now,
(36:48):
because right now they don't understand
how neural networks really work.
And that's scary.
It definitely is.
And it leads to the whole problem of explainability
that we cannot even understand
if they are working as we designed it.
But something else that Ondiri mentions
in that conversation is that human cooperation
(37:10):
is fundamental for human evolution.
But at the same time, Ondiri says that
it's always the best humans that end up dying
in lieu of dealing with a life threatening issue.
So in order to basically allow the society to survive,
and Ondiri says, this is the flaw
and the genius of your tribe that we self-sacrifice,
(37:31):
but it's always the best of us that end up sacrificing
and then living in humanity with less.
It's a very interesting concept.
Yeah, it's kind of a sad scene.
And there is something in it, I think.
The best people don't go to Silicon Valley.
I'm sorry, they don't.
In terms of some intrinsic goodness,
the best people work as nurses or firefighters.
And when we saw the women who went through COVID,
(37:54):
we weren't all desperately missing our hedge fund managers.
We were desperately missing our primary school teachers,
and we need them to function.
We don't need a hedge fund manager.
And so this is one of the things Ondiri was pointing out
was this capacity for self-sacrifice,
because in that scene, she says, well, you wouldn't do it.
And he says, well, you probably wouldn't either.
Does that make me less human?
(38:14):
Jack would, being selfish is, of course, being human.
And depending on others, to make sacrifices is human.
Like I say in the book, and I think it's probably true,
some of the best of us make those sacrifices
where others wouldn't.
It's a very interesting idea.
It makes me think how many people was left behind,
just like Jackson and Cole, because they weren't
(38:35):
part of the system, because they weren't chipped or implanted
in that case.
And yeah, they are still sacrificing themselves, perhaps,
for others like them, but they are completely invisible
to the system.
I had a quote, actually, at the start of,
I wasn't about to use it.
I was really pissed off.
But I had a quote from Mad Max at the very start of the book.
And the quote was from The Road Warrior, where Max says,
(38:58):
I'm just here for the gasoline.
That was the whole quote.
I'm just here for the gasoline.
So Jack was like Mad Max at the start.
He's not there to save the world.
He's there just to live and exist.
And he had every right to, given everything he's experienced.
He just wanted to endure.
But like Max, Max ultimately isn't there
just for the gasoline.
(39:19):
Ultimately, he does the right thing.
He tries to save his community at the end.
And that's Jackson.
So he has that capacity within him.
But his capacity for goodness and the contribution
he could have made to a society was always repressed,
because he was never allowed to flourish because
of his socioeconomic condition.
And at the end, Jackson's goodness
(39:39):
was abused by Andiri, only because the AI knew
he was capable of sacrificing himself.
And that's the saddest part of all.
Yeah.
I'm a little sad now just thinking
about how that ended for Jackson, because I
love all my characters.
And I hate it when things inevitably happen to them.
(40:02):
That said, thank you so much for that terrific discussion
and for coming to my podcast.
Well, thanks for asking me.
It was good to meet you in person at Comic-Con.
Yeah, that was amazing.
Comic-Con was such a great time.
And I'm very, very happy that I asked
you to come to my podcast.
However, can you tell the listeners where to find you
(40:26):
and if you have any upcoming releases?
Well, where to find me?
I have a website, Napa Time, or one word, Napa Time.
And on Twitter, I'm The Escher Man, or one word, The Escher
Man.
But on everywhere else, I'm Tia Napa.
So Instagram, Facebook, Blue Sky, I think that's all I'm on.
That's my, they're my handles.
(40:48):
So you're talking about Ghost, which came out in June.
But The Escher Man came out September 17.
It's a brand new novel.
It's technically not out in Australia until October 1.
But this broadcast will be out by October 1.
Yeah, yeah, yeah.
So it'll be available everywhere.
Although I've saw it in the bookstore the other day.
Some bookstores have it early.
And The Escher Man very much deals
with this question of memory.
(41:09):
So as I said at the start, The Escher Man, Ghost of the Neon
God, and 36 Streets all take place in the same world.
They're standalone novels, but they live in,
they're part of a shared universe.
So yeah, that's what's out now.
And I've been on the road, Livia,
going to a Revery Bloody Con to talk about the books.
But The Escher Man took me 10 years to write,
and Ghost took five years.
(41:30):
So it's just a coincidence they happen
to be out in the same year.
I'm not that prolific at all.
So if you like this episode, please like and subscribe.
And if you are keen on getting bite size, deep dives,
and prose analysis delivered straight into your email,
then sign up for my newsletter at liviajelio.com.
(41:52):
You will get the e-book for my novella,
The Genesis of Change, for free.
It is not cyberpunk, but it is quite grim and philosophical.
You might like it.
Thanks for listening, and happy reading.
I'll see you next time.