All Episodes

August 26, 2016 47 mins

If you could make the world a better place by removing the ability for people to make selfish choices, would you do it? We continue the conversation.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Brought to you by Toyota. Let's go places. Welcome to
Forward Thinking. Hey there, and welcome to Forward Thinking, the
podcast that looks at the future and says, am I
right or am I wrong? I'm Jonathan Strickland and I'm

(00:21):
Joe McCormick, and today we're going to be continuing our
conversation on moral bio enhancements. This is going to be
part two of a two part series. So if you
haven't heard the first episode, you should go back listen
to that one first, so you know what we're talking
about in this one. And without further ado, here begins
episode the second. So the whole concept of moral bio

(00:43):
enhancement has lots and lots and lots of problems, some
of which we kind of touched on either through our
tone or content so far. Yeah, and well, and I
want to give a shout out real quick to a
really great review of the literature on moral bio enhancements
that I was reading by Jonas Specker and colleagues called

(01:04):
The Ethical Desirability of Moral bio Enhancement a review of
reason and and a lot of a lot of the
things that I'm saying are are things that that that
that Specker in our colleagues, um pulled, pulled together from
just tons and tons and tons of of amazing thinkers
on the subject. So just wanted to put that out there.
Go read that paper if you'd like, a very very

(01:26):
theramic breakdown, but big problems. Yeah, who who decides what's moral?
That's a big one. Yeah, who who's the moral authority?
I don't feel like I'm qualified. Well, I mean, I'll
stop up. So I understand that as a concern, But
I also think that this is the same problem we
already face in our moral decision today. So you've got

(01:51):
to make moral decisions in your life, and you're either
trying to think through ethical reasoning yourself and consider consequences
and trying to make moral decisions on your own, or
maybe in a lot of cases, you're sort of offloading
some of that thinking. Two people you would consider a
moral authority share like like a like a religion that
you abscribe to, a subscribe a word, let's say it

(02:13):
is um or or or or or like like like
the theory of objectivism, like like some kind of theory
like that sure, or even just to a person. I mean,
there are people who live among us that we often
think of as kind of moral genius. Mr Rogers would
have been a moral authority. Yeah. You look at somebody
and say, I think that person knows what's up when
when it comes to how we should act, and I'll

(02:36):
follow their lead because it sounds like they know they
know what they're talking about. So we already do this.
We either make moral decisions on our own or we
defer to a moral authority. So so we're already faced
with the problem of who decides what is and isn't moral.
This would just be adding on another step to that.
Do you also have a device or therapy further guiding

(02:58):
you toward that conclusion? Yeah? And and right now, even
if you have an authority who is proclaiming what is
and isn't moral, every individual has the freedom to agree
or disagree with that and to act upon that in
whatever way he or she sees fit. So in some
cases there might be like consequences. Absolutely, there could be.

(03:21):
There could be very severe consequences, and that would probably
prevent the vast majority of people from acting out on
those thoughts, even if they held them right. Uh, some
people might still act out on them and then suffer
those consequences. That history is filled with those stories. Every
story that involves the word martyr probably has some element

(03:42):
of that in it, right. Uh. In some cases you
might agree with the person who is labeled as martyr.
In some cases you might not. But in either case
it's someone who's who whose stance is very different from
that of the authority figure. The case with moral bio
enhancements is you would have no choice but to agree
with whatever the authority figure had determined as moral, because

(04:05):
the bio enhancement is mandating, is making that decision for you? Right? Well,
so that's a different question entirely than I would say,
because there we're talking about whether we could or should
force other people to undergo moral enhancement. Is that not
inherently immoral? I don't know that's a good question. I mean,

(04:26):
so I was considering people who would willingly choose to
undergo a moral enhancement. So imagine you could you could
elect for a free surgery that would make you a
better person. Would you do it? I think the problem
is that the people who would elect for that are
the ones that we least need to undergo the procedure. Well,
we could all be better. I mean, I I think
a lot of us, we're familiar with this, we wish

(04:48):
we were better people in a way. I know I
wish I was a person. There are times when I
think about how I could be a better guy than
I am, and I'm like, oh, man, like I let
my friends down, and I wish that I hadn't done that.
And I agree with that. And I think the people
that when when you start talking about moral bio enhancements
and you start envisioning the sort of problems that it's
meant to correct, the people who are perhaps the most

(05:12):
accountable foresaid problems seem to be the least likely to
elect to undergo a procedure. They might say, I don't
have a problem. I'm find how I am exactly you're
everyone else has the problem. I'm sitting pretty, and like, well,
the reason you're sitting pretty is because of the oppression
you're you're dealing out to everyone else. Well, I mean, see,

(05:33):
that's the I would argue. I don't think, first of all,
I don't think moral bio enhancements being mandated being a
a compelled thing everyone has to have it. I don't
think that's a great idea. However, I think that's the
only way it would work. Like I don't. I don't
think moral bio enhancements are a good idea generally speaking,

(05:54):
in that I think there are too many problems that
over out that out way the benefits, uh And the
benefits only exist if it works the way we want
it to work, as opposed to some perversion of that vision. Well,
I feel like we're getting there a little ahead of
ourselves here then, because we we should talk more about
what these problems are sure. Uh well, I mean, one,

(06:17):
we've kind of identified the idea that you have to
have to follow a moral authorities vision of what is
no isn'm moral And it may be that your own
view of morality doesn't match up to that person's morality.
It maybe the morality our sense of morality changes over time.
Uh yeah. Well, the theories like like relativism say that

(06:40):
that morality is is an inherently personal thing, that that
that that you know, my morality cannot be your morality Joe,
and that that neither of our morality is going to
be nulls or Jonathan's um, I mean, and on on
a society wide thing. That's that's definitely a question you've
also got um, you've all so got uh moral pluralism,

(07:02):
which which says that that some aspects of morality are
are going to counter themselves. That if you're acting completely
morally in one way, it's a trolley problem essentially, Um
that that you're you're never because of the way that
the world works, You're you're never going to make a
perfect decision. Right. There's not there's not a black and

(07:22):
white binary world out there where everything is either moral
or not moral. There are issues where you are you
might be faced with a complicated problem that has no
good solution, but you still have to make a decision.
And and that becomes problematic in a world where you say, well,
we've got a procedure that's going to force people to

(07:45):
act in a quote unquote moral way, because that means
someone has to make that either someone has to make
that uh decision ahead of time about what is the
moral outcome of those decisions or whatever guiding factors push
you to one choice over another. And then, like you said,
you've you've got the problem of of what if our

(08:05):
ideas about morality change over time, because they do continually. Uh,
it's it's I'm sure, I'm positive that what was considered
good and moral six years ago is not the exact
same thing that we consider today, and that's that's that's
a long term kind of concept of like if these
if these changes, if if these treatments are irreversible, then

(08:29):
who makes who makes the decision to start changing them
down the line as as needed? Well, I mean, yes,
I think that's an interesting thing to consider, But I
also think that sort of falls into the same thing
I was talking about earlier, where this is already a
problem we're faced with just having moral brains. So we
have moral faculties that are informed by our sort of

(08:51):
natural predispositions with whatever genetic element there is, and then
also by our education and socialization which happened at certain
periods in time, and we get sort of moral rules
implanted in us. You can see this in changes between
generations where the older generation has been taught a certain
thing about what's moral, and then you know their kids

(09:11):
might not agree with them about that. Uh So, in
a way, I'd say this is also already a thing
that we face. We're talking again just about adding coercion. Well,
except that I would argue a person can also come
to the conclusion like they can change their right, they
can change their view, and so could a bio enhancement.

(09:34):
I mean it depending upon the implementation. Yeah, and and
depending upon the the desires of whomever whatever entity is
actually administering them. Right, So it gets more complicated. It's
not like it's something that would be decided on an
individual basis, or at least not the individual who's actually
having the experience. It would be decided upon from an

(09:56):
authoritative perspective. Right, So could you have the country of
vote on everybody's brain should be forced to think for
the bio enhancements are. Because once the bio enhancements are
and then you have the question. It's almost like our
our discussion on e voting. How do you know that
the the true desires of the person are being reflected

(10:19):
in the outcome? Yeah? Well, well here's another problem. This
is hypothetical. What would it actually be possible? We sort
of talked about this earlier, But I think is it
possible from a technological point of view to have something
like this? And I think the answer could be both
yes and no, as in, we know there's a brain
basis for morality that you can tamper with it with

(10:41):
electrodes or drugs, things like that. But morality also appears
to be this complex cross brain region phenomenon, meaning we
can't yet foresee a way to control outcomes with precision,
and the question is could we ever do that? Uh?
For for example, with with saratonin, seratonin is just let's
all chill out brain chemical. It also has a hand

(11:04):
in how we sleep, our memory and coding and recall,
our sexual behavior and performance, how we process pain, our appetite,
how we process visual information. It has a hand in
all of these. And so, uh, just like tossing SSR
eyes at the entire population wouldn't really be moral. It's
a it's a new kid for more of it kind
of option, right, Yeah, and it's really too bad that

(11:26):
we aren't like the crusty doll in that Treehouse of Horrors.
Oh here's the problem. Yeah. Um, but see that that
brings up another thing that we'll talk about in a minute.
What is the crusty dolls life like after he gets
switched to nice? It's not good, isn't. No. He lives
a life of humble subservience. But anyway, so we'll get

(11:50):
to that in a minute. But another practical problem I
want to point out is Okay, so we've got this
problem with precision in the brain. We don't know exactly
where to put the micro electrodes to make you stop
kicking small children. Um Man, I don't hope they never do.
So we could study this to try to figure it out.
But scientific ethics make it difficult to conduct experiments like

(12:13):
this because, okay, so imagine you're trying to get institutional
approval to perform brain surgery or introduce psychoactive drug regimens
on people in order to see what makes them spend
less time leaving jerky YouTube comments. That just seems like
you're going to run into experimental ethics problems. That's what

(12:33):
college students are for. Just elect to go and be
a subject in one of those testing procedures and they
get like twenty bucks at the end of it and
everything's fine. Or even even if you took a population
of of of criminals, of people who had murdered people,
you would still have really have a really hard time,

(12:56):
rightfully so getting getting permission from from any kind of
uh good board of humans too to carry out these
kinds of experiments, because any time that you that you
do something to someone against their will when they are
when they technically do not have a disease like a
like a lack of moral virtue is not a disease. Yeah,
this is another problem because then you're going to have

(13:17):
a conflict between the ethics of the experiment and scientific rigor.
Because ideally what you'd want is a randomized sample to
do your experiments on. You'd have a big problem if okay,
you say so too. In order to do this most ethically,
we'd have to have people who volunteered to want to
be a part of this experiment. But that would introduce

(13:38):
a self selection bias into the sample of people you're
performing it on, which is going to change the outcome.
Uh So, Yeah, there's all kinds of trouble in trying
to do experiments on this and and bringing up the
issue of you know, who do you who do you
perform these experiments on? Uh, it's not just an academic question.
I mean history is filled with examples of some very

(14:02):
ethically questionable, if not downright unethical, experimental projects that subjected
people without their knowledge. Two pretty intense and extreme experiments
you know, in the name of science, and justified in
some way or another at the time. But from today's perspective,

(14:23):
from our moral perspective today, we would say, yeah, that
is all kinds of wrong. So it is a very
tricky subject. Another complication I went to want to introduce,
what if moral cognition isn't as local as it once looked,
or what if there is no such thing as moral cognition.

(14:45):
I'm not going to go into the whole argument, but there.
There's another paper in social neuroscience from called where in
the Brain is Morality Everywhere and Maybe Nowhere? By Leanne
Young and James Dungan. They answer the question in the title.
You don't even have to read the rest of the paper.
Uh No, it's so uh. Essentially, they asked the questions
of is there really a uniquely moral part of the

(15:08):
brain or is this just a label we're applying to
aspects of the emotional brain and the social brain and uh,
And so they look for it and they say, yeah,
there are some regions that have been implicated, like some
of the stuff we talked about earlier, think of intermedial
prefrontal cortex and stuff like that. But it's also just
a very complicated picture, and nobody's identified this moral brain

(15:32):
substrate there's nothing there. Um, so we may be there
may be some folly in our approach here. If we're
looking for where is the moral brain? There might not
be a moral brain. Morality might be more like an
emergent behavioral phenomenon that we're describing that comes from some

(15:53):
emotions and some social tendencies. And we don't know if
in fact that turns out to be the case. That
makes it even more complicated to come up with a
moral bio enhancement that would actually be effective, right, because
then what you'd actually have to be modifying is you
can't pinpoint morality in the brain. You'd have to be

(16:15):
modifying emotions brain. And now you're really like, if you
weren't already getting people a little squeaky about the idea
of of tweaking morality tweaking emotions, then you're thinking, wow,
it sounds like, uh, you know, you're you're creating just
a little dial on me that has a very narrow
spectrum of the human experience experience and everything else is

(16:37):
off limits. Yeah, And this this actually I would say
that this conundrum is something we might expect from what
we've already seen with the overlap between moral behavior and
like SSR eyes which do mess with emotions. Uh. So
here's one more complication. I want to introduce technologically, the
plastic and adaptive adult brain. This is the thing we've

(16:58):
discovered is that the adult brain is more were adaptable
than we thought, and which is wonderful. Yeah. So there
are examples of the adult brain adapting to problems. Forms
of neural injury can, for example, be offset by ad
hoc adaptations using the rest of the brain. One example
is people with memory loss coming up with cognitive strategies

(17:19):
to offset memory deficit. Yeah. It's kind of like if
you work in a really small office and everyone has
a very specific job, and someone has to call out
sick at the last minute, and then everyone else has
to figure out, how can we continue to do our
work plus carry the load of this person who is
not there, even though they specialize in something that we
do not ourselves typically handle. Right, And you might not

(17:41):
be able to exactly cover that person's duties, but you
can sort of do it. Uh. And another one would
be since lost, people who have lost one sense like
site can sometimes compensate with adapted cognition based on different senses. Uh. Daredevil, Right,
I was just going to say the same thing that
I decided I've been too geeky for this episode soiled back.

(18:03):
But here's the question I thought about. What if we
apply this to moral bio enhancement. So you go in
and you do the equivalent, the positive equivalent of applying
a brain lesion that introduces modified sociopathy. You put in
some kind of modification that makes people very nice to
each other. Other parts of your brain are still going

(18:24):
to want to be selfish. So what if your brain
adapts to the change and finds a way to circumvent
it and revert to baseline jerky nous. Yeah, life finds
a way and this yeah, jerk faces find a way. Yes,
in this case were it's not that even the person
in question is making a conscious effort to be selfish.

(18:45):
It's the brain itself is adapting to these changes and
saying this is not the way things are supposed to work.
So other parts of the brains start behaving in a
slightly different way in order to kind of get us
close to that previous condition before moral bio enhancement as
you possibly can get. One of the other arguments that

(19:08):
I saw was, um, should we should we really be
just concentrating on treating actual mental disorders before we go
like whole hog on something that isn't a disorder, like
like moral disorders. Right. And then my response to this
is that the problem with that line of questioning is
it starts to argue for a zero some kind of

(19:30):
perspective on the subject, saying that if you focus on one,
you cannot by necessity focus on another. And my argument
would be that you could certainly have these areas of
research all working, perhaps even in parallel, cooperatively with one another. Uh.
And but it is it is a good question if

(19:51):
you could say, like, well, we have some very real
problems that we do not understand how to tackle in
a way that is beyond just treating some symptoms. Why
are we worried about something for people who have quote
unquote healthy brains. Uh? Shouldn't we focus on the people
who are really struggling with these diseases injuries, disorders and

(20:14):
worry about that. First, Jonathan, I think I agree with
your first criticism. I mean, I am wary of this
type of question in general. That's just like, well what
about uh, you know, what about is m of like
there's another problem though, Uh so to remove the first problem,

(20:35):
problems can exist at the same time. Yeah, yeah, yeah,
So I think that this is a good question if
you can point to factors that make it clear that
we really are going to need to choose between one
or the other. But if you can't point to factors
that make it clear that it's one or the other,
then I don't think this question, uh necessarily matters. Well,

(20:59):
how on another one then sure? Okay, in a world
where some people are artificially moral, would would the regular
people take advantage of them? Yes? I think so absolutely.
If you know that every time you come by your
neighbor's house and say, hey, can I borrow twenty bucks,
they're going to say yes. There are gonna be a

(21:22):
lot of people banging on that neighbor's door until that
neighbor doesn't have any twenty dollar bills left, which which
brings us back to uh, something that we that we
covered earlier in the show. Where where could regular people
be allowed to exist in a world where anyone has
this this altruistic treatment. So we wouldn't just be getting
rid of murderers, we'd be getting rid of people who

(21:42):
just you know, want to have basic uh, non harmful,
pleasure driven lives. Right. Yeah, let's let's say we've got
let's say we've got someone middle class hedonism. Yeah, hedonism.
We don't even need the hedonism part. But sure, let's
say let's just say, let's take it. Uh, let's make
a hypothetical person. And this person leads uh, you know,

(22:07):
typical life. They've got work, they've got some outside interests.
More or less, they keep to themselves. They don't really
socialize with their neighbors. They don't know their neighbors names
or anything. They're not mean to their neighbors, they aren't
thoughtless towards them, but they don't go out their way
to know them either. Sure, but they really love uh

(22:28):
spawn camping in in an online shooter game. Sure, it's
just it's just how it's how they release their tension. Right,
it's a legitimate strategy, I'll remind you. But it's yes,
so they will take advantage of something that is not
explicitly against the rules. So therefore they're not really doing

(22:49):
anything wrong, although their pleasure is coming at the expense
of other people's enjoyment. Or even even if you want
to take that away, let's just say that they happen
to notice that a name or is having a hard
time for whatever reason, but they don't feel any need
to help. They're not making it worse, they're not judging

(23:09):
the neighbor. They're just not helping. You could argue, well,
this approach would mean that people would feel better about
going out and helping others. It's not even that they
are taking advantage of other people. It's that they are
not ignoring them. They're not They're not discounting someone else's
misfortune or doing something like in like in this spawn

(23:31):
camping example, that that's that's earnestly harmless, like the grand
scheme of the right. It is irritating as all get out.
I say it's a legitimate strategy, that's merely a quote
from a from a video series. I don't actually spawn
camp well, so I want to introduce a modification on this.
A lot of I think a lot of actually very

(23:52):
destructive behavior in the world, and I'm sure y'all would
probably agree, is not even explicitly emminal behavior, not like
violent behavior. For example, there are systemically destructive phenomena in
the world in which many people participate, but nobody personally

(24:14):
breaks a law, or does anything violent to anybody else.
One example I think might be some practices on Wall
Street that you could point to. That's okay. So you
have thousands of people who collectively participate in a system
that you could probably predict is going to lead to
a bubble that's gonna burst, that's going to massively devalue national,

(24:38):
probably global economy, lead to millions of people being out
of work. You know, So there's a system that is
pretty much guaranteed to cause misery. People are participating in
the system without breaking any laws, and uh and so
and and so what do you do about this? I mean, this,
in one way is regular behavior. It's not necessary deviant.

(25:01):
They're not you know, beating people up in the street,
not breaking any laws. What would you think about this
kind of thing if you've just you're just part of
a system that has perverse outcomes. Well, and this goes
back to those arguments the philosophers were making at the
top of the show, right, the idea that we focus
so much on short term as opposed to long term,

(25:24):
that they would argue, using moral bio enhancements, you would
start to think more in long term, and that you
could still have that system in place. You could completely
have an investment system in place, but that the behaviors
of everyone involved would be more about trying to invest
and go for a long term return on that investment. Sure.

(25:45):
Sure to take it back to the to the online
shooter example. Uh, if you know, you could, you could
potentially argue that that someone who has been spawn camped
just one too many times develop some kind of some
kind of heart problem and eventually that that type of
repetitive emotional stress leads to a heart tech. H that's

(26:10):
a fair I'm not sure if fair is the right
word for it. But thank you, but so many bullets
and overwatch, I'm just ready for someone to someone to
to feel my pain. No, but but but that but
that that that same kind of drop in the ocean
sort of sort of behavior that you that you can
make an argue argument for in many different types of situations.

(26:32):
I think could could certainly be applied. That that, yes,
that that it is that we need a mandatory universal
moral upgrade. On the flip side of that, though, is
there any evidence to suggest that selfish behavior can sometimes
be of benefit to a species? Hey, I mean, let's

(26:53):
let's go back to the let's go back to the
Wall Street analogy. Now, I think it's quite easy to
see how, if you know, you look at like what
has happened in various depressions and in financial markets throughout history,
it's obvious how certain behaviors in in financial markets and
in on Wall Street can lead to pain and suffering

(27:15):
around the world. Sure, every single every single major bubble burst,
you would argue, like you just look at the the
outcome of that, right, the bubble burst, and you and
you think, wow, how could we have let this happen?
But then again you'll also get plenty of defenders who say, look, yeah,
you know, sometimes things go wrong, but this is what
drives our economy. We've got investment and even maybe sort

(27:37):
of like risky investment, it generates tons of wealth, it
creates jobs. Uh. And if there is something to what
people like that are saying, and this isn't the only
case where it would apply, you could say that, Okay,
sometimes people might engage in risky, selfish behavior that could
cause harm to others, and you might look at it
as morally dubious, but it also has lots of positive

(27:59):
out comes that we depend on, uh, Yeah, which which
brings me to, uh to another question. Could could we
nice our society to death? Um and and allow me
to to bring in a an animal, an animal tail
about this one? Okay, not a fable, no, because this
is literal, actual biology that's going on out there in

(28:22):
the world. Um. Okay, So some spiders actually live in colonies.
I'm sorry to break it to you, um, but most
spiders don't because they're cannibals. Um. But you know. But
but spider cities of up to like ten thousand spiders
lived together in some parts of the world, hanging out,
not eating each other, taking care of each other's eggs,
making repairs to the to the entire web, and sharing

(28:44):
their food. But um, but but in this specific but
in the specific type of spider um every generation that
they go through, about one of the colonies just die out,
just go completely extinct. Um and And so these these
researchers from the University of British Columbia looked into it recently,
and they discovered that once the colonies reach a certain size,

(29:07):
they essentially share themselves to starvation because the overall web
isn't catching enough food to support the entire colony. And
they overshare and they all die. Huh yeah, yeah. So
so there we have an argument saying if we get all,
if everyone's on the same page, we eventually just enthusiast collapse.

(29:29):
Yeah we we everyone ultimately goes extinct because we weren't
cold and callous enough to let parts of us go extinct.
Well here's another thing, Uh, think about who are some
of the greatest political leaders in history? Just think of
some in your head here. You don't have to say
who they are. I bet whoever you've got in mind

(29:52):
had to do some really immoral stuff in order to
achieve goals that ultimately they we now look back on
and say, I'm glad that happened. I mean, whoever it is,
a lot of people might might pick Abraham Lincoln, but
it's not like Abraham Lincoln just governed with like a
squeaky clean you know what I mean? Or whoever you

(30:14):
want to pick, leaders tend to have to do some
crappy stuff. So even the really good ones, and mostly
I mean even if it's yeah, even if it's not
a conscious effort to do something crappy, maybe that they
have to do something crappy in order to avoid an
even more crappy consequence exactly. So would we be hamstringing

(30:36):
ourselves and sort of preventing greatness and preventing change if
we say, well, everybody's always got to be a super
goody two shoes all the time. What if sometimes we
need people who tread into immoral waters in order to
ultimately take us to a better place. And then there's

(30:56):
the the question that's asked in a clockwork are in,
which is, if you remove a person's ability to make
an immoral choice, are they no longer a person? If?
If matter? Yeah, I mean Clark Karns just kind of
tricky to write. Like the American version that was published

(31:16):
was published without the twenty first chapter, So chapters one
through twenty you have Alex's sociopathic character who does unspeakably
awful things. He is not a redeemable character at all
through the vast majority of the novel, which is why
I think American editors demanded the twenty first chapter be

(31:38):
left off. Spoiler alert, he gets redeemed in the twenty
first chapter, So for twenty chapters he's a terrible person,
even after he's undergone the Little Eco treatment which gives
him this aversion therapy. Where he feels physically ill every
time he wants to uh perform an act of violence.
He still wants to do the ultra violence. He just
can't enjoy it. Yeah, he can't. He can't think about

(31:59):
it because it makes him sick. But he still has
the desire to do it. He has not changed as
a person. He only changes in that twenty first chapter
as a result of maturing of growing up. So some
people have argued that Burgess's approach to that form morality
was a little shortsighted, like two little naive and optimistic

(32:20):
saying that you'll just grow out of being a sociobath
and then you'll be fine. Um and uh. And people
say that the way that the book ends on chapter twenty,
where he is quote unquote cured of the Lotto Eco treatment,
it no longer affects him, so he can go back
to the person he was at the beginning of the novel. Uh.
That that raises a very tough question for the audience.

(32:43):
Is it better to remove the ability to make these
terrible decisions and have a quote unquote peaceful society or
is it better to allow people to retain their humanity?
But the consequence is you've got this unease and unrest
and chaos us in society and burgesses. An answer eventually was,

(33:03):
if you wait around long enough, people grow up and
then they stop being total jerk faces. Well, you know,
some people do grow up. Some people grow out of
certain types of behaviors. I think. I think just rampaging
around murdering people all the time doesn't sound like something
people usually grow out of. I'm not sure, to be fair,
most people who do that behavior don't necessarily get start

(33:26):
a chance to continue all that long. Well, they also
don't necessarily get started at age thirteen, and most of
them tend to be if you look at the grand
scheme of criminal psychology, it tends to be later in life.
But at any the point in fiction versus reality. But
but no, but but but but it is. It is
a really interesting moral question of whether or not our

(33:49):
personal choices and morality are a part of what makes
us human, and whether we would be removing and an
integral part of the human experience by by implementing some
kind of treatment like this, will we just has becomes
some kind of fleshy robot If we were unable a
robots at least in theory only able to act within
the parameters of its programming. Yeah, so would we just

(34:10):
become robots because we would have the equivalent of programming.
It would just be here's a list of things you
are not allowed to do, and not not just from laws,
you physically cannot do them because of these enhancements. So
here's the thing. I don't know if we maybe we have,
but I don't know if we've talked yet in this
episode about what is the internal experience of this life.
We've talked about sort of the external behavioral outcomes, but

(34:35):
what does it feel like to have one of these
modifications performed to your brain? And since it's hypothetical, it's
really exactly I'm just saying, trying to imagine it. Would
it be a case where you really want to do
something immoral and you feel the urge to do it,
but something prevents you, right like you're you're cybernetic and

(34:57):
plan't get it out your cybernetic implant. Make sure that
you can't, you know, go out, go outside and kick
that kid that's been irritating you all day. It's not
that you don't want to do it is that you
physically are unable to do it or does it change
your fundamental desire? Right? And that seems harder to imagine.
In that case, would you still be you? And you

(35:20):
could argue that if it was something that was done
from birth, then it is you. It's the but it's
the only it's not that you you could have been right, Yeah,
it's you as mandated. Well, and I and I think
that that based on the neurobiological view of of of
what we currently know about how we could implement this
kind of treatment. Um, I think I think that's more

(35:44):
likely actually that that latter thing, where where just are
our very personalities would be different, um, more than having
like a button in your brain that like prevents the
kicking mechanism from traveling through your nervous system. Well, I mean,
I guess here's one thing we could compare it to.
Imagine you're somebody who takes an s s R I
for depression or something like that. How does that change

(36:07):
you do? Do you like feel depression coming on but
then something stops it? Or do you just is it
just something that's no longer a feature of your brain
in the same way, So so sort of like have
you guys ever had like a really bad headache and
then taken some headache medication and you can tell there's
still a headache, but you're not feeling the pain anymore.

(36:29):
Because that happens with me and migrains where I'm aware
that there is something in my brain that is not right,
that it is it would be causing me agony. But
the medication I am on prevents that pain sensation. But
there's still like it almost feels like a presence, like
almost like it's a physical thing in my head. Um.
I I have, if I may share this experience with y'all,

(36:51):
taken ss R S for anxiety and depression and and
for me, the way that it's worked is, um it's
it's more like it's sort of like all get an idea,
um and and an anxious or a depressive idea and
uh and and without the medication, I I have sometimes
had the experience of not being able to escape from
the idea, kind of have that idea repeat and and

(37:13):
get worse and kind of spiral in my head um fixating.
Yeah exactly, um And But but with that kind of medication,
I've had the experience of of just being able to
shut it off, just going like this is ridiculous, dude,
stop it and and having and having that work, as
opposed to other times when when I have that thought
and in my brain is just like nope, blur, forget

(37:36):
about being any sort of productive member of society. You
are going to be paralyzed with self doubt and fear.
This is just what you're doing for the rest of
your day. I'm familiar with it, and so yeah, so
so I mean, I don't know, like I it's it's
a very strange thought to to try to imagine that
that kind of thing going over into into moral territory

(37:58):
of like of like may in like I really want
to cut that dude off in traffic. But I guess
I can just calm down. I guess, I guess just
everything that's going to be fine. It would be interesting
to live in a world where road rage is just
a term of something that used to happen, right, Like,
that's a weird thought, especially here in Atlanta. So so
let me ask you guys a question. I think we
can wrap up this discussion. It's gone on pretty long,

(38:20):
and uh, we have another section that I think is
completely superfluous, So I cut it. But I do have
a question for both of you, so, and I'll be
happy to answer it to your own personal response to
the idea of moral bio enhancement, do you think, ultimately
it sounds like something that we should absolutely pursue or

(38:41):
do you think the negatives outweigh any positives. What was
your personal feeling, Joe, Well, I mean, I think it's
very complicated for all the reasons we've talked about here
and probably some other ones we haven't even thought of.
I don't necessarily think that. Um. I think there's a
tendency that a lot of us who have experienced dystopian

(39:04):
science fiction have to want to say anything that sounds
kind of creepy is something that is ultimately like no, no, no,
that you know, we we shouldn't even look there. And
I don't feel like that. I don't feel like this
is something we shouldn't even look into. But I'm I'm
certainly not ready to commit to bio moral bio enhancement,

(39:25):
especially not compulsory moral bio enhancement. UM. I don't know
voluntary moral bio enhancement that I'm trying to think of
what exactly would be the problem with that, and nothing's
coming to mind it's hard to say without the actual
ability to do it, right, because we can't we can't
observe the results. Yeah, I think I think that the

(39:46):
only real risk is having those people being taken advantage of. Yeah. Yeah,
I think I think that the compulsory is what would
be flat flat out evil, just that that would be
completely so copletely immoral. Um, I'm taking a stance right there.
And uh, and I don't know, like I think actually, uh,

(40:08):
allowing the possibility of voluntary moral bio enhancement might range
towards evil a little bit too. I feel pretty strongly,
um it kind of surprisingly like like I'm kind of
pro a lot of of other voluntary brain enhancements, Um,
but but this one, I don't know. I just I

(40:28):
have real squeaky feelings about. Okay, well, let me let
me test you on that little if that's all right, Yeah,
totally alright. So we we we have a serial killer
who has been caught by police. And this is a
guy who's murdered twenty seven people. Uh. He tells the
police that he will not stop. If he is set free,
he will do it again. Now, this guy has the

(40:49):
several options he can go to prison for the rest
of his life, or he can elect to take some
proven bio enhancement therapy and we'll just assuem optimal conditions.
Here we've actually shown that it works in some way.
There's not a danger of this somehow going wrong right
and has So we could put him in prison the

(41:10):
rest of his life, or we could give him this
thing and release him and he could live out his
days and never harm anyone again. Do you still think that,
like if in a in a in a perfect world
where where the treatment was was not a lobotomy and
uh not the kind of of aggressive drug therapy um

(41:31):
that that has sometimes been practiced, then then yes, if
if the treatment truly was changing just his desire to
go kill a whole bunch of people, then then then
of course that would be beautiful. But that's a big
But that's a big if. That's such a hypothetical thing,
and I and I don't personally envision us figuring out
enough about the brain and enough about about ourselves to

(41:53):
to do that certainly in our lifetimes. Maybe ever, is
that too cynical? Well, the thing I like to think
that we're capable of everything, except for the fact that
human beings also change over time, so as we gain
an understanding of how things are, we don't necessarily have
an understanding of how things will be. So that's that's
another I mean, you would think that our scientific understanding

(42:14):
would start to outpace other factors, because it's not like
evolution happens super fast. But that is still a factor, Lauren.
I I side with you mainly on this. I I mean,
I'm clockwork. Orange is one of my favorite novels of
all time, and it's largely because I read the American
addition that left off that twenty first chapter, which to

(42:36):
me makes it the readers responsibility to answer the question,
which is the greater evil? Right? Which are these these
things that come out are the greater evil? The the
allowing people who have immoral thoughts and who act upon
them to exist, or removing the ability for a person

(42:58):
to make any choice other than the ones that have
been mandated by an authority figure, even if that's even
if that's a range of behaviors, is that better? And
more often than not, I I side with the that
you shouldn't mess with that, at least not too much.
I do think, to Joe your point, the idea of

(43:21):
pursuing it in a way that is a treatment for
pathological issues makes sense, and that it's treating a person
who otherwise doesn't have that capability. It's it's literally a
pathological issue, whether it's from an injury or an illness condition,
whatever it may be. And and and it's a danger
to yourself or others kind of situation. Yes, that kind

(43:44):
of situation, I would say that makes sense. I think
anything beyond that is at best problematic. I've also seen
some criticisms. I didn't go into it in the notes
because it didn't have time to really read and digest
all the information, but I've seen critics of the idea
of moral bio enhancement say, just based upon the way

(44:05):
we have scientific progress and technical progress alone would suggest
that we would be able to address this in a
very piecemeal kind of way, which could end up having
disastrous consequences. Where it's like the idea of having that
superhuman intelligent machine and you say, hey, I want you
to find a solution for world peace, and its solution
is to kill everybody, because now there's no way you

(44:27):
could have conflict. Uh. That kind of idea that if
you were to address one part of morality without being
able to affect all of it, you could have some
unintended catastrophic consequences. And I think the capacity for things
to go wrong is so great that it outweighs the
capacity for it to be a benefit to society. Um.

(44:48):
That being said, I would love to see some really progressive,
effective means of having people kind of come to uh
uh a moral enhancement that doesn't involve bio enhancement, whether
it's the education or uh some form of outreach things
like that, so that people have that experience and are

(45:10):
able to expand that that small social group to a
larger group of people, and also expand that short term
gain perspective for a long term one. Yeah. Another complication
on top of all of this is that in the
example we gave about, you know, somebody who's been already
convicted of many violent crimes. Um, you'd also have the

(45:34):
problem that people often don't just view prison as like preventatives.
It's a punishment, right, and so that that would also
be a complication. So imagine somebody has done a bunch
of very harmful evil and now we're saying, well, what
we could do with this person is give them a
treatment that would mean they never do anything like that. Again,

(45:54):
a lot of people would not be satisfied with that. Say, know,
what needs to happen is this person must suffer in
response to what they did. It's not Yeah, this is
where you start to you know what, how do you
view the purpose of prisons? Is it meant to be
a punishment? Is it meant to be a center for rehabilitation?
And the concept of the penitentiary being being making someone

(46:15):
making someone become penitent, right, it's in the name exactly. Well,
that was an amazing conversation about concepts that are pretty heavy.
I appreciate that you guys took the time to have
that conversation with me, because again, like I love this
whole arena of thought. It's fascinating to me. Uh and

(46:37):
it's one that I think about a lot. And I
would argue that in a lot of ways forward thinking
being in kind of an optimistic view of what the
future might be. A lot of that optimism depends upon
the concept of compassion. And if you were to argue
that compassion is having a less relevant place, maybe your
solution to that issue would be Hey about moral bio

(46:58):
enhancements but as we've discussed done here, we're not so
convinced that would be the best approach, especially not compulsory
moral bio enhancements. But I'm curious to hear what our
listeners think. Yeah, you guys get in touch with us. Yeah,
you can send us an email our addresses f W
Thinking at how Stuff Works dot com, or you can
drop us a line on Facebook or Twitter. At Twitter

(47:19):
where FW thinking over. On Facebook, you can just search
f W thinking our profile will pop up, leave us
a message there and we will talk to you again
Willie soon. For more on this topic in the future
of technology, visit forward Thinking dot com, brought to you

(47:50):
by Toyota. Let's Go Places,

Fw:Thinking News

Advertise With Us

Follow Us On

Hosts And Creators

Jonathan Strickland

Jonathan Strickland

Joe McCormick

Joe McCormick

Lauren Vogelbaum

Lauren Vogelbaum

Show Links

RSSAbout

Popular Podcasts

2. Stuff You Missed in History Class

2. Stuff You Missed in History Class

Join Holly and Tracy as they bring you the greatest and strangest Stuff You Missed In History Class in this podcast by iHeartRadio.

3. Dateline NBC

3. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.