Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Christa Potratz (00:00):
on today's
episode.
Bob Fleischmann (00:02):
There's a
presumption, when you say I
wouldn't want to live like that,that everybody's thinking like
you.
And so, as a result, peoplebegin to start making these
judgments like because Iwouldn't want to live like that,
that they probably wouldn'twant to live like that and
because their mental conditionis not up to my level that it
eventually leads to basicallyforced euthanasia.
(00:22):
In other words, someone's goingto determine that surely they
wouldn't want to live like that.
And that happened in 1982 whenBaby Doe was born in Bloomington
, indiana, a born Down syndromechild, and the family decided
not to do reparative surgery onthe esophagus so the child could
at least eat, because they knewthe child would not want to
live like that.
Paul Snamiska (00:45):
Welcome to the
Life Challenges podcast from
Christian Life Resources.
People today face manyopportunities and struggles when
it comes to issues of life anddeath, marriage and family,
health and science.
We're here to bring a freshbiblical perspective to these
issues and more.
Join us now for Life Challenges.
Christa Potratz (01:11):
Hi and welcome
back.
This is Krista Potratz, and I'mhere today with Pastors Bob
Fleischman and Jeff Samuelson,and today we're going to talk
about our February currentevents here at the end of
January and so some things thathave happened this month.
And we'll start with the firstthe death of Derek Humphrey.
(01:32):
Bob, do you want to give us alittle background as to who this
is?
Maybe?
Bob Fleischmann (01:42):
for people that
don't know and kind of get us
started that way.
Derek Humphrey was a journalistout of Great Britain who
assisted his wife in her suicideand he never was prosecuted and
he wrote a book called Jean'sWay which is this very I've got
the book, I read the book andit's this very heartfelt, you
know, deeply in love, mercifulthing.
And he tells this dramaticstory about how the doctor gave
(02:05):
him the chemicals and he mixedthe potion to feed her.
So they said their preciousgoodbyes and then he went into
the next room played herfavorite piano piece as she just
drifted away and died.
And then he immigrated to theUnited States and started the
Hemlock Society and he gotmarried again and the second
woman he married they wereworking on the Hemlock Society
(02:26):
and right-to-die movement typestuff and they had a falling out
.
She developed breast cancer.
They separated or divorced.
She ironically became friendswith the late Rita Marker who is
the head of the Anti-EuthanasiaTask Force.
Who is the head of theAnti-Euthanasia Task Force and
(02:48):
Rita would tell stories andwrote about her relationship
with this second wife of DerekHumphrey and one of the things
that she wrote about was thatHumphrey had confessed to her
that the pills did not workcompletely and he had to smother
her to bring his wife's firstdeath.
And that's why, when Californiafirst proposed some assisted
suicide legislation that was onthe order of dispensing
(03:09):
medication, Humphrey opposed it.
He felt that doctors needed tobe more directly engaged and
involved.
And speculation is it's becausehe saw how it failed.
And so then, despite thefriendship of a second wife with
Rita Marker, she committedsuicide and so Derek Humphrey
just died.
(03:29):
And I had my one chanceencounter with Derek Humphrey
back about 30 years ago whenWisconsin first held its first
had proposed assisted suicidelegislation, and I was on a
radio show, a Milwaukee radioshow, and I was on one line,
derek Humphrey was on the otherline and the host asked how I
(03:50):
felt about the legislation and Ihad already done some checking.
And the question was you know,did you think it was going to
pass?
And I said, well, I said, fromeverything I've heard, it
doesn't even stand the chance ofgetting out of committee.
And then they said well, whatdo you think of that, mr
Humphrey?
And his response was ReverendFleischman's correct we don't
believe it's going to get out ofcommittee.
(04:11):
Our goal is not to get this topass right away.
Our goal is to get people soused to talking about our right
to make these decisions.
A right to death with dignitythat in time they'll begin to
ask for it.
A right to death with dignitythat in time they'll begin to
ask for it.
And I've always characterizedHumphrey as perhaps the most
honest adversary I've ever hadon these ethical issues.
Jeff Samelson (04:33):
One of the things
that I found interesting first
of all is that I found out aboutthis kind of through a back
door, wasn't in any of theplaces where I would have
expected to have it publicized,and pretty much you know online.
The only things we could findwere the obituary, and the
obituary did not mention hissecond wife at all.
(04:55):
There was a third wife, but itwasn't mentioned as a third wife
, who was apparently with himwhen he died.
And the thing I findparticularly interesting is that
he was 90 or something likethat when he died and there was
no mention whatsoever of whatyou of interesting.
You know whether he at the endreally had the courage of his
(05:28):
convictions or not, and it'sjust one of those things that
maybe nobody's making a big dealout of it, because at the end
he wasn't really the advocatethat certain people wanted him
to be.
Bob Fleischmann (05:43):
Well and he had
burned a few bridges on his
side of the table on that issueand you know, the legislation
was never quite right.
And he made a big splash afterhe wrote Jeans Away.
When he immigrated to theUnited States he wrote a book
called Final Exit, which was arecipe book on how to kill
yourself.
And I think I'm right.
(06:03):
I think it's on page 20 of thebook.
I've quoted it in mypresentations.
He writes in there he goes ifyou believe in a God or a
supreme being, don't read anyfurther.
Get the best hospice care youcan and move on.
You know, because he recognizedhow godless his movement was
and he was a professed atheist.
(06:24):
But you know, the joke thatalways went around about Final
Exit is that libraries wouldnever carry it because people
would check them out and neverbring it back.
You know it was always kind ofthe dumb humor.
But the thing that wasinteresting is Jeff was the one
who told me that he had died.
Even in my pro-life circles itdoesn't come up on any radar and
you dig around, you find theobituary.
(06:46):
You find it on like the Rightto Die site.
I think they had something andI even did some deeper dives,
but it was all kind of on moreon the advocacy end of it, but
nothing in the media.
Christa Potratz (06:57):
Yeah, why do
you think that is?
Bob Fleischmann (06:58):
Well, I think
it's because the same thing I
think if Margaret Sanger were tohave died in.
January.
The media wouldn't cover itBecause I think they were
embarrassed by some of thelaundry he brought along with it
.
But he really was.
Before Jack Kevorkian.
He really was the catalyst toget the Write the Die movement
moving in the United States.
(07:19):
So if you're listening to thepodcast, you have now expanded
the circle of people who nowknow that's right.
Christa Potratz (07:27):
The next thing
that we'd like to touch on a
little bit, too, is thatPresident Trump now is in office
and he is signing a bunch ofpardons, and has just recently
pardoned a bunch of pro-lifepeople.
Jeff, can you give us a littlebit more information on what's
been going on in that area?
Jeff Samelson (07:48):
Well, one of the
interesting things from my
experience with this item wasthat I was a little bit behind
on reading some of my news andcommentary.
And, of course, a lot of the bignews on the day of inauguration
was all the last minute pardonsthat President Biden had put
through.
And then, by the end of the day, were all the multiple pardons
(08:12):
that President Trump had putthrough, and there were articles
I was reading that were saying,yeah, he pardoned all the
January 6th people.
No-transcript that there was nogood reason to be sending
(08:52):
somebody to prison forsignificant terms, particularly
people who were in their 70s,people who were mothers of
toddlers, sending them to prisonfor the crime of blocking
access to an abortion clinic.
Now, that's not the kind ofdisobedience that we necessarily
approve of, but at the sametime, you wouldn't do that in
(09:15):
any other situation with anyother kind of protest.
They were very much focusing onoh well, this is pro-life,
anti-abortion, so we've got tosend a message here.
I am glad that President Trump,in this case, has said no,
we're going to cancel thatmessage and release these people
.
Bob Fleischmann (09:33):
We've done past
episodes on the subject of
truth and what is truth andobjective standards, and I would
say that maybe the last five,six years have definitely
challenged me as far as thesubjectivity that we have
applied in the way that we lookat a lot of truths.
The law that these protesterssupposedly violated said that
(09:54):
you can't intimidate, and ofcourse, intimidation is a pretty
wide, open term, and that'swhere you begin to politicize
the law and, as we've observedon both sides of the aisle, that
you come into it with a bias.
And so if you favor abortionrights, you favor the right of
women to be able to terminatethe life of their unborn
(10:15):
children.
You view anything as an enemy.
So you look for any latitude inthe law and you make
proclamations that favor yourside, and I think that the Trump
administration is basicallysaying the other side has had
their day.
They've done this time to resetthe clock.
Now I don't know how.
(10:36):
Look me up in four years andwe'll talk about how well that
was done.
But the one thing to rememberand I have made this mantra many
times and that is neitherideological side on this issue
takes into account the sinful orevil inclination of the human
heart.
They all presume that they aretaking the most noble route.
(10:56):
You just would.
Sometimes I long to just have apolitician second-guess
themselves.
But you know, just this morningthere was a commercial on
television for a state SupremeCourt race.
The running monologue on it isyou know, this candidate stands
for this.
This candidate stands for that.
This candidate has stood forwomen's reproductive rights in
(11:19):
Wisconsin and da-da-da, da-da-da.
And then she comes on and saysone thing you can count on me to
do is I will do what is right.
Based on what standard?
And that's the problem.
And what has happened is it wasa gross injustice.
You know, granted, the lawwould have provided some sort of
penalty for these people,pro-life people who are out
(11:42):
there, you know, who went toofar, maybe blocked the entrance,
chained themselves, whateverthey may have been guilty of.
But the problem was is thatthey were trying to throw the
full weight of the law, as ifyou were truly an
insurrectionist, you know.
And that's the problem andthat's where the bias comes in.
So it was good news to readthat.
I was more concerned.
(12:04):
You know that I saw this masspardoning and the pro-life
people were still behind bars,but I believe it was 23 pro-life
people were set free.
Christa Potratz (12:15):
All right.
Well, we're just cruising alonghere today.
Another topic that we wanted tohit was one that Jeff had
brought, and it was a veryinteresting read on the AI.
Well, I want to say girlfriend,but I think it was an AI
boyfriend or something but justthis very interesting idea of
(12:37):
how artificial intelligence isbeing used as a companion and
just where that can go.
So, jeff, do you want to tellus a little bit about just the
interest in the article and whatit's about?
Jeff Samelson (12:50):
Yeah, Well, there
have been lots of articles and
things, news items out thereabout the wonderful
possibilities that AI presentsfor, say, you've got an elderly
person who lives alone, youcould set up an AI companion for
him or her and would help dealwith some of the loneliness, and
you know various things likethat that have been suggested
(13:10):
that you kind of say, well, youknow that might be worthwhile.
But what this particulararticle was and it was actually
disturbing to read was about awoman, and she was not unique,
she was just the one who was thesubject of the article, because
there are apparently tens ofthousands of people doing the
same kind of thing Using theopen AI chat, GPT, chatbot,
(13:32):
whatever.
She basically created aboyfriend for herself and it
became a very deep relationship.
It became sexual, obviously notin any physical sense, but very
much in a mental and emotionalsense, to the point that she was
willing to spend not only lotsof time on this relationship but
(13:54):
also lots of money in order toget it to the next level, and it
was just kind of a oh yeah, and, by the way, she has a husband.
I know that part was realinteresting.
She didn't really seem to see aproblem with it and he said he
didn't have a problem with it.
But I strongly suspect thathe's just not saying that he
(14:16):
does.
But it's disturbing because notjust because it's a kind of
infidelity and they're justpresenting it as well.
This is fine, becauserelationship connection that is
being made with someone who isnot human, and that's not the
(14:37):
way we're made, that's not whatwe're made for.
And they quoted an expert Ibelieve it was a psychiatrist in
there said well, what arerelationships for all of us?
They're just neurotransmittersbeing released in our brains
Like no, no, they're not.
That may be how sciencemeasures this, but that's not
all that it is.
(14:57):
Relationships are meant to benot just face-to-face,
words-to-words, ideas-to-ideas,but really soul-to-soul.
And an AI chatbot does not haveand never will have a soul.
And if you're trying to grow inthis relationship, it's very
one-sided, which means it'ssophisticated technological
(15:19):
navel-gazing, because you'rejust getting back what you put
in and it becomes kind of adownward spiral.
And that's really what thisarticle from the New York Times
was demonstrating.
Bob Fleischmann (15:31):
One of the
great difficulties with AI is
that it is it'll sometimesportray your best and your worst
features because you know it'lltake what you put into it and
it constantly is trying tomodify it and to change it to be
increasingly more compatible.
Both of you know that I'm a bigfan of AI.
(15:52):
I use it a lot.
I've used the Gemini AI, whichis the if you've seen the
commercial on television wherethe guy is lying on his back and
he goes so what do I do?
Do I talk?
And the voice comes back andsays, yeah, just go ahead and
have a conversation.
So I'll do that, you know andI'll have.
But you got to always approachAI kind of like picking your
friends.
(16:12):
You've got to remember that AIis going to capitalize on both
your best and your worstfeatures and you have to be
keenly aware of that.
And so what happens here isthat I use AI exclusively for
research.
I will propose a thought, I'lldo what's called a thought
(16:34):
experiment.
I'll do a thought experimentand I'll argue with AI about it,
and that's what I've done andI've shared some of that with
both of you when I talk to AIabout abortion and slavery, and
it's fascinating If you are evenremotely vulnerable, I think,
and maybe even I could bevulnerable to it.
(16:57):
But if you're remotelyvulnerable, you begin to start
convincing yourself that this isa real thing.
If you ever need to have acourse correction, ask AI to
draw you a picture and make me afor sale sign.
If you ever notice, ai cannothandle spelling, it's a very
simple thing.
Every Saturday morning I havethe family over and I make
(17:18):
breakfast for whatever my familymembers are in the area, and so
on the TV in the living room Iput on a YouTube AI generated
graphic and what it does is ittakes pictures and stuff and
then it kind of enhances them.
It does all the kinds of fancythings AI does, but it can never
handle print, and you got toremember that because study
(17:41):
after study of AI has shown thatit multiplies erroneous
thinking.
So, first of all, I'm trying totell you AI has a place, but
never expect from it more thanyou should.
Secondly, this whole problemthat Jeff was saying the
psychologist is trying todismiss its use of AI.
Part of the problem with thatis that's the same argument
(18:01):
that's used to justifypornography, justify sex dolls,
all that kind of argumentation,and it's always the idea there's
really no victim and everythingno no.
The victim is you.
Because Scripture clearly talksabout the heart, and the thing
is that you flee temptation andthe moment you become dependent
(18:24):
upon AI, you become dependent onsomething that begins to find a
place in your heart.
Get out of it.
It just AI will never be thatgood.
Christa Potratz (18:35):
Yeah, no, I
mean there were a couple of
things that really popped out tome, and of course I mean this
story is very anecdotal too, butjust in this particular case
she had said, well, I have this,my husband has porn I think was
a line or something in thearticle, and kind of getting
(18:56):
what you were saying, bob too,was just how you can maybe kind
of liken these two things inthat sense.
But then also you had mentionedtoo, jeff, just the amount of
money that she was paying for.
This too was crazy, and I guessone of the limitations was that
(19:18):
ever so often the AI would goback to just maybe not remember
everything or restart orsomething like that.
And then I think in the articleit kind of asked like well, what
would you pay if you didn'thave to restart the process like
(19:41):
every week or something?
And I think she said she'd paylike $1,000 a month or something
.
And that's kind of to your pointtoo, bob.
I mean, that's an addictionthere, and so there is a victim
and it is you.
And that part just was reallysad to just think that, wow,
(20:04):
okay, this fake person.
Now you have just given, I meanobviously, your time and energy
and also money and justanything that you possibly can,
to feeding this addiction.
Bob Fleischmann (20:22):
Mm-hmm.
Well, and people have committedsuicide.
You know, because of theirinteractivity with AI, they, you
know they're not getting theresponses.
And finally, the AI chatbotleads them to the conclusion I'm
better off dead.
And you know you're not goingto find defense for that
anywhere in scripture classes oranything like that.
Jeff Samelson (20:45):
I certainly
remember it being a common
mantra when it came to computersback in the 80s and 90s garbage
in, garbage out, and the ideabeing that if you put junk into
your program, that's what you'regoing to get out of it as well,
and if you put junk data intoyour spreadsheet, you're going
to get junk results, and withwhat we know of the human heart,
(21:10):
that it is full of sin and allsorts of things when you think
that, in a pseudo-relationshiplike this, what you are getting
back from the AI is what yoursinful heart put into it is what
your sinful heart put into itYou're getting a lot of garbage
back, and it's as Bob said.
(21:34):
It multiplies error in that way,and one of the scary things
with AI, though, is because it'snot just your own garbage
you're getting back.
It's reaching out there intothe rest of the world and
pulling garbage from all sortsof other places that you're not
even aware of and giving that toyou, and with that you can
really understand how this canspiral into really dark places
(21:55):
that no one should want to goand that are going to be very
dangerous for your soul.
Bob Fleischmann (21:59):
Now, somewhat
related to this is I don't know
if you saw the story about thefuneral in Europe.
There's a project that has beenunderway I reported on it a few
years ago to the National Boardin which they were taking
survivors of the Holocaust andthey would sit them in a room
for like three days, eight hoursa day.
They were to dress identicaleach day and they would fire a
(22:22):
minimum 2,000 questions at themand they would answer and
everything.
Well, one of these survivorsdied recently and there was a
funeral and what they did isthey put up a screen and she
spoke at her own funeral.
They interacted with it, theyasked it and that's where they
utilized AI so that you couldformulate questions and then the
(22:43):
AI would search everything thatthis lady had ever recorded and
then she would actually replyback like you were having a
conversation with the deceased.
But the thing that's interesting, and the reason that came to
mind, is when Jeff was sayingyou know, not only does it put
up with the darkness out of yourown heart, but it also kind of
amalgamates the darkness out ofeverybody's heart.
(23:04):
That form of AI and it's adifferent nature of AI that form
of AI can be confined, you know, like you can confine AI so
that it's only pulling theinformation out of a set
material, but it still doesn't,you know, nullify the darkness
of the heart.
So it's a weird, weird storythat whole thing about her
(23:29):
basically falling in love withher AI companion.
Christa Potratz (23:33):
And our
listeners can read it too if
they want.
So we'll have links for allthese things in our show notes.
Another story that I wanted tohit on here, too, was this idea
of euthanasia on psychiatricgrounds, and this article was
talking about I think itinterviewed a couple of people
(23:55):
that either had written books orhad I don't know if they were
like anesthetists themselves butjust talking about offering
euthanasia to people that hadmental illnesses and just kind
of talking about that.
Bob Fleischmann (24:14):
Early on in the
pro-life movement, the most
targeted population that Irecall in what we used to refer
to as search and destroy was ifthey were able to diagnose
children in the womb who hadDown syndrome.
They were the most susceptibletarget and the mantra that
people would oftentimes use tojustify that is I would never
(24:34):
want to live like that, I wouldnever want to be like that.
And first of all, it's badenough just to even talk about
euthanasia and assisted suicidefor anyone.
But then we start drifting intopsychological versions.
There's a presumption, when yousay I wouldn't want to live
like that, that everybody'sthinking like you and so, as a
result, people begin to startmaking these judgments like
(24:57):
because I wouldn't want to livelike that, that they probably
wouldn't want to live like that,and because their mental
condition is not up to my level,that it eventually leads to
basically forced euthanasia.
In other words, someone's goingto determine that surely they
wouldn't want to live like that.
And that happened in 1982 whenbaby Doe was born in Bloomington
Indiana, born Down syndromechild, and the family decided
(25:21):
not to do reparative surgery onthe esophagus so the child could
at least eat, because they knewthe child would not want to
live like that and so theyterminated.
So once you drift intopsychological reasons, you know
it's an open arena.
Anybody could decide now what'sthe grounds.
And of course, you know God hasfar left the arena on this one,
(25:42):
because people are basicallysaying your God is now your
choice.
You know you decide your ownrules, basically saying your God
is now your choice.
You know you decide your ownrules and you have to remember
that when you do that, youcreate a mentality for the
entire society and even if youthink you are the most stable
person in the world and most ofus think we are pretty stable
compared to the rest of theworld you create a mentality
(26:04):
that preys upon others who arenot as stable.
Jeff Samelson (26:08):
And another
problem with this that was, I
think, really illustrated in thearticle is that once you open
the door to this and you say,okay, well, yeah, you know, it's
not just medical conditions,but it's also going to be severe
psychological suffering, yeah,they should be allowed
euthanasia as well.
It's like where do you set thelimits and how do you measure?
(26:29):
Well, this person is sufferingsufficiently to merit this, but
this person is just under thelimit and so we're not going to
offer that to them.
And then you start talkingabout ages and some of these
countries going down withteenagers and things like this.
And I don't know about anybodyelse, but I know that when I was
(26:53):
a teenager, there were thingsthat my world was just consumed
with and things that I thoughtwere just the world's biggest
problems, that, as I grew up, Irealized, you know what?
That's not such a big deal, Imean, it was important for me
then.
And when it's important forteenagers now, when they're
(27:15):
going through struggles.
It's not that they're not real,it's just that as you mature,
as you get older, as youexperience things, you realize
that wasn't the big deal, that Ithought it was, and there was a
way out and a way forward, andwe're just going to keep doing
that and offering thiseuthanasia.
It's just cutting all of thatgrowth and opportunity short
(27:38):
Again.
Another amazing thing in thisarticle was that there were
experts you know, of course, arein favor of this speaking as
though there is a substantivedifference between a person
asking for euthanasia and beinggiven it and that person
committing suicide.
What's the real differencebetween please kill me and I'm
(28:01):
going to kill myself?
And this is part of the waythat they're kind of trying to
justify this and saying, oh saidsomething like oh, it does beg
the question like does theperson really want to die or are
(28:33):
they just under the symptoms ofthe mental illness which would
make them want to die?
Christa Potratz (28:39):
As if there was
like a big difference like
between that.
Like okay, it's okay if it'sthe illness, but yes, like, and
they build a case based on nofoundation and of course, then
(29:15):
everything can go.
Bob Fleischmann (29:16):
It's like
abortion rights, it's like gay
rights, it's like all that kindof stuff.
I still say that what's goingon here is a very strong
eugenics mentality that reallyonly the bright people, only the
stable people, only the leadersshould be allowed to continue
living, and that seems to be thelogical end of where this is
(29:36):
going.
Christa Potratz (29:37):
Yeah, this is
kind of at our time for what we
can discuss today.
But you know, as we're goingthrough these, especially like
kind of the ending ones here toothere's a part of me that's
almost like oh, it's kind ofdepressing, like why are we even
, you know, going through thesecurrent events?
But you know, just to kind ofremind our listeners that I mean
(29:57):
, we do this so that we do knowwhat's going on out there, and
for better or worse, just to beinformed on some of these topics
just makes us, as Christians,more aware of what's going on in
our world.
Bob Fleischmann (30:12):
And we just
keep doing our thing telling the
truth.
That's right Helping peoplemake the right decisions in
their lives.
Christa Potratz (30:18):
Yep, Yep.
We will have information onthese current events in our show
notes and if you have anyquestions, please reach out to
us at lifechallengesus.
Thanks a lot and we lookforward to seeing you back next
time.
Bye.
Paul Snamiska (30:32):
Thank you for
joining us for this episode of
the Life Challenges podcast fromChristian Life Resources.
Please consider subscribing tothis podcast, giving us a review
wherever you access it andsharing it with friends.
We're sure you have questionson today's topic or other life
issues.
Our goal is to help you throughthese tough topics and we want
you to know we're here to help.
(30:54):
You can submit your questions,as well as comments or
suggestions for future episodes,at lifechallengesus or email us
at podcast atchristianliferesourcescom.
Email us at podcast atchristianliferesourcescom.
In addition to the podcasts, weinclude other valuable
information at lifechallengesus,so be sure to check it out.
(31:15):
For more about our parentorganization, please visit
christianliferesourcescom.
May God give you wisdom, love,strength and peace in Christ for
(31:36):
every life challenge.