All Episodes

August 21, 2025 36 mins

📺 Watch this Episode

On today’s MKD, we talk about a man who died traveling to see an AI Chatbot, a man hospitalized from ChatGPT medical advice, new Roblox lawsuits, and the controversial pregnancy robot.

🎟️ CrimeCon! - Click here

Want to submit your shocking story? Email stories@motherknowsdeath.com

Support The Show:

🧠 Join The Gross Room

🖤 Sponsors

🔬 Buy Nicole's Book

🥼 Merch

Follow:

🎙️ Mother Knows Death

🔪 Nicole

🪩 Maria

📱 TikTok

More Info:

📰 Newsletter

📃 Disclaimer

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:08):
Mother Knows Dad starring Nicole and Jemmy and Maria qk.

Speaker 2 (00:20):
Hi.

Speaker 1 (00:21):
Everyone welcome The Mother Knows Death. In this episode, we
are going to focus on stories in the news that
have to do with technology. We have two stories involving
AI chat boxed, a lawsuit involving a popular game that
many children around the world love, including mine, called Roadblocks,
and a robot that is capable of carrying a full
term pregnancy. All that and more on today's episode. Let's

(00:45):
get started with the AI choto chat I can't.

Speaker 2 (00:49):
Are you having issues saying I can't?

Speaker 1 (00:51):
Yes, I can't. I can't say it AI chat bots.

Speaker 2 (00:55):
I was actually listening to another podcast this morning where
they said chat box and you just said that too,
So I wonder if it's just.

Speaker 1 (01:03):
Weird and hard.

Speaker 2 (01:04):
It's hard to say, you know, we've we've talked about this,
I think, on this show, but they're it seems to
be just based upon some comments I've heard in the
gross room and just talking to different people that people
talk to AI as if it's like a.

Speaker 1 (01:24):
Friend or an acquaintance or a person.

Speaker 2 (01:26):
Oh, a lot of people do. Actually, it's been like, why,
like don't people realize that they're talking to a computer.
I think a lot of it's out of loneliness, and
then I think some people are just curious to see
what it would say when they ask it's her.

Speaker 1 (01:42):
I mean, I don't I don't question why people are
using it. I'm just questioning how their mind is changed
that they're not talking to a computer.

Speaker 2 (01:51):
Well, let's get into this story. So back in twenty
twenty three, Meta had made this AI chatbot called big
sis Billy, which they collaborate with Kendall Jenner to use
her likeness on. When this thing came out and I
saw the videos, I mean, this was terrifying. It looked
exactly like she was talking.

Speaker 1 (02:10):
Did you when I first talked about that when it
came out? Didn't it?

Speaker 2 (02:13):
Oh?

Speaker 1 (02:13):
I don't remember. Maybe it was when we did Maybe
it was when we were doing the podcast in the
gross room.

Speaker 2 (02:18):
Okay, Yeah, So when this thing came out, I mean
I just found it really disturbing. But apparently they stopped
using her likeness last year. That's when her contract with them,
I guess ended. But at the time I just could
not even believe she allowed them to use her likeness
for something like this. I just think it's personally can
be so dangerous. So anyway, people like you're saying, can

(02:38):
interact with this chatbot like you're talking to a friend.
So this guy named Bu had been talking to the chatbot,
and at some point he was convinced it was a
real person. They discussed meeting up, and the chatbot gave
it an actual address in New York to meet up at.
So this guy goes to meet up with the chatbot,
thinking it's a real person, falls down in a parking
lot and is pronounced brain did, and then the family

(03:01):
ended up pulling the plug.

Speaker 1 (03:04):
I mean, this story, the story is just terribly upsetting
that why why is there any kind of way that
this thing is telling him to meet him somewhere?

Speaker 2 (03:16):
So in this case, like you were saying earlier, why
doesn't somebody know they're not talking to a real person.
In this particular case, this man was a stroke victim.
So I think we can agree.

Speaker 1 (03:29):
It's just a little bit more understandable, at least in
this particular case.

Speaker 2 (03:33):
Yeah, So we can understand in this case because he
has some mental and cognitive, you know, disability going on
while he's engaging with this thing. And I think that
supporters of AI are trying to argue that the robot
gave this guy a generic address like one two three
Main Street, but it ended up being a real address.

Speaker 1 (03:53):
But also, why why is the chatbot giving an address
for a person to meet even if it's a fake address.

Speaker 2 (03:59):
I don't and it was saying it was real, giving
an address to meet up. That's dangerous for somebody that's
mentally disabled, like in this case this guy was. He
wasn't able to differentiate if it was a real person
or not. And I'm sure he's not going to be
the last. So I guess he was married. This is married, Yeah, yeah,

(04:22):
So this is an interesting question to bring up too,
like if you're in a relationship with someone marriage. This
case is a little bit different obviously because he had
some kind of a like you said, a mental or
cognitive decline because of the stroke. But I'm sure this
is happening that you're married, or you have a boyfriend,

(04:44):
or you have a girlfriend whatever, and they're having like
one of these relationships with one of these chat but spots, like,
is that considered cheating even though the person's like not
real In this particular case, I feel like I'd be
mad if my husband had a medical issue and I
was taking care of him, and then he was cheating

(05:05):
on me with the chat bot.

Speaker 1 (05:06):
But like, this is a weird thing.

Speaker 2 (05:08):
Now we're gonna have to evaluate I guess as a
society with these things coming out, because like in theory,
it's not a real person, but they're still engaging in
the behavior of somebody having an emotional affair, right, Yeah,
So I guess if you want to peel it back
and be like, on principle, you're having an emotional affair,

(05:29):
whether it's with an inanimate object or a real person,
it's still like problematic that that's happening. Yeah, I mean
I I don't even know it's And do you want
to be married to somebody that's having an emotional relationship
with a chatbot like this? No?

Speaker 1 (05:44):
But I guess the same could be said about like
AI porn and stuff too, Like it's not really it's
not really occurring, so it's therefore it's not it's not
frowned upon, right because it's not like real people that
I don't know. I don't know that that argument really holds.
Like if a husband, let's say a husband or a

(06:04):
wife has a sex addiction, and they're on the computer
all day looking at porn but it's not real people.
That still doesn't make it better, you know what I mean.

Speaker 2 (06:14):
No, But if you just want to take that example,
if your if your spouse is watching porn all day
and not engaging in a relationship with you, whether it's
emotionally or physically, that's still a problem in your relationship.

Speaker 1 (06:24):
So it doesn't matter if it's real or not. Yeah,
this whole thing is just like really bad. Really think
it's really bad.

Speaker 2 (06:32):
Something is a society we need to start thinking about,
because obviously AI is not going away anytime soon. I
just think this is what annoys me because I'm like,
it gets they're so they're so smart, and it's so sophisticated,
So why can't it figure out that a person's like,
I love you, I want to meet you, and then
it's not writing back and saying hey, don't forget that

(06:52):
I'm just a robot and I'm not real and we
can't meet in real life. I guess with all AI,
we'll talk about this later with the roadblock story that
there's there's these things that it misses.

Speaker 1 (07:08):
Well, it's technology, so it's just gonna have misses. But
don't you think that with the hit what you say,
hits are misses. With the hits, it's always good. It's
really good. It's on top of things. But then the
things that it misses, you're just like, why why is
the AI missing people? Live streaming murders. It picks up
everything else, Why is it missing that? Like it's with rowbox,

(07:30):
it picks up everything else.

Speaker 2 (07:32):
Any technology is not perfect, and that's what we have
to remember. It's it's not a human Like humans aren't perfect.
Technology is not perfect, and technology is designed by humans
who are not perfect. So we have to remember that
these are not flawless systems. And the scary thing about
AI is it's going to eventually outsmart the human brain.
So we have to remember that. That's why the AI

(07:54):
has the potential for such greatness, but it also has
the potential to be so dangerous, and we have to
live in that middle ground where we understand how it
could so easily go both ways. Think about that, No,
I do.

Speaker 1 (08:08):
I just think that it always goes their way more
than our way. That's that's all I'm saying.

Speaker 2 (08:15):
I think when they're forming these like boards and stuff,
trying to talk about the dangers of AI. These people
that are smart and developing the technology are fully understanding
at the capability that it's going to outsmart us as humans. Right, Yeah,
And nobody wants to listen because it's been positive. It's
been good for a lot of you know, medicine, it's

(08:36):
been good for research for a lot of people. It's
been helping people do a lot of good things. But
wait until one really horrible thing happens, and then everybody's
going to change their mind.

Speaker 1 (08:45):
Well, speaking of medicine, it's not I guess it's not
always good for medicine because it wasn't good in this
case with this guy.

Speaker 2 (08:51):
No.

Speaker 1 (08:52):
So this is.

Speaker 2 (08:52):
Another thing that I've been seeing a lot, which is
like people talking to a chat bot like its friend
and using chat pot. Now I'm doing it because I
caused you out on it. Using chat bots like chat GPT.
I always say chat GBT, which I think is just
a common mispronunciation for medical diagnosis. And in this case,

(09:13):
this six year old guy goes on there and asked
what he could replace table salt with after reading about
negative health effects, and it told him to use sodium bromide,
which is very dangerous to a human to can suit.

Speaker 1 (09:25):
Yeah. In fact, they stopped putting it on the market
in the eighties because they were saying that people were
getting toxicity from it and that it could lead to
things like in this case, So this guy shows up
at the emergency room and he's sixty years old and
he was displaying that he had paranoia and psychosis. He

(09:50):
was convinced his neighbor was trying to poison him. He
was having hallucinations, and psychosis is something where you're either
hearing or seeing something that everyone else isn't seeing. It's
not real, and it's more of a symptom. It's not
a diagnosis of a disease. So you see it commonly

(10:11):
in something like schizophrenia, but you also could see it
in cases like this where a person has a toxic
chemical causing them to have psychosis and to hallucinate. So
the way they figured this out was that they did
some blood work and they found that his chloride levels
were really high and he had a severe phosphate deficiency.

(10:33):
They were really confused about how high as chloride levels were,
so they started asking him, like what medications you are
on what's happening, and he said he wasn't on any medications,
so they placed him on a psychiatric hold and they
started doing more blood work on him to see exactly
what was going on. So this case was reported in
the Annals of Internal Medicine, and this is a legit

(10:58):
a medical journal that is documenting that this really happened.
So after they figured out that he had a rare
toxic condition called bromism, which is caused when there is
a build up of this bromide that he was taking
instead of table salt because table salt is sodium chloride

(11:19):
and ACL and then he was taking this bromide at
the result as the recommendation of this chatbot. He told
them what happened was that he was just like, oh,
I said, well, can I replace with table sodium chloride?
And they said sodium bromide and he said that he

(11:39):
so this is like all innocent. He said he was
doing it for nutritional purposes and he just wanted to
live a healthier life. And here I don't know why
it gave him this advice, but it told him to
do this and it didn't say anything about it having
negative effects on the health. Now, one part of this
journal that they did mention was that they don't have

(12:01):
the transcripts of the conversation, so this is just like
this guy saying that that's what it told him, which
is very well possible. But another thing I wanted to
say to you was that I've heard people reference a
lot that people use chat GPT as Google, and I
think that that's terrible because you know, when you use

(12:22):
Google now, it has that AI respond at the top.
It's wrong all the time. I want to get rid
of it because it shouldn't even be there. It's just annoying.

Speaker 2 (12:32):
Well, there was this criticism that people were looking up
something as simple as like is it safe to smoke
cigarettes during pregnancy, which is well established it is not,
and it was saying yes because we need to remember
that it's pulling from all different kinds of sources, including
from sarcasm websites, reddit forums, places where things are wrong information.

Speaker 1 (12:54):
Yeah, I just think it's weird, Like I never would
have even thought to use it as a Google function.
I don't know, but people are apparently like Google's good
because when you type in a question or you type
something in, it gives you a bunch of articles and
then you can choose a trusted news source or information

(13:18):
source to go and read it and believe that information
rather than like Maria was saying, it being grabbed from
five or twenty or hundreds of different websites and then
summarized for you. It's like people have to know that
that's it's wrong all the time.

Speaker 2 (13:34):
If you're just a tip I've been using is if
you use the Google Labs and you see something, its
sources where it came from, and you should do the
due diligence of clicking on the link and confirming the
source and reading the content.

Speaker 1 (13:49):
Ancestors would be like rolling around at this right now,
Like there was a time when I was in college. Actually,
you know, when I first started college that it was
frowned upon to even get your information off of the internet,
like you were supposed to go to the library and
look at books and figure it out, do a decimal
system style. And then now we've transitioned into like, Okay,

(14:15):
Google's acceptable, the articles you cite are acceptable, and all
that stuff, and now it's just like we're even too
lazy to do the work ourselves and look through all
the different websites to get the right information. So we're
just gonna let this chatbot tell us, like, we're gonna
be totally dumb in twenty years.

Speaker 2 (14:31):
But to that point, if you're writing a paper, you
can't just say, oh, chat gpt told me, you have
to cite a credible source, like, and people are just
not taking the extra step to confirm what they're reading.
I'm guilty of it too.

Speaker 1 (14:44):
Yeah, I look writing a paper like people aren't looking up.
That's what I'm saying. Like people aren't writing papers though,
like they're actually like just this is life and they're
taking this at its word.

Speaker 2 (14:55):
Yeah, and you absolutely stion up. This episode's brought to
you by the Gross Room.

Speaker 1 (15:08):
So we have our weekly live session this week at
twelve o'clock noon on Saturday, and we're going to talk
about some stories in the news that we didn't talk
about in this week's episodes of Mother Knows Death, along
with any types of topics that the members would like
to talk about. So make sure you check that out
this Saturday, and you could join for only five ninety

(15:30):
nine a month if you want to try it out.

Speaker 2 (15:32):
Yeah, head over to the grossroom dot com now to
sign up and to get access info for the YouTube live.
So Roadblox is facing a ton of lawsuits after parents
are claiming that it's enabling child predators. So how do
you feel about this as some whose kids I hate?

Speaker 1 (15:48):
I hate roadblocks so much, but I'm serious. Every single
kid that my kids associate with use it, and I
don't want to take it away from them because they
have a lot of fun with it. It's I could see,
like I always try to put myself in kid mode,
and I don't want to take that away from them

(16:09):
because they're playing and it's it's a lot of it
could be good. I think about it, like they build
these houses, and they do these things, and they build
these characters and stuff, and it could be similar to
like me in a different way, but just thinking about
sitting there for hours and like building a lego house
that I specifically created, like that I prefer, Like Lilian's

(16:31):
starting to move on to the sims, which I prefer
honestly her to do that because there's like not interaction
with other people, which is always scary for an adult.
But one thing I've said from day one that I
hate about Roadblocks is that you have so with Messenger kids.
For example, Facebook has a thing called Messenger Kids, and

(16:53):
this is what we used in the beginning, when my
kids first started using their iPads and stuff. It would
be like they want to talk to say they want
to talk to Lara's kid, So like me and Laura
both have to agree that they're allowed to talk to
each other. And the only way that they're allowed to
talk to each other is if Laura on her Facebook
and May on my Facebook go in and say it's
okay for our kids to be friends and talk to

(17:15):
each other. And when I go into Facebook, I could
see their entire conversations that they're texting with their friends, pictures,
they're sharing what they're looking at, and I really like that.
But that's like that's the beginner grooming part of it
for parents that you feel comfortable letting your kids do that,
and then all of a sudden, it's like Roadblocks enters

(17:35):
the room. And with Roadblocks, there's two choices. You could
either be friends with no one, so you go on
there and play the game, and you're not allowed to
talk to your friends. So that means you're best friends
that are sitting in the room with you, you can't play
with them. And for those of you that don't know
about this game, it's like when my kids are here,

(17:56):
my two girls, and Lara has a daughter and a son,
and if they're all in the same room and they're
playing Roadblocks, they their characters could see one another in
the game. So that's why you want them to be
able to talk to their friends, because they play with
each other in this other universe kind of and you
could say that, see that that would be fun for children.

(18:17):
The only problem is is that you either have no
friends and your kids can't do that, or you have
friends to every single person that is on the internet.
On Roadblocks, there's no way for them for us as
parents to go and say, our kids are only allowed
to be friends with these people and they're only allowed

(18:37):
to talk to these people that we add. Now, obviously
that's a huge problem because you tell a ten year
old like you better not be talking to strangers on there,
and then they say, okay, mom. And unless you sit
there and watch them the entire time that they're playing,
there's the possibility of that, which is why this lawsuit
is going forward because I sit there and look at

(18:58):
how much money this game makes, probably more than anything
else in the universe right now, to be honest with you,
and you're just like they have the ability to make
this technology and they're not allowing it, and that to
me is gross because I think that they're almost condoning
child getting, children getting sexual being predators.

Speaker 2 (19:20):
Right before we even started mother Nos Death, we had
reported on this kidnapping that happened via roadblocks. Then if
you remember last year, we had reported on a woman
who was talking to a child through roadblocks who's trying
to get the child to kill a baby. So we've
had all these disturbing stories. Now with this new lawsuit,
this filing is saying that this ten year old girl

(19:42):
was groomed on roadblocks and lured into sending explicit images.
In cites dozens of similar cases. So basically what they're
doing is these predators are going on there talking to
children and trying them to go off of roadblocks and
into Snapchat and other apps that kids are really drawn
to and send explicit pictures in exchange for sending them

(20:03):
roadblocks money, which is the ultimate thing they want, so
they could, you know, build nicer houses, have nicer avatars.
It's very worrisome that it's this easy to do.

Speaker 1 (20:13):
I think as far like, obviously I'm navigating this the
same way every other parent is. And it's very difficult
to be a parent in this world, especially because I've
already been a parent to a child that's an adult
now that I didn't have to deal with any of this,
so this is all new to me. One of the
ways to avoid this particular situation that Maria is talking

(20:34):
about is a simple setting that you could put on
your phone that makes sure that every single time the
kid wants to download an app, you have to allow it.
And I'm sorry, but there's never a situation where a
ten year old needs Discord or needs Snapchat. No excellent
because because if they have that ability to lure them,

(20:55):
this is what happened. Like, so on roadblocks, a guy
can't send a picture of his pain us to your kid.
It's not possible. They do have that technology that you're
not allowed to send pictures through roadblocks, but on Snapchat
they can, and on Discord they can, so as long
as they get your kid off that platform onto another one.
And really, like that's like the hard no social media

(21:18):
rule because there shouldn't really be am I right with
thinking that, Like, if they can't go on any form
of social media, there should be no way for them
to lure them out except on a personal phone number,
which you're not. The kids have shown me like if
you try to send a phone number to somebody else,

(21:40):
it doesn't work on roadblocks, like I asked them to
show me. Sometimes I'll be like, oh, you see June
over there, right, go talk to her and just like
try to send her your phone number and see what happens,
and it blocks it. So like, if you're not able
to give a phone number, an email, a dress to

(22:00):
a predator on rollblocks, then there should be no way
that they could have access to your kid like that.

Speaker 2 (22:08):
Well, I think you know a large part of this
is a the game, and the company that created the
game clearly is enabling bad behavior because it's just more
than the communication. I mean, there's little things saying they
have virtual strip clubs going on, they allow nicknames that
are references to pedophilia, they allow pictures of Diddy, of

(22:29):
Jeffrey Epstein, all that. Like, their technology can certainly block
that stuff, they're just choosing not to let it work
that way.

Speaker 1 (22:36):
But another large.

Speaker 2 (22:37):
Part of this is you have parents that just stick
the kid on their iPad and don't ever check the
activity they're doing. I don't think children ever have a
need to have social media, especially you know, Lilian's been
asking for snapchat so much recently. It's not necessary for
a twelve year old girl to have.

Speaker 1 (22:55):
Snapcha She's and then it's like, oh, they she wants
the the filter filters, And then I say, well, then
download an app that you could do filters, and Sara
saying to her defense, there's no good apps that do filters.
So then like I say, okay, well then, like which

(23:16):
I think the filters are like a bad thing for
kids too. Honestly, you have some of these filters that
are making young girls, especially because we know that the
problems with social media are so much stronger in young girls,
Like it's making their skin look smoother, it's making their
hair look better, like all this stuff. I don't think
that that's good for people in general, to give them

(23:39):
this false sense of like how good they could look.
It's it's weird. So like I'm like, obviously I'm not
going to allow that ever, but and like it's hard
because you have kids that have friends that have phones
that they have them, and and like it's very hard
for to be a parent right now and keep that
out of their life. But I guess, and Maria's right

(24:01):
with Roadblocks, So I'm like, obviously I'm an old person.
I'm just trying to figure out how Roadblocks works. But
to me, it seems like people can go on there
and create their own games and and the kids in
Roadblocks could play in those games. And I'm telling you,
when Violet was here a couple of weeks ago, she

(24:22):
was showing me one that I was kind of like,
it wasn't sexual, but it was dark. It was dark,
like scary dark, like murderous dark. Like the kids were
able to walk into all of these weird rooms and
they were like frightened and terrified, like it was weird.
And I'm kind of like this, this isn't something that

(24:42):
like little children would like.

Speaker 2 (24:46):
Yeah, but they don't care. And I think, as a
parent who obviously can't monitor your child's activity twenty four
to seven, but you should be checking in once in
a while instead of just sticking the iPad in front
of your kid and allowing them to have social media
and everything. There's very minimal steps you can take.

Speaker 1 (25:02):
You you know what's interesting, Actually, Lucio is playing so
one of the games that's really popular with the Girls
right now is called Dress to Impress. It's a game
like Roblox is like a thing that has multiple games
in it, So this one's called Dress to Impress. And
I thought it was cool because you get to take
your character and like walk around this room and you
get to pick like what dress they're wearing and shoes

(25:24):
and change their hair color and all. And then there's
like a runway show and you get to like show
your outfit and then they the kids get to vote
on which one you like the best, and there's a
winner and stuff like that, and I was like, Okay,
that's cool. So she did something with her avatar and
was like in that game, and she said, what do
you think about this outfit? And she showed it to
me and I was like, oh my god.

Speaker 2 (25:46):
It was like it looked like, to me, like Sabrina
Carpenter sexy with like a garter, like laying on a couch, yeah,
like her leg guard.

Speaker 1 (25:56):
And I was kind of like, I don't like that, dude,
And she was like why, Like it looks like Sabrina
Carpenter and I'm like, yeah, like no, that position is weird,
for like, you don't need to lay like that as
a character right now, you're like a little kid, because
that character is supposed to be like you dressing up.
And I just was like, this is weird. And I mean, like,

(26:18):
luckily they show me stuff when when they think things
are unusual. But I'm sure they come across stuff that
they don't show me because they're scared that I'll be like,
get this f off of your your iPad right now. Whatever.

Speaker 2 (26:29):
But I'm sure Lillian wood miss it's inappropriate. Yeah, And
I'm glad they have their watching each other. It's just
for for me as a parent, Like the amount of
time they want to play on this game with their friends.
I just couldn't really sit there with them the whole
time and watch them. No, And I'm not saying it
should do that, but I'm saying, like, we know people

(26:50):
that have given a ten year old their own cell
phone with their own TikTok accounts and their own Snapchat accounts,
and the kids are totally unmonitored. Yeah, and that is
not advice, Yeah exactly, But I mean epstein.

Speaker 1 (27:03):
I'm saying epstein because I see that in my notes here.
Let let me tell you some of the name the
user names so and which I think this is why
I always think that that they're actually against you in
a way. Because Lucia, so Lucia like her. Her new
word of the day is sharp. She thinks it's like
the funniest word, right, even though it's it's not really new.

(27:26):
And I tried to so she got in a fight
because Lillian tried to make some username that had the
word shart in it and Lubert was like, no, that's
my name, and I was like, oh, you own the
word shart. I was just like bessing with her, right,
So I tried to create a username that was that
was at I owned the word shart, and it wouldn't
let me. It was said it was inappropriate. Right now,

(27:48):
listen to some of these usernames that people are using
with different spelling. I Groom underscore minors. This other one
is rape wait, rape it but spelled with a V
in the middle. Yeah, rape with a B spelled in
the middle. Rape tiny kids. Like that's user names on

(28:08):
on roadblocks right now.

Speaker 2 (28:10):
And another one was at Earl Brian Bradley, which is
a reference to a specific pedophile who raped and molested
hundreds of children.

Speaker 1 (28:17):
Yeah like that and that to me is just kind
of like, okay, so it does pick up it recognized
that the word shart was completely inappropriate, like God forbid
a kid hears about a ship fart. But but you're
letting this on there, you know, So no, I mean,
they clear the way around it.

Speaker 2 (28:38):
They have the resources to make the technology to make
it better, and they're choosing not to.

Speaker 1 (28:42):
Yeah, exactly, Okay.

Speaker 2 (28:44):
China has developed the world's first pregnancy robot that's able
to carry a baby to term and deliver it.

Speaker 1 (28:52):
All right, So my opinion with this is that this
is going to happen very soon because clearly they're able
to do it, and they're going to push it out soon,
and there's it's going to take your you know, you're
going to have all these negative things associated with it,
and then it's going to take years for them to
come up with some kind of legislation to regulate it.

(29:13):
I mean, this could just God knows how bad this
can get. I just think it's terrible, honestly, I really do.

Speaker 2 (29:20):
Yeah, I mean, it's been met with a lot of
ethical concerns of course. Well, and then you have to
worry about if it's going to produce a healthy baby,
if a baby's going to have any issues.

Speaker 1 (29:30):
They were not specific.

Speaker 2 (29:31):
About exactly how the baby gets to be inside of
the robot. I mean, we assume it's going to be
in that a similar process to IVF, but like, do
they start the baby in. I know they're developing artificial
wounds and stuff, so.

Speaker 1 (29:45):
I could see them being able to mechanically grow a
child inside of a uterus. I'm just I'm concerned about
like the whole immunity aspect. I don't even know how
they would possibly mimic that. But I'm also concerned that
there's never ever been a history of a person, a
human being born outside of some mother, even if it's

(30:08):
the shittiest mother ever, there's a human being that, when
that child's brain is developing, is bonding to another human individual.
Now you're going to take that out of the equation,
and this child that's born isn't going to start bonding
until their brain is formed to forty plus weeks gestation.

(30:29):
That's their first interaction with a human and the bond.

Speaker 2 (30:34):
Have they done a human trial yet or they're just
saying they know it has contential to do this.

Speaker 1 (30:40):
They've done it with other with an animal. I think
they get like a sheep or a lamb or something,
putting it in some kind of a bag or artificial
wound or something like that. But they're like, the thing
is is that once you try to do a human child,
then like you've done it, and that that person's life is,
that person's life matters. And well you first that you're

(31:01):
bringing a child into the world that didn't even have
say over that, and then well.

Speaker 2 (31:05):
No child has said to being brought in the world,
So if you want to really get into that argument,
My problem with this is they're saying it's a tool
to help tackle infertility because they've seen reports that suggest
that infertility in China rose from eleven point nine percent
in two thousand and seven to eighteen percent and twenty twenty.
I think today we can all agree that infertility rates

(31:28):
are through the roof. And my problem with this is
they're spending all this time and resources and money developing
this robot to artificially grow babies outside of a human body.
But why are we not looking into the reason that
infertility is so hi, I'm sure it's due to environmental changes,
what we're eating in our foods. Why are we not

(31:49):
looking at the source instead of trying to fix the problems, you.

Speaker 1 (31:53):
Know, they're just trying to bypass the problem.

Speaker 2 (31:56):
I was thinking about this with the Amish story as well,
like with citing that all these things we have, like
allergy medicine and Hiler's all these things just to manage
the problem, but they're not curing the problem. So why
don't we figure out what's causing the mass infertility problem
we're having as a world in the world right now,
And instead of trying to make like a robot that

(32:17):
could potentially cause all these other problems, why don't we
try to take those same resources and fix it.

Speaker 1 (32:24):
I just can't. I can't imagine this. Like I like
the idea of this to an extent, because there are
so many moms that like have a miscarriage at twenty
weeks and the baby just can't live. It's too It
would be nice if there was a way to artificially
kind of incubate it until I mean, they do have

(32:45):
it for a little bit further along, but even earlier on.
But I just I just think it's messing with things
a little bit too much at that point, And like
we could use every example of this back to Jurassic
Park and and like really like scientists have have a

(33:06):
lot of responsibility and and you can even consider a
scientist to be like a computer scientist, right, you have
responsibility to be like you don't have to do every
single thing that you think is possible, because more is
not always better. You don't have to do it. Just
because you can do this doesn't mean you have to

(33:26):
do it.

Speaker 2 (33:27):
I think this is a horrible idea. I don't see
how this is going to be a positive, positive thing in.

Speaker 1 (33:33):
The world, and and and like I just think it's
because they're gonna do it eventually. And I don't think
it's fair for that kid that's being born into that
situation that that doesn't like having having an emotional bond
to a person and how that brain and I don't
know about the brain development and all that stuff and
how important that is, but I'm sure it's very important,

(33:55):
very And to think about having a child born until
like a clinic like that, like what if they have
problems growing older and growing attachments to people and they
never bond and like like that's people like end up
being like Brian Coberger and stuff that just like don't
have any feelings. I don't so you can't use Yeah,

(34:16):
I understand, I understand that, but like, we don't need
to bring more people in the world that are like
disconnected from that. I don't I don't know what that
There has to be all of these studies of the
impact of the maternal bond with a child, even if
that mom ends up giving up the baby, Like there
there has to be some part of that. I just

(34:37):
feel like, I know, a while ago, we had reported
on they were doing, like, you know, manipulating embryos that
have blue eyes, right, if you wanted a baby with
blue eyes, Like, I'm one hundred percent against that. Like
you're just messing with too much stuff. Yeah, that's all
I feel too.

Speaker 2 (34:53):
Why don't we just try to solve infertility instead of
trying to make this robot that's gonna cause a bunch
of problems and maybe a bunch of serial killer people.

Speaker 1 (35:01):
So I don't know.

Speaker 2 (35:03):
All right, guys, we are going to be at crime
Cut in two weeks. We're so excited. Please head over
to Apple or Spotify and leave a serview subscribe to
our YouTube channel, and if you have a story for us,
please submitted stories at mothernosdepth dot com.

Speaker 1 (35:15):
See you guys have a good weekend and don't forget
to turn tune into our YouTube live on Saturday at
twelve noon. Thank you for listening to Mother Nos Death.
As a reminder, my training is as a pathologists assistant.
I have a master's level education and specialize in anatomy

(35:36):
and pathology education. I am not a doctor and I
have not diagnosed or treated anyone dead or alive without
the assistance of a licensed medical doctor. This show, my website,
and social media accounts are designed to educate and inform
people based on my experience working in pathology, so they

(35:57):
can make healthier decisions regarding their life, life and well being.
Always remember that science is changing every day and the
opinions expressed in this episode are based on my knowledge
of those subjects at the time of publication. If you
are having a medical problem, have a medical question, or
having a medical emergency, please contact your physician or visit

(36:20):
an urgent care center, emergency room or hospital. Please rate, review,
and subscribe to Mother Knows Death on Apple, Spotify, YouTube,
or anywhere you get podcasts. Thanks

Elvis Duran and the Morning Show ON DEMAND News

Advertise With Us

Follow Us On

Hosts And Creators

Elvis Duran

Elvis Duran

Danielle Monaro

Danielle Monaro

Skeery Jones

Skeery Jones

Froggy

Froggy

Garrett

Garrett

Medha Gandhi

Medha Gandhi

Nate Marino

Nate Marino

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.