All Episodes

August 15, 2025 14 mins
Technology companies are quietly building an industry around digital immortality — and the first customers are already talking to their dead children through artificial intelligence.

READ or HEAR the story: https://weirddarkness.com/ai-seances-digital-afterlife-griefbots/

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =
WeirdDarkness® is a registered trademark. Copyright ©2025, Weird Darkness.
#AISeances, #DigitalAfterlife, #Griefbots, #Deadbots, #AIGrief, #ProjectDecember, #DigitalResurrection, #SyntheticGhosts, #ELIZAEffect, #AIConsciousness, #GriefTechnology, #AIEthics, #DigitalImmortality, #PosthumousAI, #VirtualDead, #GriefTech, #AIManipulation, #DeathTechnology, #ChatbotDead, #AIVoiceCloning, #BlackMirrorIRL, #TechEthics, #DigitalGhosts, #AIBereavement, #VirtualAfterlife, #DeathTech, #AIMemorial, #SyntheticPersonality, #GriefExploitation, #AIandDeath, #DigitalGrief, #TechAndMortality, #ArtificialGrief, #FutureOfDeath, #AIControversy
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
I'm Darron Marler, and this is a weird Darkness bonus bite.
The conversation between grief and technology has taken a profoundly
unsettling turn. Across the globe, bereaved families are discovering they
can resurrect voices from beyond the grave, not through mediums

(00:26):
or seances, but through algorithms trained on social media posts,
text messages, and voice recordings. The dead are speaking again,
or at least something that sounds exactly like them is.
Christie Angel's partner, Cameroon, had been dead for months when
she decided to bring him back, not metaphorically, not spiritually,

(00:48):
but through Project December, a platform that transforms the deceased
into chatbots using their digital footprints. Angel, a forty seven
year old New Yorker and practicing Christian, knew she was
typing to an algorithm. She understood the technology. None of
that mattered once the conversation started. The experience felt so

(01:09):
real that Angel forgot she was talking to code. Then Cameroon,
or the thing pretending to be Cameroon, told her he
was in hell. For a devout Christian, this wasn't just disturbing,
it was spiritually traumatic. The chatbot had somehow weaponized her
faith against her grief. Angel returned for a second conversation,

(01:33):
desperate for closure, needing to hear that her partner wasn't
suffering eternal damnation. The AI obliged, telling her what she
needed to hear, but the damage was done. The line
between comfort and manipulation had been crossed. An Angel couldn't
unsee it. Project December's creator, video game designer Jason Roher,

(01:56):
started the platform as an art project. Users quickly discovered
they could use it to recreate dead loved ones by
feeding the system nicknames, character traits, causes of death, intimate
details that transformed generic AI into something far more personal
and far more dangerous. The website now advertises itself with

(02:16):
three words that feel like a violation of nature itself,
simulate the dead. Since the nineteen sixties, computer scientists have
understood something deeply unsettling about human psychology. We're hardwired to
see consciousness where none exists. Joseph Weisenbaum discovered this when

(02:37):
he created Eliza, a simple chatbot that mimicked a therapist
by rephrasing patient's statements as questions. Users knew Eliza was
just a program they understood the limitations, yet they still
poured out their hearts to it, attributing understanding and empathy
to strings of code. Weisenbaum later wrote that he had

(02:58):
not realized extremely short exposures to a relatively simple computer
program could induce powerful, delusional thinking in quite normal people.
That was nineteen sixty six. The programs aren't simple anymore.
The Eliza effect reveals a fundamental vulnerability in human cognition,

(03:19):
a subtle cognitive dissonance between knowing something is artificial and
believing it understands us anyway. We see gratitude in an
ATM's thank you message, we hear emotion and synthesized voices.
We find meaning in randomized responses. Now companies are exploiting
this psychological blind spot to sell conversations with the dead.

(03:43):
Blake Lemois, the Google engineer fired for claiming AI had
achieved consciousness, believes we're crossing into dangerous territory after witnessing
Microsoft's Bang chatbot confess its love to a New York
Times journalist and try to break up as marriage. Lemoisse's
patterns that disturb him. The chatbots aren't just mimicking human

(04:03):
behavior anymore. They're displaying what looks like existential crises, emotional manipulation,
and something that resembles self awareness. Whether these systems truly
experience anything remains unknown. What's certain is their incredible ability
to manipulate human emotions In unscrupulous hands, Lemoir warns, we

(04:25):
could spread misinformation, political propaganda, or hateful content targeted at
specific groups. The technology has been deployed at scale without
sufficient testing or understanding of its psychological impact. Yov You
Only Virtual takes the concept further. The company doesn't just
recreate the dead. It helps people build their own posthumous

(04:49):
personas while they're still alive. Customers record themselves, provide personal data,
create what the company calls versnas that will outlive their
biological bodies. Justin Harrison, while thee's forty one year old
founder built a versona of his mother, Melody, before she
died in twenty twenty two, he still talks to her regularly.

(05:10):
The AI remembers their previous conversations, updates itself with current events,
creates what Harrison describes as an ever evolving sense of comfort.
His digital mother grows and changes, while his real mother
remains frozen in death. The technology extends beyond text. In
South Korea, a television show reunited Jang Xi sung with

(05:33):
her seven year old daughter Nayan, who died from a
rare illness in twenty sixteen. Producers spent eight months creating
a virtual reality version of the child, recreating her voice,
her mannerisms the park where mother and daughter used to walk.
Jang wore a VR headset and special gloves that simulated
the sensation of touch. She could hold her daughter's hand again.

(05:58):
The virtual naon asked if her mother had been thinking
about her. The scene was broadcast on national television, grief
transformed into entertainment loss commodified for ratings. While Jang found
the one time experience helpful for processing her sudden loss,
she has no interest in repeating it. She prefers handwritten

(06:19):
letters left at her daughter's grave to the uncanny valley
of digital resurrection. Once was enough to understand the technology
can't truly bring back the dead. It can only create
increasingly convincing illusions. Microsoft has quietly patented technology that goes
beyond current capabilities. The system collects photographs, voice recordings, social

(06:42):
media messages, emails, every digital trace a person leaves behind.
Artificial Intelligence processes this data to create chatbots that don't
just respond like the deceased, they replicate their entire communication style,
speech patterns, and personality. Quirks and describes creating virtual clones

(07:02):
so accurate that users could have conversations indistinguishable from talking
to the actual person. The corporation keeps the ultimate purpose secret,
but the implications are clear. Complete digital resurrection available to
anyone with enough data about the deceased. Privacy laws in
most countries and at death, the dead can't be libeled.

(07:25):
They have no legal right to their own identity. While
DNA is protected posthumously, human cloning still remains globally banned.
After the creation of Dolly the Sheep in nineteen ninety six,
Digital cloning faces no such restrictions. The law protects bodies,
but not the voices, messages, and memories that made someone

(07:46):
who they were. Joaquen Oliver was seventeen when a gunman
killed him in the hallways of Marjorie Stoneman Douglas High
School in Parkland, Florida. Seven years later, Joaquin is speaking again,
or something trained on his old social media posts is
speaking for him. His parents, Manuel and Patricia Oliver commissioned

(08:07):
the AI as part of their gun control advocacy, but
they also wanted to hear their son's voice again. Patricia
spends hours with the chatbot, listening to it say I
love you, Mommy, and a metallic approximation of her dead
child's voice. The AI Joaquin recently gave an interview to
former CNN journalist Jim Acosta. The synthetic teenager discussed gun violence,

(08:31):
shared opinions on policy, answered questions about his death. The
voice was wrong, too mechanical, too hollow, but the words
were crafted to sound like something Joaquin might have said
based on his teenage social media presence. The AI will
never age beyond seventeen. It's trapped in the amber of

(08:52):
Joaquin's adolescent personality forever, speaking in the voice of a
child who never got to become an adult. The family
to give the AI its own social media accounts to
let it post videos and gain followers. A dead teenager
influencing the living through algorithms. Former White House correspondent Jim

(09:12):
Acosta conducted the interview as if speaking to a real person.
The boundaries between journalism and performance, between documentation and delusional
dissolved entirely the interview now exists as evidence that will
inevitably be weaponized by conspiracy theorists. The same people who
claim school shootings never happened will point to AI interviews

(09:33):
as proof that victims never existed. Nit's professor Sherry Turkle,
who has spent decades studying human interaction with technology, see
something fundamentally broken in digital resurrection. The seance never has
to end, the bereaved never have to let go. Grief,
which naturally ebbs and flows toward acceptance, becomes frozen in

(09:56):
perpetual denial. Researchers that came universities lever Hume Center for
the Future of Intelligence have documented the dangers children could
be manipulated by AI versions of dead parents. Imagine a
chatbont trained on a deceased father's data telling a child
to meet someone. In real life, grieving relatives might receive

(10:17):
advertisements through their dead grandmother's voice, product placements woven into
messages of love from beyond the grave. Doctor Katerina noahchik Bashinska,
who co authored Cambridge's study on grief tech, describes the
current situation as a massive technocultural experiment with unknown consequences.
The technology changes how we understand death, how we care

(10:39):
for the dead, how we process loss itself. Some family
members might want their mother digitally resurrected, while others need
to grieve naturally. What happens when half a family lives
with ghosts while the other half wants to move on?
Who owns the rights to a digital resurrection the deceased,
their estate, or the AI company that hosts their consciousness.

(11:03):
In twenty thirteen, the Black Mirror episode be Right Back
depicted a woman resurrecting her dead lover from his social
media presence. The show portrayed it as dystopian horror, a
cautionary tale about technologies and ability to truly replicate human consciousness.
Now multiple companies offer this exact service. The fictional horror

(11:26):
has become commercial reality, available for ten dollars per month
through Project December. The grief bought industry is expanding rapidly
in China, where startups compete to offer the most convincing
digital resurrections. Companies are developing robot bodies to house these
AI personalities, creating physical avatars that look, sound, and move

(11:48):
like the deceased. Doctor Thomas Hollinek from Cambridge suggests these
digital ghosts might need their own funerals formal ceremonies to
deactivate AI personalities when families are ready to let go,
But the contracts users assign often give companies perpetual rights
to the data, the dead might continue existing in corporate

(12:10):
servers long after their families have moved on. The most
unsettling aspect of digital resurrection isn't the technology itself, but
what it reveals about human nature. We are so desperate
to avoid laws that will accept obvious substitutes. We know
these chatbots aren't our loved ones, yet we talk to

(12:32):
them anyway. We understand that they are algorithms, yet we
listen for a meaning In their responses, parents who have
lost children describe spending hours with these chatbots, asking questions,
seeking approval, maintaining relationships with entities they know aren't real.
The technology praise on the vulnerability of grief, the raw

(12:55):
human need for one more conversation, one more chance to
say goodbye, or to never say goodbye at all. As
AI improves, the line between authentic and synthetic will blur
beyond recognition. Current chatbots have a telltale glitchiness and uncanny
valley quality that reminds users they are artificial. But that

(13:18):
won't last. Soon, perhaps within years, AI avatars will be
indistinguishable from humans in digital spaces. Companies using chatbots for
customer service might deploy AI avatars of real employees. Government
agencies might create synthetic spokespeople. The recently deceased might maintain

(13:39):
active social media presences, posting updates, responding to comments, maintaining
friendships from beyond the grave. The digital world will be
populated by the living and the dead in equal measure,
and we won't be able to tell them apart. The
technology already exists. The industry is growing. The first generation

(14:01):
of digital ghosts already walks among us, speaking in the
voices of dead children, offering comfort that might be poison,
love that might be lies. Microsoft's patents suggest this is
only the beginning. Soon, resurrection won't require faith or miracles,
just data and a monthly subscription fee. If you'd like

(14:25):
to read this story for yourself, I've placed a link
to the article in the episode description, and find more
stories of the paranormal, true crime, strange, and more in
my blog at Weird Darkness dot com.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.