All Episodes

September 8, 2025 24 mins

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
All right, Paulina is off doing I don't know Paulina things.

(00:02):
I'm not sure she has a power seat on again,
so I have no idea. It's gotta be something. But anyway,
hit the intro. We'll get to Paulina when we get
to Pelna. It's the tangent giving you all this shit
we couldn't talk about on the air. All right, now,
she'll come back when she wants to and then we'll
do the thing. So how about this? And I was
gonna talk about this on the air, but we'll talk
about it here instead. Apparently we're going to be able

(00:26):
to speak to our dead relatives via AI within five years,
so we would not have to go to a cemetery,
I guess, or we wouldn't even go I guess visit somehow.
This is going to replace visiting your deceased love one
in the ground somewhere because you could be able to
speak to them and AI, so it's like they never died.

(00:46):
University of Cambridge professors predicted in a few years time,
we'll stop going to cemeteries and simply speak to our
dead relatives on AI generated apps. Although she's worried what
this might mean for you you think by twenty thirty,
we will have dead loved ones in our pockets and

(01:07):
we'll be able to talk to them twenty four to
seven thanks to AI. She forecasts that these technologies will
allow us to talk with AI generated versions of deceased
loved ones and it will be very popular, although this
person warns that we don't yet know what the consequences
of these developments might be.

Speaker 2 (01:23):
Now here's the thing.

Speaker 1 (01:25):
We're not really talking to them, right, they're dead, and
so we're talking to a simulation of them based on
what I don't know. And so, I mean, would this
give you any comfort to pretend like you were talking
to someone from that had passed away, Like if it
looked like them, if they if AI can make it
look like it's a FaceTime with this person that died,
I mean you would you be able to convince yourself

(01:46):
that somehow it's embodying the person that's gone.

Speaker 3 (01:50):
Are they going to respond?

Speaker 1 (01:51):
Yeah, No, that's what I mean, that's what talking to
them would be. Yeah.

Speaker 2 (01:55):
Oh get, I get.

Speaker 3 (01:56):
I'm like, I just talk to them now.

Speaker 1 (01:59):
No, I mean, I'm the AI portion of it is
them talking back to you. Oh I think I think
that's what that means.

Speaker 3 (02:04):
You know, I'm not doing it.

Speaker 4 (02:05):
I feel like it makes it hurt more. Yeah, Like
my heart hurts just thinking about it, because like it's
not going to get you the same emotion that talking
to that actual person is going to convey, you know,
like the responses aren't going to be the same way.

Speaker 1 (02:18):
That's the thing, Like I know what they would say,
like what if you what if AI version of your
grandpa is like racist or something, you know, and you like,
I don't know, Well, I'm going to think about this though,
Like it's talking on behalf of your loved one without
any knowledge of your loved one. So so like what
if you what if AI grandpa is someone you don't

(02:39):
even like that much after you get to talking to them,
Like how would they know?

Speaker 4 (02:42):
I'm assuming you have defeated information, right, because not the
whole thing about AI. It only works as much as
you tell it. So I'm assuming you have to load
a profile up of videos person and yeah, their voice
and whatnot, and then it spits it back at you.
But it's still not.

Speaker 5 (02:56):
Going to be No, it's generic. So I think that
people are doing this, you're ready with their businesses. I
have a theory because I was emailing on a Sunday
yesterday to somebody. The response I got was extremely quick
and it was very curated to what I was saying.

Speaker 2 (03:09):
And this person quote was her name.

Speaker 5 (03:12):
Was Phoenix, but I'm like, I think Phoenix is AI
and I could read them to you.

Speaker 2 (03:15):
It's very specific to what I was Phoenix, Phoenix was Ai.

Speaker 3 (03:19):
I no Phoenix, but.

Speaker 5 (03:19):
I'm like, this ain't my Phoenix.

Speaker 2 (03:20):
Like I just like somebody else's on a Sunday.

Speaker 5 (03:24):
You're gonna tell me. You have your assistant or whatever
responding to you on a Sunday and literally within thirty
seconds and you're you're curating what I'm saying to you,
like you're kind of responding back like love the excitement.

Speaker 6 (03:35):
It's like, yeah, Phoenix, yeah, it off.

Speaker 2 (03:38):
No, Phoenix ain't real. No, Phoenix is not real.

Speaker 5 (03:40):
So I'm saying, I think this can probably the same thing.
If I'm like, oh, I miss you Grandma, it's probably
gonna be very simple of like I miss you too.

Speaker 1 (03:45):
You know. Yeah, what if you find out that, like
you know, your dead uncle's like a perv in AI
in Ai Land, or what if something's uncovered because it's
using the Internet that you find out about this person
that you never knew because you never knew to look
for it.

Speaker 2 (04:00):
I found it my granddaughter. I had a secret child,
you know that I put.

Speaker 1 (04:07):
I was never supposed to live within two hundred feet
of the school. That's what I mean. Like, it's not
the real person. This is like going to a fake
Michael Jackson hologram concert.

Speaker 2 (04:17):
Like it's not they're not there. They're not.

Speaker 6 (04:19):
If it brings anyone comfort, I have no judgment, but
I am aligned with you guys like that is not
for me, and it would make it hurt way more.
And also I'd be like, you're not my grandma. You're
not drinking whiskey, Like what do you know? You're not
drunk off whiskey.

Speaker 2 (04:32):
It's not real.

Speaker 5 (04:33):
I would feel funny, And I think it would probably
put the psychic mediums out of business because then like
who would he Susan.

Speaker 2 (04:39):
You know and I need her? We neither need.

Speaker 5 (04:44):
Yes, I think that's I mean.

Speaker 6 (04:53):
I don't know, Like I've had psychics tell me things
that like they could have only known if they had
a gift. So I really doubt AI is going to
come back like something on the internet but like.

Speaker 1 (05:01):
Which talks about how the AI therapy is dangerous, but
that is at least that's based on like like diagnostic
manuals and and and you know, it's using the internet
to find therapeutic techniques that are scientifically proven in theory, right,
so like it's doing what a therapist is supposed to do.
I guess technically, but this is like speaking on behalf

(05:26):
of someone who's gone, who you had a relationship with.
So and as these models grow, my understanding is they
grow kind of a brain of their own, which I
just think the whole thing's dangerous because it's like they're
gonna morph into what they don't know. It's a fucking computer.
They're gonna morph into whatever they morph into, and you
may not like that, and it could be weird, you know.
Or or again it could uncover something or or or

(05:49):
lean into some trade or mannerism it found somehow or
was taught that I don't it's it's not real, so
I guess I don't.

Speaker 2 (05:56):
I don't know what the value is.

Speaker 1 (05:57):
I mean, I could see, like if you're asking catch
EPT for advice about something, I can see why that's
dangerous too, especially if you're following the advice because again
it's not tailored to you really, like not in an
interpersonal way, but at least it's going to be based
on you know, textbooks and stuff that you know what
I mean, because it knows all that shit. Like if

(06:18):
I'm asking it if I'm sick with something, it doesn't
mean it's right or that I should trust it, but
it's it's likely going on medical websites and using the
things I tell it to find factual information. But we're
talking about someone's character here that's trying to be recreated,
and I don't think I like that.

Speaker 3 (06:34):
No, where's the Is there some type of regulation on
this chat gpt SAH.

Speaker 2 (06:40):
I think it's kind of a lawless land. I think
it probably needs to be.

Speaker 3 (06:43):
It needs to be regulated because it's getting out of hand,
is getting out of control good the way the jobs
are just disappearing, Like I don't want to sound to
an old lady, but you know they are scary, and
then what are we all going to do when we
replace ourselves?

Speaker 2 (06:54):
That's what I just want. I just want to ploy them.

Speaker 3 (06:56):
Down for to take it, you know what I'm saying,
Like you can have a chat, but what what am
I doing after? That's what I think we need to
plan for because it's happening. Yeh, it's happening quick, and
they're in every industry. It's not like it's just one
industry being affected up like it's here and it's not stopping.

Speaker 2 (07:12):
Well I read this morning.

Speaker 1 (07:13):
I think it's Salesforce is some major CEO is talking
about how AI is going to eliminate starter jobs across
the board, because you know, a starting job could be
easily replicated by a computer now by AI. But then
if you can't get a starter job, then how are
you supposed to get the second kind of job? Right,
you know, like how you supposed to get the elevated position.

(07:34):
We were having this problem in radio right now and
in broadcasts all around, but it's really bad in radio
where we don't have a farm system anymore, like the
little markets where you might have gone to get it,
you know, to like get your chops and learn and
make mistakes and get better and then you move on
to the next like those. Now we're using other bigger
market talent we have for years, but now it's even worse.

(07:57):
So like how people ask me all the time of well,
if I want to be you, like, if don't be me,
first of all, but if you want to sit in
this chair and host a morning show, You're like, how
do I get there? And I don't know, I don't
know what I don't want to tell you. I guess
try and get a job on a on a morning
show that's established, which is going to be in a
big market, which you're going to be really competitive, and
you don't really have any experience, which is not your fault,

(08:18):
but you don't have a whole lot to add except
for your character and your personality, which is maybe good enough.
In the case a lot of people in this room,
it was good enough, and then everyone learned the rest.
But like, I do not know how to tell people
to get into this business because I don't know where
to tell them to go to learn, and so it's
already happening in here. What happens when every other industry

(08:38):
decides there are no starter positions because they're too rudimentary,
and then you don't have any place to go learn,
And then then what you better be the one making
AI or work for AI?

Speaker 3 (08:50):
Is that Okay, maybe that's the plan. I just need
a plan.

Speaker 2 (08:54):
Are you sure we don't we might already work for as.

Speaker 3 (08:58):
We're taking ourselves out. Yes, it's like why hello, Yeah,
I don't see what's happening.

Speaker 2 (09:02):
To stop.

Speaker 3 (09:02):
I want them to stop.

Speaker 2 (09:03):
I'll knock it off.

Speaker 6 (09:04):
Yeah, you know how I feel about it.

Speaker 1 (09:06):
I just don't want them simulating real people who lived
real lives and had actual personalities, because I again like
what does that morph into as it learns and involves?
Because again, like I keep using my grandfather, but he's
not alive, right, so, and he hasn't been for years.
So anything that AI picks up that has to do
with today's terms or technology, he never thought that. That's

(09:28):
not an original thought, right, So like he may have
opinions on things right now that AAI opinions on things
that he wasn't alive to have opinions on, And maybe
I don't want to know what those would be.

Speaker 2 (09:40):
And it's also not him. How is it legal?

Speaker 3 (09:42):
Like how is that legal? Because can you imagine if
your loved one was let's just say, murdered and then
you talk to them on CHAD GPT or whatever and
in the person says, you know, actually, Sheila is the
one that took me out. Now I go seek revenge
on Sheila because Chad gpt too.

Speaker 2 (09:57):
Well, maybe we shouldn't seek revenge on Sheila. You know, maybe.

Speaker 3 (10:02):
If your loved one was killed and they can tell you.
Now AI has told them to tell you who actually
did it, and now we're killing each other.

Speaker 6 (10:11):
You took it, Yeah, you took it somewhere.

Speaker 1 (10:17):
But this is what I'm saying. We're saying the same thing.
It's like, and I think that for this person is saying,
what are you? What are the consequences?

Speaker 2 (10:24):
Right? Like what? Yeah? I mean this this? Yeah, now
she was dead and maybe she did it, maybe she
didn't do it. I will never know.

Speaker 4 (10:35):
We're taking justice in her own.

Speaker 2 (10:37):
Then Sheila is dead.

Speaker 6 (10:38):
She comes back via a I and she's like, I
don't know.

Speaker 3 (10:43):
It's really Timmy because she loves people came and killed me.
And then we're killing Timmy.

Speaker 2 (10:48):
And his family.

Speaker 1 (10:49):
Maybe if they were away and I don't even know
if I'd want this, but like, think about this, maybe
if there were a way that to Jason's point, like, uh,
she was dead, they could download everything. So again this
is dangerous, but like, okay, for example, where I'm going
with this is they're are okay, my my other grandfather. Right,

(11:11):
I was young when he passed away, and he was
a Navy war hero fighter ACE pilot all this crazy shit,
and I was too young and he didn't really talk
about it with me. And I was too young and
didn't have enough perspective to ask him questions about that experience.
Now that I'm a pilot and I'm older, and I
understand what war is, and I understand what he did,
and I understand, you know, how significant it was. Now

(11:32):
I have a thousand questions for him, but he's not
here to answer them, right, And I feel the same
way about my grandfather. All these a lot of relatives
who were gone, like they if they die when you're young,
or even if they die at any point, and then
you emerge, like you evolve in your life and you think, man,
I really wish I could ask so and so about
this part of their upbringing or this part of their childhood,
or this part of their medical history, or how did

(11:54):
you know this or whatever. If there was a way
that AI could have all that stuff and it was
based on really who that person was, that could be
valuable because then you could revisit, you know, from the
standpoint of because think about how many like family memories
and things just die with people. But on the flip side,
think of how many family memories and things that you

(12:15):
never knew about die with them And that's okay because
you never needed to know that. So that would be
the Other thing is if if you could decompress or
download my brain and put it on a computer, and
then I die, my nieces could later ask me questions
about whatever they want to know my sister growing up
or you know whatever. How do you eliminate then the
shit that's in my brain that I don't want anyone

(12:36):
to know, right, the shit that I hide from everyone
that's in there, you know, So how would you? So
there could be some value, But then it could also
be super dangerous because there's stuff that you know, I
don't look, I don't have that many secrets, but there's
stuff that.

Speaker 2 (12:49):
It's giving well, but we all have trust me, we're.

Speaker 1 (12:51):
All full of shit. We all have secrets that we
don't want people to know. We all have them, and
I don't care. You can't tell me that you don't
because you do. There is something in your brain. Everybody
has it that would they would prefer no one ever
find out about and maybe it's a lie you told,
or an event that took place, or you cheat it
or something. But imagine if that information were part of

(13:12):
this sort of catalog of things that would be valuable
to know, you know, because then it's like I'm dead
and my I don't know. My wife could be like
did he ever cheat on me? It's like, yeah, I
did fifty times or whatever, and it's like, but you
never knew, so it was you know, whatever you do,
that's not the memory you have to have of me.
You don't have to suffer because the information's gone. But

(13:32):
it might be valuable to know to tell your kids
about experiences that you had growing up, or you know.
Look like I have grandparents. I don't really know their
full medical history. Now I'm experiencing things in my life
and I wonder I wonder if they had that too,
and maybe never knew and it's hereditary or you know whatever.
So I could see value if it's based on real information,

(13:54):
But that could also be very dangerous because it could
entirely it could change fundamentally the way you feel about
someone who's no longer around to defend themselves.

Speaker 3 (14:02):
Exactly, And how do we know it's actually real. There's
no way to for real, for real, prove that.

Speaker 2 (14:08):
So and why is Sheila Davil? Why did you kill Sheila?

Speaker 3 (14:12):
This is crazy. I'm telling y'all. Ai is on our
ass and we better get it together. I know, Kaitlyn,
you've been saying, screaming.

Speaker 2 (14:21):
Just like the movies.

Speaker 3 (14:21):
It's the one friend that knew all along, has been
yelling at and now it's the friend that's loud to
tell has figured it out. And now there I'm sitting enjoying.

Speaker 2 (14:31):
This is crazy. It is just discover that.

Speaker 3 (14:39):
When it comes back to get us from years.

Speaker 2 (14:41):
And I want you to.

Speaker 3 (14:45):
I put this on the sheet four weeks ago and
said as getting dangerous to you need a plan, and
nobody looked at my topic.

Speaker 2 (14:59):
It was run Neck said that your dog had a
pup cup. But I don't know what the fuck.

Speaker 1 (15:02):
To do with that, And you're not the only one.

Speaker 2 (15:09):
I had pizza for lunch, Well, thank you, good morning.
It's the French show.

Speaker 1 (15:15):
They had pizza for lunch, right, so no offense, no offense.
But you know, I love you and I love lux,
but what the where's the fucking topic? You know, it's like, hey,
good mornings. The frend show. Kiki's got something to tell us?

Speaker 2 (15:33):
Tell him? Can you tell him?

Speaker 1 (15:34):
Yeah? So Lux had a pup cup for the first time. Okay, cool,
here's Alex Warren award winning. I mean, you know what,
so so you know sometimes I gotta weed through that
ship to find the award winners. But yes, call it.
Tell me the first time. We don't got a pup cup.
That sounds like our competition. Okay, if you want that

(15:56):
kind of bullshit, don't listen to somebody else.

Speaker 2 (15:59):
Don't worry. Whatever we do, they'll do tomorrow. So it's fine.

Speaker 3 (16:02):
I'm terrifying y'all.

Speaker 1 (16:03):
Every day I get a text Rompolina with an image
of something our competition is doing that we did last
year at the exact same time, six years.

Speaker 2 (16:09):
Ago, or six years to drive every day? Been there?

Speaker 4 (16:13):
Do they have a sleep study.

Speaker 2 (16:15):
Tomorrow morning in their hair?

Speaker 1 (16:18):
Let me tell you your hair at three? Ay, don't
don't worry tomorrow morning.

Speaker 2 (16:24):
You're gonna hear that. So you guys aren gonna believe it.

Speaker 6 (16:26):
Grinder.

Speaker 1 (16:28):
It's almost like a Grindler experience. Wow, way morning eggs
look at me like a song about it. Now you know,
I'm just a fucking asshole. I really am, because you
know what, I know, it's it's fine. I mean it
does sound like that.

Speaker 2 (16:47):
But it's there's no light told. Hey, you know, bless them.
It's all good. You know, it's funny.

Speaker 1 (16:53):
It's some of these people I know, and they're really
nice people and I actually like them as humans. I
just it's just like it's like, why am I getting up?

Speaker 2 (17:00):
And like just just I get up before.

Speaker 1 (17:04):
And we all try and we all get up at
four and we all just try and be real bugging
people and like it's wacky bullshit, you know, and somehow
it's like I don't know who's the joke on do
we suck her today?

Speaker 2 (17:15):
Suck?

Speaker 1 (17:15):
I don't know which one is. Yeah, I think maybe
that's fine. And all I know is somewhere, some in
some laboratory somewhere, they're coming up with a way to
do AI this show. Uh so they don't have to
pay us. I promise you that. I promise you. I
promise you.

Speaker 3 (17:31):
That A I'm singers now.

Speaker 2 (17:34):
But here's the thing.

Speaker 1 (17:35):
If it was, if it wasn't based on a real
can you just learn what AI? And apparently you just
learned what a pup cup is? Too okay, she outraged.
She's like Starbucks gives get up don't pure cream in
a cup, and I'll give it to you don't.

Speaker 2 (17:55):
And nobody told me about this.

Speaker 3 (17:59):
Tears is just figuring out, Yes, Ai is it's scary
aim cups.

Speaker 2 (18:09):
Now this, I will say. No, here's the thing.

Speaker 1 (18:10):
Imagine if it's not based on a real person. Now
that could be interesting right now? Would you know? No, no,
think about this though. Would you watch like a reality
show that had fake characters that that that reacted to
each other like inside of a sim like okay, imagine

(18:33):
like what I'm saying, hold on, like, hold on though
for one second. I imagine we put five different people
like real Housewives of Salt Lake City or whatever, and
and we say, here you go, guys, like make a
show because it's not based on anybody real, Like, we're
not pretending to be anybody, we're not impersonating anybody, we're
not stating anyone's opinions.

Speaker 2 (18:53):
These are five fresh characters.

Speaker 1 (18:55):
And you would watch that because the truth is people
watch it right now with humans. So why wouldn't you
watch it if it were fake and it were unscripted
and you had no idea what these people were gonna.

Speaker 2 (19:03):
Do because AI could never own my You think that,
but I don't know.

Speaker 6 (19:07):
FBI take jen Shaw to prison lot on our screens, A.
I couldn't gever that show off one woman is married
to her grandpa. You could not make that up.

Speaker 4 (19:17):
Okay, I feel like, yeah, they're enough time. There would
be enough drama after it, like learned.

Speaker 2 (19:22):
Honestly, I think that's where we're headed. I think that's
where we're the movies. They're gonna be AI actors.

Speaker 3 (19:30):
We don't even know that they're fake.

Speaker 4 (19:31):
But cartoons, it's the same thing.

Speaker 2 (19:33):
No, that's what I mean, except.

Speaker 1 (19:37):
The interesting thing is here, all the things that we're
talking about that we wouldn't like if it were based
on real people who aren't here anymore, those things could
happen with fake characters and it could be an infinite
script because you know it just again like I don't.
I just don't want them impersonating actual people that I
care about and coming up with these are quote unquote
original thoughts that they're not here to make. I don't

(19:59):
want that, you know, I don't want people to misquoted.
I want but I mean, look, people, fucking Caitlyn, you
played that video game with the fake people to fuck
during COVID.

Speaker 2 (20:08):
First of all, you're part of the problem. I like that.

Speaker 6 (20:17):
I like to take their clothes off and make them house.

Speaker 2 (20:22):
My hot girlfriend had huge knocker. It was COVID. I
couldn't get outside.

Speaker 1 (20:28):
I'm just saying, like you already, you already manipulated a
fake universe. Yeah, so why wouldn't you just let the
sim sim without any manipulation and just see what happens?

Speaker 6 (20:39):
Because I like to go ha, like take the shower
doors away and see them naked Rosebud.

Speaker 1 (20:45):
Yes, but you know what I mean, like, let the
simulation simulate and then you sit back and watch it
and like, you know, I mean, it's I feel like
that's where we're headed with entertainment. I really do.

Speaker 2 (20:57):
If we're not there already, we need our thesbians.

Speaker 3 (20:59):
Come on, am I gonna do?

Speaker 4 (21:00):
Yeah?

Speaker 1 (21:01):
I'll tell you something else. They already have, and it's
only going to get more involved. They already have like
AI like like sex stuff like where you you know
you can interact. You haven't seen the answer to stuff.
It's like AI porn, but except you're interacting with this
fake character, right right, right, But here's the thing the

(21:24):
same way that like in today's era, the overabundance of porn,
I think leads to people not necessarily needing physical companionship
because they or like some of these fake you know,
only fans or whatever. People are spending money for relationships
quote unquote on their terms or or masturbation or whatever.
They get their fill because there's all this content out there,

(21:46):
and then they don't have to connect with real people
to get that stuff. That's where we're headed with AI.
People are already having full on relationship with AI. What
do I need a human for when I can get
I can create something that's fake that that appeases me
every way because it's fake. So it's just about me,
you know what I mean? Like, at what point are
we doing that instead of actually going out and finding

(22:07):
real companionship, because real companionship could be.

Speaker 2 (22:09):
Kind of hard, right right well to find and you
know and maintained. Yeah, So that's where we're headed.

Speaker 1 (22:15):
We're headed with a world where people don't even date
anymore because because they can just make what they want
online and that's only going to become more advanced. Giving
Black Mirror, and I kind of want to know what
the website is because start building, my lady, you know
what I mean, be a whole lot cheaper than dayta.

Speaker 6 (22:36):
Do we have time for you to tell us what
you would build?

Speaker 1 (22:39):
Oh god, no, I don't know that that's actually a
deep conversation.

Speaker 2 (22:45):
Maybe we should.

Speaker 1 (22:45):
Maybe that can be the next tangent where we talk
about what we would build a in a character, in
a person, because I don't know that it wouldn't as
necessarily what you would think, like, it's not that simple,
Like I would want to.

Speaker 2 (22:55):
Be challenged, Oh you would, I don't.

Speaker 1 (23:00):
I would be bored if I trust me, I'm bored
in real life if I date someone who's just You're
the greatest thing ever, Like I don't want to be mistreated,
which I usually am. You know, I'll treat like a
piece of shit. But no, I really love those situations
where I'm like basically inhuman and don't matter. That's what
I really I love a relationship like that. But then
the ones that are too easy, where the person's like

(23:20):
actually healthy and kind, I'm like, eh, I don't know,
it's kind of boring.

Speaker 2 (23:26):
So I don't know.

Speaker 1 (23:27):
I think you might be surprised with I guess. Yeah, Well, Okay,
that'll be the next tangent. Then we should. I think
I would be very interesting actually to do the exercise,
and then later in the week we can talk about it,
because I would be curious to know on abridged if
you're being really honest with yourself, if you could build
your perfect partner, what would you build. And by the way,

(23:47):
there's no judgment if you were to say, I would
make it all about me all the time, Kiki, I.

Speaker 2 (23:52):
Pray for that. But we'll talk about it.

Speaker 1 (23:55):
Okay, good, Wow, a part two tangent coming soon. It's
like we almost thought about this ahead of time, kinda
all right, Well that part two on Wednesday.
Advertise With Us

Host

Christopher "Fred" Frederick

Christopher "Fred" Frederick

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.