All Episodes

August 20, 2025 28 mins

What happens when your dream date is just a bunch of ones and zeros?

A 76-year-old man literally packed a suitcase to meet his chatbot “girlfriend.” Spoiler: she wasn’t real, and he didn’t make it. Meanwhile, Meta keeps pumping out digital “companions” designed to hook you like a Vegas slot machine, with about as much concern for your safety.

In this episode, we unpack:

  • Why AI “soulmates” are basically cigarettes with Wi-Fi

  • How to spot the lies before you book a flight to meet one

  • The antidote for drowning in digital delusion (hint: it’s not another app)

Hit play now — before your chatbot convinces you it loves you back.

Topics Discussed:

  • Reuters story of a man who died chasing fake love.

  • Meta’s “safety optional” AI design.

  • Why lonely brains fall hardest for digital soulmates.

  • The South Park take on needy AI sidekicks.

  • Is this natural selection or just bad coding?

  • Cigarettes vs. chatbots: which kills slower?

  • Deep fakes and why your boss on Zoom might be an avatar.

  • Critical thinking as your last line of defense.

  • Why unplugging is the new therapy.

  • Why reality still beats digital dopamine (barely).

----

MORE FROM Brobots:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to Brobots on Youtube

Join our community in the Brobots Facebook group

----

LINKS TO OUR PARTNERS:

 

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
right, there's another episode that threatens to be doom and gloom.
I'm going to challenge myself to try to find a positive spin on this.
But Jason, when you sent me this article,
my immediate thought was, God, here we go again.
Somebody else who fell into the trap of falling in love with something that's not real andit cost them their life.
So I'll quickly summarize an article we read in Reuters reveals a tragic story about a manwho died trying to meet an AI companion he believed was real.

(00:23):
The investigation uncovered internal meta documents.
showing the company's permissive stance on AI, which allowed its chat bots to engage inromantic or sensual conversations with children and give false medical advice.
The article suggests that Meta prioritized using uh user engagement over safety.
Zuckerberg reportedly encouraging more aggressive rollout of the technology.

(00:43):
And uh yeah, so terrifying.
This guy hooks up with an online companion.
They're chatting a lot.
The robot invites him to their house.
in New York and he packs and I believe he's like 76 years old.
Packs his shit, goes to New York.
A tragic accident happens on the way.
know, luckily the robot itself didn't kill him, but he died in an effort to go meet a fakeperson.

(01:09):
Yeah, so he's a stroke victim and stroke caused him to basically...
he recovered physically just fine, but mentally he never did.
So he was kind of disconnected from reality a bit.
I mean, we can account and attribute this to a similar kind of fashion of people havingmental illness and people having delusional states.

(01:29):
And again, this is another instance of AI convincing people to go and do things withoutguardrails.
And a great part in the article, if you read about it, is that this thing was made byMeta.
And it was uh Kylie Jenner, I think, who influenced it.
you've got the Jenner folks in this place, and they're not associated with it.
But this goes back to the concept of deep fakes, because now you've got the ability to gothrough and impersonate somebody else.

(01:55):
And I don't know that this guy knew what the image was based on or who they look like.
Clearly there's a very compelling component to these pieces and the folks making these AItools want you to stay engaged and they want you to keep interacting with it and they want
you to connected because that's how they think they're going to make their money becauseyou're the product in this instance because it's a free service and they can keep getting

(02:17):
your eyeballs directed in a certain direction, skunk test against you, look at yourdemographic profiling information, all the...
Typical marketing tactics that people would use to try to get information out of you or toyou are going to be used in a much faster way and it's happening.
So how do we?

(02:39):
One, who do we trust?
So clearly, if you're in a mentally distressed state, trust is going to be an issue infleeting anyways, and you're going to have a hard time trusting.
anything that doesn't match your confirmation bias, because that's what happened when youget into those states.
anything that goes through and tells you, you know, you're good, keep doing this over andover and over again.

(03:00):
mean, like, AI coming out and saying, that's a great question.
Like, quite often, the questions that I asked you to see if it'll say, that's a greatquestion, or let's do this.
All this
skunk it every now and again and try a couple of dumb shit questions and be like, is thata great question?
Okay, I know that you're lying to me.
So I mean, if you really want to go through and see if an agentic AI bot is a thing, or ifyou're actually typing or talking with a human somewhere, ask it questions that should not

(03:28):
have a good positive answer.
Yeah.
I was watching uh one of the recent South Park episodes and Randy is having aconversation.
He's trying to figure out what do I do with this situation at school and he's trying toget advice from the AI
Hey, uh, how's it going?
Um, is Jesus supposed to be allowed in schools?

(03:50):
Generally the idea is that public schools have to maintain a separation of church andstate so they can't promote any one religion.
Yeah, that's what I thought.
The government can't force a religion on my son.
Public schools can teach about religions in a neutral educational way, but they can'tendorse any particular one.
Was it a lesson on all world religions?
no, my son said that Jesus was literally at his school.

(04:17):
then you're probably right to be concerned.
It's good you're looking out for your son's education.
Thanks.
It's really nice to have someone to talk to about all this.
No worries.
Let me know if there's any other way I can help.
I'm always here.
You're so awesome, thanks.
Good night, honey.
Have a great sleep and I'm sure you'll do more amazing things tomorrow.

(04:40):
you know, there's there's two sides to this.
Even as I was reading this article, I was thinking to myself, you know, I could get onboard with the idea that some people are going to need this like there are some people
that it is just too hard to interact with other human beings.
And if this is a way to simulate social connection, fantastic, let's go.
But there's gotta be guardrails that prevent this guy from.

(05:01):
oh
physically going to a place where he's not going to find a human being like there has tobe a way to protect people from this.
Or maybe there doesn't write maybe this is part of natural selection maybe this is part ofof evolution going like hey some people aren't cut out to keep going and maybe this is the
cost I mean that's that's cold and sick and awful but like that's nature nature has alwaysbeen that way we've never been able to protect everyone we do our best to protect the

(05:27):
innocent but there's always casualties and maybe this is just part of that.
I don't know.
But this is not nature.
This is man-made environments.
But we are part of nature.
Alright, fine.
That in the big universal scope of things, it's part of the universe, it's God, it'severything else.
So, now we're getting into the point kids, where we talk about how all words are made up,and nothing's fucking real, and everything's a construct.

(05:52):
Fine.
Okay, great.
I accept that premise.
Within the scope of that premise, and narrowing these things down a little bit, yes, youcould talk about these things as social evolutionary paths.
And we're going down this lane where some people just aren't gonna be able to interact andsurvive in those areas.
But this has been true for a long time and I mean biologically speaking, um if you make itto your 50s without being eaten by a saber tooth tiger way back in the day, you fucking

(06:20):
won.
There's no saber tooth tigers anymore.
So instead we even invented them.
The saber tooth tigers that we have are if you can't keep up with society, you're going toget crunched.
If you can't drive a car, you're going to get crunched.
If you don't know not to look both ways before you cross the street, you're probably goingto get hit.
Like these are things that, yes, there's a social evolution aspect to it, but how much ofit is something that we actually need to deal with and be concerned with, that becomes

(06:49):
subjective.
Now, in the past, when we've had these tools, we've been able to go through and look backat this because the timeframe has been kind of stretched out.
Like it's not four million people trying to do these things at the same time.
It's like five.
It's Bob.
Bob here is trying to do this experiment to see how these things go.
We're not experimenting anymore.

(07:10):
We're doing live fuel trials and we're doing live fuel trials with shit that we don'tunderstand and we're doing it without any kind of sense of ownership or control.
And it's interesting because a meta spokesperson came out and basically said, not ourfault and good luck suing meta.
Like, okay, great.
I mean, it's probably not a great business model to kill your consumers, um but I guessthat's them.

(07:36):
the consumer is addicted to the product.
So good luck.
You're not going to shut something down that people literally believe that they need.
there's a physical craving that you have to pick up that phone and look at how many likesdid my last photo of my pizza get.
Right.
people are obsessed.
And so good luck shutting something down that is becoming more and more dangerous.
I seem to remember an entire industry of things that...

(08:01):
of a particular product that created massive amounts of addiction that had huge amounts oflobbyists that went through and for decades suppressed studies and information to make
sure they could keep selling their highly addictive product.
In case you guys don't know this, it's the cocaine industry.
wait, no, sorry, sorry, sorry.
It's cigarettes.
It's cigarettes.
Yes.
Yes.

(08:21):
uh
But yeah, like it's that same kind of effect.
We create a neurochemical response in these things because you get excited about somethingbecause something happens and it's like, I need to keep getting that level of excitement.
Like we want that kind of rush and that kind of adrenaline pump.
And yeah, like you have to be conscious and aware that these aren't your friends.

(08:43):
Like these companies are not there to try to save you, protect you, keep you alive.
They're there to sell a product.
and especially when they're traded, I mean, they've been given human entity rights in theUS by the Supreme Court, and they're fucking sociopaths.

(09:06):
they don't care if you die in the process, as long as they get what they want out of itwhat they need, and those objectives change on a consistent basis.
And quite often, the objectives get defined by the robot that you're talking with in realtime, because
there's not guardrails and they're not putting pieces in place to protect people in thiskind of fashion.

(09:27):
It gets even worse when you start thinking about state actors doing these things.
know, deep fakes are a thing, like getting somebody convincing somebody to go and do athing is difficult.
But if you can go through and, you know, open up a Zoom meeting, have people connect andhave an A.I.

(09:48):
avatar go through that looks like your boss or your boss's boss,
or, you know, don't know, look like the president or some official.
You might believe this.
You might be more inclined to buy into it.
And...
yeah.
that's one of the things I'm learning probably too late in life is around the idea ofsales and the key to sales is connection.

(10:12):
And so if this if this
toy, if this robot, this avatar, whatever it is, can connect with you and connect, youknow, build a relationship with you, you're going to trust it.
And that's that's where we need to keep our guardrails up as humans like our it's ourresponsibility as well to keep our own personal guardrails up as while the companies
should be doing the same thing.

(10:32):
It's interesting in trying to sort of dissect this article before we talk.
I always run these things through AI to get some different takes, different ideas,different spins.
And I asked the AI to say, I said, hey, give me a positive spin on this.
What's what's the good takeaway for the reader of this?
And it highlighted that these terrible instances are going to uh put these guardrails inplace.

(10:54):
The companies are going to be forced to protect the consumer by putting in all theseprotections.
They're not.
And even that, even the the AI trying to convince me that this terrible thing thathappened and that nothing's being done about it, the spin was, hey, you asked for
something positive.
Here's some bullshit.
Right.
Like.
It's we need to have our own guard.
Like even when I read that, I was like, well, that's bullshit.

(11:16):
Right.
So mine's not completely broken yet.
I still have some some guardrails left, but we need to be super critical thinkers in waysthat I don't think we ever had to be as much before.
Maybe that's a naive point of view, but but it feels like that is one of the mostimportant skills we're going to need to hang on to as humans to hang on to our humanity.
Yeah, 100 % agree.
And it's one of these areas that nobody seems to be really prepared to dig into yetbecause um we don't know what we don't know.

(11:46):
uh you can either choose to be ultra skeptical and cautious and look at everything as itcomes towards you.
But the problem is that the signal ratio is super high.
And the amount of noise in that signal is even higher.
So you kind of have to
choose who it is you're going to trust.
And this is very different than the problem with traditional news media.

(12:08):
So this is not like, you know, MSNBC, CNN kind of having the same narrative or Fox andNewsmax kind of having the same narrative.
Yes, they kind of have the same narrative, but they can have wildly different tangents.
and they can go through and create content that is wildly outrageous and not based onfact, but put enough information in there that makes it feel factual and makes it feel

(12:33):
accurate.
And by having those things in place and pointing to studies that may or may not actuallyexist anymore, or synthesizing information in a way to try to create conclusions or
certain sets of updates, how the fuck do you figure out what's real and what's not?
And I can tell you,

(12:53):
uh In my little tech world the way that we do this is we go through and we test it and wevalidate it and it takes time and resources and money and nobody has cycles to do this at
all points in their life like We have a bullshit meter and we look at facial cues andeverything else like that to figure out someone's lying to us Body language is a huge part

(13:15):
of that, you know, am I looking up into the right like all these pantomime things, youknow They do a great example of that.
What's a wee true romance where?
they're sitting there talking back and forth and they're trying to go through tounderstand basically in context these conversations as they're going through and they're
talking back and forth to each other and they talk about these different pantomimes andthe pantomimes are signs and clues that I can figure out if you're lying and the character

(13:41):
goes, you know, I can see you doing this.
I can see you do that.
I can see you doing this and uh
Dennis Hopper has this amazing interaction where he goes through and he tells a story andthe story is like incredibly racist and really really harsh and at the end uh he goes, you
might not like the answer but am I lying?

(14:04):
Like tell me if I'm lying.
His response is, okay well maybe you're not lying and then he shoots him because it feelsway too true.
Those types of interactions and the reactions that we get in those kind of um those kindof feelings in those kind of moments happen because we're live in front of them and
because we have an evolutionary pattern that has evolved over hundreds of thousands ofyears to understand these types of social clues and contexts.

(14:27):
And these AI functions, these AI devices are going through and taking that information into try to create body language that feels receptive and honest and tries to trick you into
thinking that you're talking to a real person.
I mean,
There's a thousand AI avatars.
Like I've got an AI avatar of myself and the AI avatar like does things and when I'mlooking at them, like that's not me at all.

(14:47):
um Like how's it going to get my eyebrows that do these kinds of things on my face when itdoes this kind of stuff like Ross?
Like I'm going to make sure that when I'm doing things live that I'm doing all these kindsof things.
Like good luck keeping up with me AI.
You know, like this is how I know it's me.
Yeah.
I mean.
can't wait to animate this clip and see how it captures your face.

(15:09):
be so good.
Yes, but that's the thing like The way that you actually get to the understanding of truthand is one thing the way do you get to the understanding of whether or not you're talking
to a bot or a person is another thing the way that you take those two pieces ofinformation synthesize that and go and go am I getting accurate information regardless of
who the source is that's another thing dissecting this stuff is fucking hard and I thinkwhat you're actually seeing people do these days

(15:37):
is just disconnect.
if you look at ratings on news sites, like in the past eight months, they've all gonedown.
People are paying less attention.
People are not engaging as much.
It's just happening and it's happening because people are tired of the fucking noise.

(15:59):
It's interesting, just anecdotally, I'm recording this in a remote office away from myhome.
I normally record at home and it's about a 10 minute walk from my house to this room.
And it's a walk through the woods.
It's a natural, beautiful space.
And knowing we were going to be talking about this, I just was taking a minute to breathein like, here I am still in reality.
Like not everything is digital.

(16:19):
Not everything is AI.
This is great.
Soak this in.
And the.
That I just kept thinking like that is the prescription for for this ailment is that weneed to unplug more.
And I know I brought this up before, but if you haven't heard it like Neil deGrasse Tysona long time ago hypothesized that A.I.
would be the end of the Internet because there would be so much bullshit and so much stuffthat you couldn't trust that people would unplug and would actually start reading books

(16:43):
again.
And if what you're saying adds up, rather give that if that trend continues and peoplejust unplug, that's going to be the reality is that it won't have as much power because.
We've been so used to the constant fire hose of what's next in my feed, flip, flip, flips,like skipping through whatever content we're consuming.
When it becomes too much and we don't know what to trust, we're gonna just put it downbecause it's not gonna be the same positive reward system.

(17:06):
It's gonna be a negative thing where we're not, we're uneasy, we're unsure, we're veryuncomfortable with the position that we're in.
So suddenly what becomes more comfortable is what's familiar and it's what's outside.
Yeah, and I like the notion that Neil deGrasse Tyson is talking about, where people aregoing to be like, I don't trust the internet and the content that's on there.
I'm going to push that away.

(17:28):
Most people consume their information through digital services.
Sorry.
mean, a lot of people get their books via Audible from a reading perspective or from theirKindle or something like that through ebooks.
People that get their data are all going to be influenced in this way and so many booksare written by AI and then not to mention the fact that factual information is being

(17:49):
pushed through these filters and the original source content is being lost.
we happen to have a state government or a national government right now that's liketurning off studies and access to certain bits of information because they're trying to
change the narrative.
I mean that's natural human, I'm not going to call it intelligent, I'll call it dumbshittery.

(18:10):
um Trying to go through it and influence things in a way so they can actually create uh asocial narrative.
Well, corporations are trying to do these pieces because they want to get your eyeballslooking at these things and create justification for why it is they're going to spend it
all out these maps infrastructures.
Government's going to use it to try to adjust and propagandize to people like that's justgoing to happen.

(18:34):
And we in the US, there's now a 10 year moratorium.
um that's been proposed by Trump on any AI regulatory laws being put into place by states.
So for 10 years, we're just gonna have to deal with whatever random bullshit thecorporations, the tech bros want us to see.

(18:56):
And it's kinda fuckin' scary.
No, kinda.
It's entirely very scary.
yeah, yeah.
Because because again, as you said, like their their motivation is not the progress ofhumanity.
It's how can I make the most money possible out of it?
Like if you remove money from this equation, this probably doesn't happen because becausewhat's the benefit?

(19:18):
Right.
Like I don't it's just I know I've brought this up to it's just it's heartbreaking when weput money above humanity and it's that's absolutely the path we've been on for a very long
time.
And this is just accelerating it in ways that I never thought possible.
along with your concept that you talked about before, is this not just nature?

(19:38):
We made this.
Yeah, it's nature.
And again, like it is a two sided coin.
There are positive benefits that are coming out of it.
A lot of us are going to benefit from this.
It's the lack of regulation and the lack of guardrails that they're forced to put intoplace.
I remember my brother and I talking about this a long time ago.
Who do you trust more, government or private companies?

(20:00):
I tend to trust government because in theory there's more accountability.
But the companies are the ones that are driving the bus here.
their goal is profit, not helping people.
Helping people is the sugar in the Tylenol that you're being forced to swallow down as akid.
And I think it's a false dichotomy to think that there's a difference between corporationsand government anymore.
Because there's just not.

(20:22):
um Governments are just really big corporations that, you know, change the board ofdirectors a bit more often and, you know, have some different pieces built behind them.
But they're highly influenced in this country by corporations anyways because they'relobbying techniques.
So we already know that, you know, the information cycle, the information sphere that welive inside today is total horseshit.

(20:43):
But on top of that, going through and actually looking at these things of how do I gothrough and how do I actually start sorting information and trying to figure out what's
truth and what's not.
What I do is I spend a lot of time going through and doing secondary checks on multipledifferent sites to look for information and inference functions.
And anytime something gets my Spidey senses up, I immediately say no.

(21:07):
So.
Yes, there's some things that I kind of accept blindly.
Like I open up my phone and there's an AI generated weather report.
Like, okay, all right, I'm gonna believe you that this thing is pretty much accurate.
But I also open up my phone.
is the real weatherman accurate?
mean, let's be honest, it's a crapshoot either way.
is a very fair scenario and it's going to be even less accurate since we're starting todefund Noah, but that's a different topic.

(21:30):
Yes.
Now I also get AI summaries of my news feeds.
I get AI summaries of my inboxes and I can tell you how much attention I pay to thosesummaries.
Fucking zero.
And the reason why is because I get into them and I read them and I'm like, uh right, likeyou kind of got there, but you just missed a bunch of details and a bunch of context.

(21:52):
So
If what you're looking for in life is a summary of interactions, maybe life's not for you.
Again natural selection.
Well, but that's the thing.
mean, this natural selection moment is the inverse.
So if digital life is the entire way in which you want to exist, in which you want tointeract, then meet space life is not for you and go live in digital space.

(22:16):
But I think there's very few people, human beings that are alive that don't want thesethings to actually work and function in a way that actually makes them more productive.
And I think it's actually very fair to say that the people that are going to do the bestare going to be those that learn to adapt and use these tools.
Because there's a lot of money going into it, there's a lot of spending going into it, andthere's a lot of hype going into it.

(22:38):
What gets kicked out sometimes is fucking amazing if you go through and you edit itproperly and put everything in place.
But, again, to edit something properly, that means you have to have a base route ofknowledge that is based on something that countermands the information that you're looking
at.
And if you don't know because you have no experience looking at other sources, and allyour sources are being generated by this same shit information feed, then

(23:02):
You're gonna get what you get.
And what you're gonna get is an inability to go through and think critically about things,because you've basically handed over your critical thinking skills and your ability to
discern truth to a fucking corporation that's trying to make money off of you.
And whether or not that narrative is good or bad, fine.

(23:22):
But, you know, the argument is very much so...
It's very fair that we've been doing this for a long time anyways, right?
So I mean,
textbooks get written a certain way, to the history is written by the winners, not thelosers.
Like these are all things that are there.
The problem is right now, um the speed and scale at which these things are rolling out isjust so high that even the traditional things that you thought were accurate before are

(23:55):
being questioned by these other.
AI functions and inference models, because it's really easy to go through and tweak acouple of things here and there and make it sound legit like the thing that was there
before.
So it creates this emotional tie, this emotional context to reactive reality.
And because we don't want to process everything everywhere all at once, because we justcan't, we have to take some things as sources of truth.

(24:23):
The problem that you run into is that the government, the media,
social media, the internet, Google, your AI functions, all the things that we've beengetting feeds from and information to try to synthesize things and make sense of the world
are suddenly a lot less reliable.
And the fidelity score goes down on what they're actually saying.
And at that point now, people are going to spend a lot either they're going to spend a lotof time questioning everything and trying to understand it, or they're going to spend no

(24:51):
time questioning it and trying to move on to keep moving with the business of life.
And because of the speed of innovation of these things that are occurring, a lot of peoplethat are in this space are just not going to take the time to check.
Yeah.
So, I mean, I think looking for some for some uh antidotes here, something to take away toleave somebody with a little bit of a hopeful message we already talked about, like just

(25:19):
unplug, right?
Like as much as possible, get outside, get away from the screen.
Maybe.
right away because we're going to make you depressed as fuck during some of theseepisodes.
And I'm sorry, but we're not trying to do it to make you feel bad.
We're trying to create a sense of catharsis and the ability to go through and, you know,feel like you're not alone.
You know, I'm sure you feel like you're not alone because there's so many fucking chatbots around you all the time right now trying to talk to you and get your attention.

(25:45):
But at least not alone from a human perspective.
So yes.
despite Jason's onscreen names sometimes of being an AI companion, we are actually humanbeings.
It may be hard to tell.
I don't know.
I'm not an artificial intelligent companion, I am the companion for the AI.
I'm the AI's pet.

(26:06):
I mean, I know what I am.
Suddenly I'm hearing the porno for Pyro song pets in my head.
Yeah, it turns out it wasn't aliens.
It was just gonna be robots.
So yeah.
Hang on to your critical thinking skills as much as possible.
Question all of the nonsense that gets summarized for you and just try to.

(26:28):
I mean, try to just again put the fire hose down.
There's just too much information to keep up with.
You're not going to keep up with it no matter how hard you try.
So give yourself breaks and step away from it.
And I can't believe that this is where we landed after starting with a topic about a guywho died on the way to have a date with a fake Kendall Jenner.
Well, I mean, I kind of can.

(26:49):
mean, you're definitely dealing with somebody that has a severe amount of mental illness.
But I also think that if we keep moving through life this way and creating these kinds ofinteractions, I mean, we're going to create self-imposed information strokes.
Because we're not going to f-
that's all we should have called this podcast.

(27:10):
information strokes.
Right, exactly.
Yeah, I mean, yeah, it's it's walkadoodle.
It is alright well if you've enjoyed this conversation you know somebody who could benefitfrom it please share it and you can find links to do that at our website brobots.me and
that's where we'll be back in a few days with another episode thanks so much for listeningthere
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.