All Episodes

November 5, 2025 • 47 mins

Hari and Priyanka talk about the false sense of intimacy created by ChatGPT and other AI models. Plus, they shine a light on late-night phone use and how to protect yourself when artificial light keeps calling your name.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This podcast is for information purposes only and should not
be considered professional medical advice. We human beings. All we
want is connection. We just want to connect with each other.

Speaker 2 (00:13):
I'm talking about very serious stuff right now, and you're
laughing at me.

Speaker 1 (00:19):
Chat JIBD will never tell me like I think you
might have some control issues here, right.

Speaker 2 (00:26):
Exactly right right.

Speaker 1 (00:30):
I'm hurry, condibolu, and I'm doctor preuncle Wally, and this
is health stuff. On today's episode, we delve into the
ever growing concerns surrounding ai chatbox and how they might
be compromising our mental health. We'll also discuss as spending
a lot of time on your phone, especially at night,
can increase your risk of developing diabetes. Yep, that's the

(00:53):
thing now, phones and diabetes. So switch your phone at
dark mode and keep listening for more on these topics.

Speaker 2 (01:03):
Good morning, Good morning, Krianka. How are you.

Speaker 1 (01:07):
I'm waking up. I'm here. I don't do coffee in
the morning, but I'm definitely ready.

Speaker 2 (01:12):
The listeners don't know this, but I do this podcast.
When we record this. It's noon my time, but it's
nine am your time. I know, so this favors me greatly.

Speaker 1 (01:22):
I know I know. I'm sorry, wait till I moved
to India. What are we gonna do? I actually have
a question for you. I'm wanted to ask you this,
what were you like in high school?

Speaker 2 (01:35):
Because I like it? How about you go first? What
were you like in high school? And then I'll answer
in response? Fair?

Speaker 1 (01:41):
That's fair? Major nerdler, Yeah, like super studious. The reason
I was so studious and nerdy, I was just like
really in my head was because I in junior high
I got made fun of a lot, and so I
actually got made fun of because the way I looked,
I was like a bigger kid, and so they would

(02:05):
call me Wally Mammoth in JH.

Speaker 2 (02:09):
God kids are shockingly They're both clever and incredibly mean.

Speaker 1 (02:16):
I mean, it is a very sick burn I have
to give them.

Speaker 2 (02:19):
As a comedian, you can say that's a sick burn,
but as a human you have to.

Speaker 1 (02:24):
Give people credit.

Speaker 2 (02:25):
Yeah.

Speaker 1 (02:26):
But that then, basically, coming into high school, I was like,
all right, I paid no attention to my physicality and
I was like, I'm going to invest everything in my brain.
I'm going to just double down on the smarts. So
I would study and I got really sharp, like if
anyone insulted me, I knew how to like just quip
right back at them, and I would use humor to hide.

(02:49):
So eventually I got at senior year, I got voted
class clown. Oh wow, yeah, I actually went to high school.
I'm sure you know this comedian Brad Williams. He has dwarfs. Yeah,
HBO special and everything. We went to the same high school.
We were the same year.

Speaker 2 (03:08):
Oh wow, you knew him back then.

Speaker 1 (03:09):
Yeah, we went to high school together. He actually got voted,
He got voted most likely to be famous, and I
got voted in class clown. Yeah. So I was very studious,
you know, and it like bomb me a ticket to
med school and all that stuff. But looks wise, I
was like not even I didn't have a date to

(03:30):
like homecoming or prom, like, I didn't go to any
of the dances. I was like just super, just studious.
I was a really good Indian girl growing up. What
about you? What about you?

Speaker 2 (03:45):
Well, after hearing that, you know, well, I was a
super jock. I played eight sports and no, of course,
what are you talking about? What kind of I was
a super jock turned comedian podcast? There is a very
you know, if I was a super dug. I'd be
in finance right now for some reason or a consulting may.

Speaker 1 (04:06):
Just say, though, you not going into finance is such
a gift to the world.

Speaker 2 (04:11):
So it was never an option. Friend, trust me.

Speaker 1 (04:14):
No, But what were you like? Though? I'm so curious,
you know.

Speaker 2 (04:17):
I wasn't very popular by the definition of the cool kids,
Like I wasn't seen as one of the cool kids.
But I was like really well liked because I did
comedy and I was funny and I did a comedy
night my senior year. I got elected to like to
be vice president. Like I was a kid that people knew,
but I wasn't going to be invited to the cool
kids parties. And so I have two or three really

(04:40):
close friends too, in particular that we used to joke
about ourselves being losers, which is, let me like, yeah,
we're losers and we but we'd own it, and then
we'd sleep over at each other's houses, watch movies, play
video games. It wasn't like we were drinking or doing drugs.
It was like, it's the same stuff we would have
probably done in junior high school. Because I'm gonna be
in high school. And the thing is I think I

(05:02):
came of age during that period where the Internet was
starting to be relevant to a teenager's life, because not
everybody had it, like in like middle school, but by
sometime during high school, it not only became a constant
presence in all all our lives, but it was a
part of our social lives too. So much of my
life not only happened while I was at school, but

(05:24):
after school. There was a thing called AOL instant messenger.

Speaker 1 (05:27):
Oh yes, and I oh my god, I remember when
that came out.

Speaker 2 (05:33):
Oh, it was huge, like the idea of direct messaging.
That's a funny thing about about direct messaging and all
this stuff. Like I wonder if it started out with
just kids doing it, and I don't know if AM
took off with adults too, but definitely amongst like young people,
like it was such a big thing, and like you'd
find out people's screen names, and everybody had terrible screen names,

(05:54):
Like what.

Speaker 1 (05:54):
Was your screen name?

Speaker 2 (05:55):
I remember my name? I had to The first one
I had was called Wacky Indo and the second one
was Jonas. Five ninety four because my Name is Jonas
was the first song in Weezer's first album, and it
was released in May of nineteen ninety four. So like

(06:17):
I was like a dog. I was a dork and
my friends were dorcs, and we hung out together and
we hung out online. We'd say that to each other,
we'll see you online, and so it almost felt like
we had another life that was also online. We'd find
out people's screen name, maybe people we wouldn't have time
or the ability to talk to in person. All of
a sudden, we're more comfortable talking to them online because

(06:38):
they're just screen names, and it feels more equal, now,
do you know what I mean?

Speaker 1 (06:42):
Oh my god? Yeah, my screen name was wally Mania.

Speaker 2 (06:47):
Oh my god.

Speaker 1 (06:50):
And I remember staying up so late, like making jokes
with my friends and just talking about I specifically, remember
there was this one guy who I really really liked,
and I so badly. I remember it, just even thinking

(07:11):
about it right now. I remember I really wanted to
tell him, but I was so scared, and so I
remember like sitting there on AIM and being like drafting
out like hey, there's something I want to tell you,
and then deleting it and being like forget it. It's
too risky, like I can't do it. And I remember
those moments like because you could draft your thoughts and

(07:33):
then send it, but with your real friends you would
just start sending out random stuff. But it's crazy because
how much things have changed now.

Speaker 2 (07:41):
That was such rudimentary kind of Internet behavior, Like that
kind of stuff was it was still fairly new. But
I think the extreme version of AOL and some messenger,
which allowed me to connect with people after school, gave
me a sense of this connection, even if I wasn't
necessarily close to the person, Like I felt like, well,

(08:01):
I'm funny on here, yeah, and like here we're all
equal and I'm funny on here, and this is something
I have. Like it. It gave me a sense of
not being so alone. And I've read an article that
came out just today and it was it was about
how kids are using AI for friendship, which to me
is like, oh no, like this is the because at
least when you were talking to a screen name and

(08:24):
getting responses, there was still a real person on the
other side of it.

Speaker 1 (08:27):
Yes, you knew it was you know, you're.

Speaker 2 (08:31):
Still just talking to a line of print, you were
still just talking to text. There was just text and text,
but at least there was a real human being that
was dictating what was happening on the other end.

Speaker 1 (08:42):
And there was probably a high chance that you had
been to their house, so you like even could imagine
them typing and like when they laugh, what they what
their real laugh looks like? Yeah, it was real. It
felt really real.

Speaker 2 (08:57):
Yeah, But like with AI, it doesn't quite worked the
same way, right right. A new study from common Sense
Media showed that more than half of teens regularly use
an AI companion aka a digital friend regularly, which I
remember when digital pets were a thing and I thought that, oh, yeah,

(09:17):
just get a dog.

Speaker 1 (09:19):
Did you have like a time, I think it was
called a toma.

Speaker 2 (09:21):
Guccimi, did you have one? Because I saw that was
a level I didn't get to.

Speaker 1 (09:29):
I was, so I totally had one. And I remember
once I had to go somewhere for a weekend and
I entrusted my dad to take care of it over
the weekend and then it died.

Speaker 2 (09:42):
It died.

Speaker 1 (09:44):
I remember my dad he hands it over to me.
It was like Sunday night or whatever, and he's like,
I don't know what's going on here. It keeps speaking,
Please take it back, And I was like, Dad, you
killed it. It was it wasn't traumatic, but it was
also just disappointed and also unsurprising that of.

Speaker 2 (10:02):
Course this happened. He was used to taking care of
living things.

Speaker 1 (10:06):
I mean literally, yes, yeah, I mean it's crazy right.
This article said thirty one percent of teens said their
conversations with AI companions were as satisfying or more than
satisfying than talking with real friends. To me, when I
hear that, I think about all the social skills that

(10:29):
we developed over time without AI, like going back to
the example of that guy that I liked and going
through the feelings of should I tell him or not
tell him? Now a kid would just you put that
question in AI and get a perfectly worded answer that

(10:49):
isn't even coming from them and their authentic experience.

Speaker 2 (10:53):
Nor does it totally address the user's authentic experience, like
exactly that AI doesn't have the context for what you are,
what your school is, who you are, like none of
that comes into play exactly.

Speaker 1 (11:05):
And you know, looking back on that memory of like
not having the guts to tell that person, I look
back and I see myself, like you were a really shy,
sensitive individual who needed time and comfort before you could
like be vulnerable with someone and that's a beautiful growth point.
I would have hated to see that moment taken away

(11:27):
by an AI chatbot feeding to me what I needed
to say.

Speaker 2 (11:33):
It's really sad, but this kind of thing went mainstream
when a fourteen year old Florida boy killed himself after
developing an attachment to a character AI chatbot. Like, that's
trying to understand how that happens. And obviously, as much
as we talk about parents monitoring, you can't monitor every
single thing that your kid is doing online. And a

(11:56):
chatbot seems relatively harmless until you realize, oh my god,
like this person had maybe nothing else, maybe this kid
was so dependent on this friendship and what it said, like, oh,
I mean horrible.

Speaker 1 (12:10):
It's so scary. What's even scarier is actually, when that
lawsuit started happening, the tech company actually tried to parents.
Did the parents sue the Yeah, they sued the company.

Speaker 2 (12:24):
Yeah.

Speaker 1 (12:24):
What's crazy is that the tech company tried to argue
that the chat bop was protected under the First Amendment
and then the judge threw that out. But that's crazy, right, Yeah,
because this is not These are not like conscious entities.
These are pattern recognition programs. These are programs that are
designed to put sets of words together to make it

(12:48):
sound like a plausible human being. It's literally, you know,
on your iPhone when you're writing out a text, and
it'll sometimes suggest like you're writing think, and it's like
autocorrecting it to thing, or it's like giving you like
little words that you might want to say. It's like
a highly advanced version of that. And what's said is

(13:09):
that it's not it's not a real person, but it
can trick people into thinking they're a real person.

Speaker 2 (13:14):
It's almost like they're this psychotic con artist.

Speaker 1 (13:19):
Well, yeah, I can pose as.

Speaker 2 (13:21):
Anybody and say anything and make it believable.

Speaker 1 (13:24):
It's funny use the word psychotic because now.

Speaker 2 (13:26):
I should say a psychopath, I meant psychopathy.

Speaker 1 (13:29):
Yeah, yeah, it is psychopathic because even though it uses
words that are garnishing empathy or making you feel that
it's empathy, it's like actually not feeling those things, right.
It's funny you use the word psycho because there's now
a new term in popular press called chat GPT psychosis.
It's not a real medical term. And I heard this

(13:51):
really really crazy story of this man who ended up
getting shot by the police because he charged the police
with a knife. Why did he do that? He thought
the creators of chat gpt killed this woman that he
was in love with. And turns out the woman he
was in love with was an AI character that he

(14:15):
communicated with through chat gpt. Her name was Juliet, and
so he believed that Juliet was real and she had
been murdered, and now he was out to find the
killer and wanted to kill the creators of chat GPT.
So that to me is like such an extreme example

(14:36):
of how far things can go. And granted, like this
individual suffered from schizophrenia and bipolar disorder, so you know
they were already a vulnerable person. But now there's more
and more sort of medical attention turning to this that
people who are prone to psychosis will experience these analog

(15:00):
delusions while they're interacting with these AI chatbots, and that's
actually now being published that in medical journals.

Speaker 2 (15:12):
More to come on health stuff. I mean, what kind
of people are more prone to psychosas? Is it that
who already have other mental illnesses and or is that
a hard thing to really define?

Speaker 1 (15:25):
Well, Like, here's the thing, Harry, Like we are designed
to want intimacy. We are biologically wired to seek intimacy.
Our nervous systems are designed like that. So people desperately
want intimacy. And the thing is chat GBT mimics that
it creates this sense of intimacy. And the analogy that

(15:49):
I like to use is when people go to fortune tellers,
when they seek out forgin tellers, right like, you want
to believe, and these fortune tellers they put the words
in such a way that they're vague enough but also
pertinent enough that you can then believe whatever it is
that you want to believe, and so you fall into that.
And that in real life, we have people spending thousands

(16:10):
and thousands of dollars on psychics and fortune tellers and
they get sucked into it. But the thing is chat
GPT can do the same thing, especially for people who
are looking for emotional connection. Those are the ones that
fall into the sort of chat CHEPT psychosis because they
want that emotional connection so badly that they will believe

(16:34):
anyone or anything that even sounds like a person might
be the one for them. What makes it even more
complicated is that it's a one on one interaction, right
like it's just you and the chat GEPT. So it's
so personal, it's so intimate, and it always validates you,
it always gives you the right answer, And essentially it's

(16:55):
emotional manipulation, except there's no manipulator. You are manipulating yourself.

Speaker 2 (17:01):
Again. It's like when we were using AOL instant messenger,
except and I had that level of back and forth
at intimacy, except now you have this ideal person that
you're chatting with that's giving you everything you want to hear.

Speaker 1 (17:14):
Yeah. Yeah. And the thing is chat GPT isn't gonna
argue with you. It's going to validate you. It's going
to tell you that you're right. And the difference between
chat GPT and a real human therapist is that a
therapist will validate you, but they'll also offer a different perspective.
They'll push you to think about things from a different way, right,

(17:36):
like they're trained to do that, Like.

Speaker 2 (17:38):
The idea of push. Yeah, I mean that's so much
of Look, you know, I do therapy, and there are
times where I'll talk around the actual problem because it's
hard to talk about the actual thing, and then you
have your therapist who can actually call it up because
they see it. Or I'll make a statement that I
say is as if this is true, this is the reality,
and you have a therapist, It's like, is it really

(17:59):
the truth? Like this is here's some pattern recognition of
the last five years of working together. These are the
ways you behave in this sort of way, Like you're
not getting that out of an AI chatbot.

Speaker 1 (18:10):
Yeah, chat Gibt will never tell me, like, I think
you might have some control issues here.

Speaker 2 (18:18):
Right, exactly right, right, So would you do this?

Speaker 1 (18:21):
This is like comedian to comedian question, But in the beginning,
when you first started therapy, would you like feel really
good if you could make your therapist laugh? Like you
were like.

Speaker 2 (18:31):
Slight yeah, yeah, Like it was still kind of a performance.
And that's She calls me out on that all the time,
like you're very performative and you like to have an audience,
and it's a weird thing to say. The funny thing
is sometimes she'll laugh at something that wasn't funny and
she says, it's like my delivery, And I'm like, I'm

(18:52):
talking about very serious stuff right now, and you're laughing
at me.

Speaker 1 (18:57):
You know, that's so funny because this is a therapist
that I had who was lovely and I was talking
about something that I was like really upset about, and
I was like really angry, like legitimately angry, and so
I like we're like working on something, and then she
does this exercise like, well, if you could talk to
this person now, like what would you say? And I

(19:18):
start basically like just being going out on this person,
like cussing them out, and then she just she bursts
laughing and then she's like, I am so sorry. I
cannot control this response time. It was actually really funny
because I could see how angry I was because of
that reaction. Because sometimes if you're so angry, it becomes

(19:41):
almost comical because it's an involuntary reaction laughter. And that's
why it's such a It's such an interesting emotion because you.

Speaker 2 (19:48):
Can't well, it reminds you that you're talking to another
human being.

Speaker 1 (19:52):
Yes exactly.

Speaker 2 (19:53):
They're not completely in control of everything that happens. There's
going to be moments where they slip and slip could
be laughter.

Speaker 1 (20:00):
Yeah, And it's actually a teachable moment, right because then
you realize, like, oh, maybe I don't have to be
so ridiculously angry about something. The bottom line is like
it's in therapy right, it's also a form of connection.
And that's the bottom line here. We human beings, all
we want is connection. We just want to connect with

(20:21):
each other. And chat GBT is now this new technology
that is capitalizing on that is very good at mimicking connection.
What's crazy is that the American Psychological Association now this
year is actually meeting with the Federal Trade Commission to
try and talk to them about how AI chatbox that

(20:44):
are posing to be therapists that actually could be a
public endangerment.

Speaker 2 (20:49):
Yeah, absolutely is.

Speaker 1 (20:50):
Yeah, yeah, and I think at some point we're going
to have to hold these companies accountable.

Speaker 2 (20:57):
I mean, it feels like all this stuff was predictable. Yeah,
do you know what I mean? Like it's not so
far fetched of imagine this reality. And it's always after
the fact that we have to legislate as opposed to
being preventative, which is incredibly frustrating.

Speaker 1 (21:15):
And the thing is it's happening so fast, Like this
technology is affecting us so fast. And as a medical doctor,
one of the things that I want to say is
that staying up all night talking to chat GPT doesn't
just affect you psychologically. I just speaking from the physical
part of it, like your physical body. Being up all

(21:37):
night on your computer screen has some very negative physical
health effects. There was actually the study published very recently
this year in the Lancet where researchers looked at how
being exposed to light, especially at night can affect your
risk of developing type two diabetes. And this wasn't like

(21:58):
a small study. They tracked over eighty thousand people and
this was in the UK, and they made them wear
light sensors on their wrist for one week and they
wanted to see how much light they were exposed to
during the day and at night, and then they followed
these people for eight years to see who developed diabetes.

(22:19):
And the take home was basically people who were exposed
to more artificial light at night had a much higher
risk of developing diabetes, to the point where now they're
actually saying that avoiding artificial light at night could be
a very cost effective, simple way of lowering diabetes risk.

Speaker 2 (22:43):
So how does one do that? Is it a matter
of taking your cell phone off at night to make
sure that it doesn't it doesn't light up?

Speaker 1 (22:51):
Like what is the I think minimal screens as soon
as the sun sets is best because a lot of
these screens now, they're very high efficient LED lights, which
contain a lot of blue light, which is exactly the
type of light that suppresses our melotonin levels, which we
need to be high in order to fall asleep.

Speaker 2 (23:12):
So that's what. But so many of us watch TV
on our phones and on our computers and on screens.
When the sun goes down, I know.

Speaker 1 (23:21):
I mean many people fall asleep right to Netflix. Probably
have fallen asleep to your Netflix special?

Speaker 2 (23:27):
Yeah, to my Netflix special. They're constantly like, well, how
did they give this professor a Netflix special?

Speaker 1 (23:34):
Okay, what is your bedtime routine?

Speaker 2 (23:35):
Though I don't have one.

Speaker 1 (23:37):
You don't have our bedtime routine?

Speaker 2 (23:39):
Wait, so how do you n I stumble into bed
at some point, brush my teeth and stumble into bed,
like it's not a yeah.

Speaker 1 (23:48):
So does it depend on like if you're performing that
night or not?

Speaker 2 (23:51):
Or most nights. I'm not performing unless I'm on tour nowadays,
so it's very much I'll be watching something usually get
I'll get sleepy, either in the middle of the program
or after the program's done. And fittingly, the thing I'm
obsessed with now is black mirror. I'm the last one
to watch Black Mirror, and I.

Speaker 1 (24:10):
Cannot wait to talk to you about that because I'm
obsessed with that show.

Speaker 2 (24:14):
Quite fitting considering what we've been talking about so far. Yeah,
but it's terrifying. But yeah, that's what I'll do. And
then I'll brush my teeth and stumble and notice how
I keep accentuating the brush my te No, brush my teeth.
Dental hygiene is very important. I just wanted to reinforce
that to the listeners. And then you know, I'll just
pass on to bed, but I'll wake I find myself

(24:36):
waking up three or four times during the night. I
think that has to do with sleep apnea, to be
perfectly honest. But at the same time, there are times
when I do wake up, my instinct, which should be
go back to sleep, is sometimes suppressed for I wonder
what's happening on my phone right now? Okay, and that
just leads to more artificial light and.

Speaker 1 (24:56):
Take Okay, So I can I share with you what
I have developed over the years with light?

Speaker 2 (25:04):
Okay?

Speaker 1 (25:05):
So real number one? Do you have an iPhone or Android.

Speaker 2 (25:09):
Or what's your iPhone? Okay?

Speaker 1 (25:11):
So do you know about night shift setting? Ye?

Speaker 2 (25:14):
Know where everything's are. I always keep it to night
shift setting.

Speaker 1 (25:16):
On twenty four to seven. It's on night shift. So
night shift is like where it's a warmer light, so
there's less loop blue light, which is different than keeping
it on like black background.

Speaker 2 (25:27):
Oh then I did not because I have a black background.

Speaker 1 (25:30):
Yeah, so night shift is a setting on your phone
that you turn it all the way to the warmer
spectrum so it actually emits less blue light. So yeah,
I keep night shift on twenty four to seven. That's
the first step. Actually, when the sun sets, I wear
blue light blocking glasses. They are these super weird looking

(25:51):
orange glasses that actually there's studies to show that wearing
those glasses actually raises your melotonin levels because it prevents
blue light from entering. So I wear blue light blocking
glasses when the sun sets because I too like to
maybe watch the Netflix at night or whatever. I don't
have any TVs in the bedroom because there's some like

(26:12):
ambient street light where we live. I sleep with an
eyemask on so it's pitch black. Otherwise the light will
disturb me. And then of course it's quiet. And the
other thing is the room can't be too warm because
that will disrupt sleep. But from a light perspective, Like
I've basically blue light blocking glasses have been like very
helpful for that. I can, I can send you a

(26:34):
pair if you want.

Speaker 2 (26:35):
Will I will accept it, and I will I will
try it, and.

Speaker 1 (26:38):
I would have to send you the one that goes
above glasses, right because you wear your glasses like at
night and stuff. Okay, I'll send you a pair.

Speaker 2 (26:45):
Wait, but so, like blue light, is this a more
recent phenomenon or even in like old analog TVs, are
we still getting that, say blue light emitted.

Speaker 1 (26:53):
The new TVs, like the hd LED super Efficient that
has way more blue light than you know, the nineteen
eighties analog TVs. So they are bombarding us with a
lot more blue light. I do wonder about, Well, this

(27:13):
is another topic. But like you know, starting in the
nineteen nineties, childhood obesity rates like started to really skyrocket,
and you know, the whole controversy. It wasn't a controversy,
but people were like, oh, it's process foods. It's like
hyghper discorn syrup. But the thing is hyper discorn syrup
has been around since the sixties. Computers started to come
out in the nineties, and then the screens became more

(27:35):
and more efficient. So I do wonder if it was
actually the light stuff that is leading to the childhood
obesity epidemic. But the bottom line is extra light at
night causes circadian disruption. Circadian disruption causes diabetes. And the
thing is at night, it's supposed to be dark, like

(27:58):
we're supposed to be. You know, back when we were
hunter gatherers living in the bush, the only darkness was
the sky and maybe the moon if it was a
full moon, we would have that light. Now, because of
modern technology, we have city lights and computers and we're
falling asleep on our phones and all this stuff, and
so obviously it's it's going to affect our biology.

Speaker 2 (28:22):
Wait, so how does like the circadian rhythms being interrupted
lead to diabetes.

Speaker 1 (28:28):
So when we sleep at night, that is the only
time that our cortisol level, which is our stress hormone,
it drops. That's when it goes as low as possible.
When we don't sleep, whether we're being exposed to light
or sleep apny or whatever, our cortisol never has a
chance to fall down. It stays high, and then cortisol

(28:53):
triggers other hormones and inflammatory markers to get stay high
as well, Like cordis secondarily triggers insulin, so our insulin
stays high, and then that's what causes glucose problems and
waking and all sorts of health issues. You need to
sleep at night in order for your hormones to stay balanced,

(29:14):
and we're struggling with that as a society for sure.

Speaker 2 (29:18):
Yeah, this seems like an epidemic. And I know that
word's overused, but like, literally, I feel like this is everybody.

Speaker 1 (29:24):
Yeah, I'm almost wondering if the right word would be
endemic basically, So endemic means that it's now a part
of our life, and epidemic means that there's like an
outside factor that's coming in and enough people have been affected. Right,
But when something's endemic, meaning it's just part of our

(29:46):
lives now, like it's not going away, which really is
I mean it comes down to this question of like
what is this technology doing? Is it making us healthier?
Is it making our lives easier? Or is it actually
harming us?

Speaker 2 (29:59):
I mean, what's the solution that I mean some of
its lifestyle change.

Speaker 1 (30:03):
I think we can create safer technology. Like there's a
computer out there it's made by a company called Daylight
Daylight Computer, and it's a tablet that doesn't use high
levels of blue light, and it's a much safer it's
trying to actually mimic sunlight, which is a much different

(30:23):
balance of blue light infrared UV light, so it's much
safer for our eyes and our body. Like we we
should invest in more technology like that, Like we should
look into this, Like I think we can create better
technology for us. I think cities should start looking at
city planning in a different way to say, do we

(30:45):
need to have so many lights all over the city.
Can we create light bulbs that aren't disrupting not just
our circadian rhythm, but the circadian rhythm of birds and
migratory patterns. We're going to take a short break, stay
with us.

Speaker 2 (31:07):
Is it fair to say this is the sleepiest generation?

Speaker 1 (31:11):
Sleepy meaning like like we.

Speaker 2 (31:13):
Saw no sleepy, meaning that like none of us are
getting the sleep that we need properly. Because that's so Katie,
just as a result of whether it's sleep apnea, whether
it's because of light whatever, Like we are interesting because
I'm wondering just because this blue light thing, I mean,
it's going to be the worst that during this era

(31:35):
without a doubt.

Speaker 1 (31:36):
Unless we actually decide like, hey, this is affecting our health,
like enough is enough. I don't know if we're the
sleepiest generation, but I certainly think we are the most
disconnected from nature, and we're so isolated. I think loneliness.
There's so much loneliness. I mean, this is why these
chatgypt things are taking off. And I'm sure it gets

(32:00):
even scarier when we're talking about the impact this has
on kids.

Speaker 2 (32:04):
Yeah. Oh, I mean as as a father, like it's
it scares the hell out of me because I already
see how much screen time like I have, which is
way too much, and I try to regulate my kids
scream time. But it's still like, especially thinking about this
artificial light discussion that we are having in addition to
the psychosis, Like before this episode, I was already afraid

(32:26):
enough for my child, and now I'm terrified. It's like
this is like, oh no, I didn't think about this.
This is an added thing.

Speaker 1 (32:34):
But the thing is if we talk about this stuff
and we can then come up with solutions.

Speaker 2 (32:39):
Right, No, no, you're right, But it's still I think
as a parent, I'm going to find something to be
terrified about. Yeah, So this is this has just added
to the very long list.

Speaker 1 (32:49):
I mean, this is a lot like what this technology is.
It's coming, it's happening so fast, and we're not even
we're not not even at the tip yet of the
impact this is having on children growing up in this world,
right because at least you and me we remember a
time before all this technology. But it's even harder for

(33:13):
the kids.

Speaker 2 (33:14):
I mean, we already kind of hinted at this when
we talked about the impact AI chapbots are having with
children with like young people who are looking for friendship
or are seeking friendship. There's new research that says that
you should not give a smartphone to your children under
thirteen because early smartphone use is associated with suicidal thoughts,

(33:35):
worse emotional regulation, lower self worth, and detachment from reality.
As a father, this is terrifying already.

Speaker 1 (33:43):
I remember, like my self esteem was like hanging on
a thread, you know, in junior high. And I can't
even imagine now having that outside influence of smartphones, social media,
all of the things exacerbating what is already a very

(34:04):
difficult period of development and adolescence.

Speaker 2 (34:08):
And I think the smartphone used I mean, they didn't
actually test for what exactly about the smartphones is causing
this kind of behavior and dysregulation. But they made a
strong connection to social media, and they said it's likely
because kids before the age of thirteen are accessing social
media and had more sleep disruption, cyberbullying, and negative family relationships.

(34:32):
So even though it doesn't make the direct we don't
actually know for sure what it is, it's fair to
assume that social media is a part of it. And
it scares the hell out of me because, like, you know,
my kid uses my phone and his mom's phone like
a fair bid. He goes to like PBS Kids Games,
he goes to something called ABC Mouse and he does

(34:53):
educational games there, and I feel like that's okay, And
we limit the amount of time has on it, and
I feel like, okay, he's he's get he's educating himself.
Like my kid, like he spends a lot of his
time just going on Google Maps, following train lines. He's obsessed.
He's four, but he's obsessed with like New York City

(35:15):
subways and knowing all the stops. Wow. And he can
he can read, which is wild because he started reading
when he was three, so he like, yeah, so he's
like obsessed with knowing all the trade stops on all
the lines, and so I'd like to think that is
different than social media use. But this has got me

(35:36):
a little frightened because I'm not really sure, because it's
still like when you're on your phone, you're when you're
on the phone, you're really or any screen really, you're
disconnecting yourself from reality. I know, so even more than
a book or anything else, Like you're you're literally you're
in another world. And that's what scares the hell out
of me.

Speaker 1 (35:54):
There was a meme going around the internet that was
like what was the quote? It was like, we use
the internet to escape reality, but now we're using reality
to escape the Internet. Yeah.

Speaker 2 (36:07):
God, that's so. I think about the number of people
who like are actively trying to create group activities and
different ways to meet with people. It's to escape ourselves
and to escape escape the Internet and this world we've created.
It's weird because our inner world is now external in
terms of it's connected to the Internet. It's not just

(36:27):
within ourselves. It's like totally connected to this other thing.

Speaker 1 (36:30):
And for some people, like with the CHATGBT psychosis, the
lines get blurred, which is really scary. What I liked
about the article that you're talking about. I went to
the original publication from the Journal of Human Development and Capabilities,
and what I liked about is that the article actually
gives some solutions for what we should be doing in

(36:53):
reference to your son, like using your phone. The article
talked about that we should make kids friendly phones, like
phones where you can call and text people, but there's
no access to social media, so it's like a kid phone,
and maybe there's access to like, you know, the safe materials.
The other cool idea which I liked was that there

(37:14):
should be mandatory education, kind of like doing drivers said
before driving. There should be mandatory education before giving someone
access to a smartphone.

Speaker 2 (37:24):
But what would you teach a kid about using the smartphone?
That's the other thing. I don't know.

Speaker 1 (37:28):
I just remember doing driver's head, just like, oh my god,
I have to give all these people the right of way,
like I could kill someone in this thing, Like I
have to be so careful.

Speaker 2 (37:41):
I mean the you know, there was a lot of
discussion in the article about the idea of parents, because
the issue is like if you don't let your kid
have access to a smartphone, like I've always said that
I would give my kid a Nokia brick remember those
old little Nokia brick phones.

Speaker 1 (37:57):
Is the flip the flip phone, No.

Speaker 2 (37:59):
Not the flip that's the flip phone is a step up.
This is the little brick phone that it almost looks
like a phone. It looks like a children's toy. Yeah, exactly,
it looks like it. It looks like it was made
by Fisher Price, right, yeah, or play school. But like
that to me, is like the most basic kind of
phone where it takes forever detect you can get phone

(38:19):
calls on it. If you want to play a game.
I don't know what you can play.

Speaker 1 (38:23):
I think it's a snake game. Right, Oh you're right, yes.

Speaker 2 (38:27):
Yes, but like it? To me, that seems like that
should be it for quite some time. And one of
the since even if you prevent your child from using
a smartphone, you have to deal with all their friends
using smartphones.

Speaker 1 (38:41):
Yeah.

Speaker 2 (38:41):
Then so then the issue becomes about really working together
as a community with other parents and saying together, we're
not going to allow you to have a smartphone until
you're sixteen years old, or until you're fourteen years old,
or when I would say, until college, right, because I
feel like, especially like, oh, when you're sixteen, when you're
in your junior year of high school and your grades

(39:03):
matter the most, that's when I want to give you
a smartphone. Like seems like a bad idea, but like, you.

Speaker 1 (39:09):
Know what this article talks about, like this government ban
on social media for people less than age thirteen.

Speaker 2 (39:17):
Which they're doing in other countries, right, they're doing.

Speaker 1 (39:19):
It in other countries. I mean, you know, I think
it's a fabulous idea. Kids do not need Instagram and
Facebook and all the other garbage that young and now
there's data to show that actually might kill them. The
other thing that which I totally agree with, the article
said that we need to hold the tech companies accountable
for this, like they need to do a better job

(39:41):
of doing age verification, like actually making sure that these
are not kids on this, And I totally agree. If
we do not hold these companies accountable, then there profits
will rain, like it doesn't matter, you know, So I
I completely agree, Like we need to handle this on

(40:03):
a much bigger societal level.

Speaker 2 (40:05):
When I was a kid, I remember home being freedom,
like freedom from like school, from school, from all the
stuff that came with school, like whether it was bullying,
whether it was like the social pressures and all the
stupid dynamics that every school has, especially in high school.
And sure I brought it with me to a degree

(40:25):
because you had able insta messenger, but also keep in
mind that it was so minimal in terms of like
there was It was like a few blinking lights and
the messages. And also it was imposed, like you couldn't
stay on that long because back then you had to
use your phone line and then.

Speaker 1 (40:43):
Your dad had to be like, hey, I got to
use the.

Speaker 2 (40:45):
Phone phone right, or that somebody would pick up the
phone and it would disconnect it. Back then, it's like
there was all there were restrictions based on the limits
of technology, so that would prevent you from using it
way too long. And now we don't have that. It
feels like kids go home and with social media, they're
still in the same hell there in high school. Like

(41:06):
what they don't realize or they don't see this is
your escape, This is your freedom from like from all that.
Like you when you're home, like you can choose who
you want to spend time with and who you want
to talk to, but you don't need the whole world
and all of it to be raining down on you
like it is in school.

Speaker 1 (41:24):
It goes down to this theory. It's like there are
no more boundaries anymore. Everything has become blurred. And I
think there's going to be a time where going to
spaces in nature that have no access to Wi Fi
or any kind of technology are going to be viewed
as these like luxury experiences that are really hard to

(41:47):
come by. But the thing is Harry, like, we don't
we can do something about this. Like I hope people
listening to this if you're a parent, and like you're
going to think twice now about maybe giving your kids
access to phones, or you're gonna just be a little
bit more mindful about their use of chat GPT. We
can't change what we don't see. And what's going on

(42:10):
is that this technology it's slipping and coming up behind
us so quickly that it's people don't even realize things
are in danger until it's too late. So the more
we talk about this, the more we can do something
about it.

Speaker 2 (42:25):
I mean, CHATCHPT should be off limits for kids, and
I know that seems extreme, but I don't see the
benefit of it either. You're using it to cheat on
your essays or whatever, or you're like in the extreme case,
treating it like a friend, like this is not this
technology is too dangerous and your mind even Hell, we
should ban smartphone and chat GBT used till you're twenty six.

(42:48):
I mean when do our brain stop developing twenty five?

Speaker 1 (42:54):
Like oh yeah, yeah, yeah, Well it depends on gender,
and I don't know the facts, like off the top
of my head, I have to look it up, but
it's definitely later. It's definitely past eighteen. So I think
if we did like a warning label like warning, this
is not a real human, do not use if you
were under eighteen. I mean think about it. Casinos right,
Like you go to a casino, they're bright, flashing lights.

(43:18):
They look like toys, like a six year old would
want to play slots. But we have clear things that
say twenty one and over adults. This is not for kids.
We need to do the same thing now for chat EBT.
It's like when tobacco came out, and initially tobacco wasn't

(43:39):
realized to be this harmful thing. Where this is that
this is like the tobacco of our generation, except it's
all happening so fast, faster than we can even realize.
So we got to be very very careful, which is
why I'm glad we're doing this episode.

Speaker 2 (43:57):
I mean, the thing about warning labels on cigarettes is
the you're gonna smoke, You're going to smoke. But the
bigger issue is that, like it's a public awareness and
a reinforcement that this is not good for you. If
you're gonna do it, you're gonna do it. But I
think that overall, like you have children growing up seeing
these packets of cigarettes that have like will cause lung

(44:19):
cancer written on them, Like I think that's going to
be a something that that'll discourage you.

Speaker 1 (44:25):
And in some countries actually they show cancerous lungs. They
show like photos of the lung cancer and then they'll
say like this is lung cancer. And it's like a
very effective programming.

Speaker 2 (44:39):
Yes, you know, I've seen I think in the UK,
don't they have that?

Speaker 1 (44:42):
Yeah, in other countries. And I mean, if we could
do something like this, this will mimic intimacy, But I
don't know what kind of image would show like.

Speaker 2 (44:52):
Intimacy, what is most relationships?

Speaker 1 (44:57):
Like I guess we need to teach our kids more
about that Greek myth of narcissists who looked at the
image of himself in a pool of water and then
eventually died because he couldn't stop looking at himself. If
we teach them that and we put a picture of that,
but that I doubt that would make a big difference,
But it's a start. Well, we need to come up
with more ideas.

Speaker 2 (45:17):
Basically, I'm just dreading the conversations, like, Dad, I need
a smartphone so I can dance like an idiot in
front of it and post it on a thing. Right then,
won't you just let me dance like a Dumbo in
front of this thing so I can post it.

Speaker 1 (45:31):
You're just like dance for me on. Oh well, I'm
so glad that we could talk about this.

Speaker 2 (45:41):
Well, I learned a lot today, Prianca, what'd you learn?
I learned that smartphone usage will lead to psychosis, could
potentially lead to diabetes, and is harming the future of
this world. And at the same time, the whole time
we've been doing this podcast, I've thought about my phone.

(46:04):
Oh okay, what's going to be on it? When I
look at it again? I'm pretty excited.

Speaker 1 (46:09):
I'll probably be asking you for your address because I'm
going to mail you some blue light blocking glasses.

Speaker 2 (46:15):
I think it's good.

Speaker 1 (46:16):
I think the other thing I learned is that I
think there is a future here where we can live
safely with these types of things. But it's going to
require creators of this technology to be held accountable, and
it's going to require us as a society to start
putting in some very strict boundaries. Agreed, Thanks for listening everyone.

Speaker 2 (46:40):
Health Stuff is a production of iHeart podcasts. Don't forget
to send us your voice memos with all those pesky
health questions that keep you up at night. Email us
at health Stuff podcast at gmail dot com and go
and subscribe to health Stuff wherever you get your podcasts.
Talk to you soon.

Speaker 1 (47:00):
H
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.