All Episodes

January 28, 2025 36 mins

As TikTok America’s legal limbo continues, Australia became the first country in the world to ban minors under age 16 from social media. Australia’s eSafety Commissioner Julie Inman Grant warned the world saying, tech companies aren’t doing enough to protect kids online. Really, no Really!

While the powers of social media apps are undeniable, the harms they cause are more… enigmatic. Even if harmful effects of social media are established, are bans an effective response? As this international debate rages on, there are millions of parents and young people looking down at their smartphones wondering what if anything they should do differently.

To break down the pros and cons of social media we’ve invited Dr. Jacqueline Nesi to make sense of it all. She’s an Assistant Professor of Psychiatry and Human Behavior at Brown University, where she studies how technology use affects kids and how parents can help.

She’s published over 50 peer-reviewed publications related to youth and technology use, and her work has been funded by organizations like the NIMH, NICHD, and NSF. She’s also testified before U.S. congressional subcommittees at both the national and state level on issues surrounding tech and mental health.

***

IN THIS EPISODE:

  • How do we define social media and would a ban even work?
  • So… how does social media affects us?
  • Should kid’s online activities be treated like getting a driver’s license?
  • Default privacy settings.
  • Teaching media literacy is an important facet of modern parenting.
  • What age should kids be allowed to use social media?
  • The social media changes kids are actually asking for.
  • Speaking to kids about dangers like bullying and pornography.
  • Deleting, taking a break, downloading again… deleting again… and again.
  • How Jason altered his social media algorithm in 6 weeks!
  • REAL or FAKE: Jason stumps the panel on potentially actual websites.
  • Google-HEIM: Mistakes!

***

FOLLOW JACQUELINE:

Website: jacquelinenesi.com

X: @jacquelinenesi

Substack: Techno Sapiens

***

FOLLOW REALLY NO REALLY:

www.reallynoreally.com

Instagram

YouTube

TikTok

Facebook

Threads

X

 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Now really.

Speaker 2 (00:06):
Really now really hello and welcome to really know really
with Jason, Alexander and Peter Tilden, who kindly ask you
to subscribe to our show and then tell everyone you
know and don't know to do the same thing by
your multitude of social media apps. And speaking of social
media apps, have you noticed how they've come to pretty
much dominate our daily lives? And suddenly government is seeking

(00:29):
to intervene. Currently, the Supreme Court has upheld the right
to ban the popular TikTok app in the United States.
Australia has issued a ban on social media for miners,
and there is a growing trend of trying to limit
social media in order to make people more social. Really, no, really,
and who even knows if bans are the best way
of dealing with the situation. Well, one person who knows

(00:52):
quite a bit is doctor Jacqueline Nisi. She's an assistant
professor of psychiatry and human behavior at Brown University, where
she's specific he studies how technology affects kids and how
parents can help. And now here are two guys who
are still waiting for their verification check marks, elon Jason
and Peter.

Speaker 3 (01:10):
Really, So today we're gonna we're gonna be talking about
social media.

Speaker 1 (01:15):
Would you describe yourself as someone who is addicted? It's
a big word, but it is highly participatory in your
social media.

Speaker 3 (01:25):
Yeah. Well, and how many social media accounts do you
personally have? I don't mean our show, I don't mean any,
just the ones that you don't.

Speaker 4 (01:31):
I don't. I don't you have funny because I don't
enter into I have them all, but I do them
for research. I want to see what's spending, what people
are talking about. But I don't post, and you don't
really look at comments. I don't know.

Speaker 3 (01:42):
I don't you know I have I have three. I
had four. I got off of a big one. And
the thing that makes me crazy is at some point
every week, my my iPad will say to me how
much screen time I've had this week and whether it's
upper down from the week before. And I, first of all,

(02:04):
it makes me crazy that it's watching me what's hello?
And I guess it's not making a judgment about it,
but it feels very judgmental.

Speaker 1 (02:15):
It's telling me I'm up or down from my use
to the week before and what I left.

Speaker 4 (02:19):
What I love about that because I get that on
all my social media and on my phone, how much
use and how much Yeah, with the subtext of watch out,
just make sure you're not doing too much. And at
the same time, it's going, we got to we we
got one, we got we got to get this up,
we got to get this up again.

Speaker 3 (02:35):
So if so, but here's why I bring it up
because I don't think I'm overly engaged with social media.

Speaker 1 (02:41):
I don't really think.

Speaker 3 (02:42):
And yet you know, it gets into the just shy
of nine hours a week is basically what it's telling me.

Speaker 1 (02:50):
I go, I'm on that thing nine hours a week.

Speaker 3 (02:53):
So if I who am not plugged in as a
sixty five year old man, I'm not plugged into this stuff.
I don't really care about this stuff. I get my
news from traditional media. And imagine being a teenager whose
entire world.

Speaker 4 (03:09):
Well that's the other that's the point. And just how
damaging or not damaging the social media because it's their community,
it's their engagement, it's their entertainment, it's their music, it's theirs,
their news sources, it's there everything, and why wouldn't it
be right? And then the big question for me with
this specialist is isn't there a spectrum like, aren't there

(03:30):
kids who are just destroyed by social media? Devastated by
we know it all the time, we hear about it
all the time, and it's even suicidal. Then there's some
maybe that are neutral about it, and then maybe there's
some who are honor students who are using it to
do content to learn stuff. So all of a sudden,
the band across the board would seem to be counterintuitive

(03:51):
and a bludgeon for something that has to be there
has to be and this is the hard part. What
are the solutions to make it better for those kids
who are devastated by it? So that I'm fascinated by this.
I'm fascinated that the Australia went screw it and that they.

Speaker 3 (04:07):
Think I said to you, when you, you know, talk
to me about this topic, I said, Oh, we're back
to the Chemtrail episode where you go, Okay, they're banned
in Arizona.

Speaker 1 (04:15):
Okay, so.

Speaker 3 (04:18):
The sky I mean, so how do you how do
you enforce this? How do you you know what what
is the mechanism that allows you to make sure that
you can enforce a.

Speaker 4 (04:31):
Band, especially by the way, when the people who are
pushing back on the band are the wealthiest, most powerful.
Now people on the planet did own these tech companies
that own the The thing.

Speaker 1 (04:43):
Is they're going to do.

Speaker 3 (04:44):
I can't remember who this comedian is, and I just
saw the bit the other day, but he does a
bit about I guess the porn sites now have a
thing if you if you bring up the if you
type in whatever it is, the first thing that comes
up is the thing that says, I am eighteen years
of old or older.

Speaker 1 (05:02):
Or I am not eighteen years but any any schmo,
I'm eighteen. What do you want? How is that? He's
a big preventative way that no one goes past.

Speaker 4 (05:14):
We took a reasonable effort.

Speaker 1 (05:17):
Does that sven?

Speaker 3 (05:18):
Thirty four million dollars? We put up a thing if
you kid, I'm the kid hit I'm eighteen.

Speaker 1 (05:22):
Who are we to know?

Speaker 4 (05:23):
You're a liar?

Speaker 1 (05:24):
Right? Yeah, you're a liar, and I should pay thirty
four million dollars.

Speaker 4 (05:27):
But the good thing is nobody's watching forn so it
doesn't matter.

Speaker 1 (05:30):
That's right, that's right. And with that, with that, we
have a fascinating.

Speaker 3 (05:37):
Guest today, Professor Jacqueline is it nessy is that I
would assume she's assistant professor at Brown University who studies
technology use and how it affects kids and how parents
can help. She writes for a popular weekly newsletter called
Techno Sapiens and co created Tech Without Stress, which is
a course to help parents raising kids in the digital age.

(05:57):
She's published over fifty peer reviewed publications related to youth
and technology. Youth and her Her writing has been featured
in The New York Times, Journal, Washington Post.

Speaker 1 (06:06):
She's a person. She's a real person testifying for a.

Speaker 4 (06:09):
Congressert on In addition, she setsfy in front of Congress.
But she also just got funded to do research specifically
on kids and using this tech and howat how it
does affect them and is this causing the depression or
in the youth will find this out.

Speaker 3 (06:26):
Let's get schooled and let's welcome professor Jacqueline.

Speaker 1 (06:29):
Is it nessy? Am? I saying it right?

Speaker 5 (06:31):
It's actually ns messy?

Speaker 1 (06:32):
Welcome Professor Jacqueline. Nissy.

Speaker 4 (06:35):
You need to add another easy you.

Speaker 5 (06:36):
Need it's it's it's confusing.

Speaker 6 (06:40):
I would say about half of my my friends also
don't know how to pronounce it.

Speaker 4 (06:44):
So it's okay because your name is obviously sponding correct.

Speaker 1 (06:48):
Exactly well, welcome, welcome to our little program. So we
are we are. Wegan we began by chatting about.

Speaker 3 (06:57):
And we'll get into many things, I'm sure, but I
want to start with as I'm sure you're aware, the
country of Australia has stated that they are going to
ban social media for people under the age of sixteen
starting next year, and we wanted to get your thoughts
about whether that's a net positive or a net neutral

(07:19):
or negative. What's your take on whether this is even
a good thing to be pursuing.

Speaker 6 (07:25):
Yeah, yeah, I mean so this is so just backing
up a little bit, I'll say that the research on
the link between social media and mental health, which is
the thing that's kind of driving a lot of these
bans and.

Speaker 5 (07:41):
Movements to change things, the research is a little bit mixed.

Speaker 6 (07:45):
Actually, it's not so as clear cut as we would think.
That said, there's generally pretty widespread agreement that these platforms
do have problems, right, like they're not designed optimally for
young people and young people's being, and so I think
there's two generally schools.

Speaker 5 (08:04):
Of thought about how to address that.

Speaker 6 (08:06):
One is, let's make changes to the platforms themselves to
make them safer.

Speaker 5 (08:12):
Healthier, better for kids.

Speaker 6 (08:14):
The other school thought is let's keep them off altogether, right,
and so obviously that is what Australia has started taking
steps to do.

Speaker 5 (08:23):
You could argue that you have.

Speaker 6 (08:25):
To set the age somewhere, right, like you have to
set the age at which kids can accessit at some point.
And in the US it's effectively based on the laws
that we have, it's effectively thirteen.

Speaker 5 (08:36):
But there are a lot of issues that this brings up.

Speaker 6 (08:40):
One is how do you even define social media right,
like what platforms would even be considered social media for
this type of ban. The second is how do you
enforce it? Like you said, so there's some talk of
age verification, like, how are you going to know whether
kids are actually sixteen?

Speaker 4 (08:57):
Together?

Speaker 3 (08:58):
I was going to say, because I was joking about
you know, the porn sites have a thing that says
I am eighteen or holder or I am not.

Speaker 6 (09:04):
It's yeah, yeah, I think it's just a it's a
really tricky issue and I don't think that we have
a good solution in terms of how to do age verification,
not that it couldn't be developed, but how to do
it in a way that protects privacy and that people
feel comfortable with. I think the biggest issue though, actually
is that it's by putting a ban in place, it

(09:25):
kind of replaces the need to actually make changes to
these platforms, right, So, you know, so we know that
there are changes that could be made, maybe should be
made to make.

Speaker 5 (09:35):
Them safer and better for kids.

Speaker 6 (09:37):
And if there's a band in place, then the platforms
can just say, oh, well, the kids shouldn't be on
there anyway. So you know, so what do we have
to work what do we have to worry about? You know,
we don't need to be making these these changes.

Speaker 5 (09:49):
And that they actually become less safe for kids as
a result.

Speaker 4 (09:51):
Also, when you're studying teens, not every team is destroyed
by TikTok. Some flourish because of that, and you would
push some kids in more isolation. So there's potential negatives
for a whole group of kids that we don't talk
about the benefit from it. Some are neutral. We always
talk about the kids who are damaged by that.

Speaker 6 (10:12):
Yeah, yeah, it's a really good question. I think it's
kind of the question that's at the heart of a
lot of these debates, you know, because the narrative around
social media and kids' mental health is obviously really negative.
Like what we hear about it is that this is
bad and that the science is settled on this, Like,
we know it's bad, but I think the research we
do have actually doesn't really support that kind of simple story. Like,

(10:34):
as you said, social media has a lot of different things,
first of all, so there's a lot of different ways
that it can be used. Some may be beneficial, some
certainly harmful. And then of course kids are very different
from each other, right, so you know, kids are using
these platforms in very different ways.

Speaker 5 (10:51):
Some kids, as you mentioned, maybe.

Speaker 6 (10:53):
Those who might be more marginalized somehow in their you know,
offline lives, may actually be getting some benefit from the platforms,
finding community on there. So kids are going to be
affected in different ways by the platforms, and that makes
it hard, I think, to determine, you know, what is
the best way to regulate these platforms, to make them safer,
to make them better, knowing that the impacts really different

(11:16):
for different kids, and that there may be good that's
coming out of the platforms for some kids.

Speaker 3 (11:21):
Yeah, well it seems to me, you know, we have
some of the same issues in my own business about
we have just gotten around to trying to figure out
practices within my industry to make a safer environment for
the children that work in it.

Speaker 1 (11:36):
Yeah, and that's when.

Speaker 3 (11:37):
We can see them in their hands on and it's
very limited exposure, and we can sort of navigate what
they do and don't do during their work day. I mean,
are you aware of any of the things they're thinking
about that might be considered protective for young people? Are

(12:07):
you aware of any of the.

Speaker 1 (12:08):
Things they're thinking about that might be considered protective for
young people.

Speaker 5 (12:17):
Yeah, it's a really good question. It's a really good parallel.

Speaker 6 (12:19):
I think there are many different proposals out there, some
at the federal levels, many at the state level. Some
of the things that are in consideration are things like
default privacy settings, so making sure that kids are not
being automated into public profiles where anyone can contact them
or see their information, right, putting more protections in place

(12:43):
for parental controls, so letting parents and kids themselves customize
their experience a little bit more on the platforms. And
then you know, generally kind of creating a duty for
these platforms to protect kids from arms, so things like
really problematic content, like content that promotes self harm, or

(13:03):
eating disorders or things that we just obviously would not
want kids seeing. Right, there's some argument that there should
be changes made to the platforms for everybody, just that
they should generally be you know, more safe and places
where there's more transparency about how your data is being
collected and used, where you have more options to customize,
you know, the algorithms that are feeding you content and

(13:25):
those kinds of things. And maybe there should be changes
made for everyone.

Speaker 3 (13:29):
But is that is there any danger that you are
effectively putting blinders on because they're only exposing themselves to
things that are giving them what they want as opposed
to offering an opposing opinion or a different idea, a
different perspective.

Speaker 1 (13:45):
Would there not be a downside to that kind of Yeah?

Speaker 6 (13:48):
So, yeah, certainly, I think there could be a downside.
That's way I think that any with any change or
solution that we talk about here, I do think that
some amount of media literacy training is necessary, and that
just meaning we need to be teaching kids about healthy
and safe ways to use these platforms.

Speaker 5 (14:09):
Right now, I think there is not a lot of
that happening.

Speaker 6 (14:12):
Mostly, I think what kids are hearing is social media
is bad for you, it's bad for your mental health,
don't use it, which is not really realistic.

Speaker 5 (14:21):
And that's it.

Speaker 6 (14:22):
And they're not getting a lot of education and training
around how can they actually use this in ways that
work for them, that make them feel good or make
them feel better, where it's not interfering with other aspects
of their lives.

Speaker 5 (14:36):
So I think it's something that we need to be
teaching kids.

Speaker 3 (14:38):
And part of as I was reading out your resume,
part of what you do is you are trying to
help parents learn how to help their kids navigate this
What are some of the tips that, I mean to
the if you can sort of winnie them down, What
are some of the things you're trying to offer parents
as guidelines for navigating this stuff.

Speaker 6 (15:00):
Yeah, I mean, I think this is a really challenging
landscape for parents. As you can imagine, it's changing so fast.
It's very different than what many parents dealt with growing up.

Speaker 5 (15:12):
When they were younger.

Speaker 6 (15:14):
And Yeah, so in the past few years, since becoming
a parent myself, I have young kids, I think I've
become a lot more interested, become a lot more interested
in just how can we help parents figure out how
to how to navigate this, how to deal with these
new technologies, and so yeah, I think there's a couple
a couple general tips that I think are good to

(15:34):
start with for parents. So one is which parents don't
love to hear, but is important is modeling, which is
thinking about your own use of social media actually in
your own use of technology, and we know that plays
a pretty major role in how our kids use technology,
and so starting there reflecting on your own use.

Speaker 5 (15:56):
And then the other two.

Speaker 6 (15:57):
Tips I would say to start with are communication, So
open communication, conversations about this, asking questions is really important,
getting curious about how kids are using these technologies, why, when,
why it's important to them, what they like about it.

Speaker 5 (16:15):
I think we often come.

Speaker 6 (16:16):
In with our own agenda, which of course makes sense
we're concerned about this as parents, but getting kids perspective
to is important. And then the last piece is setting
boundaries around it. You know, So the tech companies, of course,
I think have a role to play in terms of
making the platforms safer for kids, but I also think

(16:37):
that parents can play a major role in terms of
setting limits, figuring out what the right boundaries are around this,
and that might mean waiting a certain amount of time
to get them a smartphone, for example, or waiting until
a certain age to give them access to social media.
I think it can help parents to think about what
is kind of the minimum level of time technology that

(17:01):
is necessary.

Speaker 5 (17:02):
For your child to meet the needs they have.

Speaker 6 (17:04):
So if they want to be communicating with friends, staying
in touch with their peers, maybe they can do that
with a you know, a flip phone, or maybe they
can do that with like a smart watch that's made
specifically for kids.

Speaker 4 (17:18):
Or by the way, that's what I'm getting my kid
to be popular in school. Your research, because you've got
a grant to do research, so we can actually have
a jumping off point and look at it and go,
here's what I found. This is not an opinion, here's
what I found. When your research is done, who do
you present to that may actually look at it and go, oh, wow,
thank you for this. We have something tangible now where

(17:40):
we can do something actionable.

Speaker 6 (17:41):
Maybe yeah, I mean, I think there's a few different
ways that you know, after you a study fea of findings,
that you can share the findings.

Speaker 1 (17:51):
You know.

Speaker 6 (17:52):
One of the things that I'm trying to do now,
as you mentioned, is trying to get the information directly
to people who can use it quickly, like parents, parents,
educators in schools. But obviously there's some amount that can
get to policymakers, like we've talked about some of this
is happening over social media.

Speaker 4 (18:10):
Yeah, well, congratulations on the grant, you know, And should
we have panels of like twelve year olds who are
actually trying to get the solutions rather than adults who
didn't grow up with social media and don't understand it.

Speaker 5 (18:21):
Yeah, yeah, it's funny.

Speaker 6 (18:23):
Actually, I think that we do definitely forget sometimes to
actually involve kids in these conversations, the ones who actually
know the most about how they're using these platforms and
what's working.

Speaker 4 (18:34):
Right.

Speaker 6 (18:35):
I did this survey last year with common Sense Media
where we went out and we surveyed about fifteen hundred
girls ages eleven to fifteen across the US, and one
of the things we asked them was what changes would
you make to social media to make it better? And
the things they came up with with were really kind

(18:56):
of common.

Speaker 5 (18:56):
Sense, right.

Speaker 6 (18:57):
It was like, you know, stop letting adult contact me
who I don't want to be contacting me, or like
give me more age appropriate content to look at instead
of things that I don't want to be looking at,
you know, like they came up with things that are
really make a lot of sense, and I think most
people would agree with. But I how to actually then

(19:17):
put that into practice at the policy level is is hard.

Speaker 1 (19:21):
Professor.

Speaker 3 (19:22):
If I may may, this may not be something that
you can absolutely speak to, but on the on the
hopes that you can, it would be helpful, I think
to a lot of parents. I remember my boys are
fully grown, but I remember when social media sort of
started to come into it's it's real heyday as they

(19:46):
were becoming teenagers, and I had a very realistic talk
with my boys about pornography online, where where I said
to them, look, guys, when when when two adult people
agree on something and they there's nothing that's bad. Everything's
fine as long as those are circumstances. But until you
understand sexuality for yourself, you might bump into some things

(20:09):
online that are really disturbing or really shocking or really
kind of blow your mind that they're not appropriate for
you at this point. So I'm going to beg you,
since I cannot follow you or every move, try to avoid.

Speaker 1 (20:24):
Going of these things.

Speaker 3 (20:25):
I understand, I mean, I had playboys as a teenage boy.
I understand you want to see what a naked girl
looks like, but there are some really very frightening things
out there. Same thing with this online bullying thing, which
which really hadn't become, to my knowledge, much of a
thing when my boys were growing up, but seems to
be so prevalent now in those two specific areas of pornography.

(20:48):
When parents sit down to discuss how a child is
going to use their social media when the parent cannot
see what's going on. Is there a way to talk
to kids about pornography specifically, or ways that they might
open themselves to online bullying and what those kids can
do to help themselves.

Speaker 1 (21:09):
Are there? Do you have any insight on what a parent.

Speaker 3 (21:12):
Can say that have a real conversation with their child
about stuff.

Speaker 6 (21:16):
Yeah, it's a great question, and I think I think
you're you were one step ahead of the game by just.

Speaker 5 (21:22):
Having the conversation.

Speaker 1 (21:23):
Yeah.

Speaker 6 (21:24):
That's the first thing is that a lot of parents
are just nervous to have these conversations. They don't know
what to say, they don't want to get it wrong,
so they don't say anything. And we really do need
to be talking to kids about some of these challenges
and some of the dangers that are online.

Speaker 5 (21:39):
So, if there's one message.

Speaker 6 (21:40):
That I think is important to convey, I would say
it's one of non judgment, right, So trying to convey
the message that if you have questions, if you're concerned
about something, if you see something you feel like you
shouldn't have, I want you to feel like you can
come talk to me without me, without me taking your

(22:01):
phone away, or without me jumping into something punitive. And
I think it's really common for parents, very understandably, where
their kid says they saw something or they bring something
up to kind of test the waters. Parents freak out
and get a little nervous about that, and then they
put in more restrictions, right, And what that shows the
kids is that they can't do that in the future, Right,

(22:21):
They can't. They can't come to you and talk about
it because it's going to have negative consequences. And so
I think as much as possible, even though it's hard,
really trying to approach us conversations with you know, thanks
for coming to me.

Speaker 5 (22:35):
I'm glad you.

Speaker 6 (22:36):
I'm glad you brought this to me. Let's have a
conversation about what happened? What can you do next time?

Speaker 4 (22:42):
You know?

Speaker 5 (22:42):
What did you not understand? Can we can?

Speaker 6 (22:44):
Can we look it up together and figure it out
rather than you know, having a more punitive right right.

Speaker 4 (22:52):
Wow, that makes sense because we go we go right
to we do go to punitive when you're panicing, right well, yeah.

Speaker 6 (22:57):
I mean I think the other the other thing for
parents is know you don't have to, you know, quote
get it right in every conversation, right like, it's never
going to it's not going to be perfect.

Speaker 3 (23:06):
Where were you when I was raising children? I don't
have to always get it right. That would have been
a very good tip.

Speaker 4 (23:13):
I started the episode by saying, you know, these tech
companies can target kids with ads. I'm sure they know.
How is that possible that you can target a ten
year old with an ad of twenty things are going
to buy and you know everything about them? However, you
can't block content from that same kid that you know
who they are and what their age is. That to
me tells me if they wanted to, if there was

(23:36):
an incentive that they could.

Speaker 6 (23:39):
Yeah, there's also what the algorithm is solving for, right like,
right now we don't know, but it seems like what's
probably mostly being solved for.

Speaker 5 (23:47):
Is engagement and time.

Speaker 6 (23:49):
So whatever is making a kid or an adults stay
on their longer, engage more with the content, that's the
thing that's going to show up first and their algorithm
and show up most frequently and do do the best
in terms of how popular it does on the platform.
But there are other things you could you could solve for,
right like, there are other things you could be prioritizing.

Speaker 4 (24:08):
Man, it'd be wonderful they could. We had a guest
on his suggestion, you know, replacing hate and fear, there's
got to be another more powerful emotion that can release
dopamine that we can replace.

Speaker 3 (24:20):
I told you I changed my Instagram algorithm in six weeks.

Speaker 1 (24:24):
It was and it was not hard to do well.
It was very studious.

Speaker 3 (24:28):
So as I was scrolling, if there was anything in
the first second it came up, if it looked like
it was trying to give me the kind of news
and information that would anger me or frighten me, or
I would flip by.

Speaker 1 (24:46):
So there was no two seconds of eyeballing it.

Speaker 3 (24:49):
Anything that was uplifting, entertaining, humane, charitable, beautiful. I started
harding it I just started liking it, and within six
weeks I would say the vast majority of my Instagram
feed are really kind of lovely things.

Speaker 4 (25:06):
Well, thanks for coming on, doctor Nishi, a pleasure.

Speaker 1 (25:10):
Good luck with your work. We look forward to.

Speaker 4 (25:13):
Why do we need it? Well, there you go, there
you go?

Speaker 2 (25:26):
What?

Speaker 1 (25:27):
Well?

Speaker 3 (25:28):
Yeah, all right, So what we found out is, yes,
there is a problem. Yes, companies may or may not
be working on it. I don't know. No one has
a definitive answer on how Australia is going to pull
off this band or if it's there or is it
just this could be advice to make the companies we're.

Speaker 4 (25:45):
Talking, I thought, no, they they have to be at
least as smart as we are and know how are
we going to do this? So maybe if just we're
not going to sit back and do this, we're gonna
start finding you guys. So if you don't incentivize and
give us a reason, we're going to do this dumb band.
We don't know how to do yet, but we'll figure out.
Figure we'll figure out. We're going to threaten you exactly

(26:05):
the year we're going.

Speaker 1 (26:06):
To threaten you.

Speaker 3 (26:07):
So yeah, it's interesting. Well, listen, I think this is
one of the great challenges of our time. Seriously, I
think the Internet and social media, and we didn't even
get into misinformation and disinformation.

Speaker 4 (26:19):
Which is so dangerous, or how do you do that?
How do you do that if it's my opinion and
it's wrong, and how do you know when it's opinion?

Speaker 3 (26:24):
I have a game for you, but I want to
be able to include David, So David Google, I'm come
on and tell us.

Speaker 1 (26:28):
What we learned, what we missed, what we should know,
what we don't.

Speaker 7 (26:31):
There's actually a lot to get into.

Speaker 4 (26:33):
A couple of mistakes.

Speaker 8 (26:34):
Chemtrails are not banned in arizon't know they're banned in Tennessee.
That's what I meant to say, right And actually, in
one clarification when we were speaking about pornography and that
button that comes up saying I am eighteen, there actually
are a few states that have requirements in place that

(26:56):
you have.

Speaker 4 (26:57):
To do some rigor too, like what what's additional?

Speaker 8 (27:01):
The states that where it currently exists are Tennessee, Louisiana, Texas, Utah, Indiana,
and Montana. Now where I live in Florida, there actually
is new legislation that has.

Speaker 9 (27:12):
Passed going to go into effect in January of twenty
twenty five, where they perhaps might be requiring you to
give over your driver's license number.

Speaker 3 (27:24):
Really indeed, so that'll cut way down on users.

Speaker 8 (27:28):
There's a lot of legislation fighting back against this for
free speech concerns, and there is actually a Supreme Court
case it's supposed to be heard in twenty twenty five
on this very subject.

Speaker 1 (27:39):
You know, that's interesting.

Speaker 3 (27:40):
I don't know how I feel about that, because I
do understand the free speech part, but I also think
protecting minors is vote.

Speaker 4 (27:47):
You need certainty to drive, you need a certain a
device to regular liquor, you need a certain age to
douce access, certain act curtains, but not all inhoment certain
infrom that wouldn't be bad.

Speaker 3 (27:56):
Could I give them my license and sayts for research
purposes only?

Speaker 4 (28:00):
Reason all I can see. By the way, just as
a recall when you said to our guests, as long
as it's too consensual people, there's nothing bad. Oh yeah,
even with two consensual people, there's stuff going on, you
know what you're well, that's oh absolutely even yeah, forget
too consensual people.

Speaker 1 (28:18):
There are just things I don't want my There are
things I don't want to the aristocratic?

Speaker 4 (28:24):
Yes, what else, David?

Speaker 1 (28:26):
Anything?

Speaker 3 (28:28):
Yeah?

Speaker 8 (28:28):
Another thing I wanted to mention is what phone should
people give to their kids, whether it should be a flip.

Speaker 1 (28:33):
Phone and not a not a smmartphone.

Speaker 8 (28:35):
And obviously your child will be mocked and right, so
who wants to be able to flip the openness fun?
But there are phones on the market that use smartphone technology,
so you're not going your your child is not going
to be outed as having a weird looking phone. It
looks like your traditional smartphone, but has built into the

(28:56):
core system safety features that block many things and then
give parents automatic alerts on on certain activity on certain
That's great.

Speaker 4 (29:06):
I bought my kids what the ARP fond for seniors.
They use a big screen, I fall, extreme.

Speaker 3 (29:13):
Thing numbers I fall, I can't, and a direct line
to walk in tubs exactly.

Speaker 2 (29:24):
Wow.

Speaker 3 (29:24):
All right, Wow, Well, and here's a little game for
both of you, because what we're using when we go
on social media, we're going to websites. And I have
researched some pretty crazy websites and what they do. So
we're going to play a game. I'm going to name
a website what it does? You tell me if it's real.

Speaker 1 (29:44):
Ready.

Speaker 3 (29:47):
There's a website called pointer Pointer which shows photos of
people pointing and they will always be pointing at where
your cursor pointer is on the screen.

Speaker 4 (29:56):
That's got to be real.

Speaker 1 (29:57):
David, real that it is absolute real. Point or pointer
go in there now. Eel Slap allows.

Speaker 3 (30:04):
You to post a picture of someone on the screen
and digitally slap them with an eel by moving the cursor.

Speaker 1 (30:10):
It is absolutely real. It is absolutely real.

Speaker 3 (30:14):
Bodily Fluidily features close up photo of some sort of
liquid stain and it invites the user to guess if
it is a bodily fluid or some other kind.

Speaker 8 (30:26):
David, I would say it not only is it real,
it's sponsored by doctor pimple Popper.

Speaker 1 (30:31):
Wow, I made that one up. That is a false.

Speaker 4 (30:33):
Really well we should get that, don't you, David. David
make a note.

Speaker 3 (30:37):
Do nothing for two minutes, offers static photos and video
and challenges you to do nothing. Don't touch your cursor,
don't look away, don't click on or off. If you
move the mouse, it kicks you off automatic. David, I'm
gonna say it's false.

Speaker 1 (30:52):
It is real. Do Nothing for two minutes is a
real website?

Speaker 3 (30:55):
G string a solid vertical line appears on your screen.
If you run your cursor over it, it produces the
note G on a different.

Speaker 1 (31:03):
Instrument each time.

Speaker 3 (31:06):
That has to be real, David, that's strange, real, I
made it up.

Speaker 1 (31:11):
That's completely wow. On the other hand is the name
of the site.

Speaker 3 (31:15):
On the other hand, challenges you to draw simple figures
with your mouse using the wrong hand.

Speaker 1 (31:21):
Points are scored for accuracy.

Speaker 4 (31:23):
They also, I mean it's got a he's very creative.

Speaker 1 (31:28):
False. I made that one up.

Speaker 4 (31:29):
Really yeah. The most hold on I think I hate
and I hate to say this. I got because I
love Seinfeldt, and I love that you got a Tony,
But I think we just found your nich.

Speaker 3 (31:40):
We made it thing I got. I got four more
for you to make a fort with these. The most
seconds is the name of the website. Everyone who logs
on adds to a timer that registers the total amount
of time that people have stayed.

Speaker 1 (31:54):
On the site.

Speaker 3 (31:54):
So you log on however much time you watch it.
It adds to the counter.

Speaker 1 (32:00):
True or false.

Speaker 3 (32:00):
True true, that is absolutely true, and currently the aggregate
is close to five hundred years worth of seconds that
people have stood and watched that thing. Here's the website
has the large Hadron collider destroyed the world. Yet you
click on and all you see is the word nope.

Speaker 4 (32:24):
Well that has to be real too, because I don't
think you would have made that one up.

Speaker 1 (32:27):
That is absolutely true, absolutely true. That's a website. Doppelganger.

Speaker 3 (32:32):
Upload a picture of your face and the app immediately
uploads a celebrity it thinks looks like you, then splits
the faces in half for a side by side.

Speaker 1 (32:41):
I've used that.

Speaker 3 (32:41):
I think I've used that well if you have, I
am older royalty because I just made it up.

Speaker 1 (32:46):
Yeah.

Speaker 3 (32:47):
And the last one, Rock Paper Scissors, allows you to
practice your skill at the game.

Speaker 1 (32:55):
Absolutely real rock paper zero for zero.

Speaker 3 (32:59):
I got to tell you that's fun coming up with
things that you go so crazy it could be real.

Speaker 4 (33:04):
So let's say, as we say goodbye, if there's a
developer out there who wants to help them with string
bodily fluid, I think is the one.

Speaker 1 (33:14):
Uh and uh doppelganger.

Speaker 3 (33:16):
I I am willing to pretty good, pretty good, all right, So,
ladies and gentlemen, that is our program for those of
you listening in Australia. Good dye Mite and good luck
with your band. Thank you, for I'm on to you.
I get what you're doing. Yeah, right you and your
bowie knives. Yeah, thank you, Professor Jacqueline neasy and uh watch.

Speaker 4 (33:34):
Thank you, laur thank you. Uh look, can we go
out with Laurie's music.

Speaker 2 (33:42):
You're wrong.

Speaker 4 (33:44):
Let's have one thing understood.

Speaker 8 (33:46):
Whatever it is, I am against it, and even when
you've changed it all condensed, I'm.

Speaker 1 (33:52):
Against thank you. Whatever it is. I'm again.

Speaker 6 (34:01):
No really.

Speaker 7 (34:03):
As another episode.

Speaker 2 (34:04):
If really No Really it comes to a close, I
know you're asking yourself one of the most popular social
media apps out there today. Well, I'll close the book
on that topic in a moment, but first let's thank
our guest doctor Jacqueline. You can follow the doctor on
x where she is at jacquelinii, or on her website
jacquelininissi dot com.

Speaker 7 (34:22):
Her substack is Techno Sapiens. Find all pertinent links.

Speaker 2 (34:26):
In our show notes now our little show hangs out
on Instagram, TikTok, YouTube, and threads at really No Really podcast,
And of course, you can share your thoughts and feedback
with us online at reallynoreally dot com. If you have
a really some amazing factor story that boggles your mind,
share it with us, and if we use it, we
will send you a little gift.

Speaker 7 (34:47):
Nothing life changing, obviously, but it's the thought that counts.

Speaker 2 (34:51):
Check out our full episodes on YouTube, hit that subscribe
button and take that bell so you're updated when we
release new videos and episodes, which we.

Speaker 7 (34:58):
Do each Tuesday.

Speaker 2 (35:00):
So listen and follow us on the iHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.

Speaker 7 (35:05):
And now the answer to the question.

Speaker 2 (35:07):
One of the most popular social media apps out there today. Well,
at the end of twenty twenty four, the top thirty
five were listed, and a lot of familiar and less
familiar names were among them. Among the less familiar are
several things I'm about to mispronounce number thirty four, vik
or Vic contact Ta number thirty, Josh number twenty six,
Jiaohanngshu which translates to red note number twenty, Billy Billy

(35:30):
number sixteen, Sina Weebo number twelve, by Do, and number
eleven dall Yen.

Speaker 7 (35:34):
Among that group, the least popular vk at number thirty.

Speaker 2 (35:37):
Four still has eighty million monthly active users, and the
most popular deli En at number eleven has seven hundred
and fifty two million monthly users. Of the more familiar
We have Rumble at number thirty five, Threads at twenty nine,
Vimeo at twenty four, LinkedIn at twenty two, X comes
in at number fifteen, TikTok is number five, Instagram is four,
What's App is three, YouTube is two, and the number

(36:00):
over one.

Speaker 7 (36:00):
Is good old Facebook.

Speaker 2 (36:01):
With three point zero six billion active monthly users. That
means if one a half of one percent of Facebook
users listen to our show, I could probably afford to
buy some stock and Rumble or maybe even Jao Hung Shoe,
and then I could use that money to learn how
to pronounce Jao hung shoe.

Speaker 7 (36:17):
Well, a guy can dream.

Speaker 2 (36:19):
No, really, really, it really is production of iHeartRadio and
Blase Entertainment
Advertise With Us

Hosts And Creators

Peter Tilden

Peter Tilden

Jason Alexander

Jason Alexander

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.