All Episodes

October 14, 2024 72 mins

Travis Hawley, with a formidable background in military intelligence and social media marketing, brings an unparalleled perspective to our latest episode, challenging both our reliance on and our understanding of social media's pervasive influence. We promise you'll gain valuable insights into the dual nature of these platforms, as we unpack the empowerment they offer alongside the hurdles they create. From managing our children’s screentime to the staggering mental health impacts on teens and women, our conversation brings to light the significant role algorithms and advertising play in crafting this digital landscape.

Our discussion takes us into the thorny realm of media bias and the ever-evolving perception of conflict, spotlighting the complex dynamics of the ongoing situation in Israel and the power of misinformation. With Travis's unique insights, we delve into personal stories of navigating this noise, drawing parallels to his experiences in military intelligence, and contemplating the profound implications on public opinion. We examine the necessity of engaging with diverse news sources to uncover the truth, echoing the significance of military all-source intelligence in our digital age.

As we wrap up, we lighten the mood with personal pursuits for balance and connection, underlining the importance of mental wellness amidst the digital chaos. Humor and escapism become focal points, with tales of Jiu Jitsu, cultural matchmaking, and the search for new hobbies to offset the stress of a world saturated with news. Our episode closes with a blend of gratitude and humor, offering a candid glimpse into our lives. Join us for a thought-provoking journey that beckons you to reassess your digital habits and worldview.

Your hosts: @lynnhazan_ and @tonydoesknow

follow us on social @ltkpod!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Hey, welcome to the Lynn and Tony Know podcast.
I'm your host, Lynn.

Speaker 2 (00:04):
And I'm Tony.
We are both wellness coachesand married with kids.

Speaker 1 (00:07):
Join us as we talk about all things health,
wellness, relationships, lifehacks, parenting and everything
in between unfiltered.
Thanks for listening and let'sget into it.
Welcome back, welcome back.
How's it going, babe?

Speaker 2 (00:22):
Well, we're all depressed.
We all follow the wrong leadson the internet and um, we have
an expert here to help us alongwith why we're all no wait, I'm
not depressed you're notdepressed.
Okay, because there's good newswell, what's the?

Speaker 1 (00:37):
season two of nobody wants.
This is gonna.
They announce it today.
They're gonna release itwonderful so we're gonna get to
see that romance play out andwe'll probably just pivot our
entire platform to that we'regonna just, I think we're gonna
change our whole podcast to justus like live watching the

(00:57):
episodes.

Speaker 2 (00:58):
Okay, I think, I think, yeah, I think that's
what's gonna happen until then,we'll keep providing as much
value as we possibly can, andtoday I feel like we're.
We have a very special guest.

Speaker 1 (01:08):
The past 10 years of my life has spent on social
media and I have a veryinteresting relationship with.
It's like a love-hate, becauseobviously I love it in the sense
that it's allowed me to workfrom home and cultivate a
marketing agency and a businessthat allows me to be there for
my kids and also be creative.

(01:29):
But also it's very toxic andthere are days that I want to
throw my phones yes, I havemultiple phones.
I want to throw them in theocean and live a trad wife life
and make sourdough bread and bebarefoot and never be on
Instagram ever again.
So I'm really excited for ournext guest.

(01:49):
We're going to dive into thistopic.
You want to?

Speaker 2 (01:53):
Yeah, listen, we have a seasoned professional with a
unique career spanning bothmilitary intelligence and social
media.
He moved back and forth severaltimes between military
intelligence and social mediamarketing, where he achieved
remarkable success, amassingover 1 million followers across
various accounts andcollaborating with top-tier
influencers, celebrities andbrands globally.
Despite his accomplishments, hehas grown increasingly critical

(02:16):
of social media's role insociety.
His insights into theplatform's use as a tool for
misinformation and its potentialto disrupt societal norms stem
from his deep understanding ofinformation warfare tactics.
On the podcast, we're going totalk a little bit about his
journey and explore the profoundeffects of social media from
mental health and nationalsecurity perspective,
emphasizing its double-edgedimpact on the modern world.

(02:39):
With that, welcome Mr TravisHawley.

Speaker 3 (02:43):
Welcome, mr Travis Hawley.
Hey, thanks guys.
Well, that's heavy.
Hopefully we can live up tothat chat.
Gpt-generated boost to theresume.
But yeah, thanks for having me.
I think this is going to be animportant conversation.

(03:06):
I feel that, like everyone ishaving at this point now, is
kind of already sunken into likesocietal you know, societal
level type of conversation as itpertains to mental health and
the intersection of technologyin general.

Speaker 1 (03:17):
Isn't it funny that we're social media people and we
kind of poo poo on social mediaat the same time, you know?

Speaker 3 (03:23):
it's like I think it's the best people to do it.
Yeah, we know intimately thevalue and the dark side.
I think everyone does.
But we know the ins and outs,the front end, the backend.
We understand the valuepropositions, the mechanisms,
gamification, the addictivetechnology, the casino features.

(03:44):
We get, we get it.
In fact, that's probably whatmakes us maybe I don't know, I
haven't seen any studies maybeit makes us less happy 100%.

Speaker 2 (03:55):
Yeah, as somebody that has personally benefited
very little from social mediamyself, it's simply, you know,
it's a tool, I guess, for me.
I've never monetized myself onsocial media, I've never amassed
a following, I've never reliedon it for any part of my life
other than just like, oh, Iwonder what's happening.
So it's interesting, it isinteresting to me as somebody

(04:18):
from that perspective, to hearpeople like you I don't know who
I'm pointing at like Travis,like Lynn, saying we've built
entire careers around this thingand here's also why it sucks a
lot.

Speaker 1 (04:33):
And it's something that I'm struggling now with my
teen.
I have a, we have a 12 year oldand she talks about how her
friends are on social media andI'm like absolutely not, you
cannot have social media untilyou're 16.
And it's a, it's a point ofcontention and I'm like
absolutely not, you cannot havesocial media until you're 16.
And it's a point of contention.
And I try to explain to herthat it's unhealthy.
And I already noticed that sheis addicted to her phone.
She is addicted to her screenand recently she got a bad grade

(04:55):
and I was like, okay, no morephones.
And it seems like in the pastcouple of days I don't know if
there's a correlation, but she'sin a much lighter mood.
She's just more happier.
She's finding different ways toentertain herself, she's using
her imagination, like on roshhashanah we didn't use screens
and she was like you know,putting makeup on and and just

(05:16):
doing other things that I havenever seen her do, and I was
like the this fucking device iskilling, is hurting our kids,
you know, and she calls me ahypocrite because I'm always on
my phone yeah, she's very cleverand also I think part of the
demeanor shift is the fact thatshe's in trouble and she's
trying to get out of trouble.
That's also true.

Speaker 2 (05:36):
But but it's about it's.
The point is it's a battlebefore we've even gotten to the
point where she has social media, it's already a problem so,
travis, I want to know, likeyour story, like how did you get
into the social media space?

Speaker 1 (05:46):
Like where, where did that happen?

Speaker 3 (05:49):
Oh man, um, so it was by accident.
You know, um for sure I cangive you kind of the the pivotal
time.
So I went to Afghanistan in2012 and I've always been like a
power social media user.
I would say kind of you know,introverted person who loves

(06:11):
technology.
So I've been using what I wouldcall even like pre-social media
or you know kind of the initialtype of websites AOL, aim,
messenger, myspace, you knowbefore that.
So I've always been very drawnto it.
And then in the military, I usedsocial media a lot because I

(06:32):
was away from home.
But in 2012, on my way toAfghanistan, I had to do
pre-deployment training in aNational Guard base in like
Indiana or something, and we hadlike a couple of weeks of just
like a lot of downtime, by theway, and I just started a, at
the time, a meme page onFacebook and that was like my

(06:57):
first real start in being whatwe now would call like a creator
or something or an aggregator,um, influence or whatever.
Um, to me they're all dirtywords now, but um, and the long
story short with that is, Istarted curating and creating
content on Facebook initiallyand then 2013, even more on on

(07:19):
Instagram after the acquisitionby Facebook of Instagram and I
started building what you mightcall meme pages.
Now I have three of those, andthat's where most of my
following actually is is on kindof meme pages built around
communities, really, or topicalissues, and that's how I started

(07:40):
in 2012, 13.
And the long story, short withthat, is I had a friend I'm from
Silicon Valley, by the way,like, born and raised in the Bay
area and one of my friends is arecruiter for all the tech
companies.
Um, this is 2014 and you know,I had only a few thousand
followers or something, I don'teven know.
I barely had anything.
And he basically was like hey,it seems like you have a knack

(08:03):
for this, and I was out of themilitary at that time.
And he's like hey, put togethera resume.
You know, maybe I can get you ajob in social media.
And I'm truly so naive at thattime I was like a job, Like I
truly didn't even think thatpeople were doing social media
for a living.

(08:23):
And he's like, oh, yeah, Irecruit people to do social
media all the time.
And I was like, wow, I nevereven considered that.
And since this is now, you canthink this is 10 years ago it
was still fairly novel and funand exciting and innovative.
I think those days are gone.
However, I got the job.
Don't ask me how I got anentry-level job into what is now

(08:46):
known as the the highestgrossing mobile app game video
game company in the world.
We actually moved intoFacebook's old headquarters in
Palo Alto.
So that was my start.
I got a lucky break sorta in inuh as a entry-level social
media manager at a at a gamingcompany, and then that just
skyrocketed my career and I gotto work with celebrities and

(09:07):
influencers and big brands andit just went out of control at
that point, which is amazing.
So I just kind of fell into it,stumbled into it, didn't even
try and you know, right time,right place had some of the
skills to get an entry-level joband then from there you know.

Speaker 2 (09:29):
Did you work for Candy Crush?

Speaker 3 (09:32):
No, but they were one of our greatest competitors at
the time.
Actually, yes, so ourcompetitors depending on the
demographic, candy Crush wasmore on the female, so they
weren't direct competitors.
They were competitors on thehighest grossing charts.
Our direct competitor wasSupercell, a European company
that does Clash of Clans, so itwas a war game, so that was more

(09:55):
of our competitor.
They were the casual version ofour war game, literally called
Game of War.
So you probably saw thosecommercials everywhere.
We were just absolutelydrowning the world in
commercials at of war.
So you probably saw thosecommercials everywhere.
We were just absolutelydrowning the world in
commercials at the time.
So, um, yeah, Fascinating.

Speaker 1 (10:13):
And at what point?
When you were starting in thesocial media business, where did
you kind of realize there's anissue here.

Speaker 3 (10:21):
Like this is not healthy to realize there's an
issue here Like this is nothealthy.
That's a great question.
I think it's gradual I'm notsure if there was a point, but
certainly I would say mid-2010s,right, 15, 16.
I think if we were to point toone area that a lot of people
could agree on as well, that wefelt a shift, was the election,

(10:46):
the first election of DonaldTrump versus Hillary Clinton.
So 2015 would be a fairly fairarea to start when it comes to
just the toxicity of traditionalmedia content now being
distributed on social media, andthen the political nature of
that, the divisive nature ofthat, and then also the

(11:11):
weaponization of social mediaalso, I would say, really was
proven around that time as well,as it pertains to foreign
actors using information,warfare, misinformation,
disinformation, malinformationto divide and to destabilize the

(11:34):
Western world.
And at the same time thoughthis was about 10 years into
social media right, you couldsay around 2005, myspace to 2007
, facebook, so about maybe 10years later um, this is when the
real weaponization started, umand so, and also you can, after
10 years of hardcore socialmedia use, you could also

(11:55):
there's plenty of studies onthis you could already start to
see the trends of mental healthdeclining, particularly with
teens and women or girlsdeclining, particularly with
teens and women or girlsInteresting.

Speaker 2 (12:13):
Do those studies address why?
I mean, I can make some guesses, but what do the studies point
to as to, why women and girlsare more susceptible.

Speaker 1 (12:16):
I mean, I could tell you, you're comparing, you're
seeing women with six-pack abstwo weeks after giving birth,
like what the fuck you know it'sseeing people with more money
and more stuff and Chanel bagsand doing like you know.
You have influencers doing likeshopping hauls and meanwhile
people can't afford thegroceries.

(12:37):
It makes you feel like shit.
It makes you feel horrible,absolutely, and as and
especially as girls in the highschool age.
I can't even imagine like I hada tough time being a girl in
high school with no social media.
I imagine being a girl in highschool with social media, with,
you know, people making fun ofyou and, uh, not having enough
followers or or, or this youknow, culture of comparison, of

(13:00):
like numbers.
Yeah, and I, you know, I'm a 41year old woman and I, like I've
been in I I don't like the wordinfluencer anymore, but at a
point in my career I was aninfluencer and I would go to
these like influencer typeevents and and I would generally
go to like meet people andnetwork and some at some of
these events, these girls wouldjust compare numbers oh, like,

(13:22):
how many followers do you?
And like it just makes.
It puts you in a box and andlike nobody talked to each other
and everybody was on theirphones and everybody.
And yeah, I understood that itwas a network, it was a you know
influencer thing where we hadto take pictures and take videos
and that was was expected of us.
But there was like no, thesegirls were not capable of having
any sort of social interaction,like socially dumb.

(13:46):
And I was just like, ah, thisis not for me, you know.
So that's right.
I mean, I think I yeah, thoseare.

Speaker 3 (13:50):
Those are definitely some of the reasons.
I think the last one was onethat seems to be, at least from
my research and watchingdocumentaries on this and other
people who study this explicitlyalso a big factor, at least for
women and for teens, is thecyberbullying.
So, you know, when we all grewup, right, you didn't take

(14:14):
school home with you.
Right, if anything, home couldbe some kind of refuge from
bullying.
And now it follows.
It follows you home if you havesocial media.
And and then, like you said,look, there is many dimensions
to this, whether it's whetherit's the comparison culture,

(14:35):
like you said, or the socialcurrency culture, whether it you
know, about followers andcomments, likes, shares, all the
different metrics, views aboutfollowers and comments, likes,
shares, all the differentmetrics, views, or it's actually
bullying people as well.
Right, putting certain emojis in, you know, messages or comments
, the Social Network, netflix,that documentary that was

(15:00):
licensed by Netflix is anincredible bird's eye view of
this phenomenon and you know ittouches upon not just what we're
talking about, as it relates tomental health and stuff, but it
also gets into what Lynn and Iknow very, uh, very dearly.
I'm not sure if you do, tony,how much you're into the
marketing, as much of the socialside, but the mechanisms behind

(15:24):
advertising and how advertisingcreates the incentives for
social networks, and thoseincentives drive algorithms and
those algorithms drive the typeof content we're going to be
exposed to at scale, whichinfluences our social media.
It is all connected and it allcomes from the incentives and it

(15:46):
all stems from how thesecompanies make money, and that's
another can of worms we can getinto if you want or don't want
to, but really that's what isdriving it.
If you look at the actual rootcause, it is the business model.

Speaker 2 (16:00):
Yeah, I mean I watched the Social Network
documentary and I have amarketing degree, so I mean a
lot of what was brought up inthat was, I mean it made sense
Like in my marketing brain, likepart of me was like that's
genius.
And then obviously the realityof what takes place after all,
that is like that's horrible.

(16:21):
But I understand that at apoint when all of this was
evolving, that simply peoplewere trying to build something
that kept people's attention,because that's what's being
traded on social media, right,and at some point.
What I'm curious about is, nowthat there's at least some

(16:45):
awareness around how damaging itis there's enough information
out there and studies that havebeen put forward that at what
point do the people that areincentivizing this behavior go
oh, I'm not just like buildingbusiness.
Now what I'm actually doing isquite damaging.
Now what I'm actually doing isquite damaging.

(17:06):
I'm wondering where that switchstarts to flip when the reality
of we have enough data at thispoint to understand that what
they're doing is quite dangerous.
And how do we get over the humpof that?

Speaker 3 (17:18):
Well, first of all, it's not in their best interest
to admit that, to believe thator to act on that, because
they're going against theinterest of a publicly traded
company that is held to astandard of infinite growth.

Speaker 1 (17:32):
Yeah, they don't care .
They don't care about us.

Speaker 3 (17:35):
Even if they did care if we remove our intent or
morality or anything.
Just remove that.
Just remove that.
Even if they did, it is not inthe best interest of the
stakeholders of the companybecause it would drive revenue
down, guaranteed by reducingtime on app, time on site,
therefore advertising revenue.
So it's against all interests.

(17:55):
So that brings us to reality, asober reality, where there's
only a couple other options toaddress this, and they're not
the greatest options.
So, for example, first of all,there's education.
I'm less enthusiastic aboutthat, not that it's not
important, but education is lessstrong than our instincts and

(18:19):
our instincts are going to bedriven by addictive gamification
methods.
Right, so you can educatesomeone all you want.
I mean, I know all about healthand diet.
I'm still overweight.
Like education is not enough.
At scale, always want to have,because you certainly don't want

(18:47):
to create an even morebureaucratic, bloated government
, and always giving governmentpower isn't always the solution
either.
Regulators can also be co-optedand captured.
However, if you look at otherthings, like cigarettes, look at
things that are proven to beharmful if used at scale, or as

(19:12):
habitually, cigarettes, certainkinds of foods, even dangerous
things like hazmat oil.
Water like water contamination,doesn't matter what it is,
they're all regulated.
So why isn't social media Like?
Even to a libertarian or someonewho doesn't want big government

(19:32):
, you would admit there is atime and a place for regulation
when it comes to public safetyand it comes to the health and
well-being of the population.
So regulation, I think, is abig one.
We need policies, we need laws,and those have started.
I like that.
Of course it's a slippery slope, to use that expression,
because if it's not enforcedthen it doesn't matter.

(19:55):
It can be overdone, it canunintentionally cross the line
on our constitutional rights,like free speech and things.
So it's a messy, it's a verymessy area.
I happen to think well, we gotto try something.
Even if we're just slowlychipping away, we got to do

(20:16):
something.
I think it's a I mean to use aword actually accurately, it is
the mental health relatedaspects of technology, the
Internet and social media as asubset is a pandemic.

Speaker 1 (20:31):
Yeah, wow, okay.
I want to talk about a littlebit, because it's a lot to
process, but I want to talkabout the last year and you're I
know that you're a member ofthe tribe.

Speaker 3 (20:44):
Yeah.

Speaker 1 (20:45):
Um.
You were in Israel on October7th.

Speaker 3 (20:48):
I was how?

Speaker 1 (20:50):
what was that Like what?
What happened when you werethere?

Speaker 3 (20:54):
Oh man, you know what's wild.
If you look at my feed, myInstagram feed, like anyone
before, let's say, pre-october7th, it changed.
But if you look at minepre-October 7th, I was in Israel
for I was going to stay therefor three months and this was
like week five or something.
I don't know.
The couple weeks before, all ofmy videos are talking about

(21:17):
exactly what we're talking aboutright now Mental health on
social media, gamification,propaganda, misinformation.
You know social media, you knowas it pertains to mental health
.
That was all my content likelegitimately.
And then October 7th happens,and then October 7th happens and
I basically dive into thedarkest depths of the internet.

(21:41):
I mean, we all did right, butnot just digitally.
But you know, being in Israelwas, it was super scary.
I'm not going to lie like Idon't want to.
I'm not here to sound tough.
I didn't grow up in Israel,where some or many people are

(22:05):
sort of used to it, um, which issad, a sad reality.
I'm not used to that Like beingunder, you know, the Zeva Dom,
like the red alerts and, um, thesirens, air raid, sirens and
things like that.
On October 7th I actuallystayed up that night till like 3
am or October 6th.
I stayed up till like 3 am orsomething, I don't know.
I went out or I don't evenremember, but I was up late, so

(22:28):
I slept and I woke up, you know,when the sirens went off.
Whenever that was, it's like 7am, 6, 30, I don't.
You know, I don't know exactly.
I was in south tel aviv and andit was actually so faint that I
just went back to sleep.
I didn't even like it wasn'tloud, like for whatever reason,

(22:49):
it wasn't loud enough that likeI was like questioning it.
And is that because I'm so notused to being there during those
air raid sirens?
I honestly didn't even think itwas possible.
So I went back to sleep.
I don't know how long I slept.

(23:09):
Then I wake up in a couple hoursand then it's like chaos.
I'm just hearing everyoneoutdoors like pick up my phone
and turn on the tv and I seewhat's going on and um, yeah,
I'll be honest, I was um, it waskind of a little PTSD because

(23:30):
while I never experienced thatin Israel, I did experience that
in Afghanistan.
So in Afghanistan we have thison base, we have a, we have the
air raid sirens and you run intoa concrete shelter, or if
you're indoors, you stay indoorsand you know.
So I actually have some of thatPTSD from doing that in
Afghanistan.

(23:50):
But this was more scary becausenow that I've reflected on and
even a few weeks after October7th I reflected on it it was I
was in more fear because Ididn't sign up to go to war like
I did in Afghanistan and be ina war zone and be on a military

(24:11):
base.
I'm in an apartment in South TelAviv, so your sense of safety
is actually it's so flippedupside down that the fear level
I had was much higher than evenin Afghanistan, which is crazy
Now, given I wasn't infantry inAfghanistan, so I can't speak to
that.
I'm sure that's different.

(24:32):
I'm sure that's different.
So, yeah, and then it becamenot just an existential threat
physically, but then you knowyou go online and you're seeing
all of the death right and thereality, and it's nonstop, of
course, and it's your entire,what I call your digital diet.

(24:58):
So then everything I'mconsuming digitally is a
nightmare, and then when I'm noton my phone, I'm truly on like
eggshells or whatever you wantto call it, pins and needles at
any given time, just expectingthere's going to be an air raid
siren yeah, that's intense.

Speaker 1 (25:21):
That sounds intense.
Did you come back to the stateslike a few days after, or did
you stay there?

Speaker 3 (25:30):
yeah, I did um.
I I intended to um, but somepeople convinced me not to Um,
so I left like five days lateror something like that.
Um and partly was I startedmaking all this content.

(25:52):
I wasn't some influencer orsomething I still am not but I
started because of my background, also in military intelligence,
and my audience at the time wasnot Israeli and Jewish Okay, so
I'm talking to like people whoare not really have anything.
They don't know nothing aboutthis conflict really.

(26:14):
So I started making educationalcontent, talking about, you know
, hamas and Hezbollah and Iranand just all of the nuances of
it from a national securitiesperspective.
Just started making educationalcontent.
And I'm not Israeli, but in aHebrew word, they call that
Hasbara, which is like anadvocacy type of person, so like

(26:36):
kind of unintentionally,organically and entirely
unofficially kind of educatingpeople on social media.
Some, some people were like,thank you, but you could, you
can do that at home.
You don't need to be in Israelto do that.
So you're an American citizen,like, why don't you just go home
?
You could do your, you could dothis content maybe better, as
you're not having to alwaysthink about going to the bomb

(26:58):
shelter, so that, and and alsospeaking with some people in the
security apparatus in Israel,you know we were actually like a
security people were actuallymore fearful of Hezbollah at the
time than we were of Hamas.
I explicitly put that in all myvideos too.

(27:20):
Like, yeah, what Hamas did wasobviously a nightmare of all
nightmares, but after October7th the fear was more about if
Hezbollah decides to go, pushall the chips in and go all out.
This is going to I hate to sayit, like no offense to anyone,
all respect it would makeOctober 7th possibly be

(27:44):
minuscule in comparison, hadHezbollah tried or surprised us
with all its might.
So, with that looming threatand thinking about I can do, you
know, I can do.
You know this digital educationcontent and reporting probably
better when I'm not under duressor whatever.
So I decided, okay, ultimatelyto go home a few days later.

Speaker 1 (28:09):
What was your reaction?
Going online and I think a lotof Jews expected support and
people you know the world beinghorrified and what was your
reaction?
Seeing the pro-Palestinianprotests like literally people
on the streets?
You know people screaming gas,the Jews and just like this, you

(28:32):
know all over social media,people celebrating October 7th.
What was your reaction?
Were you surprised?

Speaker 3 (28:40):
Yeah, that's actually a great question, because when
I got home I made some, I wasback in Israel.
I felt more safe in Israelunder the rockets than now I do

(29:06):
at home in the US, consideringwhat I was seeing in the streets
, and I mean that that's how Ifelt at the time.
So, yeah, I was shocked.
I mean, how could you not beshocked?
Certainly there are Jews andIsraelis and allies, if you want
to call it that who have beenmuch more involved in the

(29:28):
countering the BDS movement andall this anti-Israel stuff for a
long time.
They were probably lesssurprised than me, but I was
definitely surprised, um.
I.
I certainly didn't, um, expectwidespread support on the on the
longterm.

(29:48):
But if, again, a lot of thingsare not so shocking with
hindsight the benefit ofhindsight because, if we think
about the climate that wascreated in part by Kanye West,
the year or two leading up toOctober 7th, anti-semitism had

(30:09):
already started bubbling up andbecoming more accepted on social
media, more accepted on socialmedia.
So, from an intelligenceperspective, now, with the
benefit of hindsight, you couldsee this threat was looming and
this threat would be in theinformation space, social media
and the internet, where seedswere being planted for quite a

(30:32):
long time of doubt and insultand hate and accusations and
misinformation anddisinformation, that when there
was such a flashpoint event thatcould ignite all of those, I
suppose in retrospect it's notentirely surprising.
Yet we're still baffled to thisday, I think, no matter what

(30:55):
widespread acceptance ofanti-Semitic viewpoints and the
normalization of those that hatespeech online and in person,

(31:26):
and then, of course, theprotesting and I don't even know
if I'd call it protesting, butprotesting at Jewish businesses
to this day and synagoguesprotesting at Jewish businesses
to this day, and synagogues.
So yeah, it definitely madethings so much worse.
And again, I can keep goingforever.
Feel free to interrupt anytime,but one of the things that a

(31:51):
really interesting point thatsomeone made recently which I
happen to agree with and it'sokay if no one else agrees but
all of Israel's wars previouslyhave been very short.
We're talking six-day war,couple weeks, few weeks, all of

(32:14):
them.
And this this person made.
It was really neftali bennettis the one who made we listened
to this podcast too.

Speaker 1 (32:24):
I was like I feel like I heard this recently, yeah
, and it's like a slow drip.

Speaker 3 (32:30):
Yeah, so not to recap the entire podcast with him and
Jordan, but I'm not surprisedthat with a protracted, as he
said, war that we're in thisreally, really terrible dark
place Because there's justthere's been no point for any of
us to breathe, so we've beenunder this dark shadow of just

(32:53):
war and hate speech for so long.
I'm still absolutely I don'tthink any of us are really out
of it.
You can't.
As long as the war is going on,anti-semitism will be at at the
at its peak, but I also believe, and I'm certain, that it will
significantly drop whenever thewar does end.

Speaker 1 (33:15):
I hope so, because it's a very long year.

Speaker 2 (33:20):
Yeah, go ahead I had two things stuck out to me, and
one was do you think that, basedon Israel's track record of war
and how short it is, that alsoHamas was counting on it being
shorter?
Also Because I feel like theyweren't prepared for a year-long

(33:41):
war either.
Even this sort of the way thatthe propaganda has been put out
there and adopted, it wasn'tprepared for a year-long like
propping up that story for ayear either.

Speaker 1 (33:56):
And I'm wondering if you think I don't know if I
agree with that.
Actually, I think it's about.
It's not about land, it's anextremist, you know, religious
terrorist group and they want toexpand the ideology and the
longer it goes, the more peopleare continuing to.
You know, like people americaare turning to, like white,

(34:18):
blonde, shit.
It's like crazy.
They're like wearing keffiyehsand and you know, going to these
protests and they just theywant to just keep themselves,
like you know, relevant.
It's like j-lo just trying tostay stay relevant?

Speaker 3 (34:33):
well, I will, though I'm not sure anyone has that
answer.
They may or may not.
If anyone, it might be the IDF,but I would argue that Hamas
didn't expect a long war, butthat they're prepared for one.
But that they're prepared forone, and the evidence I would

(34:55):
give you is I also think I thinka lot of people think this, but
that October 7th was probablymore spectacular than they would
have even dreamed of as well.
But the tunnels are built for areason to hide out for long
periods of time, to have a citywithin a city.

(35:19):
Hamas is definitely playing thelong game.
In fact, all of our adversariesplay the long game.
Our meaning, I would say theWest, us, israel, europe,
doesn't matter.
They play the long game,whether it's the Islamists
terrorists, whether it's Russia,including Iran, north Korea,
china.
I also would say this thoughSinwar, if you know some things

(35:43):
about him, was in prison inIsrael, learned Hebrew,
understands the system,understands the politics,
understands the power ofpropaganda, disinformation and
all this stuff, understands thepower of propaganda,
disinformation and all thisstuff.

(36:06):
I think to some degree theyknow he knows the longer.
Any type of conflict wars overthe long term, and this is a
multi-decade preparation, whatwe call the preparation of the
battlefield of propaganda,misinformation, bds movements,

(36:28):
organizations getting intoacademia, getting into
government.
This has been going on decades.
So they're prepared for, let'ssay, the public opinion in the
long run to turn on Israel.
And that's why I happen toagree with Bennett that had this
been a couple weeks orsomething, whatever and I'm not

(36:48):
criticizing anyone, but just thereality is, the longer it goes
on, it worst, it's worse forjews in israel now in the last
year.

Speaker 1 (36:59):
Let's talk specifically with the internet
and social media.
Um, I think for me the biggestsurprise it wasn't surprising
that we you know people hate us,it's not new.
But to me what's surprising wasmainstream media just
completely denying facts andjust spinning everything.
And you know, I never thoughtof myself, of like some, like

(37:23):
the type of person I would say Idon't trust the media, or I
can't read the new york times,or like, oh, I'm gonna watch fox
, like I find myself watchingfox news and I'm like what, what
?
How can you, can you like?
Was this also a shock to you inthe like last year?
And how how can like somebodywho's listening?
You know, how do they know whatto trust?
Right?
How do they know where to gettheir information?

(37:45):
How do they know if it'saccurate or not?
Like, like.
How do you determine that?

Speaker 3 (37:50):
I knew this was going to come up.
Yeah, this one's hard Um so uh,I guess it's not entirely
shocking, because this isactually where there's some
irony in the anti-Semitism, or,let's say, some holes.
Because they say Jews run themedia.
So why isn't the mediadisproportionately in favor of

(38:13):
Jews, including the Jewish state, disproportionately in favor of
Jews, including the Jewishstate?
You can't square that circle.
So just one example ofsomething that is absolutely
demonstrably hypocritical forthem to say that.
But the media in general?
Again, I try to stay away frompolitics, but the facts are the

(38:35):
media tends to lean left and themainstream media tends to lean
left and progressive, especiallyin the last 10, 15 years.
The progressive standpoint on alot of different topics, but
including israel, is the wholevictim verse, victimizer,

(38:58):
occupied, occupier, all this.
This is this dichotomy of theworld.
So when israelis are painted aswhite and they're painted as a
colonial entity and displacingdarker skinned people, of course
, that shows their lack ofeducation of dark skinned.

Speaker 1 (39:20):
Jewish people.
But that's the thing.
These are journalists.
How do they not?
You know what I mean.
How do they not know?

Speaker 3 (39:28):
It's no different than social media.
Once you have an audience,you're going to pander to that
audience, and that goes forsocial even people we all know
and to a degree, maybe evenourselves.
There is a muscle memory towhen you develop an audience to

(39:50):
make sure they get what theywant.
And what they want isconfirmation, confirm,
confirmation, bias.
So it doesn't matter thatpeople are professional.
I hate to say it.
That is.
The problem is the need to feedthe beast that is their
followers and their revenuegeneration.

(40:11):
If you are counter to youraudience, what happens?
You lose your audience and thenyou lose your money.
And I don't think I'm I'm beingoptimistic.
I don't think most people inmedia or influencers or whatever
creators all these stuff areevil or greedy.
I think a lot of this is doneunconsciously, and so the left

(40:32):
media is certainly isn't goingto entirely take the opposite
point of view of theirestablished audience.
That's not good for business.

Speaker 1 (40:45):
So I think that's one point Somebody is like an
editor and a writer like sittingin a fucking room and they're
like no, that sounds too goodfor the Jews, you know, or too
good for the like.
That's like.
What I can picture is like OK,what do you want to title that?
What do you want to put as theheadline?

Speaker 3 (40:58):
Like no, I don't think that's again.
I could be optimistic, but someof the headlines are wild.

Speaker 1 (41:05):
I wish I had like some examples, but some of them
are like, like nuts.

Speaker 3 (41:09):
Yeah, uh, me and Stella talk about this a lot, um
, and when I send her stuff, shesends me stuff as well, and
especially from AP there'scertain.
So to answer your secondquestion, by the way, of how we
can try to find truth and thingsof that nature, there are
established media outlets that,in general, historically, are

(41:32):
credible for the most part.
There's no such thing as anyperfect outlet Credible for the
most part, honest for the mostpart, respected.
On Israel there's an exception,because this is where I try to

(41:55):
contend with people we all knowwho kind of like to say don't
listen to mainstream mediaanymore.
Well, here's the thing throwingthe baby out with the bathwater
, you know, using thatexpression, isn't the solution
either, because there arecertain pockets where you can
see massive bias, or whetherit's intentional or not.

(42:16):
So here's some kind of I don'tknow approaches to help decide
this.
So, first of all, actuallyanother great person.
I will take some learnings fromsomeone who I may not agree with
politically but philosophicallyYuval Harari.

(42:38):
He talks about how truth is.
He didn't say this word, butit's laborious, it takes work to
find the truth right.
That takes effort, takesresearch, it takes education, it
takes strategy, it takes.
It takes self-awareness to, atleast to the extent possible,

(43:02):
set aside bias or identify bias.
So here's the thing Truth takeseffort.
People need to understand whatare biases Like.
I can't tell someone how toknow whether a media outlet is
true or not, or that they cantrust it if they don't even
understand what a bias is, howto identify one.

(43:24):
What are the most common ones?
Secondly, look at track record.
You can look these things up.
Are they, historically at least, accurate or trusted?
That certainly doesn't meanthey will always be, as we know
in this case.
Are they credible?
Mean they will always be, as weknow in this case?

(43:46):
Do they?
Are they credible?
What is their methodology?
What are their sources?
What are their politicalleanings?
Right, there's, left, right andcenter.
One of the things I try to tellpeople, and this is what I do a
lot, and I and I want to shoutout a a new website or app that
is sponsoring everyone.
Now, I'm not sponsored by them,by the way, but ground news,
have you guys seen this?
This thing everywhere.
Wow, it's such a smart idea.
I or app that is sponsoringeveryone.
Now, I'm not sponsored by them,by the way, but Ground News,
have you guys seen this thingeverywhere.

(44:06):
Wow, it's such a smart idea.
I wish I thought of it.
Ground News is a newsaggregator website and app, but
it aggregates any story and itshows you all the outlets and
what their biases are.
Is it left-leaning,right-leaning?
How many?
Quantity versus quantity?
So you can see the blind spots.
You can see thedisproportionate reporting it
can.
It can give you context to thenews and I love that.

(44:29):
I'm like, wow, that is such agreat idea.
It doesn't fix everything butthat's what's up.

Speaker 1 (44:36):
I said I'm influenced .
Now they should sponsor you.

Speaker 3 (44:39):
They should.
I think it's a great idea, andnow others are copying it.
But that shows you the quantityand quality.
You can see how many outletsare reporting on something, and
then also where they are on thepolitical spectrum, and that can
say something.
That can say something abouttheir values, what is important
to them, what do they prioritize.
It can show their blind spotsand it can show their bias, but,

(45:01):
again, that doesn't tell youwhat's true or not.
Right, because just becausethere's a left-wing bias on a
story or a right-wing bias on astory, that doesn't mean it's
not true.
Right, because they havedifferent values and different
business models, so they aremonetizing that worldview.
So this is why it's complicated.

(45:21):
So I think the best way to findtruth is to educate oneself on
these type of mechanisms, thebusiness models where they
politically lean, the volume ofthe reporting, a credibility and
another one, I'll add, is kindof recency, which there is a
recency bias as well, though isunderstanding like has this been

(45:43):
?
Is this a new story?
Is this an old story?
Does this story have a history?
You want to look into, tounderstand it more in depth?
It's a mess and, as you cantell, everything I just ranted
about is it's heavy.
It's like wait what I have todo all of that to find truth?
Or at least let's just say getcloser to truth.

(46:04):
It's like, yeah, and as we lookback on, we're talking about
social media and algorithms,right, how we get our content
and our news.
Now it's in opposition to that.
The culture we have and thetechnology we have is completely

(46:24):
in opposition to themethodologies we need to
actually find truth, and that'swhy we're in a crisis of truth.

Speaker 2 (46:36):
Yeah, I mean.
This is the reason why that appis so brilliant is because,
when you're describing theprocess that goes into actually
coming close, at least as closeas possible, to the truth, the
biggest cost to all that is time, and nobody wants to spend the
time to do it.
So that, in a way, is whatmakes that app brilliant.
The other part of Wait.

Speaker 1 (46:56):
I want to know, though who do you trust?
Who do you look for for news?

Speaker 3 (47:02):
Who do you look for for news?
No one.
So I think it's not about trustFor me what I try to do, and
I'll give you sort of an analogywhy.
You know, in the military andoutside there's something called
all-source intelligence, okay,and in intelligence there's many

(47:26):
sources of intelligence Humanintelligence, signals
intelligence, electronicmeasurements intelligence.
There's communicationsintelligence, there's
open-source intelligence,there's all these disciplines,
okay.
Well, an all-sourceintelligence analyst isn't a
person who, well, they may havea discipline.
What they do is they aggregate,they get from as many sources

(47:49):
as they can.
So someone who works in one ofthose disciplines makes
intelligence reports becausethey're specialists in that, and
that's great.
Let's talk about what'shappening in Gaza, or something.
Well, to really understand what, let's just say what Hamas's

(48:10):
intentions are, let's say.
Let's say you know theirabilities and intentions,
capabilities and intentions.
Would you just believe onesource of intelligence?
Well, probably shouldn't, right?
So I take that same frameworkfor news.
They all have strengths andweaknesses.

(48:30):
Some are completely BS, andhopefully people know what
parody or satire places are, orones that are extremely
clickbaity.
And others again, even the onesthat we were not happy with
recently CBS and AP and stuff onaverage are fairly credible and

(48:51):
that's reducing.
I think we can argue Multiplesources, what we call all-source
intelligence.
Take the same mindset.
It's not about truth or who youtrust anymore.
Get as many sources as you canand understand.
Maybe the truth overlaps fromthose sources.
And if you're going to, ifyou're even going to take away

(49:12):
something from the news, abelief, try to take away where
there's overlap among storiesfrom across the political
spectrum.

Speaker 1 (49:25):
Sorry, I cut you off.
What was your question.

Speaker 2 (49:28):
I don't remember the follow-up.
Oh, I do actually.

Speaker 1 (49:32):
So sorry, I'm like word vomit, like questions, just
fly out of my mouth.

Speaker 2 (49:36):
That's fine, I don't remember.

Speaker 1 (49:40):
So in the last year, how do you because I, you, I,
you know, we're, we're in like agroup, chat together and I and
I think one of the reasons wewanted you here is because
you're very, you know, like you,you have like your boundaries,
and when people just say like,oh, this happened or this
happened, you're like where'syour source?

Speaker 3 (50:00):
yeah, and that doesn't earn you any friends
because, let's be honest, we'realso in an echo chamber and
that's okay.
Communities are echo chambers,so I don't think there's
anything.
There's no solution to thatexactly, other than to be in a
few other echo chambers,hopefully.

(50:20):
But yes, people don't want tobe challenged.
That's the problem, right, andpart of human nature and how our
brains work is holding on tobeliefs is way more efficient.
So if you tell someone that'scontrary to a belief, a position
or information that theyalready have, it's very

(50:43):
uncomfortable and, in allfairness, if someone challenges
you publicly which a group chatcan feel that way, including
myself you may feel attacked andget defensive.
I certainly can feel that waytoo.
That way too, it's it's it'sit's it's problematic.

(51:06):
But there needs to be some kindof mechanism, process before we
just spread stuff.
Because if we're going tocomplain about propaganda,
disinformation and hate speechand bias and all this stuff, but
then we're just going to do thesame thing when we see a story
that resonates with us,emotionally, triggers us, or we
want to believe, or it'soutrageous, and we're just going

(51:29):
to spread it too, then we'rejust contributing to the problem
and maybe because of mybackground.
I don't want to do that, atleast to the extent possible,
because for me, publiclyspeaking out against propaganda,
disinformation, misinformationand all the like, I'm not
prepared to be a hypocrite tothe extent possible.

(51:51):
So I think what's important is,people just slow down before
you push something out, at leastwidely.
I mean, sharing something insmall circles is one thing, but
in a group chat, let's be honest, things are flying out.
I'm sure.
Um, if it confirms a bias orsomething, or it's gonna, I hate

(52:12):
to say it doesn't matter whatgroup you're in if it's going to
get views and clicks and andrage bait and all this type of
stuff.
So, um, yeah, I don't mindbeing that person because,
frankly, I rather be respectedthan liked.
And again, that doesn't make methe most popular person, but I
understand, I'm very self-awareof my contribution to the world

(52:35):
and it's not to be popular.
I realize that.
But I do want to feel valuableand I do appreciate when people
come to me and ask me hey, checkthis, do you think this makes
sense or not?
Like tons of people do that.
It's actually out of control.
Now I can't even keep up, but Iactually prefer that.
I'd rather someone come to meto, you know, layer a

(52:58):
methodology of truth to theextent that's possible, then
come to me with.
Let's spread these rumors andand and whatnot.

Speaker 1 (53:11):
So that's kind of how I see it I mean that's the
right way of going about it,like I like.
When somebody posts something,I'm like, wait till travis
answers, we'll see if it's ifit's right or not?
You know, know, like we waitfor, for your response.

Speaker 3 (53:24):
Well, I don't want that to be either I'm like, I'm
not, I'm not the litmus on ontruth, but what I want to be
known as, as as as somethingthat a filter.
I don't mind being a filter,that's something that can be
pushed through, but I'm not thearbiter of truth, without a
doubt, because I also have myown biases and faults and things

(53:45):
of that nature.
But yeah, I think that's what'simportant.
Like, let's get nerdy for asecond here.
Think about the scientificmethod.
It's also infallible.
I mean, it is fallible.
It's not infallible, but theprocess gives us confidence.
At least.
If you push something throughthe scientific method, the
degree that we can trust it andbet on it is exponentially

(54:08):
higher than not putting itthrough Right.
So, again, the same thing welearned during COVID and all
this stuff too it's not thatscience can also be corrupted or
biased.
It certainly is.
So I think that's what'simportant is to at least create
imperfect systems that at leastyou can push information.
You can push information, youcan push information and
experience through them.
So at least the output is ofquality of some nature.

(54:30):
It's not about being infallibleor being the truth, but it's
about having confidence, ahigher degree of confidence in
it.

Speaker 1 (54:42):
Let's talk now a little bit about the mental
health aspect of all ofeverything.
Right, it's kind of tie it allin.
It's been a really hard year.
It's been even harder.
It's you know, social media hasalways been hard and it has its
challenges.
Now you tack on the likeexistential threat and it's just
seeing just violence all daylong and I see maybe 10% of good

(55:05):
stuff and the other 90 is likecraziness.
And then sometimes Tony sendsme like funny married people
memes which I really love, likethe reels and stuff.
Those are super.
Or toddler I love toddlerslander Like if the internet was
just toddler slander and likeanimals, we'd be good, We'd be
real good.
So it's been intense, Like howdo you take care?

(55:28):
Or how do you take care of yourmental health?
And how can our listeners takecare of their mental health
during their time, Especially ifthey're Jews and this war is
affecting them?

Speaker 3 (55:38):
Yeah, I mean, this is something I'm very passionate
about too and I've I've evenmentioned this in our, in our
group, and it doesn't seem to goover well, but I tell people
privately because we're addictedto doom.
We are addicted to doom it.
It sucks that this is thenature now of social media,

(55:59):
which initially was traditionalmedia, is it's the most negative
?
Salacious, emotional contentrises to the top, not the
healthiest, with the most levityand the most utility.
And again, we spoke about insome reasons why that is

(56:20):
algorithmically in businessmodels.
There's a few things.
There's a few things.
There's a few things we can dohere.
One, I think, is the mostimportant, which is the hardest
to do, because I don't want tospeak for everyone, but in some
ways we're all addicted.
You have to shut them off, settimers, set reminders, set

(56:42):
limits, have a social lifethat's outside of the digital
world, have hobbies, have afamily, go out like, go in the
real world.
Now, I understand that's veryhard to do for some people,
including me sometimes.
So it's just one piece.
Another one actually isactually leaving toxic groups

(57:08):
and things of that nature aswell, even if they're unintended
and people are well-intended.
What I say is we need lessdoom-scrolling, and if something
doesn't have utility or levityto help your mental health, you

(57:29):
should probably not use it thatmuch.
And if you cannot help yourself, which a lot of us can't again,
I'm always accusing myself toosometimes it's better to remove
yourself, to abstain from thatfeed or whatever, whatever it is
.
So I think that's reallyimportant is to to to be

(57:50):
self-aware of again your digitaldiet.
Where is this stuff coming from?
Like you got to reduce it, movesomething like let's just use
Instagram and group chat andstuff.
For instance, turn offnotifications for one.
That will help reduce and criesand social media manager well,
see, well, we can get into thattoo, but that's why I've left

(58:12):
social a couple times now.
But um, oh man.
So that's another thing.
We're not talking about socialmedia management or people who
are in that career.
That's tough, but for Jews, Ilook.
I think here's the other thingtoo.
And I said this, I swear, Isaid this.
I have a video on my profilelike a few days after I swear, a
few days after October 7th, Imight have still been in Israel.
I said we got it.

(58:34):
Or maybe I was in Greece and Iwas doing a TV interview because
I was on my way home and I waslike, look, I swear, five, six
days in.
I was like we've already seen alifetime of death, we've got
our fill, you don't need anymore.
Like you don't.
And look, I'll even say it onthe other side, as you might put

(58:57):
it, what's there to gain byseeing more dead, more dead
people, more violence, moreharassment.
Like we get it, we get it.
We're in a war, people aredying.

Speaker 1 (59:08):
But it's hard to ignore, it's hard to like, not
like.
I need to know every day, everymorning I'm like what happened
last night?

Speaker 2 (59:18):
Yeah but I do, I have family in Israel Like it's,
like it's it's intense for me, Iwant to make sure they're okay
and like it's just you know okay, let me challenge you.

Speaker 3 (59:24):
How does that?
How does that help?
You know, they're okay.

Speaker 1 (59:27):
I know, but like it's just, I need to know, like
what's going on.
I don't know.

Speaker 3 (59:31):
Look and I'll respect you.
I don't have family and we have.
You're saying we should all beDelulu and just like.
No, we need to reduce itsignificantly because ignorance
isn't the solution either.
Right, and unfortunately, it'sone of the one of those examples
where it's like where's thebalance?

(59:51):
And everyone has to figure outtheir own balance between
information, overload of doomscrolling and what is the actual
amount of of content that Ineed to be up to date?

Speaker 1 (01:00:03):
What do you think on average, is like a good amount
of, like a limit to limityourself?

Speaker 3 (01:00:09):
Like time on social.

Speaker 1 (01:00:11):
Yeah.

Speaker 3 (01:00:12):
Again, that's.
It depends what your diet is.
What are you watching dogs?

Speaker 1 (01:00:18):
45 minutes.

Speaker 3 (01:00:19):
If you're watching doom stuff, then, yeah, you can
catch up on the news and thelatest anti-semitic attack in in
10-15 minutes, let's be honest,in a day.
So, but but here's the thing.
We're fighting nature and we'refighting these algorithms which
promote the most emotion, themost emotional stuff.
So it's it.

(01:00:40):
It's not actually easy to do.
I don't even want to overstatehow simple this is, but one
thing I've I've actually done,um, that's been helping me, um,
at least on my Instagram.
Um, I don't use TikTok, but, um,my, my feed itself, like the
feed I actually created, is, youknow, lots of Israelis and Jews

(01:01:01):
and news and all that stuff.
But the like For you, orwhatever the hell the thing's
called, I never really used it,believe it or not.
I'm a weirdo, maybe, on thatsense.
I usually just use the feed oflike people I'm following.
The last like couple weeks,I've been spending a lot more
time in that For you page, whichcan be dangerous depending on
what your algorithm is, but Ifound in there, luckily, somehow

(01:01:25):
.
Mine's just all funny shit.
It's like just crazy funny shitand some of it is maybe I have
some political leanings.

Speaker 1 (01:01:32):
I have like emo, emo quotes.
I don't know why I have like Iwish I had my phone.
I could read some of the funnylike emo quotes.
And then there's like quickquotes in Hebrew.
It's like kind of weird.

Speaker 2 (01:01:48):
I'm like I don't.
This is a weird for you page.
What's on your?
What's on your for you page?

Speaker 1 (01:01:50):
I've been on my for you page Literally only the
times you've asked me should Ilook, make sure there's no naked
bitches.

Speaker 2 (01:01:54):
I haven't been on my for you.
No, yeah, one other thing.

Speaker 3 (01:02:08):
What's the word I want to use?
Um, go, go, go through yourfollowing and fall following
like you know, list.
Go make some cuts, like whoneeds?
Like, be real, be moreselective of the things you are
following, because the thingsyou're following and interact
with will influence thealgorithmic content that you

(01:02:31):
didn't ask to see.
So go make some cuts, like getsome stuff out that you don't
need and what you can do.
The algorithm is a veryinteresting thing to use like a
physical analogy.
It's like a, it's like a largeship.
You can steer it.
You can change the type ofcontent you get, but it's a slow
, slow process to steer a ship,I mean like an aircraft carrier.

(01:02:54):
So one way is to change whoyou're following and then change
who you're interacting with andcontent that may be doomy.
Don't interact with it.
Actually, scroll past itquickly, right, you know this as
marketers as well.
Like part of it is engagement,it's positive reinforcement, but
part of it is time that it's onthe screen.
So get past it quickly.

(01:03:16):
It will slowly steer the ship.
But also go out of your way togo follow pages with the animals
and crafting, whatever yourhobbies and interests are, and
then intentionally over-engagewith those pages to slowly train
the algorithm that you want alittle bit more of that.

Speaker 1 (01:03:34):
That sounds so relaxing, like a little hobby
crafting doggies.
Wow, oh my God, imagine likecute stuff babies, everything I
share with my family is thatstuff?

Speaker 3 (01:03:46):
it's dogs, cats, memes, memes, uh, travel like.

Speaker 1 (01:03:51):
The stuff I share with my family is all like
wholesome shit, like nice,uplifting, cool, cute, funny
stuff that's what we need thoughright dark doomy shit they
should do a setting on instagramwhere it's like you turn it off
and it like and your feedcompletely changes, like if you
want the news, it's likeseparates it oh, yeah, yeah,

(01:04:11):
that'd be a cool thing, likelike I'm in, I'm in like happy
mode, like give me the happystuff and like here's 10 minutes
of new, of news of what I'mdoing.
You know what I mean?
like yeah, yeah yeah I feel likethat'd be genius, but well,
they would never do that becauseit's of news, of what I do.
You know what I mean.

Speaker 3 (01:04:24):
Like, yeah, I feel like that'd be genius, but well,
they would never do thatbecause it's against their
business interests.
Actually, one little um, damnit Weird.
There's one little um uh uh,easter egg People don't know
this on Instagram.
You can actually get rid of thealgorithm.
On Instagram, you can actuallyget rid of the algorithm.

(01:04:48):
How do you do that?
There's an Instagram logo Idon't want to show you
everything.

Speaker 1 (01:04:55):
Click it, can you see ?

Speaker 3 (01:04:55):
Yeah, click the logo and there's a drop down menu
Okay, following and now youwon't see anything that you're
not following and it's inchronological order.

Speaker 1 (01:05:11):
It's just chronological order I want.
Oh, I miss those days.

Speaker 3 (01:05:12):
It's chronological order and it's only people you
follow.
So, but they don't want you toknow that.
The ui ux team makes itabsolutely impossible to know
almost that.
That's even a dropdown menu,because it's not in their
interest.
They added it just to say they,you could do that and of course
, I would bet my life on it.

Speaker 1 (01:05:31):
I'm changing it tomorrow.
It sounds amazing, clean it up,but I I follow like.
I feel like in the last year Iwas just followed.
I just want every news out likeIsraeli news outlet, left right
.
I want to see the perspectives,I want to know what's going on.
It's too much, though.
It's way too much my mom willcall me.

(01:05:53):
She's like did you hear?
I'm like Mom, I heard it threedays ago.
She'll send me a YouTube videoof Bill Mayer.
I'm like I saw it a month ago,yeah, and like I'm like I saw it
like a month ago.
Like yeah, it's great.
But she's like she's like whendo you have time to?
How do you know everything?
You know like.

Speaker 3 (01:06:09):
She's like mom, come on, yeah, yeah, well, look, I
think this is super important.
As much as I may sound educatedon this, I still don't do most
of these things because it'sthat hard.
We're really up against biologyand we're against technology.
It seems like a losing battle,but we have to really really try

(01:06:31):
.
Everyone needs to really reallyprotect their mental health.
We cannot be in a perpetualstate of depression and maybe
even more common, anxiety.
It has literally permanenteffects on us.
So I think this is um, it'slike I said, it's no different
than your, your, your, your diet.
This is your digital diet andyour digital diet is the key way

(01:06:52):
is the is the is the gateway toyour mental health.

Speaker 1 (01:06:56):
Wow, this was an awesome conversation.
So, okay, let's switch up realbefore we like conclude
everything.
What do you do for fun, like,what do you?
What do you do like on theweekends and stuff.

Speaker 3 (01:07:13):
Not much.
Jiu Jitsu is big in my life.
It's been hard to go to thatfor me.
The irony is it's hard for meto go to that when I'm not in a
good place, even though that isthe best place for me to be in
that time.
So I typically do somethingless social, which isn't healthy
, and go to the gym.
I like to, I like to work outand go on the treadmill and my

(01:07:35):
escapism is is education.
I think I was telling Tony thissort of where I alluded to like
I don't know shit about, likepop culture and movies and like
like I don't know that world.
I'm like I just like watchedYouTube videos and books and
podcasts and like I'm just likeaddicted to education.
So that's kind of how I spent alot of my time outside of work

(01:08:00):
and social media.
But I've actually startedthinking or realizing that I
need more escapism, andeducation isn't escapism for me
at least.
I think I take it veryseriously, so it's not escapism.
So now I'm trying to find somenew hobbies actually right now,

(01:08:21):
and find maybe new communitiesbased around hobbies and
interests as well, because Imyself am not spending enough
time outside of this, this doomscrolly world.

Speaker 1 (01:08:34):
Um, so yeah, and I hear you're also looking for an
Israeli wife.
Oh my God.

Speaker 3 (01:08:41):
It just outed me.
Hey, I'm single.
Look, he's single.
Ladies, I'm only getting older.
What's your type?

Speaker 1 (01:08:50):
Let's do a shiduch.
Let's do a shiduch.

Speaker 3 (01:08:52):
Oh my God, he's open for hobbies.
Let's shiduch, you.

Speaker 1 (01:08:56):
Come on, you know I'm a Jewish mom.
Like it's a biggest mitzvah.
Are you kidding?
Okay?

Speaker 3 (01:09:01):
Like it's a biggest mitzvah.
Are you kidding?
Okay, what's your type?
What?

Speaker 1 (01:09:02):
what kind of what's your like celebrity crush.

Speaker 3 (01:09:06):
Oh, a celebrity crush Again.
I don't even watch TV, but, um,okay, I never I've been put on
the spot, but I mean I, if we'retalking like what are we
talking like background,physically?
Like like background physicallylike what are we talking?
Yeah, just give me a littletype.

(01:09:27):
Well, I would actually say onon paper, someone who
understands both the americanculture and israeli culture is
important to me because I I grewup with middle eastern people
from at least teenager on,believe it or not.
So I grew up with just lots ofpeople from the Middle East.
So I actually and I speakHebrew.

(01:09:48):
So for me, like I'll give youagain, I'm generalizing, but
like an American Jew can betough for me because I feel like
I need someone who reallyunderstands the Israeli culture,
israel, the language, middleEastern culture that can be
tough for me.
But on the other hand, someoneon the you know Israel, the
language, middle Eastern culture, that can be tough for me.
But on the other hand, someoneon the you know from Israel,
who's who's who's super Israelior you know whatever that's not,

(01:10:10):
that's not an insult who maynot understand some of the
nuances of American culture andthings of that nature, would
also be tough for me.
So I think someone who's whokind of has either, who is
either American, israeli,israeli, american, someone who's
who kind of has either, who iseither american, israeli,
israeli, american, um or livedin both countries I tend to to
vibe with.

Speaker 1 (01:10:29):
That's very, very specific it is.

Speaker 3 (01:10:31):
That's a problem, isn't it?
He got one.

Speaker 1 (01:10:37):
He got one I can attest.
They're pretty.
They're pretty amazing.
Yeah, the half c's.
It's their half.
They're half.
They're pretty amazing.
Yeah, the half Cs, it's theirhalf.

Speaker 3 (01:10:44):
They're half chetzi chetzi, they're like half yeah.

Speaker 1 (01:10:46):
Half Israeli half.

Speaker 3 (01:10:47):
American, but I understand reality is very
humbling.
So I think what's mostimportant, rather than
demographics and features, is ofbut someone that you just feel
yourself around, regardless ofwhether they're what, whether
they're Israeli, american orwhat have you.

Speaker 1 (01:11:10):
So that's but ladies, his DMs and he and do you have
a job?
Do you live with your mom?
You don't.
Wow, Guys, guys, he's a catch,he's a catch.
Where can people slide in yourDMs?
What's your Insta?

Speaker 3 (01:11:29):
Talk to Trav T-A-L-K.
The number two Trav.
Talk.
Number two Trav.
Awesome.

Speaker 1 (01:11:36):
Well, thank you so much for your time.
This was fun.

Speaker 3 (01:11:39):
Yes, thank you I learned a lot.

Speaker 1 (01:11:41):
Now I'm going to go to bed.

Speaker 2 (01:11:43):
Yeah, I'm going to go Take.
Yes, thank you, I learned a lot.
Uh, now I'm gonna go to bed.

Speaker 1 (01:11:44):
Yeah, I'm gonna go take care of my mental health
yeah I won't scroll, I promise.
I won't check who watched mystories or that's, it's not
possible.
It's not possible do you haveanything else?

Speaker 2 (01:11:58):
yeah, I'm convinced that kanye is an irgc plant.
Now he was.
He said long game.
I'm starting to put the piecestogether.

Speaker 1 (01:12:08):
Oh babe, this is not a QAnon podcast.

Speaker 2 (01:12:12):
It is now All right.

Speaker 1 (01:12:16):
Thanks, Travis.

Speaker 2 (01:12:18):
Yeah, thanks guys.
Thank you man.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.