All Episodes

April 12, 2025 26 mins
Joining us once again for this thought-provoking conversation is Emily Pabst, the founder of Remake the Rules. Emily is able to bringing a human-centered lens to the way we approach modern systems.

In this episode, Emily helps us explore the profound impact of data on our lives, the overwhelming flood of information we face daily, and how this overload affects our ability to make clear, grounded decisions. Emily also shares critical insights on the growing role of AI, the risks of outsourcing decision-making to machines, and the importance of slowing down to evaluate situations with intention and awareness.

Whether you’re a tech enthusiast, a skeptic, or simply curious about where we're headed, this episode invites you to pause, reflect, and reconsider our relationship with data and technology.

www.mentalwealthpod.com
www.pedalmyway.com

remaketherules.com

DISCLAIMER: The views, thoughts, and opinions expressed are the speaker’s own and do not constitute legal, medical, or other forms of professional advice. The material and information presented here is for general information and entertainment purposes only. The "Mental Wealth Podcast" and "Pedal My Way" names and all forms and abbreviations are the property of its owner and its use does not imply endorsement of or opposition to any specific organization, product, or service.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
How can we make authentic connections and stay grounded in
an artificially intelligent world. This week on the Mental Wealth Podcast,
we were delighted to welcome back Emily Pabst, founder of
Remake the Rules. We discuss how technology can interfere with
our decisions and the tangible steps we can take today
to regroup and refocus on our long term plans. We

(00:28):
think you're really going to like this one. Emily. Thank
you so much for joining us. Yet again, it's always
a pleasure to have your company.

Speaker 2 (00:34):
Thank you so much for having me. I'm really looking
forward to another conversation.

Speaker 1 (00:38):
Last time we had a fascinating discussion about decision making
and dating. How do people approach dating in the modern
world with applications and all the decisions that they have
to make in their romantic lives. Today, I want to
talk about one of my favorite topics of technology and
decision making. The quick pace of technology is to me

(00:58):
something that is both frightening and exciting, and I think
it's something that a lot of people. I think I
can speak for at least people in my friend group.
A lot of people are asking me about things like
AI and how to coach, how do I approach business,
how do I approach employment opportunities with the decisions that
they're making. The first question is how do you deal

(01:20):
with decision making fatigue? There's so many decisions now that
people have to make each way day in their day
to day lives, whether it's simply as simple as ordering food.
There's like fifty thousand food apps you can choose from net.
How do you respond to certain emails? How do you
approach decision making fatigue? As an individual? What would you
say to one of your clients.

Speaker 2 (01:39):
I think one of the original conceeds of remake the
Rules and why I really got pulled into doing the
work that I do and having the conversations that we're
having right now, is this idea that decision fatigue is
now sort of omnipresent in our lives, just as you describe.
So this experience that so many people are having where

(02:00):
you are just out of energy, have over extended the
energy that you have to make good decisions, and that
over exertion is going to impact the decisions you make
right in a negative way. So it's going to get
harder and harder to make decisions. So, yeah, it is
a very tough situation that a lot of folks are
finding themselves in.

Speaker 1 (02:20):
I'm thinking a lot about information architecture. The vast majority
of people that are like us right now is staring
at a flashing screen in front of them and trying
to comprehend the information. Is there sort of a general
advice you would give to someone that is trying to
make sense of new information that they're getting, making sense
of the reliability of the information that they're getting. How

(02:40):
does one make sense of this new sort of AI
focused world where we're getting information, possibly from a source
that is not human the information.

Speaker 2 (02:48):
As you say, we're looking at screens right now. This
is how we're ingesting a lot of this new information,
This incredibly increased quantity of information that we're all able
to access now right That is because of the information age,
that is because of the digital age. And I think
that the really kind of insidious aspect of living in
the information age is that on one hand, we are

(03:11):
given these incredibly powerful tools with tons of potential and
told that now that you have these everything should be easier, right,
But those tools themselves are actually making things way harder.
They are destabilizing the ways that we used to be

(03:31):
able to navigate the world, navigate information, make decisions. So
I believe that there is another side to this that
eventually we are going to build tools and the skills
to baby incorporate this novel technology into the way that
we navigate, into the way that we make decisions. That
is precisely what I help people do. But until we
get there, it is like a real quicksand situation.

Speaker 1 (03:55):
Yeah, I'm wondering what that bridge is, like, how do
we get there? How do we do that in the
world which is changing so quickly. I think about I'm
just going to talk about my personal journey. I'm in
digital marketing and I see people see searching for information
on Google. You've probably seen it, the AIO overviews. When
you're searching for something. You used to be Wikipedia, would
you show up and then maybe there'll be some other

(04:15):
trusted media source whatever, But now it's an aioverview. Now
it just is Google. Technically, Google's information that they own
that space. How does someone engage with that information and
how do you trust? That? Is something that I think
about quite often, because you really don't know where that
data is coming from.

Speaker 3 (04:30):
Now.

Speaker 2 (04:31):
The first thing that I absolutely recommend that folks do
is give themselves grace and give themselves space. So start
with acknowledging that this is not a reflection of personal failure.
This is a truly unprecedented situation that we are finding
ourselves in, and we are doing our best to navigate
it and to figure it out. So we can all

(04:54):
pause and be like, Okay, it's not just me, It's
not just me.

Speaker 1 (04:56):
That's a great point of I'm pausing and thinking about
what the heck is going? What am I actually looking
at right? What am I reading? Where is this coming from?
Something that I reflect on all the time? Is is
this information designed to elicit a certain emotion? Is it
a marketing thing? The stories that tend to be more
tragic tend to be the highest clicks. What emotion do
they want to elicit, whether it's anxiety or some sort

(05:18):
of emotion that they're feeling after, you know, using social
media for a while, for example. That's a great point
about community that you made, like that the idea of
like get outside and take a pause, like look around
for a while, right.

Speaker 2 (05:28):
There's lots of different ways that making of space. First
of all, we've already kind of forgiven ourselves for feelings
so lost in this world because we're all lost in
this world, right, and now we're trying to figure out
how to reorient right. Are we reoriented into our lived
experience in the physical world, which is a very important
place to be oriented to, that is where we exist

(05:49):
most of the time, or are we reorienting to how
did my brain and body end up feeling these feelings
and thinking these stonts And this is something I do
with all kinds to clients. Did this thought feeling problem
originate from inside of my own body, from my own
lived experience or was it brought to me on this

(06:11):
super highway of information and feelings?

Speaker 1 (06:13):
Right?

Speaker 2 (06:14):
And now it's inside of me, And now I'm scrambling
to figure out what's going on, to figure out a
solution to a problem that I literally didn't know existed
ten seconds ago. So just kind of figuring out what
is actually happening here. And to that end too, we
are going to have to prioritize what really matters to us.

(06:34):
We cannot address every experience and every question as if
it is the most important moment of our lives. We
are going to burn out so fast. The fatigue is
going to get worse. But people have priorities, they have
things that they really want to invest time and energy in,
and you're just gonna have to let the other stuff
go and treat it maybe with more of this benign curiosity.

(06:58):
It's happening, it's around. Maybe learn something from it. But
this is not a critical problem that I need to
solve right now.

Speaker 1 (07:06):
Right It's almost as if people take ownership of the
problem that they're reading about and it becomes their own stress.
It's not a personal failing when you don't know how
to connect with the information that you're seeing. But you
also don't have to take ownership of a negative news story.
You don't have to let that impact your emotional stake.
Start with the things that are impacting you the most
and sort of go from there. From my personal experience,

(07:28):
I can just speak to that. In terms of Facebook,
I realized about ten years ago. You have friends that
are using it all the time, and you're getting updates
at like four am when they're out back from the bar,
and I don't really care about any of these It
doesn't have any positive impact on my life. It's almost
like if others are using this technology, right, you want
to feel like you belong, whether it's a friend group
or your company or whomever is using the social media.

(07:50):
You want to feel a part of something, so you
use it because everyone else is using it. You're saying,
build a community based on that separation, build a community
outside of the thing that everyone else is.

Speaker 2 (08:00):
I think that that is such a good example, this
sort of like potential for constant updates on people that
we maybe do love and care about and want to
build relationships with and spend more time with. Right now,
that being said, as you or I or whoever are
building meaningful relationships and having these connections and the conversations

(08:21):
never would maybe I be like, you know, what I
really want to do to really level up our relationship
is to know what you're doing it four am every
day right right? That is actually not a very important
part of our important relationship. Potentially is being brought to us,
and because it is being brought to us, we are
kind of normalizing it into that relationship because that is

(08:45):
just kind of how our brains work, right as we
are sort of as our environment changes, we kind of
assume that that is the way the environment is supposed
to be. But that isn't necessarily true. Right, and then
there are opportunities to again change our environment back or
modify it for a way that works well for us.

Speaker 1 (09:01):
It almost feels like there's a separation going on and
where it's almost like it's the opposite of what the
original intents of the Internet was, where everyone's going to
that information silos and we don't really know where that
information is coming from. Do you see signs of hope
for how the information architecture is being built or are
you one of those people who says we need to

(09:22):
get rid of this technology? Is there a societal value?
Do you think that some of these cadets wuls.

Speaker 2 (09:27):
I think there can be, and it just depends on
what we're gonna do.

Speaker 1 (09:30):
You know.

Speaker 2 (09:31):
I think what I'll start with is that we are
already on untested ground and we are creating these tools
with incredible power that are also into it. Right, Everything
that we're kind of doing now is totally untested and
we do not know what is going to happen. We

(09:52):
certainly have agency in it on an individual scale, but
more than that, these are going to be questions on
institutional scales and society skills. So as AI continues to
get better, one of the impacts that we're going to
see is a greater increase in not knowing what is
real and what isn't real. That is going to become
a really huge problem for us. It already is a

(10:14):
very big problem for us. And what we are going
to need to do culturally, socially, institutionally is promote structures
that de incentivize false information, fraud, these types of nefarious
uses of this technology. That is going to be very important,

(10:34):
and also on a smaller scale, we're going to need
to really start focusing on our local communities, our physical spaces,
in person communities and relationships so that we have strong foundations,
can be responsible for helping to confirm some of the
information that is being shared about my local community. That
is sort of labor that I can help do for

(10:57):
as an individual, but to help my community. And we
are each going to need to do that, and we're
each going to need to build the skills to be
able to do that right, to confirm this information and
to have these conversations. So building those skills, building that community,
and then also building those institutional structures is going to

(11:18):
be paramount.

Speaker 1 (11:19):
I would imagine that societies will change like that would
have to start with seniors who are perhaps more vulnerable
to the criminal element of that kind of change forward
in those kind of things, like an educational structure of
maybe a government, and that has to sort of start
with somewhat right structure where people get educated about the
risks of the technology more so than anything else.

Speaker 2 (11:41):
Yeah, I mean, safety is are a very large concern right,
both societally and individually. You're talking about the senior most folks,
and education for these novel technologies is a big component,
and being able to navigate them and understand them and
kind of see what they are and what is actually
happening and what's working. But also there is sort of
senior in the consolidation of power and decision making, and

(12:03):
we have some leaders in the community who are able
to have greater impacts on a larger population of people
just because of the position that they hold. And in
the past one I've worked with folks who are at
high levels of leadership. It's definitely interesting to see the
struggle of catching up because they are not immune to

(12:26):
this like lightning speed change in technology and in collaborative
thinking and decision making with these technology tools. One of
the other principal concepts that I work under is that
our information technology tools have outpaced our ability to use
them and incorporate them in our systems and decision making right,
and they are continuing to develop. So I am attempting

(12:49):
to write and make that space and have these conversations
with folks to kind of catch up to do that
work and do that labor to figure out how to
better think about these problems, think about these tools and
use them better and more responsible.

Speaker 3 (13:04):
For senior citizens, how would do you help them differentiate
between real and AI.

Speaker 2 (13:10):
That it is certainly something that I've experienced personally, and
that lots of folks in my corehort are experiencing personally too.
And I think that more so than these sort of
individual skills of tells, which I actually think are going
to become obsolete so quickly that they would be not useful,
it would be completely rebuilding the system at which people

(13:33):
are ingesting and accessing information. So, right, again, are we
just becoming these entities where there are screens everywhere and
information coming at usatus at us and we are sort
of passively absorbing it. I do not think that that
is a good way to learn. I do not think
that was a good way to ingest information and to

(13:57):
grow and to make better decisions. So refiguring like starting
with a question. I have a problem. I have a question.
I have information that I need or would prefer to,
would enjoy learning about and growing And now I'm going
to seek out information. And that's seeking out, right, So

(14:17):
we're first we're starting with synking out because I have
an important question that I want to answer. Then that's
seeking Also we'll have some guardrails around a community that
is known or understood, has been built, that is experts,
trusted individuals, right, depending on what the problem is that
we're looking for known scientists, right, that this is where

(14:41):
I'm going to be acquiring that information. And so right,
it isn't these little you know, look for the weird
hands and the AI images, right, It's really more about
totally changing our relationship with how we access information.

Speaker 1 (14:56):
Yeah, I was just thinking about people that might be
vulnerable to shoes with fraud or whatever it is you
have to say to them. You know, if I call you,
just think of it as you're not speaking to me.
Just don't give any information if I ask you anything,
any personal information, don't provide It's almost as like you
have to be so precautionary now in your day to
day experience. We all sort of know that we have

(15:18):
a sort of a taskit understanding that we don't share
banking information via email. That is sort of the most
basic element that we now have to transfer to people
that are more vulnerable in society. The ability to communicate
now is changing very rapidly and produce media that is
very realistic is a little terrifying.

Speaker 2 (15:39):
Yeah, yeah, absolutely. And have you guys listened to some
of that AI podcast technology.

Speaker 1 (15:44):
You can still make out it's AI, but it's getting
so realistic. They've got the pauses, they've got the cadence,
they've got the voices down. It could be a month
or two and it could be able to create podcasts
like this.

Speaker 2 (15:57):
And I think then there's kind of the interesting question
of what is the point, Because I think that a
lot of people get into podcasting, either hosting or guesting,
in order to have interesting conversations that they enjoy having
with other people who also want to have them. So
then if I want to get people to educate, how
are they going to find it? There can maybe be

(16:18):
a pretty narrow window of problems that we're solving versus
problems that we're creating.

Speaker 3 (16:24):
There's no emotion with that here. There's kind of an
emotion going that's driven. There's no real emotion.

Speaker 2 (16:29):
Everything just content, and not just content, but content that's
usually built on some sort of idealized average. So kind
of like you're saying, we're not going to get maybe
some of these more meaningful or rare or interesting or
variable interactions.

Speaker 1 (16:47):
If you're a business owner dealing with these technology problems
right now, is there steps you can take to improve
your decision making for your business while also adapting to
this new sort of information architecture that we're dealing.

Speaker 2 (17:01):
With, for sure. So one example I'll give is kind
of similar to some of the problems that we've already
been talking about in solutions. And it's amazing how frequently
these problems show up over and over and over again
in our lives. Specific to small business owners, is the
extent to which folks salespeople are showing up on their doorstep,

(17:23):
in their inbox, on their phones with a technology solution
to a problem that they didn't know that they had.
Right And one of the interesting things that I find
with this problem is that is the extent to which
it can really hit business owners. So I've worked with
some people who really struggle with this sort of outreach

(17:47):
and calls and these sales pitches because they feel like
someone is telling them something about their business that they
don't already know, and that makes them feel like a
really bad business owner and that they shouldn't be doing
what they're doing and they don't know what they're doing,
and it really shifts this sort of power dynamic about
who knows what about what this person does for eight

(18:08):
hours minimum every single day. And so the first thing
that we talk about is what they do and don't
know about their own business, right, and the fact that
they are the expert period in what they do. Right,
these small business owners this idea that is it possible
that somebody can bring them a piece of information that
they don't know? Of course, does that mean that this

(18:30):
person understands the ins and outs of their business better
than they do. Absolutely it does not. So here in
this dynamic, it's really not necessarily about understanding the technology
problem that this person is attempting to discuss with the
business owner. It is about giving that grace and giving
that space and saying what I usually do with folks.

(18:52):
We build a social script about how to say this
is something I'm going to research, or no, thank you,
or what has you, so that they can step back, reassess,
do whatever research that they need to do, or just decide,
you know what, again, this was not a problem that
I knew existed yesterday. I do not think I actually
have this problem. That is okay. I believe in myself,

(19:13):
I trust myself, and I'm going to move on with
my day. So definitely a very common problem that I encounter.

Speaker 1 (19:19):
It's almost likely they've got to sort of figure out
what was my objective in starting this business. It must
be difficult to remain true to that continuing objective when
you have thousands of emails now coming through from various
different directions and sort of make decisions for your business
in that space. It's a great point you make about
taking pause and just separating yourself from all the information

(19:39):
coming at you and sort of thinking for a second,
what is it that you want from the information. It's
an interesting I really like that idea, as.

Speaker 2 (19:46):
Well as separating yourself from the expectation that you should
understand every problem that you ever might encounter, including the
ones that are based on novel technology that didn't exist
when you started your business. The conversations that I've had,
some of the things that come to mind is someone
who's asked to choose what middle school to send their

(20:07):
kid to, so they have like here in Colorado we
have school choice and so, and there's all these sort
of like bells and whistles that the schools do to
try to get folks to choose them. But talk about
weighing heavy on your mind. You're making a decision for
your child about where they're going to go to school,
and is it important that they have a dance studio?

(20:28):
Is it important that they have a maker space? What
am I supposed to do as a parent when I'm
being asked what school to send my child to? And
for that question and for so many other questions that
feel very impactful and indeed are and can be very
impactful to folks, what I recommend that people do is
just really get very serious and very limited about what

(20:52):
is going to be most important in this outcome. Does
it matter that there's a maker space? If this young
person is already fully invested and doing this sort of activity,
is on this great educational and experiential curve to do that,
then yeah, that might actually matter. But if that is
not a part of their lives already, then no, it
probably doesn't matter. So kind of similar to dating is

(21:16):
that we just got to pick two or three things
that really are likely to impact the development of this
young person, and then to figure out what information will
tell me as the parent, which of these schools is
likely to have the most positive impact on that one
or two things. What we were talking earlier, Rob, one

(21:37):
of the really interesting things you touched on that I'd
love to ask you more about is this fear of
increasing reliance on technology that you don't understand or is new.
So I'd just be really curious about if you want
to share the specifics of which technologies kind of give
you that angst and what that.

Speaker 1 (21:58):
Angst is is mostly based on the way the Internet
is going and business reliance, personal reliance, relationship reliance that
we've all sort of touched upon in this conversation. Billions
of people are now reliant on an architecture that we
don't really know where the information is coming from. Right, So,

(22:19):
someone like myself who reads a lot of newspapers magazines,
I see conversations that people are having based on a
news story that is I know for a fact it's
just nonsense, though they've got the information from I'm always
sick of now correcting people when I have to do so.
I guess my biggest fear is that we're heading in
a direction where certain powerful people are controlling the flow

(22:43):
of information and is no longer a democratized information source.
It's certainly more becoming from fewer and fewer sources, more
billionaires and only newspapers, more billionaires and only media companies.
I guess my concern is what is going to happen
with the world. What is going to happen with all
this these changes taking place? That is my major concent I.

Speaker 2 (23:03):
Generally do consider myself an optimist. That being said, humans
do not have a good track record of getting things
right on the first time. We just don't. And that
is what we are living through right now. We are
being asked to do something for the first time every
day and it's rocky. It's going real rocky. And I
think the question is how quickly we're going to be

(23:24):
able to positively adjust and how I kind of conceive
of this. The big underlying overarching problem here is the
extent to which we are choosing to outsource our thinking,
our knowledge, our decision making, and our trust in all
of those systems. So as we sort of expand our

(23:46):
own cognition and decision making to these external tools, these
computational software tools, the pattern that I am seeing is
not a balancing of that where essentially we say, hey,
very powerful tool that has a specialized utility, why don't
you help me with this one specialized task, And then
I'm going to and then you're going to hand it

(24:06):
back to me, and then I'm going to spend some
time evaluating it, thinking about it, doing some of my
specialized tasks, right, which is verifying whether or not it
feels right compared to other sources, compared to my own experience,
et cetera, et cetera. That we're not doing the trade
off right. It isn't being collaborative enough. We are just
fully outsourcing, and I think.

Speaker 1 (24:28):
That is a huge problem.

Speaker 2 (24:31):
Yeah, So when I work with folks and when I
talk about what to do about this, it is building
that better collaboration where I and other humans are at
the center of it, and that I am using these
tools to bring information back to me, back to other experts,
right that we are doing it collaboratively in a way

(24:52):
that systematically will have a higher likelihood of being accurate,
of actually helping me, of actually building these pro social
systems and figuring out how to do that is a skill,
right that we don't know how to do yet because
all this technology is new, but we need to We
got to figure it out.

Speaker 1 (25:10):
Yeah, it's almost like there has to be a secure
pass key between the human side and the AI side
so that you know who is using this tool and
where you're getting the information from it. That sort of
system hasn't been built yet, but we need something that
can sort of trust and verify this data that we're
getting and to help us sort of deal with this
overload of information that we're getting. It's a different issue,
but the trust side of things is becoming ever more crucial.

Speaker 2 (25:34):
Hopefully, we are learning a very critical and important lesson
about collective action with each other and collective action with
our tools too, where we also need to be very
strong participants in that collaboration.

Speaker 1 (25:48):
Collaboration. That's a great that's a great way in facing it. Emily,
I always appreciate your time. Thank you so much for
both relieving a lot of my concerns and also just
placating me and sort of, let's thanks to some of
my rambling about this technology. But I appreciate your expertise
on this and thank you so much.

Speaker 2 (26:06):
Thank you so much for having me. It's been great.

Speaker 3 (26:08):
Yeah, thank you, Emily, thank you for being back. I'm
sure we will bother you again with a couple of
more episodes.

Speaker 2 (26:13):
That'd be great.

Speaker 1 (26:14):
Thank you Rob, thank you Mchanda. It's great to see
you both. And thank you a void listening. We appreciate it.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.