All Episodes

September 26, 2025 37 mins

How will AI change your job — or even replace it? In this episode, Tim Staton sits down with futurist Dr. Mark van Rijmenam to explore the future of artificial intelligence, disruptive technologies, and the coming wave of digital transformation. Discover the Wave Framework for navigating change, learn which industries face the greatest disruption, and get practical strategies for staying ahead in a world where technology evolves faster than ever. If you’ve ever wondered how many jobs will be lost to AI — and what you can do to prepare — this conversation will give you the insights you need.

Connect With Mark van Rijmenam

Book: Now What?: How to Ride the Tsunami of Change

Link: https://mailtrack.io/l/caf5a91ec9b71085dc49cf19ffd9d199f7568f78?u=8129205

LinkedIn: https://www.linkedin.com/in/markvanrijmenam/

Website: https://www.futurwise.com/

 

Connect with Tim

Website: timstatingtheobvious.com

Facebook - https://www.facebook.com/timstatingtheobvious

Youtube: https://www.youtube.com/channel/UCHfDcITKUdniO8R3RP0lvdw 

Instagram: @TimStating

Tiktok: @timstatingtheobvious

 

 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Tim Staton (00:02):
This is Tim Staton with Tim stating the obvious.
What is this podcast about? It's simple.
You are entitled to great leadership.
Everywhere you go, whether it's to church, whether it's to work, whether it's at your house,
you are entitled to great leadership.
And so in this podcast, we take leadership principles and theories and turn them into everyday, relatable and usable advice.

Disclaimer (00:25):
And a quick disclaimer. This show, process or service by trademark, trademark, manufacturer,
otherwise does not necessarily constitute an apply the endorsement of anyone that I employed
by or favors them in representation.
The views are expressed here in my show are my own expressed and do not necessarily state or
reflect those of any employer.

Tim Staton (00:36):
Hey, and welcome to another episode.
I'm super excited about our next guest, Dr. Mark van Rijmenam.
We live in times that are fast paced and ever changing.
And change is happening now more than ever.
If you want help navigating our rapidly changing world where emerging technologies are converging
and reshaping society at breakneck speed, Dr.

(00:58):
Mark van Rijmenam, a futurist and innovation strategist, argues that we are already living in a tsunami of change.
And the real question is not if change will happen, but how we respond to it.
That makes all the difference.
So, from the Future, Dr.
Mark van Rijmenam, welcome to the show.

Mark van Rijmenam (01:19):
Thanks for having me, Tim. Great to be here.

Tim Staton (01:21):
I'm glad that you're here.
And welcome again from the, from the future as well.
And speaking of the future, can you kind of help us get on the same page and what is a strategic futurist?

Mark van Rijmenam (01:33):
Sure. So as a strategic futurist, I help organizations around the world design and build a better future.
And I do that from a strategic perspective.
So I'm not looking into, you know, these are the different trends that are coming or these are
the fancy tools that we can expect five years from now, but more from a very high level perspective
of what is happening in the world.

(01:53):
How are these shifts and changes driven by disruptions such as AI or other technological breakthroughs
going to affect our personal lives, our business lives, our society?
So very much from a high level perspective, trying to change people's mindset in a way that
creates a better future for all of us.

Tim Staton (02:14):
Awesome. And, and so when you were doing that and you started writing your book, you start the
book off with your cycling journey across Australia.
So what experience, you know, could you relate to resiliency and leadership in the face of that uncertainty?

Mark van Rijmenam (02:31):
Sure. So yeah, in the book I talk about how I circumnavigated Australia as a duchy I went on
a little bike ride, in fact, 100 days around an entire continent to raise money for charity,
which is a fascinating experience.
But when you are cycling through the outback and you know, in the nearest town is 500 km away,
and you have to bring your own food, your own water, your own gear on the bike, you have to

(02:54):
have a lot of resilience and perseverance to, you know, to keep going when the going gets tough.
But what I didn't know at that time, but what I know now, is how that cycling was such a mirror
for being a futurist and trying to embrace the future, where we also require a lot of resilience,
a lot of perseverance when they go and get stuff because.
Because there is so much disruption happening at the moment.

(03:14):
At the same time, it's not only technological, it's also ecological, it's geopolitical.
And how we deal with that change as individuals or as organizations or even as society, that
requires a lot of resilience because it's not easy.
The coming years, I think there will be a lot of chaos in society with all the disruption going on.

(03:34):
And we have to persevere.
We have to have a mindset to the future, a long term purpose of where we're going so that we
can get there and then we can build a better future for all of us. But that requires resilience.
And a lot of the resilience that I, that I bring myself to the table is what I learned on the
bike when we were cycling over 10,000, almost 10,000 miles in a hundred days.

Tim Staton (03:57):
That's, that's a lot of, that's a lot of cycling.

Mark van Rijmenam (03:59):
It was, but it was such a cool experience.
I can recommend it to anyone if you have a few months to spare.

Tim Staton (04:05):
Okay, yeah. If you got a few months to spare and a lot of calories to burn. Absolutely.
So you mentioned disruptive technologies and building resiliency for that.
So what do you see coming up on the horizon with some disruptive technologies that are starting
to be more relevant today?

Mark van Rijmenam (04:25):
Sure. So there are so many different technologies.
Everyone is talking about artificial intelligence at the moment.
Of course, just a big hype. Three years, three.
Three years ago, we were all talking about the metaverse and that was the big hype.
And I think a lot of people are very happy that they think that the metaverse is gone, but the
metaverse is actually back and is bigger and bolder than ever before, thanks to new technologies

(04:47):
such as spatial intelligence, as well as advancement in miniaturization.
For Augmented reality and the mixed reality headset, etc.
So there you can already see that it's not just AI, it's also the metaverse and spatial intelligence,
but it's also like brain computer interfaces that are coming where we can literally think to
our computers and get an answer from goog just by using our thoughts, or where we can communicate

(05:09):
almost like telepathy, with each other, but then we translate our brains, thoughts, and then
we send it across the Internet.
But it's also blockchain technology, which for many has been, for a lot of people, it sort of
has a reputation problem because it's being linked to, like, crazy monkeys selling for millions of dollars.
But in fact, the technology itself is really, really powerful and actually, I would argue, very

(05:31):
much necessary to create a trustworthy Internet, which.
Where we know that the data that we use has not been tampered with.
But at the same time, we have quantum computing coming, which will help us solve some of the world's wicked problems.
We have synthetic biology coming, which will allow us to literally create life from scratch.
We have 3D printing coming, which will completely upend geopolitics by moving global production

(05:55):
from the east to your neighborhood.
We have robotics coming, where the estimates are that by 2040, we will have 10 billion humanoids operating in our society.
So, and all of this is all these technologies are driven by artificial intelligence.
So that's the thing is what we see happening at the moment is this convergence of technologies
creating this perfect storm to disrupt society in unimaginable ways.

(06:19):
And I think that's both, you know, fascinating and terrifying.

Tim Staton (06:24):
No, absolutely. So I myself, having worked with some of the 3D printing and like, the, like the logistics sphere. Right.
So for you to print car parts now, and so instead of waiting for parts to come in, you can print
them and have them available while you're waiting for stuff to come in. So I.
I love the fact that you brought that up and, you know, some of the scarier things like telepathy

(06:46):
and brain computing stuff sounds super scary.
And you developed a framework to kind of help us get through this, and it's the wave framework.
So can you explain that a little bit and talk about it?

Mark van Rijmenam (06:59):
Sure. So what I try to do in my book is to provide an answer of how do we ride a tsunami of change?
Because it is literally a tsunami.
It feels like a tsunami.
So much disruption, but at the same time, it's only temporarily.
It's not that we'll be in this disruption for the next 50 years at least.
I hope not, but it is temporary.

(07:21):
Next five to 10 years, it will cause a lot of chaos and it will allow us to rebuild society in a new way.
And I think there are two ways to ride the tsunami of change successfully.
One is a long term approach, which is through education, lifelong learning, changing the education
system for the next generation.
But the other one is a short term framework, which is indeed the wave framework.

(07:44):
And the wave framework stands for we have to watch for signals, which means we have to look
at the horizon just like riding a bike.
What is coming over the horizon, how can we anticipate for that?
And we use structured, unstructured data to understand what's happening in a very simple, simple form, then we have.
Once we understand how our environment is changing, whether that's for you as a personal life,

(08:08):
or as a fresh graduate, or as a job seeker, or as an organization or society, or even as humanity, we need to adapt.
Because otherwise, if you don't do something with the signals, what's the point in doing it in the first place?
So we need to adapt, but we need to adapt with a long term purpose in mind.
Because the world is unfortunately often driven by short term shareholder value.
The next shareholders meeting, three months, or the next term in school, but not so much looking

(08:32):
five, 10 or 25 years ahead to see how our actions today will affect the future generations.
And I think it's especially relevant when it comes to artificial intelligence.
Because the foundational models that we build today, they will become the basis and the foundation
of the models of the future.
So any bias that we incorporate in that, in those models, they will persist for the long run.

(08:53):
And that's something we need to be aware of.
So we work for signals, we adapt with the long term purpose, but then we need to verify because
we live unfortunately in a trust and truthless era, thanks to, among others, deep fakes, when
it becomes really difficult to trust that the person I'm dealing with, whether text, audio or
video, is indeed the person that I think he or she is.
And at the same time, if we use data to power our most, you know, crucial systems, for example,

(09:19):
self driving car, we want to make sure that the data has not been tampered with, and if it is,
it has been tampered with, that we know.
So that's where of course, blockchain comes into play.
And then the final step is after we watch signals, we adapt with long term purpose, we verify
everything that we do, then we have to empower all our stakeholders to collectively design and build a better future.

(09:39):
And we do that through education, we do that through awareness.
Because only if you are aware of what's happening within your environment can you prepare and make better decisions.
So that's the way framework, very simple, four step approach to help you prepare for, for the future.

Tim Staton (09:56):
So you say it's really simple.
However, the most things I find that are simple are very hard to do.
So of, of those four things, which one do you see people struggling with more now at the strategic leadership level?
What is, what are things that you're seeing that you're like, you know, people need to get a

(10:17):
better handle on on this?

Mark van Rijmenam (10:20):
I think it's a combination.
So I think it's a combination of, of, of the watch part and the adept part because a lot I would
argue that there's a very small number of people in the world who actually understand what is
currently happening and I consider myself one of them in my, because of the work that I do,

(10:40):
I have a good pulse on how the world is changing.
But very few people do.
And that's just because they have, you know, they're, they're busy leading the life, they're
busy with the taking care of their family, they're busy with their job.
So there's no blame here whatsoever.
That's just the way the world works.
But if we want to be ready for that for the future, we do need to become aware.
And there we run into like an information overload problem where I'm building a company called

(11:03):
futurewise to solve that issue.
At the same time, we need to adapt and people are very reluctant to change.
And that's an issue that has nothing to do with the technology, but is a human component.
We do need to change because if you don't change, whether it's as an individual organization
or as a country, you will stand to lose in the long term.

(11:24):
That's why I say that I believe that by 2030 there will be two types of companies.
Those that use AI and the latest technologies and those that no longer exist.
Because if you don't adapt, you will lose out.
So I think those are the two components that are very, very difficult.
But at the same time there's also a lot of scams happening.
So the verify part is difficult because it's to do with the new technologies.

(11:48):
And if we're not aware of these technologies, then we can't prepare for the same thing.
And the empowerment is linked to all these three.
So yes, it's a simple framework but, but yes, also it's difficult to implement simply because
of the nature of who we are as humans.

Tim Staton (12:04):
Yeah, because I was thinking about what you said, the watch and the adapt part specifically.
So I could see how, because of all the information coming in, all the different signals, and
things are changing so fast and so rapidly, how do we know that we're watching the right things?

Mark van Rijmenam (12:22):
That's a very good question.
And I think you're correctly, you're correct to say that, you know, there's so much information
overload coming our way, so we need to be able to pinpoint and use the right data sources to
create a dashboard, a living dashboard that will tell us what is happening that is relevant to me. What is relevant?

(12:42):
That's what's relevant based on the organization, the type of organization, the industry that
I am, the country that I am, the work that I do, what kind of data sources are relevant, what
kind of news sources are relevant, so that I can prepare for that and then can take action accordingly.
Now, that is difficult because the true insights are generally happening on the edges of the Internet.

(13:05):
And what I mean with that is the moment it hits mainstream news, you're already too late, because
if mainstream picks it up, you're too late.
And so with futurewise, what we're building is exactly that.
So it allows you to find the relevant information based on your location, your company or your
industry, your preferences, et cetera, from the edges of the Internet, from the thought leaders,

(13:29):
the best journalists, the best academics, to understand what is happening.
So we can help you pinpoint the signals that are relevant to you.

Tim Staton (13:39):
I love how you put that, because there's a lot of signal to noise out there, and weeding through
which signal to lock onto can be incredibly difficult.
Now, for the human aspect, with the change and the adaptation, you have a really good argument
of you're either going to exist or you're not going to exist in the coming future.
So I think that's a good motivation.

(14:00):
But other people who are like, well, you know, I can hold off a little bit on jumping onto the
AI, or I can hold off before jumping on to the fringes of the Internet.
What do you say to those people?

Mark van Rijmenam (14:12):
I would say you can't hold off because the world is changing so incredibly fast.
And like we say in Australia, you snooze, you lose.
So if you don't pay attention, someone who is paying attention will outpace you really, really
quickly and leave you behind in the dust.
So you have to adapt, and if you don't, that's fine.

(14:36):
But then you will notice that quite quickly the world has moved on.
And by the time you start paying attention, it becomes a lot more difficult to catch up.

Tim Staton (14:46):
No, absolutely. And so you mentioned also a very key part about data integrity and blockchain.
So where do you see that coming together in the future?
Because I know a lot of people are tying that strictly to crypto or NFTs or whatever.
But when you talk about data verification and make sure it's not tampered with, where is that

(15:07):
going in the future that you foresee?

Mark van Rijmenam (15:09):
Well, I hope that blockchain is going to be everywhere.
But blockchain is not sexy. It shouldn't be Sexy.
Just like MySQL databases are not sexy, blockchain should not be sexy either.
It's very important, crucial technology for an Internet and a world where we can trust our data.
But it shouldn't be sexy.

(15:30):
It shouldn't be related to crazy monkeys selling for millions, but it should be to power the whole Internet.
Let me give you an example.
So we have a self driving car, a system of self driving cars in a city where the cars talk to
each other, the cars talk to the traffic lights, the car are talking to weather systems, are

(15:50):
talking to satellites and all that data that goes through the system, that are all by different
owners, by different entities, the machines, not only humans, but also the machines need to
make sure that they can trust the data that they get from the traffic light, that the traffic
light has not been tampered with, or from the other car that's coming around the corner at a very high speed.

(16:11):
That as a self driving car I need to pay attention to.
So we want to make sure that the data that goes around in such a system has not been tampered
with and the moment it has been tampered with, that you know, emergency protocols can step in
automatically and that for example, the cars stop driving because there's a risk for collision
because we can't trust the data anymore.

(16:31):
That is going to be an absolute crucial component in a world where data is going to define everything that we do.
And if we can't trust the data because they are in centralized systems that are easy to tamper
with, I think as a society we have a problem so we want to use blockchain.
And over the years blockchain has been, there's a lot of development has been happening in this
space and the technology is getting so much better that it is start to become a technology suitable

(16:57):
for this kind of use cases.

Tim Staton (17:01):
To bring up some really good points, it's a pretty scary points too, which reminds me of ethical
guardrails so as we move forward and, and leaders are thinking about, okay, we, we need to keep
up with technology, we integrate AI, we need to keep up, we need to be fast, we need to be first
or we're going to be irrelevant.
What are the ethical guardrails that people need to start thinking about on how to implement that into their plans?

Mark van Rijmenam (17:28):
Well, I think the ethical guidelines that I like to pay attention to because sort of as the
main ethical guidelines, because then the rest will flow from that is am I taking into consideration all stakeholders?
So am I not just focusing on one group of people or just my shareholders, but I'm focusing also
on my stakeholders, on my customers, my employees, and within those groups, am I focusing on all customers?

Tim Staton (17:52):
Am I?

Mark van Rijmenam (17:52):
And I'm not excluding one group because I've incorrectly built my machine learning models because
the data that I've used to train the models was biased.
So that's one component Part of those stakeholders, I think we should also include the future
generations where those not born yet, even the long term perspective, because they are going

(18:15):
to inherit whatever we are creating.
Whether it's a mess that we create, they will inherit the message or if we build a beautiful
world, then they will inherit a beautiful world.
And who doesn't want to give something beautiful to their children, to the next generation?
So I think as a humanity, we have a moral obligation to take into consideration the future generations

(18:35):
that are going to inherit whatever we are creating.
And I think if we take that approach, when we build technology, we can create more responsible
technology, more ethically guarded technology that will benefit all stakeholders, not just a select few.

Tim Staton (18:53):
So that goes into your concept where you have, you know, architects of tomorrow and, and when
we, when we are starting to think of that mindset, right, we're designing the future, we're designing tomorrow.
From an ethical standpoint, what are some of the educational requirements or the mindset shifts
that people are going to have to make that you think might be a bit too far for people to make?

Mark van Rijmenam (19:19):
Well, I think that twofolds.
One is we really need to help the next generation and generation Alpha or generation Beta, which
just this is the first year that those are being born to understand what's going on.
Because I do believe that we have sort of sleepwalked into this digital age and we don't really
grasp what, whatever is happening, how it's going to affect us.

(19:42):
Yes, we know how our smartphone works, yes, we know how TikTok works.
But do we truly grasp the implications of a large language model?
And my answer to that is no, we don't.
And that's a responsibility on the one hand, for society to focus on education, to see education
as the most important pillar within society to educate the next generation.
And as part of that, we educate children to understand how these technology works. What does that mean?

(20:08):
It means that I'm a big proponent of banning social media or AI agents for the use of kids because
they don't understand that their brains are not ready for that.
Especially with a large language model that's so happy to answer all your questions in an affirmative way.
We need to have a different approach and we need to educate them what these technologies mean,

(20:31):
how they're going to affect their mind and how they can use them responsibly and not give an
AI agent to a 13 year old who doesn't understand it, who believes everything the AI agent says
and consequently commits self harm or worse where there are unfortunately too many examples
already and the fact that the companies who are building these technologies don't really seem

(20:55):
to care about that, but only care about their ebitda.
The bottom line I think is extremely worrisome and a failure of our society that, that we can't
step up there and that we can't protect the next generation.
So to do that I think we should have a different mindset, a different perspective of how we

(21:16):
bring technology to people and inside our societies in a way that cares less about the bottom
line of those who are creating the technology, but cares more about creating and designing and
creating a better future for all of us.

Tim Staton (21:32):
Yeah, I was going to ask you about when you were talking about children and AI and everything else there.
There's an interest, there's interesting studies that are coming out that the younger generations
are more anxious because of the technology.
And so how do we develop that balance between educating them on, on the basics of it, the background
of it, how it works, without destroying the mindset of making them more anxious or making people more anxious.

(22:00):
Because I agree with you on the whole social media thing that it's completely destroying kids
and teenagers and everything else from a human interaction based.
But I was just curious on your thoughts on that.

Mark van Rijmenam (22:12):
Just like we don't allow kids to drink alcohol until they are 18 or 21, depending on the country,
or that we don't allow kids to drive cars until they're 16 or 18 again depending on the country,
we shouldn't allow kids to just use these tools with, without any guardrails.
You know, they will just dive in, they will play with it, they will use it.

(22:32):
And as a Consequence, you know, they will, they might misuse it, but not intentionally, but
simply because nobody taught them how to use it.
You know, if you just give kids access to alcohol, they will drink it and they will ruin their brains.
We don't allow that to happen.
But at the same time we give them access to extremely addictive technology that ruins their brains as well.

(22:53):
And we just allow that to happen as a society.
Blows my mind that we do that.
And I think it's not that I say we should not give them access to these technologies.
Ideally not before 16, but in the year.
That doesn't mean that during school we should not give them access to this technology in a

(23:16):
guarded guided fashion where we explain to them how social media works, that yes, there is risk
for cyberbullying, that yes there's a lot of misinformation and crap on social media, but yes,
there are a lot of bots and trolls that don't care about you, but only want to push a message.
Nobody is telling the kids this kind of information and that's where the problem resides.

(23:37):
So if we can educate first the educators so that they can then educate the kids and that we
do that in a way that benefits first and foremost the kids and not companies building the technology,
I think we, we, we are, we are moving in the right direction.

Tim Staton (23:53):
No, absolutely. I think sometimes some of these businesses kind of take like drug cartel bottles
where they're like, oh, if I can make everybody addicted to it, I still have a, I have a, you know, con.
Continuous consumer supply instead of making society better.
So I completely agree with you on that.
So I appreciate your standpoints.

(24:14):
So when we come to the education piece, we've talked about the ethical guardrails and the safe
places in there where other, what other aspects do you feel that there's a disruption coming
that people may not be prepared for?

Mark van Rijmenam (24:32):
Well, I think one important area is jobs and the future of work.
You know, we are currently in this strange position that pretty much across the world there's
record low unemployment, like record low unemployment.
But at the same time we have these powerful technologies being built that are going to append

(24:56):
that, that are going to have a big impact on that.
And the reason why is because we live in a capitalist society and AI and automation and robotics,
these together with capitalism is like a perfect storm to completely disrupt the job market. And why is that?
Let's say you have a company and a competitor and your competitor is integrating AI and is integrating

(25:20):
automation and therefore can produce faster, produce cheaper, produce better, produce more efficient
Efficiently can therefore reduce the cost of the products so more people will buy it.
So the company becomes richer and richer and bigger and bigger.
If you don't do that as your company, then you're going to lose out, guaranteed.
So you are starting to implement AI and robotics and automation as well to keep up with the capitalist drive within society.

(25:46):
As a result, I do believe that by the end of this decade a billion jobs will be lost and that's
roughly 20% of the global unemployment.
And if you remember the Arab spring in the early 2010s, that was predominantly because there
was a record high unemployment of roughly 20, 25% in the range region.
So you get a bit of a feeling of what's, what's, what's about to happen.

(26:07):
And you know, a lot of companies, especially big tech now says, oh, don't have to worry, there
will always be new jobs.
And while that might be the case, I do think that the amount of jobs will be lost will vastly
outperform the amount of new jobs that will be created.
And we can already see this happening.
There's a, there's a gap happening between the junior hires and the senior hires where the people

(26:31):
who have more understanding, more education, has been longer in the career, there will be more jobs for them.
But the junior jobs are going down because AI can do a lot of the work.
Let's say in the legal space.
You can either hire a junior analyst for your law firm that works eight hours a day, might call

(26:52):
Insignia holiday, or you can just hire an AI agent that does the work for you 24 hours a day,
seven days a week and never get a needs a holiday.
The problem with that is that if you start doing that and a lot of organizations are doing that,
it's a very short term approach because at some point in the next future, your media and your
senior level employees, they need to move up and there's no one to replace you to fill that gap.

(27:16):
To finish my point, I believe that in the next you might have seen research that says 95% of
organizations that apply AI are failing.
That might be the case because the technology is still evolving, still developing.
But fast forward two to three years when these AI agents become ever more powerful and much
more easier to integrate, much more capable of connecting the data autonomously.

(27:39):
Then we'll see a lot more people, a lot more organizations start to implement this and we'll
see a big uptake in job losses across the globe.
So disruption happens in two fold, first slowly and then abruptly.
And that's exactly what's going to happen.

Tim Staton (27:55):
I think, excuse me, you alluded to the Arab Spring and I agree with you.
Idle hands make the devil's work.
So how can people stay ahead of this curve?
And what industries do you think are going to be more impacted than others?
Because I also from talking to different people in different industries, there's a shortage
and like welders or electrical engineers.

(28:17):
So it's kind of like there are some things that AI and machines or robotics can't do that humans can do.
So what industries do you think are going to be more impacted than others?

Mark van Rijmenam (28:29):
So a lot of the repetitive tasks can be automated.
A lot of the knowledge work can be automated.
You know, marketing, we can clearly see that, you know, image generations and copyright, copy content, creating content.
And that's something that is already being disrupted as we speak.
But to your point, like the welders, you know, the plumbers, where we need, you know, our hands

(28:50):
to do difficult work at, a robot cannot yet do, doesn't mean they will never be able to.
But at this point that's still very, very hard for a robot.
And therefore because it's hard, it's very, very expensive.
So yes, stuff where we use our hands I think is something that can be still safe for the medium
term as well as where we need social interaction. Healthcare.

(29:14):
The nurse next to your bed, you know.
Yes, you might have a robot surgeon together.
We're working with a human, but it's nice.
I would not want to have a humanoid on my bedside when I'm in hospital.
You know, I want to talk to a real person or like, you know, in hospitality.
Yes, you can have a humanoid serving your food, but I also go out for dinner to have the conversation

(29:37):
with the people in the restaurant and you know, and move from there.
I do think there will be a division in society where the lower priced products or services will be automated.
So if you buy something from a very cheap clothing brand and you want to return it, you are
going to talk to an AI agent, I can guarantee you that.

(29:57):
But if you buy, you know, an exclusive $1,000 bag and you want to have, you have a question
of it, you are going to talk to a human. I can guarantee it.
And that's a problem that we will have that division.
But I do think that, you know, with, you know, when we work with our hands or when we work with

(30:18):
our brains for social interaction with humans, I think those areas will be, will remain safe for a long time.
You know, as a speaker, I'm a keynote speaker as well.

Tim Staton (30:26):
Yeah.

Mark van Rijmenam (30:26):
I don't think a humanoid will replace me anytime soon on the global stages.
But again, I'm a luxury product from that perspective.
So it's, you know, we have to, I think when I talk to students and fresh graduates, I always
tell them educate yourself on the technologies and find a way where you can leverage your empathy,

(30:48):
your emotions, your creativity, stuff that AI is not yet very capable of, or stuff where we
value the human interaction more than we would value the AI.

Tim Staton (30:59):
I think those are amazing points and I'm thinking, because as you mentioned those for the future,
I'm also thinking about what is there things that we can do today that are going to, that I
can do today that can impact tomorrow, the short term and the long term.

(31:22):
Because even though I do talk about, you know, what can you do today to make a change, I'm also
thinking about what can I do today that's also going to have a longer term impact that's going
to make my business more better, my personal life better.
So on this topic, what is something that we could do today to help us out?

Mark van Rijmenam (31:39):
Well, apart from learning what's happening, learning this new technology and staying up to date
and becoming more aware of what's happening, I think the most important thing is that we need
to achieve a change in perspective on how we look at the world.
And in the book I call it, we need to achieve a gestalt shift.
We need to see the world through a different lens because we need to start to understand as

(32:03):
individuals as well as humanity and that my reality is not the reality, but there's a multitude
of realities that are all equally valuable.
And why do I say that?
So in the book I discuss another book which is called in the Men's World by Edward Jung, which
is a fascinating book where he describes that every organism has its own so called umwelt.

(32:29):
So it sees the world through its own perspective.
And the best example is a bat.
And a bat uses echolocation.
A bat can never experience a sunset.
But for a bat, its own world, its perspective of how it sees the world is really valid and is the reality.
But if we zoom out a little bit, then the bat's reality is completely different from our reality,

(32:53):
your reality or my reality.
And I believe that we need to start to understand that and we need to start to appreciate that
that's that there are multitude of realities that are equally valuable and that our reality
as humanity is not the reality, but we need to look at it from a wider perspective. Does that make sense?

Tim Staton (33:14):
No, that makes sense. I Always have a saying that.
I always say that perception could be reality, right?
So just because somebody else perceives something differently than I do doesn't mean that they
don't feel the way they feel about the same thing that we're interacting with.
And so, so we talk about feelings and perceptions, interaction, you know, spot on with that one, too.

(33:36):
So my last question for you then is, is there anything that we didn't talk about or was there
a question that you were like, you know, I really would really wish you asked me this, but you didn't.

Mark van Rijmenam (33:50):
Well, I think, you know, maybe I consider myself an optimistic dystopian, which is a bit of
an oxymoron, but it helps me see the good, the bad and the ugly.
And maybe the question would be, you know, should we be optimistic or should we be this topic of the future?
Because I get that question a lot where, you know, there's so much disruption happening and
there's going to be a lot more coming, I can guarantee you that.

(34:13):
But in the long run, should we be optimistic?
And then my answer would be yes, I think, you know, the power of these technologies, of how
it can influence society, how it can improve healthcare, how, how it can prevent us from becoming
sick, how you can allow us to live longer, how it can allow us to spend more quality time with
our loved ones, regardless of where you are in the world, where we can uplift entire population

(34:36):
that don't have access to the Internet at the moment through personalized, trustworthy education.
I think these kind of things are fascinating.
But yes, the next five to 10 years, chaos will increase.
There will be more disruption.
So we need to have a lot of resilience to push through that, a lot of perseverance to see that
at the end the glass is half full and not half empty, and that we need to push through.

(35:01):
But if we do that, I do believe that there will be abundance on the other side.
And I don't think that Terminator will roam the streets anytime soon.
But we need to be aware of what is happening.
And I do believe in the positive mindset of humanity.
And we all, most of us, want a better life for the next generation.

(35:23):
And if we take that perspective, I think we will do very well in the end.
But it's going to be hard.
It's going to be hard work.
It's going to require resilience and perseverance and a focus on that positive outcome.
But if we do, I think we'll live in a world full of abundance.

Tim Staton (35:42):
Well, I appreciate your final thoughts. On that.
And like you said, you know, you're. You're an optimistic dystopian. So I. Well, we talked earlier.
I was very excited about having this conversation with you.
I was like, oh, man, this is so awesome. Because I love technology.
I love the futuristic stuff.
I love where we're going at the same time.
It's kind of scary and, you know, how we're going to get through it and all the ethical guardrails that go into it.

(36:04):
And you just summed it up so nicely.
So, Mark, I really appreciate it for everyone listening to this or watching this on whatever
platform you're doing it on in the descriptions of this.
I'm gonna have the link to the book and the LinkedIn and all the socials for him so that way
you can reach out to him, get in contact, see his stuff and.
And see where we're going in the future because there's not too many people out there really

(36:27):
talking about realistically where we're going to go instead of, you know, just we're all AI
and Terminator is going to come get us at the end.
So I really appreciate your perspective and thank you for being on the show.

Mark van Rijmenam (36:40):
Thanks for having me.

Tim Staton (36:42):
Absolutely.
As always, thank you for stopping by and checking out this episode and listening to it.
I really hope that you enjoyed it.
Before we go, I'd like to ask a favor of you if I could.
If you could please share this episode with one or two people who you think might like this
topic if you haven't followed or subscribed on the platform that you're listening to and hit
all the bells and icons and all the whistles so that you know that when we post another episode, you'll be alerted.

(37:06):
Please go ahead and do all that before you go.
If you got some value out of this episode, please leave a review or a comment so we can help
spread the show to other people who might be interested in the topics that we've talked about
here today, but may not have found our show yet.
Again, thanks for stopping by. I'm Tim Staton. State in the obvious.

(37:31):
Sam.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.