Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jesper Conrad (00:20):
Today we're
together with Nicholas Bergman.
And first of all, Nicholas,welcome.
It's good to see you in yourlittle spaceship here.
Yes.
Thank you.
Nicklas Bergman (00:30):
The ISS.
Sorry if the downlink is notthat good, but we'll do what we
can here, right?
Jesper Conrad (00:35):
Yeah, yeah, yeah.
So, first of all, a greeting toone of our listeners, Mark,
who's also a personal friend,who have listened to a lot of
our podcast and reached out tome and said, I think you will be
able to have an interestingconversation with Nicholas.
He has read a lot of yourbooks, and I took him up on it.
So here we the first thing thatI'm curious about, and it might
(00:59):
be simple for many, but not me.
It is, you are a so-calledfuturist, Nicholas.
What does that mean?
Nicklas Bergman (01:08):
Actually,
lately I've tried to get away
from that description andlabeling, partly because it's
being overused, and partlybecause there's so many
well-educated, really smartpeople working with futurology
and forecasting.
And I don't want to piggybackon their expertise in actually
(01:31):
building future scenarios andthat.
So lately I've been starting tocall myself more of a tech
explorer where eternally curiousabout where we're heading.
I always live in the future.
I'm never in the present, moreor less.
I'm always looking forward tothe next thing for good and bad.
Meaning that, yes, I work withpreparing myself, my audiences,
(01:55):
whatever audience it is, for thefuture.
But I don't want to pretend tobe something that I'm not.
A futurist is not like a lawyeror an auditor where you have a
strict scheme or a doctor whereyou have to go through a
specific education to get adegree, but still, out of
respect of real futurists, I'mmaybe a hobby futurist and more
(02:17):
of a tech explorer than if thatmakes sense.
Cecilie Conrad (02:19):
So looking into
how technology and the new
developments in technology isaffecting the future.
Nicklas Bergman (02:32):
Well, affecting
us and how that will play out
in the future.
I mean, my day job is as aventure capitalist, very early
stage, what people now call deeptech, but science-based
innovations, spin-outs fromuniversities, startups with a
high technology component,usually with a high
(02:53):
technological risk.
That's my day job.
In that sense, being anexplorer or a futurist or
whatever you call it, for me,it's to try to understand where
technology is heading so that wecan create the best
opportunities for a company thatis going in a specific
direction in a specifictechnology field.
But it can also be working withgovernments and trying to
(03:16):
understand how changes intechnology, how changes in
innovation, how new innovationswill change society.
Or with a company, a largeestablished company looking at
how do we prepare for what's tocome?
Because we've seen now with allthese digital technologies and
now AI is everywhere for goodand bad, where companies and
(03:40):
organizations and governmentsare really, really struggling to
understand what will this mean?
How can we prepare and how canwe try to understand how to be
where in what direction shouldwe go to prepare the best we can
for this uncertainty, so tospeak.
Jesper Conrad (03:58):
And it is a big
uncertainty.
Cecilie Conrad (04:01):
So can you just
give us a short version of that?
Nicklas Bergman (04:04):
What I find
interesting is that when looking
at these things, I find itreally, really interesting to
look back.
And I just read this greatquote: the four most expensive
words in English is this time isdifferent.
And that was coined by thishedge fund investor in 2000, uh,
(04:25):
right after the dot-com boomboom and bust and all that.
Meaning that rarely or hardlyever we see occurrences or
instances that has not happenedbefore.
I mean, now it's AI, but if youlook back and understand the
history, what we've gone throughsince the beginning of the
(04:46):
industrialization, we can seethat the introduction of the
steam engine or electricity orthe computer, it goes roughly in
the same way.
Meaning that we go through atime of experimentation and
turbulence, we don't really knowhow to use these things, and
then that eventually stabilizes,we start seeing interesting use
(05:08):
cases, and then we get a hugechange in society.
With some technologies, thistakes long, and with others it's
going a lot faster.
And of course, today we'redistributing a new AI tool, you
can do that in minutes orseconds, and you're global in an
instant.
(05:28):
Of course, it's different frombuilding steam engines or laying
down railway tracks, right?
So the societal change in someaspects is faster now,
especially if it doesn't have todo with infrastructure
build-ups.
But still, our fear oramazement or whatever feeling we
(05:48):
have around a new innovation,we've seen that fear or
amazement, or whatever emotionswe have around it.
We've seen that before.
Cecilie Conrad (06:17):
Yesterday we had
some friends over for dinner
and uh talked about chat GBT,and so she had to do a recap of
some lessons she's giving andsend in like a little summary
every day after work.
And she thought that's justsuch a waste of time to do it
because the core of what I'mdoing is teaching, not proving
that I'm teaching.
So she started using chat GBTto just record the whole session
(06:42):
and then do a recap and thenreview it, send that in.
And we talked about is thatcheating?
It's not cheating, it's justusing technology in a way that's
faster.
And then we compared it to it'slike, is it cheating to use a
phone instead of shouting?
It's just a new piece oftechnology, and I think maybe
(07:05):
the phones people werefrustrated with the fact that we
could talk each other to eachother on long distances that we
didn't have to travel and thatwould ruin everything.
Did it really?
Nicklas Bergman (07:15):
I mean, it's
really interesting.
When does it turn intocheating?
I mean, somewhere along theline it does, right?
I don't think this is anexample of cheating, definitely
not.
But two things come to mind.
One is the Amish people in theUS.
They are traditionalists, theydon't use that much technology,
but they also, as I understand,they also have to decide how
(07:39):
they should use everything thatis coming.
Meaning that it's based on howmuch this innovation, whatever
it is, will change their way oflife.
Then we can discuss is it agood, right decision or the
wrong decision.
That's up to each and everyoneto decide, right?
But at least they arethoughtful about how they use
(08:00):
these things.
That's one thing.
And then another is I thinkit's the uh what is called the
Dubai Future Foundation thatjust launched a set of labels on
how if a video or a text or animage is everything from 100%
human generated to 100% AIgenerated.
(08:21):
So that they have a set oflabels that we can go ever from
these two extremes tocollaborating with AI or
whatever it is.
So there's I think there'sdifferent ways of looking at
this.
And for your friend as ateacher, my cynical guess is
that that report that she files,no one ever reads it, meaning
(08:43):
that it doesn't really matterwhat you write in it, and I
think that's an excellent use ofa chat GPT software, whatever
it is, because it's summarizingsomething that you know about.
Yeah, I use these tools all thetime, and I use it as
brainstorming tools when writingsomething.
(09:05):
I use it not for fact-checking,but for fact-finding, probably,
just to get a lot ofinformation in, condense it, and
try to understand it.
So for summarizing in that,it's interesting, although I
never trust it fully, of course.
But what's interesting is thatthe instances where I use it,
where I get the best use out ofit, is in areas that I'm at
(09:31):
least somewhat knowledgeableabout.
Where I, for example,evaluating an investment case.
I've done that for the last 25years off and on.
Meaning using an AI, agenerative AI model to look at a
case, it can help me finddifferent angles.
I don't do it all the time, butsometimes when I do it, I get a
(09:53):
whole set of questions orsummarizes, market, whatever it
is, and I can almost always seethat okay, they got this thing
wrong and this thing wrong.
This is totally like way farout, so far away from the actual
(10:13):
case that I can definitelyimmediately see that this is
useless.
But every once in a while I getan input, an in-road, a
question, something that thatfuels my curiosity, meaning that
I can go in a direction that Ididn't really think of.
Meaning that these tools arevery useful for me if they're in
(11:12):
an area that I know of, meaningthat I would never use it in in
(11:57):
totally new areas.
Unless I'm very well aware ofthe fact that it's hallucinating
and making things up.
So I think the teacher exampleis good.
Cecilie Conrad (12:08):
It reminds me of
some 30, maybe not 30.
How old am I?
25 years ago when I was donewith university.
I was studying at theUniversity of Copenhagen, which
is a traditional conservativeuniversity, and I was speaking,
I'm a psychologist, and I wasspeaking with a new student who
(12:29):
also studied psychology, but atan experimental university doing
only case studies, and wediscussed the value of these
conservative values of knowingthe history of science, the
history of philosophy, historyof psychology.
I've been reading so many booksjust to know things, just to
have a lot of facts in my mindbefore we started doing more
(12:50):
case-specific things.
And she was way younger thanme, and she said, I don't need
all that base knowledge becauseI can just look it up.
Now we have the internet.
And I thought, no, you can't,because you don't know what
you're looking for.
You don't know what you don'tknow, you don't have these, all
these.
Sometimes I talk about it likewhen we need to think, we have a
(13:13):
little a lot of dots, that'sour knowledge points, and we
have to draw new lines betweenthem.
If you don't have the dots,there are no lines to draw.
It's a little bit the same, youcan't use AI to explore
something you know nothingabout.
Nicklas Bergman (13:27):
No, and in the
same way, you can't start
thinking about things you don'tknow anything about, and people
do, and I think that's theproblem with this technology
that people use it in the wrongcircumstances, and that is
definitely a challenge.
And I agree, in in my studies,like from first grade up through
university, and my studiesafter university, there's a lot
(13:49):
of that that I never use, ofcourse.
Some of that is uselessinformation that I had to learn.
I remember I studied German andI had to learn like a riddle,
23 German verbs that controlsomething in and it was so
(14:09):
useless.
I still know it.
I'm not gonna do it today, butI still know these 23 German
verbs, and it's totally useless.
So, what they should have doneis let me spend six years trying
to learn some conversationalGerman so that I can order food
or have a conversation withsomeone.
But that's the flip side of it,where you really learn good
(14:31):
things you never use and uselessthings that you never use.
But as you say, based on that,that's sort of my knowledge
base, and then I build on that,meaning that subconsciously
somewhere, I think all thisinformation is useful, even
though I never think of it.
Uh, but maybe subconsciously Ican connect dots between my
(14:54):
German studies and somethingelse.
I don't know.
I can still basically read aGerman newspaper if I force
myself to do it, and maybethat's good.
I don't know.
But what's all what I findinteresting also is that there
is information that is so easilyaccessible today, meaning that
(15:16):
for the next generation, for ourkids, I can see that with my
kids, that they they learn in adifferent way, which I think
it's good.
There's a lot more reflection,a lot more stating your
arguments, making sure that theyhold, that they are solid,
(15:37):
checking their sources, beingthis critique of your sources.
All these things when I was akid, I never in school discussed
the value or the quality of mysources, because the sources
were there.
There were the encyclopedias,for example, and we knew that
(15:58):
this is the truth, or at leastit was the truth as we saw it,
as everyone understood it.
So I think some of these thingsis they approach knowledge in a
different way that I think is alot more useful today, with the
reflections, with the critique,with a lot more argumentative.
And I think these things arereally important.
But I agree, you should not youneed a basic understanding of
(16:23):
the world to be able to havethese discussions.
Absolutely.
Jesper Conrad (16:27):
Nicholas, I would
like to hear your thoughts on
how all this new technology, AI,social media, which have been
here for a long time now,interfering with social life.
What are your thoughts on thisgoing into the future or some
years down the road?
I will start with referring toan episode we had with a young
(16:52):
man in the start of his 30s, andhe had decided to turn off the
internet.
Not the whole one, it's stillhere for the rest of us, but he
only accessed it something likethree hours a week.
And one of the findings he didwas that it had been a social
appetite suppressant for him.
(17:13):
That was how he presented it.
That when he turned it off, hesaw that he now went out and
became more social.
Then, reflecting more on it, Iwork with a Canadian
psychologist, Gordon Newfeld.
If I put his lens on, then heis more or less saying, make
sure you cover the attachmentapproach first.
(17:37):
That yes, computers can bescary, social media can
interrupt stuff, but not if thebase of human interaction and
attachment is solid, foundedwith your parents, for example.
And I am weighing up these twooptions, turning it off
(17:57):
completely.
I find Gag is a lovely guy, butit's also too wild for me.
And then when I look at ouryoung adult daughter who is 26,
they have the evenings wherethey play card games and stuff.
So part of me is looking at allthis technology and social
(18:17):
media as maybe we who are 50plus are the ones that have had
difficulty learning it.
And the younger generation arehandling it in a way better way
where they're actually more inreality.
They're doing more social stuffthan we do.
Maybe it's just part of beingyoung and not being caught up in
(18:38):
having a family and work andall that.
I'm not sure.
So it's a quite broad field,I'm trying to fold out, but
where do you see AI, technology,social media affects human
connection?
Maybe that's what I'm askingabout.
Nicklas Bergman (18:58):
Technology
creates this social connection
between us today.
I don't think you would havetraveled to Stockholm to do this
interview, right?
So I mean that's oneperspective of it.
What I find interesting is thatour relationship to these
tools, these gadgets or theseapps, they change over time.
I remember my first emailsmid-1990s, or how I used email
(19:23):
in the 1990s.
It was the only time I wasemployed for a couple years,
otherwise, I've beenself-employed my whole life.
And that way that I used itthen is completely different
from how I use it today.
In which instances I use it andhow I write my emails is very
different today from what I didthen.
Meaning that emails, at leastfor me, I know that people
(19:46):
struggle with getting likehundreds of emails each day if
you're in a large organization.
I don't get that.
For me, it would be horrible,but at least I feel that I have
some control over the way I usemy email, for example.
Then you have Slack channels,which I try to avoid as well.
You have WhatsApp, which worksreally well for me, for example.
(20:07):
So there's all these differenttypes of communication that you
use in differ under differentcircumstances for different
purposes.
So I think that whatever it is,we develop a way of using these
things that hopefully works forus, at least better than it
does at the beginning.
Take social media then,Facebook, for example.
(20:29):
I I think I was as an earlyadopter, as really curious about
all these new things.
I think I was quite early onFacebook and LinkedIn and
Instagram and these.
Last time I was on Facebook waswhen I had a birthday four
years ago, and I thanked peoplewho congratulated me and haven't
been there since then.
(20:49):
And before that, it was likemaybe once a year when it was my
birthday.
So I don't use Facebook anymoreat all.
I rarely log into Instagram, Idon't use X for a number of
reasons these days.
I appreciate threads sometimes.
If I'm able to curate my feedin a good way or LinkedIn, where
(21:11):
I get information, news, andinteresting content.
But I also find myself doomscrolling and just looking at
for me, it was Instagram.
My whole feed suddenly wassmall houses for some reason.
Because I stopped at one ofthose or clicked at one of
those, and the houses turnedout, they went smaller and
(21:35):
smaller for each time I wentinto this.
So I don't use these thingsthat much.
What I'm trying to get at isthat the way we use these things
changed over time.
For our generation that's beenpre-internet, I think it's a
constant struggle, but finallyyou start using them in the
right way.
Or in a way that potentiallyworks for you.
What I find fascinating is whenI look at my kids, for example.
(21:59):
I got three kids, 20, 18, and14.
My 20-year-old, he logged outof more or less all social
media.
He's not interested.
And he's gone full circle,reading a lot of like physical
books, being very much in thepresent, working in the outdoors
(22:19):
right now, in between highschool and university, probably.
So he's very much away fromthat.
And then look at my 14-year-oldwho was born after we had
smartphones and lived her wholelife online.
And for her, it's constantlyavailable, and she talks to her
(22:43):
friends in the morning, theyhave group chats or video chats
with her friends when they getready for school.
They they constantly interact,and in some way, I find that as
something very social, engaging,interacting in a good way.
But then, of course, sheprobably spends what I think too
(23:06):
much time on on doom scrollingon TikTok, for example, because
it's so addictive.
So that that's the other sideof that.
But I can definitely see thatpart of the way she uses these
things is positive, meaning thatshe probably in one way she can
handle it better than I can.
But then, of course, I reallyforce myself not to use these
(23:28):
things.
I don't think she has that sameperspective or ability as of
now.
And then we have third exampleof my 18-year-old.
She's very creative, she'sstudying animation and game
design today.
She really loves that creativeside of it.
She draws a lot, both on penand with pen and paper and on
(23:51):
with digital tools like iPads.
And she's she's also diagnosedwith autism type one, which is
high-functioning autism.
And in our discussions withher, and also with her teachers
and people in psychology, forexample, because we've discussed
this with screen time and that,and for her, it's sometimes a
(24:16):
way of comfort and disconnect.
And she disconnects from herphysical life because she's
overwhelmed with impressionsfrom large crowds or noisy
environments or whatever it is.
Then she can put on a pair ofheadphones and really look at a
(24:37):
movie or whatever it is.
So for her, sometimes it's avery good thing to use these
tools.
So I have a very mixed emotionsbecause I'm well aware of that,
at least my two youngest kidsare using some of these things
way too much.
But then suddenly I seesomething that they create
online (24:55):
a short movie with
friends.
They spend the whole weekenddoing a short movie about
something or creating an edit onsomething else and posting it
online and get a lot of feedbackon that.
And people are cheering andthinking, Oh, this is good.
When will you put out the nextone, etc.?
So again, I think it iscomplicated, and I sometimes see
(25:20):
that our kids can handle thisand take out the good parts of
these tools better than we can.
But of course, also I see thatFacebook, Instagram, and TikTok,
I'm very well aware that theirwhole business idea is the more
time we spend in front of thesescreens, the better it is for
(25:41):
them, the more money they make.
And unfortunately, especiallyTikTok is really good at getting
us hooked on this.
So there's definitely a flipside.
But I'm just trying sometimesto see the positive things on
these because I also realizethat it's very hard.
You mentioned this guy thatdisconnects apart from three
(26:02):
hours a week.
I think that's impressive.
I think it all depends on whatyou do for what kind of work and
what kind of life you have.
I think it's really impressive,and maybe I would like to try
that.
But for me, the internet andthese generative AI systems and
all this is a source of it fuelsmy curiosity in a very, very
(26:28):
interesting way, meaning that Isometimes go down these strange
rabbit holes, too far down someof these rabbit holes.
And it's being made possiblebecause of this vast trove of
information that we have online.
So for me, I don't think Iwould like to disconnect, but I
want to handle it in the bestway.
(26:48):
And on the other side of that,this with trying to understand
and learn how to use these toolsin the best way that works for
you.
As I said earlier, beingcurious and a bit skeptical,
make it work for you and notthat someone else pushes it onto
you.
Ten years ago, I went fulldigital.
I was only reading on Kindlesand online and all that.
(27:11):
And I didn't really enjoy that,I realized.
Meaning that now I'm back andbuying a lot of physical books.
So when it's fiction or it'snon-fiction, I try to read it in
physical books.
If it's research reports orwhatever it is, then of course
it's still PDFs online.
I don't print them out.
(27:33):
But when it's more for readingfor pleasure or for getting
information in a differentsetting in a different way, I
very much prefer physical bookstoday, which for me swung the
pendulum back to actually beingin the physical world a lot more
than I was like 10 years ago.
Cecilie Conrad (28:35):
One perspective
in this conversation could be
that we're talking about how dowe use or how do we relate to
the technologies available andthe way they play out.
You mentioned how your childrenare using technology and social
(28:56):
media, and so we all worryabout that.
We see how it's unfolding inour society and we think about
how do we relate to that, how dowe act, what's the plan?
So that's one thing we'retalking about.
And what I hear is somethingdown the lines of we have to
stop and think.
You talked about the Amishpeople, and basically maybe we
(29:18):
don't want to live exactly likethem, but what we can admire is
this we have a lifestyle, wehave a way that we like, and
(30:09):
we're not changing it justbecause a new shiny object is
flying by.
Nicklas Bergman (30:15):
No, exactly.
They they're questioning theneed for whatever it is, yeah.
Cecilie Conrad (30:19):
So maybe the
stop and think is the
centerpiece, and that's whatwe're doing with a conversation
like this.
And I'm thinking we're talkingabout how can we protect
ourselves from the negatives,and also how do we protect our
children from the negatives andwhat are they really?
(30:41):
Is that what that's one angleof what we're talking about,
right?
You say dudes, they spend toomuch time, but then you also say
game design and rabbit holesand all these things that we all
know that both exist.
Nicklas Bergman (30:55):
Yeah, I think
we we agree that it's not the
technology per se, it's the wayhow we use it.
Just another example, my son.
Now he's traveling, he's in theother side of the world today,
but meeting up with his friends,doing something, going to the
movies or playing soccertogether or something, uh,
Friday night, and then theysplit up, everyone goes back to
(31:20):
their homes and they log on andthey play games together,
meaning that they continue to betogether, but in a different
way.
And and that's not necessarilya bad thing, as long as they go
outside and actually dosomething physical and whatever
it is, right?
As a parent or as an adult, youthink, oh, he's in front of his
(31:42):
screen again, and then if yourealize that he's chatting with
people that he spent the wholeday with, so they continue this.
Or my 18-year-old daughter, herbest friend, is is not living
in Sweden.
She's been living abroad formany, many years, and their only
way to connect is online, whichthey do all the time, which is
(32:03):
really, really good for her.
So again, use it in the bestway possible, but at least for
yourself, question how youshould use it so that you're in
charge of these things, so thatthey don't take over and decide
how you should use them.
My youngest daughter, she'stotally online all the time,
more or less.
(32:24):
And I think even they realizethat they want to be physically
interacting with each other.
I mean, meeting, socializinglike this.
And the result of that is, forexample, that for the last
couple of months or so, she'sbeen meeting with friends Friday
(32:47):
nights, playing board games.
Which is really, really coolthat they actually do that.
That they probably sit withtheir phones anyway, but at
least there's five, seven, tenpeople in the same room around
the table doing somethingtogether in the physical world,
and also listening to a lot ofmusic, looking, watching YouTube
(33:08):
and TikTok with dances andwhatever it is.
But then last night went to aconcert, live concert in
Stockholm, and was justthrilled, loved it, had so much
fun with her friends and that.
So I can actually also see theshift where she matures in the
same way that we mature, and youstart being in charge of how
(33:34):
you use these things, and that'ssomething very positive.
And I think that realizing thatyou mature, that I mature and
start using these things indifferent ways, that my kids
mature and start using thesethings in different ways,
playing board games or loggingoff or whatever it is.
Cecilie Conrad (33:49):
If the check
question is, are we living the
life we want to live right now?
Then doom scrolling, no onewants to do that a lot.
Maybe you do want to do it as ayou said just disconnect,
chill.
Maybe some people like Ipersonally don't like it because
I notice how I feel after if Ispend some time looking at
(34:09):
stupid videos, or I very rarelydo that because it affects me
for a long time after.
People don't have that.
So maybe the Czech question isare we living the life we want
to live?
We had maybe 10 years ago, wewould say we would use the word
reality about everything you saythe physical world, and our
(34:30):
kids got really annoyed with us.
It's not reality when I'mplaying an online game with my
friends.
This is part of my reality,it's real, I'm not
hallucinating.
I'm there's actually a game,and we took that critique and
learned wait a minute, this ispart of reality.
Nicklas Bergman (34:47):
It is, and it's
so easy to be critical about
these things with your kids.
So there's quite a few thingsthat I remember now.
One one is the Apple, they'rereally good at making
commercials, they are soemotional sometimes.
I think I've even cried oncewhen I saw an Apple Advert, and
I think it was that one wherethe family is going to the
(35:07):
grandparents over Christmas, andthe kids are only looking at
their screens.
And after three, four days, thethe this sort of eruption, and
then the parents and thegrandparents are so upset about
this, and then the kids showthat they did this video about
Christmas, spending Christmaswith the grandparents, and then
(35:27):
you suddenly use the technologyin a good way, which is what
Apple does.
I mean, they want you to buythese things, of course.
But I've also seen that with mykids where you think they spend
too much time on the screens,and suddenly they come back with
a short movie, or they did onething, what is that last year
for Father's Day?
To me, really great short filmabout a nerdy thing that I like,
(35:53):
and then you realize okay, theyspend time online and with
these screens, and they'reactually being created.
I was at this huge techconference, what is that, 10
years ago.
My son was 10, 12, something,played a lot of Minecraft with
his friends, and I came backfrom this tech conference and
had this Minecraft sword, likerubber, pixelated by like a
(36:16):
rubber sword, and I gave it tohim.
And I was so happy because Ifound this great gift for him
when I was abroad.
Something that connected to himand we were interested in, and
also easy to pack and didn'tweigh too much when you were
flying back home, right?
And then I gave it to him andhe said exactly the same
reaction as your kids, and say,Thank you, Dad.
(36:37):
I already have 100 of these inthe real Minecraft, so you just
it just collected dustsomewhere, never used it.
And that's exactly what you'resaying.
For him, the reality was onlinein this game.
The reality in Minecraft wasnot the physical sword, it was
(36:59):
the actual 100 swords that hehad in Minecraft.
Then I did a Minecraft zombiepig costume for him once for
Halloween, which was then backin what I see the physical world
and outside of Minecraft.
So, well, I think sometimesthey teach us a lot about how to
relate to these things.
(37:19):
Definitely.
Cecilie Conrad (37:21):
Sometimes I've
noticed I have to be a helping
hand for them to growing up,right?
So we are some sort of I don'tknow, guide guideline, or we
have to be someone they can leanon with many things.
And sometimes I don't have theanswer, but at least I have the
question, and we exploretogether.
(37:42):
Is this playing the role thatit should play in our life?
Is it getting in the way of ourgoals, or is it I mean, are we
happy basically?
Are we do we feel content whenwe wake up in the morning?
Are we looking forward to ourdays?
So that kind of questions can Iask with them?
And I have to be open when andlisten if they say, but what I
(38:03):
want to do right now is toexplore this game nine hours a
day.
I'm having fun with it, or walkin the afternoon to get some
sunshine, and it's enough.
I'm not going to do this for 10years, but right now this is
what I want to do.
Fair enough.
And sometimes when it's reallymakes me feel uncomfortable,
because it does sometimes, I'venoticed if I let go and let it
(38:25):
have full circle, then maybe itlooks like doom scrolling, but
after 10 days, I realize it'sactually a rabbit hole.
They're exploring something andthey're figuring something out,
and sometimes they're figuringout doom scrolling, realizing
oh, TikTok is not healthy forme, or okay, people are really
stupid on the internet.
Let me find different sourcesbecause this is clearly fake.
(38:50):
This is not, and they learnthings that I don't think I can
learn them in this lifetime.
The way they can spot a fake AIpicture, they spot it like
before I've even seen thepicture, they can tell me it's
fake.
Be there with them, but at thesame time be open to it, plays a
different role in their life.
Nicklas Bergman (39:10):
I can see that
as well.
As I said, I'm not spending anytime on Facebook, for example,
anymore.
But I love to ski, I love thewinter, and I really, really
enjoy that.
And I'm also struggling withsome injuries, back injury, no
back injury from skiing.
My physiotherapist had two fullpages of different injuries
(39:30):
that I had once and asked, Can Iwrite a PhD thesis on you?
But that's a different story.
But meaning that I really Ilove the outdoors, I love
skiing, but I'm not as good as askier, and I'm not in that the
right physical shape anymore,meaning that I still remember
how it was when I was younger,and I'm still mentally still
(39:52):
there, think that I can do allthese things, but I probably
can't.
So that's the backstory.
I remember just remember thisvery clearly that online I can
watch all these skiing films andthe great backcountry things
that people do, and you can getimmersed and inspired of all
these things, and then going onYouTube, watching, being
inspired of all these thingsthat you can see, and then going
(40:13):
on Facebook in the winter, likeJanuary, February, March.
And my since I I share thisinterest with a lot of my
friends, I see their postings onwhat they are doing for the
winter holidays, and it's themore the most amazing mountains
and the most snow, the powder,the sunny days, the heli skiing,
(40:34):
the cat skiing, whatever it is.
And you think, and then I'mhere with a bad back and that
can't really do these things,and you think that everyone else
is doing this all the time, andyou get sort of depressed
because that their life is a lotbetter than mine, right?
Which is the bad side of seeingall these things, but then you
(40:57):
realize that you have 10examples of ski trips, and
they're from 10 differentpeople, meaning that they each
spend one or two days each onthis, accumulated.
It feels like everyone else isdoing it all the time, but it's
their snapshots of somethingthat they really enjoy.
Like, so on the one hand, beinginspired by other people doing
(41:21):
amazing adventures and skiing orwhatever it is online, on the
other side, being feelingdepressed because everyone else
is doing all these things thatyou can do, and it accumulates.
The impression, the feeling isthat everyone else has this
great life where they can skiall the time, whatever it is,
right?
And I think that's just anexample from me and from skiing,
(41:43):
but I think people feel thatbecause and that's the social
pressure that comes from socialmedia sometimes, where in this
case it was ski holidays, but itcan be having a nice home or
gardening or cooking or doingthis amazing 10-hour cooking
session on a Wednesday, and youthink that everyone else is
(42:07):
doing that, and it's just asnapshot of something very
specific for a specific day orwhatever it is, which I think
many of these things make youfeel it's competitive in a
really, really bad way.
Cecilie Conrad (42:20):
Talking about
how does it make you feel and
what role is it playing in yourlife?
Is it supporting the trendsthat you want supported in your
life and in your children'slives?
And it's I find it highlyrelevant to stop and think about
how any activity is affectingsmall scale and big scale the
(42:40):
life that we're living.
Choices we make, we have togroup make them.
So are we doing Facebook ornot?
It's a choice, and if youchoose to do it, it has some
mechanisms that will make you doit all the time, or not very
few people go on Facebook everyMonday for one hour.
Looking at how does this choiceaffect my life?
(43:02):
Am I choosing to exercise for10 minutes every morning?
And is that making me happy, oris it just putting me in
distress and pain?
Should I rather do somethingelse?
That kind of reflection isrelevant for everything we do.
It's maybe especially thesetechnologies that yeah, of
course.
Nicklas Bergman (43:21):
And as I said
in the beginning here, where you
get a new technology cominginto your life, and you embrace
it or you start using it, andthen you use it.
Probably all of us use it inthe wrong way in the beginning.
And then it evolves intosomething that fits in your
life.
(43:41):
And I think for me with emailsand texting and these things, I
think I'm there.
For me on Facebook, I'm therebecause I don't use it anymore.
I'm still somewhat strugglingwith my relationship to
LinkedIn, for example.
It's very useful for me, butit's also extremely frustrating
(44:03):
with the algorithms and it canbe time consuming and doom
scrolling there as well, right?
But also, if I struggle withthese things, if I feel that I
get depressed is a strong word,but I feel like being sad or
sort of mildly depressed when Isee all my friends doing all
(44:23):
these fun things online, right?
How can my kids, my how can ateenager react to that?
That must be really, reallydifficult for them.
Because I have my I live mywhole life, I got my
experiences, my background, mymy knowledge about these things.
And even I have problemsrelating to it.
(44:45):
I have problems logging off, Ihave problems seeing through my
feed and realizing that it's noteveryone doing this all the
time.
They do one day each, like forwith the skiing, for example.
And so that's for me, that'svery challenging in trying to
understand how I can help mykids relate to this in the best
(45:10):
way.
Because I struggle, on the onehand, I struggle with it.
On the other hand, they areprobably quite often more
experienced in this than I am.
Cecilie Conrad (45:21):
We talked on the
podcast, I don't know, maybe
six months ago, with an Americanwoman who is living in
Copenhagen and wrote a bookabout the Danish way of
parenting, is the title.
And she had this very nicephrasing of it where she said,
having conversations with yourteenagers about this, it's about
(45:41):
acknowledging that there arethings we know that they don't
know, but there are also thingsthey know that we don't know
exactly about this technologyand how it's unfolding.
So it has to be, and that's herthing that she likes about
Scandinavian parenting, thatwe're level, we have a different
it's not a very authoritarianway of relating to our children.
(46:03):
So if we can have theseconversations respectfully and
open-minded, thinking, okay,this affects me in this way.
And when I look at you, itlooks to me as if it affects you
in a similar way.
But tell me about yourexperience.
Sometimes they just have adifferent way around it, or a
(46:26):
different way it's unfolding, orthey're okay with it affecting
them that way because thatdepression or lower sort of base
emotion that you can havecoming out of a interaction
with, let's say, Facebook, theycan shake that off faster.
Yeah, probably.
Yeah.
(46:51):
So I think that kind of justpassing that message so we don't
react with control and use theword screen time and start
making up rules and tell themhow this works.
And it's like heroin for yourbrain and all this mainstream, I
don't know, people areparroting each other saying
(47:12):
things that makes actually nosense.
But have a conversation whereboth ends know we both know
something, the other one doesn'tknow about how this works.
Jesper Conrad (47:24):
It makes me
reflect on a conversation we had
with Darcia Naves, who havewritten a book called The
(48:04):
Evolved Nest.
And she told this story aboutan anthropologist visiting a
tribe somewhere and asking, Hey,why are you letting the kids
play with bow and arrow?
Isn't that dangerous?
And the story goes that theelders there said, but these are
the tools for the for theirfuture.
They need to know how to workwith them, but we of course
don't give them the poisonousarrows to play with.
(48:27):
We're not stupid.
And here we are in a fast-pacedtechnology technology world
where things come so fast, wherethe question is, how can we
know which arrows are poisonousbefore we have cut ourselves on
them?
Nicklas Bergman (48:43):
No, no, no,
exactly.
And and when you talk about thearrows and also listening to
your kids, because sometimesthey know these things better
than I do, sometimes.
And that's perfect.
Nowadays, when my dad, he's 82,when he has a problem with his
laptop or iPad or phone, I sendhim to my youngest because he
knows everything about thephones, right?
Which is good.
But I just want comment on whatthis with the arrows.
(49:06):
Yes, I mean that then it's it'sa tribe somewhere where this is
natural to them, right?
But we also tend to take awayso many of these potentially
dangerous things from our kids,which turns it into a problem
down the road, right?
So I once listened to thisAmerican guy who had taken up on
(49:27):
this, and he had class foryoung kids, like eight, ten,
twelve something, where hetaught them on how to use a
(50:10):
knife.
Like not a knife in thekitchen, but a night, an outdoor
knife, like bigger ones.
And someone said, Oh, but thatthose knives are dangerous.
Well, yeah, if you don't knowhow to use them.
So his whole take on this washe taught the kids on how to use
knives.
They got a certificate, justmore or less for fun, meaning
(50:32):
that they now were good at usingknives because meaning that
when they needed to use a knife,they knew how to do it, so they
wouldn't cut themselves.
And I think that that there'sconnection to that perspective
with how we use social media orcomputers or whatever it is.
It's not, and as you say,Cecile, about screen time and
(50:55):
that it's not dangerous per se.
It all comes down to how youmake a use, make these things
you be usable for you, and alsohow you prepare for the
poisonous arrows or the sharpknives or whatever it is,
meaning that you yourself,together with your kids or your
(51:16):
friends, you realize that thereare some there are dark corners
of the internet, there are badways of using these things, and
just make everyone aware ofthat.
Jesper Conrad (51:27):
Do you think
we're too scared of shiny new
choice?
So instead of shiny new choice,it's scary new choice, where
it's like, I don't know what itis, so you can't touch it, and
therefore we miss having thoseconversations.
Nicklas Bergman (51:40):
Sometimes, yes.
I mean, we have the these oldcliches more or less about
videos like VHS and all theviolence that we have there.
We had the video games, we havesocial media, we have
smartphones, and we have allthese things where yes, used in
the wrong way, they can bedangerous, they can get people
(52:03):
going down the wrong rabbitholes, and that is of course a
challenge.
But sometimes I think we youhave people like me that fully
embrace these things with a sortof skeptical view.
But I really try everythingthat's you more or less, and
then I realized this is notuseful for me, and I really put
(52:23):
it to the side.
But so for me, the naturalreaction is curiosity.
Then you have people, andthere's nothing right or wrong
in this, but you have peoplethat where the natural reaction
is fear, and I think those twoperspectives means that you in
society you start a conversationaround it, and generative AI is
(52:46):
a very, very good example ofthat right now, where you have
the doomsayers that this willtake all our jobs, etc.
And yet other people say no, itwill help us, it will create
new meaningful work, and blahblah blah.
Whatever it is, and I thinkthrough that conversation we
will take this forward and maybecome to some kind of a not
(53:07):
conclusion, but a way ofrelating to these things that
actually work.
And that goes in my family formyself, in my family, in my
community.
It goes the the this tricklesdown in the companies I work
with, my investments, mespeaking on stage, in politics,
(53:27):
in lawmaking, in society as awhole.
Hopefully, these and I alwaystry to, as I said several times
now, being curious and a bitskeptical about these things.
And I always try when I'm infront of people on stage, for
example, to make peopleunderstand that you can't just
(53:48):
log out and let someone elsedecide for you.
Not scientists, not corporatepeople, not lawmakers, not
politicians, no one.
Because each and every one ofthese people, most of them think
that they're 100% convinced.
Most of them, not everyone, butthey're convinced that they are
doing this for the greatergood.
I do this research on AI or onquantum computing or on nuclear,
(54:13):
whatever it is, because I thinkthis will create a better
world, right?
Most people think that way.
I start this company providingthis product because it will
help people.
We discussed this in theparliament because we'll make
life better for Danes or Swedesor Europeans or whatever, right?
But they all have theirperspective and their reasons
(54:37):
for doing this that you don'thave.
For a researcher, yes, they dothis research, they uh they
publish this, but they all theydo this because they think it's
irrelevant, but they also do itbecause they get funding, they
get grants for their nextresearch project, meaning that
it supports their way of living.
And that does not necessarilyalign with your views on this.
(54:59):
Politicians, they have inSweden a maximum of four years
perspective up until the nextelection, or actually, it's
probably less than that, it'sprobably somewhere between two
and three years because it takessome time to get settled once
you win the election, and thenafter one year before the next
election, you start have tostart preparing for that.
So the time perspective thereis very short.
(55:22):
Meaning that these I think eachand every one of us needs to
think for ourselves how weshould relate to these things,
but we also need to be part ofthe conversation.
We need to be online, offline,in reality, whatever kind of
reality we think we are in, weneed to be vocal about these
(55:42):
things and talk about it anddiscuss it so that we stay on
top of these things, that we arestill in charge of all these
new innovations, whatever it isthat's coming our way.
And then you see that peopleoverreact sometimes.
We see that, for example, thatyes, there's quite a lot of
(56:03):
violence sometimes in in videogames or in VHS movie rentals,
but it also creates a there'salso a lot of creativity around
that.
Indie movies coming out, greatexperiences, great
documentaries, peoplesocializing online in gaming,
for example, etc.
etc.
And we have in lawmaking, youcan see that in the EU, for
(56:26):
example, where GDPR and some ofthese regulations around uh
social media, online, privacy,and AI has maybe in some
instances been taken a bit toofar.
I definitely agree with and I100% respect of respect for the
(56:48):
perspective where you shouldprotect the consumer, protect
the individual.
But when it comes to GDPR, allof us now we just click.
It's not that we read thesethings, we don't.
So I think in theory, I thinkthat was a really good idea.
In practical terms, we only wejust click and it's not useful
(57:10):
anymore, right?
And the same with these AI actsthat the EU has put out that
regulates AI.
There are now discussions ofstepping back a bit to create a
bit more freedom to to createthese things and to build these
things and to use them, becausewe also see a challenge from
(57:31):
from the US and Chinaspecifically, where they have
totally different perspectiveson, especially in China, on
privacy and how you can usethese things.
And then the trade-off is okay,if we regulate a lot more, then
we will get behind, meaningthat we will not be able to be
part of actually discussing onhow these things should be
(57:53):
developed and used, right?
So, long story short, I thinksometimes there's a bit too much
fear.
To answer your question of whatis it, 10 minutes?
That sometimes we we take thesethings a bit far, we're a bit
too afraid of them.
But through a discussion andthrough debate around these
things, maybe eventually we findthe good balance in this.
(58:16):
At least that's my hope.
Cecilie Conrad (58:18):
And it sounds
like this is the exact same on
an international scale as in afamily.
Actually, balance is about beingcurious and skeptical at the
same time and havingconversations and look at how
does this work, but also with Inoted it actually, the greater
(58:39):
good, the better world, helppeople make life better.
What is the good life havingthat?
Is it serving me so notexploring it and just running
with it, doing it what it cando, but think about what kind of
life do we want to live and howdoes this tool help or not help
(59:00):
with that?
We are back to AI, which isnice.
We started with AI, we're backto AI now.
Jesper Conrad (59:07):
Yeah, and to try
to round up the conversation
with AI, I wanted to share awonderful experience I had,
which is learning languages as atraveler coming from the
Nordic.
You we have a very smalllanguage, the Danish.
And then when you startlearning the Latin languages
like Spanish, Italian, French,and then start to look at the
(59:29):
English language, then you beginto ponder some of the words.
And for example, I love dessertlike many of us, and I looked
at desserve and dessert.
There's a sound that issimilar.
Do they have a root that isconnected?
And with my good friend AI, Ijust asked, and it explained to
(59:51):
me that no, it sounds similar,but dessert comes from old
before 1600 friends, dessert.
I pronounce it correctly, whichmeans to clean the table.
So desserts is something you doafter having cleaned the table,
and then in 1600 it changed,and just that to be able to get
(01:00:13):
that knowledge in your pocket.
Oh my god, it's good.
Nicklas Bergman (01:00:17):
The problem is
now that you will have live
translations through yourheadphones, you don't need to
learn different languages.
Cecilie Conrad (01:00:25):
Is that true
though?
And do we have time to go intothat discussion?
And we've already talked for awhile.
I just heard another person Iknow who is quite knowledgeable
about tech and how it's allgoing and all these things, and
she said the same thing.
It will be obsolete to learnlanguages because we will get an
(01:00:46):
earplug and it will justtranslate.
And I speak a few languages,and I feel like learning a
language is learning a culture,and the nuances and how you
express yourself in differentlanguages.
I learned different languagesat different times in my life,
so I can even feel how nuancesin my personality that has grown
(01:01:08):
after I learned a specificlanguage has a different has
different options, like adifferent playground to play at.
Nicklas Bergman (01:01:57):
Then it's
really useful.
But I don't think we'll stoplearning languages, at least not
in the foreseeable future.
Cecilie Conrad (01:02:04):
I always think
that that these things are
evolving and that it might beway better 10 years from now
than it is now.
Nicklas Bergman (01:02:10):
Probably,
probably.
But I think I mean you're justnot impressed when I see them.
Cecilie Conrad (01:02:15):
You're also from
Scandinavia, Swedish is way
bigger than Danish, but it'sstill a relatively small
language.
Nicklas Bergman (01:02:20):
Well, with 10
million instead of five, right?
Or something.
So it's uh time, yeah, ofcourse.
Cecilie Conrad (01:02:27):
Yeah, but I mean
it's still it's just we just
had some problems with an Airbnbhost.
And one of the things I askedin the beginning was, are you
using auto-translation?
In order to find a commonlanguage that we can both speak.
And I realized, oh, so hespeaks French and he's using
auto-translation.
That is not going to, Iactually speak French, so why
(01:02:50):
don't we just communicate inFrench?
And then maybe my French is notgreat, but maybe at least the
misunderstandings could be fromthe question.
Anyway, I just had it, and itcreated a lot of problems.
This auto-translation, yeah,yeah, yeah, yeah.
French is a quite big language,yeah.
French cannot beauto-translated into English in
(01:03:12):
the community.
Nicklas Bergman (01:03:14):
But I mean, you
just not to go down a very deep
rabbit hole where I spend a lotof time since I work with
venture capital, the AI bubble.
And one part of the AI bubbleis the overpromising from some
of these large language models.
My my stand on this is thatyes, AI will be tremendously
(01:03:36):
useful.
It will change the way we live,the way we work, the way we
build our societies, as thesteam engine or electricity did.
But where we are now withgenerative AI, I don't see that
we can scale the large languagemodels for several reasons.
Physical reasons, with theenergy and data centers and that
build up, for example.
(01:03:57):
I I can't see that generativeAI will take us into some kind
of artificial generalintelligence or
superintelligence because it'snot scalable.
There will be other kinds of AIin the future, nearer.
Distant future that willpotentially create those kinds
of intelligences becausegenerative AI, they have a no
(01:04:19):
understanding of the outsideworld.
It's only trained on the imagesor the text that is put into,
right?
So that creates a problem whenloads of investment, loads of
human capacity, you loads ofhuman brain power or actual
(01:04:43):
power is being put intodeveloping these systems, which
for me is useful if you use themin the right way, but it's a
dead end, it's not the waytowards a greater kind of AI, a
lot bigger, more intelligentkind of AI.
Meaning that we spend today, inmy opinion, way too much time,
(01:05:03):
energy, and money into buildingthese things where it will not
give us the perfect translatoror the perfect research
scientist or whatever it is,right?
But that's maybe a topic foranother discussion.
But just a couple of weeks agowrote an essay, the AI Bubbles,
and maybe you want to share thatwith the audience as well.
(01:05:24):
It's probably obsolete.
If you take more than 10minutes to edit this one, my
essay would probably be obsoletebefore that.
Jesper Conrad (01:05:33):
So how fast
things go.
And also through here, it waswonderful talking with you.
Nicholas, if people want toread more about your works and
dig into some of your books,etc., where is the best place to
read about all that?
Nicklas Bergman (01:05:51):
Well, I do the
occasional essays on Substack.
Of course, I have a website.
I publish a couple of books.
I want to write more, but Ijust haven't had the time as of
now.
But I really want to do that.
But maybe just subscribe onSubstack or connect with me on
LinkedIn where I post theoccasional things.
Jesper Conrad (01:06:09):
Thanks a lot for
your time.
It was an interestingconversation.
Nicklas Bergman (01:06:13):
All good.
Happy to be part of it.
Thank you.