Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
This is an iHeartRadio New Zealand podcast.
Speaker 2 (00:29):
Hi everyone, and welcome back to episode twenty four of
Leaders Getting Coffee. My name is Bruce Cotterel.
Speaker 1 (00:35):
It's great to have you with us.
Speaker 2 (00:36):
As many of you will know by now, we put
this podcast together with the support of the great team
at enzed ME. So thank you to the people of
the New Zealand Herald and News Talk zed Bee for
helping us get the amazing stories from our leaders out
to you all. And speaking of the Herald, in my
latest article, I've discussed the Green Party. Most of us believe,
(00:58):
or would like to believe, that those who put their
hands up for public office do so with the intention
of seeking something better for the people they represent. But
we're seeing currently an episode play out almost daily which
suggests that such lofty ideals do not necessarily ring true.
And I'm talking about yes, the Green Party, a political
(01:18):
party that's become incredibly distracted as they stumble from one
public relations disaster to another. My own view is that
they no longer represent the true values of what we
expect of a Green Party, the values that were first
brought into the mainstream of political debate in this country
by people like Jeanette Fitzsimon's and Rod Donald, and I
think it's time that the current Green Party had a
(01:40):
rethink and maybe even a rebrand.
Speaker 1 (01:43):
So go and have a look at the column.
Speaker 2 (01:45):
You can find it at the New Zealand Herald dot
co dot n Z under the headline green Party's identity
crisis should be caused for a rebrand. As always has
been plenty of feedback, the theme of which is best
captured by this note from David. He says, it's a
very sad day for the good people that believed in
(02:05):
the old values of the Green Party. Whilst I've never
voted Green, I had respect for the likes of Janet
fitz Simon's, Rod Donald and more recently James Shore, who
gave their time and energy for the good of the
environment and believe totally in the merits of sustainability, which
is what they fought for. Today's so called Green Party
do nothing but provoke and incite disharmony, pushing agendas that
(02:29):
have nothing to do whatsoever in common with the values
that the Greens used to stand for. Socialists should not
be confused for environmentalists. So that's from David. There's plenty
more on the Herald website and in my email inbox.
Thanks to all of you who have responded, and as
I said, check out the article if you haven't seen
it at nzherld dot co dot nz, or you can
(02:51):
look it up on my website at dubdubdub dot Bruce
cootteral dot com, Forward Slash Blog.
Speaker 1 (02:58):
Thanks again for being it with us.
Speaker 2 (02:59):
We're talking technology this week and I'll be back shortly
with our latest guest on leaders getting coughed. Welcome back
to Leaders getting Coffee. This week's guest is someone that
I've known for six years or so, and I remember
(03:22):
when we first met very clearly and how intrigued I
was about his story. He's young, he's intensely bright, he's
well educated and well read and all up, he's.
Speaker 1 (03:32):
A very impressive young man.
Speaker 2 (03:34):
His name is Danu Absaria, and he brings to our
conversation today two decades of experience as a technical innovator,
entrepreneur and technology leader.
Speaker 1 (03:44):
He's the founder and.
Speaker 2 (03:45):
Chief technology officer of Rush Digital Interactive, a software company
that he founded fourteen years ago in a company which
went quickly from a garage startup to developing world first
solutions for companies such as UNISEEF, Downer, the Ministry of Health,
Zen Energy and Google, to name just a few of them.
(04:05):
Rush species and really complicated business problems and the software
to resolve them. And it's the company behind the likes
of the number plate recognition technology that we see at
the car park or the service station, and here in
New Zealand, Rush with the developers of the government's COVID
tracing app, a solution that was so much part of
our lives just a few short years ago. He has
(04:27):
a fascinating backstory. He's Sri Lankan. He was born in
Sri Lanka and his parents are both educators, both teachers,
and he spent his early childhood in both Sri Lanka
and then Zimbabwe before moving to New Zealand about the
age of eight or nine, I think from memory, and
he completed his education here and despite the opportunities to
work anywhere in the world, he set about building a
(04:48):
career and a reputation in New Zealand, where he's done everything,
including gracing magazine covers and a regular technology commentary spot
on Breakfast Television. And with the world having been through
a number of technology moments, in our lifetime, such as
the personal computer, the Internet, and the old iPhone. We're
now on the verge of another one, and I just
(05:09):
felt he was the right guy to talk to about
the much hyped artificial intelligence, or AI as we're calling it.
He's at the front line of the major changes we're
about to see in the way that technology impacts our
lives and the way we live them. Not content with
changing the world through technology, he has a strong social
conscience and is passionate about social equity, poverty alleviation, and education.
(05:32):
And one of his many mantras is that we have
a responsibility to use our developer privilege to change the
world for the better. We have a responsibility to use
our developer privilege to change the world for the better.
Speaker 1 (05:45):
Wise words.
Speaker 2 (05:45):
So Dannu have Asiria, welcome to lead us getting coffee.
Speaker 3 (05:50):
Thanks Bruce. That was a very very humbling introduction. Thank
you for that.
Speaker 1 (05:56):
It's been quite a journey, hasn't it.
Speaker 3 (05:59):
Yeah, it really is, especially when you're working at at
full pace. Sometimes you stop don't get a chance to
stop and think about it. So it's nice to hear
it played back to you and remind you that, Yeah, definitely.
That colorful background has led to something, right.
Speaker 2 (06:14):
Well, it's led to a lot, hasn't it. But I
mentioned your upbringing, it was certainly different. Born in Sri Lanka,
but your parents at the time were living in Zimbabwe.
I think you travel or your mother traveled back to
Sri Lanka to have.
Speaker 3 (06:27):
You have I got that right, correct. Yeah. My parents
actually met as teachers in Nigeria and then immigrated to Zimbabwe,
which had you know, Africa at the time was doing
sort of this Educated Teachers program where they were trying
to get degree qualified individuals from around the world to
stimulate their education system. Yeah, my mom ended up flying
(06:50):
back to Sri Lanka, not quite you know, bending the rules,
I would say, to get on a plane to make
sure that I had a Sri Lankan birthright because they
didn't know if I wanted a you know, the option
to get a Sri Lanka passabort when I was a
bit older. So yeah, I went there for less than
a year, was you know, born in Srilanca, and then
moved back to Zimbabwe, where I spent a lot of
(07:12):
my early childhood in Harari, the capital city, before then
moved to New Zealand.
Speaker 2 (07:19):
So what was that like, you know, growing up in
Harari at that time, because that's a country like Sri
Lanka that's changed a hic of a lot over the
last over the over the course of your lifetime. What
was it like growing up there as a little boy.
Speaker 3 (07:34):
Yeah, yeah, very interesting. I think we sort of were
in that transition period where, you know, through the eighties,
I think Zimbabwe did really well from the communications boom
around the world because one of the main exports was
copper and they were quite fertile soils as well, so
Zimbabwe was well known for producing and being strong in
agriculture yep and and really you know, Rhodesia as was
(08:01):
previously known, you know, the indigenous people really didn't appreciate
the control systems and you know, all of the power
dynamics that were still at play so many decades after independence,
and that really kicked off what we saw as well.
As mcgabe was the leader at the time, going from
(08:22):
someone with the world community saw as a moderate to
someone who was clearly put in the more extreme end
of things towards the end of his career there and
we sort of got out maybe a year or two
before everything really kicked off. I think we left in
the mid nineties, just in the naked time, and we
(08:42):
sort of had this old joke in the family that
if my dad decides to leave somewhere, you shouldn't ask questions,
you should just follow. So we sort of we sort
of kid because he has a snack for getting out
just the right time to left Sri Lanka. Yeah, he
left Sri Lanka just before a civil war kicked off.
He left Nigeria just before another civil war kicked off,
(09:05):
and he left Zimbabwe just before all of this sort
of change in power happened. So you tell me he's
either the problem or has a lot of foresight. Yeah,
we joke if you ever leave New Zealand, don't ask questions,
just pack up.
Speaker 2 (09:19):
Yeah, just can you let can you let me know
as well?
Speaker 3 (09:22):
Please? Yeah, we'll do. Yeah, we actually have a WhatsApp group.
I'm kidding.
Speaker 2 (09:27):
So have you kept in touch with those countries at all?
Speaker 1 (09:30):
Do you do?
Speaker 2 (09:31):
You see what they're like today, Sri Lanka and Zimbabwe
and how they compare.
Speaker 3 (09:37):
Yeah, look Sri Lanka obviously a lot more directly. I've
still got direct family in Mexican, you know, sort of
in Sri Lanka, so much more vested interest with zimbabwehere.
A lot of our friends immigrated at the same time,
so we our family circle sort of with our extended
group of friends sort of moved to New Zealand at
at at the same time, So there was always that continuity.
(10:00):
And we were both young enough that we didn't have
a lot of relationships invested in the country at the time.
Obviously kept an eye on it from a news and
sort of genuine intrigue. And actually there's a lot of
Zimbabweans in in New Zealand, so have also you know,
(10:21):
whenever I'm playing football or socializing, you always run into
one or two and yeah, just getting news about how
things are going back in those countries here.
Speaker 2 (10:31):
Yeah, so other than your father's timing, what made the
family come to New Zealand.
Speaker 3 (10:38):
Look, I did ask my dad, Actually, Stephen Tindall, Sir
Steven Tindall actually prompted me to ask that question of
my dad a dinner that we had many years ago.
And ah, he sort of asked me the question and
I didn't have an answer, and he and gave me
the wise, the wise words to say, maybe you should
(10:59):
ask your dad there. So I did, and the answer
was really simple. Hey, he sort of saw the world
becoming what a friend Roger Dennis might refer to as
a bit of a poly crisis. You know, there's a
lot there's a lot of complexity in the world now.
It used to be arguably a lot simpler. And now
(11:20):
with technology, with geopolitics, globalization, free trade, you know, the
the reaganomic money system, those complexities are so intricate that
stability is actually kind of hard to predict. New Zealand
has always had that reputation for stability, and my dad
was sort of looking at where the best place to
(11:41):
make a bet would be, and he believed little Old
New Zealand seemed to be the right place because of
all the things we love about the country. A democratic
system that it seems to work, a small population group,
low crime, high quality food, lots of space, all of
those wonderful things.
Speaker 2 (11:59):
Right, Well, I think your father got it pretty right.
Speaker 3 (12:04):
Yeah, I think he did, and I think I think
I do love New Zealand and obviously a New Zealand
citizen now, but it's hard to when you travel, you're
always comparing what home is like compared to where you are,
and you're always looking at the positives and the negatives,
and New Zealand always seems to have quite a long
listed positives that can sometimes be forgotten. Yeah.
Speaker 2 (12:25):
Absolutely, I think we tend to forget about them until
we get on plane and go somewhere and that reminds us.
So once here, you obviously settled into New Zealand. Where
did you go to school and and thereafter university?
Speaker 3 (12:42):
Yeah, we chose to settle in South Auckland, So grew
up in Tapitoi and sort of the Moneco area, and
my parents actually taught in South Aukland schools their entire
career right through till retirement. Asada and Pakaranga and Mangomy
as well. So quite a colorful community which I love dearly.
(13:06):
And it was responsible, definitely responsible for a lot of
my personality traits, being able to sort of deal with
lots of different people with lots of different backgrounds. I
think it makes me, yeah, a big part of who
I am.
Speaker 2 (13:19):
Yeah, sure, and so so did did you find it
an easy to settle?
Speaker 3 (13:27):
I think I think it was quite tough. My parents
did a good job of shielding the kids, I guess
from some of those difficulties but I do remember distinctly
the first week decade of being here just having to
build up those networks. I think New Zealand has gone
from you know, Auckland has grown massively since we since
(13:51):
we came here, and that diversity has changed the dynamic
on how people view immigrants and even in the country.
I mean generally speaking, New Zealand's reasonably welcoming, but it
is a it is a shift, I guess for any society.
And then yeah, just building up all of the resources
that you need when you're living in a country with
(14:12):
no sort of background or lineage or heritage in that country,
there's always a lot of hard work. And then schooling.
There was a big difference in the nature of schooling
in countries like Zimbabwe in Africa and Sri Lanka. Shall
we say corporal punishment was permitted in those other countries,
whereas New Zealand abolish those systems. So, you know, different dynamic,
(14:36):
but obviously less strict and a very different approach on
how to educate. And my parents obviously have successfully adapted
to that. But as you can imagine, if you're used
to kids kind of you say jump and they say
how high? And if they don't the parents taking a
real interest in that to a much more shall we say,
center aligned approach to schooling. Yeah, that can be quite
(14:57):
a shift. So that caused a bit of stress and frustration.
Speaker 1 (14:59):
But in terms of your own schooling, what were the
highlights do you remember much about?
Speaker 3 (15:06):
Yeah? I went to Peppatoy Preappatoy Primary School and then
intermediate and then high school, and then I went to
the University of Auckland to do my bachelor's in Software engineering,
which is a b in engineering with a specialty of software.
I think we were only the second batch through on
that specialization, so quite early. Yeah, sure, yeah, And you know,
(15:29):
I always I always had a pretty good group of
friends who also followed, and I followed through engineering and
sort of the sciences, and we maintained sort of that
network and that that was an inherent support system that
was there. So I do have very fond memories of
schooling and university. And I spent a lot of time
(15:52):
on a computer, you know, from an early age, sort
of high school onwards, probably probably primary school. One was
actually ninety late nineties, so yeah, it would have been
like standard one, kind of standard for whatever they call
it from then on just tapping away on computers and
trying to figure out mostly how to get games running.
But you know that quickly quickly evolved how to make games,
(16:16):
and that evolved how to make software.
Speaker 2 (16:19):
So we did you have interests away from the keyboard
or were you? Were you just that sort of geeky
kid who sat by his computer all day?
Speaker 3 (16:29):
Yeah? I would love to say that I was this multifaceted,
you know, musician extraordinary, but anyone who's been to karaoke
with me knows that I'm solidly tone deaf, unable to
achieve even the most basic, most basic metal melodies rhythm,
but doesn't stop me from trying. And I think that
probably speaks to my personality the most.
Speaker 1 (16:51):
But you do, you are a bit of a soccer fan.
Speaker 2 (16:53):
You do you do and kick a ball around every
dow and then, And I presume that's something that came
from your school days.
Speaker 3 (16:59):
Yeah, I played play a lot of football, you know,
even even to this day, probably three four times a
week social football, And to be honest, it's the thing
that keeps me level when you've got a pretty stress day.
And I'm sure you can relate to this with your
swimming and cycling. Yeah, and yeah, it just means, you know,
(17:20):
when you're playing a sport like that, you just it's
a form of meditation the way I can because you know,
you read about how mindfulness works and meditation, and it's
really about taking your mind off what it's already ruminating
on and focusing in an open area. And in football
and on the pitch, that open area is the football
(17:40):
because you have to just concentrate on that thing for
an hour and that kind of creates the space to
sort of wind your brain down and allow you that
space to meditate effectively.
Speaker 2 (17:52):
Absolutely, So you were always going to be a Tich guy,
wouldn't you.
Speaker 3 (17:58):
Yeah. It was written quite early on. My parents used
to work until about five pm most days, even though
school finish around three, So it meant that I wasn't
picked up from school for a few hours after school finished,
so obviously not a lot to do. Was told to
stay on the school grounds in South Oakland, you probably
(18:19):
should stay on the school grounds, and that left me
with the library and the library computers, so there was
a lot of a lot of time to sort of
just too around for lack of a better words, and
and really tweak and sort of experiment. And do you
remember those Acorn computers, those really old A computers. Yeah,
(18:43):
there's really funny story about that. That's kind of full circle.
They were built by a company in Cambridge, and that
company went out to build these personal computers and they
were reasonably successful in schools and but they eventually failed commercially.
But the technology in them was the ARM microprocessor, and
(19:05):
this company went on to become the mainstay of war
portable devices, including the iPhone. So the very one of
the very first computers I learned to program on ended
up being the computer that I programmed on in business
with the iPhone. So pretty funny little story there.
Speaker 1 (19:22):
Yeah. Yeah, and the and made in Cambridge.
Speaker 3 (19:26):
Yep, made design in Cambridge and in the United Kingdom.
Speaker 2 (19:31):
Right, Okay, not so not Cambridge, New Zealand, Cambridge, UK.
Speaker 3 (19:34):
No that I think a few people would notice at
the multi billion dollar buyout of something, That's.
Speaker 2 (19:40):
What I was wondering. But the brain was ticking over
saying I wonder where that happened. Hey, folks, we're with
Danu Absaria, the founder of Rush Digital.
Speaker 1 (19:48):
Will be back in a moment. Back with Danu.
Speaker 2 (19:59):
Absaria, the founder and chief technology Officer of Rush. What
made you start your own business?
Speaker 3 (20:08):
Ah, yeah, great question. I think there's obviously an inherent
drive to make the most of the opportunity. You know,
I've always believed in I think one of the really
interesting things about growing up in New Zealand when I
had family still in Sri Lanka is we used to
go back every couple of years, even as kids, and
we would see how our life was progressing comparative to
(20:29):
you know, our cousins and and and close relatives and
family friends in Sri Lanka. And you could really, you
could really see how opportunity and and sort of, for
lack of a better word, privilege allows opportunities rather than
you know, it being specifically because of an individual's talent. Yep.
So it became fairly evident that we were given a
(20:51):
lot of opportunities simply by being here. You know, A
decision that neither my sister or myself made, it was
made for us, right, So there comes I wouldn't call
it pressure, but like there's there's definitely a massive drive
of motivation to make the most of it. You know,
New Zealand has so many positive aspects to doing business
(21:12):
in it and it was sort of a degree of
patriotism to say, Okay, this is the country that we
managed to find stability in and and create some opportunities
and get a higher education. So a certain amount of
it was, well, I'd like to be able to start
a business in New Zealand and give back in some way.
(21:35):
And then yeah, the second most important part was probably
just the drive to make most of the opportunity yep.
And then the sort of a layer that focused in
on what the business would be was growing up in
a generation that was also handed in the same way
we were handed the opportunities, we were handed the problems.
Speaker 1 (21:54):
Yeah.
Speaker 3 (21:55):
So you know, we were born into a world that
was ocean plastics and climate chain and all of that
stuff was already set on its course, right yep. Sure,
And I think the one common thing being in technology.
I sort of noticed this common thread of well, actually
technology is the real cause of a lot of this stuff,
(22:16):
you know, and humans wanting to use technology to you know,
conduct business or do things, you know, even if it's healthcare.
So there was sort of this belief that actually we
had sort of just been blindly wielding technology and that's
why the problems existed. Not that technology was bad, just
(22:37):
that it was quite a utility. So the intention was, well, actually,
if you were going to change something, how do you
help organizations that are going to use technology use it
in a better way, one that serves, you know, humanity
in a more equitable way, you know, but also just
works really hard to not use technology without really understanding
(22:58):
its side effects or making proactive decisions about what impact
it's going to have. Sure, and that sort of led
to the mission statement of RUSH, which was to design
and build technology to better serve humankind. And I think
that that works as a real magnet to the type
of people that we attracted in the business. I think
it changes how we think about how we can impact
(23:21):
the world. And I think it also invites a certain
type of customer that's also willing to engage in that
sort of theory of thinking. And I think we were
doing it at the right time. You know, there's a
bit of zeitgeist behind us. If I had probably tried
to do this ten years prior to when I did,
probably wouldn't have really ended. If I had done it
(23:42):
ten years after, we probably would wouldn't be seen as
a front runner. And doing it so timing is everything.
As they say, where did the name come from? Yeah,
this is a fun one. A lot of people think
RUSSI means do it quickly, and you know, we have
been known to do it quickly. The COVID tracer app
definitely speaks to that. We got that up and running
(24:04):
in less than I can't even remember. I think like
two months or three months or something of that. But
Russia is actually the exhilarating feel so the rush that
you get from solving a problem. Yeah, and that's the
engineering me. You know, that aha moment, that Eureka moment.
We call that living for the rush, which does sound
(24:27):
like some sort of drug reference, but I promise you
it's not.
Speaker 2 (24:32):
Danny, you did just touch on on your people and
the type of people you attract, and I had a
look at your website and you describe your organization as
a team of strategic thinkers, empathetic designers, and technical mavericks.
Speaker 1 (24:47):
Tell us about those people.
Speaker 3 (24:51):
Yeah, I love I always love that description because you
give anyone any opportunity to use the word maverick in
a day to day sentence, and I'm pretty sure they'll
take it. Yeah. I think I think I'm hiring the
ethoses it is really you know, Russia is really really
really well known for its employee culture, it's employee value proposition.
(25:13):
Even in these sort of economic times, you know, we've
been able to conduct ourselves pretty well because customers are
looking at us as a sort of a trusted brand
and they love working with our people. And you know,
we get testimony and we just had our Rush Awards
last week and it was a good reminder that, yeah,
there's a lot of there's a lot of enjoyment for
(25:34):
the work in our organization. And I think that would
probably be one of the hiring pillars that we look for.
You know, we say it Russia, if we're not excited
to hire the person, it's probably a no. And so
those are the kind of things that we look for
when we're interviewing people. And we also work really hard
to try and take care of our people. When it
(25:55):
was really pretty obvious that the healthcare system is going
to go through a lot of change, we proactively implemented
health care insurance health insurance for all our employees and
we wear those costs as part of our sort of
operating expense. And for a company our size, you know,
around about one hundred people any given Sunday. That's probably
(26:18):
not that common even in technology sympols. Sure, and I
think one of the biggest things that we look for
is people who are up for a challenge, because we
try and do this sort of outsized influence approach where
we sort of say, we're a tugboat out at sea
and what we're trying to do is help a giant
(26:39):
oil tanker, which is another big organization, you know, like
those ones you rattled off, like z or the Ministry
of Health. So we want to be the small, effective
team that works hard to change the course of a giant,
and by changing the course of a giant, we have
a real material impact on an economy, customer segment, or
you know, some other sort of impactful outcome.
Speaker 1 (27:02):
Where do you find them?
Speaker 2 (27:03):
I mean, you've obviously got talented people very the very
top of the tree in terms of software engineers and
the like. Where do you find them in a market
the size of New Zealand.
Speaker 3 (27:14):
Yeah, Look, we've we've I would say a lot of
our hiring over the years has been mostly domestic, you know,
so people that have landed either with work visas or
have lived in New zeal in their whole lives and
been educated here, and I think we look really hard
for them. So when I when I was starting out,
(27:38):
you know, it would not be uncommon for one hire
for me to go through about four or five hundred
CDs and do you know, one high one higher upwards
of thirty to thirty five interviews, and you know, we
would we would test. We would specifically test. We used
to have this old Star Trek reference. There's this scene
(27:58):
in a movie, in a Star Trek movie called Kobe
Ashi is basically this made up test for Captain Kirk
that he's supposed to fail. There's no solution for but he,
you know, being Kirk, figures his way out of it, right.
And so we used to have this test that was
designed so that people would fail. And that wasn't to
be mean. It was to acknowledge the fact that actually,
(28:20):
when you're sitting next to someone or a colleague, you
don't really care how they perform as long as they're
meeting their goals when things are easy, right, what you
really care about is when things go bad, how good
are they? How are they going to behave? What kind
of questions are they going to ask? What kind of communication,
you know, what's their approach, how accountable are there. So
(28:40):
we sort of created these tests that were really artificially
supposed to put pressure on you without you really knowing
when when you're interviewing with us, and then we would
spend time analyzing how they behaved in an impossible situation.
And so those early hires for us were all about
people were able to, you know, get it done when
(29:04):
things are tough, and I think that permeated the culture
and that's that's what's allowed us to deliver successfully and
build a reputation that we have with customers like you
know Z and the Ministry of Health and being in
literally a situation with COVID Teresa where you're getting the
Prime Minister delivering requirements on live TV and then getting
a call from a panic product manager a couple of
(29:24):
minutes later from the ministry side, you know, like that
was the kind of environment that I think a lot
of other companies would have really fumbled the ball. And
that was demonstrated around the world where you know, the
United Kingdom, for example, they spent upwards of forty million
pounds on a failed attempt to do the same thing
that New Zealand did for a fraction of the cost
using US and other partners.
Speaker 1 (29:46):
So is our talent. You know, we talk a lot
about the brain drain.
Speaker 2 (29:50):
We talk about a lot about you know, tech being
an opportunity in New Zealand with companies like Zero and others,
you know, Syrian Taylor's business, Animation research and so on.
You know, we've got this wonderful these wonderful techle organizations.
But we do talk about the difficulty in getting talent.
Here is the talent drain as big as as big
(30:14):
a problem in the tech industry as we talk about
it being.
Speaker 3 (30:19):
I think it is. I think it is, but not
in the most simplest terms, you know, like fundamentally, especially
more lately, you know in New Zealand. If you put
a job at up in New Zealand and you're supportive
of immigrating or migrating someone as part of that these
application process, I think you can be reasonably successful. Actually,
I think the brain drain is literally caused by a
(30:43):
lack of focus by most of our institutions. You know,
I would say our universities are unfocused. I would say
that they focus on revenue rather than what they should
be focused on as producing great candidates. And you know,
I've personally seen that shift. You know, I graduated from
the University of Auckland when it was ranked top fifty
(31:03):
in the world engineering schools at least, and that has
slipped marketably. And that's because we've been focused on generating
revenue from I believe you know, this is my opinion
international students. You know, we seem to have a lack
(31:24):
of focus of policy. You look at some at something
as simple as extending what we gave movie studios to
the video games companies domestically, and we didn't move on
that for ten years easily, and while the iPhone was
booming and went from zero dollars of revenue per month
(31:46):
or per year to literally billions. While those New Zealand
game companies were being built. All we had to do
was extend what we were already giving to the movie studios.
And the only reason we moved recently is because Australia
actually put in some policy that meant that a couple
of studio heads in New Zealand said, hey, if Australia
do this, we have no bus justifiable business reason to stay.
(32:10):
So really a gun had to be held up to policymakers'
head to actually get that change. And then I think
businesses themselves also have this unwillingness to invest in people
in their capability. I think it is a business culture thing,
and I'm not sure if it's an unintended side effect
(32:30):
of New Zealand being predominantly small businesses. But this idea
of continuously improving and educating your people so that they
can scale your company, not just come into it and
climb the ranks, but actually work hard for it is
something that is different here, right, Like if you go
to the United States, I'm pretty sure you could find
(32:51):
people that have been in organizations for like twenty thirty years.
And even in technology. You know the recent passing of
a senior Google VP. She was I think fifty years old,
passed away from lung cancer. She was one of the
very earliest employees in Google and she ended up being
the head of YouTube. That's right. You know that you're
(33:13):
looking at literally going with Google from zero all the
way to the top, right, Yeah, and I would say
that that's pretty rare in New Zealand. I don't know
how you think about that, but yeah.
Speaker 1 (33:24):
No, I agree with you.
Speaker 2 (33:25):
There's the few and far between, and it is about
investing in your people. You're absolutely right. I think the
small business nature of the country is is a is
a part.
Speaker 1 (33:36):
Of the issue.
Speaker 2 (33:38):
But of all I also have this view and you'll
you'll have a view on this, particularly with the tech
sector and and a weak currency, which is what we
have at the moment. You know, our tech sector could
be a service provider to the world. We've got good people,
we've got we've got you know, a relatively low cost
of employing people relative to people who have paid in
(34:01):
US dollars, and and to me that that's an opportunity
combination of talent. If the university's got together and thought
about the types of programs we run, if we marketed
ourselves to the world, we could be a provider to
the world. And you've sort of proved that with the
work you've done yourselves for some American companies, right.
Speaker 3 (34:25):
Yeah. I think that's a really interesting thought, and I
think it's along the right lines of thinking. The really
interesting thing is if you take that one step further
and you think about what unique value propositions New Zealand
could export to the world. Then in technology, one of
the biggest issues that's coming up in the last ten
to fifteen years, especially with the rise of social media
(34:45):
is trust yep. And you know it's something that our
FMCG and agriculture businesses have traded on for decades, if
not hundreds of years. Right. You can trust the quality
that comes out of New Zealand and Manuka honey. You
can trust the quality of our dairy. You can trust
the quality of our meats and our produce. And why
(35:08):
couldn't you take that to something like cybersecurity or artificial
intelligence or accounting software. I think I think if you
were to, for example, talk about any kind of software
that came from New Zealand versus a company that was
maybe based in other countries around the world, say America,
just for aargizan's sake, I would say that most people
(35:28):
anywhere around the world would have a very high regard
for New Zealand without knowing much about New Zealand. And
your ability to stand on that and say, hey, we're
a company that specializes in X, Y and Z. We
handle all your data and your data processing. And you
know what, because we're in New Zealand based and where
all of those good things, we're a highly trustable company.
(35:51):
I think that's definitely a missed opportunity, and I think
we could be doing more in that sector, in that
vein to achieve what you were talking about as well.
Speaker 1 (35:59):
Yeah, absolutely all.
Speaker 2 (36:00):
I just I just challenged the phrase missed opportunity.
Speaker 1 (36:05):
It's never too.
Speaker 2 (36:06):
We're with Dana Whoever Syria, the founder of Rush Digital.
Speaker 1 (36:10):
And we're going to be big back in a moment.
We're back with Dan Whoever Syria.
Speaker 2 (36:23):
Dana, You've touched on the COVID tracing app a couple
of times, and I know it came a bit out
of left field for for your company.
Speaker 1 (36:31):
How did it come about?
Speaker 3 (36:33):
Yeah, we went through a standard sort of government tender process.
I think one of the things that was interesting was
we we obviously had to respond quickly, so the government,
you know, probably got a short and hard list of
top app developers in the organization, in the in the country,
(36:56):
and obviously what I work with Zen Energy and running
their digital programs and a few others, our name made
it onto that list, and we came in were pretty
strong references from business leaders and sort of customers, and
we were also able to demonstrate that we had a
team ready to go and I think that was really
(37:18):
what clanched it for us. Our availability obviously meeting all
the other criteria and winning on equal terms there, but
having a team that was available and ready to go
definitely didn't hurt the situation. And one of our advisors,
Roger Dennis, had just come back from China December twenty
nineteen I believe, if not November, and he sort of
(37:40):
said to have, who was our CEO at the time, Hey, mate,
there's there's something going down in China. It looks like
a stars type flu thing. It's never really happened on
the scale in China, and people are genuinely concerned. We
should probably start thinking about it affecting the rest of
(38:02):
the world. Because Roger particularly specializes in sort of crisis
identification and mapping out how how these things play out.
So he sort of tipped us off and and the
provocation was what digital tools would you apply to a
modern pandemic. So we developed a couple of prototype ideas
(38:27):
we called it. One of them was planed see, which
was a quick assessment for businesses to see where risks
in their supply chain exists. But it became quickly evident
that digital contact tracing was going to be a thing.
I think Singapore had started to kick off things in
the in the UK had started discussing trials and Isle
of Wight and we were obviously specialists in mobile technology
(38:51):
and building mobile apps, so we started to also ID
eight in prototype and so when when we went through
this government Purman process, we were able to demonstrate working
examples and a lot of thinking and a lot of
risk management strategies which played played out really well. One
of the ones that I can think of was our
advice around you know, how Bluetooth should be approached that
(39:15):
a lot of other countries decided to implement make their
own implementation, and our recommendation was that that's risky. It's
really risky. You're better off keeping it simple and easy
to use for everybody and then waiting for a more
comprehensive solution from the likes of Apple and Google. And
that did eventuate to Apple on Google's credit, and that
solution was much much more straightforward and reliable to implement.
(39:38):
So yeah, I think there was.
Speaker 2 (39:41):
So you've got a you've got a pandemic raging, and
you are appointed by the government to build an app
in record time. What were the challenges.
Speaker 3 (39:57):
Yeah, I think one of the biggest challenges. You know, normally,
when you're building something like this, the regulatory environment was
very black and white, very clear, Yeah, whereas in this
situation someone had to go dust off the regulation because
you know, the pandemic and quarantine rules are probably you know,
first authored in the sixteen hundreds or something for dealing
(40:20):
with ships that come to your ports with six sailors.
So yeah, so I think there's the modernization. You know,
this is something that technology struggles with all regulation, right,
so when you when you look at social media and
all of the regulation that relates to that, we still
aren't caught up really. So if you imagine that that
(40:41):
scenario for a pandemic or a health event, there's a
lot of thinking that has to go into at what
point does it become a health event, what do people
what do people need to consent to, and where does
their data need to live? And you also then need
to make judgment calls, you know, we will. It was
very clear that no one in the government was and
rightly so, willing to take push the risk out on
the privacy side of things too far, right, so early
(41:04):
decisions were made that the data had to sit on
the user's phone fully encrypted, so that until it became
a health of it, until you were included in a
contact tracing chain, the government didn't have any right to
that data and there was no secret place where it
was stashed in the cloud somewhere, and those decisions had
(41:24):
real technical implications. So every single device that was out there,
and I think we had three million active users at
some point, that means we had three million different encrypted
bits of data sitting on people's phones that we just
could not lose. So every time we had to roll
an update, we would be testing across thousands of devices
(41:45):
to make sure that you didn't lose those things, right,
And then you've then also got this ever changing rule
set about lockdowns and when we you know, color light
systems and all kinds of stuff that you need to
participate in, and so you know, you need to be
on your toes continuously and the requirements are never quite settled,
(42:06):
which is quite normal in software, but the pace that
we had to deliver those changes was was very break neck.
So and the importance absolutely.
Speaker 1 (42:16):
It's life and death stuff in that environment.
Speaker 3 (42:19):
Yeah, absolutely, What were the lessons?
Speaker 2 (42:21):
What were the lessons from that experience from from you
for a software company, you're growing rapidly, You've got a
whole lot of stuff going on in your business. This
comes in, This comes in from left field. You know,
the country's locked down, so doing business is hard anyway.
And then something that's a huge project, huge pressure around
(42:46):
timing and as you say, constantly changing the lessons from
a project management point of view must have been huge.
Speaker 3 (42:55):
Yeah, definitely if we If I sort of had to quantify,
I think I think the biggest thing is in those situations,
people are just at the heart of it. And it
sounds a little cliche as an answer, but to be honest,
(43:16):
they're writing the code, they're doing the testing, they're thinking
through all of the processes of the maps and stuff
of how this thing can go and where it might fail,
and you know all of those things, so you know
when you're when you're in that crisis situation and now
it's where my head goes immediately, you know. Like I
was asked for comment on this recent crowd strike failure
and business that's sort of quoted one of my first words,
(43:40):
which is the first thing I would be doing is
getting on the phone to your head of people and
making sure that your people are safe and sort of
that's the permanent lesson that I've learned. It's like, no
matter what the crisis is, the first thing you need
to do is put yourself in your employee issues or
your partner shoes and then see what risks sit around them,
and as a management team, figure out how to deal
(44:03):
with those first. Because if those people don't feel safe
or rested or able to do their jobs, then the
whole programs at risk because that's when things get missed.
That's when people don't speak up or you know, don't
vocalize the right things, and so you know, a big
cause of failed delivery, I think now more than ever
(44:25):
is how well are you looking after your people? And
and the really interesting thing about that, Bruce, is not
to sound too much like a soft touch. If you
do look after your people, it also gives you the
ability to apply a bit more pressure, yeah, and allows
your people to take more pressure.
Speaker 1 (44:42):
Yeah.
Speaker 3 (44:43):
And I think that that is the that's probably a
very subtle but important difference. And you know, if you
look at something like the Steve Jobs School of Business,
where it's a renowned such and such, shall we say,
but very very very capable.
Speaker 1 (44:57):
It was a tough task master. Can we say that
it was?
Speaker 3 (45:01):
He was? Yes, So I think I think there's a
real art form and balancing, balancing off, really supporting your
people to make sure that you've removed everything in the
way that might cause them to trip up, but then
using that as an opportunity to put them into more
comprehensive and more complicated situations because you've kind of created
(45:21):
the environment that will allow them to do their best work.
Speaker 2 (45:23):
Right you mentioned the cloud strike failure. Will we see
more of those events?
Speaker 1 (45:32):
Do you think?
Speaker 3 (45:34):
Yeah, it's definitely. Yeah, it's it's one of these really
interesting areas because you know, cybersecurity is is is much
like combating terrorism, right, and it's and it's a it's
an attacker versus a defender situation if you if you
(45:54):
think of it in a terrorist plot type approach, the
attacker has has to just get in once when they
but the defender has to stop it a thousand times.
Speaker 1 (46:06):
It has to be there all the time.
Speaker 3 (46:07):
Yeah, exactly, So there could be a thousand attempts. The
attacker can be unsuccessful nine hundred and ninety nine times
out of the thousand, and the defender fails once out
of the thousand attempts, and what we see as the
general population as an absolute disaster, right, we see a
complete massive outage you would see, you know, like the
(46:28):
byko dhb breach. Now, in the case of CrowdStrike, it
seems to be something related to quality and culture. And
the founder of CrowdStrike was the previous CEO of McAfee.
He left McAfee under very similar circumstances where McAfee had
a similar hastily rolled out update. Not the update. It
(46:51):
was only out in the world for about sixteen minutes apparently,
but it knocked out hospitals and emergency services and all
kinds of stuff, and it tanked mcafe's stock price up
I think like forty percent, which allowed Intel to launch
a hostile takeover. So quite quite a disastrous outcome to
lack of quality control. But I think cross track really
(47:12):
exemplifies reliance on a single provider for a lot of
these security related matters, and it also exemplifies the additional
complexity in the world. I don't know if you've noticed,
but I constantly complain about this. My iPhone doesn't just
work like it used to anyone. Yeah, so say about
(47:34):
five or six years ago, maybe up to about three
years ago, when you tapped on anything on an iPhone
or you did anything on an iPhone, it just worked, right, Yeah,
like no questions asked, it just worked. And more recently,
in the last couple of operating system updates in the
last couple of years, whenever I tap on something, it's
a little glitchy, and it's a little you know, it
doesn't quite follow through. And I think that's a representation
(47:58):
of every thing that seems to be getting technology and iterating.
Even cars. You know, I used to be able to
work on it. Yeah, you pull up the hood on
a car from the seventies or eighties, you know exactly
what everything is, and you know which tools you need.
Probably need about three tools total, and you can fix
about eighty percent of the problems that that car might have.
(48:18):
I don't even want to beg begin with four. Yeah,
they're basically laptops on wheels.
Speaker 1 (48:24):
Now, Yeah, you don't know where to start, right.
Speaker 2 (48:27):
Yeah, we're with Dano Eversiria. We're going to come back
in a moment and talk about artificial intelligence. Back with
the founder and chief technology officer of Rush Digital, Dano
(48:49):
Everserria dano I mentioned in the introduction, I think artificial intelligence.
Speaker 1 (48:56):
It's early days.
Speaker 2 (48:59):
There's a lot of hype around it, the software companies
that are involved in it. The share prices are going nuts.
But if we if we sort of pull back to
where it's at, it's as I said, it's early days.
People are starting to use it. How are people how
are people using AI? Just in terms of the general
(49:19):
population at the at present.
Speaker 3 (49:23):
Yeah, AI is really one of those you know, I
believe BCG sorry JP Morgan Chase, so I believe referring
into it as a milestone technology, and they're sort of
predicting that it can affect global GDP by sort of
(49:44):
seven percent in the next decade or so, which is
seven trillion dollars of net productivity, which is quite you know,
it's quite comprehensive. The category of Marston technology includes the
light bulb and AC current AC the electricity sorry, so yeah,
it's it's really it's really in that category. And when
(50:05):
you look at the the cost of training and the
cost of and complexity of developing these models, it's fully integrated.
You need you need the power systems to power the computers.
You then need the computers to be reliable enough. You
then need the computers to have enough power processing power
to simply do the training. You need people and computers
to do the data labeling. Then you need the ability
(50:28):
to have that same infrastructure deliver all of that capability
at scale. I mean, you've got Microsoft and some of
these larger AI companies buying power directly from nuclear power
plant stations completely bypassing the grid and actually moving you know,
potentially coal two emissions, which is the main driver for
(50:50):
buying something like nuclear power by orders of magnity, you know,
And and really that's that's what we're dealing with here,
the probably the sum of all human knowledge by the
jet engine to deliver some of these capability. The really
disappointing thing is we're using it for things like art generations,
which you know, is very valuable, and we can see,
(51:11):
you know, parts of society being affected by that. But
there are I think one of the most interesting things
about AI is it's so widely applicable. People are actually
struggling to be creative enough to apply it. And I
see this below the line internally internal operations. You know,
(51:32):
when I Russia is developing, it's you know, we're AI tooling,
So we're applying AI tooling across our business for internal operations,
and I see this when I'm dealing even with our people.
You know, some of our most bright engineers are still
only just figuring out how to you know, get the
most out of it. And I still have these really
(51:52):
interesting harm moments where people go, oh, it can do that.
You know that that phrase is banded around our office,
you know, pretty frequently, And I'm not embarrassed to say
it because when you when you look at how people
are using it out out in the wild, most people
are using it to rewrite an email, check grammar for
former skeleton plan. But its ability to solve so many
(52:17):
different problems is really requires a good comprehensive understanding of
how it works and how to think laterally about solving
your problem. So with AI, like chat based systems, large
language models, you know, they they have a huge, huge
(52:38):
potential in real automation. So in enterprise businesses, you know,
getting systems to talk to each other is typically very challenging. Sure,
And one example is if you've got a ARP or
something that produces a bill as a PDF that you
send to a customer, the work required to get that
system talking to a totally new system. You know, that
(52:59):
could be in the millions of doubtics, and it requires
tons of licensing of software. The really interesting thing is
these lllms or these chat based interfaces, they don't require
a lot of structure. They don't even care. So you
could throw that PDIA document across the fence into an
AI model and have it fully extract all of that
data and do it in a you know, a millisecond
(53:20):
or a second twenty four to seven, three sixty five. Yeah,
and you could do that, you know, orders of magnitude
quicker and more cost effectively than you know, the traditional
way of doing that sort of task.
Speaker 1 (53:36):
Yeah.
Speaker 3 (53:36):
Sure, And then you look at sort of if you
look at this lateral thinking approach. So Tesla is a
good example of this, where the full self drive system,
the autopilot system that Tesla has been working on forever
and will probably be working on forever. But that's a
different story. They saw a massive shift, a massive change
(54:00):
between their version eleven and version twelve, where Vision eleven,
which was a fully self driving system but still made
a lot of errors to put the drivers in a
pretty hairy situation reasonably frequently right. The new system was
actually which drives the vehicle. It turns the steering wheel,
applies the brakes, It decides what to do with traffic lights,
(54:22):
It navigates its way around street cones and pedestrians and
people falling off their bikes and cars crashing, and all
kinds of really interesting scenarios that a human would deal with,
some better than others. Its new system is actually based
off the same techniques as chat GPT. So what they've
done is they've translated the video footage into a context
(54:45):
that a large language model could understand so effectively words.
And then they've asked this large language model, given this
description of the situation that I'm in, what do you
recommend I should do to the car? Apply the brakes,
turn the steering wheel by how much? And if you
look at some of the way that these systems work,
(55:06):
it's really transposing something that you would think as an
image processing problem into a you know, a question and
answer problem. So the challenge to most people for productivity
is like, how can I reframe the problem into more
of a question and ask answer type solution and have
a extended conversation with these systems to get the result
(55:28):
that I want. It's it's all. It's all about refining
what you're asking the system to do.
Speaker 2 (55:35):
You asked the question, So if if people are sitting
in their businesses or their officers right now looking for
looking at AI or chat, GPT or whatever they're looking at,
saying I want some real productivity improvements. I'm sitting here
working on my own what are the things that people
could be doing differently today?
Speaker 3 (55:55):
Yeah? Sure, yeah. There's two ways to sort of approach this.
I think what one is look for a point solution.
So a point solution obviously very special. Realize that one
task and it means that you apply it somewhere. So
in your accounting, for example, there might be a point
solution that automatically files or receipts for expenses, you know.
(56:18):
An example. Another example is RUSH. We build a system
called our Vision, which does computer vision analysis on CCTV,
and we do health and safety observations and factory flaws
and sort of logistics centers and things like that, and
we can sort of spot things like people lifting things
with bad posture. We can sort of count the number
(56:41):
of times people get within a too close show. We
say to a forklift or other form of heavy equipment
that's got wheels, and that information is really important, and
that's that point solution is outcome focused, right, So any
buyer of the solution doesn't really need to know how
the AI system works. They just need to know that
the data coming out of it is reliable and actionable, right, Sure,
(57:04):
So looking for point solutions, add the word AI to
the end of it and see what comes up. And
then sort of below the line, I think using systems
like chat, GPT and Claude and perplexity, those deliver a lot,
a lot of flexible and readily available productivity to most jobs.
Speaker 2 (57:28):
Yea.
Speaker 3 (57:29):
You know, at a minimum, I would say you're probably
saving ten percent of your time, especially in administrative type tasks,
so you know, akin to rewriting emails or forming plans
or things like that. And if you take that thinking
a little bit further and use those systems as oracles
and knowledge and systems that allow your people to heavily
(57:54):
interact with as if though they had appear that knew
everything and could answer any question, you start to think about, okay, well,
where in my business can I improve things? Can I
improve decision making? For example, because if you go to
if you get an MBA, that information is now freely
available through something like CHECKDPT, So you can be applying
decision making models, you can be applying return on investment calculations,
(58:17):
all of those things are now freely available if your
team are willing to and know how to engage your
system like that to get that productivity. And you know,
if you are looking to develop plans to execute in
an organization, that's another great way of sort of gaining
some of those that additional productivity. Other areas are your
(58:44):
technology teams will obviously benefit from this. You know, there
are a lot of code generation capabilities, so if you do.
Speaker 4 (58:50):
Have a technology team being able to improve your testing
and your code quality and your security, those are all
areas that these systems can you add a lot of value.
Speaker 2 (59:01):
If you look beyond all the hype, what does the
future hold for AI technology?
Speaker 1 (59:07):
Where does it? Where does it take us?
Speaker 2 (59:09):
If we say, you know, three years, five years, what
are we using artificial intelligence for them?
Speaker 3 (59:17):
Yeah, another another introspective question verse. I think, Yeah, there's
definitely a lot of hype around this technology. I think
even in the markets now, you're starting to see a
bit of cooling on in video and some of the
AI stocks that we've seen rapidly rise. I think the
cost of inference, the cost of running the AI models
(59:40):
as well as the cost of training is the continuous
barrier so artificial intelligence. The first time that phrase was
used was a Dartmouth University conference in nineteen fifty six,
and yeah, and the field of their eye has really
been in existence since then and actively pursued. There was
(01:00:02):
another sort of there's actually this term called an AI winter.
It's much like a crypto winter. It's actually where the
phrase crypto winter came from, was originally AI winter. So
there was this this sort of these boom boom and
leveling cycles that happened in AI and all of AI's history.
In the nineteen seventies, there was one where you know,
(01:00:25):
there's a lot of excitement about AI, and then we
hit this sort of ceiling and it calmed down, and
then it took off again, and then there was another
one in the early nineties, and you saw movies like
Terminator Too talking about and Terminator talking about that in
science fiction that those are closely related to these AI
winters actually, because there's a lot of hype and then
it plateaus outs and I'll tell you why plateaus. It's
(01:00:49):
because we've reached the maximum available computing power, yes, for
that era. Yeah, and Gordon Moore's law. If you map
Gordon Moore's doubling of compute every eighteen months to two years, yes,
it maps perfectly to these AI winters, right. And it's
really exemplified when you think about the for example, the
(01:01:10):
computer that was used to land on the Moon, the
raytheon Apollo guidance computer. That was a computer that had
two megaheits, which you don't need to know how much
megahits is, just remember it's two. And then you look
at a Furbie, which is a software in the nineties,
it had more power than that computer. And if you
compare it to an iPhone, iPhone is nearly eight thousand
(01:01:30):
times more powerful, a single iPhone more nearly eight thousand
times more powerful than the computer that landed on the moon.
There was only seventy six computers that were built, and
they were at a cost of about a one hundred
billion dollars and an iPhone, a single iPhone, which of
you know, there's one point five billion of them around
floating around only costs about five dollars. Yeah, so if
(01:01:52):
you compare them next to each other, the scale of
computing that we've been developing as a society has been
doing this exponential growth.
Speaker 2 (01:02:00):
So are you so are you saying that I I
The limitation to how fast AI moves is is the
microchips that the computers, the talent that uses them, and
the imagination I suppose, and the power supply.
Speaker 3 (01:02:21):
Yeah, definitely. Yeah, that's a that's a great summation. I
think in the next in the next three years, it's
really about us as users adapting to it, much like
how we had to adapt to instead of looking through
the Yellow Pages, we had to start using Google, and
much like you know, moving from typewriters to PCs and
(01:02:42):
moving from desktop to smartphones. I think the next three
years is about how do we transition from doing things
without everything being AI assisted to doing our most basic
productivity tasks and administrative tasks heavily AI assisted and looking
for only things that are you know, AI enabled as
(01:03:03):
the points solutions that we buy over the next years,
and then the only difficult news cases, you know, like
true human interaction with robots, autonomous vehicles. You know, we
may see that pushed out even further if I was,
if I was a betting man, I'd be saying around
twenty twenty six if you map it out on that
exponential growth curve, the cost of inference, in the cost
(01:03:24):
of doing everything becomes very difficult around that point. And
really there's there's a lot of development in quantum computers,
which are a real step change in how computer processes work.
So I think until we get those kinds of shifts
in our processing power, we might not see the AI
of sci fi. But I don't think that that is
(01:03:47):
one hundred years away. I think ten to twenty years
at most.
Speaker 2 (01:03:51):
So so the risks of around AI. You know, you
talk about sci fi and terminator movies and robots taking.
Speaker 1 (01:03:58):
Over the world.
Speaker 2 (01:03:59):
If you know, if you spend five minutes on a
YouTube channel nowadays, you can you can see that people
have already got robots doing some spectacularly cool things, but
also some spectacularly destructive things. What are your thoughts about
the commentary around the risks of AI and how do
we how do we protect society against those?
Speaker 3 (01:04:21):
Yeah, it's a very very very challenging potential problem, and
I think I think that's what gets lost in the noise.
It's a potential problem. Yeah, but putting a cat back
in the bag is so much harder than just planning
to never let it out. Yea, And I think everybody
(01:04:46):
needs to move to a conversation of Hey, this technology
can be used for so much good, you know, like
the recent med Palm two model, a chat model from
Google was able to beat the average score of a
doctor passing the medical bar in the United States. So
you think about a key problem in New Zealand, like
(01:05:07):
our healthcare system and the workforce and that healthcare system,
could you redesign the healthcare system to have lower risk
activities AI assisted with a less trained staff member. You know,
So if you look at at a marine medic, a
US marine medic, they get basically pushed through medical school,
(01:05:28):
but in about a year or two and they remove
things like cancer treatment and anything that's not related to
a gunshot wound on the field. Right, So, if you
had an AI system that was able to offer half
decent advice that still needed some mechanism and being checked
by a qualified human, could you redesign and revolutionize a
healthcare system to be able to triage it at a
(01:05:49):
much more effective scale. So those are the potential benefits
that we could be seen, and education is one of
those things. But the real problem is you can't ignore
the potential on every side of the fence, right, so
people are inherently emotively than then when you see prices
like war and you know, all of those kinds of things,
(01:06:11):
you immediately jump to how can this system be used
to defend myself? And if I'm going to defend myself,
I have to assume that my enemy is not going
to be so kind and they're going to figure out
a way to use it as a web, as an office.
I think that's fundamentally. One of the main risks around
AI is it's very it's very, very hard to control
(01:06:34):
because there's no physical limiting barrier to how you control it.
You know, like nuclear weapons, you have to get the urani,
so you've got this physical limit which allows you to
put regulation and allows you to track where everything is going.
But with AI, we've got you know, every single laptop
on Earth, potentially your cell phone on Earth could run
a pretty comprehensive AI model and one and those in
(01:06:58):
ten years probably will be able to do that. So
some of the risks are what happens when you see
one of those robots with a gun. Yeah, that's definitely
not something you want to meet on the battlefield, right
and you know, taking it down a notch who hasn't
(01:07:18):
received a spam call these you know, that's either all
a fully automated call call or something more nefarious like
someone trying to scam you out of your bank balance,
and or worse yet, a loved one who's not so
TechEd a little bit. You know, AI is able to
scale human like conversations. Yes, so cyber crime, you know,
(01:07:43):
day to day cyber crime. These are topics that are
really relevant that happened. We've probably lost billions of dollars
in sort of invoice fraud and telephone scams and banking
scams and you know, investment scams. We haven't really sat
down and said, okay, let's let's let's not just disregard
(01:08:05):
this conversation because you sound crazy when you bring it up.
Let's let's go through the list of problems and map
out how AI could turn bad on this real problem
that affects us right now, and let's do something about
that instead. What I've seen from most policymakers and EU
included to the most advanced on a lot of this
stuff fairly abstract things, right and fairly abstract like thou
(01:08:30):
shalt be keep data private? It's like great, like, yes, yes,
that's really important. But what are we doing against the scammers?
And so those are the kinds of areas where I
think we could be spending a little bit more time
thinking about the impacts on society and and how we
are going to manage systems like this. And we also
(01:08:51):
need to pick an ethos to live by. You know,
one thing I don't love doing is pitting man against machine.
So when you look at you know, if I, if
I make that a more translatable example, I would say that,
you know, the machine is if it's if it's well
built and it operates twenty four seven, it's never going
to miss a thing. Right. So when you apply enforcement mechanisms,
(01:09:13):
let's say red light cameras or parking parking fines, if
you're putting man against the machine or or a person
against the machine, is that fair? And I would say
that that's not fair. And many people would argue that, well,
just follow the rules, which is a reasonable request, right,
(01:09:35):
But it completely disregards the reality of society and life.
So you have to pick as a society how you're
going to implement these technologies. Otherwise, what you'll probably find
is your sleepwalk into a completely authoritarian, autonomous society where
everything you do is monitored by a camera or a signal,
you know, an IoT device or something like that, and
(01:09:58):
every single mistake you make is absolutely hammered. And I
think that's a really really important thing to to be
talking about right now. You don't have to make any
decisions around it, but we should probably be thinking about, well, okay,
if we if we are going to make the most
out of these systems, what what are our what's our
(01:10:19):
ethos behind them? How are we going to treat are
we going to are we going to be more assistive?
And you know, a simple example might be, Okay, the
systems detected that you've parked illegally, you have to pay
the fine, but the fine goes towards a credit to
your next offense, right Like, That's a softer approach that
doesn't risk the revenue that might be generated from that
(01:10:39):
and also doesn't minimize the return on value on that
AI system. What you don't want is a massive backlash
and people reacting badly, which is what's probably going to happen.
Speaker 2 (01:10:50):
Yeah. Well, because you're starting to challenge the fundamentals of society, right,
and one of those fundamentals is is fairness. And there's
there's countless others. Boy, we could talk about this all
day and most of the night. I suspect I've got
so many more questions and so little time.
Speaker 3 (01:11:12):
That's right, But you're right, those.
Speaker 2 (01:11:16):
Those fundamental aspects of fairness and other other values are
all the things that we're going to have to work
out as we as we deal with AI and and
and it's the many.
Speaker 1 (01:11:29):
Forms that it throws at us.
Speaker 2 (01:11:30):
As I said, Dannie, I've got all sorts of other
questions I'd love to ask you, but the one that
I absolutely have to ask you is that if you
were the New Zealand Prime Minister for a day, what's
the one thing you'd like to do.
Speaker 3 (01:11:43):
Now, this is a great question. I'd be really genuinely
interested to hear what. I'm definitely asking it in barbecues, now,
that's what you need to know.
Speaker 1 (01:11:54):
You're asking barbecues.
Speaker 3 (01:11:55):
Okay, Yes, So I thought hard about this, and I
think what I would love for New Zealander is to
to develop more of a domestic economy. So I would
try to find a way for us to get to
about ten million people. I think that's a real tipping
point in a society's ability to be self sufficient in
(01:12:16):
terms of domestic commerce. And I think what that does
is it really forms a bridge for businesses to be better,
for us to raise middle management talent, go past sort
of the small business twenty person headcount domestically before we
then have to take this gigantic leap you know right
now in New Zealand, what you end up doing is
(01:12:36):
you start a sort of a business, you get to
about twenty headcount, and then you need to find a
way to scale it domestically, and if you can't do that,
you then need to figure out how to explot it.
That's a really big leap between I've just started a
business and I have twenty people working for me too.
Now I need to get it overseas and regulations and
all kinds of stuff, right and and and I think
(01:12:58):
a domestic economy would really start to facilitate and solve
a lot of problems. I think the side effects of
developing a domestic ony wouldably produce more tax revenue. Who
would justify infrastructure bills all kinds of stuff, And I
think a really targeted immigration policy could really help with
that as well, to bring in sort of those specialized
(01:13:18):
skills to allow that economy to foster.
Speaker 1 (01:13:23):
Well.
Speaker 2 (01:13:23):
Dano Avisaria, I don't mind having immigrants like you in
the country. You add a lot of value to the
people around you, the clients you work.
Speaker 1 (01:13:31):
With, and society as a whole.
Speaker 2 (01:13:33):
It's been a real pleasure having you on leaders Getting Coffee.
Speaker 1 (01:13:37):
Thank you for joining us.
Speaker 2 (01:13:38):
As I said, I wish we had more time, and
that might mean we have to get you back again
one day. You've shared some fascinating thoughts about the world
of technology and the AI inspired future that awaits all
of us. So thanks for joining us, Thanks for sharing
your story with us, and it's.
Speaker 1 (01:13:56):
Been a lot of fun.
Speaker 2 (01:13:57):
Thanks for being part of leaders Getting Coffee.
Speaker 3 (01:14:00):
Thanks us already appreciate it.
Speaker 1 (01:14:02):
Finally, folks, my leadership tip of the week.
Speaker 2 (01:14:05):
It's not so much a leadership tip is an aspiration
because it comes from our guest and it's the mission
statement that he mentioned earlier. We design and build technology
to better serve humankind. We design and build technology to
better serve humankind. Perhaps that's something for all of us
to consider and aspire to in our daily actions. That
(01:14:25):
last part of that sentence, to better serve humankind. Thanks
again for joining us on leaders Getting Coffee Episode twenty
four with Rush founder danu Absarea. If you have any feedback,
please get in touch at info at leaders Getting Coffee
dot com. Remember that our favorite charity is Bike for
Blokes dot co dot Nz. And we'll see you soon
with another New Zealander and another fantastic leadership story. Until then,
(01:14:49):
have a great couple of weeks and we'll catch you soon.