Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to episode one oh one of Live with the
Maverick podcast. The team of today's discussion is Actuaries in
Software and we are very excited to have with us
our guests Marcella Carnados Lavoy. Marcella is principal Global head
of Insurance at Data Break, So welcome Marcella.
Speaker 2 (00:19):
Thank you Dominique for having me. I'm a big fan
of your podcast and I'm excited to be talking to
you today.
Speaker 1 (00:27):
It's a pleasure to have you as well. And thanks
for the compliment and apologies if I hope I pronounce
everything right. That is Spanish for five years, but sometimes
I don't always roll my rs and do my Tilda
is right, So hopefully I got your name right enough.
Speaker 3 (00:40):
Yeah, yeah, that's that's good enough. Thank you great.
Speaker 1 (00:43):
So just love to give you an opportunity. You know,
before before we get into it, I was excited about
today's episode because we're in the same space and I
feel like you know a lot of the stuff that
you're going to talk about, I'm going to be able
to relate to, so I think it'll be fun. For
what it's always fun, but you know that that's something
I noticed. It's I think it's going to be fun,
but just love to giving up a needed to introduce yourself.
Speaker 2 (01:02):
Yeah, sure, so, Marcella Ranados, I am principal and go
a head of insurance here.
Speaker 3 (01:08):
At Data Breaks.
Speaker 2 (01:09):
Been an actually since I guess seventeen years old when
I started to pursue the career back in my hometown
in Mexico, and now here are Data Breaks. Just doing
everything I can to bring technology to solve business problems.
Speaker 1 (01:27):
Excellent. No, I mentioned one thing that we had in
common is that we're both actuaries working in the software space.
So what was your motivation for going into the software space?
Speaker 3 (01:39):
Yeah?
Speaker 2 (01:39):
So I think that, as you said, anybody that does insurance,
you normally do it to just fulfill the promise to
the policy holders to just be there when they need
you the most. I think that what's had changed, and
you probably can relate to giving your experience as well,
is that a lot of the information that you need
(02:01):
to answer questions to pay losses on time, to be
able to price adequately still realized in a lot of
legacy systems, and people are frustrated. I think that what
we saw in the insurance industry when Lemonade, you know,
became a very innovative company route. A lot of the
insured texts was that companies wanted, or the expectations of
(02:26):
the policyholders were to just be able to have personalized experiences,
meaning you know a lot of people compared to just
like I want to be the Amazon of insurance, I
want to be the Netflix of insurance. You want your
claim to be paid on time, and you want to
be able to be underwritten within minutes. And the technology
(02:50):
that you would need to be able to do things fast, better,
cheaper cannot happen really without software and people being able
to utilize that software to fulfill the promises to posity
holders on time and accurately.
Speaker 1 (03:08):
I love that answer, and I typically don't do this,
but I'm going to just quickly insert what my reason
was because I think, I mean, one of the benefits
of this show is is for actuaries to know the
range of what they can do. So how I thought
of it, You know, I went into software. What year
do you go into what do you already start? The
data works?
Speaker 3 (03:25):
Again, it was two and a half years ago, so yeah,
like right after the pandemic.
Speaker 1 (03:32):
Okay, so we're right on part becaus I was just
over three, so not too far from that. But for me,
it was about helping to helping companies to modernize. I'd
done the TED talk around what July or so, twenty
twenty one, and one of the things I talked about
was actuaries being smarter in terms of how they use
technology so they can maximize insights and minimize mechanics, and
(03:55):
so it was about modernization. I'd been on teams where
where I'd seen a lot of manual work done challenge it, so,
you know, manual work actuaries get discouraged because they're not
providing some of those value added insights that they're trained
to do. Issues with governance and compliance because of things
like inconsistent coding, inaccuracies. So for and that had issues,
(04:19):
like I said, don't stream implications and retention. So for me,
it was more about modernization or sorry, it was about
similar themes, but that was you know, that's how I
personally thought of it, So just I just thought that
at that for anyone who's listening, I think there's there's
it's obviously a slightly smaller space, but plenty of opportunity
within this this this software space, and I think that
(04:41):
will only continue to increase with time, as as you know,
technology and AI becomes more mainstream.
Speaker 2 (04:49):
So yeah, I think that there's a I would say
there's few of fausts right now, just like really jumping
to technology. But there's a lot of interest, especially the
young actuers, all that they want to know what else
can actually be doing other than insurance or than consulting.
And I see, and I'm sure you say it as well,
more and more of us providing value because of everything,
(05:13):
especially around the modernization of the actoral function.
Speaker 1 (05:16):
For sure, I agree fully agree that's what I'm seeing
as well. Now, something that one might ask is why
would a software company need an actually actual is of course,
traditionally work and insurance companies, like you said, insurance based
consultings that I may be thinking. Of course, software companies
have developers and engineers and people like that, but why
why do software company need an actuary?
Speaker 3 (05:38):
So I think it's that's that's a very good question.
Speaker 2 (05:42):
I For me, I was I was in consulting and
I was happy. But the more and more what I
see these software companies is a lot of times I've
seen as software vendors, right, I mean, you're providing technology,
writing consult think you're you're actually you know, promoting your people.
(06:03):
In the insurance industry, you're doing your business as usual.
But in technology, you know, you're you're really just doing
the software as a service. And what happens a lot
of time is that in order for you to like
understand where does your software fit into, you know, whatever
your client is, you need to understand things like what
(06:24):
are their strategic priorities, right like what are the goals,
how do they get paid, how do they get measure
and monitor, and what are the pain points with the
current technology that they have. And I definitely think that
when a software company, when somebody works for software, is
trying to have those strategic conversations, you know, the client
(06:48):
on the other side is just like, why would I
be telling you what my periodies are?
Speaker 3 (06:51):
Like, You're only a software vendor, right, But we.
Speaker 2 (06:55):
Are seeing a lot of changes to that where these
software companies want to be trusted advisors, and to be
trusted advisors, you need to speak the language of the customer.
So going back to what was my experience, I was
approached by Data Bricks. They were in the process of
subverticalizing because you know that we have a lot of
(07:18):
and I'm sure you see the same with your company,
right with sus that there's products and then there's features
of the product. So the way that you are able
to show your technical capabilities is very horizontal. Right. So
I love your comment about governance because everybody's trying to understand,
you know, like.
Speaker 3 (07:37):
Where is the data coming from, who use the data,
who modify it?
Speaker 2 (07:42):
Access control? So you know that governance component is something
that can be applicable to all industries, retail, manufacturing, media, entertaining, banks.
But then what happens is that when you're providing that
point of view, you know what I was telling you,
you're trying to have this very strategic conversation with somebody
(08:03):
and you're asking them about the priorities or the use cases.
You really cannot do that without understanding the vertical piece
of it. So taking it back to insurance, there's the
insurance value chain where you have to do things like quoting, onboarding, underwriting,
(08:24):
pricing claims, marketing, distribution, you name it. And I feel
like people like you and I, or you know, even
like anybody that has taken actual exams.
Speaker 3 (08:35):
You understand the insurance value chain.
Speaker 2 (08:37):
You understand the use cases, and you're able to not
only speak the language of the customer, but also quantify
the benefits. I think that the quantification, being able to
put numbers behind the value that any solution can provide
is absolutely critical.
Speaker 1 (08:54):
Yes, the lost ratios, ex expense ratios, combined ratios, ROI
all those things, And I certainly what you what you
said resonated with me in terms of understanding the client's challenges.
Is that we call those discovery calls. I'm sure you
have those those as well. And what what never ceases
to amaze me is, and I'll tell it again to
anyone who's considering this particular space, is you. Because I
(09:19):
came from a larger company, must larger company where we
had hundreds of actuaries. So when you're at a large
if you happen to because some people are in a
small company, if you happen to be at a larger company,
you may feel like you're just one of many, and
maybe your expertise is there's somewhere in the middle and
on power. But when you go to a smaller company
where no one understands insurance, there's a lot of reliance
on you. Has that been your experience? I was certainly
(09:41):
has been my experience.
Speaker 3 (09:43):
Yeah, no, totally.
Speaker 2 (09:45):
I think that's something that I do here a data
breaks is enablement, enablement and training, and that's why again
the doctoral exams at least, you know, I'm very very proud.
I was never I think I told you when when
we're preparing for this, I was able to pass my exams,
the lower level exams ratively quickly, but then struggle, uh,
(10:08):
you know, to to pass the last couple of exams.
And at first I was just like, you know, do
I really believe in the rigord of factoral exams? And
now that I go back to your point about being
one of the few people at least here in Data
Bricks that know the industry really well, I give a
lot of credit to just the actual exams syllables as well,
(10:28):
because yeah, I mean, like I I think that, you know,
my role here Data Bricks is global, so I do
have the privilege of troubling all over the world. But
at the same time, I have a family, you know,
I have I have a young daughter, Sophia, So if
you're looking at me, your mom says high and I
(10:49):
also and there's no way that I could be traveling
around the world, uh and scaling and taking our solutions,
you know, without the right enablement. So to your point,
and this is actually it's an interesting point because the
way that we were thinking about enablement before was just
like what I was telling you on those discovery calls,
(11:10):
having you know, the trends around insurance, connecting those horizontal
product features to the vertical functions of insurance, quantification of
use cases, business outcomes. But something that really has accelerated
(11:31):
the development of enablement plans is generative AI. Right, Like
I feel like nowadays you can go to CHGPT, you
can go to Perplexity AI and at least like have
the basics of the structure on just how to have
that almost like next best action or advisor based on
(11:53):
what you're hearing from the customer. But of course, you know,
like there's nothing that really compares to having the actual experience,
like you know, like somebody like you and I would
have that.
Speaker 3 (12:02):
Now we're working in software.
Speaker 1 (12:05):
Sure, and we'll talk about a bit about generative AI
a bit later on looking forward to that. You mentioned
data breaks a few times. Data Breaks, I think is
a fairly from Understeind's a fairly young company and your
current role is Principal Global head of Insurance at Data Breaks.
So how do you describe your company and current role.
Speaker 2 (12:23):
Yeah, so I'll start with the company. So we're at
Data and AI platform. And it's funny because when we
started as a company, we were the creators of a
lot of open source products. So some of you may
have used a Patchei Spark for processing engine, you may
(12:44):
have used Delta for storage and mlflow for a lot
of the MLPs, like putting a lot of predicted models
into production, and to be honest with you, we almost
went bankrupt. There's an artic call by Force magazine that's
called Accidental Billionaires that talk about the origins of Data
(13:06):
Bricks and how you know, back in the day in
twenty thirteen, people were really not bought into the whole
idea of open source open standards because like, why would
you actually take something that is available for free if
there are a lot of companies that offer, you know,
licenses and products that are going to be validated that
(13:28):
you know they're going to be responsible for the liability.
So long story show. What happened with Data Bricks is
that even when we're calling ourselves like yes, we're like
the data and AI platform.
Speaker 3 (13:38):
Back in the day, people did not believe in AI.
Speaker 2 (13:41):
I mean not everybody, but a lot, like a lot
of times you would say like, we're the data and
HI platform. And now obviously everything changed and basically what
we do is anything from just data ingestion transformation. So
think about all of the times that you need to
just clean your data, reconcile the information, you build your models,
(14:04):
you find you in your models, and then ultimately you
get to insights. So anybody would need to just visualize
their data and data breaks does that end to end
processing of the data to take it to insights and
ultimately to actionable items. So you know, I can, I
can probably as a follow up just give you links
(14:25):
on just you know, without even thinking about data breaks,
how does Delta and a lot of these products work
and how the data engineers, data scientists and even the
actuals can collaborate in one platform. But that's the secret
source around data breaks was really the compute because we
you know, a lot of these companies when they were
(14:46):
trying to do some of the things that you were
aligning or on like the modernization, Uh, you have a
lot of data, you're storing, but that data when you
when it comes time to process it, you know it
would take a long time just you putting it back
to just like something that probably everybody would relate to.
You know, you have Excel and at some point, you
(15:07):
know Excel maybe okay if you're doing like a what
if analysis, but if you're doing simulations, if you're doing
you know, your your data is increasing in terms of volume,
Excel at some point breaks. So what data Bricks was
able to figure out. It started as a Netflix context actually,
but it was just like, how can you do like
recommendation engines, How can you have an engine that would
(15:28):
separate the compute pursus storage to just like process data.
Speaker 3 (15:32):
Really really fast.
Speaker 2 (15:33):
So we started as a as an ETL company, meaning extract, transform,
and then load back the data. But now we're being
moving into just more of like you know, the business
outcomes conversation. So that's my company, well, I mean, I
mean I work for this company. I mean it's kind
of my company because we're still pretty IPO uh and
(15:55):
we've grown very fast. As I was telling you two
and a half years ago, I was employee three one
hundred and fifty two and now two and a half
years later, we have more than eight thousand employees, so
it's it's crazy how much we've grown now. In terms
of what I do here at Data Bricks, my role
is twofold. So on the one hand, I qualify a
(16:21):
lot of these opportunities with clients, so we have we
have something that we call the land and expand strategy.
So we're very lucky to have two hundred and forty
one insurance customers and for the ones that are already customers.
Sometimes I participate in executive briefings, so just you know,
one on one connections with executives. Sometimes it's a chief Factory.
(16:43):
Sometimes is the chief technology officer. If we are running
self place around on the writing sometimes it will be
the chief on the writing officer. And that's where I
think that the actorial background really shines because I can
tell you dominate. There hasn't been one time that I've
been in a room talking to either a business or
(17:04):
a technology person and within my intro I say I'm
an actuary. I get a certain level of respect that
you know, feels good. It feels good to have that credibility.
So the executive briefings where I'm able to share the
point of view that we have as a company, but
I also get to learn a lot about the customers.
So I'll give you an example when when when I
(17:25):
joined Data Breaks, and of course the question became, can
actually use data Bricks? To be honest with you, the
answer was no, because our platform was very geared towards
people that coded. So you would log in into a
data Bricks workspace and it wasn't local not code, it
was literally you you had to code in a notebook
(17:48):
like a Python notebook, right or just like any language notebook.
And at that point you know there are actualies that code,
and I know we're going to get into that later
in the podcast, but the reality is just that it
wasn't necessarily a tool that would feed actoris. So what
happened is that during the conversations I was highway clients
(18:09):
and I was trying to understand the use cases that
they wanted to activate. It done to me that we
needed to make data Bricks simpler. So the beauty of
working for a smaller company, as you were alluding to earlier,
is that you know our CEO, everybody here.
Speaker 3 (18:25):
Is like equal.
Speaker 2 (18:26):
I mean, I'm sure you can see from like this
crystal office that there's there's really no offices, right, like
everybody sits in a cubicle, there's little to no hierarchical structure.
So I did talk to our Ceoligoti, and I said, listen,
we need to make the product, you know, simpler and
more appealing to non technical user.
Speaker 3 (18:48):
And I wouldn't claim that I was the.
Speaker 2 (18:50):
One that really motivated that, but I was able to
somehow influence a product team. So they started launching, you know,
a version of data bricks. You know, it's called gene
like Genie or like data Room, where somebody can ask
questions in natural language, and the UI, the user interface
is very very simple, like you just like type. It's
(19:10):
kind of like a chatbot. Do you type a question
like what you were saying, like I want to know
what my last ratio has been over the past five
years for personal Auto, and it gives you the answer.
Speaker 3 (19:19):
And you can click on a visilization.
Speaker 2 (19:21):
And I think that's very powerful because then the impact
that I can have not only for my customers but
also influencing the product that we're building can be huge.
And then so that's one tide, right, Like just like
talking talking to customers.
Speaker 3 (19:36):
Doing a lot of public speaking.
Speaker 2 (19:39):
I do get to travel around the world and just
keynote conferences around data breaks. Sometimes I do partner with
other technology companies. So I was in Switzerland and I
actually got to present with your colleagues from sas Yeah
in service now and.
Speaker 1 (20:00):
Which one which ones? Which ones?
Speaker 3 (20:02):
I had to look at like I had to look
at their names.
Speaker 2 (20:06):
But it felt really good because it was it was
an event that was hosted by e WHY in Switzerland,
and they wanted to just like bring best in class
technology that influences a lot of the ecosystem and insurance companies.
So it just felt good to just like not only
provide my point of view, but also learned from each other.
(20:27):
So that's that's one aspect of my job. The other
part that I think it's probably one of the most
fulfilling roles I've ever had, is the collaboration with data engineers.
So I always think that when you are, as you said,
right having these discovery calls and trying to think about
(20:47):
like high value use cases, we as actualist have to
go beyond a PowerPoint. Anybody can come with a PowerPoint, right,
Like again going back to generative, you can probably add
to put your to put together point of view around
insurance for claims on the writing, and that's not the
hard part. I think that the hard part is to
(21:08):
do to demo how would that work in real life?
And that's where what we build with with the architects
and the and the engineer is what we call solution
accelerators and demo the platform. I mean, it's basic a
combination of data, blocks of code to just do the
(21:28):
ETL piece.
Speaker 3 (21:29):
I told you write the ingestion.
Speaker 2 (21:31):
The transformation, and then the visualization and just like you know,
go back and forth on on those solutions. And I
think that that is so much more powerful than a
power point because again it's the innovative way that actually
thinks on just the implications of what you're showing.
Speaker 3 (21:50):
And it's just way more than code is literally solving
you know.
Speaker 2 (21:54):
A problem or like providing a perspective on how to
solve a problem from the data all the way.
Speaker 3 (21:59):
To the insights in the visualization.
Speaker 2 (22:00):
So I spend a lot of time, probably more than
half of my time building those solutions showing it to clients,
and the reality is that they're not necessarily plug and play.
We're not in the business of like building products, per se.
But it shows them out of the possible, It shows
them different ways that they can use the platform, and
it gets them let's say, seventy percent.
Speaker 3 (22:20):
Of the way. So we're pretty happy.
Speaker 2 (22:23):
And you know that also keeps my coding skills to
a decent level because I'm not the person that's.
Speaker 3 (22:30):
Doing the coding, but I do review the code.
Speaker 2 (22:33):
I do show you know, what each piece of the
code is doing, and I get ask like lots of questions,
which you know, in that case is just helping me
a lot to think about the problem and to end.
Speaker 1 (22:44):
I can certainly relate to the product demos because I
do a lot of those. We do a lot of those.
I certainly do as well. And I'll tell anyone who,
like you said, anyone can give a PowerPoint, but when
you're doing an actual demo, it's a real test of nerves,
I think, because you're doing your doing the presentation piece,
but you're also you also have to manage Okay, you're
showing the functionality and you want to demonstrate the value
(23:05):
aid as well, so there's altso pieces that you have
to It takes out well, I don't know, I don't
know what you were like, funny enough, Like, I mean,
I'm a public speaker. I've done keynotes, and even when
I started it, it just took me a while to
get used to that does that specific type of presentation
Because when I'm giving a motivational speech or or a PowerPoint,
that can make it fun entertaining. But when it comes
to doing an actual product demo, there's a few moving
parts that you have to manage. So but it's like
(23:28):
very I thought it's I think it's a worthwhile challenge
for anyone who might be interested. I just had one
follow up, quick follow up on data breaks. I know
you mentioned that it's global, so you get to travel everything.
So it's a global company. Of course, your data and
AI platform for insurance companies. Do you have like in
terms of the profile of the customers you know? Are
(23:48):
you is it like the broader insurance market? Do you
have go after PNC versus life more? Is it smaller
versus large? Just sure anything you're willing to share.
Speaker 3 (23:57):
Yeah, So.
Speaker 2 (24:01):
Historically a lot of the growth in insurance came from
propert and casualty, and I'm sure you can relate to that.
I always think that you know, take telematics, right, like
everybody was doing telematics when you know, back in the
seventies or eighties, people were coming up with like what
can be behavioral trait that would influence premium use credit score?
(24:25):
And nowadays, you know, people realize that it really doesn't
make sense to have credit.
Speaker 3 (24:31):
Score because that can be.
Speaker 2 (24:34):
You know, implicit way to just get into even discrimination
of this put impact. I'll give you the example that
as a Mexican, I didn't have credit score when it
came to this country, and I wasn't able to rent
an apartment in New York because I didn't have a
credit card. Right, So if you are charging premium for
(24:59):
renters insurance, so homeowners insurance for you know, based on credit,
you are effectively discriminated against a lot of the other
countries that you know, they you know, they don't use
credit cards as often as they do here in the US.
So the rise of a lot of the semi structured
data like telematics, IoT the P and C industry have
(25:22):
been leborating that for quite a while. So when I
joined Data Breaks, the majority of our customers were P
and C. I think that's something that we've seen is
now like When I look at the growth of life
insurance and annuity versus P and C, life is growing
(25:44):
at a faster rate and P and C.
Speaker 3 (25:46):
I think it's also because the denominator.
Speaker 2 (25:48):
Of P and C is larger, right because as I
told you, like historically a lot of our first insurance
customers were P and C.
Speaker 3 (25:55):
But definitely, you.
Speaker 2 (25:57):
Know, considering even the regulation, right, like take I for seventeen,
where a lot of times you need to do what
they call time travel, go back in time a year ago,
two years ago. We run an analysis, do a lot
of scenario testing that. Going back to the conversation we're
having around Excel, I'm telling you your Excel is probably
(26:19):
going to break at some point, especially if you're a
large company. So in that case, the growth in life
and annuity is definitely accelerating. P and C still represents
about sixty percent of our premium and life annuity around
you know, forty, but I think that at some point,
you know, in maybe fifty to fifty.
Speaker 3 (26:40):
In terms of.
Speaker 2 (26:41):
The types of customers that we target. At the beginning,
it was the tier one companies, So we currently have
eighty five percent. As an example of like the top
twenty P and cs. But if you think about it,
and everybody needs a data platform, right, and sometimes I
(27:03):
do get you know, when I travel to the Midwest
as an example, to just like see clients that are
maybe like the tier two. So think about like insurance
company number twenty one, if you rank them by market
share all the way to like fifty, They're very appreciative,
you know, they're very appreciative to just having somebody going
(27:23):
and spending time with them understanding you know what I
was saying earlier on and like what are the pain points,
what technology they're using. So I'm spending more and more
time with also the tier two, like again going back
to the landing and expanding concept. So with the land
I spend time to just like you know, qualifying the platform,
(27:43):
thinking about how our platform can play well with the
other technologies in our ecosystem.
Speaker 3 (27:49):
So we do see a lot of companies.
Speaker 2 (27:50):
Of course using SaaS, using Microsoft Power bi, you know,
whatever solization of their choice. But then we have to
paint a picture on a bread and architecture on like
how data bricks can be used alongside with that technology.
So again, you know, more on the tier two companies
that were spending time and then with the tier ones,
(28:12):
we're just maintaining the consumption and the relationship and like
educating them a little bit about how they can be
using our platform. And then lastly, in terms of geography footprint,
we have what we call top ten regions. So it
is Canada, US, Brazil, the UK, France, Australia, I'm trying
(28:41):
to think Amsterdam, Japan, and I'm forgetting two more. So
I got it out of the ten. But you know,
just as an example, last week, I was in Mexico
and Mexico was not a top ten, but I was
able to talk to my boss and said, like, listen,
like it is. It is kind of like the chicken
(29:02):
and the egg, right, like if of course, these regions
are top ten because they're large, right, and they have
most at least for insurance, they have most of the
insurance companies. But that doesn't mean that other emerging regions
we shouldn't be investing. I mean, of course we should
be investing, right because a lot of these companies also
have local operations. So you know, I had the privilege
(29:25):
of just going back to my own country and hopefully,
like we just actually announced that we're opening an office
in Mexico City. So I'm hoping that at some point
Mexico can be part of the top ten. So that's
that's kind of like the segmentation on just like how
we view the coverage.
Speaker 3 (29:43):
But it's it's a great opportunity.
Speaker 2 (29:45):
I think, like anybody that'sn't actually that starting insurance, you know,
at some point can actually work their way if there's
interests of course, you know, to get to roles like
the one that you and I have, which is insurance
for technology, bridging our technical and communication skills.
Speaker 1 (30:04):
Yeah, Mexico. I talked about it on the previous episode.
I had episode of Alejandro in actualaries in Latin America.
Also on there's a page on Instagram called Actuaries. I
think it's Actuaries X Mexico. Actuaries or actually is from
Mexico have lots of good memes and content. So lots
of stuff happening in Mexico. Now, let's talk about the
(30:24):
cloud because I like to talk about the kind of
the baseline for some of the you know, the platform
and the analysis that we've done through data breaks and
through similar software platforms, and we can't talk about modernization,
digital transformation and unstructured data without talking about the cloud.
So for those of us who may not be familiar
(30:44):
with what the cloud means, what is it or what
the cloud is? What is it and why is it relevant?
Speaker 3 (30:50):
Yeah? So you know the the cloud in.
Speaker 2 (30:56):
Just very general terms, right, it's just how do you
get your data out of legacy system and help with
computer Right? So I told you a little bit about how,
at least for our company we have there's a place
where you store your data and then there's a place
where you actually do the computation. And we made the
decision very early on that we would not exist on premises.
(31:20):
We would only exist on the cloud. And we made
that decision because when you think about the rise of
big data, right, like I mean, there's been a lot
of books published by the American Academy of Factories and
both the CS and the SOA.
Speaker 3 (31:37):
The three b's.
Speaker 2 (31:38):
Right, the velocity of the data right, like the frequency
taking it back to the IoT example or even fit bits.
The data comes almost like real time. You cannot process
real time data on premises. It has to be on
the cloud, right, Like I mean, it's just like the
data centers that you see from Google, from like Amazon,
(32:00):
from Microsoft. You know, that's the technology that enables you
to just get off of the velocity of data. The
other piece that I think it's undeniable right now in
this world is the variety of data. So what I
mean by that is just the whole spectrum of unstructured data.
Think about the threashure trap of all of the data
(32:22):
that you would have. Going back to insurance again, with
text claim sudjusted notes, they're normally texts, policy forms from
on the writing, it's texts, it's pages, it's PDF right Like,
there's still like, even from a distribution perspective, a lot
of the information dot coms or just even like your
(32:42):
trends adjustments, your like applications.
Speaker 3 (32:45):
Would all be part of an email or a PDF.
So I feel like the.
Speaker 2 (32:49):
Insurance industry have been leveraged a lot of the text
piece of unstructured data.
Speaker 3 (32:54):
But it shouldn't stop there.
Speaker 2 (32:55):
You have video, you have images, you have audio, and
they're so much information that is in there that without
a platform or without the cloud, you wouldn't be able
to just not only access the data that is sitting
on all of those unstructured ways, but also process that
data and ultimately fit into the pipeline. So that's variety
(33:19):
and then.
Speaker 3 (33:22):
Volume.
Speaker 2 (33:23):
Right, So I talk about velocity, I talk about variety,
and then the volume of data that nowadays everybody's processing,
uh is becoming bigger and bigger, right, So like again,
you know, like there's so many insurance companies that are
being part of the of the you.
Speaker 3 (33:40):
Know, whole portfolio.
Speaker 2 (33:41):
But if you think about it, whether or not you
grow organically or through acquisition, there is only going to increase, right.
And a lot of the technology you would have to
process that information, you know, would not work unless you're
on the cloud. The the other piece that I think
it's important and you public can relate to given your
(34:03):
background also on predictive modeling. You know, back in the
day when you were running a generalizing our model for
pricing and you were coming up with the coefficients and
everything that we learn in the exams in the past, right,
and I'm talking about probably when I was in the
industry back in two thousand and twelve, twenty thirteen. You
(34:26):
will have the coefficients of a GLM. You would like
document that give it back to it and then it
would put it into production. What we're seeing is that
anytime that you're trying to test different algorithms and actually
deploy those models live and just like you know, being
able to just like have the policy holders receiving their
(34:48):
premium almost like instantly, right like the Renewalt coms or
you know, it's incorporating a lot of these behavioral traits.
You cannot deploy models on premp like you have to
do it on the cloud if you want to go
from badgingto streaming.
Speaker 3 (35:03):
So I think it's a combination.
Speaker 2 (35:05):
Of that, like the three bees, they need to deploy
models effectively, as well as just considering you know, everything
else that goes into how legacy system would no longer
work and having data on prem sometimes it was thought
to be safe, but if you look at like a
(35:26):
lot of the data and privacy bridges it actually did.
They didn't occur in the cloud. It was like on prem.
So we're seeing a definitely shift towards the cloud. Not
everybody is on the cloud and not all of the
workloads are on the cloud, but there's definitely a trend
towards you know, moving from on prem to cloud.
Speaker 1 (35:47):
Certainly I'm seeing the same thing. And one example you
had use for for more limited cases where where companies
are not moving to the cloud. I guess, uh, low
fre I think it's a low frequency, high severity. If
you're doing a low frequency, high severity line of business,
then perhaps it may not be as pragmatic. Well, certainly
on my end, I'm seeing, you know, very similar, a
lot of companies have, you know, very clear technology roadmaps
(36:10):
that do involve some degree of cloud migration.
Speaker 2 (36:14):
So yeah, exactly, and like I you know, just even
thinking about operational systems.
Speaker 3 (36:19):
Like guy Wire right guide Wire, we have a.
Speaker 2 (36:23):
We're part of their marketplace, and you know they do
offer the on prem version, but they also offer the
cloud version, which is cloud data access. And you can
see probably with every technology, and again it's not it's
not easy. I think that even right now a lot
of companies are even considering cloud economics and just like
(36:45):
before and after things that would have to consider change management.
But it's definitely where you can you can even do
a lot of like you know, testing of use cases.
I mean I can tell you even right now, right
like generatively I has to be uh you know, whatever
form of generative the idea you're doing, like it wouldn't
(37:06):
exist without the clouds. So whether you like it or not.
That's why it's going to be in the future. But
it's not at once for all change. I think it's obviously,
you know, incrementally, and it has to be very strategic.
Speaker 1 (37:18):
So you took the world right over my mode. I
was actually going to say that we talked about the
platform and what enables us to do these analyzes, and
the next thing I was going to ask about was
generative AI. Now, since large language models such as CHAD
GBT became mainstream, the term generative AI has become a
(37:38):
very common or very popular buzzword. Now, how do you
how would you describe generative AI?
Speaker 2 (37:45):
So I would say that it it is a form
of deep learning. So you know, normally the way that
people depict generative AI, it starts with you know, the
big circle, which is AI. Then you have machine learning
(38:05):
that's a form of AI, and all of that. You know,
like probably the simplest way is just how to emulate
the human brain, right, that's what it's called artificial intelligence.
That it's called machine learning. So you teach the machine
to think like a human. But after that you have
deep learning, and then you have neural networks and within
(38:26):
the deep learning space, you know, you're basically looking at
not only the relationship that certain features would have on
your predicted variable. So let's take premium for homeowner's insurance.
You know that it is related to maybe square footage
and neighborhood, right, Like, that's like a simple equation that
(38:46):
we learn in abur ectoral exams.
Speaker 3 (38:48):
You know, it's just like a linear question.
Speaker 2 (38:51):
But what neural networks allow you to do is not
only to look at the relationship that the predictive variables
would have in premium. So you know how square footage
affects your premium, how neighborhood affects premium, but also they
have what they call hidden layers, which is why it's
called ner neghborhood because it kind of like simulates how
your brain is thinking. So then in this case, it
(39:11):
would consider the inter relationships between the predictor variables, so
it would also consider how does the square footage varies,
How does you premium vary by square footage correlated with
the neighborhood you're in, right, of course, you're in a
fancier neighborhood. You have to normalize by the fact that,
like you know, the houses are bigger, right and the
(39:33):
bigger the house, the bigger the problems. So all of
a sudden you start thinking about a lot of different correlations,
not only between the excess.
Speaker 3 (39:40):
And the Y, but like all of the different predictor variables.
Speaker 2 (39:43):
So within that concept, the you know what change with
generative AI is the GPS like what you're actually generating,
so in comparison to the traditional technologies that you would use,
right going back to the claims adjuster node, if you
want to information from text, you don't need general TIBI.
(40:04):
It's just like OCR right, like optical character recognition to
digitize that information and then you just run natural language
process to just like know what to look for in
a document. With GENERATIBAI, what you're generating is, let's say
a summary of that information. So you know, I'll give
you a concrete example. Think about Google two years ago
(40:29):
when you would like ask something like I want to
understand I want to know who the maveric actual is, right,
So you type something like that and.
Speaker 3 (40:39):
It will give you a source. Maybe it would like
link exactly to.
Speaker 2 (40:44):
Your website and it would show you everything that you've done. Now,
Google search engine is powered by Google Gemini. So now
if I were to ask the same question, and I say,
I want to understand the highlights of the interview that
the Maverick actually did with Alendrod around Latin America. It
would not only point to the source, but it would
(41:07):
give a summary of your interview with Alejandro.
Speaker 1 (41:09):
Right.
Speaker 2 (41:10):
So it's generating something new, it's summarizing information, it's helping
you create code. So if you want to say, I
want to do a closer analysis based on again, you know,
going back to the homeowners example, uh, square footage and
neighborhood on this data coding Python, it generates the code,
(41:31):
It completes uh, you know, the sentence, right, Like, it
creates something new. So I think that's very very innovative
because you're really not just pointing to a source, but
you're giving something that has not been created. But the
challenge with that is, you know what it's called hallucinations
(41:51):
that since it's actually been performed by technology, it's very
hard for you to even question what you're getting. So
hallucination is giving you a wrong answer, but you know
it's given to you in a very confident way, right,
So sometimes you just like take whatever you're getting for granted,
(42:12):
and you know, like something that sometimes we give the
example is General Motors had launched a General Tibi solution
to just help customers with product recommendations and somebody, you know, one.
Speaker 3 (42:25):
Of the customers.
Speaker 2 (42:26):
Is a very famous use case where somebody asked the
question like what is the best peakup truck? And the
chat bot powered by General Tobi. Because of course chatbots
have been around for a while, but now you know,
it's the fact that it's more intelligent generating something new.
They recommended the four f one fifty, right, because it
didn't understand the context that this was a chatboard for
(42:48):
General Motors, right, so you shouldn't really go to the competitors.
So that's just an example of how powerful this new
technology can be and harders that compare to traditional uh
you know AI machine learning, and also what are the
risks that come with generating that new content?
Speaker 1 (43:06):
Yes, and for anyone looking to understand large language models better,
I had the privilege of interviewing Eric Siegel on episode
forty three, so we talked about some of the foundations
of large language models such as GPT, how they work,
and some of the shortcomings that you mentioned in terms
of accuracy. So certainly check that out if you want
to understand those foundations better. So we talked about what
(43:28):
generative AI is. I'll give you actually add on one
example because recently I was using GPT and I was
just pulling around with it quite frankly and asking it
to generate a business plan for this idea. Had I know,
came up with a very concrete business plan. But not
only did it generate a business plan, it then said,
would you like me to look at streaming options and technology?
And it just kept asking me questions like we're having
(43:50):
a conversation, and it knew it wasn't just retrieving or
extracting insights that I asked it to do. It was
continuing the conversation and thinking of, you know what I
may ask next, So that I thought was an interesting example.
But we you know, we talked about the foundations of
general generative A, sorry, and what it is. So, how
can generative AI help to yield value for insurance companies?
(44:14):
Because we're seeing it now more commonly in the insuran
at least, it's still fairy. I mean, it's fairly new
in terms of this application perhaps in the insurance space.
But how are you seeing that happen?
Speaker 2 (44:25):
Yeah, so I would say like three different ways. The
first one is on the going back to that unstructured
data piece that we're talking about, something very very cool
that I've seen being developed is using AI functions to
create descriptions of an accident.
Speaker 3 (44:47):
Right. So you know, of course.
Speaker 2 (44:49):
If you're a claim adjuster, you can look at the
picture of a car and just determine whether or not
is total loss. But just like any function, an AI function,
you can put parameters. So as an example, you can
look at what is the average severity of the accident,
what were the weather conditions that were present at the
time of the accident, and incorporate as semi knowledge like
(45:13):
you know, skid marks.
Speaker 1 (45:15):
Right.
Speaker 2 (45:15):
So I think that's very very clever. And then I'm
sure you you guys, some of you may have heard that.
Just like there's good sides of using GENEI, there's also
bad side. So there's a lot of bad actors that
are creating fake images, you know of people like the
deep fake right or you know, fake images of a car, right,
So then what do you do with that? So within
(45:37):
the same AI functions, you know, we had something that's
called AI query that compares the human generated description of
this car accident to the AI generated description and it's
able to just identify where the discrepancy is and the
root cause. And I think that's very very powerful because again,
(45:59):
you know, like the technology became downstream and really democratized
based on as you said, right, like CHARGPT, but if
you actually look at a lot of the algorithms, just
like even vector databases and how you do embeddings to
do retrivelmental generation. That's actually in our rectoral exams. So
I always try to think about, you know, when I'm
(46:20):
reading something very very technical, take it back to concepts
that I understand before. And that's something that again in insurance,
just like images, right, and it's still around the same
claims and on the writing. I think that within marketing,
you know, you pointed out correctly that a lot of
(46:40):
you know, just like new content can be generated if
you think about the concept of customer lifetime value and
having product recommendations. A lot of insurance companies are using
generative BAI to just not only you know, when somebody
calls your insurance company, you may be calling to report
a claim, you may be calling to understand your coverage.
(47:02):
You want to know what your deductibles and limits may be.
So you know that's not new, right, Like everybody can
call your insurance company. But what generative AI allows you
to do is not only the speech to text, but
you can do things like the intent measure intent, why
are you calling number two? You can do sentiment analysis
(47:23):
all in one platform, right, and then ultimately you can
guide and give recommendations around again you know the crosshold,
ap sell the product, the coverage, three edging the person
you know whenever maybe maybe you got into an accident,
right and you don't want to talk to an IVR,
you want to talk to an actual person. All of
that can be done through the transformers architecture, which is
(47:45):
the t piece of the CHGPT to just like really understand, uh,
you know what action you should be doing next. So
that's that's some of the external facing use cases. The
main thing that companies are doing is the internal use cases.
So what I mean by that is every single employee
(48:06):
of a company can probably use generative AI to be
more productive. So think about having your own executive assistant.
And again the technology is already there where now you
go into a call and you automatically are able to
summarize the call know what the next steps are, and
even looking at very complex structures, maybe like the ORG chart,
(48:30):
so that when you look at how you should be
prioritizing responding to emails, the technology can help you do
that and then even identify the way that you would
respond to an email. Right, So it learns from your behavior,
it learns from your parttern and then ultimately can help
you can have huge probiblity gains. So internally, of course
(48:53):
it's easier to just put into tests a lot of
your genera use cases so you can learn and then
you can take it externally so that you don't have
the problem of like the hallucination example I gave you
earlier on manufacturing.
Speaker 1 (49:06):
Yeah, that last example. We talked about the meeting transcriptions
songs a lot like Copilot, I think from what I've
seen there. So you know, we've we've we spent some
time now just to recap because I know we've We've
talked about a lot, a lot of good stuff. We've
talked about actuaries in software, what the what, why you know,
how you got into software, the motivation for software companies
(49:27):
hiring actuaries, who talked about the cloud, and then we
spoke a little bit about generative AI, and of course
data bricks. You know, spend some time talking about data bricks.
So let's pivot slight leads to a related subjects related
to the broader theme is programming. Programming and coding. Those
are relevant skills for the mattern actuary, especially if you're
(49:49):
into work in software or in tech. Now understanding is
that you have some you know, after you've finished the
actuary exams, you did spend some time developing skills in
R and Python. So how do you get experienced that?
And you know what I gets question all the time
from my followers, and so it's a question I've always
(50:10):
kind of struggled with at times. There's certainly people more
senior than me, is what level of proficiency do actually
need in programming and coding today? You know, of course
coding being a subset of programming, which is more of
a broader algorithmic design. But how much proficiency? So how
do you get involved in What are your thoughts on
proficiency need factories?
Speaker 2 (50:32):
I think that it depends on the role, and it
depends on the example that you were given on just
like low frequency high severity. So if you're dealing with
you know, a small company that's doing excess and surplus
line and your analysis is going to Excel.
Speaker 3 (50:48):
You know that's fine.
Speaker 2 (50:50):
I think that sometimes Excel gets a bad rep just
because people are like, oh, you know, like Excel is
like legacy system. I actually think Microsoft has do an
incredible job making X self relevant.
Speaker 3 (51:00):
You can do great things there. But if you are.
Speaker 2 (51:06):
Doing anything where you're dealing with high volume, as you
and I talk right, like, think about your personal lines,
if you are doing anything.
Speaker 3 (51:16):
That involves.
Speaker 2 (51:20):
Having multiple people working on the same analysis, which happens
all the time. Uh, you know that collaboration cannot be
done in an Excel spreadshet. I mean it can, but
like then you have all of these stabs and you're
like at the mercy of who created the the spreadsheet
and who documented a lot of these things. So I
think that that's where coding can be, you know, incredibly
(51:43):
helpful because you know, you still have to document things,
but you can do things faster, better, cheaper. Right, So
the example that I gave you, like, you know, anybody
that's an actually that does pricing, even if you want
to code you know, generalized in our models, it's probably
harder to do in Excel. There's a lot of packages
(52:05):
that would allow you, you know, as an example for
Python there they have non pie and pandas to do
a lot of the data transformation. Obviously, R was the
first programming language I learned, and there's a huge academic community.
There's brilliant actuaries working and contributing to packages in our
(52:27):
Marcus Gusman who built the chain ladder package, I mean,
it's just it's not only the chain ladder package to
allow you to do loss of a non factorund reserving,
but he actually wrote a chapter in a book and
predicted modeling using R that shows how the chain ladder
is a it's a special case of like the Brothers
scastic reserving. So I think that the I would advise
(52:53):
you there's if there's one thing that you can take
out of this podcast is don't be afraid to try things.
They the version of R or Python that you may
have run in to five years ago, they actually develop
it so it's a lot more user friendly. It's collor coded,
and I think that you know, all you need to
(53:14):
do is just like get started right like you like
think about how you can combine the most common programming
languages with the power of a co pilot. As you
were mentioning, and just like you know, ask ask a question,
do a clustering analysis in R and Python, give me
the code, and then from there you start iterating.
Speaker 3 (53:34):
I think it's incredibly powerful.
Speaker 2 (53:36):
The other piece that you know, the kind of like
the line that the other line that I withdraw and
just whether or not you need to code is sometimes
when you move up on your career and you're managing people,
you may think that you don't need to god anymore
because probably the younger person can code in a more
efficient manner, but you still need to just like anything
(53:56):
you do as a manager, right like, you still need
to review that what they're doing is actually sound and
that you can grown the code yourself. So I think
that to answer your question, I don't think the actually
need actually need to code as well as a data
scientist or a data engineer, but you have to have
a minimum level of proficiency. Again depends depending on the role,
depending on your line of business, depending on your company.
(54:18):
H And that that skill set is something that can
make you a unicorn, right like, actually is by definition,
we're very very technical. We are good at measure managed
monitor risk, which can be applicable to anything. As you adventure.
Speaker 3 (54:36):
Career, you become a better communicator.
Speaker 2 (54:38):
And if on top of that you add the coding component,
you become the unicorn and you know your skill set
and you know you have more choices at least when
it comes to like finding your dream job.
Speaker 1 (54:51):
That's true. Very few people have both in terms of
that unicorn concept. Now you mentioned just mentioned data engineers,
and I just had a question to ask on collaboration.
So actuaries, actuaries, data scientists and data engineers they have
you know, different skill sets and maybe for instance, with
actuaries and data scientists, perhaps some type of oral app
(55:13):
But you know, in spite of there are different skill sets,
there are working within the context of insurance all least
so working you know, on the benefit of the same
client or the same company. So how can the three
work together for the benefit of the company or the client.
And I remember when we spoke before, you had a
very nice illustration of just for you broke it down
into what actuaries do, what data science? Data engineers are
(55:34):
the scientists and but the key is like how can
they how can we make this collaborative and not adversarial
I know, certainly specifically with actuers where data scientists. A
lot of people talk about how many how much ground
actuories are losing the data scientists, and like I mentioned
in the PREMI you know, I don't. I don't see
it that way. I see it as some things that
minimum overlap, but definitely things that are separate and distinct.
(55:54):
So thoughts on that.
Speaker 2 (55:56):
Yeah, yeah, so I'll give it example like backing consulting.
So I can probably tell you that there's never been
a time that I feel more proud of being an
actually because every single time that we needed to build
(56:17):
a predictive model and put into production, you know, everything
that we've been talking about, it all starts like it's
not I feel like the problem that people the mistake
that sometimes people make is just like it starts with
the data, right, and start with the data, and you
crunch your data. Your data engineers are the ones that
are building these pipelines. They're robots and they don't fail.
Speaker 3 (56:38):
Right.
Speaker 2 (56:39):
The data scientists would be the one that's like, find
you in the model, right, and then they actually will
be the one percent analysis. But the mistake lies on
the fact that before you start pulling the data, right,
you cannot pull all of the data that you have,
so you always start the problem with business hypothesis. So
(57:00):
if you're trying to determine even how to segment your
business to do reserving, if you're trying to look at
like what is the right marketing approach or like you know,
pricing or your insurance, you always need it actually to
start thinking about what do I think is correlated with
adverse loss ratio? And then with those business hypotheses, you
(57:24):
put your data requests and then you give it to
your data engineers. They're the ones that are pulling the data.
I mean, think about age something simple, right, You're not
going to put every.
Speaker 3 (57:34):
Single age range right, because you.
Speaker 2 (57:37):
Have like every single age of like driver times, territory times,
you know, like a number of accidents. All of a sudden,
the number of combination and permutations becomes unmanageable. So right
off the bed, you have an actuary that would do
things that make sense, like Okay, well, if I'm a driver,
I probably not going to have anybody, you know, younger
(58:01):
than twenty one years old, right, and maybe after sixty
five that's when like your visions started terrorating. So you
would always have actually is helping with not only the
binning of the variables, but also the correlation again, you know,
like things that we learn in work exams. So that
significantly runs more efficient, makes even the data engineering job
(58:25):
more efficient because you're not pulling the whole data and
just like throwing everything into a predictive model. Right, You're
guiding how you should be thinking about the data, the
correlation between the variables, how you should be grouping and
segmenting things of course within the So that's data engineers
with actuaries. I think that, as you point out correctly,
(58:45):
you know, there is a lot of overlap between data
scientists and actuarists. But again, the data scientists would be
able to just like test algorithms that maybe neural networks
that we're discussing, or the gradient boosting machines right, or
generative additive models. But the actual is going to be
the one that would be able to say, like, hey,
(59:07):
I know who my regulator, who my department of insurance.
I know how likely they are to accept this technique
versus this other technique. I know how to provide justification
and have that balance between Yeah, maybe a more accurate
a more sophisticated algorithm would give me more accuracy, but
(59:27):
at the same time it's not as transparent of a
generalizing a model because you know the formula by heart. Right,
So I think it's like that type of trade off
and the collaboration that you would have that like an
actually is like more of like the horizontal right, it's
involving the business hypothesis in pulling the data and transforming
the data, in thinking about like what's going to be
(59:49):
acceptable or not acceptable by the regulator and ultimately presenting
the results back to just you know, your chief on
the writing officer, your CFO, catost effect, your general led
there and what is the implication that ultimately would have
in your policy holders. So you know, from my perspective,
the fact that we as actually you know have credentials
(01:00:12):
right where it's like a GAS or f CUS or
you know like from you know FSA or or the
other actoral institutions. It's the rigor it's the going back
to what we were talking earlier, right, it's speaking the
same language that a lot of the people are that
are working in the Department of Insurance, they are actually
is themselves, right, So like we we know what's acceptible,
(01:00:35):
what's not acceptable. And I think that as long as
there's the rigor of just having the doctoral exams and
having the certification and you know, you putting your name
when you're signing an doctorial opinion.
Speaker 3 (01:00:46):
The career is not going anywhere. But we do need
to collaborate.
Speaker 2 (01:00:50):
We need to know where the strengths and waitnesses are
of the other disciplines, so ultimately we can all work together.
Speaker 1 (01:00:58):
That's a great response that I could help. But think
from you started is when you started going through the
different roles. I was thinking that for some companies that
actually data engineer and data science is the same person.
So but that's for a different episode. We won't we
won't get into that. But I like the way that
you you compartmentalize that and you kind of structure that. No.
(01:01:18):
In closing, something I want I noticed is that you
know you were you were one of the co founders
of the organization of Latino Actuaries and you're actually the
current president. Great organization. By the way, you know, lots
of people there, Ali, Danny Fernandez, everyone. Why so why
was it important for you to get involved with this organization?
Speaker 2 (01:01:40):
Two reasons I as I was saying, I'm I'm a
very proud actually, but I'm also a very proud Latina.
And I was actually inspired by I A. B. A.
I was in one of the CIS conferences getting my ACS,
and I saw an A I A. B Ah Champ
getting on the stage talking about how when you look
(01:02:03):
at the population of the world and you look at
black people, right, you know, it's a certain percentage, but
then when you look at the population of black people
within the doctorial community, it's like very very low at
last at a time, yeah, exactly one to two percent.
Speaker 3 (01:02:21):
And then I thought, well, what about Latinos? And I
could not.
Speaker 2 (01:02:25):
Help but like, look in the room full of actualists
and I did not see enough Latinos. I At that
point I run into really, you know, great individuals like
Adelaida Campos, Lala, right, Alexandro Pega that I have to like,
(01:02:47):
from my perspective, I couldn't be the only one thinking
about this problem, right, So I was very lucky that
I run into them, and you know, we just decided that,
you know, we we should start an organization. We first said, like,
there is there anything out there? We quickly discovered that
there was nothing out there.
Speaker 3 (01:03:04):
To our knowledge.
Speaker 2 (01:03:05):
So you know, we just found the disorganization and you
may be thinking about you know, so that's that's that's
the one reason why you know, we all started because
we select there's a need. But then if you think
about the theme of this podcast, which is innovation, there's
been many, many studies and probably conversations for another topic
(01:03:25):
for another day. Innovation comes fostering. Innovation comes from diversity.
And it's not only diversity of races, diversity of thought.
And I feel like every single time, or diversity of gender,
diversity of like anything, right, And when you're in a
room where everybody thinks the same way, right, everybody went
(01:03:48):
to the same school, everybody you know, like was educated
like in the same country, you can't help. But like,
just like there's not enough diversity, there's not enough disagreement.
And I truly believe that being in that environment when
people it doesn't need to be that people are disagreeing.
It's just like people may think differently, but they're open
(01:04:09):
to hearing the other perspectives. That breeds innovation. And you know,
now even on the age of like technology and AI,
we have to have in all of those roles. We've
been talking about the data engineers, the actuaries, the data scientists,
the business. I really think that the versity of race
leads to diversity of thought and that leads to innovation.
(01:04:32):
And I think that in order to do it right,
to do it responsibly, you know, we we need we
need organizations like all, like ia b A, like any
any diversity organization. I think it is propelling the whole
growth for innovation, especially in the insurance industry.
Speaker 1 (01:04:51):
Well shout out to Adelaida by the way for introducing
me to you, and you know, I think that helped
enabled episode. And you know, just to reflect the conversation,
we talked a lot about software and it's greatly appreciative
for this discussion because as somebody who's in the software space,
I genuinely think that there's an opportunity in the future
and those opportunities will increase. We actuallys who want to
(01:05:13):
try something a little bit different, and it's always good
to have a substantive discussion when we talk about you know,
JENNYI and large language models, cloud. Sometimes these buzzwords are
thrown around, but to be able to give it substance
and context and depth and dimension is I think is
helpful for the community. So just want to thank you
(01:05:35):
for your time, Marcella. And you know, looking forward, are
you going to the CIS meeting by chance?
Speaker 3 (01:05:41):
I don't know.
Speaker 2 (01:05:42):
I'll have to check, but just I was. I was
saying this before we started recording, but I'm very impressed
by your level of professionalism. I told you that part
of the motivation to do the podcast is an incredible
influence you have in younger people. Adelaia first ask me,
(01:06:02):
He's like, do you know the Maverick action. I'm like,
of course, you know, I really know the Maveric action.
And people are like, oh my god, he got like
nineteen thousand followers on LinkedIn, right, Like I think I
have five thousand, you know what I mean. So thank
you for everything you do, not only for the profession
to just bring awareness of you know, what an actually
(01:06:22):
can do. But like I have to say, I truly
enjoy not only doing this podcast, but preparing and having
this discussion. And you know, I actually think that other
of you being an influencer, a very good influencer in
all of the social media channels, I think you know
you can We're probably a big inspiration for a lot
of actionists that are trying to get a little bit
(01:06:44):
outside the insurance role the consulting world, and thanks for
everything you do for the Victorian profession and technology, and
kudos on your job as well.
Speaker 1 (01:06:54):
Deeply humble then, you know, just thank you and really
appreciate your time. I was going to say so thing,
and it's totally just slip my mind. But maybe it's
time to wrap up.
Speaker 2 (01:07:04):
I know, like I'm looking at the club too, and
I can probably talk for and I probably don't can
talk to you for forty five minutes, but we probably need.
Speaker 3 (01:07:12):
To keep it short.
Speaker 1 (01:07:13):
Yeah, well, but definitely keep in touch and you know,
look forward to meeting you in person. I'm sure I
was gonna I know, I remember what I was gonna say.
I was gonna say, remember you can use this for
continuing it. I found that at some point during the series,
so be sure to use it to leverage it for
for continuing it.
Speaker 3 (01:07:29):
So thank you so much them, Thank you.
Speaker 1 (01:07:34):
Bye,