All Episodes

August 31, 2025 84 mins

What if we could revolutionize healthcare and other industries by using the untapped computing power that sits idle in hospitals every night? Meet Rani Yadav-Ranjan, Chief AI Officer at Greycloud AI, who's pioneering a breakthrough approach to AI that respects both data privacy and urgency.

After watching her mother's medical journey, Rani had a revelation: doctors function as "databases" of knowledge, but these repositories remain isolated within individual hospitals. Her solution? A patented system that transforms existing computers in closed networks into distributed training devices, operating during off-hours to build sophisticated AI models without compromising sensitive patient data. "From six o'clock till six in the morning, that's twelve hours of computing power we could use," she explains, describing how her technology chunks and chains data through multiple nodes, each with a unique blockchain identifier to ensure validity.

This innovative approach reflects Rani's decades-long career at the intersection of emerging technologies. As one of the most influential women in tech, with over 20 patents, she pioneered mobile payments in the early 2000s, navigated cultural resistance to e-commerce in Japan, and continues to advocate for the ethical development of AI. Through personal anecdotes about balancing leadership with motherhood and observations about gender bias in corporate cultures, Rani offers a master class in resilience and strategic thinking.

Rani's perspective on AI's future is refreshingly nuanced. Rather than fearing job displacement, she sees AI as enhancing human capabilities while emphasizing the irreplaceable value of experience. "What I use AI for is code reviews," she notes. "It's awesome to say, 'here's 3,000 lines of code, tell me where the missing semicolon is." As she prepares to publish her book "Constitutional Democracy in the Algorithmic Age," Rani continues to champion responsible innovation that serves humanity's needs while respecting our rights and privacy.

Connect with Rani on LinkedIn to follow her groundbreaking work at the intersection of AI, healthcare, and ethical technology development.

https://www.linkedin.com/in/ryadavranjan/

https://yadav-ranjan.com/

Send us a text

LEORÊVER COMPRESSION AND ACTIVEWEAR
Get 10% off Loerêver Balanced Compression and Activewear to elevate your confidence and performance

8 EIGHT SLEEP
Save $200 on 8Sleep and get better quality and deeper sleep with automatic temperature adjustment

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

This content is also available in a video version on YouTube.

If you enjoyed this episode, please share it with someone who may enjoy it as well, and consider leaving a review on Apple Podcasts or Spotify. You can also submit your feedback directly on my website.

Follow @GrandSlamJourney on Instagram, Facebook, Twitter, and join the LinkedIn community.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Rani Yadav-Ranjan (00:00):
The idea was okay.
If we are confined by law andmaybe proprietary in society to
have already closed data loops,wouldn't it behoove us to try to
solve this faster?
So in the case of this thingwith my mom, because I'm
intimate with that knowledgewhat if I could create a model

(00:20):
that uses all public information?
So everything from the Mayo,from the CDC, from all the other
public availability, train it,and now again the car with a one
gallon of gas.
Now I'm giving it to you.
What if now you take all yourcases and you train now to
further refine the data?
So, using federated learning,we can refine the data and we

(00:43):
can move forward.
It's exorbitantly expensive.
So my solution that we havepatented, was to break it up
into nodes.
So every computer, which isalways hardwired in a hospital
or a bank, by the way, evenlaptops, I mean, they're in a
closed network.
So I said, why don't we turneach of them into a training

(01:03):
device?
Everybody has CPU, GPU.
We could put a load balancer inthere.
We could use it when you're notworking.
So from, let's just say, sixo'clock till six in the morning,
that's 12 hours of computepower that we could use.
So let's do that.
We can use chunking so we canchunk up the data into
bite-sized pieces.
We can use chaining so we canchain it up the stream and then,

(01:26):
if you will a reverse pyramid,get it up many nodes to a single
node and then each node has aunique Web3 blockchain
identifier on it.
So we know the validity and themetrics that were used and we
move forward and it's actuallyworked out so far in trial
pretty good.

Klara Jagosova (01:44):
Hello, ladies and gentlemen, and welcome to
the Grand Slam Journey podcast,where we explore the
intersection of sports, business, technology and leadership.
My today's guest is RaniYadav-Ranjan.
Rani is the Chief AI Officer atGreycloud AI, former Head of
Technology and Innovation atEricsson Analytics.

(02:08):
Insight picked Rani as one ofthe 10 most influential women in
technology, with over 20patents and recognition from the
US House of Representatives.
Rani offers inspiringperspective on resilience,
leadership and innovation.
Inspiring perspective onresilience, leadership and
innovation.
Our conversation covers thefuture of AI and healthcare,

(02:28):
data privacy, e-commerce, mobilepayments and the opportunities
and challenges women face intech.
We also discuss the importanceof mentorship, resilience and
balancing career with family.
This podcast is fullyself-produced, and so I highly
appreciate anyone who gives it alisten.
If you enjoyed thisconversation, please consider

(02:50):
subscribing, leaving a review orsharing it with someone who you
believe would love it as well.
This is your host, klaraEgoshova.
Thank you for tuning in andenjoy the listen.
Hello, rani, welcome to theGrand Slam Journey podcast.
How are you Good?

Rani Yadav-Ranjan (03:08):
Thanks for having me.
I really appreciate this, Clara.

Klara Jagosova (03:11):
I am thrilled You're one of the women I am for
sure, blessed to have in mycircle with all that you have
achieved and are still building.
And we have many topics to diveinto at the intersection of
leading innovation, technology,entrepreneurship, leadership,
you name it.

(03:31):
So I'm curious where this GrandSlam journey takes us, talking
about your journey of life andwhat you're building now.
But let me pause before we diveinto many topics.
How are things lately?
We haven't talked in a while.

Rani Yadav-Ranjan (03:46):
I know I want to hear all about.
Of course you and I want tocatch up, but I love the idea of
Grand Slam Journey.
I think it's awesome.
You and I are both tennisplayers you at a whole different
level than I, but there are alot of analogies we can make
with life and tennis, I think.
But almost any sport, you know,whether it's the challenge, the

(04:07):
training that's required toreach certain levels, the
experiences, the coaching, thementoring those all align with,
to me, leadership and, of course, everything done with grace.
I think that is reallyimportant.
I don't know about you, but Ihave fallen so many times on the
court.
My knees were so scabbed up atsome point that you know now
when that happened to me in myjourney of entrepreneurship or

(04:30):
even enterprise life.
It's like, yeah, I've had scabknees before and time to get up
and put some slab on there andmove on.

Klara Jagosova (04:39):
I love that.
So true and fun fact.
I don't know how I didn't knowyou play tennis, ronnie.
Have we ever talked about it?
Tell me about your tennispassion.

Rani Yadav-Ranjan (04:48):
Tennis was something I used to bond with my
dad as immigrants you're stillfiguring out the system and my
dad would take my brother, mysister and I out on the tennis
court to play and we had the oldChrissy Everett wood rackets
you know that were like 20 bucksmaybe back then and we would go

(05:08):
out on the public courts and wewould play and we would hit and
we were horrible.
We had no formal instruction atthat time but I really took to
it and I loved the strategy, Iloved the competitiveness, but
yet I was still playing withmyself.
Right Then the middle schoolhad tennis teams a great public
infrastructure in Minnesotawhere I went to school and
joined the tennis teams and Irealized that the girls weren't

(05:30):
hitting as hard as the boys andso I joined the boys tennis team
.

Klara Jagosova (05:34):
Of course you did.
I wouldn't expect anything elsefrom you.

Rani Yadav-Ranjan (05:37):
But unfortunately, some of the
parents objected and so theymoved the practice to 6 am.
First they moved it to 7 and Istill showed up.
Then they moved it to 6 am andit was just too much for me to
hit it at 6 am and then twohours and then go hit school all
day and then hit the girlstennis team in the evenings.
It was just too much and so Iended up dropping out.

(06:00):
But that was my first alsomemory of how rules can change
to favor certain things.
But I got what I needed out ofit.
I learned how to hit reallyhard, mm-hmm.

Klara Jagosova (06:12):
I love that and I love we're diving straight to
the sports, but I also want tojust dive in a little bit to
some of the things you are doingnow and that you have been
doing, and you're currently thechief AI officer at Grey Cloud,
with passion about using AI forgood.
You have spent decades helpingcompanies and clients navigate a

(06:34):
complex intersection ofemerging technologies,
governance and regulation Over25 years pioneering this complex
technology at the intersectionof AI, machine learning and
blockchain.
We have met at Ericsson, whichI have been really privileged to
have a leader like you in theSilicon Valley office to look up

(06:54):
to.
That has been a veryman-dominated industry,
obviously, and even theleadership team overall.
So I've always appreciated yourfearlessness and just leading
with courage and charge.
So I want to dive into thosetopics and I know you have been
a big pioneer for women in techoverall women now in AI.

(07:14):
I know it's a topic that you'recurious about and obviously I
am as well.
You've also worked at Airbnband have been specializing in
developing innovative AIstrategies and governance
framework that drive businesstransformation.
So I definitely want to diveinto the technology and the AI
topic that I've been trying toread up on and I know for sure

(07:37):
that I'm not going to know asmuch as you, so I'm going to be
learning here, hopefully as wellas most of the audience, but I
want to give you an opportunity.
Anything else you want to addas an intro that you want
listeners to know about you?

Rani Yadav-Ranjan (07:52):
Well, you're the first one to know outside my
husband, but I just signed acontract to publish a book.

Klara Jagosova (07:59):
Congratulations and we have to do another
podcast when it comes out.
Yes, that'll be December.
It'll be out in DecemberFantastic, what is it about?

Rani Yadav-Ranjan (08:11):
The convergence of exactly what you
said.
AI governance law and also ourrights as citizens.
You know, my entire career hasbeen about understanding the
power of tech, because once youget in it, you understand the
power and the dark side and the.
The power of tech Because onceyou get in it, you understand
the power and the dark side andthe light side of AI.
I've always said that AI hasbeen around since 01.

(08:32):
It's not new.
It's how fast we can do it, andthat is thanks to the hardware
technology.
People don't appreciate that asmuch as they should.
You know, folks, we've beenable to do facial recognition
since about the 70s, but wecould never do it fast enough
because the chips were so slow,the hardware was so slow, and so

(08:55):
even just to see my faceaccurately it would probably
take six hours in the old IBMdays, in the VAX 360 days.
But fast forward to today andyou could do it almost instantly
.
It's encouraging where we are.
What Grey Cloud is doing isactually solving a whole
different problem.
Just to pause, my background is, of course, dealing with public
information.

(09:15):
Navigator was a company Ifounded that had a private
acquisition for just the corecomponents of it and that is
searching all 42,000 US and alittle bit of Europe, a little
Canada government databases,because nobody has more
information on anyone than agovernment.
They have you from, as I alwayssay, from cradle to death.

(09:37):
You know, they know when you'reborn and they know your whole
life in the middle.
They know what you've bought,they know what school you've
gone to, if you've taken out astudent loan, they know what
profession you're in, if you'vehad to be licensed, and they
know how much you've earnedbecause of your taxes.
So, if you look at thecomposite, nobody has more data
on you, an individual or abusiness, than a government

(09:59):
entity.
In 2000-ish, I decided that youknow, I need to figure this out,
we need to tackle this.
And back then, all the USgovernment databases were open.
You could ask for anything.
You know, I started withcontractors and licensing boards
.
People were sending me data,not only for free, but short,

(10:22):
small amounts, and sometimes Iwould get social security
numbers and I thought, jiminyChristmas, this is scary.
So there was my first forayinto privacy, pii.
How do we protect theindividual?
Primarily myself, you know it'svery core to that, and I think
all innovation starts with you,an individual.
What is my problem?
And then, how many other peoplehave the same problem I do.

(10:45):
And then, of course, because Ican solve my problem I just
assume I've not solved it foreverybody, false or not, but
it's a good approach toinnovation.
So I started doing that andit's evolving.
I remember talking to members ofCongress and saying, in
probably 2010s, that look, giveme four hours and I could be you

(11:07):
.
Identity theft is a big deal.
And they did not think it wasan issue because, you know,
social security numbers areunique.
And I said, actually, it's notyour social security numbers you
need to track, it's your cellphone numbers.
And they looked at me and mostventure capitalists looked at me
with shock when I said that ismy unique identifier.
Because I said, trust me,connectivity it's our whole

(11:28):
thing, it's our being, it's whowe are, it's who we trust.
And this is before, you know,back when there were brick
phones, when Ericsson was makingthose huge phones, and I said,
you know, trust me, you can havemultiple phones, I can have one
for my family.
And I was just envisioning outloud.
You know, have multiple phones,I can have one for my family.
And I was just envisioning outloud.
You know, you could have onefor your family, one for your

(11:51):
work, one for your socialnetwork, back when there was a
social network, fast forward.
Some of that is very true andalong the way I filed patents on
a lot of this stuff.
I then figured I was thinkinghow do you secure all of this?
You know it's okay, you can'thave it out there in the public
domain, and trust is one thing,but is it verified?
You know we cannot openly trust.

(12:11):
I think the world has changedtoo much.
It's become too small actually,where anybody can look at
anything you know.
Gone are the anonymity dayswhere you could remake yourself
and maybe hide at times, but AIwon't let you do that.
So public database.
So my point is okay.
Now, thanks to the big four, Iwould say, with ChatGPT, with

(12:35):
Claude, with Anthropic, withMistral and now with what Oracle
is trying to do with their AI,people have gotten savvy and
most of the data is now lockedup.
And yet there's a whole nuancein healthcare and banking and so
many other field law thatrequire access to the benefits

(12:56):
of an LLM.
How do you train that?
So GreatCloud is actually ableto, in a closed network, use all
the nodes that are availableand help you train model on
super sensitive data and, ofcourse, that genesis was
Ericsson, where we know.
I started a company insidecalled Snapcode with three other
co-founders who are still atEricsson, and the idea was to

(13:19):
use the data from our networksfor customer benefit and how to
monetize.
Because, again, companies likeVerizon, t-mobile, vodafone they
all know where you are becauseof your cell phone tether.
They know where you're goingbecause of your cell phone
tether, they know what texts arehappening and then, for that

(13:40):
matter, they even have yourvoice.
They have every conversation ifthey choose to.
Most are very responsible andactually all are responsible.
We have government laws toprotect our rights, but we don't
have rights.
We don't have certain rights inAI, like the ability to be
forgotten.
We don't have that.

Klara Jagosova (13:59):
It's interesting how your passion is clear and I
love even the initial what youmentioned protecting the problem
statement and starting fromwhat is it that I'm struggling
with, what is it that I'mpassionate about and what do I
think I truly feel like I shouldsolve, for that you cannot find

(14:19):
anywhere else and especially ifyou cannot find other people
that are solving for it, thenit's for sure a problem that is
worth tackling.
And obviously it's terrifyingto know that you're able to even
get sample data of people withsocial security number.
That's very scary, and I wantto go a little bit deeper into

(14:41):
obviously gray cloud AI and yourjourney, a little bit deeper
into obviously Grey Cloud AI andyour journey.
But I always like to know aboutmy guests where they grew up,
what was their upbringing likeand what led them to the
passions that they have today.
So it seems like we alreadystarted maybe spoiler alert a
little bit with the tennis.
It seems like sport was part ofyour background.

(15:03):
Tennis that's like sport waspart of your background.
Obviously, you have deeppassion at the intersection of
technology and using differenttypes of technologies for the
greater good of society.
Whether you mentioned data, howdo you handle data,
communications, blockchain, ai.
It all seems to be kind ofcoming together in some ways,

(15:23):
utilizing especially, some ofthe platforms are building on
the blockchain to make it moredata privacy centric.
But yeah, let me just stopthere and go back, sorry, to the
original question.
What is your upbringing, likeHarani, and if you look back,
how have you uncovered thispassion and journey?

Rani Yadav-Ranjan (15:42):
you have been on my parents immigrated to the
United States and we went toMinnesota.
My dad was a computerprogrammer back then and my mom
was a homemaker and then theygot into import-export business
and, through misfortune at timesit was hard.
We had retail stores and so, aswith all, immigrant children,

(16:03):
have to support, help supportthe family, in the sense that we
work in the stores and we dothis.
Working in retail really gaveme an idea of how to talk to
people and how to communicate.
People always think that I'm anextrovert, but I'm actually the
happiest with a book and a hotcup of tea and just being at
home in my cave, as you can seebehind me.

(16:24):
I think that we can look ateverything in life in two ways.
We can look at it as my gosh,you know, we had to work and we
had to do this.
Or we can look at it and say,wow, what great training.
My siblings and I can talk toanybody.
When you grow up in retail, youcan talk to anybody and you
have to learn patience.
And then I went to school.
I mean, most of my life wasobviously in Minnesota.

(16:46):
I went to school there, met myhusband, moved to California
where he was at a startup.
So that was the first forayinto startups.
And so then we had our familyand we made a deal You'll love
this, clara.
So I was working, he wasworking, and so we made a deal
that whoever would make the mostmoney in one year, the other
one would stay home and raisethe kids.
You know, because I believe inequality, and so does my husband

(17:09):
.
Actually, he was Indian, he isone of six and he has four
sisters, and his father in Indiadecided he would never take a
dowry.
In fact, the dowry was hisdaughter's education, and
they're almost all PhDs.
So you know, he said I'm notgoing to take anything from my
daughter-in-law, even thoughit's very common even in the US
Indian community.
So I said you know, this iscool, so why don't we make a bet

(17:32):
?
He said, okay, so of course.
But then I had to stop.
At nine months.
I'm sure he was like, yeah, youdon't know.
Okay, fine, but by that time,by the time we had met, I had
already worked for a consultinghouse, because, you know, my dad
was very traditional and girlsdon't go out and get jobs and
get apartments back then.
So I got a job at a consultinghouse always a workaround to

(17:56):
every problem, clara.
So my job was to consult in allover, including Texas, where I
worked to help automate thegrocery supply system.
So it was a conversion jobbetween IBM to Vax Simple.
They had an Oracle database.
But every month everybody, fromthe CEO to all of us, would put
on our jeans and go in thewarehouse to do inventory.

(18:18):
And I just thought that was thesilliest way you know to do
inventory, because there's somuch waste, fraud and abuse
happens at grocery stores andpeople don't realize that
grocery store warehouses areend-to-end, so from frozen to
produce, and there is waste,fraud and abuse because it's
manual inventory.
But yet with the Oracledatabases we could pull the

(18:41):
register.
So you knew how much you weremaking every night.
And I thought, well, if youhave that much technology in
there, how hard would it be toput a scanner on there?
So we were experimenting.
So I took two two-by-fours duct, taped it upside down so it's
always on because the handle andI used the UPC code which we
were using to buy stock and Icoded it and just for fun I

(19:03):
thought, well, I'll do somesimple math on it so you know
how much do you add up to 14items.
How much are you putting in?
Here's your change.
Just for fun.
I thought we would do this forour warehouse, and the
technology was rolled out.
Unfortunately, I never filed apatent on it, had I done that?

Klara Jagosova (19:18):
Yeah, so you eventually invented warehouse
logistics automation for supplychain Rani.

Rani Yadav-Ranjan (19:26):
And I actually enrolled at Clark to
all the grocery store scannersand everywhere.
Now right the code.
So I wrote it in four languages.
I wrote it in COBOL, in BASIC,in B-Tree for the filing system
and automation of the scannersand C that's a fantastic example
of just you diving in and againsolving a problem that you
found you had yourself.

Klara Jagosova (19:46):
How do I make it easier and better?
There must be a faster way todo this.

Rani Yadav-Ranjan (19:52):
Makes you wonder if we're just lazy and so
we're like solving it with tech.

Klara Jagosova (19:55):
Well, yeah, I mean, I love tech for many
reasons and one.
I truly believe that it allowsus to solve some of the hardest
challenges in the world.
Yeah, I think so too.
I think Jensen would for sureagree with it.
Obviously, that's a premise ofAI and his slogan the more you
buy, the more you save, becausehe is the most advanced
computing platform.

Rani Yadav-Ranjan (20:16):
Now Jensen also is in Silicon Valley.
I don't know him, but a friendof mine does and was one of his
first investors and she said tome you know, and that's the
whole thing, overnight successis usually 10 or 20 years in the
making and when they werelooking for revenue models,
their whole thing was okay,gaming is not picking up, it's
not as widely adopted as wethought, but the chips are fast,

(20:38):
the GPUs are awesome I mean,like I said, imagery.
So they actually put it allinto crypto.
So they were putting all theirchips into the crypto mining
trades on the East Coast.
And then, when AI took over thecrypto miners who also were
still finite, because, as youcan tell, there's only so much
you can mine, right theyactually ended up converting to

(21:00):
be AI, then hubs.
So how you could train models.
That is what the conversion was.
And the next thing, you knowit's a trillion dollar company.

Klara Jagosova (21:08):
Yeah, it's pretty cool.
It's fantastic what he has doneand built and continue to build
, obviously, and Ashley so areyou.
So back to the warehouseautomation invention.
That seems like was it one ofyour first that you didn't
patented back then, like was itone of your first that you
didn't patent it back then.
But I know you now have was itover 20 patents related to data

(21:29):
technology and AI.
So there was perhaps a learning.
I was like why haven't I donethis?
What?

Rani Yadav-Ranjan (21:36):
was the journey from there.
From there, like I said,focused on my social life well,
my private life a little bit,primarily.
If my mom and dad's begging,okay, it's time.
And sometimes I think we needour parents encouraging for that
, because it is scary to comeout of something you are used to
doing and, whether that's workor it's whatever, you know, it
is scary to all of a sudden haveanother life in your life.

(21:56):
And you know this.
I just thought I'd have a dog.
A German shepherd would beforever, yes.
But instead I met thiswonderful man and we have three
children.
So my husband a side note hasover 350 patents.
So when you talk impressive, heis impressive.
He actually is the father ofthe one inch hard drive which
made Apple what it is today.

(22:17):
But what's interesting is andthe reason I mentioned that is
that during the process he wasgoing to a conference.
He's a PhD.
He often gave lots of lecturesand conferences and we had made
a pact that he would only go foras long as was needed None of
these extra weekend trips, youknow, and extra days to
acclimate.
So he was going up to the cityin San Francisco to give a

(22:41):
lecture, and it was a Sundaynight and he was leaving, it was
evening and he was driving upand he was pulled over and he
got a ticket because the tag onhis license had fallen off you
know, the validation tag hadfallen off and so he got to fix
a ticket.
Then he, when Don, did hisconference.
Well, a month later he's on theEast Coast and he calls me and

(23:01):
says hey, a month ago I got aticket, can you go in my car?
I think I threw it like on theside, can you take care of it?
And I said, oh my gosh, youknow, pick up preschool, drop
off.
But I called San FranciscoCounty and I said, hey, I'm so
sorry, my husband got thisticket.
We need to pay it today.
She goes yeah, yeah, no problem, just come up here and pay it.

(23:22):
And I thought, wait what?
And she goes yeah, yeah, honey,just come on up.
I said, look, I've got threekids by the time I pick them up
from preschool and school anddrive up to the city, which is
an hour plus drive, and thenpark and then get them all in a
stroller and then get to you tomake this $10 payment.
It's going to be, it's aproduction for me and that you

(23:46):
know, I don't know who couldwatch the kids because my
husband's out of town and I'mexplaining the facts to her.
And she was, yeah, sure, justcome on up.
So then I called him back and Iyelled at him and called this
woman back who was so kind, andI said, okay, I've got this
problem.
What can we do to solve it?
So then I went through, becauseI am a software programmer, and
I said, okay, what if I go toback?

(24:08):
Then we had Western Union.
So what if I go to WesternUnion and wire you the money?
She said, sure, but it'll gethere tomorrow.
Like, okay, what if I go to mybank and wire you the money?
And she's like, sure, it'll gethere in two days.
And I'm like I said, okay, whatif I gave you my credit card
number and you ran it?
And she's like, no, wephysically need to have it here.

(24:28):
I said, okay, what if you runit and I pay you $20?
Like, honey, you're bribing agovernment official.
I'm like, oh, that's my point,that's my point.
So, as I went through all thesechallenges that we could not
solve, I then called back myhusband.
I was just so angry.
I then called back my husband.
I was just so angry.
I said, look, I know I couldwrite the code to solve this
problem.
And I think just to get off thephone with me he's like you

(24:49):
know, why don't you write it up?
Write up a schedule, write up amodel, write up the process on
how you would do this and we'llpatent it.
And then you know, somedaylater you can write it.
I'm like, okay.
So then I put my head down asnow you know this about me and
for two months nobody saw me andI wrote the patent for mobile
payments Filed.

(25:10):
It started to get a company onit raised money.
The US government asked me toactually go to Nagasaki, japan,
and talk at a conference thereabout what e-commerce could do
for Nagasaki.
Oh my God, clara, I mean wecould have a whole episode on
this.
So I get to Nagasaki with theUS delegation.
There's a press conference andthey said you know it's a sleepy

(25:32):
conference.
A couple of faculty membersfrom the university will come,
some of the local press, no bigdeal.
So I'm sitting on this table.
They've got maybe 50 chairs inthis room.
Everybody will get one or twoquestions and if there's no
questions for you, our team willstep up and we'll ask you the
question Just to make sureeverybody's participating.
So by this time we've had oneday of conference in the booth

(25:55):
and I thought, guy, you know allour material is in English.
So I went to the KFC and I saidcan you translate this for me?
Do you speak Japanese andEnglish?
And they're like yeah, and I'mlike can you translate this for
me?
So this guy wrote down exactlyunderneath my English what is
said in Japanese.
I had a whole bunch printed off, went back to the booth and my

(26:17):
position was very simple.
Even today, which is e-commerce, will change the world.
And I said if there is a billdue at midnight, you can make it
at 1155 pm.
It'll be instant.
It's a cost saving for anyentity that uses it.
It's a benefit to you and thebank cannot make money on the
float because they can just lockup the funds in your account

(26:38):
while a credit has been issued,if you will.
The funds do not have todistribute for almost two days,
48 hours so they can make moneyon the float.
It's a win-win for everybodyand it's free.
And at the press conference nownext day, the room was full.
I was the only person to ask aquestion and one of the members
from the university literallystood up and said you will

(27:00):
single-handedly destroy theJapanese culture if we do what
you're saying.
So I call my husband I don'tknow what to do or say and he's
like what are you saying?
I said watch the news.
So that was quite interesting.
So from there came back, y2khit, unfortunately, and the
crash.
While we were able to getcredit with Wells Fargo, we had
relationships built.

(27:20):
We couldn't continue, so wegave back to investors their
money.
The patent went dormant.
The reason I mentioned that isbecause in it I had also put
down a secure block, I had putdown bio crystals, I had put
down all these ways you couldsecure not only data but how
payments could be made, andstarted another company, like I

(27:41):
said, with big data.
And then this patent emerged 10years later.
We'd all forgotten about it.
We had literally forgottenabout it.
It had other patents I wasfocused on and technologies.
And then in 2009, this emergedand it was, ironically, a
submarine patent.
The world attacked it againbecause mobile payments were
very ubiquitous to how we doworld.

(28:02):
But in 2015, a bunch of banksgot together because it.
Literally, it was worth aridiculous amount.
But they were like, well, no,this is for the public domain,
this is for the public good.
They finally took it out, butit was by this time it had been
in for a long time.

Klara Jagosova (28:20):
I'm curious why did the Japanese say it would
destroy their culture?
Is it because it was everythingso personal?
They didn't see the electronic?
Or what was the reason behind?

Rani Yadav-Ranjan (28:30):
it.
So what was explained?
So the way I said is I usedagain for me it was a ticket.
So I said like a utility bill,water bill, electric bill.
So in Japan back then what theyused to do is a man from the
electric company would come byyour house.
He would knock on the door.
The wife would open the door,he would give her a ticket and

(28:51):
he would leave.
He would come back at a shorttime later.
She would give him back theticket and the amount that was
on the ticket in cash.
He would then take that ticketalong with all the other tickets
that he had collected.
He would go to the utilitycompany, he would take out his
portion and he would give it tothe clerk.
The clerk would then take outhis portion and then he would

(29:12):
give it to the people whoactually make the payments.
The payments would then happenmanually and the whole process
started a month later.
The point was, I would beremoving those two jobs and you
don't have to have as manyclerks, because I was saying you
can have one person do the jobof 10.
They're like so we can get ridof 10 clerks, all the middlemen,

(29:34):
all the collectors, and thatwas for everything that was for
utility, all utilities, so water, electric telephone.
Everybody had their own groupof people that did this.

Klara Jagosova (29:51):
What year was this?
And it is so interestingbecause you can literally draw
parallels to what people areafraid of now.
With AI, right, everybody'ssaying it's going to replace
jobs, for example.
We're not going to need peoplelike the Uber drivers.

Rani Yadav-Ranjan (30:00):
This was in 2000, 2000, 2001.
I don't think AI will replacepeople.
I think AI allows you to domore in a shorter time.
And I've used AI.
I use Gemini, I use ClawAnthropic, I use the ChatGPT.
I've played with Minstrel alittle bit.
They have a long way to go.
It allows you to do more.

(30:22):
But I mean certainly nobody cansit down and say, develop a
neural net-based software systemfor me If you don't know what
the architectures would looklike, if you don't know what the
components need to do, if youdon't know how it should be
structured, what are the keylines of code that you need, if
you can't just quickly read itand fix it, quickly read it and

(30:47):
fix it, it won't do that for you.
It will do for you.
Write me a simple website formy tennis program or my tennis
class or my cooking class Verygeneric instructions.
But what I do see is, forexample, prompt engineering is a
thing now right, and that'sjust how to ask questions
succinctly.
But what I often say to peoplewhen they ask me this is look,
I've been hearing this my entirecareer.

(31:08):
The next evolution of tech willreplace the people, and it
doesn't.
But what we are doing isgenerating software programmers
who actually cannot code.
All they do is ask questions.
Yes, somebody like me who cancode in how many languages?
Everything from Fortran toPython now, and React, COBOL,

(31:30):
BASIC, all the other languagesthat exist.
You know all the differentdatabases that exist.
I've used them and I've codedwith them.
I know we're using FFmpeg withhow to really sync the voice and
the audio together.
I know what language is beingdone.
I could fix the code.
How many other people todaywhich are software developers

(31:53):
can actually tell you that theyknow how to write raw basic code
?
And that's going to be less andless.
So I don't know if that's goingto be a dying breed or AI will
replace it.
I don't think so.
I think that for me, I use itand my team uses it to do code
reviews.
It's awesome to be able to sayhere's 3,000 lines of code,

(32:16):
Somehow, there's a missing commasomewhere or a semicolon.
Tell me where it is and it cando that for you.
But the other problem issecurity.
Problem is security.
Do I really want to put myproprietary code into a
Anthropic or a ChatGPT or Gemini?
And the answer is no, I don't.
I don't want to put my healthrecords into it and say tell me

(32:39):
what I think is wrong.
However, I would be verycomfortable putting all of that
information in a closed networkthat I knew nobody else could
see, and I see this happening.
I see it becoming more of aneven personalized therapist, but
it's a one-on-one Sure.
You have this baselineinformation Everything about

(33:03):
psychotherapy or family therapythat you could put in into a
database that is now used totrain a model.
Perfect.
It's like everything else I'vesaid.
It's like buying a car withonly one gallon in it.
Great, At least you have a car,You've got the basics, You've
got a gallon gets you going andthen after that you put your own

(33:23):
right, and that's where I see agreat cloud solution.
You know, creating asoftware-based neural network is
not easy.
I've been working on thisreally on and off for about a
year with this team and we'vemade headway, and a lot of it is
the genesis of Ericsson and theSnapcode, and you know Ericsson
did put in almost a million, soit's kind of a shame that we

(33:44):
don't continue it and run withit and now raise more money.

Klara Jagosova (33:48):
Yeah, so there are so many angles we can dive
into, but obviously the AI is atthe intersection now of all
your focus and what you do.
You're writing a book about it,so maybe let's just continue to
dive straight in.
So Great Cloud AI.
What was the premise, or evenstarting the company?
It seems like, again justfollowing up on what you had

(34:09):
mentioned before, you saw thisproblem for yourself that nobody
else was solving.
How do you now apply it to someof the client challenges and
what do you see as?
Maybe just help listenersexplain the key focus?

Rani Yadav-Ranjan (34:24):
Two things happened that were pivotal and
critical.
First, I was at Ericsson.
We had a lot of super sensitivedata on people that we couldn't
even or explain the key focus.
Two things happened that werepivotal and critical.
First, I was at Ericsson.
We had a lot of super sensitivedata on people that we couldn't
even share with providerseither Verizon, t-mobile Nobody
had the base layer data.
But what do you do with it?
That was the initial problem.
If you will, it's like an apple, very basic.
My mom actually got very sickand she passed, and as I was

(34:47):
standing talking to the doctors,I was asking how did you
approach her case management?
What did you do?
Because technically, you are adatabase.
You are the repository of allthis information and experiences
that you have had and maybeyour colleagues have had, but
you are the repository of that.
So how did you develop thisstrategy to help her?
And I said, how do you approachthe next person who is like her

(35:14):
?
I mean, what did you learn fromhere that you can use there?
And I was trying to just reallymake peace with everything that
had happened and justunderstand why certain things
had happened and certain thingsthey had used for her case
management.
So, as she was explaining to me,I realized that it's a very
problematic solution.
It's very it's.
They're scientists, they justthey're exploratory.
They're like they're nodifferent in some ways than I am

(35:35):
with the software system anddeveloping it, you know.
Oh well, this is not working.
Maybe if I change the colorwill that look more appeasing.
Right, they're not doinganything different, but the
human body is so complex thatthere's so many random variables
.
So look at all the code now I'mwriting.
You know it's complex, randomvariables, no one solution for

(35:59):
any one person.
However, the common is it's ahuman body with blood and the
same organs, different sizes,all in the same place.
Many commonalities, but thevariables are so exponentially
huge that you can try and errand when a solution that might
work for one doesn't work forsomebody else.
So you're really rolling thedice in some ways on the luck of
the training school of thedoctor, of the doctor, of the

(36:21):
training mechanisms and theexperiences they've had.
And I asked her what if therewas a software?
And so what if there was?
You know a way for you to typein her metrics?
And by this I mean, like her,basics, you know woman, age,
ethnicity, overall health.
I mean you know what, if therewas, you could type, you could

(36:42):
get all this data in a systemand it gave you impossible ways
to approach her issues.
I'm just talking about speedingit up, right, how would this
help?
And he goes yeah, we could, butyou know, we're limited to what
we have here in this closedhospital care and I thought,
okay, that's fair.
So when I went back, I said toErickson, I said why don't we
look at healthcare?

(37:02):
And we were talking to BaylorHospital of Breast Cancer.
I said, okay, that's a verylimited group.
Women Well, actually not menhave it too, but a specific body
part and there's a lot ofimagery out there of people
trying to diagnose it.
But there wasn't a solution outthere and there still is not
where anybody can use it.
There isn't something that justfor doctors Right now we're

(37:24):
limited to every hospital.
The same thing with banks we'relimited to every hospital.
The same thing with banks.
We're limited to each like it'scity bank california, it's not
city bank united states or citybank global.
So the idea was okay, if we areconfined by law and maybe
proprietary and and society tohave already closed data loops,

(37:45):
wouldn't it behoove us to try tosolve this faster.
So in the case of this thingwith my mom, because I'm
intimate with that knowledgewhat if I could create a model
that uses all public information, so everything from the Mayo,
from the CDC, from all the otherpublic availability?
Train it and now again the carwith a one gallon of gas.

(38:09):
Now I'm giving it to you.
What if now you take all yourcases and you train now to
further refine the data?
So, using federated learning,we can refine the data and we
can move forward.
It's exorbitantly expensive.
So my solution, that we havepatented, was to break it up
into nodes.
So every computer which isalways hardwired in a hospital

(38:33):
or a bank, by the way, evenlaptops, I mean, they're in a
closed network.
So I said, why don't we turneach of them into a training
device?
Everybody has CPU, gpu.
We could put a load balancer inthere.
We could use it when you're notworking.
So from, let's just say, sixo'clock till six in the morning,
that's 12 hours of computepower that we could use.

(38:54):
Let's do that.
We can use chunking so we canchunk up the data, and into
bite-sized pieces.
We can use chaining so we canchain it up the stream, and then
, if you will a reverse pyramid,you know, get it up many nodes
to a single node and then eachnode has a unique web3
blockchain identifier on it.
So we know the validity and themetrics that were used and we

(39:18):
move forward and it's actuallyworked out so far in trial
pretty good.
You know it needs more work, aswith everything it needs it
needs gas.

Klara Jagosova (39:28):
Yeah, I mean, I can imagine this work.
It will never be sort of done.
There's always going to be away to make it more accurate and
iterate and chunk and slice thedata in a different way,
especially in this use case.
Right, I always compare it to,at least for me, creating
PowerPoint pages or creating adeck.
I created one yesterday.
I look at it half an hour laterand there are still

(39:49):
improvements.
Like there's always a way youcan make something better,
clearer, visual, appealing, andI think it's a simple way to say
that applies to all technology.
There's always a way toprobably evolve, write the code
more elegantly, but it's justsuch a brilliant idea.
So, just to summarize, kind of,what are you doing now?
You're eventually using all theat least closed loop technology

(40:12):
available in that setting.
In this specific example, itcan be a hospital.
I don't know if you know thestatistics, but I can imagine
there are several times,especially in weird hours, where
a lot of the compute power isnot being utilized in a hospital
, right, I don't know, is itmaybe up to 50% that there's
shifts, so there might becomputers when people go in and

(40:32):
out of the office that nobody'sdoing really anything with it,
and so you're using that computepower available in the closed
loop setting, with privacy andsecurity pretty much around it,
to create the smarter model in away that eventually doesn't
cost billions of dollars for ahospital, or any entity for that

(40:53):
matter, to use.

Rani Yadav-Ranjan (40:54):
Exactly.
Not only that, I mean.
All you have to do is leaveyour computer on, Don't even
have to do anything else.
Our orchestrators do everythingand you know it's built in a
really sophisticated, intuitiveway.
You know all you have to do isturn it on, and my vision was I
want to be able to do everythingin three clicks, which is
upload your model, identify adata source and go.

Klara Jagosova (41:20):
I love that and can you share a little bit more?
You mentioned the trial andit's going pretty well.
Obviously I don't want to gointo privacy, but kind of what
are you finding now or anythingyou can share around, how it's
working and the impact.

Rani Yadav-Ranjan (41:34):
When I say it's going well.
It's a smaller trial, it's only25 nodes right now, but we're
building it for more.
I mean, you can imagine, goshCara, I was thinking about this
and just to go off topic for asec can you imagine if, on the
space shuttle, you could do amodel right there using all the
compute power in a space shuttle?
It's a closed loop, right.

(41:55):
Look at military I meanmilitary bases in the sense of
healthcare, you know.
Look at all our healthcare base, our hospitals, and not just
the US military, but anymilitary or anywhere.
You have a closed-loop systemthat you need to further refine
and it's super secure data.
I think the first big thing isit's not just generic

(42:16):
information.
That does not help.
If you're going to do publicsource information, then any
model will do.
Any chatbots there's so manychatbots out there now that say
just upload your data into thischatbot and you can refine it
have the basics of a chatbot.
That's great.
Except that even chatbots youcan get them to, you know, admit

(42:36):
that.
Yeah, sure, maybe you shouldput a camera in the bathroom if
you want to be secure about yourpremises.
But you know those are generic,right, that's just generic.
I'm talking about somethingsophisticated that you need that
are maybe even mission criticalto what you're doing, whether
it is security and how do weencrypt things, or whether it's
even in-house training and youwant to train somebody to do

(42:58):
something specific, but this isproprietary knowledge and so you
don't want to have it in that.
I find all too often, whileevery company says don't use the
LLMs to build a PowerPoint evenbut it's so easy and so fast,
now, right, you can say build mea PowerPoint how to develop a
neural network for super securedata and you know what It'll do

(43:20):
that.
But is it right, as my husbandsays, because he's been testing
it for material science and he'sa semiconductor guy, he says,
yeah, it's kind of like aundergraduate.
You know it's not quite even amaster's level candidate and I
have to snap, but I'm like youare such a snob.

Klara Jagosova (43:37):
I mean that goes into a little bit the
hallucination, right.
So I mean everybody talks aboutthat, let's dive into it, and I
have many different viewsactually on it that I've gone
back and forth because I totallyagree with hallucination.
I struggle with it myself, evenwhen I use some of these GPT
models you name them, I thinkI've tried them all.
At the same time I also ponderwe humans hallucinate too.

(44:02):
There's a human bias thatthere's been.
Everybody who's lived thecorporate world have gone
through the human bias, andwhether we can truly remove it
or not, that's a question I'mstill pondering about, because
we all have created a view inour mind of the world and our
life and how things may or maynot be through our own

(44:24):
experiences.
And I know I'm biased in someways.
Like there are specific thingsthat I have lived through and
when A or B happens I gettriggered and I know where the
triggers come from.
I've learned to limit them alittle bit and learn how to step
away and control them.
But I know exactly there's fewtriggers when I am in those
situations.

(44:45):
They're very dangerous for meand I need to step away.
So I feel to that sense AIagain, and even models are
created by humans.
Absolutely, they're trained ondata that are produced by humans
, and the data again.
The cleanness of data isanother topic.
I know it's a broad question, Iguess, whichever way, Rani, you
want to take it, how do youlook at this complex world of

(45:07):
data analytics, hallucinationsbetween AIs, humans?

Rani Yadav-Ranjan (45:12):
I think you've hit it spot on.
It's a human factor and we cannever remove the human factor.
We really can't.
There was a study done by aPrinceton professor.
He used the data from Amazon.
So Amazon had a metric.
They said, when we hire, wewant to have the best winning
person.
And so they profiled thisindividual would be Turned out

(45:33):
it was a white man, age 30 to 40.
And the Princeton professorshowed the bias.
That is because it's like, okay, how many do we interview?
We interview X amount of people.
How many resumes do we get?
Now, all companies I don't knowif you know this stop at 3,000
resumes.
At 3,000 resumes they shut itoff.
No human can review 3,000resumes, so they use a bot to

(46:00):
review based on the keywordsthat the hiring manager has
asked.
Well, if you don't use theright word, you might be the
ideal candidate for the job, butit's gone.
So then from there it goes downto 10.
From 10, they interview five.
From five they hire the one,the one.
Then if it's moved up the ranksand it's gotten great reviews,

(46:21):
that was the profile.
So that's what they wanted tohire.
So it came out white male, 40to 50.
So what happens now in our world.
Well, we have gone torelationships.
We've said, hey, I believe Ican do this role.
I may not have all the keywords, so I'll call my friend Clara.
I'm like, hey, clara, you're atHSEC, you know, I think I'd be
great for this.
You're like, yeah, maybe youwould Let me see.

(46:42):
Oh, yeah, no, you're right, youwould be.
So then you push up to thehiring manager and say, hey, I
think my friend Ronnie would begreat for this job.
The hiring manager interviewsme and maybe they hire me, maybe
they don't, but we are tryingto circumvent the AI, if you
will right, through humanrelationships.
So data is biased, clearly,especially how you weight it,

(47:04):
for sure.
But the bigger question is thenare we forcing ourselves to
change?
Are we saying, hey, look,there's biases?
We're showing it the same thingwith the Google picture, with
you know, the identification,another huge study.
If there's so much bias in data, that means there's bias in
leadership, that means there'sbias in the teams, there's bias
in the company culture, and Ithink that's what needs to be

(47:27):
addressed.
So I'm also part of thecommittee of 200.
It's called c200.org and thatis 450 women, global women, in
the C-suite.
I was invited to join it when Ihad my last startup.
You have to have rigorousmethods and you have to be like
10 pages of thing.
I'm also a co-head of theirtechnology and innovation chair

(47:49):
and so we help advise or justshare information with all the
other C-suite members who areCEOs of very large companies
actually, on the latestemergence of trends of
technology.
And my big thing is again thisbias in data, bias in their
companies, bias in their companycultures that they have to look
out for and understanding thatthey can't hire enough HR people

(48:10):
to go through 3,000 resumes.
So maybe we need to change ourthinking and how we hire.
Having the follow the sunmechanism that Oracle started,
which is, you know, you have ithere, then it goes, the
development goes to India orChina or somewhere, then it
comes back.
Follow the sun, right, maybethat works, but how well does
all this stuff work?

(48:31):
You know we need to look at thebias in our own cultures and I
think that's very, veryimportant.
With hallucinations, it can besolved.
The current LLMs are only.
They're not analytical.
They're designed to give you ananswer.
If I say I am a Sagittarius,tell me my horoscope today.
It'll tell me my horoscopetoday.

(48:52):
That is what I've asked it todo.
Give me a recipe how to makethe best peach pie.
It'll give me that.
If I say which peaches are thebest, it can probably give me
that.
But if I get more in theanalytical it gets very
difficult and until we breakthat it's going to be hard.
Now we mentioned NVIDIA, and Iknow they have NEMO, which is

(49:14):
built on Minstrel, as their bot.
But the problem is how much ofour data do we really want?
So when my book gets published,am I going to want all these
LLMs to be trained on thisknowledge?
Or am I going to say look,these are my thoughts, right or
wrong, they're my thoughts.
And do I want you to say giveme in Ronnie's voice, do this?
I don't.
But hallucinations can be solved.

(49:36):
It's a matter of ranking andweighting the words, for sure,
but it's also the conscious ofhaving observability in the bias
of the data.
It all starts with data, andall data is messy.
It's got fat fingers.
It's Navigator was acquired forthe mapping algorithms of

(49:59):
cleaning our data.
It was really nothing else.
It's the ability to look at thedata and sanitize it, clean it
and then put it in a labeledmanner, and those are huge
companies today that do nothingbut just that because it's so
complex of a system.
But there's a lot of growth.

(50:19):
I see a lot of industry comingup.
I see a lot of changes comingup in how we think.
I would like to believe thatthese very weighted questions
that you have asked will helpsociety evolve and change.
I really hope so.
I hope that the gender biasthat exists today will erode,
but I don't think so.

(50:39):
I think it's too ingrained intous as humans.
Maybe another hundred years,but it's not going to happen
before that.
We're having a hard enough timehaving women in leadership hold
their positions.
You know, I have a friend and Iwon't mention the names of
these companies because they arein the news every day.
There's a thinking among thehedge funds and some of the

(51:00):
other leaderships that they onlyinvite a woman in to be CEO
because it's going to fail andthey want to blame her.
They give her two years to dosomething and then it's sad.
You know, personally speaking,I will say to you that women are
brought in as a warming.
You know the role.
You're warming it up for a guyto come in.
You're creating a foundation,creating an outline, creating a

(51:21):
structure that the role will do.
You've done all the heavylifting, you've mapped it out.
You know you've carved out whatyou need to carve out.
You've laid in a plan, afive-year plan, which everybody
asks us.
You know that what's afive-year plan.
You've carved it out, you'veput the infrastructure in place
for that plan, you've built therelationships for that plan.
And then you know, a lot oftimes they're like, hey, thanks,

(51:42):
because the same problem thatis in the venture community is
also in the enterprise community, in the corporate cultures,
which is, you trust the personacross the table from them If
they don't always look like you.
It's very hard.
If they're coming as areference from somebody who is

(52:03):
across the table from you thatyou trust, it's about 50% in the
door but it's not 100% in thedoor and that's hard.

Klara Jagosova (52:10):
Yeah, I mean, the trust is a big aspect in the
corporate world for sure, and Ithink it's even more important
as you go up.
You know a lot about it, moreso than me, but I do want to
dive a little bit.
Maybe we'll just go a sequenceinto the women topic and women
leadership, because I know youto dive a little bit.
Maybe we'll just go a sequenceinto the women topic and women
leadership, because I knowyou're really passionate about

(52:31):
it.
I have observed you being thecheerleader, even at Ericsson,
sometimes perhaps too voicefulfor some of the cultures which I
loved and appreciate.
Obviously, I'm right behind you, rani.
Thank you for doing the heavylifting and courageous work.
But going through the differentcultures, even myself recently,

(52:51):
from Ericsson to Apple, nowAccenture, that obviously has
Julie as the power women,probably the closest
percentage-wise men to womenratio when it comes to
leadership.
I think its latest numbers Isaw was like 49 to 51%.
So really making an effort,closing the gap.

(53:12):
And now coming back to HDAC,full loop, my view is that
leadership is not about the sex.
There can be amazing leadersthat are men and amazing leaders
that are women, and I have toadmit myself I think I had a
little bit of a statistical biasperhaps in the past.

(53:33):
I was just telling somebodyactually this morning,
throughout my career of 15 plusyears I only had one female
manager in my whole life.
All of the managers I've hadhave been men and through that
you can imagine you kind ofblame.
Oh, you know, men are this wayor that way, but I think it's a
it's a rule of statistics too.

(53:54):
Like we've lived around,especially in the technology
just seems to be more of a mandominated world which you know
you can make assumptions of, whythere's there's like a whole
sort of thing going back downthrough.
How do you inspire kids evenearly in the age to be part of
STEM and be part of technology?
But yeah, let me just pausethere.

(54:15):
Turn it back to you.
What is your view on it?
Closing the gap or inspiringmore women to continue to be
courageous, step up, have theleadership, no matter what the
word on the street is?

Rani Yadav-Ranjan (54:27):
Again, spot on and brilliant question.
So when I was on the board oftrustees of the University of
California it's a 10-year span Iserved my 10 years, made as
many changes as I could, butthere was a study done that said
why do women engineers drop out, or women overall, after
sophomore year?
They drop out of theirengineering classes or math

(54:49):
classes, any science classesactually, and the results were
that universities inherentlyhave graduate students teaching
some of these classes and thegraduate students are
foreign-born, most of them wereforeign-born and they have very
little patience for oureducation system and how some of
these women would approach aquestion.

(55:11):
Sorry, inherently it wascultural, so they are
merit-based people.
They're here graduate students,foreign-born, and their culture
and perception of women is verydifferent than what they were
seeing in front of them.
And it was very shocking for methat they dropped out If they
had a foreign-born graduatestudent teaching their
undergraduate class.
They never went further.

(55:32):
That was it.
That was the last science classthey took and that was very sad
and shocking.
So of course, my husband beingthe benefit of that, I asked him
.
I said you taught these classes.
What did you do?
And you know, and along all hisfriends, I'm like, hey, when
you guys all taught these womenthat would come in, what did you
do?
Did you take time?
Did you give patience?
Did you not make them feel likethat was a dumb question?

(55:55):
When they were asking, I saidcertainly I was one of those
women.
To go back to the first thingyou said you're right, I also
can profile right away.
I know when I go in for aninterview whether I'm going to
get hired or not.
Just by the person in the firstfive seconds of that screen
coming on, or that me sittingdown in a meeting.
I already know there's aprofile of people that will hire

(56:15):
somebody like me and there's aprofile of people that will not.
And it is not always gender,but it is there.
So that was an interestingstudy.
It made us change a lot ofthings for the University of
California for the good.
So now undergraduate classesare taught by faculty and then
it isn't until you get to yourjunior and senior year that you
know we let any of the graduatestudents really come in and

(56:38):
teach classes.
That's important.
I think women are our hardestcritics for ourselves and that's
not fair.
I think the idea of asisterhood is illusionary at
times.
There's no such things.
You know you mentioned yourbosses were always men.
But I have found, especially inthe venture community, when I

(57:00):
have gone for venture you know,rounds and such men can play
golf.
Then they go into the men'sroom and they continue in the
men's locker room.
They continue theirconversations and I'm locked
outside, and that is a greatanalogy as to where the world
needs to go.
It has been proven that womenwho are in leadership roles help
lift the bottom line ofcompanies for many ways.

(57:22):
It's just the way we are alsowired.
But I think for women to belifted up you need a little bit
of that glass ceiling to becracked even further.
I think one or two token and Icall them token there's too much
weight on their shoulders.
You know, all of a sudden youhave all this responsibility
like hey, look, we lifted you upto be head of this country or

(57:43):
we did this.
There's a lot of expectationsthere, and that's what I mean by
women in the C-suite.
And I was talking to you aboutmy friend.
She was a VP of a very largecompany here and wanted to be
CEO desperately because she feltshe had the qualifications and
the experience.
And she did.
But she made the strategic jump.
She went from SVP here to a VP.

(58:04):
She went from SVP here to a VP,to a SVP, to a CEO role, got
crushed and started her owncompany after seven years.
Because that should not be thedefault.
It should be that the value,that, okay, your approach is

(58:29):
different, but it doesn't haveto be bad, and we should not
continue our conversations intothe men's locker rooms or the
rooms where women are excluded.
So that comes down to associety, do we ask too much, or
are we really asking othercultures to rise up to let that
happen?
Are we asking for a culturalchange at the atomic level, or

(58:50):
subatomic level, if you will?
You know, at such a nuancedlevel that it's not possible,
right?
You look at India.
That has the largest workforceunder 30.
Most of the educated women.
Women have a higher illiteracyrate in certain states than men
do in India, and that is becauseit is cultural.
But given an opportunity, womendo much better than men in the

(59:13):
classes.
But yet when they come on,there's that perception that, oh
, you will stay home and havefamily, and so that brings you
to a better question of healthcare and child care.
And you know what are we doingto support these women?
I mean, there was a time,before Jack Welch was at GEE,
where corporate cultures didhave the worker as a central

(59:34):
figure of the things thatSilicon Valley was so successful
at, everything they could do tohelp employees stay at the
office and work.
It was cheaper for us thanletting you go out for lunch for
an hour.
Even so, supply lunch If youdon't have to leave to pick up

(59:54):
your child at school.
Let's have daycare in-house,let's have tutoring in-house, so
you don't have to worry, youcan focus on your job that we
have hired you for.
Until some of those benefits.
If you will come back into play, it'll be very hard, but of
course, yes, it does cut intothe bottom line.

Klara Jagosova (01:00:16):
But again, it's the long game and not everybody
thinks of the long game and Ithink you really should.
I love that.
What comes to mind is a littlebit of the balance, and I don't
know if that's the best word,but you can't be a leader and
still have the expectations, Ithink, to be primary person, to
bring up the family, to takecare of the household, kind of
do everything right.

(01:00:37):
So some of the really amazingleaders that come to our mind
have been really great infinding partners that take on
some of that load or share itequally to kind of help support
them in their career growth.
Also had one conversation AbbyDavison, who actually wrote a

(01:00:58):
book about this, and on aprevious podcast we talked quite
a bit about that perception,even as women and you mentioned
like we sometimes are hardestjudge and she navigated that
through even her role inorganization in Gap Inc.
And how might some of the womeneven there that was still quite
female heavy leadership hadchallenges kind of with that

(01:01:21):
doing the raising family andhaving successful careers.
And how do you feel that you'redoing both things equally right
, I guess, or sort of balanced Idon't know if equally right, if
you can ever do it right, butsort of to, in a loose term, any
view on that, because I dothink it also feels really
lonely when you get up top andyou have a view in Rani like any

(01:01:50):
sort of tips or maybe even aswomen, could implement ourselves
when it comes to losing some ofthat judgment about how we are
doing in our own lives.

Rani Yadav-Ranjan (01:01:55):
You have to remember you can't have it all.
It's a piece I made a long timeago, but I can have it at that
moment.
We have three children, as Imentioned, and my husband and I
decided that he would do thebreakfast.
So if he made breakfast, wepacked lunches together and then
we dropped off the kids atschool.
We actually put them in privateschool because our public

(01:02:16):
school system is a little brokenin the United States In
Minnesota even, it was a better,but that was an older time.
We had nurses, we hadlibrarians in the schools.
Hot lunch was literally a hotlunch.
The accommodations I madecertainly not everybody can
afford to make either.
I have to preface that you knowif you can afford a charter

(01:02:37):
school or private school, youshould, because you don't have
to worry about certain things inan education system.
Plus, they had had daycareafterwards so they could stay
there and they called it studyhall, which literally meant they
could get their homework doneand they had a snack.
So we picked them up at 6.
We both took turns cookingdinner and then we had family
time and then, when they went tosee, we worked until about 11.

(01:02:58):
And we started again the nextday.
Now, as the kids got older, itgot harder.
Our and then started again thenext day.
Now, as the kids got older, itgot harder.
Our careers were climbing, somy mom moved in with us to help
buffer some of that.
It does not change the factthat you have to be there for
your kids.
You know, my husband and I westill laugh that when they had a
, you know, halloween parade orwhatever, we would literally put

(01:03:20):
it in our calendar and we wouldblock out the time to be there
for that, because it doesn'tmatter if you're there for some
of the bigger stuff, if you'renot there for the little stuff
and that's what they remember isthat feeling of embarrassment.
You know, as I said, I'm achild of an immigrant and my
parents were never there and Iknow that feeling and I didn't
want my kids to have thatfeeling.
So it's remembering that it'salso great having a boss.

(01:03:42):
Unfortunately, all the menbosses.
I had never understood why Iwould have to take off time for
this.
So I just started blocking itas doctor visits, because you
can't question it Dentist visit,doctor visit, physical
therapist visit.
But my boss I was honest with.
I would tell him like, look, mychild's got a pumpkin thing and
you know I got to go.

(01:04:03):
Or they've got a Halloweenparade or they've got you know
Easter parade or whatever.
I would always be honest withmy boss but in the calendar,
where everybody else couldn'tlook, it was not blocked as
personal time, it was blocked asa doctor or some visit.
So you know, you have to havethat relationship with your boss
.
I think is important.
Most people are very supportivenow, and I did this at Airbnb

(01:04:23):
and I did this at Ericsson,which is look, get your work
done.
I don't care if you workregular hours or not, you just
have to get the work done.
And as long as nothing slipped,it was cool if you took four
hours to do versus eight hours,because everybody works
differently.
From a woman's perspective rightnow, coming into the age, I
would say to you don't buy intothe notion that you can push off

(01:04:47):
certain biological tasks,because that's not factual.
Like I said, our biologies aredifferent.
Some people sure can choose tohave children later, some people
cannot.
So if that's a priority in yourlife, share it with your boss
and tell them you're going to bedoing it, that's it.
But then you have to figure outthe daycare and that's a
priority in your life.
Share it with your boss andtell them you're going to be
doing it, that's it.
But then you have to figure outthe daycare and that's where I
think corporations have to stepin as part of the success of

(01:05:10):
this individual.
Do share the task, don'tsacrifice that.
I took off five years.
It's cost me 15 years to getback to where I should have been
, but it's a choice I made.
I'm happy with my choice.
My kids had to understand thatthere were evenings that you
know I had work events that Ihad to go to, or you know they

(01:05:32):
came with me.
I mean, we had a lot offunctions at our home that we
had to host because it was myturn to host, so we did.
It afforded them an opportunityalso to walk into any world and
have conversations at any level.
So a friend of mine, dan Gordon, who is CEO of Gordon Beerspear
, we met and you know you'rechatting like this, and I have

(01:05:53):
three kids.
Yeah, he has three kids, andwe're like what do you do for
New Year's?
I said I have three kids, we donothing for New Year's.
And so we decided when hisyoungest was five no, sorry, his
youngest was younger than five.
We decided that every year onNew Year's Eve we would host
each other's families.
So one year their home, oneyear our home.

(01:06:13):
We would cook dinner, we wouldplay games, and we did that for
years, for years and years untilthe kids our youngest ones were
in high school.
Now you know it's ironic thatthe three kids have followed
almost the same careertrajectories.
You know where?
Both our youngest are attorneys, both our middle ones were in
education, they're public schoolteachers, and both our eldest

(01:06:34):
kids are in the corporateculture.

Klara Jagosova (01:06:37):
So I love diving into these courageous topics
with you because I know you arecourageous and one of the other
topics I saw recently on yourLinkedIn is actually from your
99-year-old uncle, who alsoposted a study that's related to
aging and how, as we age, weactually continue to grow

(01:06:59):
smarter in many ways throughreflection and meaning and
purpose, and continue to developreasoning, obviously if we
continue to invest our time inthe problem solving.
So I saw that post and I knowseveral people who are actually
struggling with that, and that'salso the intersection of now AI
.
Some people are stating that,oh, now, if we have the young

(01:07:21):
kids and we have AI, you know,will we need to start retiring
sooner?
Or you know what might behappening, although there's
still the gap when it comes tothe amount of kids that are
being born.
So I think overall in the worldwe still have a gap when it
comes to kind of the workforceoverall.
So perhaps AI is a naturalcontributor to that productivity
.
But I want to throw this topicin because I know many people

(01:07:45):
maybe would not have the courageto talk about it.
You do what's your view.

Rani Yadav-Ranjan (01:07:49):
His name is Dr Sid Soti.
He's in Halifax, nova Scotia.
His grandson is a Formula Onedriver Can you imagine that?
But look at this 99 and hesends me this thing.
You know about how God changesyour mind Because we believe,
and it's false that AI willreplace theology or replace the

(01:08:13):
concept of a divine being, andthat's absolutely false.
What I took out of his thing andI read the paper that this was
based on I haven't read the bookyet.
It goes back to the faith.
I believe you need to havefaith in something, in theology.
In this paper he cites theJudean culture right where they

(01:08:33):
value the elders.
And that's what I was sayinghere that at my age and like I
said, I mean my children areadults and married, my children
are adults and Mary, I'm stilllearning.
If I can write a book at my age, and so my book is going to be
called Constitutional Democracyin the Algorithmic Age.
If I can write that, doesn't itbehoove us to put our knowledge

(01:08:54):
down so other people can taketwo sentences out of it and
build something fabulous?
Or what happened to the 16,70-year-olds and even
80-year-olds, you know, justbecause our machines break down,
and that I mean my hearing ormy eyesight or my knees, or I
don't hit as hard as I used to,you know, because my shoulders

(01:09:17):
just because the machine wearsdown doesn't mean the CPU GPU
wears down.
In fact, it retains it.
It's all retained forever andit compounds it.
It's doing exactly what you'resaying with hallucinations.
The more knowledge I get, thefiner my ideas formulate and
they're articulating much moreclearly.

(01:09:38):
Now, you know.
I'll give you an example.
So we have a grandson and mydaughter was saying to me he has
a diaper rash.
It's a really bad diaper rashand my son-in-law has mentioned
that he had a diaper rash forlike two, three weeks and my
instant thing was oh, have youbeen feeding him cantaloupe or
peaches?
And he's like peaches.
I said, ah well, his mom wouldget diaper rashes from having

(01:10:05):
cantaloupe and peaches, likeinstantly.
There you go.
So there's the data.
Now they could have continuedor they could have just made
that passing comment and thewisdom or the knowledge that
I've had could be passed on.
And now the best cure for thatis another whole thing, but you
know, that's there.
The other thing people have toremember, that I think is really
important, and it's reallyimportant to emphasize Clara is

(01:10:25):
all these apps.
Everything is based on humanknowledge, everything is human
knowledge.
If you're looking at aparenting app, it is based on
aggregate human knowledge, basedon a bunch of moms like this.
You know what can give for thedata that I have, which is my
three kids.
This would be actual data,would give you that knowledge,

(01:10:49):
and you need to remember andtake that with the intent that
it's given to you.
This is advice.
60 and 70-year-olds can lookback and say, no, maybe you're
right, but we're learningbecause we have the time.
We're building on a career thatis already done.
We don't have those stressesanymore.

(01:11:09):
You know, we don't have toworry about making the plays.
We don't have to worry about ifI want to work 12-hour days, I
can work 12-hour days.
We have those freedoms.
The question is can societyaccept that?
Can you have somebody come in?
And you know, in India theforced retirement is 55.
Now, that's a different culture.

(01:11:30):
They have a massive 100 plusmillion people that are under 30
.
I get it, but yet even there,like I mentioned with Japan,
there's a mandatory thing, butyet you can't.
I have a neighbor who is anuclear scientist, but well,
close by, he's 89, and they keepasking him to come back because

(01:11:50):
nobody can do the job based onthe experience that he has and
he's 89 years old.
So if you're given anopportunity, what you will get
is an individual who has gotdecades of knowledge and
experience, who can handle verycomplex tasks easily.
If you will.
If you're worried about thesalary because you don't want to

(01:12:13):
pay them these salaries thatyou think that this person
should make, offer themsomething.
Offer them something that works.
A lot of times my colleagues andI, or my cohort, we just want
to work because we're not doneyet.
Yes, we would like to be paidwhat we're worth, but there's
always creative ways to hirepeople.
You can give options.
You can do other things.
You can do things that impactyour bottom line, or you can

(01:12:36):
bring them in to shadow andtrain somebody who's just coming
up just from nothing else insocietal good and ethical
development.
And, like I said, I learned aprogram back when you would
mount a tape.
You know, in COBOL you have tomount the tape, run the code,
then dismount it.

(01:12:56):
These kids don't evenunderstand what that means.
So imagine having to programsomething raw.
That's why these scientists arevaluable, the people who
understand how to code and cancode devices.
They're the ones who arevaluable.
Prompt engineering is great.
That's just asking smartquestions.
Any 60, 70, 80-year-old can bea prompt engineer in my world
because they know how to askquestions.

Klara Jagosova (01:13:17):
Yes, yeah, I love that.
Obviously I'm biased towardsthat because I always think my
grandparents were the best thingthat ever happened to me when I
was a kid, like growing up Justeven with their passion and
patience, like there's both ofthese things like inspiring your
passion and giving you thefreedom to sort of do things you
want and having the rightpatience with you when I was

(01:13:38):
doing my homework and kind ofhelping me solve through the
problems Exactly when I wasdoing my homework and kind of
helping me solve through theproblems Exactly.
And I hope one day I will be assmart as my grandma was and I
actually just see it also likemy mom and she's like fantastic
human, just everything she hadgone through in her life.
I one day hope I will be ascourageous and brave as she is.

(01:14:01):
I have just a line of strongwomen to look up to Impressive.
But I feel like somethinghappened when my grandma passed
away.
My mom is continuing to switcheven more into this wealth of
wisdom and just very differenttemperament and I wonder if it's
kind of what you mentioned,this lift experience as you

(01:14:21):
continue to go through life andyou see that full evolution and
even just the history, too, oftechnology or the world for that
matter.
I think that's something thatoften now is being forgotten,
like what can history teach us?
And if you kind of track backto some of the human choices or
technology choices that wentgood or poorly, there's a lot of

(01:14:45):
lessons that we can lean on andapply to and to your point,
like what are the deep questionsthat we should ask as we think
about deploying this newtechnology that's coming up?
That may be grounded in thelearnings from deploying the
past technology in the right orwrong way?

Rani Yadav-Ranjan (01:15:02):
You're absolutely right.
People, I think, inherentlyunderstand when there is a shift
that occurs in our society.
You know, when my mom and dadboth passed away, I said to my
siblings I said now we're theelders.
Yikes, it's time for us to growup and be elders now and that's
important.
You know, we need those.
Society needs those.
Ai needs those.
Society needs those, ai needsthose.

(01:15:22):
Actually, if you want to get ridof hallucinations, you get all
these seniors who have any kindof a common sense and you have
them start labeling yes or no.
You have them start validatingbecause, based on everything we
know, the stories we've heard.
And if you haven't heardstories, ask your parents, ask
your aunts and uncles, ask themquestions.

(01:15:44):
Because, because even some ofthe questions that I'm asking
now I wish I'd asked back then,but I was so busy.
What people don't realize, andif I can give you one piece of
wisdom, is life is very fast.
It's short but it's very fast.
From the time you're zero to 20, you're just waiting to grow up
, you're just waiting to do thatand guess what it happens.

(01:16:05):
Then it's 20 to 40.
You're just building yourcareer, you're having your
family, you're setting yourfoundation and I'm telling you.
It happens like this 40 to 60,you're like coasting.
You're still getting those kidsout the door or whatever.
You've got a balancing act.
You're still being sandwichedbetween parents, siblings,

(01:16:31):
children, life.
You're still sandwiched, butyou'll hit your 60s and 70s and
you're no longer sandwiched andyou have nothing but time and
opportunity to learn and becurious.
But if you get blessed like itsounds like your grandparents
work hard with you and yoursister you are allowed to share
that back because you're not ina hurry anymore.
You know now you're in a hurryto share your knowledge that

(01:16:55):
you've accumulated all this time.
I love to read, I love to codestill, there's a lot of stuff I
love to do but like yourgrandparents, that kindness and
gentleness and love andnurturing they gave you, they're
forever in your heart and soare they not immortal.
Is that not what immortalityreally means?

(01:17:16):
And isn't that how we feel weshould be remembered With
kindness and grace, not thehumans that we were?
And so to me, you know, folks,if you want to be immortal, give
back.
That's it.

Klara Jagosova (01:17:29):
I love that.
It beautifully ties into a fewclosing questions that I have.
So, if you have time, but Iwant to insert one more in,
because it's again this age ofAI that you continue to shape
Before we go to closing,anything else that you want to
share with the listeners, theaudience, about great cloud AI,
the book, what are you launching, or even just things to keep

(01:17:53):
top of mind as we continue toexplore and navigate this new
technology.

Rani Yadav-Ranjan (01:17:57):
I think what you have to realize is that we
are going to become more siloed.
Companies are going to startlocking up their data.
They're going to finallyrealize that it is gold and it
has been gold.
I'm more concerned with thevast amount of data that every
government in the world hasaccess to, whether it's still in
paper or it's been digitized.
At this point, it can be usedfor good.

(01:18:20):
It can also be used not, so Ithink guardrails are critical to
have in AI.
The GDPR has sent out withtheir EU AI Act did a lot of
great work that needs to bebuilt up on yet, and if we can
start creating maybe a worldorganization for AI, that would
be great governance Whether it'sAI governance with guardrails,

(01:18:41):
you know, then you can do yoursub stacks.
I was a part of NIST AI here,in fact, through C200, I helped
with the White House back in Ithink it was a Biden or maybe
Obama administration helpedwrite their AI policies some of
them, but certainly movingforward.
Ai can enhance if it's donecorrectly.

(01:19:02):
Ai can enhance, if it's donecorrectly, autonomous vehicles
and things like that.
They need AI.
You cannot drive.
If you can have sensors andsomebody can be a second pair of
eyes, why not?
If we can do surgery roboticsurgery across the world,
because the expert happens tolive in Ukraine or the United
States, why not do that, sharethat knowledge and do the
surgery?
However, I don't believe thatevery role will be applied that

(01:19:27):
way.
I don't believe that's possible.
Certainly, I cannot get a robotto come and do my plumbing.
It would be great, but I can't,so that's never going to happen
.
With GrayCloud, hopefully oursolution solves a critical thing
of super sensitive data.
If not us, it'll be somebodyelse, because you have silos now

(01:19:48):
, and data silos are really bad.
But if we can evolve aroundthat, we should.
We should do it, as societyneeds to evolve.
We need to do that.
There used to be the arms race,then there was a technology
race, now it's an AI race.
What you have to remember isNVIDIA is not the only company
that has GPU and CPUs out there.
Theirs are fast, very fast, butthey're not the only ones, and

(01:20:10):
so our solution is if you don'thave millions to drop in a,
what's it?
Blackwell, yeah.

Klara Jagosova (01:20:17):
The new chip, yeah, if you don't have 10
million to drop into that.
You don't have to you.

Rani Yadav-Ranjan (01:20:19):
If you don't have $10 million to drop into
that, you don't have to.
You can use GreyCloud and it'llmanage and do the load
balancing for you.
And you know your computer's onanyways.
Just plug it in, leave it onfor or still the wars that's
going on.
You can take it whichever way.

Klara Jagosova (01:20:52):
Always curious what do you think people can be
doing more of or less of as wenavigate?
Kind of.
This seems like turbulent timesthat I now have been saying
that I realized for severalyears and just is not getting
any calmer.
Really, maybe that's just thenew reality.
What's your viewer inspiration?
For the listeners I would saydon't get triggered from fear.

Rani Yadav-Ranjan (01:21:10):
That's the biggest thing.
Fear, just it just kills yourmind.
Like I tell my kids, you gottaface it, you gotta walk through
it, and then it becomes anothing burger.
Don't wait to have difficultconversations, because they
become complicated conversationsfrom a nothing conversation.
Ai is going to be here becausewe're showing how easy it is to

(01:21:33):
do things.
Corporations are gravitatingtowards it because they believe
they can replace a workforcewith computer knowledge.
What they don't realize is thatfor every server rack, every
cloud storage company, there'ssuper drain on power grid.
So resources are going to beexpended, and that includes
water.
Water is a resource that isgoing to be in short supply very

(01:21:55):
fast, very soon.
I remember 20 years ago tellingour mayor here in San Jose that
if you really want to cripplesociety, just kill the water
grid and the electric grid.
Well then you have to rememberthat that all these technologies
we depend on depend on power,and so you can kill anything,
just unplug it.
Yeah.

Klara Jagosova (01:22:15):
That's it.
Turn the grid off and we'lllearn really quickly.

Rani Yadav-Ranjan (01:22:18):
Yes, yeah, you're back to pencils, so keep
your knowledge.
Don't stop learning about whyand how these evolve.

Klara Jagosova (01:22:26):
I love that.
It reminds me I now and then dothese trips to remote places
without any connectivity and Ifeel like they give you so much
more clarity and you alsorealize how insignificant we are
, because the mother nature isjust so powerful that if you
actually get yourself into themother nature, if it gets angry

(01:22:47):
at us, we're just going to belike ants rolling down.
So, yeah, how do we keep ithealthy to not make it angry?
It should be actually our focusas well.
And then with that, I know yourbook is coming out, so maybe we
can have a round two when itdoes.
I'm really excited to read itand thank you for that
announcement.
I know you're very active onLinkedIn, which, with your

(01:23:09):
permission, I'll add yourLinkedIn to the podcast so
people can easily find you.
But anything else, what's thebest way to keep in touch for
listeners who want to reach out,have a conversation about what
you're building, how to scale itor any other topics?

Rani Yadav-Ranjan (01:23:29):
Absolutely no , please, I mean definitely
through you.
Thank you for having me on thispodcast.
This is just brilliant.
I'm so proud of you.
I'm so excited to know you Likeoh my God, I have a friend
who's a podcaster.
Thank you, which is cool.
Please, yeah, contact methrough LinkedIn, always
available.
You can DM me any which way youlike.
I'm always available toconverse.

Klara Jagosova (01:23:46):
Excellent, and I'll add that so people can
easily click and find you.
And thank you so much, ronnie.
I know we could be going forhours, but maybe that would be
our series too.
I'm sure when your book willcome out, they'll be full of
wealth and interesting thoughtsand information, so I look
forward to that.
Thank you, cara, for everything.
I really appreciate it.

(01:24:06):
Talk to you soon, thank you,have a good day.
Bye-bye, you too Bye.
If you enjoyed this episode, Iwant to ask you to please do two
things that would help megreatly.
One, please consider leaving areview on Apple Podcasts,
spotify or any other podcastingplatform that you use to listen
to this episode.
Two, please share this podcastwith a friend who you believe

(01:24:32):
might enjoy it as well.
It is a great way to remindsomeone you care about them by
sharing a conversation theymight be interested in.
Thank you for listening.
Advertise With Us

Host

Klara Jagosova

Klara Jagosova

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Cardiac Cowboys

Cardiac Cowboys

The heart was always off-limits to surgeons. Cutting into it spelled instant death for the patient. That is, until a ragtag group of doctors scattered across the Midwest and Texas decided to throw out the rule book. Working in makeshift laboratories and home garages, using medical devices made from scavenged machine parts and beer tubes, these men and women invented the field of open heart surgery. Odds are, someone you know is alive because of them. So why has history left them behind? Presented by Chris Pine, CARDIAC COWBOYS tells the gripping true story behind the birth of heart surgery, and the young, Greatest Generation doctors who made it happen. For years, they competed and feuded, racing to be the first, the best, and the most prolific. Some appeared on the cover of Time Magazine, operated on kings and advised presidents. Others ended up disgraced, penniless, and convicted of felonies. Together, they ignited a revolution in medicine, and changed the world.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.