Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Kenny (00:33):
Welcome, welcome
everybody to another great
episode of The Beyond NormalPodcast, where every founder
story is a journey worthexploring.
I'm your host, Kenny Groom.
Today I have a very specialguest in studio with me.
Hello.
Yes, Rachel Cash.
She is the, you go by CEO orfounder,
Rachel (00:51):
um, founder and CEO,
founder and CEO
Kenny (00:54):
of Elroy and El.
What Elroy does is it's a firstparty data platform.
Uh, with, uh, AI driven insightsand seamless compliance, uh,
for.
Organizations and for consumers.
We're gonna dive into that in alittle bit.
But without further ado, I justwant to thank you for coming on
the platform today.
I
Rachel (01:14):
appreciate it.
It's awesome.
Kenny (01:16):
Yes, yes.
Uh, have you been on podcastsbefore?
Rachel (01:19):
I have.
Kenny (01:19):
Alright.
So you know the drill, right?
We're gonna really.
We're gonna leave no stoneunturned.
We're gonna dig into your story.
So without further ado, let'shop into the first question.
Mm-hmm.
We're gonna take things back,right?
In terms of what you were doingbefore Elroy and what, you know,
what led you to start thinkingabout data privacy as a
business?
That you would wanna take agamble on yourself with, with
(01:42):
being a founder?
Rachel (01:43):
Yeah, that's a good
question and it's a long answer,
so I'll try to truncate it downfor us.
Um, so I went to North CarolinaCentral School of Law, so I'm a
legal eagle.
Shout out central.
Um, and when I went to lawschool, I always knew I was
gonna go into business.
So I, I always knew that Iwanted to go on the business
route, but a lot of the classesI took were constitutional law
(02:05):
driven and, and I really fell inlove with how does one own their
persona or themselves, theirlikeness in an ecosystem.
And just like, so I graduated alittle bit ago, so 20, was it
20?
13.
I went to, uh, 2013 I went tolaw school.
So it's, it is been a minute.
(02:25):
2014 I went to law school, soit's been a minute.
And so then I was focused ontrademark law.
Um, and so thinking, thinkMichael Jordan trademarking his
signature, um, Jor, the Jordanemblem.
And can individual consumers dothat or individual people do
that?
It was right in the like onsetof.
(02:47):
Um, NFTs mm-hmm.
And more people puttingthemselves on YouTube and in the
digital economies.
Um, coupled with, I was workingthe front desk at, in law
school, so I had no technologyexperience, but I was working
the front desk and kind of justhad everybody coming back and
forth.
So I was kind of like thiscentral mechanism while very
interested in theconstitutionality of, of
(03:08):
individuals.
Right.
Um, so start studying for thebar exam.
All these things are kind of allthese thoughts around how can
people own their likeness underthe constitution?
How can we integrate intechnology into this?
And then there was an Experiandata breach when I was sitting
at the front desk.
(03:29):
So I took like a whole side,like a, a side trip into
security.
Mm-hmm.
Um, and things just kind ofsnowballed from there.
I did SOX two compliance.
Right after law school, justbecause I got really
hyperfocused on it.
It was just.
By happenstance and my first jobthere was an opportunity for
(03:51):
privacy and it just like was allthe stars aligned?
Um, the California ConsumerPrivacy Act had come out.
Mm-hmm.
And they were look like,basically needed someone to
help.
This program that they wererunning.
So I was working in insurance inall, um, in Chicago and just
everything aligned perfectly andprivacy I found was like where
(04:14):
the law was keep trying to weavein the constitutionality of how
do we protect consumers.
It was technology.
It was moving fast, and it'ssomething that you rarely see.
In the legal field.
And from there I startedbuilding data orchestration
systems.
So the strategy around howorganizations.
Deal with data.
(04:34):
How do we protect consumers?
And so I was really excitedabout this'cause one, I was
making money.
Mm-hmm.
And two, I was doing something Iliked and I was living in
Chicago, but then COVID hit Ohright.
And I had a lot of time to sitand I was like, ah, I'm doing
something, but I'm not really,still not solving the
constitutional problem I believein.
That's what started me at likeon El Roy's path.
(04:57):
So long story all, basically allthese things kept coming to me
and.
I found a problem that I, Iwanted to solve.
I love
Kenny (05:06):
that.
So that aha moment for you whenyou knew it, you wanted, like
what was the, was there aparticular moment like, that
sounded like it was a, a quite afew experiences, so was there a
particular moment where you werelike, I wanna
Rachel (05:22):
start.
I think the moment.
So I, I've had a couple inthose, in that path.
I had a couple experiences, so Iwas relatively new attorney, so
all my thoughts weren't alwaysescalated to the top or the
decisions that were made.
Um, and the decisions that weremade were strategic decisions
that were best for companiesthat I was working for and not
(05:44):
best for the consumers that weregonna be impacted.
And all those nos started tolike accumulate.
Like, Hey, I think this is gonnabe best for consumers.
But I was getting red taped,like saying, no, we're not gonna
do that'cause we have to predictthis from the business.
And that was a trigger for me.
Like, oh, this isn't, we are notsolving what I really wanna
(06:05):
solve.
And so I just kept growing andgrowing and growing.
And then I, I think the final.
Step was one of theorganizations.
We were the first company to useOneTrust.
One of the first big companiesto use OneTrust, which was a
data, is a data privacy company.
Um, and they became the fastestgrowing startup, um, that year.
(06:27):
And I was.
Fundamentally opposed to theirapproach to privacy.
And while I, we implemented thetool with the way that I, we
thought were the best at thetime.
Um.
I said, if this team can do it,I can do it.
And I think it was just seeingthat visibility and seeing a
(06:47):
team who was doing something newand people giving them chances
while knowing there's a problemthat I knew how to solve or had
an idea that was kind of, thatjumped off the edge.
Mm-hmm.
And COVID gave me theopportunity.
You didn't look back since?
Yeah.
I mean, I have looked back.
'cause like I said, I was makingmoney.
Kenny (07:05):
Oh, got you.
So, okay, so you, but you havingthat strong background in
privacy law.
Like how did, how do you thinkthat background prepared you?
To be a founder.
Rachel (07:20):
Um, so there's a lot of
things about being a founder or
just a business owner ingeneral, even if you're not in
tech, that I, I think it's hardto jump over.
For example, like filing yourLLC, doing your C Corp.
Uh.
Even knowing like what's yourbest privacy strategies now are
becoming like more and more coreto what founders need to think
(07:40):
about.
'cause your investors are gonnaask you about your compliance,
even if you don't haveeverything set up, what's your
approach to it?
So a lot of those hurdles, Ithink some founders had from.
I think a founder will have acouple hurdles they'll have to
be able to overcome.
Some of them will be legal, someof them will be based on sales
and some of them be onmarketing.
(08:01):
The legal hurdle was notsomething I, I felt like it, it,
legal stuff comes naturally tome, so I just, it, it just
helped me not feel.
Scared about things that we hadto do.
Setting up and being strategicabout that was, was fine.
The parts that I needed morealignment than I, that forced me
to spend more or work on more,were like marketing and sales
(08:23):
and how do we communicate astory.
Um, on the other side of that,I'm very, very detail oriented
and think like a lawyer, soeverything has like a risk
profile to me, and so I, I do.
While being a founder, I'mtaking a lot of risks, but most
of the, the steps that I, Ithink through are I, we probably
(08:43):
build technology that we don'tnecessarily need because I
understand our risks from like asecurity and privacy
perspective.
So I wanna be at the bleedingedge of that.
Okay.
Not necessarily where, Hey, wehave an MVP and we can do this,
and, you know, if somethinghappens, then we make the, the
next jump.
Kenny (09:03):
Yeah.
Okay.
So on that.
Side of things, you feel likeyou go above and beyond.
Rachel (09:09):
Right.
Kenny (09:10):
Got you.
Okay.
That's
Rachel (09:12):
which I'm sure my tech
teams hate.
Yeah.
I, I'm so sorry for anybodywho's worked with me from a
technology perspective, but Imay not understand everything.
I'm always like, Hey,
Kenny (09:23):
here's a issue.
How do you know?
Like, how do you know?
Because the a a lot of gueststhat we've had on are, are just
founders that I talked to.
They mm-hmm.
They have that mentality that,you know, if it's like 70% done
mm-hmm.
I'm gonna release it.
Mm-hmm.
And then I'll figure out therest.
So like, how do you overcome?
Rachel (09:44):
Yeah.
That's a really good question.
So yeah, the.
The ability to not get so caughtup in like, it has to be
perfect.
So my perfection is more on thebackend things that you don't
see.
So how our data is being storedin our systems.
We, we, I really believe indecentralizing.
Data storage.
Um, being consumers, being ableto delete their data on demand,
(10:08):
having segmentation of our datain our, in our, um, environments
who we're using as a cloudprovider at a time is really
important.
There's big debates.
I know a lot of US startupfounders do.
Uh.
Cloud or credit jumping, whichI'm, I've been, uh, you know, I
like credits as well, but youknow, digital ocean is,
(10:29):
sometimes it has a little bitmore risk than maybe you're
using Amazon or like IBM and itmay be more expensive to you.
So I think a lot of the stuffthat I have lower risk tolerance
on or more on the back end, orat least having the
documentation that we can proveout while on the front end,
like, yeah, could you make.
(10:50):
Hey, you, you're logging intothe platform and it may like do
something funky and you stillrelease it.
Yeah, that's fine.
I have like a lot of, uh,bandwidth for that.
But some things like from ourdata storage perspective, I
don't have a lot of data.
You stick
Kenny (11:04):
Okay.
Rachel (11:05):
So let, let, now I hope
this doesn't come back to bite
me.
Like, no, no, no.
Perspective.
Let's, let's, let's test hersystem.
Kenny (11:12):
Oh, got it, got it, got
it.
So I wanna talk a little bitabout, um, a lot of startups
now, a lot of companies, nowyou're in the privacy space.
Mm-hmm.
I think of the word like trust,like
Rachel (11:23):
Yeah.
Kenny (11:24):
You're trying to build
trust with your, like your end
user, things like that.
When you're a startup founder,you have to focus on your early
customers.
So I'm curious, like, how didyou go about.
Building trust with some ofthose early customers when it
comes around privacy and data,which is not a, it's not a sexy
topic.
Rachel (11:43):
Yeah, that's, yeah.
No, it's, it is true.
It's not sexy and most of thetime consumers don't care about
it at all.
Mm-hmm.
Like when we talk to consumersor when we first launched and
was like, Hey, we're a privacyproduct.
You should, privacy is a right.
No one cared it.
It did not speak to the core.
So what we did, we did have togo back and really interview our
(12:04):
consumers and understand whatthey wanna see.
And the next digital economy orecosystem, is it really
important for you to own yourdata or is it really important
for you to have action with thatdata that you now own?
And uh, and once we discoveredit was important for them to be
able to action their data andhave that own level of control,
that's when we start buildingout more features based on that.
(12:26):
So like by that, well, what,what I mean by that is like, we
used to allow, like automateyou.
Um, submitting your DS a R, soyour data subject request
rights, so you can ask anycompany for deletion or return
of data, the company that basedon the regulations, they have to
do that for you.
Historically, nobody does that.
So yeah, nobody goes back andsays, yeah, yeah, you're just
(12:47):
not interested.
So the, what we've found isthat, okay, maybe you don't
want, you don't care to getaccess to all your data, but
even right now you, this is apodcast platform.
Having access to your audienceand their data would be really
significant to you as a businessowner.
Mm-hmm.
Do you have all that access?
And where are all the placesthat you wanted?
Mm-hmm.
(13:08):
You being able to action thatmeans more to you.
And that's just us learning.
To build trust throughunderstanding what the need is
and how we can meet the need.
Um, and so, and, and second tothat, we go to business owners
and bring them onto ourplatform.
And based on that institutionalrelationship, consumers
typically trust us.
(13:29):
Got it.
So we know consumer marketing isexpensive, so anybody who does
it well, kudos to you.
Yeah,
Kenny (13:35):
yeah.
I got, I like that.
And I think it is.
It's interesting because it'slike, like you said, like we, we
kind of give away a lot of, um,we don't give it away, but we
just like, we just scrollthrough.
Like when we download the app,we just, as consumers scroll
through all like the, the, the,the letters at the beginning,
(13:56):
like, Hey, like.
The fine print.
Yeah.
Like we just want to get to theaction.
Yeah.
But then there is power and meas a consumer being able to tell
somebody how much right they cando with
Rachel (14:08):
my data.
Right.
Which I have a whole nother likelong tangent I could go on of
like, I believe having to acceptterms and conditions or like
contracts of co or coercion,like you don't have any power in
that contract.
So I, I don't think those shouldbe legally binding.
But I don't think anybody, like,I'm not taking on that fight.
But it, it just seems like youcan't force someone to accept
(14:30):
everything That seems like itdoesn't seem fair.
Kenny (14:33):
Yeah.
That seems like a, uh, thatseems like a podcast.
Rachel (14:37):
Yeah, but I, we digress.
We, so I'm just like, youbrought it up and I was like,
oh, yeah.
That doesn't seem right either,but,
Kenny (14:43):
and I think where we're,
where we're going now, um, I'm
curious your thoughts.
Like when I think about dataprivacy
Rachel (14:56):
mm-hmm.
Kenny (14:58):
I feel like me
personally, I don't feel like we
have a lot of that.
Rachel (15:03):
Yeah.
Kenny (15:03):
Right.
And so I feel like it'sincreasing.
And then alongside that we havelike something like ai, which is
increasing as well.
Yeah.
So I'm curious, like how do you,are you thinking that you can
use tools like AI to.
Enhance the way like me, someonelike me as a consumer or
(15:26):
business can own their privacy alittle bit more.
Rachel (15:29):
Yeah.
So as we enter into, and so thisis a great point as we enter
into the intelligence era,right, where we are more reliant
on automations and ai, whichrequire more real time data.
I think we're kind of like atthe, like the, like right at the
peak of like almost the.comboom.
The same thing.
Mm-hmm.
And as human, like, as, uh, usas humans, as we engage with
(15:52):
technology, we have a veryinteresting position to take
and, and we have to make achoice.
And that's kind of where Elroyis really trying to be.
As you as a consumer, you usechat, GPT for example.
You're just feeding itinformation eventually, and, and
this is just a trend.
We'll start using like ourtraditional internet searches
(16:13):
less and you'll start going tochat GPT mm-hmm.
Instead, and they'll have aprofile of you and, and, uh,
then do its own version of, andI forgot what the AI version is,
but it's like.
Uh, like AI optimization orsomething the same as like SEO,
but like pulling the rightwebsite to the right answers to
that, specific to to you.
(16:34):
And so if we don't createownership or.
Like, uh, silos for individualconsumers in this process, the
existing big tech people willcontinue to move the machine
forward.
Mm-hmm.
And without you being involved.
So like, you're a smallbusiness, but you're highly
reliant on other companies.
(16:54):
Helping you Extremely reliant,extremely reliant.
Then you, yourself, as a smallbusiness have a, that reliancy
will continue as a consumer.
You are then are in contractsthat are forcing you to give
stuff without any control.
So we will eventually get toplaces where.
You have no control and you'rebeing influenced.
(17:15):
So like even right now, I thinkwe both, I'm gonna say we're
around the same age, but, so wegrew up in the internet where
the internet truly was theinternet, right?
Yeah, yeah.
Like right now, there'salgorithms, it tells you what
you like, it tells you, and thatwill get more and more
integrated into everything thatwe engage with if we don't
manage controls and, right.
And the only way I'm thinking wecan, my viewpoint is to manage
(17:37):
controls, is who has access tothat data and who.
Is giving access and who'srevoking access.
Mm-hmm.
That's one of the ways I thinkwe can put these controls in.
So hopefully I answered yourquestion.
Oh, you did, you did.
You definitely
Kenny (17:49):
did.
You definitely, you, you got methinking too, like, what exactly
am I even like giving away?
Because I, everything I'm quickto just like, oh, this is the ai
something that'll just, lemme
Rachel (17:59):
try something new.
Yeah.
Lemme
Kenny (18:00):
just try something new.
And it's like.
All right.
Fine.
Print.
All right.
Whatever.
Get that.
Yeah.
Like lemme get to the action
Rachel (18:06):
that, because anything
free costs us something.
Yeah.
I think that's what we have toremember.
It costs you something, you justdon't always know what it is.
Kenny (18:13):
So let, let's pivot a
little bit.
Let's talk about, um.
What kind of people are youleaning on right now to build
out Elroy?
Like talk about like how you'rebuilding your team and, and, and
what you're looking for there.
Rachel (18:30):
Yeah, so I, I think
right now I've been based mostly
building my team based onrecommendations and I'm at a,
I'm at a.
Another crossroad as part ofthis, this journey.
Um, when you're beginning, and Ithink this is probably similar
to a lot of founders, there'svery specific people that are
interested in being in thestartup world.
(18:52):
You have to have a high risktolerance.
You have to have the ability toknow that, like some days you
don't know when that startup'sgoing out of business.
Mm-hmm.
You just, you, you just arewilling to do that.
And then two, you have to kindof believe in the mission.
So we just recently, I would sayjust recently, I'm starting to,
mm-hmm.
Be more focused on individualswho are looking to solve a
really big prob, really bigcomplex problem and understand
(19:15):
the nuances of that problem.
And I think before I wasspeaking about my problem, the
problems that we were trying tosolve, and it was going through
recommendations or people aroundme, and now I'm more so looking
in areas where people arealready.
Solving or looking or addressingthe problem because it, we are
in a complex problem state, andso it, so again, the pivot now
(19:37):
is not so much recommendationsas more people who are more
mission aligned and I have moretolerance for what that mission
aligned.
May I, I mean, I'll take morerisk on'em.
Mm-hmm.
Um, so that's kind of where I'vebeen looking for teams.
So more in privacy organizationsstarting to build up cohorts in
the different like industrieswe're in.
We have a couple beachheads,we're in healthcare.
(19:59):
We're in, um, education.
So people who are veryunderstand the nuance and the
insights that need to drivechange in those industries.
So we're, I'm, I'm being a lotmore strategic about like, who I
bring on the team and opposed tolike, people are just like, oh,
you're doing something cool.
I, yeah.
Lemme doing.
Yeah.
And then you're, then you'refrustrated'cause you're wasting
Yeah.
You're, you're wasting your timea little bit because I can be
(20:20):
really frustrated.
Yeah.
So, and yeah.
Yeah.
You've been through that.
Yeah.
You ain't, I hate you ain'tgotta say no more for everybody.
Kenny (20:29):
No more.
So transitioning a little bitbecause it, the, the way that
you have laid out the privacy,um, focused market in terms of
opportunity, it seems likethere's, there's, there's space
for, for a whole bunch ofplayers in this space.
So what, what, what could sortof advice would you give to
(20:49):
someone who's thinking aboutstarting a privacy focused tech
company?
Like, where should they bethinking?
Rachel (20:54):
Yeah, so I think
there's, there's two paths.
I think you could be a privacyfocused tech company, I'll get
to that.
Or you're a tech company who'sgonna be dealing with data.
And I think more startups shouldbe thinking through their data
plan.
Especially like everybody's likeAI powered.
Most startups are AI power, datapower, doing something as part
(21:15):
of that thought process or howyou position yourself, it should
be able to communicate yourprivacy by design.
How are you dealing with yourconsumer's data as you collect
it, and how, how is that gonnadrive your business?
So that's one path.
So I think that's one thing thatshould be integrated in every
startup and every business to behonestly, but mostly with
startups and two of your privacyfocus.
(21:36):
I do think solving the data.
Ecosystem issues requires a lotof different.
Technology to come together and,and reshape the data ecosystem.
So there's a lot of space fearhere for where, how we leverage
data, how we're managing consentfor data, how you may be
thinking about, um, data privacydifferently than me, especially
(22:00):
we have a huge generation, um,at Central working with.
Um, students who are takingCentral has a privacy course now
that they're, they have backingfrom large organizations.
So we have larger or youngerlawyers coming in who are right
at the, the, their, thebeginning of their careers.
So they have nothing limitingtheir thoughts of what could
(22:20):
happen.
So I think there's space for newideas for privacy and that, and
if you get a chance to get, getinto the nitty gritty of the
issues, it's a, it's a good.
Industry to, to get in
Kenny (22:33):
it's safe.
Yeah.
I believe, I believe for, uh,based off what you're telling
me, like it seems like there'splenty of, I'm a big prop
proponent right now of like B2Bbusinesses, right?
Like us having more of those.
And so it seems like there,there, there's some room for
somebody who's coming in new,new to the space who's a new
attorney or something like thatto put on that entrepreneur hat
(22:54):
and create something new.
Rachel (22:56):
Right.
Perfect.
Yeah.
Kenny (22:57):
So.
I'm gonna ask this question alittle bit selfishly.
Mm-hmm.
As a consumer, are there thingsas consumers we should be
considering more when it comesto privacy and how we engage
with
Rachel (23:11):
companies?
Yeah, so it really is, and Iknow this, it's nuanced and it's
complicated, but when you're,when you have a technology that
you're using often, right?
You need to make sure you're.
Just at the baseline have somelevel of control over it.
So if we're leveraging chat GPToften you can.
We are, yeah.
There's a setting in chat, GPTthat says turn off training the
(23:35):
model training.
So if you think about it, likeif, if we are using it.
For our businesses, it'sproprietary really, right?
Mm-hmm.
And so, but if you're trainingthis model now, there's no
guarantees.
I'm not saying this is aguarantee that chat, GPT won't
go out and train on the data,but they do have a toggle that
says, Hey, don't tr train your lthis model based on the data I
(23:55):
provide.
So really it should just be verylocal to your computer and what
you provided.
Mm-hmm.
So in each technology, I thinkthat's what you need to be able
to do and find, uh, even if it'svery nichey or very quick.
Content of how I can be safer.
So we use like meta a lot.
So it's Instagram, they're,they're big policy shifts and
(24:16):
making sure we understand thesetting features.
Mm-hmm.
In those policies, you're,you're, you probably leverage
YouTube and Instagram anddifferent things.
Mm-hmm.
There's settings that let you bepublic, and then there's
settings that's like, yeah, Iwanna save my searches.
So that's how you get your likefour U pages, but some of those
you need to restart.
Got it.
So just really understanding thetechnology and where they're
pulling your data from.
(24:37):
Making the controls that makessense for you.
'cause I do think controls areon a spectrum.
Some people it will, it willnever matter how, how they own
their data, what they're doing.
If you wanna contribute to thegreater good of, of OCHA GPTs
model, great.
Mm-hmm.
But like for some people, ifyou're putting your proprietary
ideas and trying to solvesomething, you, you may not want
to do that.
Kenny (24:58):
Yeah.
That's fair.
That is definitely, so those aremy tips.
Especi, when it becomes a likebusiness ip.
Yeah.
And you're putting in some ideasthat.
Interesting.
You got me thinking.
Yeah, I use chat GBTA lot, solike as soon as this is over, go
toggle that I like, so like, doI have that toggle on?
Or also I'll definitely bechecking that.
But I appreciate you giving usthat tip.
(25:20):
Um, and it definitely seems likethere's just like a blurring of
the lines where like, whatshould we.
Or shouldn't be putting intolike the AI tools.
Mm-hmm.
And I don't feel like there'slike a clear answer now.
And for a lot of business ownerswho are doing, you're wearing
multiple hats, you're puttingeverything in there.
(25:42):
Like you're putting the essenceof your business.
So it's important to have some,it's, it's
Rachel (25:46):
funny because we are
like, I think YouTube on July
15th will say create, you can'thave AI creators anymore.
Kenny (25:53):
Yeah, I saw that.
And so
Rachel (25:54):
they're leading the
industry.
I saw that.
But this, we're gonna startseeing this across industry.
So like, I think us asindividuals and creators should
also be thinking through whatare the rules that we think may
impact us, and how do we livewith that, even if it's micro
steps.
And so a micro step would be.
Hey, you can't train on me.
But for YouTube, they're like,oh yeah, we're not gonna let you
compete your platform, competewith someone who's just like an
(26:17):
AI version or a gen, likesomething that's not human.
Um, make, well, human will makemoney off of them, but you know,
it's like trying to limit thatand they're like, at this
frontage.
But I think we also should be atthe front end of micro steps
that we each should be
Kenny (26:32):
doing.
Yeah, that's a good point.
And I, I did see that YouTubeannouncement.
It is really interestingbecause.
Like, and then it plays intolike monetization.
Yeah.
And all these things.
And it's like who's supposed tobe getting paid mm-hmm.
Off of, you know, there'sprobably pages with like a
hundred million views and it'sjust like either AI generated or
(26:56):
it's like really basic stufflike.
Like sleep music.
Rachel (26:59):
Yeah.
Kenny (27:00):
And it's like,
Rachel (27:01):
and then what's blurring
the line?
What's going too far?
Like what is the differencebetween me having a automated
voice or AI voice?
Say something and we have likethese quote videos.
Yeah, that's, no one's engaged,but it is just the quote video
versus mm-hmm.
Us saying, Hey, what should Italk about in this session?
Like what's, where's the line?
So again, it's on the spectrum,but I think we are at a unique
(27:23):
place, even in our ownbusinesses, to decide what's the
line for our communities, thecultures that we're gonna be,
that the culture shifts thatwe're driving, and it's our
responsibility to be.
Cognizant of that.
And so
Kenny (27:35):
we're in the, we're in
the moment now when it comes to,
um, specifically data privacy,like where the change is rapid,
um, regulations or lack thereof.
Mm-hmm.
Or like conversations are beinghad.
So since we're in the momentnow, I feel like the end has
gotta be on the horizon.
(27:56):
Yeah.
So I guess like what do you seein like the next five years,
like.
How does this play out in, in,in your view?
Rachel (28:04):
Yeah, so I, I hope I'm
wrong, but I do think AI and the
access to data will drive areduction in the human
workforce.
Mm-hmm.
I, I hope that, that we replacethose jobs with other jobs, but
I don't right now see that asthe trend.
Um, so the vision for us atElroy is that if a human can own
(28:26):
all their data instead of.
These companies that individualconsumer can get paid from their
data and, and, and, and bridgethat gap of lack of fund money
that we will see in thereduction of the workforce.
That's really where we're, wherewe're looking in is.
Gig economies will shift, actualworkforce economies will shift
(28:48):
as well.
Our traditional workforceeconomies, if we can replace
that with like your only as ahuman, your only renewable
resource, current state is yourdata that, and so if we can
shift that to say like, okay,you own it and now you can share
it.
Mm-hmm.
Like in feed this digitaleconomy going forward.
Then that's, that's what we're,what we're trying to do.
But I, I do think the, the, thefoundation of that is based on
(29:13):
privacy.
Like your right to do this isbased on privacy, and if we
don't focus on the privacy partof it, then we won't get that.
Right.
So I don't, again, I'm hopingthat like all, like, like 50% of
the workforce won't get replacedby like a robot, but like
inevitably, I, I think that'sthe path we're on from an
innovation perspective.
(29:33):
Yeah.
We, we, we don't.
Do the things we used to doanymore as we automate, and
that's just industry.
Mm.
So that, that's, that's whatI'm, what we're supposed to do.
Oh, yeah.
That,
Kenny (29:43):
that's really, I don't
know, like which side I'm on
yet.
Mm-hmm.
I feel like naturally.
If there's all this automationand efficiency, we should be
working less.
Mm-hmm.
But then to your point, like theextreme of that is the be like,
hey, like you can work less, youcan work at zero hours because
(30:03):
we don't need you.
Rachel (30:04):
Yeah.
So like less goes to zerobecause go to zero.
I can't pay you.
And the
Kenny (30:10):
Yeah.
That's the part where, um, Idon't,
Rachel (30:15):
now they, everybody says
that this isn't happening.
That all the, the studies saythat they're not gonna do this.
We can't.
But like if we look at thetrends, like McKinsey has laid
off mm-hmm.
Their bas and their juniormm-hmm.
And consultants, they say thosejobs can be replaced.
That is something that's across,like agnostic to industry.
(30:37):
Lots of things could beautomated or we have enough
information to let someone
Kenny (30:42):
Yeah.
And then what remains is, Ithink what remains afterwards is
a question, because it's like.
Those could be the jobs thataren't necessarily where people
want to go.
Like there's no creativity inthe job.
Mm-hmm.
Maybe like certain aspects arelike
Rachel (30:56):
mm-hmm.
Kenny (30:57):
It's, it is a heavy,
heavy topic.
So the, I wanna segue, right?
So when you're not building,like Elroy, right?
Mm-hmm.
And, and having these, um,really nuanced deep
conversations on data privacy.
Yeah.
Like what do you, like, what doyou do to recharge?
Rachel (31:15):
Um, I learned new
skills.
Kenny (31:17):
Okay.
Rachel (31:18):
So I think in the, uh,
and I don't know if this is,
this is probably like a bad pathfor startup founders, but this
is my path.
I know that being a founder ishard and we are, are climbing up
a mountain, which half the timewe're sliding down.
Like, so you climb up threesteps, you're gonna slide down
four usually.
So I'm, right now I'm a activitygirly, so I'm in a.
(31:42):
Private pilot's license, uh,learning to surf, go, like golf
lessons?
Anything that you can't
Kenny (31:49):
sit still?
No.
Rachel (31:50):
Basically, no.
Okay.
If I, if I'm sitting too long,I'm like, uh, I have a lot of,
of space in like, time is myenemy.
Like, it, it just is too muchtime.
So I will do physicalactivities.
To, to make myself feel better.
So that's what sews into me.
Mm.
Um, like learning that like,it's like I feel like they're
(32:10):
like patches and that's how Itake life.
Like everything is like a newlittle girl scout patch that I
get.
Got it.
So I'm just trying to collectthem all, all.
Kenny (32:19):
I never know that there's
nothing wrong with that of
skills being active, beingoutside.
Yeah.
Um.
Especially in the time we're in,we just have so many devices and
it's like easy to not gooutside.
Yeah.
But then there's, there is valuestill in like, going outside and
like, you know, our elders usedto say like, go outside and
touch grass.
Yeah.
Like, just go out, be be withoutside of the devices, be with
(32:44):
your thoughts.
Mm-hmm.
Um, there, there's, there'spower in that.
Yeah.
So that, that's cool thatyou're, you have the.
You have the capacity or theflexibility in this moment to be
building your passion and thenwhen you unplug, go outside.
I think that's, I think that'spretty cool.
Mm-hmm.
So, you know, I appreciate youfor sharing so much with us in
(33:07):
this interview.
I learned a bunch.
And then a big thing for me, andI'm pretty sure some of our
listeners, we're all gonna go toall those apps we're on now and
be like, okay, like what aresome, at least like question,
what are some of the.
Data privacy practices thatthey're, that they're, um,
implementing now.
Because again, folks, like Isaid, I pretty much, if there's
(33:28):
an app I want, I'm just signingup for it and I'm clicking
through the, through the sheetsin order for me to do the final,
uh, accept of those terms.
But you've definitely got methinking, and I'm excited
actually, for this space nowthat after having this
conversation with you.
'cause there are layers thereand that where those layers lie.
(33:48):
I think again, there'sopportunities for folks in the
B2B space specifically.
There's nothing wrong withconsumer, but B2B I think is
where there's gonna be a lot ofgrowth.
So tell our listeners how canthey connect, uh, keeping,
keeping in the loop on Elroy andall the things that you're
doing.
Rachel (34:07):
Yeah, so we're at Elroy,
ELRO i.ai is our website.
You can connect with us on, um.
Instagram, it's Elroy ai, um, onInstagram.
So we're, we're pretty, we'reeasy to find.
Um, just as a little tidbit,Elroy is named after the
Jetsons.
Oh.
(34:28):
So, you know, I was about askyou where that, yeah.
Oh.
Um,
Kenny (34:33):
even though you're dating
both of us, but I got excited
when you, when you said that.
Yeah.
Like, that's the name
Rachel (34:40):
that is Yeah.
I used to watch the Jetsons onBoomerang.
So that's where it
Kenny (34:44):
came from.
Don't, don't say you.
I used to, we used to watch, weused to watch suggestions.
I'm
Rachel (34:49):
sorry.
Oh, that's dope.
Yeah.
So that's where it came from.
Kenny (34:52):
That's amazing.
Oh, well I appreciate you, um,for giving our viewers that and
they know where to go.
Um, lasting thoughts.
I always give it to our guests,like, what do you want our
listeners to take away as thefinal thought from this
conversation?
Rachel (35:08):
Um, I think just like,
uh, one of the things I think of
often I talk to my friends abouta lot is that I think a lot, we
as humans share ideas.
And a lot of us have collectiveideas, and the person who has
the most resilience and thevision are the ones that get
their ID ideas forward.
So if you have an idea, you'relooking to solve a problem, I
(35:29):
think it's okay to talk toeverybody about it.
Mm-hmm.
You'll probably get a lot ofpeople, some, you'll meet
somebody who had the same idea.
What's the difference here islike either can you partner or
how long can you work on it to,to get it done?
Because it's, that's just it whogets it to the finish line.
And I think we get discouragedspecifically in our community
about our ideas and we thinkpeople are taking them and it's,
(35:49):
I think like God probably giveslots of people these ideas.
It's just whoever has aresilience.
Kenny (35:55):
Yeah, I like that.
I like that.
Founders gotta be resilientnowadays.
So what a way to, uh, close usout.
Thank you again, Rachel, for,um, coming on and, uh, sharing
your story and giving us some,some, some lessons on data
privacy, uh, for all thelisteners out there.
Thanks for tuning into anothergreat episode of The Beyond
Normal Podcast.
(36:15):
Peace.