All Episodes

February 6, 2025 41 mins

Send us a text

Kasara Weinrich, Director of Sales Technology and AI Solutions Designer at ADP, joins us this week to discuss how organizations weigh generative AI adoption. As a PhD candidate in anthropology and social change, Kasara offers a unique historical perspective on how workers have adapted to technological disruption. 


[0:00] Introduction

  • Welcome, Kasara!
  • Today’s Topic: The Anthropology of Generative AI in the Workplace

[5:47] What’s wrong with the world of work today?

  • Historical patterns of AI innovation and adoption
  • The impact of end-user involvement in generative AI development

[18:46] What are the implications of unregulated use of people data by generative AI?

  • The importance of partnering with established global organizations
  • Leveraging emerging technologies with a goal and strategy

[28:07] How does an organization prepare for generative AI adoption?

  • Aligning expectations and involvement between leadership and employees
  • Technology adoption beyond workforce reduction

[38:46] Closing

  • Thanks for listening!


Quick Quote

“For certain work, generative AI can [augment and automate tasks] while bringing the human skills to the top, but only if it’s done strategically and not just because it can be done.”

Contact:
Kasara's LinkedIn
David's LinkedIn
Dwight's LinkedIn
Podcast Manager: Karissa Harris
Email us!

Production by Affogato Media

To schedule a meeting with us: https://salary.com/hrdlconsulting

For more HR Data Labs®, enjoy the HR Data Labs Brown Bag Lunch Hours every Friday at 2:00PM-2:30PM EST. Check it out here: https://hrdatalabs.com/brown-bag-lunch/

Produced by Affogato Media

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Announcer (00:01):
The world of business is more complex than ever. The
world of human resources andcompensation is also getting
more complex. Welcome to the HRData Labs podcast, your direct
source for the latest trendsfrom experts inside and outside
the world of human resources.
Listen as we explore the impactthat compensation strategy data
and people analytics can have onyour organization. This podcast

(00:24):
is sponsored by salary.com Yoursource for data technology and
consulting for compensation andbeyond. Now here are your hosts,
David Turetsky and Dwight Brown.

David Turetsky (00:39):
Hello and welcome to the HR Data Labs
podcast. I'm your host, DavidTuretsky, alongside co-host,
best friend, genius, partner incrime, lots of different
adjectives, my friend fromSalary.com, Dwight Brown.

Dwight Brown (00:52):
I was looking around to see who you were
talking about there

David Turetsky (00:56):
when I lost your best friend? Wait what?

Dwight Brown (00:58):
Yeah. Well, when I heard genius, I'm like, yeah,
gotta be talking about the guybehind me.

David Turetsky (01:02):
Should I say brilliant? Hmm, we're gonna have
to go back to the transcript onthat one.

Dwight Brown (01:06):
Yeah, exactly. We may have to re record this.

David Turetsky (01:09):
But we do have a genius with us, because in the
world of ADP, there are verysmart people. This person stands
head and shoulders above most ofthose other folks that work at
ADP. And I know this becauseother people have told me this
too, and I've had a chance tohear this person speak at
conferences and on webinars, andthis person exudes brilliance.

(01:33):
I'm talking about none otherthan Kasara Weinrich. Kasara,
how are you?

Kasara Weinrich (01:38):
I'm wonderful, especially, I mean, I've peaked.
It's, it's early in the day tohave peaked. But thank you.

David Turetsky (01:46):
Well, come on. I mean, you know when, when you
look at the body of work, fromwhat I've seen, from what you've
done at ADP, I think I'vedownplayed it a little bit.

Kasara Weinrich (01:57):
No, no way. I think, I think ADP has given me
a beautiful opportunity to mixambition and determination with
a lot of Right Place Right Time.
There's a lot of open doorshere, and I've had a lot of
opportunities to walk throughthe right one. So I'm lucky

David Turetsky (02:12):
Well, why don't you tell us about those doors? I
mean, tell us about what you'redoing at ADP and tell us a
little bit about you.

Kasara Weinrich (02:17):
Right now, I have a very awesome opportunity
to be working under worldwidecommercial operations in
designing and developing AI andgenerative AI driven
technologies for our entiresales force. And so, outside of
designing and creating that andworking, of course, cross

(02:37):
functionally with so manybrilliant people. I am also able
to take all of these passionsand and speak to the market

David Turetsky (02:46):
That's so cool.
You're You're literally on theabout them.
leading slash bleeding edgewithin a technology company that
prides itself on being on theleading edge.

Kasara Weinrich (02:56):
It's true.

David Turetsky (02:56):
And by the way, I'm not a shareholder anymore,
so I'm allowed to talk about it.
Okay, but one thing we ask allof our guests, Kasara, what's
one fun thing that no one knowsabout you?

Kasara Weinrich (03:11):
I guess I would say, and this ties back directly
to the fact that I am able tospeak publicly now. So early on
the first, I was born with verylimited hearing, and I started
speaking incredibly young,which, much to my, you know,
parents dismay, I haven'tstopped since. But for their

(03:32):
first 13 years of my life, I hada lisp that was like, like, like
this. So it was, and my name isKasara, my name has an S in it.
So that was, you know, almost acruel turn of fate. But so for
the first 13 years of my life, Ihad a very difficult lisp and
speaking impediment, and now Iget up on stages and I speak,
and I hear people talk about howmy voice is soothing, and it's

(03:55):
just it makes me giggle to thinkof of how I started.

David Turetsky (03:58):
Well, I'm very happy to hear that, because I
think a lot of us have come fromplaces where we've overcome
those kind of issues. I wouldn'tcall them necessarily. I'd call
them challenges, because theyare and a lot of us overcome
them. In fact, lately, Dwightand I have been talking to
people that have talked about usovercoming some of the things

(04:19):
from our past too, and they'vegotten really emotional.

Dwight Brown (04:23):
Yeah, we don't want to go too far down that
road, yeah, but

David Turetsky (04:26):
We're gonna need tissues again! But one of the
things that we we love talkingabout here on the program, is
that there everybody's human,and we all have our foibles, and
we all overcome and it'swonderful to know that such an
amazing person like yourselfwith what you've accomplished,
have been able to do that withthat as one of your challenges.

(04:49):
And again, you're one of theirbest public speakers, and you've
definitely done a great job ofovercoming something like that.
So kudos to you. It's wonderful.

Kasara Weinrich (04:58):
Thank you so much. And. Yeah, no, no one
really knows that it got me,with the exception of the people
who knew me at 13 and earlier.

David Turetsky (05:05):
So now, now the people who listen, hopefully all
the people in the world willhear

Kasara Weinrich (05:12):
everyone,

David Turetsky (05:13):
and they'll appreciate even more so now that
when they see you speak thatit's it's not there.

Kasara Weinrich (05:19):
Thank you.

David Turetsky (05:20):
So the topic for today is going to be one that
has graced this podcast quiteoften, which is Gen AI. But
we're going to be talking abouta really fascinating look at Gen
AI through the lens ofanthropology and looking at how
it affects the world of work.

(05:47):
So the first question is, what'swrong with the world of work
today? What? Why does Gen AIcome up? What? What's wrong with
the world of work?

Kasara Weinrich (05:55):
I think Gen AI is a, it's been an opportunity
for organizations to realize,you know, it's so reliant upon
data. It's incredibly reliantupon having an organization
with, you know, learning agilityat the core, having an
organization that has beenready, that is historically

(06:17):
adaptable and curious. And so Ithink with the onset of this
emerging technology, as well asthe combination of
organizational culture even, andI should probably say I don't
know that this was mentioned,but the reason that we're
talking about this from ananthropological perspective is

(06:38):
that's the program that I'mcurrently studying within for my
PhD. And so I am studying humancentered AI. And so not just the
influence and impact of AI andgenerative AI on an
organizational cultureperspective, but also human
beings culture and theirpropensity to adapt. So

(07:02):
organizations are really facing,they've been kind of hit with a
mirror in generative AI. Andthen I'll also say very quickly
that due to the exponentialgrowth in this space, there was
a race to basically be first insome ways? Many organizations
wanted to be the quickest andthe fastest to adopt and to

(07:23):
innovate and and so now we'reseeing that, you know, I like to
call those folks minimal riskadopters, the ones who are like,
we stood up a closed instance ofthis GPT platform. So we did it,
we innovated, and they didn'tactually have a good
understanding of the broaderbenefits and applications, and

(07:44):
so we're seeing a lot of theseprojects failing, and it's
because of that. But I think thegreat news is that while that
has kind of shown some of thethings that might be wrong with
the world of work, there is amassive opportunity to make a
lot of things go incredibly welland to turn this into something
that's right.

David Turetsky (08:03):
So let me ask you a question about that.

Kasara Weinrich (08:04):
Yeah

David Turetsky (08:05):
And we'll get to the right part. But my question
before that is, what the heck'swrong? What are we solving for?
And is that, has the push tothose early adopters been to
just be an adopter, or was therea goal that they were chasing
that's not just save cost andeliminate people and eliminate
positions. What was wrong? And Iguess this is where I was going

(08:28):
with the original question,what's wrong with the world at
work, or world of work that weneeded to bring the robots in?

Kasara Weinrich (08:37):
Yeah. So I guess if we look at it from that
perspective, it all comes downto human beings and the work
that they enjoy doing versus thework that they've been made to
do for an incredibly long time.
And so we're seeing that goneare the days of the nine to
five. You know, I was talking tosomebody this week at a
conference, and he said, wouldyou believe that somebody asked

(08:58):
me when I was retiring, like,he's like, I cannot believe I'm
at the point in my career whereother people think that it's
time for me, right? That Ithought that's what's on my
mind, and he's like, but I don'twant to retire! I love my work.
I love the things that I amdoing. I have no desire to
retire. And so I think whatwe're seeing here is that in

(09:19):
certain industries, for certainfor certain work, generative AI
can come in and take the thingsthat can be augmented and
automated away, while bringingthe humanity and the human
skills to the top. But again,only if it's done strategically
and not just done because it canbe done.

Dwight Brown (09:41):
Is there a little bit of the bright, shiny
thingness to this whole thing?

Kasara Weinrich (09:47):
110%

David Turetsky (09:49):
Squirrel! Yeah, sorry,

Dwight Brown (09:52):
story in my life.

Kasara Weinrich (09:53):
Yeah. And, and, I think, I think, here's the
thing. So, you know, being a.
Anthropologist, it makes mesomewhat of a historian in in
many capacities, right? Not juststudying current culture, but
prior cultures, how we got hereand and when I'm researching AI
and generative AI, I'm alsolooking at the history of these

(10:14):
technologies. And at every pointin AI's history, we've seen
massive increases incapabilities, then massive
increases in expectations, andif those things weren't met, we
saw a steep decline, right, acomplete disinterest, lack of
funding, and it's gone intowhat's called an AI winter, and

(10:35):
we're at the point right nowwhere we can absolutely prevent
that from happening if we do notallow the shiny object syndrome
to happen, and we do implementthis in the most you know,
adequate and strategic ways.

David Turetsky (10:52):
But I think there's also, though, and again,
correct me if wrong, because yousaid this before, a little bit,
there's a laziness that some ofthe key challenges to AI, like
data and like past decisions andlike history and also the crap
up on the internet. They're allfacts, right? Those things need

(11:15):
to get, if not assumptions putaround them, corrected or
adjusted for because if we'relooking at the AI, Gen AI, to
look at all that stuff to beable to help us with these
processes that you're talkingabout, they need to get a hell
of a lot better, because theysuck. I mean, I'm just being

(11:39):
honest here. I mean, a lot ofthe data we're talking about,
it's not good enough, right?

Kasara Weinrich (11:44):
Yeah, and I think I have, you know, the
organization that I work forallows me to have an incredibly
high standard for data, right?
Not just data quality, but datagovernance, ethical use,
privacy, security, so all ofthose things need to be top of
mind for organizations that areimplementing the technology, but
also they need to be prepared toask the right questions, you

(12:07):
know? So there's, I put, I putorganizations into one of three
buckets. Right now, they'veeither decided to take minimal
or no risk at all, and they'rejust, you know, they're risk
averse, and they're justobserving what's happening.
They're saying, Okay, if it getsto me, If a partner I already
work with brings it to me, thenI will think about it. But in
the meantime, I'm just gonna letthis play out and see what

(12:30):
happens. Then there's thosefolks we've already talked about
there. They took some risk, theystood up a closed instance of
something, but that was thebeginning and the end of their
journey. And then there arethese folks that my organization
falls into where, you know,we're calculated risk takers, so
we know this is a path we wantto go down. We know the broader

(12:50):
benefits and the and the wideapplications of these
technologies, they see it as ahorizontal so it's not just this
is our emerging technology silo.
It's emerging technologies AIand generative AI need to be
seen as a thread acrosseverything that we're focused
on. Where are the use cases?
Where are the implications? So Ithink, and two, who you've been

(13:13):
doesn't need to be who you'llalways be. So if you have just
been observing. Aim to be morestrategic. If you have stood up
this closed instance and youthought that you were done,
you're not, you can do betterand be better. So I think if we
see organizations striving totake this calculated risk and to
find the use cases, prioritizethe right ones, this could be

(13:37):
exceptional and like a reallycool point in our history.

Dwight Brown (13:43):
So when you look at the history of AI, you talk
about the AI winters that havehappened along the way. And,
yeah, I think a lot of timeswhen people think of AI, they
think it's a brand new kind ofthing, when in actuality, AI has
been going on since the 60s,basically. But is the fact that

(14:07):
this is the first time that it'sreally been put in the hands of
end users, is that having adifferent impact? And let me
give you a little context of whyI asked that the you know, I
think of for a while, thebuzzword was machine learning.
Everything was machine learning,machine learning, blah, blah,
blah, and then it was all aboutdata science. And, you know,

(14:32):
everybody wanted to be a datascientist, and everything was
data science. That was the buzzword. Now we're at AI, which
actually machine learning, datascience is all a component of
that, but it's the you know,back when it was data science,
not everybody could do it. Youhad to have specialized
learning, you had to havespecialized knowledge, you had

(14:52):
to be able to program and codeand all that kind of stuff.
Where now this is in the handsof end users, and they're able
to do something impactful withit. How has that really
influenced where this sits atthis moment in history, and that
impact on the world of work, andwhat's wrong with the world
work?

Kasara Weinrich (15:12):
Ah, Dwight, that's that's such a powerful
question and perspective and andI think, depending on your
organization, it's seen in oneof two ways. So for
organizations that areincredibly risk averse, having
this type of tool in the handsof the end user or their

(15:34):
employees and associates can becorrelated to a very high risk
right? So while they might bemore productive, they might, you
know, be able to use this toolto accomplish more. There's
still a lot of risk involvedwith these emerging
technologies, and so it's adifficult balance of, how do we

(15:54):
educate? How do we make surethat they know responsible and
ethical use? How do we make surethat they understand. I mean, if
I saw an image and and I haven'tbeen able to find it again,
although I could probably askGen AI to just recreate it for
me! But I saw an image whereit's a bunch of people walking,

(16:14):
and they're on their cellphones, and they're, you know,
doing this thing. And thenthere's two robots, like, kind
of knelt by a bench with, youknow, a coloring sheet, and
they're and they're coloring.
So, so what that image said tome is, we're, are we giving the
technology too many of the tasksthat actually make us feel good?
Or we're allowing it to createnow, we're allowing it to write

(16:36):
and to create images and tocreate videos, right? Are we
actually giving away the thingsthat do bring us joy and that
make us feel a sense ofownership over content and
creativity? So two, how do weensure that balance? How do we
ensure that that we don't givetoo much away and and that we
continue doing the things thatmake us feel good, that make us

(17:00):
feel human. And then I think onthe other side of that, though
very briefly, the technology isexceptional and and if it done
in the right way, you know, asas a woman, as a mother, I very
early on, one of the firstthings I did was I said, Oh my
gosh, I can. I can have thisthing make my meal plan for my

(17:22):
kids for the week, and gettingthe shopping list the recipes. I
don't have to spend three hoursa week doing that now? You know,
there are so many use cases,both personally and
professionally, that can elevateus, that can relieve some of
these things we don't actuallyenjoy, you know? There's so much
here.

David Turetsky (17:41):
And the applicability of the world at
work there is that people hateadministration. People hate
rules. They hate all the BSthat, and by the way, a lot of
that falls into HR tooobviously. But that's where I
come back to the codification ofall of HR, as well as the the

(18:04):
data from HR, and let's justtake the codification of the
rules and the policies andthings like that. There will be
instances where we may be ableto give the AI overlords the
ability to make these, if not,if not, recommendations,
decisions, on, yes we can makethat promotion, or yes we can do

(18:28):
that increase. That might befine, because we can set up the
rules and all that other stuff.

Announcer (18:35):
Like what you hear so far? Make sure you never miss a
show by clicking subscribe. Thispodcast is made possible by
Salary.com. Now back to theshow.

David Turetsky (18:44):
The question, I think, goes back to one of your
earlier points about closedsystems versus open systems.
None of this data can ever getout, right? So it's got to be a
closed system. And I guess whereI'm going with this is there's a
lot of nuances to thetechnologies we're not talking
about, which makes it verycomplicated. But in the world of

(19:05):
HR, those complications comewith other problems, like
regulation, laws and people andso how these things get adopted
is going to have to be reallycareful, because I don't know if
you're familiar with the FCRA,the Fair Credit Reporting Act,
where, when you start creatingalgorithms on people, then you

(19:28):
actually have to report to thefederal government what those
activities are and what thosealgorithms are. And I'm not
Experian, and I don't want tobe, but some of those things
might actually fall under that.
So we start getting much morecomplicated when we start
bringing people into theequation, I guess, where I was
going with this. How does thatfit in a world where our
regulations really haven'tcaught up yet with where we are
today in technology, but muchless where we're going?

Kasara Weinrich (19:53):
I think, I think there's a couple things.
And so I think this answer istwofold. First, you're not going
to be able to think of theregulatory environment if you
haven't even done thefoundational step of knowing
your organizational readinessfor all of this, right? And so
we can dive in, maybe in alittle bit, to some of those

(20:14):
readiness indicators. I knowwe've talked about data and some
of this, but there's a lotthere. But on the other hand,
the responsibility is huge atthis point. And it's not just
the responsibility of theorganization, the C suite at an
organization, it's also verymuch so their responsibility to

(20:35):
partner with the appropriatepeople, to partner with
organizations that have this asa central theme of their
development. And it feels a lotless risky if you know that
you're partnered with folks thathave been either doing this for
a long time, or,

David Turetsky (20:53):
I think I know where you're going with that.

Kasara Weinrich (20:55):
Yeah, maybe 75 years or so, yeah. Or really
organizations that operate evenglobally. So if you're the if
you are if you are compliant inthe most restricted areas of the
world, then you're going to becompliant everywhere else. And
so I think partnerships are keyhere. I think eventually

(21:18):
technology and and especially inthe United States, technology
has always outpaced theregulatory environment, but, but
I think it's also going to comeback to whether it's Corporate
Social Responsibility, or if youcall it ESG, whichever we want
to focus on here, it's almostlike prioritizing your
stakeholders and yourshareholders and your consumers

(21:40):
needs to be number one on yourESG checklist. And if you're
doing that, then all of thethings that compliance and
regulations and regulationswould try to achieve, you'll
have already been doing.

David Turetsky (21:52):
I'm thinking, though, that there's sometimes
been gaps in that. And I'll goback to trading systems, for
example, which have been on theleading edge of these things,
right? They've been on theleading edge of using data to
look for signals, to look foropportunities. And sometimes
they go bad and they go rogue.
And I'm not gonna, I don't wantto talk about some of the areas

(22:13):
where those things have gottenaway from certain companies, and
it's brought the company down.
But to your point, when you'redesigning for the eventuality,
or you're designing for the goalof being risk mitigating, and
you're taking to your point,that's a great point. You're

(22:35):
taking the right partners, andyou're taking the right
technologies, and you're takingthe right goal here, you frame
your goal appropriately from thebeginning, which is, what are we
trying to do? Where are wetrying to go, not just cost
savings. So you know, to me, youhave to, we have to create the

(22:55):
goal with the end in mind, andthen use those technologies to
drive that, just like we alwayshave. But now this is a new, new
foundational tool for all that.

Kasara Weinrich (23:06):
Yeah, and I think this, you know, maybe it's
I do. I've often been told thatI view the world through rose
colored glasses, and I'vecertainly been okay with that,
but I do see this as a maybe,maybe not once in a lifetime
anymore, because exponentialgrowth means that we're going to
see things at a faster pace. Butthis is a very unique

(23:28):
opportunity for HR, and it'sunique in so many ways, but the
way that I see it is, HR hasbeen on this journey of becoming
more and more strategic, moreand more relied upon by the
business to provide, whetherit's people, data or strategic
guidance, advice. Long gone arethe days where HR is purely seen

(23:49):
as an administrative function.
And so if this next phase, ifthis technological revolution,
can mean that now the CHRO, orHR leadership is partnering with
the CIO, the CTO, the CISO, ifwe're getting together this
cross functional group, andwe're reimagining our work,
we're reimagining our workforce,we're reimagining the way that

(24:13):
work is done and and using thisas an opportunity to, you know,
completely deconstruct, maybethe box and and look at things
and say, do we need to keepdoing it this way? Why are we
still doing it this way? And,and I think for a long time,
digital transformation wassimply taking a manual process

(24:34):
and finding a technology thatcould just do that same manual
process on a computer. That'snot transformation, that's
that's copy paste, right? You'rejust taking it down. And so, so
now the opportunity to trulytransform, right? And and to
take a deep, hard look at whatcan be augmented and automated,

(24:54):
and what, what human skills cannow rise to the top. Up and and
really help your people to beengaged and empowered in their
work. This is it can beexceptional!

David Turetsky (25:06):
But that means preparing your people and the
organization to take on thattact, because you might have
your leadership team agreeing toit, but then if you're going to
need to get rid of all yourstaff and hire a totally new
one. First of all, it's nevergonna happen, but we've been
talking a lot on this programabout reskilling and upskilling,

(25:30):
so that's the opportunity, isn'tit, to take what you have,
change them degrees and be ableto give them those skills to be
able to get that done, right?

Kasara Weinrich (25:41):
Yes, and and also not just re Skilling and
upskilling just to do it. Imean, I attended a seminar with
a Harvard a Harvard professor,and she was discussing this very
long study that they had done onreskilling and upskilling and
the outcomes, and this was priorto the release of generative AI,

(26:01):
but she was discussing, youknow, who was upskilled why. And
at the end, I asked her, I said,did the organization sit down
prior to the reskilling andupskill initiatives and identify
exactly which skills they wouldneed why they were upskilling
and reskilling in certain areas?
And she said it wasn't at allstrategic. It was many of the

(26:22):
individuals in the study weresimply looking at low skilled on
their Excel file and saying,Okay, we need to make them a
little bit more skilled. Andmaybe it was soft skills or
computer skills, tech skills,but they just kind of threw
darts at the wall to determinewhat those skills should be and,
and so I think, yes, reskillingupskilling 100% but along with

(26:46):
your strategy! And the thingthat I was so lucky to be able
to speak on prior to the releaseof all of this technology, was
data literacy and, and it's oneof those skills that often is
reserved for our data team orfor analysts, or now in some

(27:06):
cases, leadership, but it's notoften brought all the way down
through every layer of theorganization. And, and that's
one of those things that will becritical moving forward having a
data literate organization,evaluating your employees,
determining their ability to usewith, read, work with data,
right, make decisions with it.

(27:32):
And that's just one of manyindicators on org readiness for
this next, you know, phase ofour history.

David Turetsky (27:42):
Hey, are you listening to this and thinking
to yourself, Man, I wish I couldtalk to David about this. Well,
you're in luck. We have aspecial offer for listeners of
the HR Data Labs podcast, a freehalf hour call with me about any
of the topics we cover on thepodcast or whatever is on your
mind. Go tosalary.com/hrdlconsulting to

(28:04):
schedule your free 30 minutecall today.
So that kind of sets us up forso how does the organization get
ready? Does it mean that we nowneed to create the right job
architecture and jobdescriptions with skills that
talk to data literacy? Does itmean we need to start assessing

(28:25):
people? Where do we go? What dowe do? How do we take the next
steps to get there?

Kasara Weinrich (28:30):
So I think there are plenty of different
frameworks around readiness, butfrom my perspective, what I've
seen over and over and overagain in the market and in
conversations has been, yes,data literacy is something that
can be measured and can betrained. So that's a beautiful

(28:51):
thing, and it's a really greatplace to start. And I think a
lot of these things, they do,need to run parallel. So you
know, having employees andleaders that understand data,
that have access to theappropriate data, and that know
how to work with it. That is onefunction, but then even doing an

(29:11):
analysis of your techinfrastructure. Do you have the
foundation required to supportthese tools and systems? Do you
have a data taxonomy in place?
Do you know where all of yourdata lives? Is it in a bunch of
different systems that don'tspeak to each other? You know,
do a, do a deep analysis and andget some insights as to what
your tech infrastructure lookslike, organization wide. And

(29:34):
then, I would also say, kind ofin three parts, but, but all
kind of pointing back to thesame outcome would be cultural
adaptability. You know, Iwouldn't be the resident
anthropologist in thisconversation if I didn't point
back to this. So does theirability to adapt to and to
really embrace change, is yourculture ready? Is it supported?

(29:57):
And so again, you know somethingas simple as, do you operate in
Silicon Valley, where they'reused to innovation and risk and,
and, you know the the cycle oftry, iterate, fail, right? And,
or are you operating globally,where you might have a team in
in Asia that is prioritizingharmony and and hierarchy in

(30:21):
some instances, and theimplementation of this
technology could be far toodisruptive for them to have the
the desire to adopt. Do you havethose insights? It's another
thing that you can measure andlook at and and assess, and then
finally, would be leadershipsupport and employee engagement.

(30:42):
So leaders are the ones that aregoing to drive the perceptions
of these emerging technologies.
Is, is the perception fear basedthat is going to steal your job?
Or is the perception excitementand and the desire to try and

(31:02):
curiosity and learning agility,right? Your leaders are the ones
that are driving that. So dothey have a clear understanding
of your outcomes and your plan,and then, in turn, are your
employees ready for it? Right?
Beyond the data literacy, youknow, they need to be bought in
to these initiatives.

David Turetsky (31:21):
And a lot of times, they're the last group to
not only be bought into it, butalso to really just
fundamentally understand itexists. And we've even seen
where projects have been rolledout, you know, 100% done,
communicated, you know, to allthe leadership team, and then
they start talking to theiremployees, and the employees go,

(31:42):
Wait a minute. What? Where'dthis come from? So, yeah. So
your point is, don't make it anafterthought to talk to
employees, bring them in early.

Kasara Weinrich (31:50):
yeah. Get get a cross function group of
stakeholders.

Dwight Brown (31:54):
Or sometimes it's the other way around! The
employees end up knowing moreabout the technology and the and
then what the leaders do. Andyou, you know, you kind of end
up with this backwards cycle ofhow to roll it out, implement
it. You know, leaders are goingone way. The employees are
looking at going, what are youguys thinking? This is not even,

(32:16):
you have no clue what thistechnology is and what it does!
And it probably depends on theorganization which which of
those scenarios plays out.

Kasara Weinrich (32:25):
Yeah, and back to your earlier your very first
question about what's wrong withthe world of work, there's often
a deep misalignment betweensenior leadership's view on what
the problem is versus theemployee's knowledge of what the
problem is. And so if you'resourcing use cases, and you go
to your employees, and you firsteducate them on if they don't

(32:48):
already know what this tech isand what it does and how it
works, then saying, okay, inyour function, how could this
technology be used to improvethe world of work for you? They
are going to have the very bestuse cases and a prioritization
of of what can completely changetheir function versus what might

(33:08):
have, you know, minusculeimpacts and effects. And so,
yes, an early buy in and goingto them first could be immensely
impactful.

David Turetsky (33:16):
As long as the culture is about trust and not

Kasara Weinrich (33:17):
Yeah,

Dwight Brown (33:17):
Will you give us use cases? Nah, they're no use
about, hey, our goal is to cutcosts. We're going to talk to
you about AI doing your job.
cases.

David Turetsky (33:32):
Yeah tell us what you do. You want to
document everything you do. Giveus a task by task list of
everything.

Kasara Weinrich (33:39):
Everything here is manual. I don't

David Turetsky (33:41):
You need to be involved in every decision.

Dwight Brown (33:46):
Nothing to automate to be seen here. Just
keep right on moving

David Turetsky (33:49):
Keep on going.
Actually, I think that guy downthere, he's in a job, but we can
definitely outsource the robots!But, but I think self
preservation is going to be deepin this, because people watch
too many freaking movies wherethey go, Yeah, I mean, look at
the Cylons. Look what they did!Sorry Battlestar Galactica. No.
But seriously, they're going tobe like, you know, the robots

(34:11):
are going to just keep takingover. We're if we bring them in,
unless they're educated. Look,we all. I'm sorry, I shouldn't
say we all. I lived in a worldwhere there weren't computers at
work. Then there were, andpeople were fearful for their
jobs. Now there were some peoplewho used to be receptionists,
they used to be secretaries,they used to be administrative

(34:33):
assistants, and they've had tomove on and do other things,
because everybody does theircalendaring themselves and all
that other stuff that wasextremely valuable at that
point. And I think we have to gothrough that transition, don't
we? I mean, as ananthropologist, don't you see
that there's going to be thiskind of maturity and migration

(34:54):
from to certain tasks andcertain roles and certain things
to others?

Kasara Weinrich (35:00):
Oh, of course, I think I actually believe I
said this week that I'm notconvinced that there's any
current skill set that can'talso be applied in some
capacity. So as an example, ifan organization is looking to
either outsource to tech or tomake more efficient through tech

(35:20):
and so therefore, maybe cuttingdown on head count a data
analyst role

David Turetsky (35:24):
right

Kasara Weinrich (35:25):
the skills and the functions of a data analyst,
they're not just specific todoing data analyzing. You can
then move that person into datagovernance. You can move that
person into being a keycomponent or a consultant for
partner and vendor relationshipswhen it comes to these
technologies, if they have adeep understanding of the data.

(35:46):
Oh, and by the way, data is theis the goal, right? That's the
thing that's fueling all ofthis. So, so yes, I think
there's going to be a lot offear potentially, but it is
really up to the organization tomitigate that and then to be
very transparent in in theirplans and and, you know, take on

(36:09):
the learning and development. Ithink, you know, ADP Research
Institute recently released someof the sentiment, right? And 80%
of people know that their jobscan be influenced by AI in some
way. And you know, the jury'sstill out on whether it's
primarily positive or negative,and it's going to evolve, it's

(36:32):
going to change, but it reallyis up to the organization to be
transparent.

David Turetsky (36:37):
I think what we should do, Kasara, is to get you
back on the program two yearsfrom now?

Kasara Weinrich (36:42):
Hmm

David Turetsky (36:44):
to say, Hmm, well, I mean, even next year,
yeah, and we've been, we'veactually been talking Dwight and
I have been talking about, youknow, how things change and how
quickly they do and come backand see, well, did we get it
right, or were we wrong? Wasthere a backlash, or did it

(37:04):
surprise all of us, and it gotadopted much, much more quickly
than we actually thought itwould. Now, some of the AI is
actually going to be built intosolutions like your company is
pioneering, but there are othersthat are using it in the back
and not bringing it forth, andnot, you know, bringing it to
to, you know, each of theconsumers, but, but keeping it

(37:25):
in the background, doing a lotof work, and then having the
productivity of the other peoplearound it, or the technologies
around it being better.

Kasara Weinrich (37:32):
And I think I, you know, when, when ATMs were
released, there was this massivepanic and uproar. I will never
be able to go into a bank andtalk to a person again. I'm only
going to be able to work withthis robot, right? And now we
know that that is not true. Andwhile we could probably look
,especially your organization,we could probably look at the

(37:53):
over under, on how that specificlike, whether a teller has
changed, you know, in inquantity and right? But you can
still walk into a bank and see aperson.

David Turetsky (38:03):
If you want to!

Kasara Weinrich (38:05):
If you want to, right?

David Turetsky (38:06):
No, seriously, because you can deposit checks
and, you know, cash.

Dwight Brown (38:10):
yeah, you got a choice!

Kasara Weinrich (38:11):
Yeah, and that's exactly it. Like at every
point in human history, humanbeings with tools have
eventually replaced human beingswithout, right? Like the rock
was replaced by the hammer, thesatchel was replaced by the
wheelbarrow. This is just, it'sanother tool and and those who
are willing to try and do theirwork with it are going to,

(38:33):
eventually, you know, still behere.

David Turetsky (38:37):
Beautifully said.
And I think we'll leave it atthat, because, as I said, I
think what we should do nexttime we have you on is to see
where did this go and how fastwas it adopted? So

Kasara Weinrich (38:56):
I would love that.

David Turetsky (38:57):
Kasara, thank you. We appreciate it.

Kasara Weinrich (38:59):
Thank you both so much. Thank you for the time
and for the conversation. Thiswas wonderful.

David Turetsky (39:04):
My pleasure.
Dwight, thank you.

Dwight Brown (39:06):
And thank you! thanks for being with us today,
Kasara.

David Turetsky (39:09):
And thank you all for listening. Take care and
stay safe.

Announcer (39:13):
That was the HR Data Labs podcast. If you liked the
episode, please subscribe. Andif you know anyone that might
like to hear it, please send ittheir way. Thank you for joining
us this week, and stay tuned forour next episode. Stay safe.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.