Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Sam Gerdt (00:06):
Welcome everybody to
episode two of Roadwork Ahead, a
podcast that explores theunmapped future of business and
technology.
My name is Sam Gerdt and I amyour host.
Today, I give you an interviewwith Mitch Shue, the executive
director of AI Rise at ClemsonUniversity and a professor of
practice in Clemson School ofComputing.
(00:27):
Mitch came to Clemson after along career as a successful
technologist, both at thestartup level with companies
like Webs.
com and Hello Wallet, and at theenterprise level with
Morningstar, where he served forseveral years as chief
technology officer.
Our discussion focuses mainlyon the present challenges that
companies of different sizesface, from AI adoption to data
(00:51):
privacy and security, but wealso talked about the next
generation of technologistsbeing trained now and some of
the professional and ethicalchallenges confronting them as
they enter the workforce.
I found Mitch to be a down toearth thinker when faced with an
unknown technology landscape,and his emphasis on balancing
progress in all areas oftechnology is excellent advice
(01:14):
for us all.
I'm grateful for Mitch's timeand his expertise, and I hope
you benefit from it as much as Idid.
Mitch, thank you for talking tome.
You come from a reallyinteresting background in that
you've got really reallyimpressive resume on the startup
(01:35):
side, and then you've got anequally impressive resume on the
enterprise side and now you'reteaching.
So you have your foot in threevery different, very unique
arenas and I think you have somereally interesting perspective
(01:56):
when it comes to topics like thefuture of business, with
disruptors like AI, data andprivacy becoming more and more
of an issue, cyber securitybecoming more and more of an
issue.
So I want to just pick yourbrain on a few topics and I want
to hear more from the differentperspectives about how we
(02:19):
should be thinking about some ofthese topics, maybe approaching
them as business leaders, andthen maybe just get some ideas
about how you're thinking aboutthem on a more personal level,
what you see and what some ofyour gut feelings are.
So I think my first, mostobvious question is you come
(02:39):
from the CTO position at MorningStar.
You had something like 1500technologists working under you.
That's right.
You're not there anymore.
But if you were, or if you hada new opportunity and you were
dropped into a similar position,what would be some of your
(03:01):
first directives to make surethat that organization was
getting up to speed with some ofthe newer issues that we're
talking about AI, data andprivacy and cyber security?
Mitch Shue (03:14):
Yeah, that's a good
question.
Actually, when I was serving asthe CTO, the advances in
technology just never stopped,and so one of the challenges we
were faced with was a cloudcomputer, and so what we did was
we went through our entireworkforce to make sure that
(03:37):
everybody understood and couldhave an intelligent conversation
about the technology, and thenwe would make decisions based on
what we learned about thetechnology and how to improve
the business.
And I think we're still in thatsame climate with the emergence
of AI, which certainly been indevelopment for decades, and all
(04:02):
of a sudden, the scene burst onthe scene.
The tendency for a lot oftechnologists is to just find
problems and try to apply newtechnology to solve them.
Sometimes that works, sometimesit doesn't, and so if I was
(04:25):
still in a role like that, Iwould again encourage the
development of conversationalknowledge about the technology
and then, in a rational way,figure out if there's a way to
apply it in order to improve thebusiness.
Sam Gerdt (04:47):
Now you're coming
from Morningstar, which is
obviously going to be a moreconservative setting for a
technologist.
Is that the kind of mindsetthat you're applying to this a
more conservative mindset?
Is there a more aggressivemindset where you can afford to
take risks, or is this somethingthat we should automatically
skew conservative with?
Mitch Shue (05:09):
I think that answer
was from my role at Morningstar,
which is a decidedly wellestablished company, and you
obviously have to have adifferent approach than you
would if you were a startup.
Actually, most of my career hasbeen growing venture backed
(05:33):
startup companies intosustainable, successful
businesses, and the approach hassome common elements.
But clearly, as a startupcompany, one of the challenges
you have is limited runway andyou want to make the most of
your resources.
(05:55):
It's a race against insolvency.
So if there's advantages to behad by employing technologies
like AI in a way that'saggressive and that can sort of
increase your mass as a startup,I would certainly look at those
(06:15):
.
So it's kind of a differentapproach depending on whether
you're a well establishedbusiness or you're a startup
that is racing againstinsolvency.
Sam Gerdt (06:30):
When you moved from
that startup culture to your
position at Morningstar, did youfind that you had some catching
up to do in terms of shiftingmindset?
Mitch Shue (06:40):
The interesting
thing is this sense of urgency.
The sense of urgency is quitedifferent.
When you're at a startupcompany, you wear a lot of hats
and that sense of urgency isvery, very real.
When you're at a moreestablished company, it's not
(07:01):
like Morningstar is going to runout of money in the next few
months or anything like that.
So that sense of urgency issomewhat different and you have
to instill a sense of urgency inyour culture, but it has to be
(07:21):
authentic because, again,clearly Morningstar is not going
to run out of money anytimesoon, but you want the teams
involved in driving the businessto have this sense of urgency.
It's tough, it really is, butif you want to be a well
(07:44):
established, long standingcompany that is relevant for
generations, like a Morningstar,you do have to sort of think
about what is going to keep yourelevant and get people excited.
Sam Gerdt (08:00):
Do you feel like
there's going to be a lot of
disruption then from thesestartups who are moving fast?
Have that sense of urgency?
Is there a hesitancy from theenterprise level that's going to
come back and bite them?
Is there potential for?
Mitch Shue (08:15):
that you know.
I think one of the challengesis that this is proven out in
history.
Sometimes the most innovativecompanies aren't the ones that
win right, because no one knowsabout them or they're too small.
There's too much riskassociated with them.
(08:35):
So sometimes you know startupcompany.
That's what you're faced withright In the financial services
space, morningstar is a trustedbrand.
There's lots of trusted brandsin all industries that we look
to right, we look to see whatthey're doing and then we have
(09:02):
this sort of inherent trust inthem.
So I think innovative startupsone of the challenges they have
is to rise above the noise ofevery other quote.
Innovative startup, right, andthat's why we always see we're
(09:23):
seeing this now with AI.
It's everybody just doinganything with AI is going to get
some sort of attention right,it's always happened.
Just every few years there'ssomething that everybody latches
on to and it's tough certainlyfor the layperson to discern
(09:48):
what's going on.
It's tough for the investor todiscern what's going on.
So it just needs to shake out.
But there are lots of companieswho are really trying to take
advantage of just the buzzinessof whatever is irrelevant today.
Sam Gerdt (10:07):
There's definitely
that hype that we're almost
fighting against, because forpeople who are inside certain
industries, especially peoplewho are well versed in what's
going on with AI, I think it'sreally easy to start looking at
that hype and almost wishing itweren't there, because there is
this disconnect that eventuallyis going to come back to bite
(10:31):
everyone, because the generalpublic is going to figure it out
.
We're not dumb people.
We understand when you're overpromising something that's just
not going to happen.
But one of the things that Ithink is really interesting from
your answer just then is youmentioned the value of brand and
(10:52):
how it sounds like what you'resaying.
One of the biggest hurdles to asuccessful startup is they have
to overcome the fact that theyhave no established brand, and
so, for that startup who hassuch limited runway, they have
to balance innovation withestablishing some kind of name
(11:14):
for themselves.
Mitch Shue (11:16):
That's absolutely
right.
And then they have to land aclient that has a
well-established or recognizedbrand, because then people can
relate to that and say, hey,these people believe in them and
I believe in this brand.
So, by extension, you must bedoing something really good, and
(11:39):
I've seen that several times inmy career.
I'm like there's no way thisstartup is going to overcome all
this brand awareness in theirindustry and they have.
They've managed to do that.
They've managed to do it withsuperior technology, superior
(12:00):
user experience and just landingthe right influential client to
push other clients over theedge.
Sam Gerdt (12:13):
And I suppose chat,
gpt and open AI is probably the
poster child for this in thisparticular cycle, because you
have a company that was founded,co-founded by Elon Musk, who is
something of a hype machinehimself, and you have Microsoft
coming and pouring in somethinglike $8 billion into the company
(12:35):
, and now you have what isprobably one of the more
well-established brands in thatAI startup culture.
The GPT logo is almostinstantly recognizable by many
people, and for that to happenso fast is pretty incredible.
Mitch Shue (12:53):
It is, and I think
part of it is because the
layperson has access to thistechnology and can see for
themselves what it can do, andso they're able to just sort of
spread the word to people wholook to them and trust them,
(13:17):
whether it's their family ortheir friends or people in their
startup.
What have you?
It's easy for everybody to sortof sell.
Sam Gerdt (13:30):
Yeah, and what's
interesting too, is the first
business tool that's almostexclusively AI.
We've had AI advancements inall of our business tools for
quite some time, but the firstbusiness tool that's almost
exclusively AI is chat GPT.
We see this flooding intoworkplaces and it's not coming
(13:51):
in from leadership.
It's coming in from the averageworker who's using it to
augment their tasks.
That's right.
Is that something that youwould be a fan of or a verse to
in a CTO role and maybe speak tothe startup versus the
enterprise level there as well?
Mitch Shue (14:10):
Yeah, I think, as I
mentioned earlier, there are
continuous advances intechnology and you see sort of
gut reactions to these kinds ofadvancements.
Now that I'm in academia, I seesort of the gut reactions to
(14:32):
things like chat GPT.
Personally, I think that,especially as a technologist,
you really need to learn aboutthese things and understand them
and make decisions aboutwhether or not you want to
invest in learning more aboutthem, whether you want to apply
(14:57):
them to the business or whathave you.
I think that somebody who is anexpert, who uses AI, is so much
more powerful than somebody whoknows nothing and then just
(15:19):
relies on AI.
And so what I see a lot and Isee it in my students is that
the best students use tools likechat GPT to make themselves
better, make themselves betterwriters, make themselves better
(15:39):
readers, but they don'tcompletely outsource their brain
to chat GPT, so they'reactually using it as a tool, not
just relinquishing all of theirthinking to it, and so that is.
I think the key fortechnologists and students is to
(16:03):
use the technology in ways thatare going to help improve you
as an individual and then, byextension, improve the business.
Sam Gerdt (16:18):
It's really good to
hear you say that.
I 100% agree that these tools,especially as they exist now,
certainly not capable ofreplacing humans in work, but
boy, are they capable ofaugmenting humans in work and
(16:41):
magnifying the value that aperson is bringing to the table,
and so what we've seen, even inour own workplace, is the
people who take the time tolearn the tool use the tool to
do better work than they weredoing, and people in general who
(17:03):
are already very good at theirjobs tend to do much better with
these tools in terms ofproducing value.
There's almost like anexponential magnification of the
output.
So a highly skilled writer, forexample, can produce a much
(17:26):
greater percentage increase inefficiency and quality with
these tools than a lower skilledwriter.
Mitch Shue (17:33):
I completely agree.
And you know I say this topeople all the time because you
know everybody's worried aboutAI is gonna replace my job or AI
is gonna take over theworkspace.
And I tell people all the timeyou know AI is not gonna replace
you, but somebody who is reallygood at their job, who uses AI,
(17:55):
is gonna replace you.
Sam Gerdt (17:58):
Yeah that's right.
Yeah, and yeah, if you, if youhave a job where your tasks are
repetitive and mindless, thenyou need to figure out a way to
maybe break up to the next level, where the tasks are a little
bit more thoughtful and a littlebit more nuanced, because
(18:19):
that's where you're gonna havegood success.
You're gonna be able todifferentiate yourself with AI
tools augmenting what you'redoing.
Mitch Shue (18:28):
Yeah, that's exactly
what I what I think is is it
can definitely free you up forhigher value tasks, and your job
as an individual is to is tocreate value.
You know it's to create valuefor your business and if you're
in a leadership position, yourgoal is to develop more leaders,
and you can do that byemploying any kind of tooling
(18:53):
you know in a way that advancesthat cause.
Sam Gerdt (19:01):
So this should really
be a wake-up call, then, for
those leaders, for thosedecision-makers, as they
approach AI adoption in theirorganization to, instead of
focusing on the efficiencies ofthe tool or the bottom line,
they should focus on theefficiencies of their people,
(19:23):
they should focus on upskilling,they should focus on education
opportunities and buildingsomething of a culture that says
we love AI, we love our people.
We want to put the two together.
Mitch Shue (19:37):
Yeah, I think one of
the challenges that all
companies face today is istalent right, hiring the right
talent and then, beyond that,retaining the talent.
And the way you retain talentis you have a mission that they
believe in, you make the workfun, you're investing in their
(19:59):
professional development right,and you have their back and
that's how you retain yourworkforce.
And certainly, with advances inAI, with tools like chat, gpt
and other tools, encouragingthat curiosity in your culture
(20:19):
is going to really help withretention and actually create
new ideas for the business, andthere's tons of pressure on
business these days right,regardless of what industry
you're in.
If you're a well-establishedcompany, you have to remain
relevant for years to come.
(20:41):
If you're a startup, you knowyou want to try new things, you
want to find your place and youcan do it if you look at this
new technology and learn aboutit and understand its strengths
and weaknesses, and how to applyit to your mission.
Sam Gerdt (21:05):
What's interesting,
too, is it seems like it's the
younger generation just comingup that's doing that, that's
taking this seriously, learningabout it, and so it's.
It's not going to be verysurprising to me, I don't think,
when companies discover theirgreatest talent for the next
wave of business is coming fromthose entry-level positions,
(21:28):
those lower-level positions, andnot from established positions.
That's that's why I thinkculture is so important, because
otherwise those employees aregoing to figure out really fast
that they have value that theircompany's not tapping into and
they can go somewhere else, yeah, and be be used more
(21:49):
efficiently.
Mitch Shue (21:51):
Yeah, yeah, you know
.
I think that the challenge forleaders is really to understand
that they need to learn aboutthis technology as well, because
, especially if you're atechnology leader, your goal is
to make sure that yourtechnology supports the business
(22:14):
, and you need to understandwhat is available out there.
And if you don't understand it,don't react by saying you can't
use it, don't do this, it's notallowed, that doesn't support
the business.
(22:35):
You need to better say let'sfigure out what's going on here,
let's figure out how we canremove some friction from our
development process, how we canbe innovative and maybe there's
opportunities to employ AI.
However, having said that, it'slike a lot of technologies that
(22:58):
we've seen throughout the years.
Some of the technology doesn'treally solve a problem and yet
technologists are trying tosolve problems with the
technology just to use thetechnology.
Sam Gerdt (23:14):
Yeah, yeah, we have a
lot of waste, I think, in that
arena.
So how are you approaching this, then, with students who are
looking to get into theworkforce?
They're coming up in an age ofLLMs and in an age of AI
(23:35):
augmenting business on anindividual level how are you
telling them to approach this?
Is there any caution?
Is there just no?
Go and do learn as much as youcan.
What are the caveats that areprobably more important that
they keep in their minds?
Mitch Shue (23:56):
Well, most of my
students are computer science
majors and one of the things Italk to them about is how to
have a long career in technology.
And I sort of reflect upon myown career, I reflect upon my
(24:18):
undergraduate program comparedto their undergraduate program
and I tell them about a worldwhere there's no internet,
there's no personal computer,there's no GPS, there's no
smartphone and they look at melike you know, I'm a caveman
right and I tell them that theway you have a long career in
(24:41):
technology is you learn abouttechnologies constantly and you
have this sense of curiositythat you are always trying to
satisfy, and it's just, it'simportant for you to be curious
about these things.
So, as a student, I want you tobe curious about these things,
(25:02):
but at the same time, as Imentioned earlier, I don't want
you to outsource your brain tosomething that is not that great
at everything, right, I mean, Iwant to know what you're
thinking.
I don't want to know what theAI is thinking Right.
I saw when you asked to write apaper.
(25:24):
I want your thoughts on thissubject and this subject.
I want your thoughts and I wantyou to go through the thinking
process rather than just tryingto come up with the coolest
prompt to generate somethingthat's going to get you a grade
(25:45):
Right.
So I tell my students, don'tuse chat, ebt or any tool to get
a grade.
Use the tool to learn, becauseat the end of the day, you know
something and are able to use AI, are going to amplify your
(26:07):
skills and your ability to doreally cool and helpful things
for the world.
And you know it seems toresonate with students.
I've, you know, some of themhave completely outsourced their
assignment to chat EBT and Iasked them why.
You know, I'm like, why'd youdo this?
And, first of all, they're kindof startled.
(26:29):
They're like, well, you know,how did you know?
And I'm just like you know, nota complete idiot, I can tell.
But you know, tell me, you knowwhy, and a lot of it is down,
comes down to not beingwell-organized, not
(26:49):
understanding time.
You know those kinds of things.
And I talked to them like theseare important skills to learn,
right, if you can do everythingwith chat EBT.
Why would I hire you?
Because I could do your job.
I could do what.
I hire you chat EBT, just likeGoogle, I could just Google
(27:09):
everything.
If you're going to Googleeverything, I can Google it.
And they understand.
And I guess I have the advantagein that I'm not a career
educator.
So I tell them that I interviewpeople like them and I know
that within the first minute Iknow you know, did you cheat
(27:31):
your way through school?
Did you outsource your brain tosome AI?
I can tell.
And one of my students asked mehow can you tell?
And I said because I asked aquestion.
And then I asked a follow-upquestion and they're like oh,
I'm just like, you cannot cheatyour way in your career.
(27:57):
You know you're going to be,you're going to be discovered
and it's going to be too late,right?
So I tell them don't cheat yourfuture self.
Right, it's kind of cliche, butI tell them don't cheat your
future self.
Sam Gerdt (28:11):
No, but it's so true
and even in my own experiences,
I use I use chat GPT daily,probably in some cases, hourly.
It's a huge part of myworkflows.
At this point, it's somethingthat I'm training myself on, and
I also use other AI tools tomanage portions of my day.
I happen to be one of thosepeople that's extremely
(28:33):
disorganized.
I have terrible time managementand I'm actually very thankful
for AI, because there are somereally good AI scheduling and
time management tools out thereand I use one to manage my
calendar and all it needs toknow is what's on the list and
roughly how long does it takeand when does it do, and it
(28:54):
handles the rest.
It gives me a time blockedcalendar.
I can stick with it.
If I miss something, it adjustsfor me.
If somebody schedules a meetingin the middle of a block, I
don't have to spend 30 minutestrying to figure out how do I
recover from this.
I can just go to the meetingand it's, it takes care of
itself.
(29:14):
My calendar adjusts for me,that's.
I think that's an example ofwhat you're talking about, where
AI can be an assistant, butit's you're not outsourcing your
brain Right, even in writingprojects that I've done where
I've used AI to help chat, gbtto help.
It's not.
Will you please write thisarticle for me?
It's.
(29:34):
Will you please assume thispersonality and let me bounce
some ideas off of you.
Help me to develop this outlinein a more cohesive way and, at
the end of the day, you stillhave to be a very talented
editor in order to produce goodcontent.
Mitch Shue (29:53):
Yeah, yeah, I found
that, I definitely found that is
true for me too, when I usechat, gpt, it's almost like a, a
springboard for ideas.
You know, it's like, oh yes, Ididn't really think of that, but
now, now that I've seen this,this response, you know it gets
me to think about things that Iwasn't thinking about, which I
(30:14):
think is great, because it sortof teases out parts of you that
you know perhaps need to beteased out.
Sam Gerdt (30:22):
Absolutely,
absolutely.
And and you know we can't wecan't necessarily go through
those exercises as quickly as itcan.
It doesn't mean we can't gothrough those exercises, but
that's where the efficiencycomes in, is it's?
It's allowing you to do thingsthat you already know how to do
if you're if you're a goodwriter, if you're a good editor,
but it's allowing you to dothem much faster, much more
(30:45):
efficiently.
Yep, and I think that I thinkthat's why it really it does
belong on everybody's deskQuickly.
I want to.
I want to change subjects.
Sure, one of the one of thereasons why I see more
enterprise level organizationshesitating on the brink of
artificial intelligence isbecause of how important and
(31:07):
critical cybersecurity and dataprivacy has become.
I think there's this sense thatgiving data to an LLM is a
risky idea.
Is that?
Would you agree with that?
Mitch Shue (31:23):
Yeah, I think.
You know, I think people oftendon't think about security first
, right, they're just sort of anamber with this response they
get and they're not reallythinking about security.
But we've seen that in a lot ofcases, with advances in
(31:44):
technology, security sort ofcomes later, unfortunately, yeah
.
But now we're employingtechnologies now where that we
can't afford that.
We actually have to think aboutsecurity first and as a first
thought, not an afterthought.
And I think that's definitelytrue with AI tools, especially
(32:08):
when it comes to the code.
You know, I've heard of teamsthat use chat GPT.
You know, I understand whythey're using it this way, but
they're not thinking about thefact that they are basically
exposing their intellectualproperty in the form of source
(32:29):
code and they're just feeding itin the chat GPT.
You know, like, help me, helpme debug this or help me improve
this piece of code, and it'slike this is your, this is your
source code, you know, for someimportant product or service
that you're offering and you'rejust just kind of throwing it
out there.
So, yeah, so I thinkengineering leaders, especially,
(32:52):
while they have to encouragethis curiosity, they need to
have some guidance, you know,for their teams, like, like,
don't give away our intellectualproperty in whatever form.
Right, be smart about that.
But you know, it's amazing.
I've talked to severalengineering leaders who are like
(33:12):
, oh, I haven't really thoughtabout that but yeah, we need to
have some guidance there.
Sam Gerdt (33:20):
Now do you think that
there's much incentive in the
short term then to build outLLMs on private infrastructure?
It could be cloud based, itcould be privately held, but the
idea that you're that you'recreating a gap between your data
and your inputs and the rest ofthe world I mean, some of these
(33:42):
open source LLMs are alreadypretty powerful.
I mean Lama two is verypowerful.
So is that something that thatbusiness is maybe more
enterprise level should beconsidering at this point?
Mitch Shue (33:54):
Yeah, probably, but
not, not.
Not all enterprises, big orsmall you know, actually have
those kinds of resources.
They don't have the skill sets,they don't have just the
overall capacity to do thosekinds of things.
They can barely keep up withtheir own roadmaps and things
(34:17):
like that.
So some of it is it'll becomeeasier.
But again, there are ways touse chat GPT safely without
having to sort of invest in yourown infrastructure or expertise
.
You just have to think a littlebit about what are my guard
(34:41):
rails?
Sam Gerdt (34:43):
Yeah.
Mitch Shue (34:43):
I would imagine that
you know most companies and
most industries will be able touse something like chat, gpt out
of the box without having to dosome more specialized or sort
of isolated isolating their ownLLMs.
(35:05):
But you know.
Sam Gerdt (35:07):
Yeah, it seems to me,
and maybe you've got a
different opinion on this, butthe direction that we're headed,
especially with the way open AIseems to be conducting business
, is we have the originaltraining set, the data set that
was used to train GPT-4 or anyof these other models, and it
(35:29):
seems like the inputs into theinterfaces like chat, gpt.
It seems like they intend touse those inputs for future
training and it also seems likethey intend to continue scraping
the internet forhuman-generated content,
specifically for future training.
(35:51):
I think they recently came outwith the web crawler and I think
they signaled this a little bitwhen they said you can disallow
it Because they got into sometrouble because it was scraping
content behind paywalls, and theresponse to that was well, you
didn't tell us not to.
That's the way I'm reading it.
Mitch Shue (36:13):
So if it can, yeah,
it's a standard privacy issue.
This notion of opt-in versusopt-out Opt-out is so easy for
businesses.
They can just say as you justsaid well, you didn't tell us we
couldn't do it.
It's kind of easy for them.
But for us and businesses, alot of us just individually, we
(36:39):
don't even know that we canopt-out.
But if we went to an opt-insolution it's a lot of friction
for a lot of businesses.
So, privacy advocates alwayswant opt-out, I mean sort of
opt-in.
But businesses they want itopt-out.
Sam Gerdt (37:01):
Being on both sides
of it, I can see.
For a business especially,opt-out is certainly easier, but
it seems like we've got a bitof tension here because
regulation is tending to favoropt-in models and consumer
privacy, whereas business modelsare still more opt-out and we
(37:27):
need your data Give us your data.
Mitch Shue (37:29):
But the other thing
is even opting in anything
involving you having to digestsome terms of use.
It's just nobody reads them.
Sam Gerdt (37:50):
Yeah.
So with regards to that optingin, opting out, we need more
data for training in order toimprove these models.
To me, it would seem unwise fora company to willingly hand
over data through prompting to apublic model like GPT.
What I'm hopeful is we get someSaaS and infrastructure as a
(38:15):
service offerings that allow usto adopt larger scale models for
companies where your data canbe secured by you, with some
assurances, some compliance, andthat would enable, I think, a
lot more exploration in thisarea.
Mitch Shue (38:34):
I think that's going
to happen, Just like if you
look at all the public cloudinfrastructure.
It used to be so focused oninfrastructure, whether you're
AWS or Azure or GCP or AlibabaCloud or whatever, but now
(38:55):
they've all gone to higher valueservices so that you can
explore and experiment withthese kinds of things.
I mean, they all have AIofferings of some sort.
Yeah.
Sam Gerdt (39:07):
Well, there's going
to be a huge demand for it.
I mean, to get this off theground on your own would be
crazy expensive, right, I think,with chat GPT, your open AI
just said that chat GPT byitself costs them $700,000 a day
to run.
(39:27):
I mean, granted, they've gotover a billion users, but yeah,
these models are incrediblyhungry when it comes to energy
and it'll be interesting to seehow that scales.
I know NVIDIA just announced anew, more efficient chip.
(39:47):
I'm sure more of that's comingdown the pipeline, but it'll be
interesting to see whetherhardware can keep up.
Mitch Shue (39:54):
Right.
Well, that's why we're at thistipping point now is that
hardware and computeinfrastructure has reached this
point.
We can do these kinds of things, and combining that with this
proliferation of data and theseadvances in AI techniques put us
here what seems like happenedall of a sudden.
(40:14):
It's been happening for decadesand I think it's going to keep
going.
And what's going?
to happen is that people arestarting to think about this.
It's like you mentioned it alsois the energy costs.
It's like how do you measurethe energy costs of this?
(40:35):
And it's tough, but it'senormous.
Sam Gerdt (40:39):
This isn't
necessarily the same topic, but
I've listened to a lot of peoplewho are in energy and they
believe that we're headedtowards lower cost energy in
terms of generating power, butalso greater efficiencies in
running some of these vast, vastcomputing machines that we've
(41:00):
set up.
What I think is interesting isyou come from a background where
you were very heavy in cloudcomputing and it doesn't seem to
me maybe you can correct me onthis it doesn't seem to be that
AI was really any kind of afocus in the migration to cloud
computing for so manyenterprises.
I think it was generally cost.
(41:21):
Is that correct?
Mitch Shue (41:22):
It was cost and also
removing friction, because
traditional computingenvironments would require a
developer to have to provision aserver, whether internally or
in the cloud, to do someexploration or experimentation.
(41:44):
There was a lot of frictionthere, so by going to the cloud
we removed that friction.
It's like I need to try thisout.
I'll just spin up a server forhalf an hour or try this out,
tear it down no friction.
But the other driving force isreally running our own
(42:08):
infrastructure in businessdifferentiator and for most
companies running your own datacenter is not a business
differentiator.
I mean, no one buys Morningstarproducts and services because
of its data centers at all.
(42:28):
So I think a lot of companiesrealize that they need to get
out of the data center business,and then the cloud providers
reached a point where they couldprovide the infrastructure
necessary for companies to do itand what they feel like is a
scalable, secure way, and moreand more companies move to the
(42:53):
cloud and just because of that,you're able to have companies
that can grow very fast and canrespond very quickly.
Case in point Zoom during thepandemic, march 2020, everybody
(43:21):
goes online and if you have yourown traditional infrastructure,
it's like I gotta buy thousandsand thousands of servers, I
have to rack and stack them, Ihave to secure them.
That's just not gonna happen.
So with public cloudinfrastructure.
I read one account that said,during the first few months of
(43:45):
the pandemic, Zoom was addingnow 5,000 servers, since it was
a day and you can't do that.
As you use it.
That would be Employee for thecloud infrastructure, so I think
the same is gonna be true withthese higher value services,
like related to AI is why am I?
(44:08):
Going to build thatinfrastructure.
I'm just gonna use it in sort ofa utility fashion and pay for
what I use, for as long as I useit and get on with my business.
Sam Gerdt (44:23):
Sure.
So we're painting a picture nowof data collection increasing
and increasing and increasing,infrastructure getting more and
more and more centralized andall of that data is obviously
going to that infrastructure.
(44:44):
So, as we see a centralizationof all data, all infrastructure,
what does that do forcybersecurity?
Is that a good thing or is thata bad thing?
Mitch Shue (44:57):
Well, there's lots
of different ways to kind of
look at this, but I think again,at the root of my answer is the
fact that advances intechnology have far outpaced our
ability to safeguard the peoplemost affected by these advances
(45:18):
in technology, and we see thatstill with this proliferation of
data, just the velocity of datacollected for everything.
In a lot of cases I thinkEurope is far ahead of what
needs to happen, that we arehere in the United States, but
(45:40):
still it's woefully inadequatefor the time we're in now.
There's just so much datacollected from so many different
places and, honestly, a lot ofcompanies have no idea or cannot
differentiate the sensitivityof the data they collect,
(46:03):
because not all data needs to betreated the same way.
And it's just a remarkablesituation.
There's quote anonymous data.
It's really not anonymous.
You look at it and youtriangulate and you can
re-identify pretty much anything.
So it's not really anonymous.
(46:26):
So I think our ability to sortof regulate this is far behind
from a technology perspective.
On how to secure it, I think wecan secure the data, but in
order to secure anything, youhave to actually know what it is
(46:47):
that you're securing.
How does this data need to besecure?
Like I said, not all data isequally sensitive.
Sam Gerdt (46:58):
We talk about
personal information and then
sensitive personal information,and the difficulty with that is
a lot of it has to do with thecontext.
On its own it doesn'tnecessarily seem that sensitive,
but then you put it into acertain context and all of a
sudden now it's sensitive.
We actually have to think aboutthat a lot.
We're in digital marketing.
(47:19):
That's what I do, and theexample that I give is a name
and an email address is notnecessarily sensitive, but a
name and an email address on anAlcoholics Anonymous List all of
a sudden becomes very sensitiveand being able to understand
(47:41):
those contexts and protect thatinformation super critical,
absolutely super critical.
You touched on it a little bitbefore, but I'm actually
somewhat disappointed and Idon't say this very often about
regulation.
Europe did GDPR several yearsago and then it seemed like
(48:02):
there was this push to havesomething like that here.
But then I feel like maybe Idon't know, maybe COVID happened
, maybe Trump happened, maybe Idon't know what happened, but
completely derailed.
And again, usually I don't havea lot of positive things to say
about lots and lots ofregulation, but GDPR was, in my
(48:26):
mind, a very thoughtfully puttogether piece of legislation
that really and truly put theuser in the center, and the
result of it was actuallystrikingly very good.
Companies were headed in atrajectory that was not healthy,
(48:48):
not good, and larger companiesnow that anybody who does
business in Europe really had tochange everything, even to the
degree like you.
Look at products like GoogleAnalytics now, and they're
completely different in terms ofanonymized data, the way that
(49:11):
you can target audiences, theway that you can collect
information on users completelydifferent, and we have GDPR
primarily to think for that.
I was hoping that we would getsome guidance like that in the
United States.
That would maybe just notnecessarily go further, but at
(49:32):
least make those principles thatyou find in GDPR a little bit
more universal the right to beforgotten as an established
right of the consumer.
Mitch Shue (49:46):
Yeah, what's
interesting is we have these
sort of national boundaries andthings like that, but you look
at something like the internetand everything associated with
it and there's no nationalboundaries.
So it's tough.
We don't have a worldgovernment, but things like GDPR
(50:08):
actually gain traction becauseof globalization.
Countries want to do businesswith other countries, so they
have to make a business decision, so I think a lot of regulation
is going to be influenced anddriven by business.
It's interesting, it'stroubling, but honestly, also as
(50:37):
individuals, I guess there aresome individuals who are very
aware, or try to be aware, ofall the information disclosed
about them, but honestly, mostof us don't know how much
information is out thereavailable to us.
(50:57):
There's an enormous amount ofinformation and somebody can
find enough pixels about us toform a very detailed picture of
us.
That's why we have this very,very targeted marketing and it's
(51:18):
kind of scary, but as atechnologist, when I see it
happening I understand what'shappening.
Sam Gerdt (51:25):
Yeah well, and I
think that's what makes
artificial intelligence such ascary prospect for so many is
because when you have thosediffused pixels of information
out there, it's not necessarilypossible for a person or a group
of people to put together anaccurate picture.
But you throw the computingpower of artificial intelligence
at it and all of a sudden youcan produce that clear image.
Mitch Shue (51:51):
Absolutely, you can
produce that clear snapshot of a
person's life.
I mean, even before things likechat, GPT and things like that
just your typical basic,unsupervised learning.
It's just like finding patternsin this data, patterns that I
can't see, find them for me.
And you look at these emergingpatterns and you're like, wow,
(52:16):
that's something I didn't knowand I can exploit that.
Sam Gerdt (52:20):
Yeah, so AI is going
to be a nightmare for privacy.
We've established that prettythoroughly.
Oh yeah, what's it going to doto intellectual property?
Mitch Shue (52:31):
Yeah, that's a
really interesting question,
especially with the rise of, youknow, all sorts of generative
platforms, right for generatingmusic in the style of this
person or, you know, actuallyusing this person's voice,
(52:53):
generating just sort of visualart, you know, in the style of
some artist.
Yeah, it is just a crazy timebecause, you know, when you
think about traditionalintellectual property, it's
about has its roots in rewardingcreators, right, right, and it
(53:18):
encourages people to create.
Because, you know, I reap therewards of these things I create
, whether it's music or somewood carving that I make or
something like that.
It's just like my work.
My blood, sweat and tears havegone into this and, as a creator
(53:39):
, because of you know I createdthis thing I should reap some
rewards for at least for someperiod of time.
Right, that's kind of what's atthe root of intellectual
property.
Yeah, but now it's likeeverything is derivative, not
everything, but you know that'skind of exaggeration, but so
much is derivative.
You know, and we saw this yearsago with sampling, right, music
(54:04):
, just sampling the song and thesong.
And you know, again people arelike, well, you know, shouldn't
that person that you sample getsome credit for this?
And then some people are like,well, you know, it's not really
the thing they created, I justmixed in my work with it.
So, you know, maybe not, Idon't know.
(54:26):
So now it's, it's, it's in ourface, right.
It's just like, okay, whoshould, who should?
How do we reward the creators?
Sam Gerdt (54:39):
I think some of the
absurdity of the more recent IP
models is coming to light.
The original model you had abenefactor.
The original model you had apatron, someone who was saying I
want this, I'll support you inthis, and then usually only
after that was the art created,not before.
(55:00):
Now we've shifted to this modelwhere art is created almost
more on spec, where you'resaying I'm going to create this
piece and then I'm going to gofind a supporter of it and we've
gotten so accustomed to thatmodel.
I feel like and this issomething of a personal opinion
I feel like there's this senseof entitlement that has grown
(55:22):
out of that, where people feelentitled to say I made art,
therefore I must be paid, andand the reality of it is,
without having that benefactor,without having that patron,
there's really nobody who'sgoing to pay you.
And asking the universe for itor asking the government for it,
(55:45):
or whoever is, it seems to melike you're asking a lot, and I
think, then, the other absurditythat I'm seeing is you know,
google just announced thatthey're going to license.
There's a mechanism now forlicensing a I created, I
generated music.
I don't even know what thatmeans, like licensing from who
(56:10):
has the authority to license it,who has the authority to own it
to the degree that they canlicense it.
It almost sounds to me likeGoogle is just saying I'm going
to pay somebody something sothat I can say I paid for it.
And that's almost where we'reat right now with intellectual
property is, if you haveintellectual property and you
(56:31):
paid for it, you're good.
But all of the behind thescenes just seems tangled right
now.
Mitch Shue (56:38):
Yeah, I don't know
what the answer is, but you know
, again, you want to encouragecreativity, absolutely, you know
, and not everybody who createsmusic is great, obviously, but
(57:02):
you want those wonderful artists, in whatever field they're in,
to create.
And the concern is, how do you?
I mean, some people will createjust because they love to
create, but you know people needto eat, you know they need to
(57:24):
make a living somehow.
How do you?
How do you reward people forwhat they do?
Right, and everything is kindof messed up for me, because I
think, you know, I see theseinfluencers on various social
(57:44):
media platforms and I'm thinkingyou know what they're doing is
nonsensical, but they make bankright, yeah, yeah.
And yeah it's just, and what'ssad, I think, is I run across
(58:04):
people who that is theiraspiration is.
I want to amass, you know, amillion followers so that I can
kind of say things or do weirdthings and get paid for it.
Sam Gerdt (58:19):
Yeah, well, it's.
It's money and fame it's.
It's a hard thing to turn down.
The problem is going to be whengenerative AI learns how to do
that.
Mitch Shue (58:28):
Right.
Sam Gerdt (58:28):
They'll know our
psychology, they'll be able to,
they'll be it.
I mean, a movie is a big ask,but a 30 second clip that you
know lives and dies in 12 hours,that's not not as big of a of
an ask, even today, forgenerative AI.
So yeah, we're getting to thepoint where there'll be accounts
(58:49):
out there that are you don'tsee the person who's actually
behind it, you just see what thethe result of their prompt.
Mitch Shue (58:56):
Yeah, I think what's
sad for me, especially in the
area of arts, is it seems likeso much is forgettable these
days.
You know it's like some of thestuff is like no one's going to
remember this tomorrow, even youknow.
And then part of it is is maybe, maybe this is kind of a boomer
(59:20):
thing.
But you know, I, I still listento a lot of music from the 60s
and 70s even before that and the40s, 50s, depending on kind of
what genre of music it is.
But there is a lot of memorablemusic, you know, that maybe
will never get made again andand and people listen to that
(59:47):
music and it's just lasted fordecades and decades and decades.
And then there's music that ismore contemporary and it's hard
to say if it's going to beremembered.
But you know, even I listen tosome music that's maybe 20 years
old and it's like just notmemorable, you know.
(01:00:10):
It's just like what is thissong?
I don't even know when it wasmade, I think.
Sam Gerdt (01:00:17):
I think we all tend
to favor the music of our own
growing up years, our owngeneration to a certain degree.
But you're, you are right, itdoes seem like there's this
degeneration of art into beingfar more transient, far more
forgettable.
Mitch Shue (01:00:38):
Well, movies too.
If you look at movies, tv showsthings like that.
Sam Gerdt (01:00:44):
I think motivation
has a lot to do with it, though
it goes back to that idea of apatron or a benefactor.
If you don't have someone who'ssaying I value this, I want it,
I'll pay you for it, then theyou're just tend to be making
art to.
These days, a lot of thosepeople make art just to sell ads
.
That's where their revenuecomes from.
(01:01:04):
I just need I just need to keepsomebody's eyes glued on the
screen just long enough so thatthe next real will play.
Mitch Shue (01:01:14):
Right.
Sam Gerdt (01:01:14):
I don't.
I don't need to, they don'tneed to think about me after
they're done watching me.
They just need to watch me longenough that they'll watch the
next one, and then the next one,and then the next one, and it's
not a sustainable model.
It's not good for humanity.
Mitch Shue (01:01:30):
No, it's not, it's
not it's certainly not good for
mental health either.
You know, I had a student tellme that she was self aware
enough to know this.
She said one week she spentlike 28 hours watching reels and
(01:01:51):
ticktocks.
And you know she's like that's28 hours this week that I don't
get back.
Yeah, you know, it is that thatthat sort of it's that
addiction.
You know, it's just the AIunderstanding what it is that
you keep clicking on, yeah right, and just reading you this,
(01:02:13):
this drug.
Sam Gerdt (01:02:16):
And the people who
produce that content to call it,
to call it art.
I don't want to be hypercritical, but to call it art
when the model says we're goingto feed one person 28 hours
worth of one minute or lessvideos.
I don't know, I can't do themath, but that's thousands and
thousands of pieces of so calledart and she probably doesn't
(01:02:40):
remember much of it at all now.
And so to say, I'm a contentcreator, I make art.
Those two seem to bedisconnected concepts in my mind
.
You can be a content creator oryou can make art.
Now, that's not to say thatthere's not some art in there.
There absolutely is.
And usually those people goback to either having a
(01:03:01):
benefactor or they, you know,they have a patreon account and
they have patrons, and so thosemodels aren't completely dead
they just get lost in thatshuffle yeah, yeah.
So do you think that IP has arough, rough future ahead?
Mitch Shue (01:03:17):
then yeah, I mean,
you see it already.
It's just like we don't knowwhat to do, honestly, with
generative AI.
You look at these beautifulimages that get created, you
know, and you're looking at itgoing Okay, you know, what did
(01:03:37):
you do other than come up withan interesting prompt?
You know so what is protected,the results or or your prompt, I
don't know.
So I just think that a lot ofour thinking again is behind,
(01:04:01):
and we're at this point nowwhere advances in technology
used to outpace our safeguardsby a little bit, and now it's
just like a slingshot.
It's just like prehistorictimes when it comes to
international propertyprotections.
Now it's just like I don't knowwhat to do.
Sam Gerdt (01:04:21):
Yeah.
Mitch Shue (01:04:21):
Don't know what to
do.
Sam Gerdt (01:04:24):
Mitch, I don't want
to end the conversation without
talking about AI rise and whatyou're doing at Clemson, so I
want to give you a minute justto tell us what it is you're
working on, the importance ofthe program and what you're
doing in our area, particularly,I think, with manufacturing.
Mitch Shue (01:04:41):
Yeah.
So AI rise stands for AIResearch Institute for Science
and Engineering.
It is an officially recognizedinstitute at Clemson in that it
was approved by the Board ofTrustees at Clemson University.
It serves as the umbrella forAI research, ai education and
(01:05:06):
workforce development at Clemson.
Clearly, before AI rise, therewas a lot of AI work going on
all over the university, acrossall seven colleges now eight
colleges.
Yeah, so many differentdepartments.
But AI rise, the Institute, wascreated to be this umbrella.
(01:05:29):
It's great for researchersbecause when they submit
proposals for funding, they cansubmit those proposals through
AI rise.
Ai rise gives their proposal alittle bit more mass, a little
bit more credibility, if youwill.
It's helpful.
So we have about 120 affiliatefaculty from across all the
(01:05:55):
colleges, probably 35 differentdepartments involved with AI
rise.
So you have people from art,religion as well as all the
technology fields involved withAI rise.
So it's really about againbeing the umbrella for AI
(01:06:17):
research, education, workforcedevelopment at Clemson and
beyond.
So our mission is really totake that education component
and take it to the upstate,across the entire state, across
the region.
We want to focus on the areasthat Clemson is known for
(01:06:45):
advanced manufacturing,biomedical informatics, material
science, cybersecurity, cyberinfrastructure and intelligent
transportation.
Those are five areas thatClemson is known for, so we
(01:07:06):
focus on those kinds of thingsas well.
Sam Gerdt (01:07:09):
That's cool.
So you mentioned art andreligion.
That means you're taking aphilosophical approach as well
as a technological approach.
Mitch Shue (01:07:19):
Yeah, it's
interesting my historians, for
example, trying to figure outhow to apply AI to history.
There are so many applications,interestingly, at first you're
thinking well, what's that allabout?
(01:07:40):
But when you think about beingable to take an AI system and
determine the provenance of somehistorical document, it's like,
well, we've always thought thatthis person wrote this, but
given what we know about all thewriting in that time period and
(01:08:01):
everything this person wrote,perhaps an AI can help us
determine these true provenance.
Is it somebody who was nearthis person and wrote in the
style of this person?
So it's fascinating, and thenice thing about it is that
(01:08:23):
historical data is notincreasing rapidly like data
that's being collected.
It's a data set that can beanalyzed in a lot of different
ways using new AI, it'sdefinitely more static yeah.
So it's important to be able tolook through historical
(01:08:44):
documents and determine thingslike provenance and things like
that, because so much of what weknow and do is based on
historical documents.
Right, it's just like did thisperson?
Have we been attributing thisto the wrong person or the wrong
civilization?
(01:09:06):
Even so, it's fascinating whatresearchers across the
University are interested inapplying AI to.
Sam Gerdt (01:09:18):
That's fascinating
and you've had opportunities.
It looks like to do a lot ofstuff.
I saw a demo just recently on.
I think it's called Deep OrangeI guess that's under that
umbrella as well.
An autonomous vehicle.
Mitch Shue (01:09:30):
Right, the latest
version of Deep Orange, deep
Orange 14, which is anautonomous tracked vehicle like
almost like a tank, but beforethat Deep Orange 12 was about
autonomous driving at racewayspeed.
So basically Formula One carself-driving.
(01:09:54):
And the nice thing about thatis Clemson was responsible for
the common car used for thatcompetition, so all the cameras
and sensors, the computeinfrastructure Clemson was
responsible for, and then acouple dozen organizations used
(01:10:19):
this common car and basicallyraced autonomously on the Indie
motor speedway and so since thenthat has turned into a sort of
an international competition.
So a lot of times when you seethese Formula One cars they're
(01:10:42):
in these self-driving situations.
You'll see a Clemson logo onthe nose of the car.
That's got to feel good.
It's very cool.
Sam Gerdt (01:10:53):
Yeah, that's pretty
sweet.
Mitch Shue (01:10:54):
And the idea is, if
you can self-drive at 200 miles
an hour, you can do a lot atregular neighborhood speed or
highway speed.
Yeah, so that's the idea there.
Sam Gerdt (01:11:09):
One more thing before
we go.
Mitch Shue (01:11:10):
Sure.
Sam Gerdt (01:11:11):
I just want to know
generally are you optimistic or
pessimistic about the future oftechnology?
Mitch Shue (01:11:15):
You know I'm
optimistic.
I think that in general, coolerheads will prevail.
People are very curious aboutnew technologies and things will
settle out right.
We've seen it in the pastThings that are super hyped.
Sometimes are exposed.
(01:11:36):
It's been super hyped.
Sam Gerdt (01:11:39):
And that's not an
excuse to write off these tools.
It's an opportunity to level upwhile everybody else is
sleeping.
Mitch Shue (01:11:46):
Yeah, and to tell
people you know you just want to
be curious and curious aboutthings, maybe outside of your
normal focus, right, and there'sthings happening all around you
.
The world's changing veryquickly and the citizens of the
world.
I think it's important for usto just understand the
(01:12:09):
implications of these kinds ofthings.
Sam Gerdt (01:12:13):
Excellent, Mitch.
Thank you.
It's a pleasure talking to you.
Your resume is incrediblyimpressive.
You have certainly covered thegamut of technology and it's
been really interesting to hearyour perspective on all this new
stuff coming down the pipeline.
Mitch Shue (01:12:32):
Thanks so much for
having me.
Sam Gerdt (01:12:34):
I'm incredibly
thankful that you were willing
to talk and I'll look forward tocatching up with you again,
maybe in the future and we cansee how we sounded whether or
not we were right or wrong aboutsome of this stuff.
Mitch Shue (01:12:43):
Yeah, very good.
Thanks again for having me.
I really enjoyed ourconversation today.