Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jeff (00:00):
Dude, you're old, i just
want to point that out.
Tom (00:01):
Say in KCKC Welcome back to
the basic AF show With Tom
Anderson and Jeff Battersby.
Jeff, it is, as always, good tosee you again, don't lie.
Tom, we know it's not good tosee It is.
Jeff (00:23):
We're 100 percent certain
of that.
Yep, you say it, but justbecause you want me to stick
around, i get it.
Tom (00:28):
Thank you Manipulator Great
.
Jeff (00:33):
As if I don't already have
a few of those people in my
life.
We're also happy to welcome twoguests.
Tom, we have two guests on theshow We do.
First of all, we have BenjaminYoung-Savage, who will introduce
himself in just a brief moment.
Then we have also Dr DevinTaylor.
Sorry if I put too muchemphasis on the doctor.
(00:55):
There's one smart guy in theroom and it's you, and we want
to make sure everybody knowsthat that's the case, i wouldn't
go that far.
What we'd like you guys to do isintroduce yourselves.
Benjamin, why don't you tell usa little bit about you and then
we'll talk about what oursubject is for the day?
Dr. Devon Taylor, Ed.D. (01:14):
Sure, i
am tuning in from a tiny
village called Kawawachi Kamachi, which is where I grew up in
Northern Quebec.
Say that again, please.
Kawawachi Kamachi, it means bythe bend in the lake.
All right, it's a tiny NativeAmerican reservation up in the
subarctic of Northern Quebec.
I come up here.
(01:36):
I try to come up here at leastonce a year just to reconnect
with people I grew up with inthe tribe.
I'm stranded up here because ofthe forest fires.
I'm sure most of you down southas we say, upstate New York and
everywhere else have seen theorange skies.
That is because things areburning in western Quebec.
(01:59):
There are also small fires allover the place, and it has
stopped the train line, which isthe main way in and out of here
.
We've got fiber optic and so Ican do podcasts.
Jeff (02:13):
Yeah, look at you.
Dr. Devon Taylor, Ed.D. (02:15):
I've
been a graphic artist for the
past 24 years.
I run my own small businessdoing that.
We have a whole handful ofemployees.
The name of the company iscalled Zerflin.
We specialize in branding andweb design and we did a lot of
(02:40):
pioneer work just as a smallboutique agency In the 90s
making adaptive websites thatfit to your browser or phone or
whatever you're viewing it on.
Tom (02:51):
They're cool Yeah.
Dr. Devon Taylor, Ed.D. (02:53):
All
right, good to be aboard.
Tom (02:55):
Yeah, thanks for joining us
.
Jeff (02:57):
Appreciate it.
And Tom, you want to introduceyour friend.
Tom (02:59):
Yes, as if you had one.
That's the only one and a halfnext to you, jeff, you guys are
friends.
Well, it's a perfect thing.
It's tumultuous.
We're off the rails a minute inGood.
So for joining us today.
Dr Devin Taylor from ShenandoahUniversity, colleague of mine,
(03:23):
devin, welcome to the show.
Benjamin Jancewicz (03:25):
Yeah, thank
you.
Very nice to be here today totalk about this subject.
I'm excited to talk with youfolks and see what we can come
up with.
Jeff (03:35):
Yeah, we'll come up with
something we usually do.
Dr. Devon Taylor, Ed.D. (03:38):
We're
going to solve all AI related
problems here on the show.
Absolutely correct.
Jeff (03:44):
And actually, benjamin,
you brought up the big point.
So what we're talking abouttoday is artificial intelligence
, which, personally, i would putthe emphasis on artificial.
Some others in this room mightput it on intelligence or some
combination in between.
So what we want to do is wewant to talk about this.
Benjamin, you, as you said,work on the graphics, graphic
(04:07):
arts, and there's obviously somepotential problems for artists
with regard to this, and I justwant to point out that our
beautiful new artwork that wecurrently have came about
because Benjamin poked me afterhaving created some initial
(04:29):
artwork that we had for a whileusing Diffusion Bee, And he
asked me whether or not I hadcreated that using any kind of
AI, and I had, and I wonder whatgave it away.
Dr. Devon Taylor, Ed.D. (04:42):
Oh,
it's real easy to spot if you
know what to look for.
Jeff (04:47):
The people with 10 fingers
on one hand.
Dr. Devon Taylor, Ed.D. (04:50):
I just
think it's ironic that you chose
a human to draw a couple ofrobots as your look.
Jeff (04:56):
Is that?
Well, that's closer to thereality, I think, is the story
as we are.
We did that, So we did end up.
we tried to hook up withBenjamin to get him to take care
of it, but time and otherthings didn't work out.
So we had, as everybody knows,Randall Martin designs.
Dr. Devon Taylor, Ed. (05:13):
Beautiful
design.
Jeff (05:14):
Yeah, really I'm happy, we
are really happy with it to do
that.
And, devin, you are working ineducation and trying to find
ways to lean into AI as a meansof being able to either support
students I don't want to put anyopinions in anybody's mouths.
What I'm interested to hear iskind of what your thoughts are
(05:38):
on it and how, and we'll startwith you, devin or doctor,
whatever you want me to call you.
Benjamin Jancewicz (05:43):
Devin please
.
Jeff (05:49):
There was a guy once that
used to make us all say his name
was Doctor.
He was an HR guy at the companyI used to work for.
I would.
The only time I would refer tohim as Doctor is if I used evil
at the end of it.
So he was.
He was a I won't say the wordhere because we don't want to
down our rating and make it buthe was not a kind fellow.
(06:12):
Anyway, devin, what we want youto do is why don't you lead in
and let's talk a little bitabout, first of all, your
experience with AI, how you'reusing it and kind of you know
where you hope to go or how youhope to control it in your
particular environment?
Benjamin Jancewicz (06:32):
Yeah.
So, yeah, i think I'm thinkingabout it in two different ways
when it comes to AI and highereducation, and the first is
students using it to help themcreate right Content.
you know, these, these largelanguage models, are really good
at creating content, whetherthat be images, text, soon to be
(06:54):
video and audio right.
So thinking through ways thatstudents can use that as a
starting point for some of thethings that they're going to be
working on in their classes, but, on the other side, also
thinking about how do we need tostart training our students in
these tools to be prepared forentering the workforce, where
(07:15):
they're likely going to be usingthese tools in some fashion.
So thinking about it in twodifferent, two different aspects
there right, preparing ourstudents for the workforce,
while, at the same time,thinking about how we're going
to use it to have them completeassignments and work on on
different aspects of highereducation.
Jeff (07:34):
Interesting.
So let me ask you a questionabout that, and this is one of
the things that comes up for mewith regard to particularly in
education.
you know, obviously one of thethings about higher education is
training people, hopefully, howto think you know, teaching
them how to be critical thinkersand work in that kind of way.
(07:56):
I know Dr Holland, who's theprofessor that I spoke of a few
minutes ago when we were, whenwe were just talking in the
background I don't think werecorded that But one of her
concerns is, and the one that Iwould want to express to you is
she's concerned that you'regoing to have issues where the
(08:20):
thinking piece is kind of thrownout the window.
The process.
say, if you're writing a paperwhere you're coming up with
ideas and you know your initialpoints to put together some kind
of project, you, if you're, ifyou're having the AI start out
that thinking for you, are youdoing the groundwork that needs
(08:40):
to be done?
So I'm curious to know how youbalance that.
you know in academia.
Benjamin Jancewicz (08:48):
Yeah, i
think you know it's a starting
point, right?
It should always, i think, betaken as this is a place to
start or even a place tocritique.
So we have a history professorthat started every class this
last semester by prompting chat,gpt with whatever the topic of
the day was, and then they wouldspend the first 10 minutes
(09:09):
seeing what it got right, whatit got wrong and kind of
critiquing the large languagemodel, so that then they could
learn from you know what it'sdoing well or what it's
completely off the rails on.
And I think that's a learningexperience for the students as
well.
They see that it's not alwayscorrect and it should never be
taken at face value for thethings that it spews out.
(09:31):
Right, because it's just doingmath.
It's not knowledgeable, right,it's not intelligent, it's just
doing math to predict what thenext few words or sentences are
going to be.
So I thought that was a reallyinteresting use case, you know,
for kind of getting studentsused to how these things work,
as well as kind of showing themthat they're not always the end
all be all when it comes tocontent.
Jeff (09:53):
Yeah, i appreciate that
actually, And I appreciate that
you say that it's notintelligent, it's iterative.
you know, that is definitely,to my mind, what it is.
It's iterating on.
you know what the next possibleword is in a sequence, whether
or not there's truth to that ornot.
So that makes a lot of sense tome.
(10:13):
Benjamin, i you know yourthoughts about this have to do
with art and the creation ofthose kinds of images.
I guess the question I haveabout that I mean, you said that
our original art definitelylooked like, you know, looked
(10:36):
like a bad album covers from the70s is what it really looked
like.
Dr. Devon Taylor, Ed.D. (10:40):
But I
mean, it wasn't.
It wasn't great.
No, it wasn't great.
What you have now is great.
Jeff (10:47):
Yeah.
So why don't you talk a littlebit about the difference between
what is great, what isn't great, what you see I mean I look at
some of the stuff that I see.
In fact, tom sent me a linklast night of a picture it was
four different pictures, all AIgenerated of supposedly a Native
(11:07):
American from Alaska, and youknow the difference in view that
at first glance, looked likelegit photograph.
I mean, i would have said itlooked like a legit photograph
of a real human, you know.
No crooked eyes.
No, you know seven fingers andthree toes, kind of thing.
Dr. Devon Taylor, Ed.D. (11:28):
Yeah,
it's in a matter of months.
It's gotten better at that.
Jeff (11:31):
Yeah, So what?
what is it?
do you see any benefits tousing AI for the creation of art
, Or is it all downside to you?
Dr. Devon Taylor, Ed.D. (11:45):
Well,
fundamentally I agree with what
Devin was saying is notintelligent.
So I think we need to establishthat as a as a baseline.
It's not intelligent yet Thereare people who are working on
intelligent, but what most ofpeople are talking about when
they're talking about AIgenerated things, it's not
intelligent.
(12:05):
What it's doing is it's usingrepositories of information to
generate things based on otherpeople's work, and I think
that's important to to establish, because you know the, the
stuff that you're viewing isbased on something somebody else
(12:29):
did, a person.
Ultimately, now, it's very,very, very, very finely remixed,
i would say, to the nth degree,where you, where at least the
creators of these programs, arenot interested in attributing,
so they don't actually do thetracing to to tell you what
percentage of an image is.
(12:50):
You know from what, certainpieces, what certain bits of
artwork.
But, yeah, the biggest issuethat I have is that, especially
in the past, even year or so, isthat this AI generation has
taken leaps and bounds indirections of of things that
(13:14):
people love doing.
Creating artwork, photography,designing things is something
that people love to do.
I switched majors because of it.
I was originally in engineeringand I switched because I love
graphic design so much.
I would honestly do it for free, but I'm fortunate enough to
(13:37):
have people pay me for it And itit.
It seems a shame that thesetools are being used to replace
people who honestly love doingthis stuff.
I do think that you know thereare you know there are you know
(13:59):
there are you places and areaswhere it can be used as a tool,
and I think that Devin is on theright track of making sure that
you know these younger peopleare learning how to use it.
but the album cover that youguys put up before was pretty
(14:20):
obviously AI generated.
To me, there was a lot ofdifferent tells in that old
version.
I bet you, if you did the sameexperiment today and tried to
recreate and put in the sameexact thing, you would get a
much more realistic and muchmore defined result today, just
because of how much it's grownand changed since then.
(14:41):
But yeah, i'm firmly on the campof you know, let's use these
tools to do things people don'twant to do, rather than do
things people do.
The other point is that,especially given the the subject
matter where you were talkingabout images of northern
(15:03):
northern native people, i amhere on a reservation where I
recently redesigned the websitefor the tribe, because they are
underrepresented, because theydon't have as loud of a voice as
they should have, and peopleare using things like AI
generated imagery and texts tobe lazy and not to actually go
(15:28):
and get those photographs andstories on their own, which is
what they should be doingbecause these cultures are
disappearing.
Jeff (15:35):
So yeah, interesting, very
interesting.
Dr. Devon Taylor, Ed.D. (15:39):
So
you're in First Nations country,
right in yes, yes, i am on aNascapi First Nations village
right now.
Jeff (15:47):
Yeah.
So this raises for me aninteresting question and one of
the things that's kind ofchallenging and, devan, i'm
curious to know what you thinkabout this.
One of the problems for me iswith any generation is it's
currently one only coming fromdata sets that are publicly
(16:09):
available, in other words, foracademia, every single article
that maybe a research articleand there's arguments to be had
about this as well but aretypically any, you know, journal
papers, anything like thattypically paywalled.
You know, behind some type ofpaywall that you have to pay to
get actual research data.
That's that's going to be live.
(16:31):
What it is feeding off of, sofar as we know and that's
another problem with this for meis we don't know exactly what
data sets are being fed intothese, you know, into these AIs,
but one of the things that wedo know is that most of what
it's getting is coming from thebroader internet, which we all
(16:52):
know as the frigging Wild WildWest, maybe worse than the Wild
Wild West it might be.
You know very nearly post WeimarGermany.
You know in some aspects, orsome you know places on the on
the web.
So I really love Devin that,what that history professor did,
(17:13):
which was, you know feed, chat,gpt, and then see the BS that
gets propagated by that or sometruth.
And that's, i think, theproblem, or, from my perspective
, one of the issues with it isyou get you know half the
information is correct, half theinformation is a lie, or you
know made up, and then you knowbeing able to think about and
(17:37):
figure out, you know parse wherethe lie is, where the truth is,
especially when there is noaccess to academic data.
So what do you think about that?
how do you, how do you resolvethe issues, at least, that we
currently have with the, withthe data sets only coming from
(17:58):
publicly available sources?
Benjamin Jancewicz (18:01):
yeah, it's,
it's.
It's a tough problem, right,because we don't know, at least
for most of the the languagemodels.
We don't know where the data isfrom and most of them are
probably using a lot of the samedata sets.
Right, it's everything that'spublicly available on the
internet, and so smart companiesare locking down their data and
saying you don't get thisanymore.
(18:23):
Right, this is now proprietary.
We're gonna start rolling ourown right?
for instance, reddit has notallowed anybody to scrape their
data.
They have a huge data set,right.
That probably would beextremely beneficial when it
comes to questions and answersand and other things.
Right that that Reddit's reallygood at product reviews, things
(18:43):
of that nature.
So, i mean, i think we're stillvery early days, right, we've
only had chat GPT since November.
Right now, gpt has been aroundin beta for maybe a year and a
half, right, people have beenplaying around with it.
So I think it's still reallyearly days with this technology
and, like Benjamin said just inthe past, you know, couple of
(19:05):
months these things have gottenso much better on the image side
, but also on the you know, onthe text side.
You know, there's a reallyinteresting infographic that
shows the advances of chat, gpt,which was, you know, gpt 3.5 to
GPT 4, and the leaps and boundsin knowledge of what it's able
(19:29):
to do when it comes to passingsome tests.
Right, it's, it's in the top,you know, 20th percent or 80th
percentile on all of thesedifferent scores of the GRE, the
bar exam, the medical exams.
So they're advancing at apretty fast pace.
So I think eventually we'll getto the point where some of this
(19:49):
gibberish and you know nonsensethat it's spewing out will
become not the norm.
Right, it's gonna be kind ofthe the outlier out there.
That will then be fixed rightby the human reinforced learning
that these, these trainingsystems go through.
Jeff (20:05):
Yeah, so that's
interesting.
We do have humans behind thislooking at correcting, hopefully
, the information you know,which adds some, perhaps
subjectivity to the training ofthese models, and that
subjectivity challenging, right,you know, you don't know who's,
who's deciding, what's thetruth in the, in the language
(20:27):
models that are being used andthat's brings a healthy dose of
bias, i'm sure as well.
Oh yeah, well, you know if Iwas training it, it would you
know there's also these ethicquestions.
Dr. Devon Taylor, Ed.D. (20:40):
I know,
i know, jeff.
I sent you the article aboutthe 150 Nairobi workers
unionizing because they weren'tpaid enough, who were on the
back end reviewing all the chat,gpt stuff.
So you know it's.
It's being used actively toexploit black workers right now,
(21:02):
which is not a great look yeah,no, and that's.
Jeff (21:05):
I think that's another.
You know that's anotherinteresting.
Interesting question is youknow, tom sent me an article
last night about a radio station.
Where was that radio station?
I believe that was in Portlandagain who has a DJ that is now
an AI DJ sounds just like thewoman who is who is, you know,
(21:29):
actually DJing the show live.
Their thoughts are well, youknow, she can't do it 24 hours a
day.
So if we have a big event comeup, there's traffic issue or
there's you know, big accidenton some highway in Portland or
something like that we just flipon the AI.
But we're never, ever, evergonna, you know, use this for
real.
Well, you know, that's BS,right, you know they.
(21:51):
If, first of all, i will say forthe 50 millionth time, you know
, i, i don't like algorithmalgorithmically, man, i'm in
tech and I can't say that word.
You know, algorithms creatingplaylists, for me, they never
satisfy.
You know, i listened to RadioParadise, which is a human DJ DJ
(22:15):
radio station based out ofCalifornia, or was based out of
California, but I'm neversatisfied that the thing that
we're getting and I think thisgoes to your point, ben is gonna
be as beautiful or as real assomething this human created,
because there there is no,there's no, first of all, so
(22:40):
there's no soul, correct, right?
which maybe is tech in general.
Dr. Devon Taylor, Ed.D. (22:45):
I agree
with you, especially on the on
the music front, but I alsothink that there is room for AI
in that.
I am not a Spotify user anymore.
I jumped ship when they theywouldn't drop Rogan, but as
we're talking on a podcast, butthe Spotify doesn't carry.
Nobody carries it, i reallyloved Spotify's new music
(23:10):
algorithm, which did a great jobof paying attention to what I
was listening to and giving mesuggestions of other artists
that I might like.
It wasn't it wasn't a Gestapoand it didn't add them to my
playlist, but it gave me theoption, said hey, you want to
listen to these, you know you.
(23:31):
You've been listening toclassical for the past month.
Would you like to try?
you know these artists or wouldyou like to, you know, listen
to this techno or dance housemusic?
because you've been, you know,working out to this and I do
think that that algorithm wasthere, is really well based and
and this isn't new either Idon't know if you guys are
familiar with the, the franchiseJack FM, but the radio station
(23:57):
has been around for the past 10years or so and it doesn't have
a DJ.
It plays music, the occasionalcommercial, but it is mostly
just this one guy withpre-recorded announcements and
they're region-specific.
(24:18):
He's got stuff that talks aboutBaltimore or New York or
wherever you're in, but theyjust play the hits.
They play what they know peoplewant to listen to, based on
feedback from other areas, andthat's all an algorithm and it's
fun to listen to.
It's not bad radio, it's justmusic, but there still is a kind
(24:43):
of human behind the scenesrunning the thing, which I think
is important.
Jeff (24:46):
Yeah, interesting.
I think that's, for me at least,one of the bigger parts of it
is that human element, whetheror not it's in academia.
And you've got someone who'swriting a paper Again, professor
(25:11):
Holland and I'll make sure shelistens to this or she knows
she's been shouted out severaltimes in this She'll tell you
that it's pretty obvious when itis that someone has used one of
these algorithms to write apaper.
(25:33):
I think the other problem thismight be an interesting question
for you is that when thesethings have potentially come
before academic boards,technically that would be
plagiarism.
If I'm not writing it and it'sbeing written or mostly written
or mostly even outlined bysomething like chat GPT, how do
(25:57):
you address that?
How do you address that from anacademic perspective?
Do you say, yep, everybody hasthe right to at least outline it
this way and use chat GPT to atleast write the skeleton around
what it is that you're going toput together as a paper?
(26:19):
How do you think Devin atShenandoah and maybe at other
universities, because you'veprobably spoken with other
people in the same seat as youin these places how do you
differentiate?
How do you either?
(26:40):
I?
don't want to say punish, butmaybe that's it.
How do you handle theseparticular issues?
Ben?
just put a student on thechopping block, benjamin, sorry
about that.
How do you handle that kind ofstuff?
Some might argue that academiahas lost its rigor, but how do
(27:02):
you make sure that you're stillbeing rigorous, that you're
thinking?
The one example that you gaveabout history is good, but what
about me, the student who goeshome and spits a couple of
questions into chat GPT andmakes that my response for what
it is that I'm doing?
Benjamin Jancewicz (27:25):
These are
the questions that we're having
this summer and this pastsemester that all schools have
been battling with.
How do we approach this?
Because you've got someprofessors that are gung-ho.
They want to use this in theirclass, they want to integrate it
, they want their students tobecome proficient in how to use
these.
Then you have other professorsthat are 100%, completely
(27:49):
against.
They don't want any of theirstudents to do anything with a
chat.
Gpt Balancing syllabi.
What does the syllabi say forthe student?
per course, rather than havingsomething at the institutional
level that either says yes or no, i think we're leaving it up to
the individual faculty memberto make that determination for
(28:12):
their students and their ownindividual course as to how they
want to approach thisparticular technology.
Of course, we've got supportfolks that are there to help.
If they want to think aboutways to integrate it into that
English class, then we've gotsome folks that are there to
help walk them through ways thatthey could use it.
(28:33):
If it's a creative writingcourse, i can think of ample
ways that we could come up with.
Okay, here's a list of 10 ideasfor a particular story in this
genre.
Now, get rid of all of them,right, because that's what the
AI came up with.
Let's think outside of what theAI is coming up with, or coming
up with screenplays or otherthings.
There's ways to integrate itinto the writing process that
(28:57):
aren't write me a five paragraphessay on a topic.
So, trying to make facultythink of ways that it could be
used to expand what theirstudents are capable of, versus
just replacing what theirstudents are doing, i think is
the challenge that we have now,because everybody, i think, just
thinks of all right, i'm goingto type into chat gbt, write me
(29:20):
a five paragraph essay on youknow, the Shakespeare right,
shakespeare's life, and thenstudents are just going to
submit that.
But I don't think if you'regiving assignments like that,
then that's probably not a verygood assignment anyways, right?
So it's kind of thinkingthrough how we assess students
(29:41):
And I think it is causing us torethink how we assess students
in this age, because I thinkit's going to have to change a
little bit.
Dr. Devon Taylor, Ed.D. (29:53):
I think
, honestly, academia is a little
bit behind the ball, becausethis kind of the lazy student
isn't anything new, you tell me,people buy papers.
Jeff (30:06):
Is that what you're?
Dr. Devon Taylor, Ed.D. (30:07):
doing.
Yes, they buy papers alreadyand change them and adapt them
to their own writing.
I mean, people will download abook review on To Kill a
Mockingbird and spend 20 minutesputting it in their own words
and submitting it as a paper andgetting a great mark on it.
So I mean, that's existed sinceI was a kid.
(30:28):
There were papers that I lookedup because I was stumped and
didn't understand something andfound a paper that was written
by somebody else and it helpedme understand it.
I didn't pass it off as my ownwork, but I did look up stuff
because I just didn't understandthe material and I didn't have
a very good teacher.
But I don't think that.
(30:49):
I think that in a large park,academia has been kind of
ignoring that problem for a verylong time.
I'm glad that you're actuallytaking the bull by the horns and
actually facing it front on anddealing with it and recognizing
that the students that you areteaching are going to be exposed
(31:10):
and to be using these tools andthat they might as well learn
how to use them in effectiveways.
But yeah, again, i think thatit's all about the critical
learning how to critically think, which is what I think Jeff was
kind of alluding to is thatthat's what the purpose of
(31:31):
education is, and I think thatyou know, even when I was a
young student, the main reasonwhy any of us would try to buy a
paper online was because wewere overworked and overwhelmed
by classes and we actually didwant to do a great good grade
(31:53):
and we're trying to figure outall of the tools that would help
us do so But because you know,it wasn't recognized or
acknowledged by any of theinstructors.
I wish instructors had got up infront of the class and say hey,
listen, i know you can buyessays online.
The point is not the essay.
(32:14):
the point is to think about thematerial.
That's what I'm getting tryingto get you to do, and if writing
an essay is not the mosteffective way to get students to
think about the material, thenmaybe the essay is the problem
and there should be other waysto complete those tasks.
Jeff (32:34):
Yeah, yeah, i'm sorry, go
ahead, devon.
Benjamin Jancewicz (32:36):
No, i think
you know we've been talking
about the flipped classroom fora long time, where, you know,
students ingest content outsideof the class.
They do their readings and thenwhen they come into the class,
it's like a secretive seminarwhere the professor's job is to
really just foster a discussionabout the particular topic, and
(32:57):
I think it's those types ofteaching modalities that are
going to have to become moreprevalent in this.
You know, age of AI, becausethat's how you're going to be
able to assess those criticalthinking skills and what the
students are really learning isby having those conversations,
you know, one-on-one with thestudent or in a group context.
(33:18):
So I think it's you know we'vebeen working towards that type
of classroom right, the flippedclassroom for a while, but I
think it's going to become evenmore important now.
Dr. Devon Taylor, Ed.D. (33:29):
It's
just harder for teachers.
It's very difficult to assess astudent based on class
participation, especially whenyou have dozens of different
personalities.
It's much easier to grade astudent based on a written essay
And so you know, the teacher isoften trying to do the easy
(33:51):
thing for them by having peoplewrite essays and the student is
trying to do the easy thing forthem by, you know, getting the
essay from somewhere else.
So it's really about, you know,time saving and trying to get
through the thing rather thanactually absorb the thing, which
(34:13):
I think is a huge issue, and Ithink it's at the root of how AI
is being misused.
Jeff (34:21):
Yeah, so two thoughts
about this.
First of all, devin, you said afew minutes ago that these AIs
are able to pass things like theLSAT pretty easily or other
tests that we currently use toget people into various forms of
education, whether it's lawschool or medical school.
(34:43):
It's someone who sucks attaking tests always have.
You know, i have always beenchallenged that way, although
you know I think that I wassmarter than I ever thought I
was because I sucked at tests.
well, toot my own horn here, butor trying to toot a horn, that
(35:07):
maybe doesn't need to be tooted,but I think this speaks to and
to your point, devin.
This speaks to the way that weview education.
I know students will alwayscome in to professors and I
(35:28):
don't think this is as true insecondary education.
In secondary education, but inoh sorry, in like high school
education, but in college peoplethat are expecting to get an A,
the A is the thing right.
It doesn't matter how you getto the A, what the A is or what
that is.
And I will say again, talkingup Professor Holland, i only had
(35:57):
this professor in graduateschools.
In graduate school all of herclasses were wrangling about
ideas, you know, and thenlearning to take the things that
you wrangled with and put themdown, you know, on paper
cheaters gonna cheat.
You know there's no gettingaround that If you're not in
(36:18):
school.
And I think you know the waythat we think about education is
typically.
I'm gonna get a job, then I'mgonna go make a million dollars.
You know, that's kind of theidea rather than I'm coming to
this place to play with ideasand learn something I don't know
and get out of my comfort.
(36:39):
You know, whatever my comfortspace is to be able to hear
things and learn things andthink about things.
So there's something to be saidfor I think.
Anyway, you're not gonna getaround the people that wanna
cheat, you know.
You're not gonna get around thepeople taking a shortcut to get
to something good, like meusing diffusion B to create a
(37:01):
piece of artwork.
Granted, we needed a piece ofartwork yesterday when we were,
you know, doing this the firsttime.
That was the whole idea behindit.
But people who are reallyinterested in thinking, i think,
are always gonna put their nosedown.
They're gonna be the ones thatwanna play with the ideas.
(37:23):
They're not gonna use thingslike chat GPT as a starting
point or things such as chat GPTas a starting point.
So maybe it offers anopportunity for us to rethink
how we think about education andwhat it is and what its purpose
(37:45):
is.
Dr. Devon Taylor, Ed.D. (37:46):
But
then again, the users of these
tools, like it's.
I think we've so far beentalking about these tools as a
very individual experience Theone student choosing to or not
to use these tools in order tocut a corner.
The bigger danger with thesetools is when they are used by
(38:09):
large companies and industriesto replace people.
So, for example, you know theradio show that you talked about
a lot of large companies are,instead of hiring a team of
(38:32):
people to be a creativedepartment, they are hiring one
person and telling that oneperson to do all the things.
That's been true for a while.
I have had to become a masterof many different things because
of the industry that I'mworking on, that being a
(38:55):
creative writer, being anillustrator, a photographer, a
videographer, a graphic designer, a typographer all of these
things I am quite good at, butthat's because the art
department said variouscompanies have gotten smaller
and smaller and smaller andsmaller.
So essentially, the point thatI'm trying to make here is that
(39:17):
the biggest danger with these AItools is where they are mixed
with rampant capitalism, thissociety that we're living in and
using to squeeze people out andto replace people, and not only
just replace people but alsopay them less because you can
(39:39):
use chat GPT to write your blogpost for free right now.
Using a starter account meansthat you are less likely to hire
a person to do that who wouldprovide that soul, and you
experienced that with thispodcast.
(39:59):
You guys needed an album coverbecause it needed to be done,
and so the quickest way to getthat done was not to hire
someone and have them invoiceyou and talk about your creative
ideas and do this iterativeprocess which ultimately gets
you this thing that you're gonnalove.
It's easier to just dump thatinto a AI image generator and
(40:24):
just throw up whatever it spitsout and have that as your album
cover, and I think that's reallythe crux of the issue that I
have is that, in a society wherepeople were taken care of and
(40:44):
you didn't have to scrape by inorder to pay your bills, i think
that AI would be a fantasticcreative companion to being able
to make anything that yourheart could desire, but as it
stands right now, it's beingused as a tool to replace people
who really love what they do,which is terrible.
(41:08):
I wish AI was being used toactually solve some of the
bigger issues that we have AIused to solve not just like
energy crises, but globalwarming, or even like even super
complicated things like racism.
I actually think that AI couldbe used in that direction, to
(41:30):
work on those issues, andinstead they're using it to
create sketches for a storyboardand replacing the person who
loves making storyboard sketches.
Jeff (41:42):
Yeah, which is part of
what's going on with SAG-AFTRA
right now, and the writers And,to some extent, the actors.
You've got those real, thosethings being seen as real issues
, where you've got real humanbeings that can create something
And that there's value in it.
(42:02):
At least we'd like to believethat there's value in that
creative process I do.
That's the heart of where Ilive and breathe.
And yet AI for guys like Tomand me, when we're working our
day jobs, I'm mostly working aday job being able to set up a
(42:25):
network and secure that network,Whereas where that would be
good to have a human doing that,it's also good to have
something in the backgroundcreating those kinds of things.
Benjamin Jancewicz (42:38):
So I do
think there's maybe a flip side
to that, too, where I mean thesetools are democratizing content
creation, right.
I mean they're making it sothat me, who can barely draw a
stick figure right, could createsomething for my personal
website right An image or astarting paragraph, a bio about
(43:04):
some particular thing.
So I do think that it'senabling folks to have more
creative expression than theywere able to have before.
It's making it easier for themto be expressive.
So it is and there's no doubtit's going to impact jobs.
But I do think it is also goingto lead to an increase in
(43:25):
creative expression for thosefolks that maybe aren't artists
or aren't good writers or aren'tgood creators of music right, i
think it's going to lead to anexplosion in that as well.
Dr. Devon Taylor, Ed.D. (43:39):
I
wouldn't call that
democratization of anything.
I would call that thereplacement of things.
You're just choosing adifferent partner.
Rather than partnering with anartist to help you create the
thing that you want to create,you are partnering with a
machine to create the thing youwant to create.
It's not like you are doing thecreation at all.
(44:02):
You are just partnering with adifferent source, and when you-,
but is it part of that?
Tom (44:07):
the same thing that they
said when Photoshop came in and
the old school filmphotographers said you're not a
real photographer, because realphotographers use film.
Dr. Devon Taylor, Ed.D. (44:19):
No, i
think that what that's getting
at is more about the differencebetween having a tool and having
a solution.
A tool is something that youcan learn to use and learn to
work with, and I am actually prousing AI as a tool to do things
(44:43):
that are things that nobodywants to do, but-.
Jeff (44:50):
Take out the garbage.
Dr. Devon Taylor, Ed.D. (44:51):
Yeah,
take out the garbage, But in
terms of having it replacepeople who really do want to do,
or using as an excuse to notpay people who are really good
at what they do, that's terrible.
I mean, devin, i doubt youwould disagree that if the
institution that you're workingfor said you know what you're
(45:15):
teaching methods.
they're good, but they'reeasily repeatable by an AI.
We could do a computergenerated version of you and
just put out a bunch of videosand we understand that you've
got tenure and everything.
but we're good, we'll just usethe teaching methods that you
(45:35):
have and auto generate them andwe're gonna send you out faster.
Tom (45:42):
Right.
I wanna think a little bit.
What Devin said there a coupleof minutes ago, where he could
barely draw a stick figure.
I kind of fall into that too.
And so, benjamin, you mentionedthere, with things you replace,
using AI to replace things thatyou don't want to do.
For me, that would be some typeof a graphic image.
Now, i wouldn't expect it to bethe quality of work that you do
(46:04):
or that Randall's done for us,but in a pinch where I need just
a simple header image, maybefor a webpage where it would
take me 20 hours and a affinitydesign or something, to put
something together that justlooks atrocious because I'm not
arguing that you should do it.
Dr. Devon Taylor, Ed.D. (46:24):
I'm
just advocating that you choose
your partners wisely, right?
Could you talk to a designerwho you really like and who you
could have a closer connectionwith, or could you have a
machine who does not?
there's other intrinsic values.
So, for example, the personthat you hired to create the
(46:47):
album cover that you have now.
You now have a closerconnection with that person.
That person now has more moneythat they're able to sustain
themselves with.
You're able to talk about it.
You've talked about it nearlyevery episode since you put it
up.
Tom (47:02):
Yeah, true.
Dr. Devon Taylor, Ed.D. (47:03):
And so
like it's providing all kinds of
We do like it.
We have stickers.
It's providing all kinds ofvalue that you were not able to
actually have with the previousAI generated album cover.
You know.
Jeff (47:18):
Yeah, i don't think we
would have created stickers out
of that.
You know, like that.
You know, to be frank, that wasshitty.
You know, it was enough to getus a picture.
Dr. Devon Taylor, Ed.D. (47:31):
Sure,
but the partnerships is what I'm
getting at.
It was important.
Benjamin Jancewicz (47:33):
Yeah, no
idea.
Dr. Devon Taylor, Ed.D. (47:35):
The
friendship that you have with
this graphic designer and thatyear fostering that and and
honestly talking about him.
every show is great for him too.
Jeff (47:46):
Yeah, and that's the
intent of that as well is to
hopefully you know, hopefullydraw some people to Randall to
get them to use him for theirartwork.
I think one of the things thatI correlate what you're saying
to Benjamin is, as at least twoof you know I don't know if you
(48:06):
know this Devon, but I do a lotof theater involved in, you know
was, up until Tuesday, thepresident of the, the local
community theater company thatI'm a part of.
I'm happy to say I'm retired,at least briefly, from the board
for a year.
Thank the little baby Jesus forthat, the.
But my favorite part of theateris the collab, is the is the
(48:33):
actual creation of whatever itis that that we're working on.
Do I love to be in a show up onstage and for an audience?
Sure, but all those steps inbetween, from the sound design
and lighting design to the youknow the rehearsal process, to
you know costuming, all thoselittle pieces which you know,
(48:54):
theater still may be one of thelast outposts live music, maybe?
the same thing.
You know where you?
you have a completelyinteractive process that
involves a bunch of people eachthinking create creatively to
create one unique experience,and then you add an audience to
that and every single shows aunique experience, you know,
(49:15):
depending on how the audienceresponds, how the actors are
responding on stage.
I feel that, benjamin, i reallydo, i and I will say you're
right with regard to Randall'screating of our, of our show
artwork.
That was, that's an importantpiece of this puzzle.
(49:37):
We are, i think, tom I won'tspeak for you really proud of
this, you know, and proud of youknow sorry, tom, your droopy
eyes and my balding head it'swhy we're not models, jeff.
Dr. Devon Taylor, Ed.D. (49:57):
This
isn't this isn't the first time
that we, as humanity, have havecut corners in terms of of labor
costs.
We have a very storied historyof doing that and it does not
usually turn out well.
We, you know, captured Africansand dragged them over to the
Americas and had them pick upbecause we didn't want to do it.
(50:18):
So you know, there's there'scountless.
The whole reason why we havethe capitalist system that we we
do is because people want tocut corners and would rather sit
back and collect money ratherthan go out and make money.
So I mean, it's it's not hardto draw parallels to seeing that
, you know us being lazy and notpaying people properly for
(50:43):
really good work is not greatfor society, and so you know
it's.
It's not terribly difficult tosee that the money that we're
spending, that are going intothe pockets of developers of
chat, gpt and the, the AIgenerated artwork pieces, are
(51:03):
not going into the pockets ofthe people.
Who's with their source and thematerial from is kind of a
problem.
Jeff (51:10):
So I mean, yeah, sourcing
materials a key feature there.
And Tom, yeah, politics, baby.
All right, we want to.
We want to get close.
To wrap this up, i know Tom'sfavorite subject is not talking
about politics, but I'm downwith you, i feel, i feel what
(51:30):
you say.
Tom (51:31):
We, we, try to sit here on
$2,000 Mac books, by the way yes
, i mean, it's a risky thing tobring up we're, we're.
Dr. Devon Taylor, Ed.D. (51:42):
We're
all you know all about Mac
products and we review them anduse them.
I am one of those users.
I am a self-defined techie, butI do think that it's important
to also think about the ethicsof what we're doing and also how
we're using these tools andwhether or not, you know, we are
furthering society towardsmoral degradation.
Jeff (52:07):
Yeah, by the way, yeah,
these, these products that are
built, you know, in othercountries with, yes, much, much
lower price for labor batterycomponents are hard to get yeah,
yeah, they are, yeah, dude,they are mined up here in the
subarctic where I'm currently.
Dr. Devon Taylor, Ed.D. (52:25):
So yeah
, yeah, yeah yeah there's.
Tom (52:29):
There's good in lots of
things, there's bad in lots of
things.
But you've got to pay attentionto both.
I mean, you do.
Yeah, it's not all good, it'snot all bad.
I mean, when we look at theMarch of this stuff.
You know, we talked about theradio station you know 20, 30
minutes ago there, and I've gotsome history with that going
back to my dumb teenage years.
(52:49):
But even back then, like thetrend even then was, you know,
syndication was a thing right?
Jeff (52:57):
Casey Kasem's American Top
40, right, and so there was a
local person.
Tom (53:01):
You're old, i just want to
point that out saying Casey, no
shit, come on this yeah but, um,you know, and with that, and so
if you're playing that, here'sone, maybe you guys will
remember dr Demento.
Oh yeah, of course.
Dr. Devon Taylor, Ed.D. (53:17):
Yeah,
yeah, see, jeff's old too shut
up but I mean, i mean, you'reabsolutely right.
Casey Kasem and the syndicationput a lot of redo DJs out of
business.
Yeah, a lot of it.
Tom (53:28):
Especially they used to do
the top 10 and they were
replaced by him, yeah right andas soon as you could, so as soon
as they could deliver thatdigitally.
It was even more of an impactbecause at a time they would
send it to you, like KingBiscuit flower and all those
things.
They would send it to you,which was weird.
I don't know how they press somany albums and I still have
(53:48):
some in my garage.
Yeah, i want to check them out,but they would send you the
album and you would play it.
So there have to be somebodythere for that and then you play
.
Your commercials is when theyhad their breaks.
Then you start the record againand it would start playing the
next segment, but once, likekind of for sure, with satellite
initially.
Once that kicked in, that tookout the need for the person to
(54:08):
sit there and do it, becausethen it was digital and then you
could have a system play thecommercials when you needed it.
So there's more automation,taking people out of the picture
.
Radio got deregulated and so alot of the mom and pops were
picked up by Citadel and ClearChannel and and all those, and
then you would have one guy inBoston who would do the voice
(54:30):
tracks for kind of like whatyou're talking about, benjamin.
Jack right yep, he'd do thevoice tracks and amount to
Albuquerque, santa Fe, spokane,wherever he's at, and then they
would just dump those into the,the digital system that they
have right.
So this, this stuff's beengoing on and you know, you guys
have probably seen the meme.
Like for Excel, they have thepicture of the.
(54:53):
Looks like it's in the 40s or50s.
For all I know it's an AIgenerated picture, who knows,
but it's a bunch of peopleworking in, like the accounting
department, and they're all headdown on the ledger right and
they're all doing and they'relike Excel, replace this whole
wing of the building right, butthe question is is whether or
(55:14):
not those people wanted to havethose jobs or not.
Dr. Devon Taylor, Ed.D. (55:18):
You
know right where's it.
Was it hell?
hard to answer right, which ishard to answer, but I don't
think it is very hard to answerfor creatives.
I think creatives typicallylove the work that they're doing
.
That's why they get into it.
It's a risky field.
Being a creative is is very,very difficult to sustain
yourself, much more than youknow being a doctor or lawyer or
anything else that you know youtry being although that's
(55:42):
changing that.
You know those, those fields areare already risky and it's
almost unanimously.
All the creators are saying no,we love doing this.
Don't take this away from us.
Where?
Tom (55:54):
are there any pieces of
that word you don't like?
oh?
sure then that you mightconsider using that to like say
I hate this part.
Dr. Devon Taylor, Ed.D. (56:01):
Let it
do it absolutely, and and there
are ways that I use those toolsto do those things so like, for
example, i used to be a prettyavid wedding photographer and I
used to do a lot of that workand I would use a Adobe
Lightroom to edit the firstphoto And then I would take all
(56:26):
the settings from that editingand apply them to all of the
rest of the photos, all of them.
There were 50, 50 photos alltaking the exact same lighting
in the exact same place, and I'dbe able to just and edit them
all at once.
And you know that's a form ofAI, it's an algorithm, but I
(56:49):
don't know a whole lot of peoplethat want to sit there and edit
a photo one by one and do allof the individual sliders, you
know, until it's absolutelyperfect, when it's just the same
as the previous photo.
That's my point Don't take awaystuff that people actually want
to do.
(57:09):
You know, if they're out of that100 number of accountants that
were sitting in that room, ifone or two of them really loved
what they were doing and therest of them were like no, this
is hell, give those one or twothe Excel document and let them
run it from there, and that'swhat we have done.
(57:32):
You know, the people inaccounting departments now are
people who love Excel and lovewhat it does, and they're total
nerds about it.
And more power to them.
I'm not one of those people,but you know it's a way to let
people continue to do whatthey're passionate about and
also pay them properly.
You know that's an importantpart of this is to not cut
(57:56):
corners where you're just doingit the same as Buck, yeah.
Tom (58:00):
And I heard and, Jeff, I
think I brought this up when we
talked a little bit about thiswith Kirk Mechelhorn on the
classical show.
Jeff (58:10):
Yeah.
Tom (58:10):
Yeah, cal Newport's done
some pretty good thinking on
this stuff and he's written somebooks and he's a computer
science professor at GeorgetownAnd I think it was on the
focused podcast that I heard himtalk about this Like what's it
all going to mean for jobs?
right, and his take at least.
(58:32):
Then and it's paraphrased sothere's some nuance around it
I'm probably missing.
But he said that he does expectthere will definitely be some
contraction, which I think thefour of us probably kind of
agree with.
But he also thought that's animmediate thing.
(58:53):
But then as we kind of dip outof that, it's going to bring
with it some type of arecreation that will bounce
things back.
Jeff (59:06):
I would encourage you to
check out this stuff.
Tom (59:08):
Yeah.
Just to get the details becauseI didn't give it justice.
But I think it's, you know, butit's just like we were talking,
though, with that room full ofthe ledger workers and
everything, and with Excel, andI think that's a good example
though, because you know, i know, devin we use it a lot at the
office more, i think, googleSheets for collaborative stuff,
(59:28):
but same principles.
that has brought a lot of thatto just you know ordinary people
, because you don't really needthe accounting I guess, maybe
hardcore accounting backgroundto put some stuff into a sheet
and run some formulas and thenlook up some formulas on YouTube
how to use them and then dropthem in.
So I like to be cautiouslyoptimistic but at the same time
(59:55):
I think that needs to bebalanced with the healthy death
dose of skepticism that you know, because there are people who
will take advantage of otherpeople.
Jeff (01:00:05):
Always.
Yeah, i think that's the.
That's nothing new, but it'sjust.
Tom (01:00:08):
You know, it's just the way
the world goes.
Jeff (01:00:11):
Cool, cool.
All right, we've burned anotherhour of your time, we're happy
to say.
But, benjamin, i think you gaveus a good last word.
Devin want to get a last wordfrom you, last thoughts.
No pressure, i saw your eyes gowide, okay.
Benjamin Jancewicz (01:00:28):
This is not
your Mac.
As I hold up my iPhone, myAirPods, my Apple Watch right.
Dr. Devon Taylor, Ed.D. (01:00:36):
I think
that's the other side of this,
too right.
I think that it should bementioned that one of the
reasons why we buy thesemachines is because they last a
long time.
You know, the the the Appledevices that we purchase are
incredibly long lasting, and Idon't know anybody who has a 10
year old Dell, but I do have a10 year old Mac sitting in my
(01:00:59):
basement which runs my house anddoes all of the home automation
stuff you know, and so I thinkthat's also that's also an
important part of of being anApple user is that we we don't
want to throw things away everytwo years.
Jeff (01:01:13):
So Except for your phone.
Dr. Devon Taylor, Ed.D. (01:01:16):
What a
new phone every two years Not me
, i still got an older phone.
Only upgrade when the cameradoes.
Yeah.
Jeff (01:01:25):
All right, devin final,
final thoughts.
Benjamin Jancewicz (01:01:30):
Now, this
was a great conversation.
I'm really looking forward towhat this brings.
It's going to be a reallyinteresting, i think, year, with
2024 elections coming up.
We're going to see some reallyinteresting controversial use
cases of this technology,whether that be deep fakes,
(01:01:50):
we're already imagery and videosfrom some campaigns.
And I think you know we are juststarting this conversation, so
you know we live in excitingtimes And this is certainly
going to be a tumultuoustechnology And so, yeah, i'm
somewhat excited, a littlehesitant right, a little scared,
(01:02:12):
a little nervous all theemotions about this thing.
But, yeah, great talking withyou guys today about it.
Jeff (01:02:19):
Likewise Thank you, devin.
Thank you, benjamin, devin.
if people want to find you onthe internet, send you a message
someplace.
where can they find you?
Benjamin Jancewicz (01:02:27):
Yeah, i am
Devin Taylor WV on Twitter and
LinkedIn, so those are my twomain social medias of choice.
Jeff (01:02:34):
Awesome And Benjamin.
Dr. Devon Taylor, Ed.D. (01:02:36):
So I,
like I said, I run a small
company called ZerflinZ-E-R-F-L-I-N.
You will find it almosteverywhere.
You can also find me.
My last name is pronouncedYoung Savage, but it's not
spelled that way.
But if you search for BenjaminYoung Savage, you will
ultimately find me and find outthat I am on every social media
(01:02:58):
network known to man.
Jeff (01:03:01):
Excellent.
And, tom, where can we find you?
Tom (01:03:06):
I'm going to go to bed.
Dr. Devon Taylor, Ed.D. (01:03:09):
Great,
you want people to find you
there, tom.
Jeff (01:03:13):
No leave me alone.
I'm on anti-social media.
Tom (01:03:18):
Yeah, Tom F Anderson on
Twitter, TomFAndersoncom on the
website And guys.
Thanks for coming on.
We appreciate it.
It's been a good discussion.
Truly, And if possible, maybein another six to 12 months, is
this thing kind of keepschugging along.
We can get you back on and wecan reevaluate and see what a
total shit show it's become orif it's balanced out a little
(01:03:39):
bit, or what.
Dr. Devon Taylor, Ed.D. (01:03:41):
Maybe
after the election, so we can
talk about your favorite topic,tom.
Tom (01:03:46):
I think I got some vacation
coming up.
It's going to be my show.
Jeff (01:03:50):
I'll be in my room With a
blanket over his head.
his eyes closed.
Tom (01:03:58):
Rock baby.
Jesus, call me home No.
Dr. Devon Taylor, Ed.D. (01:04:02):
I had a
blast coming on here.
I would love to be back onagain.
Jeff (01:04:06):
Really appreciate it.
Dr. Devon Taylor, Ed.D. (01:04:07):
I love
to have you.
Jeff (01:04:08):
I appreciate that.
Thank you Be sure to tell allyour friends about it.
As usual, you can get to us atfeedback at basicafshowcom.
We do have stickers, we do havemagnets.
Have we given them all away,tom?
Tom (01:04:21):
No.
Jeff (01:04:23):
Come on People, you know
we're giving away free stuff,
Take it.
Tom (01:04:27):
Also, Jeff, we've
maintained a streak of one
rating in Apple podcasts pershow.
So if somebody wants to give usa rating that'll keep that
streak going.
Dr. Devon Taylor, Ed.D. (01:04:38):
So
we've done 11 shows.
Jeff (01:04:41):
We don't care, we just
want you to notice us.
We're all about the noticing.
If you call us a bunch ofdirtbags, we don't care, because
we already know that that'strue.
The website basicafshowcom.
You can find me at Reyes Pointon pretty much everywhere.
(01:05:01):
I am on LinkedIn these daysLooking for work.
If anybody wants to pass any,i'm working, but happy to have
more.
What?
Dr. Devon Taylor, Ed.D. (01:05:09):
do, you
do again You've got to let us
know so we can replace you withAI.
Jeff (01:05:13):
Oh good, yeah, that's
something that's most likely
going to be replaced with AI,which is going to be I do, it
consulting.
So a wide variety of thingsProject management you name it
Basically anything from projectmanagement down.
I'll be your director of IT ifyou want me to, would be good at
that as well.
(01:05:34):
Theme music, as always byPsychokinetics, and you'll find
links to Psychokinetics in theshow notes.
As I said before, celsius 7,frontman, one of the front men
for Psychokinetics, has a newalbum out, which we'll have a
link to as well.
Podcast artwork by RandallMartin Design, which, yes, we
are very pleased that we havereal artwork by a real human
(01:05:57):
being.
You should check out Randall'swork, and we will also have
links to Benjamin's work, theZerflin website and also his
personal links in the show notesas well.
This particular set of shownotes going to have a zillion AI
related articles that we thinkare valuable Way more than we
(01:06:17):
could have gotten into in anygreat detail, although I think
we covered it well here.
So thanks for listening And seeya.
Right, tom Right.