Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Alex Kotran (aiEDU) (00:04):
Oh is it?
It's Daniel.
Dan'l Lewin (00:06):
Daniel is my given
name, yeah.
Alex Kotran (aiEDU) (00:08):
Yeah, yeah,
an apostrophe.
L, that's right, it's my givenname, You're the only Daniel.
Dan'l Lewin (00:11):
I know my mother's
gift to me is a talking stick.
I like to say that I'm vowelchallenged.
That's through way the eye ofthe ear.
It's really just a contractionfor Daniel and a family nickname
that my father earned at onepoint.
Alex Kotran (aiEDU) (00:26):
I'm not
going to do a very good job
giving your bio.
You've accomplished and done alot.
What really sparked mycuriosity to have a longer
conversation with you washearing some of your war stories
from your time at Apple, whereyou, as I understand it, were
leading their strategy to bringcomputers into the education
sector.
The time where there were nocomputers in schools, and I
(00:49):
remember my school the firsttime we had computers it was
IMAX.
That was the initial impetusfor this conversation.
But before we dive into eventhat story and anything else
that you're willing to sharewith us that maybe you can give,
whatever your version of theDaniel Elevator pitch is, oh,
just my background or myinteresting story, sure.
Dan'l Lewin (01:12):
So yeah, long story
short.
For me is I grew up in verywestern upstate New York and,
with hindsight, look back veryfondly on the educational system
that I got to work throughpublic school system.
And in hindsight, I look backvery fondly on the educational
system that I got to workthrough public school system and
in particular, a second gradeteacher when I was seven taught
(01:33):
us binary and it was easy for me, it just happened.
When I came to Silicon Valley,there was a lot that occurred.
I had a very atypical familylife.
I attended Princeton and foundout very quickly that I was not
a mathematician like within amatter of minutes of arriving on
(01:53):
the campus and ended upstudying and spending my time in
the politics department, basedupon some professors that I met
who believed that politics, asthey called it, as opposed to
political science, was bestdefined as relationships between
or among people and that youcould scale it structurally to
(02:15):
organizations and thatorganizations have power as
individuals give up certainthings to participate, but the
collective, comparativeadvantage, if you will, of large
clusters of people canaccomplish great things.
So I was moved by the socialissues of the day, thought in
(02:36):
the end that I would be a civilliberties lawyer.
I focused on organizingprinciples.
The 18-year-old vote wasclicking in when I was college.
That dates me, if you want tolook that one up.
It was women's rights, civilrights and rock and roll that
organized people towards socialand positive change and, as far
(02:56):
as I was concerned, positivechange Anyway.
So through that educationalsystem, in the end I became a
very good student and decided totake time before I went to law
school, lost a family bet andcame to California, moved to
Palo Alto in 1976.
The people that I met as Iarrived were all clustered in
(03:20):
this large house on the top of ahill in Los Altos Hills
overlooking the StanfordResearch Park, and they were
doing various and differentthings eight or nine unrelated
adults living in this largeplace.
One was designing integratedcircuits for Watkins Johnson for
guided missiles.
One was in med school theowners of the house.
(03:40):
One was running the KestrelInstitute, which does a lot of
crypto work for the NSA, amongothers, and a lot of math work.
And the other one was a womannamed Peggy Karp who was working
on what became DNS.
She worked with Larry Roberts.
So, like aha, where am I.
And then the long story short isthrough a roommate in college I
(04:01):
ended up going to work for Sonyin Cupertino.
In college I ended up going towork for Sony in Cupertino and
that happened to be a600-square-foot office where the
next-door neighbors the veryweek that I started became the
people who left Apple's garage.
Those Steve and Steve and asmall scattering of others Chris
Espinoza, rest of them thatwere just hanging around in the
(04:21):
garage showed up in that officespace.
I spent a couple of years.
I got to know them.
But I spent a couple of yearsworking for Sony in the storage
area.
They were bringing out threeand a half inch floppy and
storage kinds of things and thateventually found its way into
the Macintosh, as most peopleknow, as I brought that stuff to
Apple.
But I became keenly interestedin the broader implications of
(04:44):
what was going on withmicroprocessors and computing
and in particular graphics.
And at one point I interviewedto go to work for a company that
had these three-quarter of amillion half-million-dollar CAD
workstations, because I wasreally interested in graphics
and simulation and understood alittle bit about microprocessors
(05:05):
and all the rest and anyway.
So the long story short thereis.
I turned down that opportunity.
Just behavioral reasons on mypart told me that I was the most
qualified person that theyinterviewed, but I only had four
years of experience and theyneeded someone with five years
experience.
And I smiled and said well, ifthat's the case, then I'm not
(05:27):
interested in working for you oryour company, because that
makes no sense to me whatsoever.
And they said well, but but?
And I said no, no, sorry.
The good news is I went to workfor Apple soon there, and it
was because you could kind ofget your arms around what was
happening.
These personal devices werethese worlds within which
individuals were programming anddoing interesting things.
Alex Kotran (aiEDU) (05:50):
At this
time, a vast majority of
Americans do not have a computer, oh no, no, not even have
access, and maybe you haven'teven seen one.
Dan'l Lewin (05:56):
No, this is exactly
the case.
I mean, you got to realizethere were 50 at the same time.
I know that was a long-windedthing.
There were 50 or so companiesin Santa Clara County more or
less, that were doingmicroprocessor-based personal
computers Chromamco, and thenCommodore and Atari.
All the rest of them started toemerge.
Apple was the one that wasbeing run by real adults, if you
(06:19):
will, in the sense that theboard, the investors in Apple,
brought in highly talentedpeople from HP and other
companies who knew how to buildthe infrastructure to scale that
business.
And Steve was the, you know,the goose that laid the golden
egg.
He was this entrepreneur withall this energy that needed to
be harnessed, and so they hadadults harnessing him and doing
(06:42):
all those things.
And Apple became the goldenchild with the IPO in December
of 1980.
I went to work soon thereafter.
I missed the IPO.
So anyway, when I went to workat Apple, it was the Lisa system
, which was the precursor to theMac.
Mac wouldn't exist without Lisaand all the work that was done
by that team.
That product schedule ran out Idon't know another 12 or 15
(07:07):
months, because they made somefundamental changes after I
started and so I took my owntime with permission and brought
in people from the universitycommunity.
I was posing the question tothe executive staff of the
company why try and sell thesesystems to the fortune 500 when
(07:28):
it was all ibm mainframes andterminal emulation and color?
Because that was a new, newthing, these color terminals and
all that other kind of stuffand all fortran programming and
apple's all about pascal and therest.
And I said know, the universityand research community will buy
these products.
You won't have to sell thembecause you're inventing what
(07:50):
they or you're building whatthey invented into real products
.
Then you know the company wasnot interested in that.
The sneak preview program that Iran.
We brought 90 corporationsthrough.
I wore a suit and tie and had afull day briefing with
corporate CEOs had to be the CEOand other officers.
They all had to be corporateofficers ran about 90 of those
programs over the course of ayear.
(08:11):
So I was in front of thosepeople, interacting with them
and then absorbing all that Inthat window of time.
Mary Kay Cosmetics came in.
They were an early adopter ofthe Xerox Star, which is some of
the first stuff.
That was kind of like aMacintosh and like modern day
computing and Richard Rogers,who was Mary Kay's son, was the
(08:31):
CEO.
Steve Jobs came to that sneakpreview and watched what I was
doing and called me afterwardsand he hired me as well at Apple
.
So that was prettystraightforward.
But he said you should come towork for us.
He had just taken over theMacintosh project and he said we
have a passion about whatyou're interested in, because
I'd been writing these reportsabout you know Fortune 500 is
(08:54):
not interested.
We should be thinking deeplyabout these other markets.
And for all these obviousreasons, the long story short
there was I went and looked atwhat they were doing.
There were about 10 people inthe group in this little Mexico
Towers office space that hadbeen set up and I looked at what
they were doing and they had alittle blueprint to consider the
(09:15):
higher education market.
Joanna Hoffman had done thatwork.
She deserves all the credit forthinking it through.
But the work needed to beretooled and redone because I
was really smart about channelsand distribution and how to
figure those things out.
Back in those days you couldnot mail order a computer to an
individual.
You had to go to a retailer.
(09:35):
It was against the law to mailorder.
I mean when you're saying, werethere computers?
No, there were no computers.
There were very few in homes.
There were hobby hobbyists.
Alex Kotran (aiEDU) (09:44):
they were
kits and that was basically it I
think right now about theexuberance around artificial
intelligence.
You know, because I was sort ofworking in ai like 10 years ago
and it was, there was a similarlevel of exuberance but it was
like very narrow, like here inboston um about Cambridge
actually not boston um and youknow, chat gbt kind of like
(10:09):
massively expanded the andalmost everybody now is sort of
thinking and talking about aiand there's almost like a
preconceived notion that thefuture is here, like if you
could put your finger on themoment where people start to
realize like, oh, computers are,like this is the future, this
is what we need to be focusingon.
It wasn't in 1980, it soundslike.
Dan'l Lewin (10:26):
Oh, it definitely
wasn't my interpretation of the
evolution of personal computingto the mainstream.
My view of that comes fromreally desktop publishing and
the notion of a graphicsinterface that human beings
could reach into and the ideathat what you see on the screen
(10:48):
you could print.
So laser printing, if you will.
So that's where Macintosh brokethrough and broke out.
We had already left, steve andI and others, steve first,
obviously, but then we left tostart next.
But that was a moment.
I think you went to NAC that'sso cool.
Yeah, and so that to me wasreally the beginning of it.
(11:10):
The fundamental differencebetween what happened, what I
would say, in the first 50 yearsof microprocessor-based
computing and where we are todayin this window, this 1975, and
that 50 years total, so thefirst 25 years and the second 25
years, is kind of the followingComputers are really good at
(11:31):
optimizing rational tasks,calculating numbers, putting
characters on a screen,manipulating an image over time,
those kinds of things, and soit was all.
You put things into the systemand then they were stored and
you could manipulate them.
And there was no networking,there was no wireless, there was
(11:54):
no way to connect.
So communications were reallynascent in all things considered
.
It wasn't until, you know,Macintosh kind of broke things
out with desktop publishing.
Then you had Microsoft and allthe channel play through 1995,
when the launch of Windows 95and Office 95.
That was a breakthrough.
There was a cultural moment,All this stuff going on.
(12:16):
Cultural again back to myearlier comments about women's
rights, rock and roll.
There were cultural movementsthat really occurred.
1997 is when XML started tobecome a topic of conversation
about, with browsers, going backto the Next machine which Tim
(12:39):
Bursley wrote, the browser rightand the web, if you will, on
top of Next.
I was writing the web, if youwill, on top of Next.
It was 97 until that early2000s, and that was where Yahoo
and others started to take off.
And then Google you started tosee the breakdown of the image
of a page into these littlecomponents, XML components, and
(13:01):
then you had these web standardsto move those payloads around
over the network that wasbasically ever-present, Started
to turn the radio network, thecell networks, into ways for
people to connect.
Then you started to get socialmedia and those kinds of things
to connect people.
So for 25 years, from more orless 1975 to about the year
2000,.
(13:22):
Because you had Y2K run up withall the enterprise
infrastructure and all the moneyflowing in, because everybody
was, you know, last time youcould sell people things on a
promise that you know the worldwould fail if not.
So that 25 years was one thing.
97 with XML yeah, rolling upinto the, you know, 2015, 2020,
(13:43):
and today that whole period hasbeen about markets reducing to a
unit of one where I couldmarket to you.
But the way I marketed to youthere was the trade, was your
identity and your data inexchange for free access other
than your connectivity charge,basically, and it's the
(14:08):
organization and the mining ofall that data and turning it
around into a system using thetechniques that you and others
have been pioneering for 20, 25years.
But you now had cloud-basedscale infrastructure.
Storage was effectively free,Communications was ever present
and everyone was walking aroundwith a device and everyone had
(14:30):
access to a device, Even hadpeople like Warren Buffett
buying into Apple.
Why was it?
Seize candy Apple?
You know it's like.
What do people buy when timesare good and when times are bad?
They buy chocolate.
Do they have a cell phone?
You betcha.
So the cultural and socialconnections came about with the
inverse of the business model.
(14:51):
The challenge right now, in myestimation, is that, as has
often been the case, thehobbyists and the early access
point.
Right now, that's thefree-for-all of people running
around on the web with AI andall the goofy stuff that's
happening.
The next phase, I think, willbe behind the corporate firewall
.
Will it be a lot of veryfocused and highly tuned and
(15:16):
highly valuable uses of these AItechniques associated with
enterprise value andefficiencies.
I don't know what that's goingto do to the job market and drug
people and all of that, butthere's going to be a ton of
that activity in the next fiveyears or so.
How that all turns into a usecase for education and learning
is an interesting challengebecause back in the day, the
(15:40):
benefit of a Macintosh in theuniversity setting, which was
the initial market entry, andthen it went everywhere.
After that it was the vetprofessor I mean, vet school is
harder to get into than medschool and he was an evangelist
(16:01):
for the company because hispoint was when he was in that
school there was a small chapterin a large textbook about some
feline distemper thing in catsand here he was.
10 years later he's practicing.
There's three textbooks on thetopic, two journals and this,
(16:22):
that and the other thing.
So the question is how do yougain access to the information
that you really need?
So part of it was organizinginformation right and then
obviously, then you had the weband then you have access and all
the rest.
So breaking that down into youknow what you trust and how
individuals will be will findtheir way, is going to be a big
(16:44):
challenge when you start tothink about the use of AI
education.
Alex Kotran (aiEDU) (16:50):
Because
it's interesting that you talk
about this sort of like digitaltransformation within these sort
of big bureaucracies.
One of the questions that Isometimes get is well, if AI is
going to display so many jobsand we have all this AI, I mean
the unemployment rate's at like4% it seems very clear to me
that there's a disconnectbetween the capabilities of the
(17:11):
technology and institutions'ability to deploy those
technologies, because thebottleneck is no longer what can
be done, what can AI do?
It's more like do we have theorganizational structures, the
people, knowledge and capacity,and also just the political
capital?
yeah, and the regulatory framingand all the rest regulatory
(17:33):
framing and um, and my instinctis that the a big push will be
when there's, whenever there'san extra session.
I'm not going to predict whenthat is, but whenever there's an
extra session, that's you know,suddenly it becomes a not going
to predict when that is, butwhenever there's an extra
session, that's you know,suddenly it becomes a top
priority to figure out how do wedo more with less, because we
just riffed, you know, 10% ofour staff and I'm curious if
(17:54):
that is that, how, like what doyou see as sort of the critical
inflection points for computingand maybe the internet, where it
sort of pushed past all of thatinstitutional like morass in
the past.
Dan'l Lewin (18:10):
Well, y2k was, you
know, one interesting moment for
that.
Because people had sold systemsinto the enterprise but there
were competing stacks or thesolutions, and so the
interoperability of data becamethe big challenge.
And so that was a recessionthat occurred after the Y2K sort
(18:31):
of glut, where everybody soldeverything in to save the day.
Then the question was, theenterprises turned around and
looked at the vendors and saidnow you need to make this stuff
work together.
And so the systems integratorsdid particularly well of
organizing and bringing the dataso that it was interoperable.
You didn't have the ISO stack.
(18:52):
People were still blushing thatout.
You know in terms of the layers, and you know what Amazon was
doing and what Microsoft wasdoing.
And even in the pc industry inthat period there was what the
asap bus crew was doing withcompact and the lead and
everyone else all the clones andbios that would allow sort of
(19:14):
windows to run on a pc stack,and then ibm split and went with
a micro channel, a differentarchetype.
That was crazy, and then sothey got left out in the cold
for a little while, and that'swhen we started partnering with
them, when I was at Next andthings because they were.
They were in left field.
So these stacks, they justdidn't work together and so that
(19:34):
was a that was a big challenge.
The tools are much moresophisticated and the problems,
I think, are more.
They're more human orientedtoday.
Yeah, today exactly.
They're more associated withindividuals learning new and
different ways to do work and inmost cases, using these tools,
(19:56):
they're going to be highlyoptimized in what they can
actually do.
Just even back in the beginningof the PC industry, there were
the number of product managersrequired to bring a product to
market and the time it took andthe lead time for publications
that were going to print thepaper.
I mean you had magazinesembargoed for six months because
(20:20):
they had a 90-day print cycleto even get the magazines to
market.
So I mean, just the pace andscale of things is just it's
lightning speed right now.
And I think the bigger problemright now at the broad level for
society is how will peoplelearn how to live in those
(20:40):
systems and use those systemsand deploy those systems?
And it's always generational.
So the question of if thetechnology takes a decade and
now you can see that same decadeback in the day, it would take
a decade.
Now you can see the same thingoccurring in 24 months or less
sometimes.
How will people adapt?
And yeah, employment is 4%, butthe number of people who are
(21:05):
unemployed, who are mid-levelmanagers in the tech sector
right now it's huge, like thequalitative data, from being in
the valley um and in sanfrancisco.
Alex Kotran (aiEDU) (21:16):
You know,
like two years ago, someone
would get laid off from frommeta and they would have two
jobs lined up and and it's like,wait, I'm getting severance and
then I'm getting a second rightnow.
And the question was like oh,how much fun employment do I get
?
And now I have friends who areeight months great resume blue
(21:37):
chip big pack.
And that seems to actually be.
I wouldn't say it's the rule,but it is sort of this thing
that's percolating and it isn'tnecessarily reflected in the
data, and I don't know that it'sAI.
I mean, I think that was justlike interest rates are really
hot and low.
For a while there was a lot oflike maybe overhiring, but there
is this odd confluence ofcompanies are saying oh well,
(22:00):
like you know, we're now AIfirst, and a quarter of the code
that's written at our companyis written by ai.
So I think a lot of that isactually just trying to signal
to investors who are chasinganything ai.
But there's, I mean, and myinstinct is there's something to
it.
Yeah, he's really good atcoding.
And then there's like how muchharder to get a job as a
(22:20):
software engineer, like there'ssomething there.
Dan'l Lewin (22:22):
I totally agree.
I have personal experiencewithin my family knowing that
with highly talented kids andhow long it's taken them in
between jobs.
And it isn't like I don't haveaccess to people who will say,
sure, I'll take their email andI'll hand it off to the most
(22:44):
senior recruiters that we haveand they're highly capable, but
they're not hiring.
And some of it is just aretooling and some of it is like
you said, the run-up for aperiod of time was people they
overhired right.
Alex Kotran (aiEDU) (23:01):
One of the
common refrains is that AI is
going to create all these newjobs.
And look back at pastindustrial you know, technology
revolutions, industrialrevolutions they all coincided
with increases in employment.
I mean, surely computersdisplaced a lot of the tasks
that people were doing.
If I think about, like before,microsoft Excel I mean, the fact
(23:23):
that people were handwritingspreadsheets is wild to me.
I think there's also achallenge now where companies
are trying to figure out at whatpoint do we start to raise the
bar of expectation, given thatwe know that employees can now
do more with less, but it'suneven in terms of which
employees actually have accessor have the training or even the
(23:43):
capacity to learn.
And I'm curious, with computers,how that was navigated where,
you know, as these tools werestarting to go and become more
common, they weren't necessarilyin front of every single
employee.
Like, at what point didcompanies say well, you know,
people are bookkeeping stuffthat used to take a week, now
(24:03):
takes a day, and that's just theexpectation.
Was there like?
Was there like a moment?
Or whether did it just kind ofhappen gradually?
Dan'l Lewin (24:10):
I think it just
happened gradually.
I, you know, I remember thecover of a business week
magazine saying you know, theoffice of the paperless office
of the future.
You know that was 1975.
It was on the verge of notusing paper anymore because you
do things electronically.
I, I think a lot of this wasjust again generational.
(24:35):
I think it was a slow andgradual and systemic um process
change and then before you knowit you've got next generation
coming in Spreadsheets and WallStreet.
You had that whole run up withMike Milken.
It's like without spreadsheetsthey wouldn't have been able to
do all that work.
It wouldn't have been possible.
(24:55):
So structural change and theeconomy radically evolved.
I mean the financial servicesindustry go back to the 70s was
certainly sub 10%.
It's 20% of the economy now.
So there's just the world's abigger place now.
The marketplace, thomasFriedman, the world is flat.
(25:17):
All of that allowed forstructural change to occur,
efficiency to occur,corporations to rise up to scale
, and what we've witnessedobviously is the tech sector
turning into this.
As I said, you know, in my timeat the museum, life doesn't
exist without computing period.
(25:38):
When did that happen?
Slowly and suddenly.
And what were the motivatingfactors for that?
Communications, beginning ofcell phones.
I mean, I was on the board witha guy who was at motorola, who
was in the executive suite, whowhen they brought in the first
phone that had a camera in it.
He laughed and said who wouldever want a camera in their
phone?
He said I was that guy.
(26:00):
So so everything that built upas a result of these
communicators, right, this goesback to star trek in the very
beginning.
Many of the people who builtthe industry as we know it today
were star trek fans, right and.
And then it was, and it wasalan k in the very beginning of
it all, right, saying okay,here's the way we'll likely
(26:21):
evolve, and it did.
I forgot about the educationcomponent and the notion of that
when the Apple reaction to usleaving Apple was the creation
of the Knowledge Navigator videoto show that the company had
foresight for the way that wewere going to evolve, and that's
(26:41):
pretty much what we have now.
I think we're in for a verydifferent place.
I think the credentialingassociated with higher education
just even the structural stuffthat's going on in the world
today and some of the pressures,whether to US politics aside,
but just higher education andcredentialing and learning those
kinds of things.
We're seeing more aroundbadging and credentialing and I
(27:04):
can go learn these things and Ican prove that I have these
skills and that's very, verydoable.
Some of the very early peoplein the computer industry that I
knew didn't have college degrees, but they were really good at
what they did because they dovein.
I think this is just going tohappen a lot faster.
Alex Kotran (aiEDU) (27:22):
That's what
I do A lot faster, because the
past industrial revolutions,past technology revolutions had
these very slow physicalbottlenecks.
You had to literally make surethat you were buying computers
and then you had to hire thesesystem administrators and you
got to get the interoperability.
(27:42):
That's what I'm doing and Ithink it's Sometimes.
I think it's a bit of a vanitymetric when people talk about
the chat gbt reaching I think itwas like 100 million users in a
month or right, some crazy.
But I think he's alsoinstructed that like we have
basically been getting to thismoment, um, or you know, let's
go back to like not take the andthe fact that we now have
(28:06):
widespread access tosupercomputing over 5G broadband
speeds.
I think the key question is iseducation able to retool at pace
or ahead of industry?
Based on what you saw in termsof the education system's
ability to adapt, surely therewas a lot of change that came
(28:28):
with computing.
Dan'l Lewin (28:35):
What are going to
be the big challenges?
Because the educational systemhas been set up to, like most
regulatory systems, to resistchange for the most part, and
the generational evolution ofhow and what instruction is
(28:59):
delivered and done and whatinformation is presented done
and what information ispresented.
What AI presents is a personalGPS for every learner.
Right, you make a wrong turn inyour car.
What happens?
Reroute?
Right, because you want to getto a destination.
If you get 90 on your algebratest, that's an A, but that last
(29:23):
10%, why weren't you rerouted?
And why didn't you get ahundred percent?
Because the system could easilyreroute you to find out what
those little things were thatyou should know, which maybe
would make a difference fiveyears from now or some period of
time in your life, you know, oryour credentialing for some
opportunity of work of some sort.
(29:44):
So that's where the tools cancome in, how the the system will
deploy them and whether theteachers and the structure of
the teachers unions would allowfor those things.
That's where I'm concerned.
I just think that's.
I think that's that's.
That's my big work.
So you do see alternativeapproaches in some of these
(30:04):
different types of schools andI'm hopeful for some of those,
but at a wholesale level.
When you break it down to thestate level and how our systems
are administered, it's hard tobe really optimistic.
Because I mean, the last majorstructural change that strikes
me goes back to 100 years ago,which took us to the
(30:27):
microprocessor era.
And what was that?
Well, human beings traveled atone horsepower.
Percent of the agriculturalproduction in the United States
went into feeding horses.
People cleaned up after them,cleaned up the horses right, and
then we had structural steelright Because of Ford, and then
eventually high rises and thenaerospace and all the things
(30:48):
associated with that.
So that took a long time, butthat was transportation, moving
people around in that fedindustry and war and all the
rest and all the military stuff.
And obviously Silicon Valleyexists because of the military
stuff, because the Russians hadbetter rockets and we needed to
reduce the weight of ourpayloads and so semiconductors
got fed and Shockley lived herebecause his mother was in Palo
(31:10):
Alto.
So there you go.
So we're seeing now a verydifferent change because it's
ever present.
People's lives don't exist Ifthe computers go off, we all die
and the indigenous peoplesomewhere survive and there's a
new world.
So it's a very different changenow than what we've ever
(31:30):
experienced before, because wehave this ubiquitous access and
requirement that these systemsfunction.
And it's on top of that andwe're being fed the data that
we've given away in exchange forfree access.
We're being fed that back.
So who's jurying, filtering,organizing and presenting that
in a way, through the structuralsystems that society has
(31:53):
created for itself, in a worldof agriculture which goes back
to that hundred-year-old thingit was farming and horses, and
that's where the educationalsystem was set up to support
that educational mean andendgame.
So that's where I seriously doworry as a society, especially
(32:13):
in an affluent society, asopposed to one where people are
hungry and that's so theopportunity for for the africans
perhaps right and people.
You will see more and more ofthat over time and we're seeing
it in other countries as wellyeah, that's an interesting,
almost almost contrarian take.
Alex Kotran (aiEDU) (32:32):
I'm not
sure.
I'm not sure if we disagree onthe potential for, like the, the
intersection of ai andeducation.
I I think what you described issort of the I mean I, ai and
education.
I I think what you described issort of the I mean I'm gonna
say utopian, because I thinkit's actually really plausible.
If you had the right expertiseand systems to be able to
implement it, no doubt ai couldsignificantly enhance teachers
ability to personalize learning.
(32:54):
You know there's questions of,like you know, in harry potter
they're the sorting hat and sortof, and that there's, like you
know, there's other sort ofdystopian novels or people that
your job is sort of dictated foryou when you're born.
I think the world that youdescribed where you're you're,
you know, let's say, theeducation system has enough of
your data to be able topersonalize learning in that way
(33:15):
.
It seems at some point we'regoing to say well, it doesn't
really make sense for us to havestudents apply to college
because, first of all, they'reusing AI to write the
applications.
We have all this data.
In fact, ai probably canpredict to a far higher degree
of accuracy than some humanreviewer which, not just which
college you should go to, butfrankly, which career you should
(33:36):
pursue.
You know, perhaps they're sortof like we put our foot down and
say, well, that's a bridge toofar, but that's.
You know the way slipperyslopes work is.
You know, you eventually do getat this place where there's so
and and maybe that's just sortof the inevitability of like
this and I think this is hardbecause I don't know what your
take on agi is but sometimes Ifeel like there's a a rhetorical
(33:57):
challenge.
If somebody's assumption aboutAGI is that we're going to
achieve AGI in five to 10 years,then a lot of these
conversations are a bit moot.
I try to assume that, like, ifit comes, it comes, but you
can't plan around sort of this,what might actually be a really
(34:17):
difficult gate you know, for usto achieve a breakthrough
technology.
So let's just write off like,let's say, we don't achieve AGI.
But am I missing somethingthere in terms of you know, once
you start relying on thesetools, it just feels very hard.
It's like imagine going backand using a paper spreadsheet.
Dan'l Lewin (34:36):
Yeah, I think the
human opponent.
Answering that question iscomplicated because of the
cultural differences andsocietal norms.
Who's it tuned for?
In that sense, that's the onething that I'm I kind of
struggle with.
It's the self-driving carroutine.
Alex Kotran (aiEDU) (35:00):
You know
and can you explain that?
Dan'l Lewin (35:03):
Because I think
it's well, yeah, in certain
societies, if the choice is gooff the road and kill yourself
or the choice is to kill thebaby and the mom walking across
the street, where do you go?
Kill the baby and the momwalking across the street?
Which?
Where do you go?
And, and I don't know, was thecar programmed in some country
(35:24):
in korea, where it's differentthan if it was programmed here?
So what are the choices theycould?
So I do think, I do think thatand it gets into the kinds of
things that that I've, that Istudied and that I've been
reaching back and reading moreagain.
So cybernetic stuff, norbertWiener, right, so it's like
(35:48):
human interaction with machinesStarting to look at, you know,
the sense of Western society andhumans.
If you go back to one of myfirst experiences, I had a
professor and a advisor incollege who studied.
He was a politics professor,but he was in the State
(36:10):
Department and he was working onIranian affairs at the same
time that the US government wasworking with the Shah of Iran
and aiming to implement a socialsecurity system, because EDS
had done that in the UnitedStates, in Iran, and you're
giving a number to everyindividual in a society that has
(36:32):
no sense of I or self becauseof the nature of their culture
and emanational nature of theirbelief systems.
So, as these systems get grownup around the world, my
curiosity is around how theywill be tuned and housed and
guided in a world where Internetprotocol takes information
(36:53):
everywhere in real time.
So that's the hard part.
So, even in the United States,down to the 50 states and the
different schools and thedistricts and all that kind of
stuff, how will they be?
How will they be deployed?
Who will be the?
Who will be the judge?
And that's the.
(37:16):
That's the stuff that you know.
You pose a question.
You know the next five to 10years.
What is that going to look like?
I don't.
I see smart people trying towrestle with these questions.
I get that, but I don't.
I don't have a crystal ball.
Alex Kotran (aiEDU) (37:30):
This is the
trouble of when you talk to
people who are really informed,they don't actually make
predictions.
Make predictions, um, I've been.
I've actually learned like youcan.
You can basically discountsomebody's expertise if they're
uh too sure of what the next,because it used to be.
You might say, like the next 10years, I think five years is
actually unknowable yeah,exactly, I think you're totally
(37:51):
right.
Dan'l Lewin (37:52):
I think you might
be able to extrapolate.
I mean, ai has changed theequation a little bit in a lot I
I would say not a little bit,but it used to be that you could
.
You could kind of inside of alarge corporation, having spent
17 years as an officer ofmicrosoft in the end and having
access essentially to allinformation, a thousand phds
(38:14):
doing research and looking andthe way the modeling gets done
and everything.
You sort of look at it and go.
I might be able to look out 18to 30 months, you know just in
terms of.
But then something comes in andthen it adjusts but the big
machines take a while to realignbut the pace of change, it's
(38:40):
stunning right now because againwe've reached the stage where
every individual functions as aresult of computing.
Everything they do all day longrequires it to me it's all day
long requires it.
Alex Kotran (aiEDU) (39:00):
So to me,
it's the real question of what
does it look like once you havecapable agents and an agentic AI
?
And the challenge of this is,like there actually is, I think.
Simultaneously, I think peopleare underestimating the scale
and scope of the chain.
Users are sort of like sodistracted by the individual
widgets, and yet I think thereis also a lot of overhype, of
(39:24):
course.
So balancing that balancing actof like yes, we're.
I mean and I guess, going backto personal computers and the
internet, I mean there was ahype cycle and it would still
have been correct to predictthat this is going to be the
future.
The goal might not still havebeen correct to predict that
this is going to be the future.
Um, the goal might notnecessarily have been let's go
(39:44):
build a website right now.
Dan'l Lewin (39:45):
Um, maybe that's
what you need to do, but yeah,
but I mean in those days we werestill I mean that's the some of
the earlier comments I mean wewere.
We were still struggling withthe stack um and the
communications infrastructure,the cost of storage and the cost
of this, and it was aresearcher sorry.
Alex Kotran (aiEDU) (40:06):
in addition
to all of the institutional
bureaucracies that had to bechanged, you still had all the
same challenges you had with AI,but outside of the physical.
But you used to go challenge.
Dan'l Lewin (40:15):
Yeah, logistical
challenges of the industry and
the competing stacks and thecompeting approaches and the way
those things were going to workand the lack of
interoperability and allabundance of data are the things
that the data is out of the bag.
You know the genies.
(40:53):
How it gets restructured andplaced into society is the big
challenge, and there weren't anyreal regulatory issues.
I mean, IBM was heldaccountable as a monopoly and
Microsoft as well in thoseperiods of time, but that was
very small scale compared towhat the world is facing right
(41:15):
now.
And the structures that need tobe considered for the social
implications of these devicesand what they can do and will do
, whether we like it or not.
Alex Kotran (aiEDU) (41:27):
Can you
give me an example?
Because I think there's, youknow, going back to these,
there's a lot of like, well, wehave to harness the upsides of
AI and minimize the downsides,and I think sometimes there's
almost a generalization of whatand even like ai ethics.
So I you know you can go as faras say, okay, algorithmic bias,
sure, but if you can help painta slightly more sort of um, you
(41:51):
know, higher fidelity pictureof like, the type of things
where, if it's not regulation,at least sort of like having
standards and systems in place,are going to be important yeah,
yeah, you asked me a question inone of our exchanges about some
of the readings and things,things that I find that are
informing some of my gutreaction to these.
Dan'l Lewin (42:10):
There's a book that
Verity Harding wrote called AI
Needs you, which is a reallygood assessment of three
different technologies, theinternet being one, um ivf in
vitro being another that took 20year arcs for society to absorb
them and for there to beregulatory oversight in some way
(42:33):
, shape or form, one could arguethat neutrality, etc.
But but she looks at these,these three things that took
societal change and structuralchange and courage for them to
materialize and become part ofthe fabric of of modern culture,
at least in the west.
Um, there's another book.
Jamie suskin wrote a bookcalled the digital republic.
(42:55):
It's very well organized intobite-sized chunks about
structural and political changeand how the technologies can fit
into that and what kind ofcourage we need to deploy new
structures and new regulatoryframeworks on.
That is again, generationalchange in the governance and the
(43:25):
people's skills in governmentto be able to appreciate that
and put those things to work.
And that's, um, because back inthe day this the motivator was
was, uh, life-threatening, itwas world war, and so those were
the motivators where weactually had science, advisors
and things like that.
Alex Kotran (aiEDU) (43:41):
That people
really organized uh to save
society as we know it right Ilove that your emphasis on
history, like as someone whostudied history in college.
Um and you know it's funny mybackground is not technical at
all.
I have never written in aprogram.
Yeah, my background is nottechnical at all.
I have never written a program.
(44:01):
Yeah, my background was inpolitical science, arts and
politics.
Um I accident fell into into ai.
I was working for the ai companydoing ml and pharma tech space
and um did like policy andcolumns work for that sure, and
I just so there's a long story,but basically landed at this AI
(44:22):
company like the first companyto basically build language
models and linguistics andpredictive coding for the legal
sector, and the CEO was this,nicholas O'Connor.
He was sort of this visionary,almost like a philosopher king
internet history.
Dan'l Lewin (44:36):
And he was sort of
intuiting this risk that history
.
Alex Kotran (aiEDU) (44:43):
He was sort
of intuiting this um risk that
I see the technology now beingexperimented within, deployed,
including by our company, andthere's no sense of like what
competence or accuracy orquality is.
There's no guidelines as tolike who should even be equipped
to ask the right questions, andcertainly judges weren't.
And so I started actuallybuilding ai literacy for judges
interesting and I thendiscovered that our schools
(45:03):
weren't teaching ai about ai andI was like, well, okay, well,
the future work is probably alsono, I didn't know that about
you, that I really appreciatewhere your questions are coming
from now as well.
Dan'l Lewin (45:12):
When we roll out
the macintosh, I had a consortia
of 24 institutionsparticipating in receiving the
systems under this centralizedpricing and give-and-take
relationship and one of therequirements was that they
encourage faculty to dointeresting things with the
(45:33):
computers and then to sharethose little mini programs and
things among themselves.
And it was a guy from BostonCollege who organized a book of
stuff and we printed it all butthe distribution of that
software.
We cut a deal with Kinko's,because Kinko's distribution
(45:55):
strategy was to set up shop nextto the major research
institutions, of which there are200.
And they would build thesebooks it was called Professor's
Press of chapters from variousdifferent publications and snap
together the book for thisprofessor's syllabus for his
course or his or her course,manufacturing the floppy disks
(46:21):
with these little quant-basedthings for the humanities and
all these little programs andthings and programming languages
that the physics professor atReed College built called Rascal
Rascal because of Pascal, allthese things, and it all went
back to using the existinginfrastructure and distribution
infrastructure.
The challenge now is it's thedistribution in the
infrastructure for sharinginformation is ubiquitous and
(46:44):
everyone is a unit of market toattack, and that's the
difference.
Now, there were filters by whichthese things were delivered
before and there aren't any.
So how we harness those filtersand what kind of leadership and
structures are being proposed,that's going to be the trick,
(47:07):
and it isn't that there won't bethese capabilities where you
can look at the bright side.
There will always be thedownside, and anything that can
be used at scale can beweaponized, right, so?
And this can be weaponized by aperson or a small group of
people in a very different waythan, obviously, other types of
weapons that require, you know,nation state action and things
(47:28):
like that.
So, um, that's the worry that Ihave, and the good news is
there's a lot of smart peoplewho who are putting energy into
trying to solve, at least pointout the areas that need to work
and some of the structures thatcould be put in place, and
(47:49):
that's I'd sort of try and huntdown that type of reading where
I can yeah, I think that's the.
Alex Kotran (aiEDU) (47:56):
That is the
glass half full argument, up to
putting language models intothe public zeitgeist.
I've heard it sort of talkedabout as reckless, but I think,
prior to chat GPT, I foundedAIEDU in 2019.
I was doing the work in 2018before.
(48:16):
It was a very small world and alot of people did not take
meetings with me.
I was doing the work in 2018before even, um, it was a very
small, it was a very small worldand you know a lot of people
did not take meetings with me,um, who are not banging on our
door, but we have no like my, my, I have no issues getting
meetings.
It's more about like, how much,where do I spend my time?
Right In the what do you what?
Dan'l Lewin (48:36):
do you?
How do you exchange?
What do you?
What questions are you asking?
What are you looking for?
Alex Kotran (aiEDU) (48:40):
And so
there's a tremendous power to.
I mean, if you think aboutpresidential campaigns, yes,
it's hard to mobilize a reallydisparate, decentralized system
like the US education system,but we do it every four years,
twice every four years, Withpresidential campaigns, you're
mobilizing half of theelectorate.
This is actually turning out,but we do it every four years,
twice every four years, Withpresidential campaigns.
(49:00):
You're mobilizing, you know,about half of the electorate is
actually turning out and youhave a very short timeline to do
it.
And it is not just about hiringa bunch of field organizers.
That's part of it.
And you know the Obama campaigndid that really well.
The Trump campaign reallydidn't.
They outsourced it.
What I think where there arecommon threads is there is this
(49:25):
sort of centrality to, andsimplicity to, the message, or
even just and sometimes I thinkpeople confuse message and
policy, Not always policy, butit's like what is the reason
that it's getting you to takeattention to this take?
Dan'l Lewin (49:42):
an action.
Alex Kotran (aiEDU) (49:44):
And
obviously Obama had that.
Obviously Trump has that, and Ithink AI actually poses this,
like the fact that now it's, youknow, teachers are like
building conference centers.
You know, hundreds of teachers,hundreds of superintendents are
taking time out of their day.
And we did one big ai summit incincinnati and the uh ohio
(50:06):
department of education wasthere and like we've never seen.
I love this level of excitement,yeah.
And so there's like thequestion is, how do we channel
that attention to what'sactually important, as opposed
to just try to sell but you'reon it and that's that's good to,
as opposed to just try to sell?
Dan'l Lewin (50:21):
You're on it and
that's good to hear what you
just said.
It's important.
My last role at Microsoft, Idid campaign technology for both
sides of the aisle for the 2016election cycle.
So I started in 2012 and I hada red team and a blue team and
(50:41):
understood and studiedeverything that the Obama folks
did and then, because it was allMicrosoft underlying technology
but no Microsoft data, so wewatched all of that happen,
including what happened withCambridge Analytica and Ted Cruz
and all that stuff.
We watched all that stuff.
We watched all that stuff and Ithink the harder part is it's a
(51:09):
great analogy that registeredvoters is a known list and
they're all technically adults,so it's the children the kids
who will have access, whether welike it or not, and that
behavior pattern.
So I love the idea that you'respending time on that regulatory
statewide educationconversation to get people to be
(51:34):
thinking about this and therewill be good models that emerge
and then people will share themand ideally, they'll share them
easily and quickly because ofthe infrastructure that exists.
The challenge will always be,which is what I experienced in
my personal life because my kidsgrew up with their.
My oldest was 18 months oldwhen I had a macintosh prototype
(51:55):
at home in 1982.
And so he grew up like a duckwith that menu system imprinted
in his brain and when he gotinto you, know, junior or middle
school whatever, where they hada little lab and he was
controlling the systems and wasgetting in trouble.
But he was wasn't really gettingin trouble, he was just doing
(52:17):
the things that he knew how todo, and the schools weren't
ready for that.
So that's the one question Iwill have is what will the
schools do when the kids come inyes, more empowered.
What will they do?
Alex Kotran (aiEDU) (52:35):
I'm
obsessed with this.
I've been by far the number onequestion that we get and
request for help is like help usdeal with cheating?
All the kids are using chat,gpt, and you know, I've heard
some people in the space.
Actually, their response goessomething like well, it's not
cheating these tools and if theydon't know how to use the tools
, they're going to be leftbehind.
So, not cheating, just, youneed to just change what you're
(53:03):
doing.
Um, I actually don't agree withthat, because I think what
teachers are actually?
They don't.
They can't quite put theirfinger on it, but what they
intuitively understand is thatkids today are already running
laps around us.
The idea that teachers aregoing to teach students about
(53:26):
how to use AI is ridiculous.
Let's be very clear.
I just actually had someonefrom Stanford who has an AI
makerspace and the students atStanford are the mentors for the
faculty.
That will be the model to theextent that we're going to be
trying to figure out how to usethe tools.
If, to the extent that we'regoing to be trying to figure out
how to use the tools, there'sthere's something sort of
fundamental about part of school, part of and like I didn't do,
(53:48):
I really didn't do very much inalignment to what I studied.
I mean, I studied politics, butit was I really just chased the
interesting professors.
So I studied the history ofBrazilian politics part two,
right and did the same thing.
The one thing I learned inschool was just the persistence.
(54:09):
I wrote a lot and so I spent alot of time, you know, and often
the night before a big essaywas due, I'd have to sit down
and sort of like push throughthe writer's block and get
something written and I, I, I,really I just I just wonder what
it's going to be like in aworld where, um, replace AI with
(54:29):
, like, a really helpful parentand, let's say, the parent is
like really good about notgiving you the answer, not
writing the essay.
It'd still be like somethingwould be off if somebody went
through college, went throughhigh school, and they always had
their parents sitting next tothem like oh, like, are you
having trouble with that?
Can I help you?
Like nice, talk about it likeyou'd be like.
Dan'l Lewin (54:48):
No, I totally agree
, you need to be by yourself
sometimes I totally agree thatthe notion of deep immersing and
reading and deep thinking, sortof deep structure as opposed to
surface structure, because Ithink what you get back you ask,
learn how to ask good questionsand you get back interesting
information but it's surface,it's a surface level, it's not
(55:09):
the deep structure on theunderlying, sort of the Chomsky
language in mind stuff, right,it's just not, it's not the same
and that's the one thing.
You know that I worry aboutyou're pointing out the same
thing and, uh, I had the samething.
The professors that I was sameexact example I was.
(55:31):
I got a degree from thepolitics department.
I took five courses out of thedepartment.
Everything else were cognatesthat were tied into things that
I was focused on sociology ofthe family, the politics, the
relationship between men andwomen, all this other kind of
stuff Because I was trying tofigure out how do you organize,
what are the organizingprinciples for driving change in
society.
So I got to apply them to thecomputer industry in the early,
(55:53):
early phases, and so that's whatI look back on now and say, ok,
what are those structuralthings that we need to be
thinking about?
And the reality is that, likeyou, I took one test in college
and other than that I wrotepapers, and it was.
It caused me to think.
And the last piece of work thatI did for this professor which
(56:17):
was the last thing I did in myquote-unquote college career I
can't call it an academic career, but I got my degree was ask
yourself three questions andanswer them no more than five
pages.
We won't judge what you ask orhow you answer with anything
that you read or discussed inthe precept, which was a small
(56:40):
gathering, or from the lectures.
It took me a month to thinkthat through because I knew that
was the end game.
It's like what do I and why, andhow am I going to express that
in a concise way, rather than 50pages of really hard problems?
(57:02):
And so that's the thing thatI'm the time, the contemplative
time, that's the one thing, andI don't know what that will turn
into for it.
Just it will change the natureof the human being, and this is
a general, linear stuff.
I mean, it's just going tochange us.
Alex Kotran (aiEDU) (57:21):
So I worry
that, like the digital divide, I
was just talking to Tony Wanabout Desmond Reach Capital, who
also, interestingly, like he'sin a venture capital now.
He's been 10 years buildingEdSurge as a sort of background
as a journalist and so he has avery unique, unique take as a
venture capitalist that youdon't always hear.
(57:41):
But basically, the wondering Ihave is whether the digital
divide will actually looksomething like.
The poor kids hate all the AI.
You know the.
You're in a private school,you're reading equities, you're
writing pen to paper, you have ateacher in your classrooms and
(58:02):
sure, I think there's still aithere.
I think the teachers are stillusing ai to maybe personalize.
I think personalized learning,as you alluded to, is such an
obvious, especially if you thinkabout like students or social
needs certain types of learningand certain types of content.
Dan'l Lewin (58:16):
Right subject
matter.
Alex Kotran (aiEDU) (58:17):
But but I,
I and the reason I'm focused on
that is because, like to the, tothe point about what does it
look like to have a very clearand crisp message, and what I
have been really pushing andfighting for is ai readiness,
not ai literacy.
And literacy is how do you usea technology, readiness is how
(58:39):
are you ready for the world, andthat might mean math and
reading and writing.
It might not actually includethat much.
Yeah, literacy and I go back to2007, which is not even nearly
as far back as you've taken us,but it would feel quite silly in
2007 to be like the thing thatschools need to focus on is
mobile phone literacy.
It's also correct to say thatmobile phones change the world
(59:02):
and you couldn't do a jobwithout a phone, but it's, it's
like, necessary but insufficientright no, I, I'm, I'm with you,
it's, it's a, it's a question.
Dan'l Lewin (59:15):
The book Writing to
Learn.
I mean, we've, as humans, havelearned from taking a stick and
putting it in the dirt.
You know what I mean.
And over there divide, whetheryou have um or economic divide,
(59:43):
whether you have the access tosomeone who can help you with
the nature of humanity as it'scurrently embodied in most
people, and yet, um, you get theemail from the school.
These days, my partner, myfiance's, got a son in high
(01:00:03):
school.
There's going to be a policyabout cell phones that they
can't be on.
It's like, all right, we knowthis Shouldn't be on in school,
but they are.
I don't know.
I don't know.
We are in an inflection pointthat is unique, um, and you know
(01:00:29):
, on the global scale, the againI, I look at it at the, at a
macro level, just a level ofcultural differences and
societal norms, the rituals, thesymbols and the way in which
the world will evolve.
And then we'll, we'll know whenwe know, and, but it's
(01:00:50):
happening, happening faster thanever do you feel like you go
back to?
Alex Kotran (aiEDU) (01:00:58):
your
conversation about access to
information, and I feel likewe're actually sort of like past
the peak, like if I had toguess it would be around the
2012 timeframe, where there wasa really rich universe of like
high quality content, like thepublishing and news industry
(01:01:20):
hadn't collapsed yet socialmedia hadn't quite pivoted to
information.
It was still like friendshipsand social networks and
connections.
Right and now, today it's likeyou know my own, my read some
mainstream publications, but Ireally my the time I spend, I
spend it's like it's on YouTubereally, and maybe a little bit
(01:01:44):
of Twitter or Axe, but everybodyI talk to, I mean like their
information silos are like it'sstaggering.
Dan'l Lewin (01:01:52):
They're real, yeah.
Alex Kotran (aiEDU) (01:01:54):
And so I
always wonder like do we?
You said well, no, but I yeah,like I, I feel like some people
will know yeah, and other peopleI will just already just happen
, and they'll be oblivious to ithaving happened.
Dan'l Lewin (01:02:11):
I I I don't
disagree with what you're saying
and I think that time framethat you point out is is very
rational, because it's in thatperiod that I started asking
questions of people in the groupthat I was looking after inside
of Microsoft, and they were ofanother generation than mine,
(01:02:32):
and I was asking them aboutFacebook versus LinkedIn.
I mean, I went to the firstpublic developer event that
Facebook had.
I was at Microsoft and I washere and we had a little
programming tool to help.
And obviously, back in that day,it was your enemy is your
friend, and so we made theinvestment of Facebook, all the
(01:02:54):
kind of stuff relative to Googleand all these things, and
Facebook was where you basicallywould be.
You'd communicate with people,only those people that you would
.
This is what people said to meBen, you're at the beach and
you're with your family andyou're in your swimsuit.
You would send those photos tothe people that you would friend
(01:03:16):
on Facebook.
And that was the use case, wasthat level of sharing and
community and some level ofintimacy and filter.
And then LinkedIn was aprofessional framework and that
was it.
Alex Kotran (aiEDU) (01:03:29):
And I ask
you, like as someone who is
informed and thinking deeplyabout this I don't want to make
it sound like other people aresleepwalking and I'm sort of
smarter than them Because, likeI'll be the first to admit, I'm
curious if this is the case foryou I have started to become
more and more reliant on the,like the google search synthesis
(01:03:49):
, where, I mean, it gets itwrong a fair amount of the time,
but most of the time it's right.
Most time it gets me, and evenwhen there's like potential for
a hallucination, I'm sometimesjust like, well, let me just try
it out, like if I'mtroubleshooting something, and
so even someone like me who isreally attuned to this, my
behavior is like pretty haschanged and even and I'm curious
(01:04:12):
if you've I mean, have youtaken any?
Are there any sort of likemeaningful or intentional things
that you've done to try toensure that you don't, you know,
sort of haphazardly live into?
This is a rabbit hole.
Dan'l Lewin (01:04:24):
you mean, um,
inadvertently placing too much
trust or reliance on ai I may bemore of a, a lot, and most in
that sense I mean whether that'sthe right phrase or not, I
don't know um I tend to readmore than than most people that
(01:04:48):
I know and and I just find it umsort of I had a list of books
that I've been you know like.
I picked one out the other daythe revolution, very self-social
change in the emergence of themodern individual from 1770 to
1800, like like what was goingon back then and how did the
(01:05:12):
modern human of that era emergeand what was what was propaganda
back then?
And what's propaganda now?
It's the systems we have rightnow.
It's basically it's propaganda.
So I try to like you.
(01:05:33):
I think what you just describedis, if I'm searching for
something, I'll try Google, I'lltry Bing.
Alex Kotran (aiEDU) (01:05:42):
You're not
using like deep research.
Dan'l Lewin (01:05:44):
No, I'm not and and
it's kind of like maybe it's,
maybe it's just me and maybe I'mstuck because I don't know.
I've got about 500, maybe 700old albums that I have digitally
and I like that music.
You know, there's a certain um.
Well, it isn't that I want toexplore other things and things
(01:06:05):
like that, but the majority ofmy time I find that.
And then when I, when I'm withmy kids and my grandkids and
stuff like that, then that'swhen I sort of I listen and
learn and I follow their leadsReese for the last couple
minutes.
Alex Kotran (aiEDU) (01:06:19):
Touch your
music and do you so.
You don't use streamingservices.
You actually have albums.
Maybe they're on iTunesunesyeah, I mean I I digital, I have
, I have yeah I have.
Dan'l Lewin (01:06:29):
I was an early
sonos customer.
I've got a whole bunch of zonesand all that kind of stuff, and
so I got a little server and Iput all of it up there yeah, and
, and there are streamingservices that laura will put
them on as well, but I'll I'll.
I'll just.
I like jazz and I got a bunchof old stuff that I really like
and things like that.
(01:06:50):
Um and I you know, and I'dprobably bob dylan, and
classical charismatic leadershiptheory, and so the new move, an
interesting one as well.
Uh, about that.
Alex Kotran (aiEDU) (01:07:02):
So um,
anyway, yeah, what I'm getting
at is because I also collectedcds.
I mean, I my best gift I evergot was my cousin basically gave
me that like this giant binderof cds and I just let me rip
them.
So I just had all the cds,itunes and I sort of like
immediately teleported into.
(01:07:22):
You know, this is like sort oflike indie rock sort of like the
general theme and so I was likeI by far had the coolest music
taste of all my friends I waslistening to like Radiohead and
Flaming Lips and John Coltrane,a lot of jazz, a lot of like
avant-garde, all of that place.
Looking back, that was by farthe golden age of my and still
(01:07:45):
sort of like in form is my musictaste today, yeah today, my
music, my, my ability to conjureup a, a song or an artist that
I like, has diminished so muchbecause the unit of measure is
no longer an album, it's a radioplaylist that's algorithm
creates for you and and it wassold to us under the premise of
(01:08:10):
like.
This is going to help you withdiscovering new music and the
actual and I've talked to enoughpeople that have experienced
the same thing where it's youjust you're tapping into a sound
stream.
Dan'l Lewin (01:08:21):
Yeah, you're not,
not the same you don't have any.
Alex Kotran (aiEDU) (01:08:25):
You're not,
you're not invested in it
because you haven't.
It's not about even buying thealbum.
Dan'l Lewin (01:08:29):
It's like listening
to an album all the way it's
the surface structure, in thedeep structure, the same thing
that me.
That was again a takeaway.
This is a couple reads from my,you know from school, is
language in mind.
You know chomsky's stuff, umwas really fundamental, um
structure of scientificrevolution, sort of structural
(01:08:51):
change, which is coon's book.
Are you familiar with that?
I'm sure most people, and it'sthat they were.
I think he wrote that in 72 orsomething like that, and I was
in school in 73 to you know.
So it's like they were freshand Wiener stuff was older.
But information theory, some ofthose things became, you know,
(01:09:12):
were really fascinating, and soyou know, we were what you're.
What you're getting at is justthe notion that you actually
invested enough time to go deepand to, and, and so you as a
human, something locked,something locked in, you learn
something.
And that's a structure fromwhich, in a filter by which you
(01:09:34):
see the world and that's what'schanging in a really
unpredictable way.
And so the question, and that'swhy I I err on the side, and so
I want to learn more, a lotmore about what you're doing as
well, over time.
Just how are the societalstructures going to filter and
what will we be trusting?
And so your point about, well,is it an economic divide, where
(01:09:58):
there will be some, wherethere's a human engagement,
where there's a level ofhumanity engagement, where
there's a level of humanity, uh,or will?
Will it be more dystopian, orwill just be slitting around
surface like a water bug?
Alex Kotran (aiEDU) (01:10:13):
yeah, I
don't know.
I like I.
I spent enough time in you know,world economic forum, the un,
the blunt bank, all these areall these sort of like
organizations, and there was alot of thought leadership.
There's a lot of thoughtleadership today.
There was a lot of thoughtleadership about ai when the
world economic forum coined thefourth industrial revolution
back in 2016.
Um, and at a certain point, Irealized that there's there's
(01:10:37):
probably not a pathway to policyleaders and the thought
leadership class actuallysolving this, because what I
kind of like and I will givemyself credit for this and, by
the way, I don't think this isactually kind of terms yet, but
I really think that there's aiis, right now, the protagonist
in the story, um, maybe like a,uh, an anti-hero if you're, you
(01:10:58):
know, depending on where you are, but it's sort of the most
conversation about ai is about,like, the opportunity it's going
to bring and, yes, there'sgoing to be.
Most conversation about AI isabout like the opportunity it's
going to bring and, yes, there'sgoing to be changes and
downsides.
Um, I think in the next it willbe the next recession, but I
think it once, once jobdisplacement it gets to a
certain point where it becomesunequivocal, I think.
(01:11:20):
I think public reception shiftsmy worry that there's.
If we don't build somefoundational knowledge about
what this is, it's veryunpredictable what the negative
sentiments and the backlash will, how it will be channeled,
because it's possible thateverybody's sort of like that
(01:11:41):
there's a rejuvenation of, let'ssay, union membership and
people realize that we need tosort of like that there's a
rejuvenation of, let's say,union membership and people
realize that we need to sort oflike retake power, um, to have
more agency over our data.
And I think there's also like,especially in a world where
there is so much ai that canmanipulate which, and the thing
that I think is actually thebest, uh, the one use case of ai
(01:12:02):
where it performs the bestagainst humans is persuasion,
more so than coding or writingor anything like that, and so so
if you think about, okay, well,how do you educate the public
to a place where they are now atleast interacting with the
changes in a way that, um, ifthey don't, they may not have in
(01:12:23):
total agency, but you have moreagency if you're at least can
make decisions, and so for me itwas like well, where do you
start?
Well, you need.
We actually started out as theamerican ai forum because the
idea was like we need to educateeverybody.
And then and then I quicklyrealized that schools weren't
teaching kids and I was like,okay, well, that's, that's an
obvious place to start thefuture workers.
They're at least making a veryclear decision.
(01:12:44):
But that's going to be impact.
Like you should, even if yousay I really want to be a, you
know, a truck driver or you know, drive uber, right, you should
know that there are companiesthat literally are planning to
displace those jobs, right, um,and it's, it's not the entire
solution.
And so you know, when you talkabout like regulation, like we
(01:13:04):
really don't, really don'tadvocate for regulation, I mean,
we have an opinion about whatguardrails should be in place,
but I think it's the hardestpart.
Yeah, you know, regulation isgoing to require a lot of
thought and strategy, butreaching real people, especially
people that are not, I've neverheard of Next, yeah, there's a
(01:13:29):
lot of those, but people whojust don't have that
technological curiosity which iscompletely justified, sure, no,
it's relatively narrow, but inthis moment it feels quite
urgent and large in scope.
Dan'l Lewin (01:13:49):
I would agree.
I think we're.
I think people like Jaron, youknow it's important that they
speak and a boy does some otherwork in that area.
You've met Dana?
No, I actually didn't.
I like her work as well she'salso at microsoft.
(01:14:09):
Uh, he set up a thing in newyork called the data and society
research institute, and whenobama administration did the ai
reports, if you will.
They had three centers one atmit, one at berkeley, stamford,
I think, and she helpedfacilitate.
The one at MIT, one at NYU,murphy-stamford, I think, and
she helped facilitate the one atNYU.
She set up a new or was she'sin Colorado now, but I think the
(01:14:33):
entity I don't know exactlywhere it is right now, but
there's more, there's a fewother researchers who are saying
Crawford as well have goodvoices.
Alex Kotran (aiEDU) (01:14:49):
Yeah.
Dan'l Lewin (01:14:49):
I anyway, I agree
for agree for this.
Alex Kotran (aiEDU) (01:14:50):
I mean,
it's like, yeah, it's part of
it's part of my hope for thispodcast or this youtube channel,
whatever we're going to call itis, um connect some of these
thinkers to educators, tofunders, um, that are right now,
I think, understandably kind oflike very focused on the ball
in front of them.
Dan'l Lewin (01:15:08):
Yeah, they have to
be in some senses, because it's
like a fire you have to beattentioned.
Alex Kotran (aiEDU) (01:15:16):
Well, it's
like a house that was already on
fire.
Yes, you know, yeah, yeah, goodon fire right.
Dan'l Lewin (01:15:20):
So now they.
Now the question is how do we,how do you house it?
Yeah, anyway.
So I this is.
It's interesting for me.
I appreciate the invitation.
Thank you so much for comingand glad to do it.