All Episodes

December 10, 2025 45 mins

Chris and Daniel talk with returning guest, Ramin Mohammadi, about how those seeking to get into AI Engineer/ Data Science jobs are expected to come in a mid level engineers (not entry level). They explore this growing gap along with what should (or could) be done in academia to focus on real world skills vs. theoretical knowledge. 

Featuring:

Sponsors:

  • Shopify – The commerce platform trusted by millions. From idea to checkout, Shopify gives you everything you need to launch and scale your business—no matter your level of experience. Build beautiful storefronts, market with built-in AI tools, and tap into the platform powering 10% of all U.S. eCommerce. Start your one-dollar trial at shopify.com/practicalai

Upcoming Events: 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Jerod (00:04):
Welcome to the Practical AI Podcast, where we break down
the real world applications ofartificial intelligence and how
it's shaping the way we live,work, and create. Our goal is to
help make AI technologypractical, productive, and
accessible to everyone. Whetheryou're a developer, business
leader, or just curious aboutthe tech behind the buzz, you're

(00:24):
in the right place. Be sure toconnect with us on X or Blue Sky
to stay up to date with episodedrops, behind the scenes
content, and AI insights. Youcan learn more at
practicalai.fm.
Now onto the show.

Daniel (00:48):
Welcome to another episode of the Practical AI
Podcast. This is DanielWightnack. I am CEO at
Prediction Guard, and I'm joinedas always by my cohost, Chris
Benson, who is a principal AIresearch engineer at Lockheed
Martin. How are doing, Chris?

Chris (01:02):
Hey. Doing very well today, Daniel. How's it going?

Daniel (01:05):
It's going really well because I have a close friend
joining us on the podcast todayand a previous guest. We went
through the Intel Igniteaccelerator program together in
in different companies. And,yeah, just really excited to
have with us, Ramin Mohammadi,with us who is an adjunct
professor at NortheasternUniversity and also lead

(01:27):
principal AI engineer at iBaseT.Welcome, Rameen. It's good to
good to see you again.

Ramin (01:33):
Yeah. Thanks then, Chris. It's always, great to be back.

Daniel (01:36):
Yeah. Yeah. I I've been excited to talk through these
things. And even before theshow, obviously you're kind of
living in two worlds. You'reliving in the industry world and
you're living in the academicworld.
And you've kind of been livingin those two worlds for quite
some time, which is interestingbecause you have a perspective

(01:59):
on how, for example, datascientists or AI people or
machine learning people arebeing trained and what those
people are actually doing inindustry, which I find really
intriguing, especially becauseso much has changed. I guess
maybe that's a good initialquestion is, is my perception

(02:23):
right that the role of an AIperson or a data scientist or a
machine learning person inindustry, the day to day life of
that person has really changeddramatically over the past even
few years. And I'm curious ifthe academic side has kept up

(02:47):
with that.

Ramin (02:48):
Yeah. So I think that's an interesting question. I think
we need to break it down intomultiple sections. Yeah. I mean,
let's just start first, do aquick review of what has
happened, you know, becausewe're talking about the complete
transformation of the AI anddata science job market, You
know?
I mean, if you remember, and itwas about, like, a decade decade

(03:11):
ago back in 2012, our businessreview, they called data
scientists the sexiest job oftwenty first century.

Daniel (03:18):
Yeah. That's why I got into it because obviously that
describes what I wanted to be.

Ramin (03:24):
That's right. And if you think about it, that one phrase,
it kicked off a massive goldrush. Everyone wanted it.
Universities were spinning upthe new master programs
overnight, and the promise waspretty simple: get a degree and
learn a little bit of machinelearning, and you're instantly
employable. That promise feelslike almost like a myth now, you

(03:46):
know?
I mean, if you talk with any newgraduate today, especially
someone looking for their firstrole, the feeling is totally
different. It's brutal. Themarket is absolutely brutal. We
see job posting for entry level,you know, that job requires
about three years of experience.The demand has changed.
It's shifted fundamentally. It'snot about what do you know about

(04:08):
it from the textbook anymore.It's about what can you build?
Can you deploy and maintain areal like, a scalable AI system?
It's kind of like that's the newcurrency of hiring.

Daniel (04:20):
I think one time, Chris, I don't know if this was us that
came up with this discussion,but I remember quite a while ago
we talked about full stack datascientists or something like
that. The idea being like, youcould figure out what kind of
modeling you needed to do. Youcould do the prototyping and

(04:44):
POC, but you could also likedeploy something to actual cloud
environments or something likethat. I mean, that seems like
quite a tall order, Ramin,because you're basically saying
be a software a proficientsoftware engineer, but also be
an infrastructure person andalso I don't know, I've heard a

(05:09):
lot of people say there's notreally a full stack engineer
doesn't really exist. So yeah, Iguess from that perspective, how
much of what a data scientist ormachine learning or AI person
fits into those differentbuckets at this point, whether
it's software engineering orinfrastructure work or actual

(05:30):
knowledge of differentialequations or statistics or
something.

Ramin (05:34):
I think that's also a great point. So if you think
about back to Data Science Job,the idea of Data Science Job was
that your job is kind of doneonce you got a good score in the
notebook. That the classic, mymodel has 95% accuracy on the
test data, you're good, you passit to someone else. And then you
remember, I think it's around2020s with some resources like

(05:59):
Google Cloud rules of MLOps, itlaid out these new realities
that successful ML needs a wholesuite of real engineering
escapes. The things likecontainerization with Docker,
CICD pipeline automation,monitoring, and, you know, you
have to know if that your modelactually works in the real life,

(06:20):
and then you need to monitor it.
And after you deploy it, youneed to basically look for the
drifts, you know? So industrymade it really clear that job
wasn't just build the modelanymore. It's kind of like you
need to own the pipeline. So andthen if you think about it, all
of sudden, the analysts or datascientists went from just being
a simple analyst to beingengineers who build and maintain

(06:43):
the intelligent systems. And sojust as that engineering bar was
being raised by MLOps, alongcomes the second, maybe even
bigger tidal wave, thegenerative AI.
And it becomes like around 2023explosion that you can see in
the Stanford AI Index,basically, they mentioned that

(07:03):
this was not just a cool newtool. This was an automation
event. I immediately attackedthe entry point in the field
that they could do those jobs,basically. This shift was
drastic from the data scientiststo ML ops engineers and all of a
sudden AI, basically.

Chris (07:22):
In addition to that, there's so much more diversity
in you know, we as we weretalking a moment ago about the
the notion of the full stackengineer, especially at the
entry level trying to fit intothis. And the notion of, like,
what is full stack is changingfairly rapidly. There are a lot
of different options out there.And not only do you have to try
with that entry level studenthave to try to fit in to the

(07:44):
notion of what, you know, anorganization is looking for, but
there's all these variations onthat. And if they're not in the
right variation of what thatorganization is looking for, in
terms of this abundance ofskills that are required for
that given position, they'restill out of luck.
I mean, it's it's really acrapshoot for students today in

(08:05):
terms of trying to find theright fit and represent that
cell represent their own abilityto fit to the organization
that's looking to hire. It's II'm I'm really glad that I'm not
out there in the job market inthat way right now. Would be
brutal.

Ramin (08:19):
Yeah. So I think I think that's that's true. It's like,
if you think about it, as thisAI wave comes in and this series
of automation task, basicallythis AI made certain things
simpler. Those are the types oftasks, like a bullet per task
that you always used to give tothe new hire, basically. It's
kind of like the groundwork.

(08:40):
And for someone who's an earlyhire, a recent graduate, those
type of jobs were kind of likethe first step on the ladder.
How to, for example, you write acomplex SQL query to get the
data, make simple Python, andget your hand dirty with the
company's data. You learn aboutit, and also you show your
skills, you know? But now it'sno longer like that, so you need

(09:03):
to basically find the correctfit, what they exactly want,
what they want to build. So Ishow that I can build that.
And there was this study fromOpenAI and University of
Pennsylvania that they look atthis task exposure to large
language model, and the taketakeaway that they had was
pretty simple. Any repeatabletask that used to be given to

(09:25):
juniors are highly vulnerable toAI, basically, and innovations.
So if a junior analyst used totake all this afternoon, write
the SQL queries and make thedashboard, now AI can just write
it with a great prompt, right?So basically, economy case for
hiring a big group of traineesand have them to do the work has

(09:48):
evaporated. There's kind of likea change.
For example, I used to hire lotsof interns to basically help
with the development and speedup the process. And since AI
shift, to be honest, I just tookI use AI for all of those tasks,
you know. So this has been thisbig change. And of course, you

(10:09):
know, we are seeing this shiftin hiring strategy kind of
everywhere in big tech or evenin startups. They just stop
hiring for potential and theyare starting hiring for proven
capabilities.
It's kind of like that. Theparadigm has changed. New
companies these days basicallyafford to bring in 50 juniors or
spending a couple of years totrain them. They're rather to

(10:31):
hire five or maybe 10 peoplethat already have built or
developed some complete systemfrom day one. So it's kind of
like fitting about it, that newentry level jobs is technically
what we would call mid levelengineers a couple of years
back.
This shift is really bad. Withthis new bar, it's not like that

(10:53):
you don't need knowledge. Allthis deep statistical knowledge,
Python skills, they're allessential, but they are just at
this point, they are kind ofprerequisites. They are the
ticket to the game. They are nothow to win it.
It's of it's has here. You needto prove that you can build. The
company wants what you built,and then, you know, you go for

(11:13):
hiring.

Daniel (11:14):
I'm wondering because that bar has been raised, like
you say, the kind of mid levelpositions that we used to call
mid level or maybe the entrylevel ones now, how does that
change? Because I mean, maybethis is a negative view that I'm
about to give, but I'm very prohigher education. But I also

(11:36):
think like even whether you lookat computer science or data
science sort of education, a lotof that does not, even before
the recent shift that you talkabout, it didn't always connect
to what you were actually goingto do in your day to day work,
right? So now not only does itnot connect to that entry level

(11:57):
kind of day to day work, butdoes it now even increase that
divide where like how could wepossibly train people to come in
as mid level kind of datascience folks? Because I think
if I'm interpreting what you'resaying correctly, it's not that

(12:18):
AI is making data scientists nolonger relevant or AI or machine
learning people no longerrelevant.
It's still very relevant. It'sjust the stuff that entry level
data scientists or machinelearning people used to do and
kind of level up on, that's nolonger available. So where are
they going to do that? And is iteven reasonable for us to think

(12:41):
that universities could help getthem up to that level, I guess?

Ramin (12:46):
Yeah, I think So I would answer to that question in two
sections. I think one part isabout where is academia stands
right now. And then the secondpart would be talking about the
industry versus academia rightnow. So let's just start with
where does academia stands. Ifyou think about it, and I kind
of call this I don't want to benegative I call it educational

(13:09):
bottleneck.
And to be clear, the first thingis that the faculties that we
have in CSML, data sciencedepartment, they are all
brilliant. They are world classat teaching the fundamentals,
the math, theory, history, theresearch. That foundation is non
negotiable. You need it. But thecurriculums often just stop
there.
And it used to be also kind oflike that. And it's some of the

(13:32):
theory and leaves basically thishuge gap between what the
student learns and whatemployees actually need for them
to do on the first day. As anexample, a student might spend
the whole semester learningabout the math and all sorts of
optimization, back replications,and stuff like that, which is
necessary. But as soon as theygraduate, see this job market

(13:54):
that wants them to deploy on theKubernetes or they know how to
work with all different cloudresources. So they know exactly
how the engine works, but theyactually never tried to drive a
car into traffic.
And that you know, there wasthis new post by Andrew Ng
recently that he argued thisurgent shift in education. I'm

(14:15):
going to paraphrase in what hesaid. He said, Knowledge is
great, but skills are greater.Meaning that in the field that's
moving this fast, you have toteach the practical skills to
get the work done. You you needto give the capacity to get
meaningful work done by having aproper knowledge and proper
training.

(14:36):
So this is exactly what the jobmarket is selecting for now. So
that's the view that I have oneducation at the moment. And the
second part that we canbasically talk about is like a
comparison between where it islike a risk industry versus
academia. And there is a reallygood, basically, study by MIT, a

(14:58):
recent study, basically, thatthe stats are staggering.
Basically, they say that rightnow, about 70% of the AI PhDs
are just skipping academia andgo to job market, basically
industry directly.
And that's a huge brain drainfor the universities, you know?
And the second is that, which isthe real killer risk, and

(15:20):
probably you I'm sure you knowall this, like 96% of the major
state of art systems comes fromindustry labs, not from
universities anymore. So youruniversity is already falling
behind, and then companies likeGoogle, Meta, OpenAI, they are
the one that defining thefrontier now. They are building
the tools. They are settingtheir standards.

(15:40):
And that's the absolute core ofthe bottleneck. Academy
curriculums moves on a cycle ofyears. Getting a new course
approved, like updating atextbook, it's slow. By the time
a university approves one newcourse to be, like, let's say,
for example, LLM applicationcourse to be added to
curriculums, the tools havealready changed three times, you

(16:03):
know? So the entire framework isreally different because, you
know, it took a while.
And that has happened to mealso. Like, developed a course
and take years to get approvalto teach that course. And then
you need to go back and updateeverything that you were
planning to teach because, youknow, the industry has changed
already.

Sponsors (16:26):
Well, friends, when you're building and shipping AI
products at scale, there's oneconstant, complexity. Yes.
You're bringing the models, datapipelines, deployment
infrastructure, and then someonesays, let's turn this into
business. Cue the chaos. That'swhere Shopify steps in whether
you're spinning up a storefrontfor your AI powered app or

(16:47):
launching a brand around thetools you built.
Shopify is the commerce platformtrusted by millions of
businesses and 10% of all USecommerce from names like
Mattel, Gymshark, to foundersjust like you. With literally
hundreds of ready to usetemplates, powerful built in
marketing tools, and AI thatwrites product descriptions for

(17:07):
you, headlines, even polishesyour product photography.
Shopify doesn't just get youselling, it makes you look good
doing it. And we love it. We useit here at Changelog.
Check us outmerch.changelog.com. That's our
store front, and it handles theheavy lifting too. Payments,
inventory, returns, shipping,even global logistics. It's like

(17:29):
having an ops team built intoyour stack to help you sell. So
if you're ready to sell, you areready for Shopify.
Sign up now for your $1 permonth trial and start selling
today atshopify.com/practicalai. Again,
that is shopify.com/practicalai.

Daniel (17:54):
So Ramin, I love how you highlighted this kind of divide
between academia industry, likewhat that is in reality.
Anecdotally, I remember actuallylast, I think it was last year
or maybe a year and a half ago,I lived by Purdue University. I
was like walking through campusand they were just finishing

(18:16):
their, They had this newbuilding, right? And so this was
'20, whatever, 2024, right? Andit said like, Hall Of Data
Science, right?
And I thought My immediatethought in my mind is like, in
2017, you could have created aHall Of Data Science. Now you

(18:38):
need a Hall Of AI. You'rebuilding the wrong hall. To
their credit, I think theyactually I just looked this up
while we were talking. They didrename it Hall Of Data Science
And AI.
To their credit, they at leastcaught up with the name. Yeah, I
guess, obviously you're aneducator and so you see that

(19:02):
there is value in trying to havethese formal education serves a
purpose and is different frommaybe on the job training. What
do you think, or have you seenexamples where this sort of
practical skills are built up inan academic environment rather

(19:28):
than just the theory or theknowledge as you were drawing
the distinction there?

Ramin (19:35):
Yeah. Actually, that's something that we have been
doing for almost the last threeyears. So I basically developed
this course, this MLOps courseat Northeastern University
almost two years ago that wehave been ongoing. So the idea
was this. This is like aboutthree, four years ago, I was
this hiring manager, and I usedto do lots of interviews for our

(19:57):
team.
And I always basicallyinterviewed smart, motivated,
good to school, basically,candidates, but most of them
struggled with the same thing.They understood the theory, but
they couldn't build anything,they couldn't ship anything, you
know. And that's when it clickedfor me that, okay, if the
industry, I personally assomeone who was in the industry

(20:19):
and academy, expect thesestudents or these basically
candidates to build a realsystem from day one. And then I
know in the industry we don'tteach them that. Could we do
something about it?
So I started working on thiscourse. I built this MLOps
course that every semester rightnow, we have about 150 to 170
students within one class, likea huge classroom. And instead of

(20:42):
just learning the concept, theystart by choosing a domain that
they actually care about,healthcare, finance, sport,
robotic and whatnot. Then as ateam, they spend the entire
semester on building one realproduct. And this real product,
it's not just homeworkassignment.
It's not a toy example. It's areal working system with

(21:04):
deadlines, milestones,deliverable, just like a real,
an actual ML and software team.And the best part of that is we
wrap up this semester. You know,the way that we wrap up the
semester is that the studentsbasically present their product
at our MLOps Expo, which is afull industry partner event we
have been holding over the last,I think, two years now. This

(21:27):
year, for example, we partneredwith Google.
So we are hosting on in twoweeks, December 12, at Google
Main campus in Boston, and whereour students are pretty hired to
come there. But they willbasically what they do, they
show they demo the actualproduct that they have built.
And so the whole course issimple. You don't just learn ML

(21:51):
anymore. We teach you how tobuild with it, you know?
And the idea for me was to givestudents this hands on
experience that companies arelooking for right now. And
honestly watching the studentsgo from, I have never deployed
anything before to me and myteam, we build a real product
this semester. That's kind oflike the best part for me.

Daniel (22:10):
One at least hypothesis that I have here, which I would
love your opinion on, Ramin, ison one side you have highlighted
how this kind of gap is wideningeven, like the between the
theory and like where you needto come into a job, like at a
mid level. At the same time,this revolution of Gen AI has

(22:36):
been happening, which in someways, to your point, some of
those things are the things thatare being automated by AI, but
it's also enabling maybe thislike younger generation of
software engineers, AI people toactually perform at a higher

(22:56):
level out of the gate, but in adifferent way. So not like
there's kind of a burden onmaybe us as prior generation
data scientists and machinelearning people to understand
that students and new hires needto from the start be doing their

(23:17):
data science work differently.So just by way of anecdote, we
were talking about this a littlebit before the show that my wife
owns a e commerce business,Black Friday, Cyber Monday just
happened. Day to day in mycompany, I'm not doing as much

(23:37):
kind of hands on work on theproduct as I was given my role
as CEO, but it was nice to goback.
So for like four days I helpedthem during the sale and I just
sat in a room doing customerlifetime modeling and updated
forecasts for 2026 and lookingat churn and analyzing customer

(23:58):
journey and all this stuff. Andnumber one, it was a ton of fun,
but I was kind of coming at itfrom that perspective and kind
of reentering some of thosethings that maybe I hadn't done
as much for a little while oreven, you know, maybe since the
previous year when I helped themwith forecasting, like I was
able to get tons of that done soquickly because I was having AI

(24:26):
honestly write most of the codefor me. The thing though was I
still had to play the datascientists to get from point A
to point B. There was no waythat I could have just said to
any AI system, Hey, I want writea three sentence prompt and get
out all of the lifetime modelingand forecasting and all of this

(24:50):
stuff. I still had to play thatkind of data science
orchestrator and know what thethings were, know what modeling
techniques were relevant, knowmaybe what trade offs were and
other things.
So do you think on the one handit's maybe depressing that the
academic kind of industry gap iswidening, but on the other hand,

(25:14):
maybe there's Am I right thatthere's an opportunity to
actually lean in for thesestudents in terms of different
ways of working to get to ahigher level faster?

Ramin (25:27):
I'm not sure about the higher getting to the higher
level faster part, but just I Isaw a a new talk recently by
Neil Ahoyne over at Google, andhe made a great point about this
data science job. And hebasically was saying that the
data science job is not gone,but AI is just forcing them to

(25:47):
change dramatically. It's nolonger it's about analyzing the
data or building certain, youknow, sort of dashboards or
stuff like that. As we say, youcan just with the knowledge,
just prompt it properly, andjust having the data and just
build that quickly, You know? Sothere are certain types of tasks
that you used to do for tryingto climb the ladder to learn

(26:11):
more and more, but that they arenot the same anymore.
And the expectation is not foryou also to do the same task
because, you know, if thiscompany is hiring, you probably
at this stage they want more.But I think it is a really great
point that for hiring managersor for someone that's when you
hire someone on your team orhave someone new juniors on your

(26:32):
team, you need to also accountfor helping them to like,
mentoring them properly to to besure that they can evolve and
learn. Otherwise, we basicallytake this cognitive ability from
them because they everyone ifyou just ask everyone to just
build, build, and they just useAI, they don't They're never
going to learn basically how tobuild. So we take that cognitive

(26:52):
ability away from them to justbuild new, faster products.

Chris (26:57):
Yeah, think you're really onto something there in terms of
one of the things that that Ihave done for the last few years
is, is I'm a capstone sponsorfor capstone projects at Georgia
Tech, in the in the College ofComputing. And so as and I'm
doing that from my nonprofitrole as opposed to my day job.

(27:18):
When I work with different teamsthere, I think one of the
challenges is they're kind ofbringing what they know.
Certainly, Gen AI capabilitieshave helped them you know, step
up a little bit along the way interms of figuring out. I think
the areas that I've noticed thatthey're still struggling, the
students, are there's, you know,going back to to Dan being a

(27:40):
data scientist over the weekendinstead of a CEO in that moment,
is he's bringing all thatbusiness knowledge, you know,
years and years and years ofbusiness knowledge and
understanding about what'sreally needed in that.
And I think that's, you know,that's one of those things that
is is part of the struggle withjunior level is is there's the
the kind of concept of I'velearned tools in university, and

(28:05):
I'm trying to bring them tobear, and they're not always the
right tools for the organizationthey've joined. And they don't
necessarily know how to combinethat with all the other all the
other tie ins that thatorganization may need, that were
not necessarily somethingaccommodated in their in their
academic development. And so,you know, that's kind of
exacerbated by the fact that nowwith Gen AI kind of replacing a

(28:28):
lot of those junior roles comingin and and you know, how do you
how do you ramp up? It does seemto your point like things are
actually getting like eventhough we have new amazing tools
in in the form of Gen AIcapabilities, it seems like
things are getting harder tobridge that gap. And and I'm not
sure how you do that.
Because it's a combination ofboth kind of the the experience

(28:51):
of being in the real world,along with fast moving, you
know, a fast moving technicallandscape to navigate. Are you
seeing that from your side withstudents? And how are you
tackling some of thosesubtleties that are there?

Ramin (29:06):
Yeah, actually, definitely. So two weeks ago, I
sent out a survey to my studentsand I asked them basically to
take a couple of questions. AndI specifically did this for
Albert Talk. And so as part ofthe survey, basically, there
were some questions. And onequestion was which is 60%,
basically, of the students, theysay that they are taking online

(29:27):
courses on top of what they aretaking in in the school.
In another question, 82% of thestudents say that they're
participating in hackathons inorder to learn to how to quickly
to build. And about 46% of thetime, they are attending
workshops, you know? So they arebuilding their own parallel
curriculums through sideproject, open source

(29:50):
contributions, or certificationthrough AWS, Google, you know?
And that's exactly it. You know,the portfolio kind of has become
a new credential.
It's no longer about your grade.It's like about what you have as
a portfolio. And this is alsoimportant for us to it's kind of
like a dose of reality that thisself learning path isn't easy

(30:10):
and isn't equitable. You know,it takes tons of time and costs
lots of money. And if you wantto practice building a real
production grade system, workingwith a cloud service that always
costs money, you know, as thosecommercials.
And how many students like, ifyou think about students already
paying thousand intuitions, theycannot also afford hundreds of

(30:31):
dollars per month for cloudcomputing, you know, to
practice. So it's kind of like ahuge change. It creates this
resource divide. And at thispoint, I think the bar isn't
just higher. It's kind of alsofinancially more expensive for
the students to learn.
And right now, for example,shout out to our friends at

(30:52):
Google. They give us lots ofcredits for our ML ops course
every semester because ourstudents, they can't otherwise
build anything in the realworld. I personally reach out to
lots of providers in theindustry and say that, hey, you
know what? We train thesestudents to use your tools. Give
us some cloud credits so theycan basically learn and build a

(31:14):
phone.
Yeah, that's my take on that.

Daniel (31:19):
Well, Ramin, I am kind of intrigued because, well, on
the one side you're thinkingvery in an innovative way about
how to bring this kind of skillor reducing the skill gap, being
creative in the academic settingto get people these skills, but
also, you're a practicing AIengineer. What have you seen

(31:43):
kind of personally? Becauseyou're already operating at a
higher level. Are there alsochanges, any like significant
changes that you've noticed inyour day to day work over the
kind of past few years that havecaused you to think about your

(32:03):
day to day tasks differently,like more so than the entry
level type of folks, butactually ways that you're
fundamentally thinking aboutyour workflows or how you're
doing those kind of higher maybehigher skill or higher level
kind of data science AI stuff.I'm wondering if anything stands

(32:25):
out for you.

Ramin (32:26):
Yeah, definitely. I mean, I personally have been part of
this shift. I started my careeras a data scientist. Then in
2018, I started as an MLengineer, and it just went up.
Then last year, I started as anAI engineer.
So I also have been part of thischain myself.

Daniel (32:44):
Data, ML, AI.

Ramin (32:46):
Exactly, the same pattern. And for me, when I look
at them, they are kind ofsimilar. If we put the data
science aside, because that waskind of like There was no
production. There were lots ofresearch, especially around it.
But when you go to ML and AI,just the terminology is
different.
They're technically kind ofsimilar. I think the main

(33:06):
difference that I personallyfelt is that I need to, in my
day to day work, to work a lotwith LLMs because it's a
requirement for certain thingsand work a lot with the larger
models, which requires you tohave a better understanding on,
you know, like a GPUoptimization, how to break your
models and basically ensure thatthey're optimal, basically. And

(33:30):
those changes, you know, itwasn't something that you do
maybe a couple of years ago. SoI ended up more personally
trying to read a lot, you know,spend summertime just read
different books to learn toadvance my own career. And I
always talk about this with mystudents.
When I learn something new, Ibring it to the class. I was
like, okay. I was recentlybasically reading about this,

(33:52):
and this was really interesting.This is the link, and maybe I
sometimes give them a smalllecture also on it. But I think
yeah.
So it's like the change is therefor everyone, not just for
junior. It's like it doesn'tmatter if you are a principal or
a junior technically. But who'sgetting being more impacted, I
think that's the part that'skind of, like, unfair, you know,

(34:13):
to to the for to the juniorstechnically or or recent
graduates.

Chris (34:17):
I'm curious to extend this out a little bit, you know,
as we kind of went from thechallenge of juniors, and Dan
introduced, you know, thechallenge of kind of us, you
know, as as as people who arepast that point in their life.
But like, we have fast coming,you know, fast changes are
coming even more in the sense oflike, we're hitting that point
where physical AI is really onthe rise now, you know, not just

(34:40):
in certain industries as it hasbeen historically, but in many
industries, it's, know, it'sexploding outward at this point.
And we all have challenges interms of incorporating these new
realities into what we're doingand how we're going to learn
about it. What does that implyat the university level? When
you're getting back to studentsand and they're already you

(35:02):
know, you're already trying tobridge the gap into the
corporate world or the startupworld or wherever they're gonna
be productive, But you also havethis explosion in terms of the
places that AI is touching innew and different ways.
What are the what are theimplications on the curriculum
and on the the burden thatprofessors have to try to get
their students ready for thatnext thing, is steamrolling over

(35:24):
us already?

Ramin (35:25):
I think it depends. So let let me just I know some
other schools are doing that,but I'm going to speak with
respect to Northeastern. Forexample, Northeastern Curry
College of Computer Science, asof this year, basically 2026,
they're updating theircurriculums finally. Not
everything is going to be asmall shift, but gradual,
basically. So they areintroducing some more practical

(35:48):
courses into the curriculums.
And they also, for example, theyare weaving their ethics
directly into the coding part ofthe curriculum. But this is
going to be kind of like aslower shift on the curriculum
side. But on the other end, fromthe teaching perspective, this
is kind of like AI is kind oflike a double edged sword at

(36:09):
this point because students,they all use AI. They are using
degenerative AI, which is great.I would tell my students, use
it, but don't lose it.
Know, kind of like you need touse it, don't lose it. So it's
kind of like you need to be surethat you can learn, move faster
with this type of thing, not tojust give away all the autonomy

(36:32):
and you just basically you justuse them for everything. And so
and then further from the otherend, from the teacher's
perspective, it's it's kindadifficult because when you give,
for example, homeworks or labsto students, it's just
especially on coding. I'm nottalking about writing an essay.
Like, coding perspective, youdon't know.
You can't even tell that if theywrote the code or not. Everyone

(36:53):
returned great codes these days,and then there's a homework.
There's no way for you to justsay that if it's written by AI
or not. They're really smart inhow to change the temperature to
ensure that the result is notbeing detected. So the the so
again, this is like a doubleedged sword, but also from the

(37:15):
other end, it's like becausethere are lots of information,
lots of changes in the market,in the industry, in the domain
every day.
Like, every day you read thenews, there's a new article,
there's something coming out,and it's hard for basically
academia to keep up with that.You know, it's like you academia
is falling far behind theindustry and it's going to go
into this this gap is going tojust expand the way that it is.

(37:40):
And I think at some point,industry need to help academia.
It shouldn't be just academianeed to keep up with the
industry. If the industry needsnew talent to come later, you
need to step forward.
And I say, okay, let me alsohelp them. Let me start some
program. Let me participate insome of the courses that we

(38:01):
have. So otherwise, it's kind oflike a chasing a ball, a academy
just constantly trying to keepup and that's not going to win.

Chris (38:10):
That's fair. And I think that's a it's a good notion that
I think inner industry reallyneeds to consider as an
investment back. I I agree. Ithink it's been largely a one
way street there. I would liketo flip a little bit the
timeline around to to thestudents that are coming in.
So and I'm asking thisselfishly, I have a 13 year old

(38:33):
daughter in eighth grade. She iswe're we've been applying to
magnet schools and things likethat and getting her ready for
her high school experience. Andshe has never been someone
interested in AI that was dad'sthing and all that. But as she
has started looking at what shewants to do, she's starting to
recognize that whatever that is,AI will impact her in a

(38:53):
significant way going forward.So it's not just the kids that
are are focused on technology atthis point, but all of the kids.
And as she does that and they'reentering into high school, what
advice do you have for what highschools need to do before they
come to you? Before you'regetting those students and
you're trying to prepare themfor industry and a career and

(39:16):
and moving through their lives,You have students coming to you.
What would you like to see fromhigh schools in terms of how
they prepare these kids to bebetter or more ready to come
into your care as a professor sothat you can do the thing that
you do?

Ramin (39:33):
Yeah, so I think that's a great point. And are already two
shifts. I have been spoken byneighbors, similar question
that, Hey, my kids, should theygo to college for computer
science anymore? Should theystudy this anymore? And I think
the answer is that yes, youknow, that there will be shifts
in the market.

(39:54):
And it's it's not just computerscience. It's not just AI. AI is
going to impact so many things.Some some areas, like, slower,
but some areas much faster. Andat some point, all of us
basically become somehow we needto learn how to work with AI.
And I think it's really good iffrom high school, you understand

(40:14):
the concept, not not maybe themaster theory behind AI, but
just to learn, okay. In general,but how does AI work? There are
lots of AI capabilities that youdon't technically need the math
behind them. You can just builda system just by knowing how to
put the components together. Soif they could, like, from high
school, to part of the workshopsor participate in some sort of,

(40:36):
like, a training set, buildsomething simple, you know, that
automatically opens lots ofdoors, like a thinking for you
for the future.
As you go to after high school,then you want to go, basically,
to universities and you learn indifferent courses, different
concepts, you're like, oh, Iknow. Maybe I can build
something around this. I alwaysthink that everyone can be an

(40:57):
entrepreneur. It's kinda like aslong as they have the correct
mindset and the energy for it.So if if they already have been
trained from high school andthey have not not trained in
bigger way, just in this easierway of training, like teaching,
they could potentially advancemore in university compared to

(41:20):
students that they just want tolearn it during the university.

Daniel (41:23):
Well, I know that we've talked a lot about kind of a lot
of perspectives, both from theindustry side, from the academic
side. I think all of us on thecall though are generally
excited about kind of certainparts of the ecosystem, way that
they're developing. From thatside of things, as we get closer

(41:45):
to the end here, Ramin, what asyou look at the ecosystem,
because you're, again, you havemultiple views of this ecosystem
from the industry side, from theacademic side. What's most
exciting for you as you're kindof entering into this next year?
And maybe it's something like,oh, I can't wait personally to

(42:06):
have the time on a weekend toexplore this, or maybe it's
something you're already gettinginto.

Ramin (42:14):
Definitely. Actually, I recently purchased the Riichi
Mini by

Chris (42:20):
Hot Yeah,

Daniel (42:21):
yeah, yeah. The robot, right? The little it's kind of a
desk type robot, Yeah.

Ramin (42:29):
So I'm pretty excited and waiting for that to be
delivered. I think the deliveryis going to be early January,
hopefully, finger crossed. AndI'm pretty excited to work with
that and build some capabilitiesI have in mind. And when I think
about all these changes, like,oh, if you would put me back a
couple of years ago, I wouldhave never gone for robotic. Oh

(42:52):
my god.
No. You know what? It's not mything. But now with this AI
change and I already wentthrough the, you know, contents
of on Hugging Face, which isthese guys are great. Reading it
through the documentation, I waslike, wow, that's pretty
straightforward.
So think about how much AIchanged it, feel that I can
easily go buy a robot, like asmall robot, and I'm planning

(43:13):
already ahead of time. You alsohave this simulator, so you
don't need to wait for it todeliver. You built ahead of time
the apps and simulate it that itwill work on the robots and the
robot comes with deploy it. Sothat's my go to, like what I'm
excited for in 2026.

Daniel (43:30):
Yeah, it's kind of crazy. I feel like when we
started in this field, it waslike hard enough to get the
dependencies installed forTensorFlow and just be able to
run any model. Just like that inand of itself was like

Chris (43:49):
trying Are to give us PTSD? Is that the goal mean,

Ramin (43:55):
TensorFlow and CUDA. Oh.

Daniel (43:57):
Yes. Yeah. It's like regardless, that was the hardest
problem. And now you can have awhole digital twin of a robot
and do all that. It is prettyspectacular.
Yeah. Well, I'm also excited forthat. I think we do have one
coming here to our offices aswell. So, I'm excited to see

(44:21):
what that's like. I've neverdone any robotics, really other
than maybe those, like what whatare those?
Lego robotics sort of things.But but, yeah, excited to
excited to see where things aregoing. Thanks for sharing some
of your insights with us, Ramin.It's been, it's been a real
pleasure, and hope to have you,on the show, third time to let

(44:42):
us know how the robotics went.

Ramin (44:44):
Yeah. I appreciate that. Thanks for having me again, and,
it was great.

Jerod (44:55):
Alright. That's our show for this week. If you haven't
checked out our website, head topracticalai.fm, and be sure to
connect with us on LinkedIn, X,or Blue Sky. You'll see us
posting insights related to thelatest AI developments, and we
would love for you to join theconversation. Thanks to our
partner, Prediction Guard, forproviding operational support
for the show.
Check them out atpredictionguard.com. Also, to

(45:18):
Breakmaster Cylinder for thebeats and to you for listening.
That's all for now, but you'llhear from us again next week.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by Audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.