Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
How'd you like to listen to dot NetRocks with no ads? Easy?
Become a patron for just five dollars a month. You
get access to a private RSS feed where all the
shows have no ads. Twenty dollars a month will get
you that and a special dot NetRocks patron mug. Sign
up now at Patreon dot dot NetRocks dot com.
Speaker 2 (00:34):
Hey Philly, it's dot net rock.
Speaker 3 (00:46):
Wow.
Speaker 1 (00:47):
Yeah, so long since we've been in front of a
real crowd.
Speaker 3 (00:50):
That a live show man.
Speaker 1 (00:51):
While I drove down here from New London, it's about
four and a half hours and I had road trip flashbacks.
Speaker 3 (00:58):
Yeah, yeah, no kidding. It was one of the I
think we stopped here on every tour, Yes, every single one,
from the very first one two thousand and five, which
is entirely your idea because I was brand new, twenty eight,
twenty ten, twenty twelve, twenty thirteen. Yeah, yeah, everyone. We
always came to Philly. Sometimes we got to do a
whole day, like we do a day of dot net thing.
(01:20):
Sometimes it would just be the evening show. But you know,
you could always count on Philly folks to come out
and have a great time. So this is a special one.
It's a one off, it's one of a kind, and
we're gonna have some fun.
Speaker 1 (01:30):
We are going to have some fun. Before I introduce
the other two people who are here with us, let's
talk about.
Speaker 3 (01:39):
Nineteen eighty one.
Speaker 1 (01:40):
Nineteen eighty one. All right, so do you know what
happened in nineteen eighty one?
Speaker 3 (01:44):
A few things, tell us, but I focused purely on
space and technology topics. Well, go ahead, Well it's nineteen
eighty one, so it is the beginning of the Space
Shuttle program. Well, the successful launch after years of delay. Remember,
Shuttle was supposed to rescue sky Lab and they lost
that in seventy nine. This is nineteen eighty one. They
missed it by a few years. So STS one flies
in April with Young Crippon on board. They for fifty
(02:07):
four hours, thirty seven orbits and a totally successful flight, really,
no no drama, great test flight. Famously, it was Crippen.
Young was the guy who flew everything. He was in Gemini,
he was in Apollo, but Crippen was the new guy.
And as they were bringing it in for a landing,
says how it FLI And he said flies like a
(02:27):
brick shit house because it did right, This a one
hundred ton vehicle with stubby little wingstune two hundred miles
nuts at hour. You know, it's stuffed a landa thing. Anyway,
they had to edit that so if you ever see
the quote from Robert Crippen it's flies like a brick,
that is not not what he said Later that year
sts two. So they turned Columbia around in about six
(02:50):
months and flew it again in November. This time it
was angling truly. Engel was supposed to be on Apollo seventeen.
He got bumped by Harrish and Schmidt, the only scientists
to ever walk on the Moon. And so they flew
Columbia again, and the main job of the second flight
was to test out the Canada Arm because their first
(03:10):
this was back when they thought they were going to
fly these things every two weeks and they were going
to deploy satellites from them, repair satellites from do all
this cool stuff right. It was a space transport system
except for all that part where did that didn't work?
And so they this was the trials for the It
was going to be five days up there working on
the Canada ARM And on the first day they had
a field cell failure and so they were low on power.
(03:33):
They couldn't operate the arm, so they cut the mission
down to two days and turn around came back.
Speaker 4 (03:37):
Was that the flights with the white painted external tank.
Speaker 1 (03:40):
This is when the external light was still Another voice,
what's going on here? Oh, that's Jeff Fritz.
Speaker 3 (03:46):
That's Jeff Frid's okay, Yeah, So that that's back when
the tanks were painted white, and that paint weighed something
like fifteen hundred pounds. So that's why they lost it eventually.
Speaker 4 (03:54):
It was just weighs that much.
Speaker 3 (03:56):
When you get that much of it on there.
Speaker 1 (03:58):
Well, you can dump the cans overboard once you get space.
So and Columbia was always overweight, right, she was. She
was just the first flying vehicle. And that's why she
never went to the space station. She couldn't get to it.
That didn't have any that mass, didn't have enough energy
get there.
Speaker 5 (04:11):
That was the space shuttles that had.
Speaker 3 (04:13):
Space shttle Columbia.
Speaker 5 (04:14):
Yeah, and that's the one with the bricks underneath it.
Speaker 3 (04:16):
With the well, and that would crumble in your hands
like they would crush. Right, they're very delicate. It was
ridiculous to delicate.
Speaker 1 (04:23):
And another voice you heard was Bill Wolf.
Speaker 3 (04:29):
And uh, the geek out start because of me ranting
at Carl in twenty eleven when Atlantis landed for the
last time. Well he was in a bar with some
of this, yes, and uh and just how screwed up?
You know, space Shuttle never really did what was supposed
to do, all the missions that it could never actually do.
Speaker 1 (04:45):
And and Richard will never tell you this, but he
had the Space Shuttle manual and read it cover to
cover so he could tell you everything about the Space Shuttle.
Speaker 3 (04:55):
Uh, not entirely true, but okay.
Speaker 1 (04:58):
Uh.
Speaker 3 (04:58):
Anyway, he talked me into let's do that as a show.
And I thought you guys would crucify us for that
because it was not exactly software. And I was wrong.
The geekout and I kept doing them. And this is
eighty this is eighty one. This is the end of November,
end of December show. The next two shows will be
the annual Geekouts. So I've been working all my scripts
(05:19):
for an update on space and an update on energy generation.
So that's coming last space topic and then we'll talk computers.
Voyage two does its flyby of Saturn in August, so
one of the pioneers already coming gone. But this is
the second time we got to look at Saturn Voyager
when I gotten there the couple years before, and we
got some better images, and that sets us up for
(05:40):
eventually Cassini. And this is when they figure out that
that titan very likely has a prototypical atmosphere and increases
the excitement for building bigger missions around all right. On
the computer side, nineteen eighty one is the year that
Sony introduces the three and a half inch floppy disco.
Beginning of all of that, there was competing sizes. There
(06:03):
were three inch versions and three and a quarters as
well as the three and a half and a three
and a half ultimately be the one that was adopted everywhere.
Not so floppy, though not a flop at all, quite
rigid actually, which part of the reason it worked so well.
They used the Bernoulli effect inside that rigid housing so
they could increase I density. The original discs were only
three hundred and sixty K, then they went to seven
twenty and then one point four megabytes, so much storage,
(06:24):
but by far easily most important story nineteen eighty one
the release of the IBM fifty one fifty which you
know is the IBM PC four point seven to seven
MEGAHURTZ processor. The eighty eighty eight processor by default sixteen
k ARAM, but you could expand it to two hundred
fifty six. Later we would come up with an expansion
burd to take to six.
Speaker 4 (06:43):
Forty, but need more than two hundred and fifty six.
Speaker 3 (06:45):
Well really sixteen k right, basic configuration, but you could
run twos that's all there was, of course, shipped with
CPM eighty six, or you could run this brand new
shiny ms DOUS one point zero. Why did they do
the eighty eighty eight instead of the eighty eighty six
because it was cheaper and more readily available. The eighty
(07:06):
six ers were in high demand and they wanted to
build a lot of these machines. They were trying to
get faster to the market, and so the eighty eighty
eight was more readily available. It also made the machine
cheaper because it only had an eight bit input bus,
and so all of the bus structures were eight bit
instead of sixteen bit. That will be changed with the
later models, with the XT and so on.
Speaker 1 (07:24):
The turbo, remember the turbo, but.
Speaker 3 (07:25):
Turbo button go up to eight megahertz and dokey go like, yes,
do not be processor cycle synchronized. That's not a right
idea idea. It was always intended to be licensable. And
so within a year Columbia Data Products makes the first
of the PC compatibles and all of our careers. Yeah,
(07:47):
also in the same year. And I never realized this
happened in the same year. You always thought this was
older the Osbourne one. The Osbourne one came out in
nineteen eighty one. So this was what we call a luggable.
It was literally like a suitcase keyboard attached onto the bottom,
had a five inch displaying in the middle, a pair
of five and a quarter floppy drives on each side,
(08:07):
weigh about twenty five pounds speakers. Yeah, I remember having it.
I had a compact, but not in Osborne in Osbourne.
And there's a good reason for that, because Osborne were
quickly out of business by Osborne in themselves. Yes, would
just say as they started selling their first machine, they
started talking about what was going to be a second machine. No,
I was born too, and that stopped sales of Osborne
(08:27):
one because everybody was waiting for that, and that bankrupted
the company. So they never built the Osbourne two, and
they became a verb. Do not Osborne yourself. Don't talk
about your new product till you're ready to sell it.
Speaker 1 (08:39):
It's okay that we talk about the geek outs because
our show is free.
Speaker 3 (08:43):
Yeah, there you go. One last thing on computing. We'll
move on the IETF. That's the Internet Task Force publishes
AREC Specification seven nine to one, which is a definition
of IPv four.
Speaker 1 (08:56):
Oh there you go.
Speaker 3 (08:57):
Oh yeah.
Speaker 1 (08:58):
A lot of things happened in nineteen eighty one.
Speaker 3 (09:00):
That's the beginning.
Speaker 1 (09:01):
And in the cultural side, I have three things. Ronald
Reagan inaugurated in December, shot in March. Nice yeah, sorry sorry, no, no, no,
it wasn't nice. I mean, yeah, it wasn't nice. MTV debuts.
I want my MTV Gary Newman here in my cars? Right,
(09:24):
was the first video Killed the Radio Star? And also
speaking of Donkey Kong that came out in nineteen eighty one. Awesome,
But if you played it in turbomodate one and that's it,
So I guess we should go for better no framework, right,
all right? Play the crazy music? All right, buddy, what
(09:52):
do you got? So I've got an interesting GitHub repo
that's trending, and I thought it was topical for what
we're going to talk about today. This is from the
dot Mac and it's called claud dash mem. So this
is persistent memory compression system built for Claud code. So
(10:12):
it's a plugin. It's a plugin for Claud code that
preserves context across sessions by automatically capturing tool usage observations,
generating semantic summaries, and making them available to future sessions.
So it's like client side context, right. It squishes it
all down and then on every other prompt it sends
(10:35):
it back. So it allows claud to maintain continuity of
knowledge about projects even after sessions end or reconnect. So
I kind of like that because you know, keeping all
that context is kind of expensive these days. But we're
going to talk about all that stuff coming up here.
But first, I guess the next thing we do is
(10:56):
talk about comments.
Speaker 3 (10:57):
Who's talking to us, Richard Hi grab a comment off
show set nineteen seventy nine, So that's brand new, and
that's the show we did with Callum Simpson talking a
little bit about yak Shaver and our ongoing explorer defined
apps that are actually out in the market using AI technologies.
This comment comes from Blackweb, and I'm going to summarize
them just a bit, but he says, on a recent project,
(11:19):
I gave some AI tools that you discussed or try.
We're trying to build at ten app using cloud and
Aspire Native application in VS twenty twenty six, brand new
VS code. The I tools included Copilot for Windows eleven,
get hub Copilot with GPT five and Claude four point five.
Copilot was helpful in all aspects of the project for
coding and research, but it had trouble fixing errors and bugs.
(11:40):
Often gets stuck in a loop and can't get out
of it without some human help. When the Claude four
point five plugin is at it, it gets even faster and
better at all aspects of coding, including script generation, which
it excels at. And now for the bad stuff. Get
Hub's pricing model and billing is aggressive and predatory. It
counts premium requests, which is basically anything you send in
in VS and VS code. You starting free trial budget
(12:02):
is twenty five requests, which most developers will use up
in about an hour. You can set the dollar limit
t additional requests for various purposes, including gethub actions and
so on. For Copilot pro, you get three hundred quests
for ten bucks, then of course they're all premium requests
and you're showing a progress bar. I used up three
point seven of my requests, so that would be like
ten in a few hours. I quickly became paranoid about
(12:25):
exceeding my request budget and running up a big bill.
Claud's pricing model is even more aggressive than Githubs, starting
it at seventeen dollars a month, it must be paid
in advance for about two hundred bucks a year. I
was impressed by Githubs, Copilot, and Claudes coding ability, but
I immediately felt like it was aggressively trying to run
up the request score as fast as possible. It is
able to edit my code window, generate new files, enter
(12:48):
in excute CLI commands, and rapidly generate tests, all of
which eats into that budget. Get help Copilot did generate
new code that worked, but it did it so fast
that I had no idea what it had done, and
had essentially no knowledge or understanding of the code degenerated,
making me dependent on the AI for fixing any bugs
it created, which of course consumes more credits. After a
(13:09):
few hours with giphub, Copilot and Cloud, I felt like
I'd been replaced by it and had become a little
more than a human moderator or a monitor. Watched it
work and gave minimal input in suggestions while running up
AI tool generated bills of hundreds or maybe thousands of
dollars per week. I also wondered if it was being
used to train AI tools to replace me and other developers.
AI coding tools are not creating bells developers. They're creating
(13:31):
unemployed or underemployed AI coding tool addicts who are stuck
in their exponentially increasing bill for those tools if they
are independent coders. And I've builted the many podcasts on
AI coding on dot in Rocks, I'm increasing concerned and
baffled by the endless fanboyism for a by the two
people I like and respect very much, Carl and Richard.
You guys need to stop drinking AI kool aid and
(13:53):
start talking to real developers who workflows and coding skills
are being adversely affected by AA coding tools that you're
providing with free blowing and biased reviews.
Speaker 1 (14:01):
Yeah, you know, I've been waiting for a comment like
this because there is that sentiment out there and it's
probably underrepresented by our audience. So, first of all, a
black Web, thank you very much for being brave enough
to send that to us. And yeah, I mean, I
feel your pain. However, I don't share your experience, and
I shared my experience, which is, you know, mostly positive.
(14:26):
There are times when I don't understand the code that
the AIS written, but I usually ask it to comment
any code that it writes so that I can understand it.
I also develop in pieces. If I say need a
full stack editor for some classes, I'll start with the models,
than the data layer, than any kind of manager, then
finally the UI. I might break all these down into
(14:50):
different prompts if they're too complex. I didn't always think this,
but now I believe in baby steps and it works
for me and my agent of choice is the co
pilot CLI and clauds on at four five. I never
felt like it was taking over my job. It's only
made me more productive, and he basically came back by saying,
can we hear a balanced view of AI coding tools
(15:11):
from actual developers, not AI apologists who are trying to
sell a very expensive product. And I said, well, I
am a real developer with real customers. The money they
pay me and the productivity games far out weigh the
costs of Copilot. I'm not an apologists for any technology
or any company. I'm an independent developer and a podcaster.
And he basically said, you know, you make a good point.
(15:35):
Maybe I just need to figure out more efficient ways
to use this.
Speaker 5 (15:39):
Well, so time is money, yep. And if this stuff
saves you time, that's worth a lot. And you could
easily justify paying charges for things like Claude if it's
saving you hours of time.
Speaker 1 (15:53):
Yeah, and therefore making you money.
Speaker 3 (15:55):
Yes, I'm working with teams that are spending thousand dollars
a month. Yeah, but they're cranking through six weeks prints
in days like they're moving code fast. It isn't. And
then my problem is and there's two things that really
hit me with black webs comment and all about billing,
Which is first is this feels like telcos We're only
(16:16):
going to find out after the fact how much it
costs you, and the fact that you're learning on the
back of it and it's fixing itself means it makes
a mess then cleans it up for you and gets
paid for it like that is pretty annoying. Yeah, right,
But the bigger thing to me is we still don't
have real pricing. Right. This is the blue ocean phase
(16:36):
of this technology where they're low buying the figures just
to get us on board. It will be very interesting
to see what the real economics this looks like when
the crazy ends and everybody actually has to pay the piper, Like,
is this actually going to be viable? I don't know
the answer to that.
Speaker 1 (16:53):
It's so we have before we have a discussion that
we need to wrap up the comment because that's basically
what this whole show and.
Speaker 3 (17:01):
Black Web you're timing is impeccable having provide us with
a comment that you literally says a foundation for the
show about what is AI going to look do to
development in the next in these next few years. So
thank you so much for your comment and a copy
of music Code Buy. It's on its way to you.
And if you'd like a copy of music Code Buy,
write a comment on the website at dot netroocks dot com.
When the facebooks you publish every show there, and if
you comment there and ever reading the show, will send
(17:22):
you a copy of music.
Speaker 1 (17:23):
Go and if you don't want to do that, you
can always just go buy it and music to code
by dot net. It's twenty five minute tracks. There's twenty
two of them. I'm working on twenty three now and
you can get the collection in MP three, wave or flak.
All right, now, let me formally introduce the other two
guys that are here with us tonight, starting with Jeff Fritz.
Jeff is a principal program manager in Microsoft's Developer division
(17:46):
on the dot Net Community Team, where he leads development
of live video and online content. Jeff is the executive
producer of the dot Net coonf series of online events.
Heard of that. He is also a Twitch and YouTube
partner as well as the founder of the Live Coders
stream team. You can catch Jeff writing dot net code
(18:07):
with Gethub Visual Studio in Azure on his video stream
called Fritz and Friends at Twitch, dot tv, slash c
Sharp Fritz. Bill Wolf is here and this is an
important date for Bill, isn't it?
Speaker 3 (18:23):
Bill?
Speaker 5 (18:24):
It is? Because so tonight I've been running Philly dot net,
which is one of the largest and certainly one of
the oldest dot net communities on the planet, possibly in
the Galaxy and been running it for twenty four years.
So I started it in two thousand and one.
Speaker 1 (18:41):
And also you had something to do with this whole
Inetta thing, do you remember I did.
Speaker 5 (18:44):
I was one of the I was a VP at
Anetta and I ran the speakers Bureau and my job
was to send famous people all over the country to
various user group meetings.
Speaker 3 (18:55):
And that was all.
Speaker 5 (18:56):
Funded through Microsoft at that time, and so that was
an interest project. But I started doing user groups in
nineteen seventy eight, so some of you may not have
been around then, but it's been a long time. And
Rob Kaiser, who's been by my side much of this time.
(19:20):
We were very instrumental in something called PAX, which was
one of the first user groups in the country, Philadelphia
era Computer Society. And I'll just one quick story. One
of my favorite meetings I ever ran at PAS. We
were at LaSalle University. This was in the early eighties
(19:42):
and I was because I'm very shy, I was on stage.
I was on stage sort of moderating, and the panel
there were the team from eniac OH and because some
of them still lived in the Philadelphia area, we actually
got like remember that they got like six people in
(20:03):
and they explained to us how they built eniac and
how they made it run and how they tested it
and stuff. It was fascinating.
Speaker 1 (20:09):
Was that related to the Jetson's uniblab because that's the
one I remember when I was that old. Well, Bill,
this is your last hurrah, isn't it. Yeah?
Speaker 5 (20:20):
Yeah, I'm gonna step down as the Philly dot net leader.
I'll still be involved in community. I still do speaking
here and there, but certainly not what I used to.
I can't keep up with these guys. They do a
dozen conferences a year. I used to do that, you know,
I used to do the v bits and you know,
connect and all of those things. But I'm, you know,
(20:44):
on my way out.
Speaker 1 (20:45):
Well, we're going to give you a good send off.
That's why we're here. So how about.
Speaker 2 (20:48):
Hands straight down?
Speaker 5 (20:51):
Yeah?
Speaker 3 (20:55):
I want to go back to the to the comments
from the black web because I think it's all this
fear element that I think is pervasive. Right now, if
you think about what he just said, and I paraphrase it,
Middley's original email was longer. It's like the tool got
away from them. Yeah. Claud's especially kind of notorious for that, right,
(21:15):
you get this sort of agentic mode where it's starting
to make changes in all kinds of places. And if
you start really thinking about about tokens all the time,
now you're asking you questions about what it's done, which
is burning more tokens for you. Just get a picture
what's going on. Like, I think you got to get
away from the token trap part.
Speaker 1 (21:32):
I would look into a good system prompt that puts,
you know, cages, a cage around what you can and
can't do and which you sure and shouldn't do. And
I've learned from Jeff and other people that they're these
great cope There's a Cope pilot, awesome repository that has
all sorts of prompts for different things. There's a we
(21:53):
just learned about this. There's a c sharp pro developer
prompt that has all sorts of of you know, guidelines
about how it should write code and all that stuff.
If you're not using those things, you're missing out.
Speaker 5 (22:06):
And I'm running up your charges. Yeah.
Speaker 1 (22:09):
Yeah.
Speaker 5 (22:09):
So I'm an enterprise architect. I actually have a real
job these days. And one of the things fascinates me.
I'm at a company that has six seven thousand people
and part of my job is to try and keep
track of Azure cost usage.
Speaker 3 (22:25):
Oh wow, yeah, of course.
Speaker 5 (22:27):
Very few people know how to do that, and you
have to sort of have a fintech degree. Where is
a Kouva over here. You have to have a fintech
degree to actually figure out how much it costs to
run a website connected to a database and all the
security and all the monitoring and all that stuff. But
I can see the same problems they're going to apply
in the AI space and until they have some better
(22:51):
tools that help you understand that.
Speaker 3 (22:53):
Is part of the problem here is all this stuff
is so masure like we don't have It's like Agile
at the beginning, right where nobody really new well because
you kind of got to do what you want because
there was no plant.
Speaker 4 (23:04):
So there are tools that are built into Visual Studio
visual Studio code where there are gauges that you'll see
down in the foot of the tool. And I'm sure
everybody here is using Visual Studio twenty twenty six. And
yes I'm even talking to you, dear listeners. You do
on that you're using the new Visual Studio, But on
(23:25):
in the footer there's a little getthub copilot icon, and
when you mouse over that or you click on it,
it'll bring up a little set of progress bars and
show you just how much you're using your Copilot resources.
Now to a little bit of our our commenter to
their request requirements. There are different models that you can
(23:47):
use with Copilot that have different we call them multipliers,
right right, Okay, I see the agile one has has
jimmed in here, yes, rate, so right, there are some
more energy hungry models that you'll see a higher multiplier
run with. And right now here at the time of
this recording, right Claude Opus is up there as a
(24:11):
I think it's a three time multiplier right now. But
there are other ones that you can use that have
a zero multiplier, like GPT Mini.
Speaker 1 (24:19):
It's kind of like playing slots exactly.
Speaker 4 (24:22):
So what you want to do is you want to
choose a model that's going to help you appropriately for
the types of tasks you're going after. If you're doing
some text summarization, you're doing a little bit of small
code generation, using that mini model is going to really
help you and not cost you a thing. But if
you're trying to one shot an asp net core controller
(24:44):
with some great views to go along with it that
use Bootstrap, break out that Claude Sonnet four to five
and go to town.
Speaker 1 (24:50):
Right.
Speaker 5 (24:51):
So if I ask Copilot in visual Studio, is it
going to tell me what's a way to save money
by changing my model?
Speaker 1 (25:00):
I would.
Speaker 4 (25:03):
So the models sometimes they know about each other, sometimes
they don't, and you can give them. There's a fetch
command that you can give copilot GitHub copilot right hashtag
fetch space and then you can give it a URL
and it will go out and analyze that RL. So
you can point it to to your point. Bill, you
(25:23):
can point it to that pricing page on the GitHub
website and have it report back here's the relative pricing
of the models, and it might be able to give
you some advice as to hey, let's use this model
instead of this for certain.
Speaker 3 (25:36):
Types of tasks.
Speaker 4 (25:38):
So there is also auto mode, which is there's the
real slot machine.
Speaker 1 (25:44):
Right, Yeah, that's a weird one, that's right.
Speaker 4 (25:46):
Hey, do choose the model appropriate for this task that
I'm asking you about, and it will go and figure
it out. And the best parties, it's not going to
tell you which one that.
Speaker 3 (25:55):
You that you haven't even turned it all the way up.
Automode is one thing. Automode with unsafe.
Speaker 4 (26:01):
So curl and curl, and I might have some experience.
Speaker 3 (26:05):
Doing a little bit of that.
Speaker 5 (26:07):
Please don't do this at home.
Speaker 3 (26:08):
No, no, no, but again I have seen folks dialed
in like that. Yes, and the burn rate is spectacular.
But you've written your prompts well, and these are not
These are multi page prompts. These are not small things. No. Yeah, yeah,
the burn significant that the co generation is astonishing, and.
Speaker 5 (26:26):
So having a very explicit instead of instructions, the prompt
yeah can reduce the token.
Speaker 1 (26:32):
Yeah, especially with Claude because, like Richard said clause Claude
is kind of like Scooter the intern that's over eager
to please the boss. Miss Frankly, shopping your Patsy is
twice I already shopping them again, So it goes out,
and you know, I thought you might need some more paper,
So I went to Staples and I bought fifty reams
of paper that kind of stuff. And like, I didn't
(26:55):
ask you to do that. So in the sipstem prompt
you have to say, don't do anything except exactly what
I tell you to do, right.
Speaker 4 (27:03):
And this is this is one of the reasons why
when I when I give talks that that get into
AI topics like this, I like to come back to
I believe that the best folks at writing those prompts,
it's going to be the elementary school teachers.
Speaker 1 (27:17):
Right, because are used to talking to kids. You have
to hear it's splicit.
Speaker 4 (27:22):
They've got a room full of kids that have ADHD,
they've got opposition defiance disorder, and you don't know what
they're going to do.
Speaker 1 (27:29):
Get in the tub, the water use.
Speaker 5 (27:37):
We forgot that step.
Speaker 4 (27:39):
Right, So it's it is one thing to be very explicit.
You mentioned the awesome Copilot repository. Our friend Burke Holland
did an amazing job putting together what he called beast Mode,
which was right, a series of system prompts that help
out the GPT models to really get you towards your solution.
But there's also agent descriptions out there so that you
(28:03):
can bring down those markdown files and load in and
get that persona.
Speaker 3 (28:08):
Right.
Speaker 4 (28:08):
When we talk about talking to an LLM, one of
the first things we refer to before you even get
into defining the context of the problem that it needs
to solve, is defining that persona who are you going
to behave as what are you going to do? And
you mentioned the expert c sharp You are an expert
c sharp developer and you know how to do this,
(28:30):
that and the other. I saw one that was one
of the agents that's in that repository was talking about
you are a fantastic at writing unit tests, and it
even referenced personalities in the tech space and said, you
have the ability to write tests like this person, and
you know object oriented programming like Uncle Bob Martin, and
you're going to be able to define and refine and
(28:53):
do these things.
Speaker 1 (28:54):
Well would uncle Bob do?
Speaker 4 (28:55):
I know?
Speaker 1 (28:56):
So we talked about this and Code with AI episode
ten which came out this week, that writing agents, and
we just sort of tried to differentiate between an LM
and an MCP, which is a model control protocol, model
content context protocol. I always get that messed up and
(29:17):
agent right, So the agent is at the highest level
and the LM you would give the system prompt, right,
but the agent has a prompt. That's just like Jeff
was saying, what is your persona, what is your area
of focus that you're going to do? Right? Are you
only going to work on testing? Are you going to
work on code? Are you working on VB code? You're
(29:38):
working on assembler code? Right, those are the kinds of
things that an agent can help with, but it's at
a high a much higher level, and it will do
things on your behalf, not just not just code, right,
And you just have to be careful that you don't
give it permission to go, you know, sell your house
or something.
Speaker 4 (29:58):
Don't don't give it that essay past word into your
SQL surfer.
Speaker 3 (30:02):
No, not the production one anyway. Oh, they're supposed to
be different, that's right, in production. I just told you
guys still know essay? Like who uses essay anymore? Right? Yeah?
Speaker 4 (30:17):
Managed identity?
Speaker 3 (30:18):
Managed identities now, yeah, and we should take a break
for these very important message.
Speaker 1 (30:22):
Yeah, we should. We'll be right back. Mission.
Speaker 4 (30:25):
What a show, it's intermission.
Speaker 1 (30:27):
What do you know?
Speaker 3 (30:28):
Okay?
Speaker 1 (30:29):
I think that was inquisition?
Speaker 3 (30:31):
Yeah, oh, agreed to get popcorn?
Speaker 1 (30:34):
Did you know? There's a dot net on aws community.
Follow the social media blogs, YouTube influencers, and open source
projects and add your own voice. Get plugged into the
dot net on AWS community at aws dot Amazon dot com,
slash dot net.
Speaker 3 (30:54):
And we're back. It's don I Rocks Emberger Campbell. That's
called Franklin. Hey, sitting with our friends Jeff Fritz and
Bill Wolf. We're here at Philly dot net.
Speaker 1 (31:06):
Ten thousand people in the audience on a very first
live show in quite a while, and really fun to
be in front of sitting with everybody and having a
little fun making a show about I think a pretty
serious topic coming into the end of the year here.
Speaker 4 (31:18):
But in all seriousness, is this the first time you've
done a show where there's a court downstairs in the building,
just in case any of these folks get rowdy.
Speaker 1 (31:27):
That's true.
Speaker 3 (31:28):
Co birds.
Speaker 4 (31:29):
Yeah, I'll tell you know where in Philly.
Speaker 3 (31:32):
You know, I've been working with a couple of software
development consulting companies now that are not only going all in,
but they're trying to get all of their developers not
only using the tools, but they're coming up with sets
of standards for how they want to make software as
a whole for their customers. And so they're building out
templates for deployment, they're building templates for infrastructure, they're building
(31:57):
out templates for UAX, all focused on educating the cogenerators.
And that's all these things are as cogenerators right to
stay within the lines, so that each developer doesn't have
to get those prompts right, they're just pre configured for them.
Of like, this is the way we do things. There's
(32:18):
some really great things to be said about those agents
and being able to supply to them, or being able
to pass in in a prompt, a prestructured prompt that
you can have written down and out there on disk.
Have those templates built out and sitting in markedown format,
so that even when you browse your repository and you
look at the markdown, you can look at the template
and you know.
Speaker 4 (32:38):
That it looks the right way that you expect it to.
And the ability for the AI the LM to generate
and stick to that is really really good. But I
think Carl made a really good point. Don't just stick
to the template, but you got to tell it what
not to do about the template as well.
Speaker 3 (32:55):
Right to get outside of the line.
Speaker 4 (32:57):
There's an example I like to show where I gave.
I gave an LLLM the ability to summarize the weather
scenario for a weather forecast, and I gave it four
options sonny, cloudy, rainy, and snowy and I would send
into the LLLM a bunch of different forecasts, and inevitably
(33:17):
it would come back and say clear, nice, clear, Clear
isn't one of them, right, So you have to tell
it what to do and what not to do.
Speaker 1 (33:27):
This is a topic that we've talked about a lot
recently on dot net Rocks, probably much to the chagrin
of many of our listeners. Can you guys, please let's
talk about something else than AI pat The fact of
the matter is it's fundamentally changed the way we write code.
For me, certainly has, I know for a lot of
our listeners it has. Black Web may be one of
(33:49):
the exceptions, but it really has changed the way I
write code. So here's a really good example. And I
love using my personal little stupid projects. I'm sitting there.
I have a forty nine inch screen, and my wife
and I play this game we call Sherlock, and it's
a logic puzzle game. It's from the nineties, I think,
(34:11):
but it's been ported to a bunch of things anyway,
So it has to live on one side of the
screen or the other because my wife and I sit
together and usually I'm watching something on a streaming service
she's playing, but then when she goes out, I want
to switch them around, right, So I got to switch
them like this. So I basically had co pilot generated
little Windows app that says, Hey, I want to put
(34:33):
this window here, in this window here, and swap them
just swap them. Yeah, And I literally did it in
fifteen minutes, and I know the code. It's not to
do it, yeah, I don't know, God no, but but
you know I know how to do it. It's not
a magerate matter of me being lazy. I just it
was a fifteen minute thing and I was just like, wow,
(34:56):
that's so cool.
Speaker 4 (34:57):
But vibe coding is a very powerful thing that I
think is valuable for project managers. Program managers when they
get the idea for a user interface update, for a patch,
and they're able to take some code, they're able to
take some screenshots, some ideas, talk to the AI generate
those concepts. And for those of us that are in
(35:17):
the industry that are experts, like you dear listener with
the headphones on, like, you can do this, and you
know those edge cases that you want it to do,
and we also know how to tell the AI, hey
there's a plan here. You need to build because I
am a project planner. I am a project manager, and
I need you to build that project plan, that spec
(35:39):
so that we can walk through it. And Copilot is
very good at following a document and execute on those plans.
Speaker 1 (35:48):
Well, have you done any vibe coding with AI?
Speaker 3 (35:50):
Yeah?
Speaker 1 (35:50):
What do you think?
Speaker 3 (35:53):
It helps a lot?
Speaker 1 (35:54):
Yeah?
Speaker 5 (35:54):
I just you know the old days of you weren't
sure how to do something. You go to the web,
you do a you know, a stack, and you know
and copy stuff and a lot of a lot of
programmers just copy exactly what's in the page. That leads
to all sorts of pan.
Speaker 3 (36:10):
Take the error message and the copy that but to Google.
Speaker 5 (36:15):
But having AI sort of guide me through things that
I haven't done in a while, it really really a
time saver.
Speaker 1 (36:23):
And you know, to the to the point that I
was making a black web. You can ask it to
explain itself. I don't understand this, what's it doing? Comment
it or give me a summary of what you just did,
you know.
Speaker 5 (36:34):
But we also use it for QA, generating tests, DevOps,
you know, figuring out you know, sort of scripts and
recipes for deployments and you know resources. There's a lot
of places that it applies beyond just the C sharp coder.
Speaker 3 (36:55):
You know, if you're billing by the hour as a developer,
you really don't want to No, you.
Speaker 1 (37:03):
Don't want to tell anybody that you even have them installed.
Speaker 3 (37:05):
Yeah, I think or if you are users tools, you
gotta lie about your hours or matter. It's still I
think you have to restructure HIO, you, Billy.
Speaker 5 (37:13):
The other side of that, Richard, is I think you
have contractors that work remotely that really understand these tools,
and they can actually juggle multiple clients and build them concurrently.
Not that that's a good thing, but I think you're
seeing stuff like that going on too.
Speaker 3 (37:32):
Yeah.
Speaker 5 (37:32):
And one of my favorite parts, how many of you
ever have to interview people? Yeah, don't you love when
you ask them a question and their eyes are going
back and forth, and you know that they're talking to
Claude or some other model and saying, give me a
good come on, give me something.
Speaker 1 (37:51):
Yeah, there's no such thing as a radio quiz anymore.
Speaker 5 (37:54):
No.
Speaker 3 (37:55):
I talked to one interviewer. He said, here's how I
asked the question. Close your off, now answer this question.
That's good.
Speaker 4 (38:03):
Yeah, Well, you made a good point earlier. Richard about
how project teams are able to get through a six
week sprint in days.
Speaker 3 (38:13):
What was a project team for six weeks was a
person with six or seven agents, burning credits like crazy. Sure,
you know, running hard, but the results were again astonishing.
Speaker 4 (38:25):
What we really can can get into now is those
things that were in the parking lot on the combine board, right,
they're now in play.
Speaker 3 (38:33):
Well, you're exactly right. I could keep telling you know,
black web handed at this everybody's talking about it's like
we're all going to lose their jobs, and I'm like,
it just doesn't look like it. A. We need the Shepherd,
but b how much it's not like any of us
we're getting to the bottom or to do list right. Yeah,
Ever there's always more And how many other projects don't
even get on the board exactly because the backlog is
(38:55):
so far back Exactly. You go back to the Luddites
and the industrial clothing production. While it was disruptive at
the time, it also lowered the cost of cloth enough
that people started owning more than one set of clothes.
Imagine right, how many software projects have just never been
written because they couldn't even get to the table.
Speaker 1 (39:18):
It's interesting that the Luddites were the technologists of the day.
They had the knowledge of what they were doing, clothes
out of weaves, and it was suddenly threatened by machines.
But they were you know, if they were us now,
like us, we have the knowledge of how to build software,
so we can interact with the AI. They didn't have AI, right,
(39:39):
but they could probably be more productive with the machines then.
Speaker 3 (39:42):
Well, in many on the street could follow that story.
Over time, they got retrained on the new machine. Not
all of them. Lots of people didn't want to play,
but a lot of them did and it did change
things around. It's hard to be in the disruption, but
we're in the disruption.
Speaker 1 (40:00):
So it reminds me of the CAD revolution, you remember that.
So my mother was actually a draftsman at Electric Boat.
Electric Boat, Yeah, and she she did everything by hand
and she was very good at it. And her friend
got on the CAD system. She's like, I don't want
to do that, right, But they left her alone, and
you know, she listened to her music and did her
drawings and stuff, and when anybody wanted any real detail
(40:23):
work done by hand, they called her.
Speaker 4 (40:25):
I was one of the last classes at my university
that that had a draftsman class required in engineering. I
had to learn how to use.
Speaker 1 (40:36):
All the tools and slide roll.
Speaker 4 (40:38):
Yeah, a little bit of slide rule.
Speaker 5 (40:41):
So that really makes me remember my father was an
electrical dressman. Yeah, and he did the routing for electrical
circuits in nuclear power plants. He worked out three mile Island.
You can imagine, well, you know, but it's it is
something that happened, yes, and uh, you know, I don't
(41:01):
I doubt if his circuit caused the problem, but you
can imagine all the drawings. It takes every little wire
back then because it wasn't there was no tc P
I P.
Speaker 1 (41:13):
Well, Wover Dam was built with slide rules. Yep, no calculators,
no computers still standing.
Speaker 4 (41:18):
So let can we can we pivot the discussion a little.
We've been talking about copilot, Claude, some of those tools
that help us as developers. But I like to get
outside that box a little bit and and something that
that I want to make sure that that our developers
here in Philadelphia and the listeners I want to I
want to make sure that we're thinking about is not
(41:40):
just how to use the AI to get your job done.
Speaker 3 (41:42):
To build a.
Speaker 4 (41:42):
Website, but how can you help your customers use the
AI to deliver their requirements, their needs to their customers.
There's there's more to that than just slapping a textbox
into the middle of your appplications so they can ask
questions about their reports. There's a lot more that we
(42:04):
can do with that that we need to get. That's
a great feature, Oh, without a doubt, it's a great feature.
I can recommend a podcast, yes, but there's there's there's
things that you can do to return value to those folks,
and I think that's someplace that we need to help
the enterprises understand and use the AI better so that
they can turn right. The developers can help their end users,
(42:28):
whether it's somebody working in a call center, somebody who's
working in accounting or doing financial analysis, help those folks
get that same multiplier that we're realizing as developers.
Speaker 5 (42:38):
Some of the work I'm doing is square there and
it relates to CAD because I work for a company
that manufactures construction components to build buildings, and part of
what we're looking at is AI. How do you, as
an architect, how do you design a building for the
(43:00):
coast of Florida, so it can handle hurricanes. And if
you have and if you ever see those pictures after
this dorm, you know, a whole bunch of buildings are flat.
There's one or two buildings center block that know that
are standing straight. And the AI said, but that's probably
probably the company I'm at. It has to do with
(43:22):
how you tie all the pieces together with.
Speaker 3 (43:24):
Metal, roof doesn't get pulled out.
Speaker 5 (43:25):
Yes, yes, and and we do all those calculations, but
we're trying to figure out how to get a I
to help do that kind of work.
Speaker 3 (43:33):
That's tricky.
Speaker 1 (43:34):
That's that's cool.
Speaker 3 (43:35):
Yeah, we've been followed. One of the show recurring shows
for us has been Vishwaz's show where he's now left
gone into a startup to build l M tools for
generating RFP responses to government contracts. Oh and it's been
fascinting to listen to him as he's as they've learned
more and hit the challenges and sort of that progress.
(43:56):
It's been a couple of years of this now, like
we're just trying to pull a narrative together, Like he's
taking on a tough problem and he's learning more about
it getting on there, like these different cases are part
of us trying to find stories about what's working what isn't.
Speaker 1 (44:08):
But a lot of these What's.
Speaker 5 (44:09):
The biggest problem I have with AI is dealing with
friends and family, oh god, yes, and holiday parties and
they're like.
Speaker 1 (44:18):
HEYI is ruining everything.
Speaker 3 (44:20):
I'm like, oh, please don't.
Speaker 5 (44:21):
And they come up to me at the party and
specifically like, you are responsible, You're doing this.
Speaker 1 (44:27):
Right, it's your fault.
Speaker 3 (44:29):
Well. AI seems to be the current scapegoat for whatever
is happening. It is, but listen, a I didn't lay
anybody off. That was people. People laid up folks off
and they may have used AI as their current excuse,
but we're also seeing lots of them walk that back afterwards,
but it doesn't get the results they wanted, or that
it actually wasn't what they were to do in the
first place. Right.
Speaker 1 (44:50):
I know that, you know, I have a young gen
Z daughter and her generation is feels hopeless, you know
about the future that you know, AI kind of take
everybody's jobs and all that stuff that you were talking about, Bill,
and it's I just have to constantly express to her
that you know, this is no excuse to not do
(45:11):
your best at what you want to do in life.
You know, if you want to learn something for the
for the sake of learning it, learn it doesn't matter
if there's an AI that knows it too. You go
do what you want to do and be the best
whatever that you can.
Speaker 3 (45:26):
Assibly help you.
Speaker 4 (45:29):
And yeah, that's that's a great point Bill, where I've
I've talked to some folks, some artists who really don't
like and I I agree with him, really don't like
using AI to generate cartoons, images, photos. You're generating those
images because you can't get a photographer or the personalities
(45:50):
that you want to appear in those In those pictures,
you're you're assembling them for some reason because you can't
get that to happen, or you can't an artist to
draw a cartoon. But what you can use the AI
for is to analyze something that you drew or something
that you made. Give it a picture and say, you
know what, take a look at this, review it and
(46:13):
tell me how I can make this better. And then
to your point, I can learn how to improve my
skills and do more I do.
Speaker 3 (46:20):
On the consulting side, I'm not talking to folks who
are making money off of cleaning up by code of
projects that don't work. Oh yes, well, and I feel
like like we're in a stupid period, right, this is
the early days of this tech. We're still in the
AI bubble, which is a very stupid period, and people
(46:41):
are not knowing how to use these tools, are getting
in over their heads. Yeah, like I did this with
access back in the day. Lots of people spent the
weekend and got themselves over their heads, Like there's money
to be made cleaning up messes and building out the
kit to get good at cleaning up the message this
will pass only a few years.
Speaker 1 (47:00):
One of the best customer comments they ever got is
why is it taking so long? My brother could do
this in access?
Speaker 3 (47:05):
Yeah, like, you know what, you should try have your
brother do it.
Speaker 1 (47:10):
I'm done, But.
Speaker 4 (47:11):
Richard, you're you're hinting at using AI to fix the
mistakes that some human implemented with AI. Like the solution
to AI being more AI feels like like a we're
kicking the can down the road here a little bit.
Speaker 3 (47:28):
It's also a normal escalation, but they are missed. You know,
we've all we've often misused tools and inexperienced you misused
tools and experienced people can get results, and then you
start getting real costs. You know, it is possible with
minimal skill to get to a certain point in building
an application with these tools, but finishing it it's hard.
Speaker 4 (47:47):
How many of us sell a sp net MVC applications
with SQL statements sitting in a view right to our
dot Net listeners?
Speaker 3 (47:56):
You put them in the parameter.
Speaker 1 (48:01):
Oh, just to change tax here. Anybody a musician or
an amateur musician, play an instrument, write songs, jand up
if you write songs, a couple of you, Okay, I
want to know what you think of Suno and Suno
is essentially something that can build a completely professional sounding song,
(48:24):
complete with vocals and solos, just from a prompt and
it sounds amazing. Now I'm a musician, it's very hard
for me not to take offense at this, but I don't.
I kind of look at this as like the Cassio
keyboard of twenty twenty five, you know what I mean?
Who used to have organs in the home where people
(48:46):
could play take me out to the ballgame and learn
their things. But that doesn't mean they're going to take
a musician's job. Right. This thing, however, it's pretty awesome.
So who raised their hand? What do you think about Suno?
Andy down? Thumbs down?
Speaker 3 (49:02):
What's that?
Speaker 1 (49:04):
I can hear you, but i'll replay your yeah?
Speaker 3 (49:08):
Right?
Speaker 1 (49:08):
Why I have a famous pain or painting something when
you can?
Speaker 3 (49:11):
Yeah? Right?
Speaker 1 (49:12):
Have you listened to a Suno generated song? There? Actually?
Rick Piatto did a video where he made one in
five minutes and it sounds like it was done in
a professional studio with an emotion, a voice full of
emotion and everything. Who else raised their hand? Over here?
You you play piano? Oh you never heard of Suno?
Speaker 3 (49:30):
All right?
Speaker 1 (49:31):
Somebody else?
Speaker 3 (49:31):
Yeah? Go ahead?
Speaker 1 (49:32):
It feels gross to you? Yeah? Yeah, synthetic feels grotesque.
Speaker 3 (49:35):
Yeah.
Speaker 1 (49:36):
As a tool to create and refine art, it's kind
of grotesque. Do you do you think you could tell
the difference between a Suno song and a professionally recorded song,
Because I'm a musician and a producer and I record
bands and everything and make albums. I couldn't tell the difference.
It was really really good a step, Yeah, it's really
(49:56):
quite a step. But here's my opinion. I think that
this is only going to make people want to go
out and see live music that much more and I
think that this is a wave, you know, and after
people have been bamboozled or whatever by a what that's
AI whatever, you know, then they're going to want to
see real people performing with real talent.
Speaker 5 (50:19):
Are we approaching a point where all elevator music is
going to be generated by AI?
Speaker 1 (50:23):
Probably is already.
Speaker 3 (50:24):
Yeah. Spotify is basically admitting they're headed down that path.
You know how you you can turn on a Spotify
playlist of songs, you know, and then it just starts
adding stuff to it. Well, they don't even want to
pay those royalties anymore, so they're going to start using
they're using these tools. If you're not paying attention to playlist,
it's just going to start then sizing music on you.
And the question is will you notice, Yeah, okay, don't
(50:48):
really perform their songs.
Speaker 1 (50:49):
Barry Manilowe, who sang I write the songs, didn't write
And I don't want to make you just to paying
someone to write the lyrics for me or paying someone
to play the guitar in the track, because they're real people.
(51:12):
So that's the difference. Someone would make that argument.
Speaker 4 (51:16):
I think there's a point when we when we do
talk about generating art whether it's music, video, even generating stories,
generating write fiction, drama that isn't just coming from nowhere, right.
We are feeding it information. We are shaping the direction
(51:38):
of those prompts and sending them down a path, and.
Speaker 1 (51:41):
For some using pieces from real art.
Speaker 4 (51:44):
It is so right, there's copyright concerns there. But when
we think about that as as creators, when you're looking
to get that that ball rolling, and you're able to
have a conversation with an AI with a with a
language model and start to tease out these things, then
(52:04):
if you're if you're talking through I want to write
a song, right and I'm just writing the lyrics and
and then I don't know. I want to throw help
me out understand what the what what the instruments are
that I might want to put at this. It's still
me that's saying, hey, you know what, let me let
me hear what a what a jazz piano sounds like?
Speaker 3 (52:21):
With this?
Speaker 4 (52:22):
Personally, I want to hear a human play that. But
when I'm writing the song, I don't know how to
play jazz piano. I don't know how to compose.
Speaker 1 (52:30):
You might want to get some ideas.
Speaker 4 (52:31):
I want to get some ideas and to be able
to get that first level expert to give me those
ideas demo, to get a demo, to.
Speaker 1 (52:39):
Hire some real musicians to put their own stamp on it.
Speaker 4 (52:41):
Because to the to the point that I've heard a
number of folks say, there's a human feel and emotion
that you get when that you see not just when
you listen to music, but when you look at an
art piece and you see the paint strokes in the painting,
when you when you see how an illustration is put together,
you can see and feel how that was done that
(53:04):
by a human, even if it is a human that's
drawing in Adobe Photoshop.
Speaker 1 (53:09):
And conversely, when you read something that's been generated by
an AI, it has that certain, I don't know, a
sycophantic feel to it, doesn't it.
Speaker 4 (53:18):
It's got an extra emoji and mdash everything.
Speaker 1 (53:20):
It's just the way that it sounds like in your
mind when you read it. I had this experience, But yeah,
I can't. I don't know if this is true because
I haven't confronted my friend about it. But I did
a post on Facebook and it wasn't I don't remember
what it was about, but one of my friends one
of my Facebook friends commented, and I swear to god,
(53:42):
this person just took my post, put it into chat
GPT and said, give me a positive reply about this,
and then they pasted it in. That's what it sounded
like to me, and because it kind of summarized everything,
you know, and said, oh, it's so good that you
blah blah blah, you know, as if to prove that
they understood what I was saying. Your real friends don't
(54:04):
do that. No, right, No, they don't summarize and bullet
point everything you said. So yeah, and let's they want
to make fun of it.
Speaker 5 (54:14):
Right, So to move this in a different direction sort
of positive to me, I think a good area for
AI is healthcare. And if I have something going on,
and yes, I want a doctor to read the X
ray or the MRI, I sure as hell want Claude
(54:35):
to also look at it, because if they find something
that the doctor missed.
Speaker 1 (54:41):
Claude is so positive even if you were going to
die and say, oh, if I just take a couple
of shots and.
Speaker 3 (54:49):
That's one of the best lung tumors I've ever seen.
Speaker 4 (54:52):
Right, Claude starts every response with you're absolutely.
Speaker 1 (54:55):
Right, right, say when you tell it it's wrong, but
you're expected lifespan is lesson sick Bill.
Speaker 4 (55:01):
You actually make a good point there, because I'm one
of those people who I'll wait to the last minute
to go to the doctor. Right, I've got don't do
that symptoms.
Speaker 3 (55:09):
Thank you.
Speaker 4 (55:10):
My wife, missus c Sharp Fritz, has been keeping me
honest with that. So but but right, I've got symptoms
x Y and say, you know, I've got this weird
pain in my leg that happens at some time in
the afternoon. And she's like, you really should talk to
the doctor about that. But I was like, I don't know, but.
Speaker 1 (55:27):
Okay, honestly, wouldn't trust an AI to give me any
kind of so anything.
Speaker 4 (55:31):
I mentioned here's the medications that I'm taking, here's the
weird symptoms that I'm feeling. And this happened to me
while I was in Portugal for a conference we were
speaking at and it came back and said, no, you
you might want to talk to your doctor about that,
because you might have a wrong dose on this medication
you're taking. Sure enough, talk to the doctor and yeah,
let's dial that back. And I don't have the problem.
Speaker 1 (55:52):
But I wouldn't.
Speaker 4 (55:53):
I would have right, because I'm a middle aged guy
in America. I would have been like, when you drive
or anything, it could be supreme tuma, but probably a headache.
Speaker 3 (56:09):
The radiology story is an interesting one because there was
this whole point made by like Jeff Hinton ten years
ago saying, you know, generative AI is now figured out radiology.
It's better at analyzing images than humans are. Radiologists are
are done, They're totally unnecessary. He was one hundred percent incorrect.
(56:33):
The demand for radiologists has only gone up, and part
of that is that there was such a huge unmet
demand for imaging, and the software accelerated the ability for
radiologists to do a good job. And so now they're
doing three times as much imaging and most radiologists now
use it's like seven hundred models certified by the FDA
(56:56):
for imaging.
Speaker 1 (56:57):
So this is where I totally agree with Bill that
you have these specified models that are trained on their
particular like radiology data, right, that are so narrowly focused
that they're going to give you a better outcome diagnosis.
Then if you just you know, type in a chat GPT,
you know, I got it.
Speaker 3 (57:16):
Yeah, I wonder if we're going to move away. As
the dumb winds down on this. You know, if you
think how much better the web got after two what's
the level you know, after the dot com boom ended,
and you know, after a year or two, we built
better websites, like we stopped racing, started to think about
what actually made sense. Right now, the business is good
(57:41):
for making them models bigger, right Those companies are all
incentive to make models bigger because they're going for these
outrageous evaluations. That means they have to constantly show they
need to spend more money counter to what they actually needed,
what's actually benefit show the customer. So as this ends,
and it must, Yeah, I'm wondering if we're not going
to go a local model. Right. So software development is
(58:04):
actually a pretty narrow domain space. It's pretty tight. So
I wonder when the dumb wind's down and we are
focused on the most efficient ways to do things, is
that the tookys are gonna become irrelevant. You're gonna own
that gear.
Speaker 1 (58:15):
And also it becomes more attractive to your customers because
you're not sharing screenshots and code of their Yeah, you know.
Speaker 3 (58:23):
There's how you get rid of the whole sovereignty problem.
Never leaves the building.
Speaker 4 (58:27):
Yeah, foundry local is is absolutely a thing, O Lama.
Running locally is absolutely a thing that we see folks
using more and more. Right, we get the we get
the fee models that are running locally on Windows. You
can run Quinn on Windows. And if you're running with
that the surface laptop with the NPU on it, you
can do some amazing things. If you're playing Fortnite and
(58:47):
you've got a fantastic GPU on your system at home, well, yeah.
Speaker 3 (58:50):
And the radio implodes. GPUs are only going to get cheaper. Yeah,
and we're probably going to be you know, racking up
a few of those things in an office. Somebody's got
to running them on stop.
Speaker 4 (59:01):
We are still seeing that that growth of data centers, right,
there's been there's a number of properties here in the
southeast Pennsylvania region around Philadelphia where we are seeing old
old warehouses, old factories that are being bought out and.
Speaker 5 (59:17):
Being turned into nuclear power plight.
Speaker 4 (59:19):
Let's come back to that in a second. But we're
seeing these older, right, industrial facilities that are being bought
out and turned into data centers. And there's people protesting
because it's it's higher demand on electricity to bill you
made an excellent point. We brought up three mile.
Speaker 3 (59:35):
Hour in Philadelphia.
Speaker 1 (59:36):
When you say we Microsoft, yeah, correct, Yeah, his blue
badge is showing. Yeah.
Speaker 3 (59:43):
So Microsoft has kicked in something two billion dollars to Constellation,
which is the company that had been operating three Mile Island.
They turned it off in twenty nineteen because it costs
more to operate than the natural gas combined cycle plants
that have been built all over Pennsylvania now. And but
Microsoft doesn't care. Eight hundred megawatts of electricity is eight
(01:00:04):
hundred megawats electricity, and it was a way to try
and put that more power online. It's probably gonna take
a few more years for they get it done. Twenty
seven seven, that's what they're shooting for. They're probably it'll
probably take longer debt. There's a dozen plants like that
around the US. So now Google has approached some some
folks in Cedar Springs and Idaho, same kind of situation.
(01:00:26):
Five reactors shut down for five years. So I mean
it's going to help. We're in any more power than that.
It's interesting to think in terms rather dystopic terms of
tech companies now going to advance technologies on power generation
of course, I'm about to record the energy geek out,
So here you go, you get a preview. Amazon's now
(01:00:49):
committed to a new reactor design. Like all the tech
giants are looking for other power source and.
Speaker 5 (01:00:54):
Another place AI is got a factor in as grid management. Yeah,
without getting the electricity.
Speaker 3 (01:01:00):
But I also think we're overbuilding, right, We're in that.
We're again in the bubble situation. Like what was the
overbuild during the dot com boom? It was fiber optic cable, right,
there was a ton of fiber optic cable was late
and many of those companies went broke shortly thereafter, and
the fire and the and the cable was bought up
ten cents on the dollar. Yeah. The only thing I
(01:01:21):
would say that's safer this time around is the tech
giants for the most part, have been spending cash money
they already have that it would have used on stock
buybacks or something and instead or turning it into land.
Speaker 4 (01:01:33):
But having that extra electric power on the grid not
only is going to facilitate AI focused data centers, but
we've been trying to do electric cars for how long?
We are doing electric cars right, but I mean at
scale with significant delivery. Here the amount of draw that
that puts on the grid. When we start thinking about
(01:01:53):
higher percentages of vehicles on the road electric based.
Speaker 3 (01:01:57):
They're not even close to the same league of what
pot data centers are trying to consume.
Speaker 4 (01:02:01):
No, no, not at all. But having that extra power on.
Speaker 3 (01:02:04):
The it's not have more power on ARID is not
going to hurt. You could stand to have more electricity.
I think it's going to be an overbuild. I think
a lot of them are not going to get finished.
I think it's interesting to think in terms of Microsoft
not being a software company anymore. Yeah, they're a utility company.
Their utility.
Speaker 1 (01:02:19):
So should we call this the future of not software
development but the future?
Speaker 3 (01:02:24):
No, I've renamed it the role of AI and software
development because that's clearly clearly we ended up for better
or worse. It's only a future. I've been doing this
AI hype keynote for a while now, very much focused
on what the end of this bubble looks like because
we've been through this before. Lots of people are going
to be hurt. I only it's going to be us. No,
(01:02:48):
We're a scarce resource and our ability to learn these
new tools isn't going to go away. This tool is
not going to go away anymore than the Web went
away after the dot com boom ended. In fact, efficiency
is going to be the word, and most people in
software actually like being efficient. They don't like the rampant inefficiency.
(01:03:09):
And so some ways the knuckle downtimes of a downturn
is where we thrive because we do make our companies
more efficient with what we do, and these tools can
do that. It's just that right now there's a lot
of people incentive to not focus on that. That will end.
The tools won't.
Speaker 4 (01:03:27):
So if we're on AI one point zero and Web
one point oh, was everybody build your HTML? Web two
point zero is social media? Now?
Speaker 3 (01:03:35):
It was? It was mash up too, right, Like I
would argue that the post bubble AI will be hybrid,
a lot more client side and a lot more focused
model types. Like I wonder if you think about how
constrained you are in a version three of the piece
of software, right if adding something new to the stack
(01:03:55):
is anathema. Right, this is our data store, this is
our client libraries, like your kind of in a box.
Like that's a pretty tight little model. Like here's an
interesting thought for you. The more mature your software gets,
the less it's going to cost to maintain because the
simpler the model is, because the constraints are so clear,
and your most experienced developers who don't want to maintain
(01:04:16):
that software and move on to the new things. The
juniors come in and these tools would protect them from
making mistakes. Your code coverage betters, your your rules are
clearly delineated. Like weirdly, we might actually get more reliable
software out of this for relatively low costs based on
the way models behave when they're well constrained.
Speaker 4 (01:04:37):
The push for better models that are smaller and use
less power means that we don't need to deploy and
run as many data centers consume.
Speaker 3 (01:04:47):
As much energy. That stuff's going to be on my desk.
Yeah right, I don't want to go out in the
cloud at all.
Speaker 4 (01:04:53):
Yeah, I want my Raspberry Pie tower running.
Speaker 1 (01:04:55):
It as Hey, I think we got to wrap it up,
but let's give one more round of applause for Jeff Fritz.
Speaker 2 (01:05:05):
And our friend walking out the door beIN a wolf.
Speaker 1 (01:05:10):
Congratulations, what do I hear?
Speaker 3 (01:05:14):
One? Two, three?
Speaker 5 (01:05:16):
All right, that's our rally cry at Philly.
Speaker 1 (01:05:19):
Awesome and thank you all for coming to dot net.
Speaker 2 (01:05:22):
Rocks dot net.
Speaker 1 (01:05:45):
Rocks is brought to you by Franklin's Net and produced
by Pop Studios, a full service audio, video and post
production facilities located physically in New London, Connecticut, and of
course in the cloud online at pwop dot com.
Speaker 2 (01:06:01):
Visit our website at d O T N E t
R O.
Speaker 1 (01:06:04):
C k S dot com for RSS feeds, downloads, mobile apps, comments,
and access to the full archives going back to show
number one, recorded in September two.
Speaker 3 (01:06:14):
Thousand and two.
Speaker 1 (01:06:15):
And make sure you check out our sponsors. They keep
us in business. Now go write some code. See you
next time. You got jud Middle Vans and