All Episodes

November 5, 2025 53 mins

Welcome to Radio Better Offline, a tech talk radio show recorded out of iHeartRadio's studio in New York City. Ed is joined in-studio by founder and writer Charlie Meyer to talk about the BS of vibe coding, why the valley went so crazy about scaling laws, and the realities of AI coding.

https://blog.charliemeyer.co/
https://csmeyer.substack.com/

Code Doesn’t Happen To You - https://csmeyer.substack.com/p/code-doesnt-happen-to-you
The Trillion Dollar Chart (scaling laws piece) - https://blog.charliemeyer.co/the-trillion-dollar-chart/
Replit’s Existential Problem - https://blog.charliemeyer.co/replits-existential-problem/

Want to support me? Get $10 off a year’s subscription to my premium newsletter: https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/w08jbm4jwg it would mean a lot!

YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use code FREE99 for free shipping on orders of $99 or more.

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

Email Me: ez@betteroffline.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
As Media.

Speaker 2 (00:06):
Hello and welcome to Better Offline. I'm your host ed
ze Tron. We're here in the beautiful iHeart Radio Studios
in New York City, and I've got a guest, of course,
Charlie Meyer, the esteem blogger and CEO of Picko. Charlie,

(00:28):
thank you for joining me. Yeah, thanks for having me. So, yeah,
you've you've gained some I would say, notoriety recently by
making blogs that go against the oinking of the hogs
of the Valley. And I think your scaling laws piece
was the one that really got going. Yeah.

Speaker 3 (00:45):
So I my blog gets some love and mostly hate
on Hacker News. That's my distribution channel, right, and so
I'm trying to get off of that. We're gonna try
and build it like a newsletter type thing. But yeah,
I'll burst on Hacker News and every once in a
while I'll go something that blows up and I'll get
my get my haters in there.

Speaker 2 (01:03):
So, so what is it that's pissing them off?

Speaker 3 (01:05):
So, like I had to post a few weeks back
that was on I called it LLLMS or the Ultimate
demoware right, And so I had to find demoware as
software that you make the software, and it works well
in the thirty minutes that you're showing off to executives
or whoever's going to buy it, right, and then it
doesn't do the thing right, It doesn't do the thing
day to day. And so I listed some examples and

(01:26):
my startups in etech, right, so we do, you know?
And so like there's always something that I pick on.
Is I really hate AI tutors and we can get
into to that and how that all of courks. But
so I said, oh, I listed out a few things
that I thought were demoware. So it's like, oh, vibe
coading that makes dashboards, right now, that's an easy thing
to pick on. And I said, you know AI tutors,
And I said, well, maybe the kid won't want to

(01:47):
talk to an AI tutor. Yeah, that was the critique
I made, right, It is like maybe they just don't
want to talk to him, Yeah, won't want to talk
to a post they want to maybe they want to
have like a teacher who is like in the classroom.

Speaker 2 (01:57):
Crazy idea maybe, yeah. But did people know like that?

Speaker 3 (02:00):
Well, yeah, so some some people, uh, you know, we
can name names of for me too, But I actually
don't know how to pronounce them. So uh but anyway,
so people are in there and they're like, you have
no idea, Like if you think that AI can't tutor calculus,
like you have never even tried, It's like it's a classic,
Like you're missing out, Like you somehow are completely missing

(02:22):
the point.

Speaker 2 (02:23):
And you know something's really good and the innovative when
the only defense of it is you're a moron. You
you well you have never tried it.

Speaker 3 (02:31):
And it's like, well, what if I like have like
what if this is like actually the thing I spend
my time on is thinking about this.

Speaker 2 (02:37):
You're a non AI I d E. Right, so coding.

Speaker 3 (02:41):
Environment, yeah, yeah, well and that yeah, we got yeah,
lots of that, lots of that. But like so the
software that I make batting around a little bit here,
but Replet was a copy and so Replet was a
very loved by Teacher's id online and their whole thing
was like we teach like we held you teach coding
online because it's a way for you to run Python

(03:03):
and Java and all your code online and you can
do it and you can do it on crumb books
and you can collaborate and they had like teacher tools
and they sold the software to schools. I was a
teacher for a couple of years, so it's kind of
like my background as an engineer and then a teacher,
and I used replet and it's awesome. Well this was
this before they used A or this is before they
used AI. So so replate just for the listeners right now.

(03:24):
Replate is an AI powered coding environment that claims to
be able to vibe code software but doesn't really.

Speaker 2 (03:28):
But what did it used to be?

Speaker 3 (03:30):
So it used to be an excellent tool, just an
absolutely fantastic tool. It was it was just you go on,
you log on like Google Docs for coding, right, So,
like you think, okay, well you back in the day
you'd have to download Microsoft Word and whatever and that
socks and you know, it's great to bring that online
into the cloud. And they did that and they were
like very innovative. They were kind of the first to

(03:51):
market of having like a very fully featured online ide
and that is useful for exactly one thing, and that
is useful for teaching in schools, right because like you
have kids and they have two hundred dollars chrumebooks that
the school bought them, and so you get Replet and
like boom, I have a great way to teach computer science.
Now that is fantastic, and that's what it used to
be USI yeah, and now it's just and now it's yeah.

Speaker 2 (04:15):
It's so listen this you probably heard me mentioned Replet
in the past. It's one of my least favorite, most
favorite companies. If you go on the replet reddit.

Speaker 4 (04:22):
It's just the wallet inspect And so now that's kind
of like I've gotten rid of most of my like
doom scrolling places, but like this is I don't know
what type of scrolling it is, but like I go.

Speaker 2 (04:33):
On our replet Yeah yeah, yeah, exactly, and it's just
so funny. It's just God being like, yeah, spend fifteen
hundred dollars. It doesn't really work. Yeah, but I think
if I spend five thousand more dollars it might. Well.

Speaker 3 (04:47):
I mean people are like, okay, well should I spend
five thousand dollars? I mean we can we can be
reasonable that you can bring the numbers down to reality.
It's I spent fifty dollars, which for a person who
has got a bad suffare, Yeah, that's that's a big
waste of money. And they're like, okay, well it seems
like I might need to spend two hundred and fifty
dollars more or should I go on fiver her. Yeah,
and then it's just people and Reddit. I mean it's Reddit,

(05:09):
so they're just like skill issue.

Speaker 2 (05:11):
Well that and also the people who were like it
am running into this problem.

Speaker 3 (05:14):
And then a lot of that yeah, where it's like
and but replet just just for for anyone on there
on replet right now? Yeah, like what did they do
to teachers? So teachers? They had a product for teachers
that worked, that was great, that was well teachers. And
on this November somethink twenty twenty three, this is a
big day for my business because this is the only

(05:35):
reason I have my business is to replace replet. Because
they turned on AI autocomplete for kids, no way to
shut it off.

Speaker 2 (05:43):
Doesn't that defeat the purpose of learning? Yeah?

Speaker 3 (05:45):
So I had a I have a small YouTube channel
with not you know, a million subscribers, but you know,
I talked to teachers on there, and you know, we
had a customer of mine on there and they were like, yeah,
you know that year, it just seemed like the guy
he just like he missed the fact the AI got
turned on, no one sentiment announcement or an email or
a warning.

Speaker 2 (06:03):
All of his students were just amazing. They were.

Speaker 3 (06:05):
He was like, yeah, everybody, everybody, everybody got an A
that semester, Like, I wonder, you know did that actually?

Speaker 2 (06:10):
So did students actually end up getting great schools? Because
no one knows the AI go?

Speaker 3 (06:14):
Yeah, I mean it took depending on who you If
you're wandering around the classroom looking at students and you
see them all tab completing, like because AI just.

Speaker 2 (06:23):
Just for the listeners as well spelled out. So with
these AI platforms, you hit tab complete because it's basically
like auto correct decoding.

Speaker 3 (06:30):
Yeah, and so so like AI for what it's worth,
you know, we can be really balanced podcast. But AI
can really well, it can solve intro to computer science
for ninth grader problems with the incredible accuracy.

Speaker 2 (06:46):
Let's call Brown from the Internet bugs. He said. It
makes the easy things easier and the hard thing's hort.

Speaker 3 (06:50):
Yes, and so yeah, so if you need to, if
you're in ninth grade and you're writing your first program,
yeah you can. You can tab complete the whole thing
in one go. It'll one shot at d that's incredible.
It'll one shot program.

Speaker 2 (07:03):
It's this term that term just listeners is like it
means that you just give it a problem and it
solves it correctly. Yeah.

Speaker 3 (07:10):
So it's like count to ten and the AI can
count to ten, which is incredible.

Speaker 2 (07:15):
That's revolutionary. But fun fun fact if you try and
make chat GPT count to a million, it freaks out
if you do the voice mode. Adam Kinna told me
this one. If you go like counter a million, it
stops around nine or ten and then says should I continue,
and it just won't do it. It's very funny. I
love living in the future. Yeah.

Speaker 3 (07:33):
So they they though they turn on the AI and
then they were like, we're not doing education and companies
have deprecated things.

Speaker 2 (07:40):
Yeah, it happens actually not their main product whatever.

Speaker 3 (07:44):
So that I mean, I'll tell you, you know, I'm
an indie developer whatever. Like our software does not make
a ton of money because there isn't that much money
selling an online ide to schools. There's money, but it's
not it's not a billion dollar business.

Speaker 2 (07:55):
It's fine. I don't know this replet but yeah, but
you can't write billions of valuations.

Speaker 3 (08:01):
No, no saying hey, we're you know, kids have chromebooks
and we're gonna you know, charge ten dollars a student
or whatever. Like you can't of course that you're not
going to raise a bill.

Speaker 2 (08:08):
You're not.

Speaker 3 (08:09):
That's not a billion dollar business, no, But the AI
thing seems magical. And then you know, the vibe coding
thing happened. And you know, as soon as the vibe
coding stuff started happening, they were like, we're all in
on this, and they deleted. They deleted every lead, so
not just deprecated. Right, So it's one thing to deprecate software.

Speaker 2 (08:26):
So it's like and deprecate is when you stop supporting together.

Speaker 3 (08:28):
You say it's it's no longer supported. When you put
up a big red scary banner on the top saying
your work is read only, you cannot create anymore. Right,
that is a really mean thing to do with. But
it happens. Software changes, you know, repl it for what
it's worth to be nice and fair to them, Like
they have investors and they're under the gun to provide
some returns and so whatever. The teacher thing isn't gonna

(08:48):
make them a ton of money. But they deleted the stuff.

Speaker 2 (08:51):
Just why And when you say the stuff was this
like projects that schools have been working on.

Speaker 3 (08:56):
It was so a teacher says, I'm going to spend
two three years putting in all my curriculum, all these
markdown files, all this stuff, all these tests. I'm going
to configure all this stuff. No deleted, gone.

Speaker 2 (09:06):
Monsters deleted actual monsters and now but they sent the
warning email ed, Wow in July. When are teachers online
looking at their work email in July? Yeah? Classic, classic,
big month for teachers to.

Speaker 3 (09:18):
Be on the well for American teachers, Yeah, it's not
a huge month. So yeah, in July we say we're
going to delete all your stuff and then it's gone.

Speaker 2 (09:25):
Was there any way to back it up? Well there
wasn't until they deleted it.

Speaker 4 (09:29):
All.

Speaker 2 (09:29):
That's so cool. It's awesome.

Speaker 3 (09:31):
So they're an awesome Like if you're a replet developer,
you know when when the next big thing comes up
and Replet may decide to delete all your stuff.

Speaker 2 (09:40):
Well replet they launched Asient three. Oh yeah, that was
my favorite launch of product I've ever seen. Because I've
mentioned this on the show before, but they it's like, oh,
it's an autonomous coding thing and it's just the digital
mister bean, it's just like, why don't you go off
and build me a software thing? And it just fucking
spends one hundred dollars and goes, I am, I don't
know you like this. I don't fucking care. And then

(10:02):
they had to release the thing where you could make
it think less. They had to like add tweaks to
it because it was so bad. It's I actually feel like,
and I'm not putting words in your mouth there, I
feel like vibe coding maybe just fraud. I think it's
a fraud. I don't it should not be legal to
lielight because it is a fucking lie.

Speaker 3 (10:19):
So so I will. I'll defend the vibe coder platforms.
But the defense now, I mean it's it's it is.
It's fraudulent, right, I mean like if you say, hey,
you don't know how to code at all, and uh yeah,
just sign on to this website and I mean look
at their marketing page.

Speaker 2 (10:38):
That's exactly what I'm loading loading. It's with a nice
blue iPhone Air. Oh yeah, beautiful. I have the I
have the Space Black. Hell yeah, the iPhone app Rise up.
It's a great phone. It's a great phone. I'm not
Apple made me pay for it. Yeah, turn your ideas
into apps? What will you create? The possibilities are endless.
And then it's a fake prompt that says, make me
a business tool for marketing teams that helps generate professal

(11:00):
business proposals and then add automated back up and recovery.
If I think if you asked Replet to do that,
it would cost three hundred dollars and nothing would happen.
I think it would just yeah on count a light
like barely functional code.

Speaker 3 (11:14):
So I wrote a post on this and I was excited. Yeah,
and I was. I was excited to end the post
saying there has never been a successful thing.

Speaker 2 (11:21):
Ever.

Speaker 3 (11:23):
Unfortunately, Replet has added a set of case studies, and
I think that they use it. And so the case
studies are we sold to enterprises and we're going to
do prototypes of internal dev tools, not dev tools, internal
like you know, management software for inside your back office software.

Speaker 2 (11:42):
So they haven't had a case study since what looks
like August, and one of them is how Zinus says
one hundred and forty thousand dollars with replet and it
but it also cuts development time by fifty percent.

Speaker 3 (11:56):
But so then The question is did the person typing
stuff into uplet did they know how to code?

Speaker 2 (12:01):
Exactly? See, that's if they know how to code whatever
the stuff, it's not vibe coding. It's just you could
have used custom you could use stuff. You could have
used any vs code or was it what's the free
one Amazon's doing now? Kiro? Maybe Kiro. And then there's
the I like the one that came out from China
and everyone was like, that's gonna send information to that
Chinese It's like will it? I don't know, but I

(12:24):
don't think they're going to Your clone of Flappy Bird
has got to be taken up. Yeah.

Speaker 3 (12:30):
The way that the word vibe coding has the meaning
that it has today I believe is you do not
know how to code. You type prompt and you get
app out. And I'm not going to dock this person
because they were nice to me once, but like there's
a person online like they just do like, oh, here's
one hundred days of AI and I'm going to make
a fully functional software as a service company fully and

(12:52):
I don't know how to code. And then you look
at this person and they're typing in the prompts. It's
like they clearly have like a pretty strong technical background,
and then the thing still doesn't work. By the way,
that's like.

Speaker 2 (13:02):
Cool, Like they know how things work and it's still broken.

Speaker 3 (13:05):
Yeah, so I mean whatever, this person was like a
product manager or something, so like they know what an
API is, and they know what a web server is,
and they know the names of the different technologies, and
like that's going to get them part of the way there.
But the idea that you can end to end create
a software product that has some value.

Speaker 2 (13:20):
Yeah, it's crazy. We would have heard about it. No,
that's kind of your demo. Workpost was really good about
this because it was kind of like, look, you can
do the proof of concept. You can do this, but
we've never seen the next stage. And someone else did
a really good one was like shovel weear. They said,
where's the shovelwear? Where's the crap software? They I remember
the first times I was on the Internet, the amount
of weird shareware. Shit, there was just like different forms

(13:41):
of IRC clients and shit, there were people making weird software.
Why isn't that happening?

Speaker 3 (13:46):
Yeah, I mean so you'll see, Okay, I made flappy Bird,
I made a weird thing I made like you can
make little small pieces of software for yourself that maybe
have a little bit of value. It is fun, it
is an novel, yes if you know like.

Speaker 2 (14:02):
But then it then it doesn't work like it's so,
I do not know.

Speaker 3 (14:05):
I'm a web developer, so I know how to do
web apps. I call it code for shits though, yeah whatever,
but like it's that, that's what I know. That's that's
what I've been doing. I've been coding for whatever, and
I know how to do that. I do not know
how to make iPhone apps. So I was like, okay,
you know what I'm gonna do. They just announced Claude
whatever because I'm interested in this stuff. I'm an early adopter.

Speaker 2 (14:24):
I don't And also, if it did, what however I
may feel about AI If it actually did well, if
Vibe coding was real, that would actually be a huge dal.
That would be a huge fucking dal. I wouldn't I
would have all my ethical concerns. If I could actually
build software without knowing anything, Wow, that would be great.
Never been the case.

Speaker 3 (14:42):
But you tried, though, well, so I tried. So so
my idea was like, Okay, I use my phone too much.
I'm gonna make an app called app snooze. It takes.
So you say I want Gmail, I want it snooze
for a half hour. Got it so that when I
open up the Gmail app, it used this screen time
thing and it says blocked, and then thirty minutes is up.
I get it back, right. That is impossible to make
using iOS essentially without like a substantial amount of work.

(15:05):
It's it's based on like the limitations of how Apple
does their stuff with screen time. It just cannot be done.
So I type this stuff in claude is like sitting there.
It's like, oh, yeah, you're You're awesome. You are killing me, dude,
this is a great idea. You got this, you got this, yeah,
which it actually does say in one off there. I've
been watching like the World Series or whatever, and a
lot of NFL and like the chatchypt Ads. We can

(15:27):
hopefully talk about those.

Speaker 2 (15:28):
I haven't told any of those, Okay, well, oh no, no,
I love to hear about this because I'm a Raiders fan. Yeah,
and I try not to watch. If I needed to
watch a poorly conceived product, I could just use my
season tickets, but I solved them, so wait, but keep going.

Speaker 3 (15:46):
Though, okay, well but see so it's like you got this,
but then it's this is the whatever. So they have Haiku,
and they've sonn It, and they have Opus. Yes, so
whatever awesome names. But so they have son It, which
is the really good one. You know, it's very well respected.
It's supposed to be it's supposed to be the cloud
is supposed to be a good one for coding. And
so I was like, I'm gonna pay. I'm gonna pay
twenty dollars. I'm just gonna see what happens. If I

(16:08):
can get this thing on the app Store, that'll be great.
I'm gonna charge ninety nine cents. Let's see if I
make a hundred bucks. Sure, see if I make my
Apple Developer account back, Yeah, dump a hundred bucks into
the Apple Developer account. Awesome on Apple. By the way,
you can't actually do have the app coding that you
need to without paying on a hundred bucks. So good,
that's a business, that's that's Apple bang. But anyways, so
I do that. I want to make my hundred bucks back,

(16:28):
but it cannot be done.

Speaker 2 (16:30):
Well, the app you couldn't build the Apple though it sounds.

Speaker 3 (16:32):
But because because of like literal limitations in how iOS
works in terms of like you can't have a timer
that goes off and messes with screen time. That's just
not a thing that Apple was.

Speaker 2 (16:42):
This thing called Brick, whether it's a physical device as well,
but that feels like a Bluetooth. Something's going on with Brick.
I don't know. But here's here's the thing as well.
With all of this, you just made me think it
is weird that the app doesn't just go yeah, I
can't build that NTE.

Speaker 3 (16:56):
It'd be nice if it did say that, and it
was this was weird. I had never observe this behavior before.
And again I've posted online like okay, you know this,
you know three bees in Blueberry that thing?

Speaker 2 (17:08):
Oh yeah, yeah.

Speaker 3 (17:09):
Whatever, and people I posted that on LinkedIn and someone
was like, you are lying, and I post my link. Yeah,
I post the link to the chat and they're like,
you had a secret prompt that told it to be stupid.

Speaker 2 (17:20):
Yeah, it's prompt injection. Yeah.

Speaker 3 (17:22):
Yeah, like you have a system prompt that says like
it's like stupid as ship did.

Speaker 2 (17:26):
Yeah a piece of ship. Dude.

Speaker 3 (17:28):
I messed up. I put be stupid in my system.
I should have put I should have put be smart.
If I put smart, it would have worked.

Speaker 2 (17:45):
So on this. You just reminded me when I was
dicking around with clode code. So I did the story
a few months ago about how you know, I don't
know if you've seen like vibrank where it's got like
people in cloth spend fifty thousand dollars. I love those people.
I think that's awesome. Well to try this myself, I
went on. I was like, what is the most token
intensive software you could build me. It's like, oh yeah,
an autonomous car and a bet of us. I'm like, cool,
build all of that. It just sat house just and

(18:08):
I don't even know what's sped out at the end.

Speaker 3 (18:10):
Well, I mean it certainly, well you could have a
trillion dollars start up on your hands, but it's just
you should check out that code.

Speaker 2 (18:15):
It's so sick that these things don't even go like, yeah,
we can't do that, Like I can't do an autonomous
car starts up. I don't have any training data. Very basic.

Speaker 3 (18:24):
But if it was, if it was, if that thing
was smart or useful, right, it has the ability to
look things up online. Yeah, you should have looked through
the documentation, and it sort of said, well, what can
we do with timers? What can we do a screen time?
Can you hook up a timer to screen time. I
will let you do this in the background and get
the half hour time incorrectly, And it's like it's it's
it's demoware and it allows you to build demoware.

Speaker 2 (18:43):
But it didn't even build a demo of this. Well,
so now it.

Speaker 3 (18:45):
Built me something I was kind of excited about because
it it let me pick the app. So I picked Gmail,
and I picked thirty minutes and then it it it worked.
Gmail's turned off, right, thirty minutes elapse?

Speaker 2 (18:55):
Right? Gmail is not back on? Oh so you just
cut Gmail? Well, I mean it's not been in your
email since.

Speaker 3 (19:02):
No, exactly, Sorry customers, if you have been emailing me,
it's messed up. No, but it it was like it's
just lied, right, I mean, And so like that's imagine
being someone I'm a software developer. Okay, whatever, I'm gonna
I don't know iOS, but like I'm gonna go on
the Apple pages and see what's up, and I'm gonna
ask some meaningful follow ups and determine that this didn't work.

(19:22):
And Okay, I lost my hundred bucks in the developer
yeah account. But if you don't know how to code,
you're going to be like, what are you going to do?

Speaker 2 (19:29):
Well, there's nothing you can do, because the reason I
read the replic pages and the cursor pages and the
Curse of One, it's people that can code a little
be at least a little bit. But replet is just
it's fifty percent and same with lovables. Read it as well,
Lovables and other listeners. It's another AI coding platform sold,
a survived coding thing, and it's all it's fifty percent

(19:49):
people being like I've spent three hundred dollars, and then
like ten percent people just lying. People be like I
just reached twelve thousand M or monthly recurring revenue. It's
all good for and everyone being like can I see it?
And they never respond. And then there's the there's the
people who are like looking for a REDPLT developer, and
it's like, so you're looking for someone that can write software,

(20:10):
because write and build software interesting, like a software developer
might say, and I don't know where to find one. Yeah.

Speaker 3 (20:18):
It's almost like there are like hundreds of thousands, millions
of people trying to do that, but yeah, we don't
need them. We can just we talk into the thing
and turn your app into reality except you.

Speaker 2 (20:29):
It's just it is really crazy how much vibe coding
is proliferate considering how fucking it's nothing.

Speaker 3 (20:37):
Well so, but so if you need, like if you
need a prototype, So if like this whole thing boils
down to if if the expectations were real, if it
was like, yeah, turn your app, turn your sentence into
a prototype of an application in minutes. Okay, yeah, like
an MVP. MVP is like, oh yeah, used to work,

(20:58):
but well okay, oh sorry, I thought you were saying.

Speaker 2 (21:01):
Hypothetical world where it works.

Speaker 3 (21:02):
Well no, yeah, sorry, Well now in a hypothetical world
and where it does what it does today, you can
get like a mock up. If it's a build a
semi functional wireframe mockup of your application that you could
show to kind of validate your idea to your friends
in minutes. Yeah, for thirty dollars or whoever it's your
credits and up being. That's fine, But does that happen?

Speaker 2 (21:22):
Doesn't it? You could kind of do that. You're lucky,
it's you roll the dice. That's the thing. It's always
if you're lucky. There's enough asterisks on this. It's just
insane that it's got this far because I've read a
lot of vibe coding articles and if you read like
Kevin Ruce of course in the Times and people like that,
you read these articles and you'd think, Wow, you can
just do this, you can just go and do this.

(21:43):
This is the future is today. But it's not really
not really the case at all.

Speaker 3 (21:49):
But I think that the thing that's so like pernicious
about it is that it's so easy to just say
skill issue.

Speaker 2 (21:57):
You just two words. You're prompt you're prompting it wrong,
David devait Yer old thing. But now it's yeah, you're
prompting it wrong.

Speaker 3 (22:03):
You're prompting it wrong. And so and there's no real
way to disprove that because can we go back in
time and like, because it's all this probabilistic stuff. And so,
so I have a post that I've put up and
it's it's code doesn't happen to you, that's my thing
is So it's it's my because you know top programming
for a while, and so if you're teaching a new programmer,

(22:25):
sometimes if they have like if they've kind of gotten
unlucky and they have a bad attitude. They're you know,
and it's not their fault, but they might think, like
coding is really mysterious and it's really weird, and I
type code in and I press run, and it doesn't
always do what I want, and so I'm just gonna
like mess around, right and like vibe coding is like
a productionized version of code happens to you. It's like
you press button, code pops out, it does a mysterious thing,

(22:48):
and then like, you know, it's so it's like it's
like that idea, which was the wrong way to program,
but like that's the way we're doing it, like and
we're going to what is the right way, though the
right way would be. A computer is like you operate
a computer. You turn it on, You open the coding
software that you're going to use, whether it's an online
software like my Wonderful software or you know, something like
vs code, like something for professionals, whatever, and you type

(23:10):
in code and you run it, and then the computer
like executes it runs the code according to the.

Speaker 2 (23:17):
Programming like code is instructions.

Speaker 3 (23:19):
Code is instructions, and the code like happens de terministically
and maybe if you're, you know, developing a game, maybe
there's some random elements to the software that you're developing,
but there's no randomness, like the randomness is under your control.

Speaker 2 (23:30):
Yeah, so it's true. It's the difference between treating it
as this mystic force that you pull together versus instructions.

Speaker 3 (23:36):
It's instructions and so like if you're if you're a
really good programmer and maybe you're whatever, maybe you use
AI to save you some typing and you still have
that good attitude whatever, like you can use to say typing,
that's fine.

Speaker 2 (23:46):
I mean, that's the only real vapp like that that
feels like the only consistent thing is just filling in
blanks that you know you code yourself. Like it feels
in it's also correct, and I'm like, I'm which may
be useful. I'm not gonna lie like I use it.

Speaker 3 (23:59):
Yeah, yeah, like I but I used to have a
paid GPT account, but I don't trust it to do
like the models. And this is one of the things
that I brought up in my post is the models
like aren't better now right that Well, GPT three was
okay and GPT four was like much genera and then
GBT five is trash. Yeah, I mean right, relatively speaking,

(24:19):
maybe it's a little bit better, and maybe it costs
open a eye more, which is a big development.

Speaker 2 (24:23):
It was very very great for them and whatever you
had to cost them less, but it costs more, costs.

Speaker 3 (24:28):
More, but so stop getting better. So that there was
a time where I was like, I'm gonna I'm gonna
buy into this. IM an earlier after.

Speaker 2 (24:34):
I'm kind of I'm kind of a booster, Like I
I was cured going to where's your head? Dot at?
That's is it? Dot com? It's just where's your adult at? Oh,
where's your Okay, well, where's your head? Dot At?

Speaker 3 (24:45):
Cured me a little bit of this because I'm like,
and I've just had some situations where it's just failed
me so poorly. Like there was a confluence of events
this summer where I was just like, no, what happened?
I'm done with this. First of all, I got a
strong recommendation from uh GPT to buy a software called
a script, which is like editing, because I have I

(25:08):
have a YouTube channel and I want to like I
say a lot of ums and ohs, and maybe I'm
saying some ums and ohs right now whatever, who cares? Yeah,
And but I'm like, okay, I'm gonna make this YouTube channel.
I want the production quality to be decent. If there's
a shortcut for me, it seems like a I might
be able to do this. So I'm like to GPT,
I'm like, what is good AI get rid of ums
and OS software just.

Speaker 2 (25:29):
Descript kind of like a Google search.

Speaker 3 (25:31):
Yeah, yeah, yeah or whatever. It's an okay Google search
ye whatever. And so he said you got to use
the script, and I was like cool. And so I
put in my credit card twenty bucks or thirty bucks
or whatever whatever it was, just I put it in
a recording and just completely mangled it.

Speaker 2 (25:43):
Yep.

Speaker 3 (25:43):
And it's just like the audio was unusable. It was
it was off by an eighth of a second off
my voice, and it's just like there's no way there's
I have no recourse and I'm not an audio engineer,
and so I just okay, like I vibe, I vibe
edited my my video and just ruined it.

Speaker 2 (25:59):
It's almost like every promise they make is it's going
to automate everything. It's like, ah, not really, as long
as you know what you're doing.

Speaker 3 (26:04):
But this was like a meta level thing where the
AI recommended me either AI also screwed me over, and
so I'm like, okay, this is like this may this
is like because I was using it a search, right yeah,
and so but it failed me a search because it's
just emphatically And then you look, because then I was like, well,
what's wrong?

Speaker 2 (26:22):
Am I? Is it a skill issue? Am I stupid?
And so I look and Reddit is just filled with
like this is the worst software, this is the worst
I used the script very briefly, and all I wanted
to do was take a bit of audio and turn
it into a video with the text happening. That's the
way you read their marketing material. You would look at

(26:42):
it and think it would take two seconds. Took me
about forty five minutes, and it was just by the
end of it, I'm like, I don't even want to
fucking that. I'm so angry because it's like, this should
be a button press. The whole point of AI bullshit
is meant it should be a button press, and it
never is. Bo wait, well there are other events though.

Speaker 3 (26:59):
Well, so there's that, and so there's that, and then
it's like, so I'm also I'm a web developer, right,
and so I'm not very good. I can't program mobile apps.
That's a thing that I can't do, don't know how.
I'm also not a very good like infrastructure systems programmer
whatever that's you know, cloud stuff whatever. I'm not great
at that, but that is an aspect of my job
that I have to do. Our website requires some infrastructure

(27:20):
difficult stuff. Over the years, I've actually gotten quite a
bit better at that, and so that used to be
a use case for me for GPT was like, oh,
it'll I'll ask you some infrastructure related questions, like I
know how to code, I can piece of I can
put the puzzle together, and you know, this is actually
going to save me a little bit of time. But
I have outpaced GPT's ability and infrastructure developments, so it's okay,
well I'm doing this project. It's not helping. Yeah, it's

(27:42):
just wasting my time. Okay, no need for that. The
descript thing is BS And then I learned from at
Tetron that this stuff is horrendously expensive. So it's like
if this was just regular software's service and it costs
pennies to operate, and you know, it's like kind of helpful.

Speaker 2 (27:57):
Whatever, yeah, be in offensive. It would be like.

Speaker 3 (28:00):
It's fine whatever. There's a company they offered me this
eight this thing and it didn't work, and you know
it happens. Yeah, but it's like in the context of
a world in which this is the future, this is
magic in a click of a button you get perfect
audio out if that's the promise in the midst of
all this and an AI is recommending to it, this

(28:20):
is like meta leveled like off like dog shits a
situation and then it's and then it's the dressed disastrously expensive. Yeah,
like what is the point? What is the point of
all of this?

Speaker 2 (28:31):
The point is we need to sell GPUs every and
literally in the car here they announced a seven years,
thirty eight billion dollar a day between open ai and
Amazon Web Services. It's just like, why so that we
can so that they can do sora to more so
they can generate more copyright infringement. It's and have do

(28:52):
you have you in the past used these coding models
a lot or is it just kind of like on
the side.

Speaker 3 (28:56):
I have so even like I will say, even like
two weeks ago, I had a very discreete task where
it's like, in this one situation, I want to do
this one little thing. Yeah, and I knew exactly what
it was, and I was lazy, and so I said, right,
write the code, and so I put in and then
this is a joke comment and people should do this
more often. I put in a I paste it in

(29:17):
the code, and I said, this code comes courtesy of
chatch ebt. If you have any issues with this software,
please contact open AI. That's what I wrote in my code, and.

Speaker 2 (29:24):
I should did and it worked great. Whatever. Okay, that's cool.
It's saved me twenty minutes. That's that's good. And that's
the thing that's the whole AI bubble. It's like, but
I'm not a paying subscriber anymore. Oh, that's even worse
for them.

Speaker 3 (29:37):
Yeah, I know, I just you know, because it's it's
the stupidest model of theirs could have come up with
that code because it was so easy, you know, it
was it was finicky, it was annoying. There are bit
situations where I f saying, you know, I say, oh
there's a bug in this code. I paste it in
and it looks it over and it saved me in
aggregate tens of hours in the last three years.

Speaker 2 (29:58):
That's fine. It's like, if it was regular sass, I'd
be like cool, yeah.

Speaker 3 (30:02):
I if it was I think that's kind of part
of why I canceled the subscription, because if it was
you know, whatever you needed value your time, you know,
I could if it was twenty bucks a month and
it saved me an hour a month, hey, you know yeah,
it's like.

Speaker 2 (30:15):
Like trip it or flighty, like it's a little bit
of software we pay for it does a thing. And
if it was it was ceiling from everyone and burn
down like it's just it. It only makes sense if
it was cheap, and it's the literal opposite. If this
was like cheap, like cheap CPU driven shit, and fine, sure,
but it's like I one day, I think we're going

(30:35):
to find out how expensive this is and it's going
to scare the shit out of people. But you know what,
that that actually makes me want to move to a
specific post. You made your scaling laws posts. Let's talk
about this. So you were a booster at one point
and you read the stuff short, but you wrote a
very eloquent piece about the scaling laws about how and
I've tried to work this into my work, but it's
we can have I don't know if i'd call it empathy,

(30:57):
but some understanding of how we got here with the
AI bubble, because when GPT four came out, it does
seem like tech people had a reason to be excited.
I was so excited. What was exciting? It was awesome.

Speaker 3 (31:10):
You talked to it and it was just like this.
I would ask it coding problems that I found. So
I was still a teacher at the time, and I
was like, oh, man, like I have the APE Computer
Science exam coming up, and I need to like come
up with practice problems. And I was like generators out
of thirty practice problems, I obviously read them over it
to my due diligence, and I, you know, I tried
the I did a good job putting them together. But

(31:31):
it's like these are decent, Yeah, these are decent practice problems,
and like this is this is useful software. I did
not understand how expensive it was, but there was there's
a the number of things that would like the hell.

Speaker 2 (31:43):
Will also give you, so the read the listeners don't
get mad. To be clear, GPT four was twenty twenty three. Yeah,
we had we were very early in understanding. I mean
the environmental damage was there, really sure, but they were
also promisingly effects. But it took a full year until
June twenty twenty four when they it came out the
Open Eye would burn five billion dollars. So like early on,
we didn't really know the costs either. And if I'm

(32:05):
sure someone will find a fucking link anyway, keep going.

Speaker 4 (32:08):
Well.

Speaker 3 (32:08):
So I was I was pumped up because I saw
GPT three, was I I you know, I'm a tech person,
and so I remember seeing early demos a GPT three
and it was like interesting novelty. It would say stupid
things and it was kind of cool that it could
even generate sentences. That was awesome. Three point five came
out and GPT whatever chat GPT. It's like, that's pretty cool. Yeah,

(32:29):
I can use it as a search thing and it
says that I'm good, which I like when people say
I'm good? Do you like when people say you're good?

Speaker 2 (32:36):
Ed doesn't happen very much, but I think, I, you
know what, I'll be honest, I knit that there's something
I think mentally about me where all of the anthwerpomorphization
pisses not even pissed milf. I'm just like, okay, shut up,
shut up, shut up. Yeah, it was bullied too much
as a kid that like, confidences don't work on me anymore.
I do want to write a.

Speaker 3 (32:53):
I do want to write a thing at some point
about how if it wasn't chat GPT, if it was
like box get text yeah, and there was no inn promorphization,
if it was just like this is a thing that
can generate usable or interesting or like code for you,
but there's no chat element to it, that would actually
make it a lot better to me, like the answer
pomorphization of like, oh you're talking to a person that

(33:14):
really makes me mad.

Speaker 2 (33:15):
Yes, And also I find every time it goes you
got it, you got to just shut up, shut up,
shut there.

Speaker 3 (33:20):
So in one of these NFL ads, literally I don't
know if they're like doing a nod to the haters
or what, but they like, take, we're going, we're going a.

Speaker 2 (33:28):
Couple, make sure the link is in the world.

Speaker 3 (33:30):
Well yeah, no, there's like four of them. Hopefully they
aren't know you. But uh, it's like a guy doing
he's trying to do pull ups and it's like here's
your pull up plan, Like you need to do one
pull up and then you should do two pull ups,
and then you should do four or five pull ups,
and like, I'm actually you will be able to do
several pull ups and then at the end it's like
you got this.

Speaker 2 (33:50):
Okay. If so the plan is you do more pull
ups of a time, you could probably just work that
out by doing doing pull ups with text in it. Mate,
I nobody said you got this, but yeah, exactly. Actually no,
my friend Mac, when I text him about pull ups,
he says much, He's like, you fucking got this. I
think he may have. Literally, it's just that's the commercial.

(34:12):
That's the commercial. That's the commercial. But it's the commercial.
People watching the NFL and they're like, oh shit, why
should I use chat GPT. Oh it's gonna tell me
a pull up plan where I increased from one to several,
one to seven. You got this? You got this? I
mean I didn't pause.

Speaker 3 (34:26):
I mean maybe I did not pause the commercial, but
I it could it could have said some really interesting
stuff in the middle. I don't know, but the bullet,
because it has to have a bullet of the list,
I am pretty sure and I might be lying, and
so whatever, you know, send me some hate mail, but
like I'm pretty sure it said like do a couple
wait a week, you know, drink a protein shake and like,
you know, do a couple more.

Speaker 2 (34:47):
It's just Google Search, except it makes up the results.
That's all three fucking years.

Speaker 3 (34:52):
So I have a new idea, which is that it's
Yahoo Answers but the person has a lobotomy and was like,
it just did cocaine.

Speaker 2 (34:59):
That's who answers. Yeah, I want it. That's just Yah
or Cora. But it's like but it's like light speed, yeah,
like the fast. Well Cora now is GPT like they
did because Adam DeAngelo is on the board of open Ai.
So it's just got GPT answers and GPT questions now.
So cool. But early on it was exciting and there

(35:32):
were these scaling laws walk me through through the listeners
who might not understand.

Speaker 3 (35:36):
So yeah, so the post that I wrote, which very
nicely called eloquence. So if I could pay you twenty
hours a month to kind of just send me stuff
like you've got this.

Speaker 2 (35:45):
I'll just I'll email those you got.

Speaker 3 (35:46):
This, Okay, that's great, I'll put it on a schedule. Yeah,
if you could just do that, I'll pay you twenty
bucks a month. No, but so so there was an
idea that if you increase the size of the models.
I'm not an AI that's right, I'm an AI scientist,
and so in this post I said, I'm not an
AI expert or an economist. But like, you look at
this chart, this chart that they had and you can
link the thing. And I actually think I cited my sources,

(36:08):
the original like paper basically about the scaling laws. They
have this chart that is incredible. It is like, make
model ten times bigger, get the nice jump in performance.
Make model ten times bigger, get nice jump in performance.
And then the idea is like, okay, well if we
just keep making it ten times bigger, we will get
who knows how good they can get? Yeah, and it

(36:29):
kind and it did work, like it was working for
a minute. That's how they went, to the best of
my knowledge, that's how they went from three point five
to four.

Speaker 2 (36:36):
I mean there's a.

Speaker 3 (36:37):
Number of they have smart people over there, yeah, Like
I mean, we can be honest that like they're doing
clever stuff.

Speaker 2 (36:42):
Small is also a very subjective. Sure, these are people
who want experts in mathematics.

Speaker 3 (36:47):
Yeah, they're doing hard, real math and they're getting results.
Like the fact that it can do what it does
is incredible.

Speaker 2 (36:52):
Yeah, it's kind of crazy that they can do it.

Speaker 3 (36:55):
If that was all they were saying, If that was
all they were saying, if they were just like, we
did research, We've created this incredible piece of technology that
feels almost alien at this point, I mean, or at
the point when we discovered it. Now it feels like,
you know, just we take it for granted that it's
kind of this trash thing. But like at the moment
when it was released, it was like, oh my gosh,
like this is actually crazy. Yeah. And the idea was

(37:16):
we make it ten times bigger and we will get
a similar jump in performance, and that is GPT four
point five. It's just like a footnote in history.

Speaker 2 (37:24):
Oh that was that was released.

Speaker 3 (37:26):
And sam Altman was just like, uh, yeah, well we
made a big model.

Speaker 2 (37:30):
It was the best announcement ever. I'm actually gonna put it,
but from what I remember, miss miss Clammy Sammy was like, yeah,
you know, it's I'm just gonna do it from memory.
I it's it was like, yeah, good news. It's really
good for writing bad news. It's really compute intensive. And
I was like yay.

Speaker 3 (37:48):
And in that announcement I did quote this in the
post because I don't want to make stuff up and
like whatever, but they literally said, with each ten x,
with each order of magnitude, you know, ten x increase
in model size, we will get an improvement in performance.

Speaker 2 (38:02):
Yes, but like, where's the like worth the big improvement?
Like it's gone? I don't.

Speaker 3 (38:07):
I think that was them. I think that was the moment.
I don't know what day they announced four point five,
but I think every into twenty that was the that
was a game over. Yeah, and then they did the
reasoning stuff, and the reasoning stuff was the reasoning.

Speaker 2 (38:19):
Thing was September twenty twenty four. And my favorite thing
about that was reading all the tech press writing about
it and being like, can any of you tell me
what this does? Can any of you tell me why
this matters to this day? And I'm, by the way,
I'm not. Actually, it took a minute for me to
work out what the fuck? And it's just a hat
on a hat thing. It's like, instead of spitting out
an output, it goes what would the output be? Oh,

(38:40):
I will I will go through it and choose these steps,
which is it's test time compute and it's.

Speaker 3 (38:45):
Mantle and I could have had a moment of reflection
when the reasoning models came out where because I was.

Speaker 2 (38:51):
Like, okay, it was still the height of the fever, though.

Speaker 3 (38:53):
I know, but I so I asked it a hard question.
So I did a maths degree and a computer science
degree in right, So I was like, take this topic
from sophomore year abstract algebra and do this like visualization
of the thing, right, And I took one of the
early reasoning models, which everyone is like, oh, oh one,
so sure, because it's like you have a PhD level
thing in the pocket. Okay, so PhD level thing should

(39:16):
be able to take real sophomore math sophomore and college
math concept and visualize it.

Speaker 2 (39:21):
Yeah, you should be able to do that. And then
it didn't, and then I kind of just did it.
I was kind of just like, oh, I guess I hmm.
And then I just didn't think about it.

Speaker 3 (39:32):
And then I just kept on kind of hoping that
something exciting would happen.

Speaker 2 (39:36):
Yeah, and I can a bit of empathy here, I
get if you're and at that time, so it was
September twenty twenty four. A month later, they'd raise six
point six billion dollars to get a credit facility of
four billion dollars. Like they it looked like open Aye
was going places. Unless you're like me and you've read
every single possible financial thing you can get hand on,

(39:56):
and you've obsessed with the numbers. But I can get
why someone it was stick within the booster ring might
not immediately be like fuck because yeah, I don't know,
people hadn't built things of reasoning, and it did actually
take a few months for people to work out products
with reasoning.

Speaker 3 (40:11):
Yeah, I mean, and whatever. I mean, I don't know
what the improvements were. And they improved on the benchmarks.

Speaker 2 (40:16):
That's fine.

Speaker 3 (40:16):
It's kind of like and I'm sure that the coding
results are marginally better.

Speaker 2 (40:21):
That's the thing. Though marginally.

Speaker 3 (40:23):
Yeah, it's always margin, but now it's marginally. But that's
the thing. Three to four was sick, huge jump.

Speaker 2 (40:29):
That was sick.

Speaker 3 (40:29):
That was not marginal. If you had, if you had,
if your lights were on, if you were paying attention
and you typed a thing into three and you typed
a thing into four, you should be impressed.

Speaker 2 (40:37):
Oh, I remember the jump. I wasn't doing better off
alone at the time, didn't do that until February twenty
twenty four. But I remember being like, oh that'sk but
I remember just being like okay, now I was I
was it was like, wow, we made a computer do
this and this cool?

Speaker 3 (40:49):
Okay now yeah, And so like I was vaguely aware
of the of the line chart that I mentioned in
that post, and so I was like, oh, like all
they had to do it was like it is a
freight train toward like actual really cool thing because it's like,
just make it bigger. And therefore if we just need
to make it bigger, then we do need more compute
and the reasoning models.

Speaker 2 (41:08):
Would you finally got another way to throw compute because
it's the training compute and well compute to generate and
on so test time compute. Wow.

Speaker 3 (41:15):
Yeah, so like that's that's where the freight train's over. Yeah,
and I and I just four point five came out,
didn't really think about it that hard. They started doing
the reasoning stuff and it's like, okay, well they have
marginal improvements and they say it did really good on
a math Olympiad or whatever, and like that's that's interesting.
But but then it's like another whole year goes by

(41:36):
and then and then GPT five comes out and like
what what was that?

Speaker 2 (41:40):
It was nothing and it was so strange.

Speaker 3 (41:42):
And so that that was the final So when I'm
talking about my confidence and the guns, that cured me
of my boosterism. Like I started reading your stuff about
it being expensive. Yeah, but then I was like, this
is interesting. I've started reading this guy Ed's posts. GPT
five is coming out next week. I wonder if this
guy is going to have an extraordinary amount of egg
on his face. Yeah, like you might have been scared.

Speaker 2 (42:02):
I wasn't because I have the stonewell of the Buddha.
But it's I was also just like when it when
reasoning was coming up, going back to twenty twenty three,
they did was some real shit. The rumors around that
Q star, Yeah, it was like wit. The reason Samuton
got fired was they found a terrifying new AI they
kind of like drummed up. There were leaks about it.

(42:24):
There were leaks about levels of intelligence. There was all
of these big leaks. There was really no leaking around
GPT five other than a Wall Street Journal story towards
the end of twenty twenty four where it was like, yeah,
it's costing a shit ton of money. It isn't working
very well, like the leaks. The reason I because the
thing is, I mean this to this day. If I
am wrong about all this, I don't think I am.
I will admit it. I will explain why. But GPT

(42:46):
five I wasn't particularly worried about because it did I
could not fucking tell you what the what it was
going to be like, no one really if you go
back to twenty twenty three and you look up GPT
five stuff, the shit that people are saying is insane.
That was someone saying it would be completely autonomous and
it would turn weapons systems against people. There's bonk as ship.

(43:06):
But getting up to it, yeah, it was kind of
a proving point, but it was just another fucking model.

Speaker 3 (43:11):
Well, and so that you and I had started exchanging
emails because whatever, I sorry podcast, and I said them
on my post over and it was it was it
was interesting talking to you and then I I when
that when announcement was going on, I emailed you and
you got a lot of emails coming through. But I said,
ed ed ed ed ed ed ed. They announced paywalled
chat colors.

Speaker 2 (43:30):
Yes, I go through. No, no, I remember this but go
through this. In the announcement of GPT five, the biggest
thing ever, They're like, for our paid subscribers, you can
turn your chat yellow, which they still haven't released.

Speaker 3 (43:43):
They still haven't released that. I don't I'm not I'm
not a paying subscribers. I've never seen a yellow.

Speaker 2 (43:49):
I'm paying for chat JPT plus. See you can turn
it yellow. I want to see if I can do this,
live on it yellow, pink, green, change my window color well, GPT,
GPT No, I'm going to ask it because it's fucking insane.
If this doesn't searching the web, yes, you can change
they did well. They did well. Can you though? On

(44:13):
some platforms you can change the accent color. God, this
fucking stinks. The fact that you can't ask a product
what it does.

Speaker 3 (44:21):
Well, if you if you can't ask, like, I don't know,
what's it good? But I mean, if you can't ask,
if you can't type into a Google doc in two
thousand and seven, what does Google docs do? That's unsurprising.

Speaker 2 (44:31):
But that's because Google Docs is a place to write woods.
This is meant to answer the thing.

Speaker 3 (44:34):
Well, no, that's what I'm saying. But it's like they've
they claimed that it's this all you know, all all
knowing omniscion thing and it cannot tell you how to turn.
Shouldn't it have just done it for you?

Speaker 1 (44:43):
Yeah?

Speaker 2 (44:44):
H like give me the bus.

Speaker 3 (44:45):
It should have it should have said edit, great question,
would you like it to be blue?

Speaker 2 (44:49):
Yellow?

Speaker 3 (44:49):
Pin or or chars are orange? And and like? But
where was the But where was That's what it should
have done.

Speaker 2 (44:57):
And also the idea that that was one of the
announcements is very very cool. I love the idea that, like,
it's the biggest moment ever and you can now make
chair GPT brown. It's insane. It's insane.

Speaker 3 (45:08):
Brown is one of the supportive colors. Probably not next year. Yeah,
that that relies on the compute actually the oracle deal.

Speaker 2 (45:15):
Yeah it's brown, I'm brown. Oh my god. It's so
cool that we've built our entire economy on top of
this as well. But but the GPT five thing is
it was such a weird moment because watching everyone try
and be excited about it was really good. There was
the whole THEO white weight, not THEO weight, and that's
the information THEO there's this fucking guy. Now I'm gonna

(45:39):
I really shouldn't have blanked. I've mentioned him. He did
a whole thing about GPT five. I'm gonna look this
up live on this professional show, where he did a
whole thing saying GPT five is the most amazing thing ever,
and and then had to be like, yeah, actually yes,
THEO THEO Brown. There we go, THEO Brown. He did
the thing saying I'm scared of how good GPT five is.

(45:59):
Then a week laters like actually it's not the same
as when I used it, which is craziest that that
should have been a scandal, Like why was that the case?
But everyone just kind of moved on. But I I
don't know what women to be excited about next GPT six. Well, yeah,
it's just But also, what's that meant to? Because JPT

(46:20):
five was this weird kind of like myth in the future.
It's like when we reach this, everything will get better.
But now it's like we're gonna get Claudes on it five.

Speaker 3 (46:28):
I guess yeah, whatever, I mean, you're gonna get the
next one. But that's the thing is like, if it's
just continued marginal improvement, what am I? Yeah, what what
are we doing? It doesn't make it doesn't make me.
That did not make me excited. And yes, it can
save software engineers typing time and whatever. I mean, if
you know what you're doing, you can get a lot down.
I guess that's fine. If that's the way you like
to work, If you like to type stuff in and

(46:49):
wait on moting screens and get your code out and
review it, that's a way to do programming.

Speaker 2 (46:54):
That's fine. Yeah, And it's it's literally fine.

Speaker 3 (46:58):
I'm actually saying like it sounds super sarcastic, but like
that's literally fine.

Speaker 2 (47:01):
No, but that's literally fine. Would be if this was
a ten billion dollar industry, if they were selling it
as like the equivalent of virtualization or like some side
thing to the great de cloud compute, not the entire
future revenue engine because it isn't. I mean.

Speaker 3 (47:19):
And so you know, I run a business and I
think that So a thing that started people say is
they say, like you have product market fit, which is like, oh,
your product is good. Yeah, if like one of the
criteria is if it went away today, would your users
like throw a fit?

Speaker 2 (47:35):
Yeah?

Speaker 3 (47:36):
Would you throw a fit of chat GPT you got
uninstalled from your front You would not, But like you know,
would the general person be that upset, and I don't
think they would.

Speaker 2 (47:43):
I think that there would be a contingent of people
who'd be very upset if you have a like parasocial
exact the right word. Yeah, if you're like in love
with your GPT, then that would be like a death
in your family and that's very sad, which would be horrible.
And indeed they But it's like I've been saying this
for what it's like, say it to boosters, like if
this disappeared, would your life change? Would it really change

(48:04):
that much? And like, well I used it for baby names.
I've used it for it's.

Speaker 3 (48:08):
Like you named your baby after No. I'm just saying, like,
if you like, if.

Speaker 2 (48:12):
You've got a baby name from chat GBT, that's that's tough. Yeah,
that's really bad. The more that was said to me
by a booster and the more I think about, the
more I'm like, brother, one day, your child is going
to hear this, because all they do is they.

Speaker 3 (48:24):
Sell a book called like the Baby Name Book, and
it has like a list of names in it.

Speaker 2 (48:29):
I don't fucking know read some books like just think
about it. Yeah, I'm gonna one of the most important
choices the identity of a future human. I'm going to
send it to incorrect Google search. Yeah, it's it's depressing,
but it's also quite funny because I feel like this
era has really revealed who just doesn't know anything about

(48:50):
sucking it. The people who are just like will believe
anything or will just believe that they are smart at
something because of machine told them they are, that they
got this. You got to you go.

Speaker 3 (49:00):
So someone online posted I can't wait for the day
when there's an AI agent that'll tell me when my
friend's birthdays are.

Speaker 2 (49:09):
Fuck, there's no other way to do that. There's no
way to do that. I don't have like some kind
of calendar.

Speaker 3 (49:15):
No, No, it's going to be a reminder. And so
that's like, that's what's happening now. Is that in the
like startup space or just people building technology, it's like, well,
we're going to get or if you like watch the
ads on the NFL or whatever, it's like you were
going to agent is going to do the thing that
software is supposed to do. Like software. So like I
got sold at one point accounting software. Right, that was AI, Right,

(49:38):
the AI is going to categorize your transactions. Sure can't
do it I bought. I was at a conference and
it was, you know, whatever, May May twentieth, May twenty first,
May twenty second. I go to Starbucks, I go to pizza,
I go to thing. They're all travel related expenses. Yeah,
one is travel and then the Starbucks is client conversation.

Speaker 2 (49:59):
That's what I decided client conversation. And so I I
had a meeting with this the founder of the thing,
and I was like, dude, like, what are you? What
is it? What is this? What did they say? Well,
they were just like, you know, sometimes it makes mistakes.
We should get on that. Fuck. Yeah, that's that's my accountant. Yeah, you.
My whole thing is I know, it's I think with

(50:19):
all of this AI coding stuff in the big in
the big tech realm, something's gonna break something. Really, but
someone's gonna someone's gonna do something stupid. Yeah.

Speaker 3 (50:28):
Well, so so back to replet. I mean I think
that they I kind of hope that they're I don't
know if I hope that they're first to go. I
mean whatever, they're nice people working there. So that's the
unfortunate thing is there are nice people working at these
There are people with jobs like it will involp people.

Speaker 2 (50:43):
I don't I wouldn't want.

Speaker 3 (50:45):
To ask for people to get laid off who are
hardworking people, and some of them are like cool scientists
who have studied hard and they're like the nice people.

Speaker 2 (50:52):
And it's the grimpa of all this is like people
are gonna lose fucking job.

Speaker 3 (50:55):
But it's I mean, it's the executives who you know
obviously pissed me off for just yes lying through their teeth, right,
I mean, those people deserved it, and but you know,
they're never actually going to have a bad outcome happened.

Speaker 2 (51:05):
To them this, which is why we need to write
things to put their it because at some point there
needs to be a record of this, of course. Yeah,
so I'm going to wrap it there. Charlie whek came
people find you?

Speaker 3 (51:15):
So I have a blog blog dot Charlie Meyer dot co,
which is like where kind of my writings go. But
I'm also trying to set up a newsletter. So that's
Csmeyer dot substack dot com. And my name is spelled
m E y e R.

Speaker 2 (51:30):
Hell yeah, and I of course amed Zytron. You can
find me on the internet at Google dot com. That's
where I live. I will put all the links to
Charlie's stuff of course in the episode notes, but it's
good for you to hear it now. And yes, should
have a monologue coming up this week. I know I
did an announcement where I said I was going to
have a big story that is on hold, not because
anything went wrong, but because the scale of the information

(51:51):
I got has changed dramatically. When I eventually talk about this,
it will be a lot of fun. Otherwise, catch you
sim Thank you for listening to Better Offline.

Speaker 5 (52:07):
The editor and composer of the Better Offline theme song
is Matasowski. You can check out more of his music
and audio projects at Matasowski dot com, M A T
T O s O W s ki dot com. You
can email me at easy at better offline dot com,
or visit better Offline dot com to find more podcast
links and of course my newsletter. I also really recommend

(52:29):
you go to chat dot Where's youreed dot at to
visit the discord, and go to our slash Better Offline
to check out our reddit.

Speaker 2 (52:36):
Thank you so much for listening.

Speaker 1 (52:38):
Better Offline is a production of cool Zone Media. For
more from cool Zone Media, visit our website cool Zonemedia
dot com or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.

Speaker 5 (53:02):
FI
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.