Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Time.
Speaker 2 (00:00):
It's like a clown. No, don't this a little page.
Speaker 3 (00:02):
He's bagging boarding batman and the gut or like a
maze story tellers me some fellas, we some felons.
Speaker 1 (00:06):
Isn't amazing.
Speaker 2 (00:06):
It's like Appella bear sellad because this shit is so contagious.
Speaker 3 (00:09):
Mouths on the summer Reason pilot got the show while
the cycle spinning knowledge on the getty like a pro
beat the babo be the rabbit. Don't step to the squad,
we get activic and hate. It's like a stepla parts.
You don't like fish talk?
Speaker 1 (00:20):
Do you hate? It's a batl with.
Speaker 3 (00:21):
The cuttle fish killers tender pools on the taping Greatest
five of Stars. If you cherish your life, fucky barneshit
squad spraying leg and your pipe.
Speaker 2 (00:33):
Hey, everybody, welcome to another edition of Is This Just Bad?
Is This Just Bad? The best podcast you ever heard
of him? Her host, Professor Mons Run is always by
the Cebes Cosmologist and Teddy Hell.
Speaker 4 (00:44):
Little folks go to be back. What s.
Speaker 2 (00:50):
Is going down? Like the dow right now? For fuck's sake,
we were just talking poly science politics before we start rolling.
Speaker 5 (01:07):
I like polyscience better you can choose all of your
various romantic polygons.
Speaker 1 (01:13):
It is sociology in some way.
Speaker 5 (01:15):
But yeah, that's that's not what we're talking about. Unfortunately,
less interesting to me, but go ahead.
Speaker 2 (01:21):
The stock market is just like, is like so bad.
I feel like they just kind of stop reporting about it.
It's just like going down the toilet and it's the
kind of like, well, what can we do? There's nothing
to do. There's like all these rug pulls happening. The
tax bill, which is insane, the tax bill that's like
circulating that Congress might pass. It's like, we're going to
(01:45):
be paying more taxes. Us on this call is going
to be paying more taxes if the ship passes. It's like, wow,
I pay too many, pay too many.
Speaker 1 (01:57):
I would have no problem with, Like, I have no
problem playing paying taxes. If I was like food programs, Yeah,
I don't mind paying what I think it's a dollars
seventy No, it's for where I live in a mysterious
county that I didn't just say. I mean there's like
(02:20):
certain amount of tax dollars that go to food programs. Yeah,
I will pay for snap benefits if somebody's hungry. Sure,
I will give you the money, but for a tax
for my tax dollars to literally go to parts of
this administration, to their personal companies and bank account That
(02:41):
makes me a little mad.
Speaker 2 (02:43):
The problem is that the United States. That's the fucking
problem because like when you look at your check in
the US for global listeners, they take so much out
of it because every part of your our lives are privatized.
So it's like we got like, if we're gonna retire,
(03:04):
we gotta have money coming out of our check. If
we want healthcare, you gotta have money coming out of
your check. Like you have to have taxes coming out
of your check otherwise you got to pay them on
April fifteen, and it's gonna be a lot of money.
You have Social Security coming out of your goddamn check,
and that shit's going into a pot that you might
not use. Like so like your income is so desecrated
(03:26):
by the time all that shit comes out so that
you can like continue to live. And that's the problem.
If they just if all of that was just taxes
and they went out and I God, like we had
universal healthcare, fucking like universal retirement and all that shit
and fucking universal public school like the public school, like
(03:48):
just like the normal shit that all other countries have.
I would be fine with it, but we're paying taxes
into just more defense spending.
Speaker 5 (03:59):
And shit, yeah, absolutely it The rug pool, I think
is an interesting point of tanking the markets and then
bringing them back up and then tanking them again to
just transfer wealth to five guys.
Speaker 2 (04:18):
Not the Burger joint.
Speaker 4 (04:20):
No, I don't know if they.
Speaker 5 (04:21):
If they did an I p O and if there's
I five guys stock that's probably doing okay that ship.
Speaker 4 (04:27):
Yeah, but no.
Speaker 5 (04:29):
The Zuckerberg bezos and like three other dudes.
Speaker 2 (04:35):
That sucks.
Speaker 5 (04:37):
It's self frustrating and yeah, absolutely, Like I'm happy to
you know, if you could pay taxes for the only
money that goes into political campaigns is tax funded as
opposed to private campaign finance and my library system, all
of that is awesome. But yes, all of the additional
bullshit that is functionally a tax but it's not going
(05:00):
to the government, is going to private insurance and retirement,
and then, as you said, social Security, which may not
be around by the time we're ready to pull back
out of it.
Speaker 4 (05:13):
Yeah.
Speaker 5 (05:14):
Kind of a bummer. So did you have something specific
do you want to follow up on there, or we're
just starting this off with a big bomber.
Speaker 2 (05:25):
No, it was just that weird thing that. Yeah, the
the tax bill crashed the Dow, and it's like every
economic policy is a way to crash the market so
that people can invest. And low key, I think it's
juicing crypto as well. Yeah, probably because this shit like
(05:47):
whenever people sort of whenever, like the traditional speculative markets
are having trouble.
Speaker 5 (05:56):
People run away to an even more speculative, fake market,
even faker market. Yeah, no, line not go up on
imaginary graph. Let me go to a digital imaginary other
graph that burns forests in order to exist somehow.
Speaker 2 (06:13):
Yeah, well, at the very least, like with the Dow.
When you like see a Coca Cola stock, you go like,
I can at least understand. I don't understand this abstraction.
But I've had coke, right.
Speaker 5 (06:26):
And that number, that line going up or down is
somehow theoretically attached to the process of producing and consuming
that soda that.
Speaker 4 (06:37):
I have tasted.
Speaker 5 (06:39):
Yes, yes, like there is still kind of a physical
product attached sometimes.
Speaker 2 (06:44):
Yeah, I've never eaten a bitcoin.
Speaker 4 (06:47):
I've never never bitten down a coin.
Speaker 2 (06:49):
I've never seen I've never seen a doge in the wilds, like, uh, yeah,
so I don't know what the fuck is going on,
but yeah, Teddy would say that before we started recording
that he hadn't been following the news, but I mean,
like when you open up the news, every headline is
(07:10):
is horrid, like, uh, what tariffs are hitting. I'm getting
emails from companies that I have subscriptions to. I have
a Dice subscription and they it's like a great dice
company called Dice Envy, and their subscription was like super
fucking cheap. I remember canceling a couple of streaming services,
(07:34):
meaning like we have like forty five extra dollars, and
then this Dice subscription was twelve bucks a month. So
I'm like, okay that you know, we're we're shaving some off.
Let me indulge a little bit. And then they said
it emails like a very uh uh you know, upfront email,
being like, you know, the terrorists have already impacted our business.
(07:55):
We're gonna have to increase our monthly subscription to twenty
It was like twenty two dollars or something like I
canceled it. I'm like, that's fucking a double. That's double.
I can't pay for that.
Speaker 5 (08:06):
Yeah, I've been tracking. They're called the Wand Company and
they do fancy prop replicas and they have been fard
at work for years on a really nice original series
Tricorder replica with they went through and they scrubbed all
(08:30):
of the communicator logs from every episode of the original
series and put together a bunch of like semi working
parts and there's a bunch of cool stuff that this
that this tri quorder does, and they finally got it
into production and then the tariffs hit and they're and
they're based in the UK, and they're like, uh, we
(08:52):
can't sell this to you for a while, and like, oh, well,
now the tariffs has gone down, maybe we can. But
it's kind of cost like a hundred extra.
Speaker 2 (09:00):
Yeah, we can't tell this to for four years, uh
maybe eight.
Speaker 5 (09:06):
Yep, and like please don't scalp it and buy a
bunch of extras. But also it's too expensive to even
buy one. I'm like, well, here's the you can go
online and listen to the communicator logs if you want to.
Speaker 2 (09:18):
Sorry, god damn, oh my god. Yeah, it's it's it's
it is weird. How I mean, it's only been five months,
and you see the the the impact was immediate and
it's continuous.
Speaker 5 (09:33):
Meanwhile, the terrafts themselves are rug poles, and it was
just like, oh, you know two hundred terror or whatever, like, oh, well,
just kidding, we're not doing that. We'll delayed for ninety days,
Well for another ninety days. We don't know what we're doing.
So but by that time it's too late because the
supply chains, the dread supply chains are work so far
(09:55):
in advance that that uncertainty hits everybody. And when you
flip flop back and forth to lower terrors, higher terrors,
nobody could predict what they should be ordering. Yeah, and
obviously like we're talking about like luxury bullshit, but you
know it's going to seep into.
Speaker 4 (10:16):
Every aspect of our lives.
Speaker 5 (10:18):
Yeah, trying to figure out like costplay stuff for the
next count I want to go to, and like I
don't know if I can get any of this stuff
shipped to me or if it's going to cost two
hundred more.
Speaker 2 (10:30):
Well, that's the thing is the raw materials, Like raw
materials are are increasing, so consumer goods are increasing. I
like with like a I thought twice with like a
dice company, because it's like, I mean, this is not
a huge company. This is like the Internet equivalent of
like your local gaming yeah or whatever, which those have
(10:52):
like disappeared. But then then the fucking the the excuse
for like Wendy's to make the bacon eat or fifteen
dollars is bullshit to me? Were that?
Speaker 1 (11:03):
Yeah, one of the biggest issues that we've seen when
it comes to anytime any of these supply chain issues
hit is for everyone. For all of that talk about
the market self regulating, the second these supply chain issues hit,
(11:24):
there is no there is very few abilities for either
us as the American people or at the time the government.
But even now it's being encouraged more for price gouging
to hit. We saw it with eggs, we saw it
with milk, We've seen it with but weirdly, there was
(11:44):
a potato antitrust one that came up I think between
December and January, and we're just seeing like now that
it's been a fit and that was just from like
the Suez Canal being blocked and another like the two
canals that got black. That's when we were like, oh no,
(12:05):
this is the supply chain issue. Oh wow, globally this
all went up. Now that we have actual policy that's
only been an effect for a month and a half,
we haven't gotten through twenty twenty five yet. It's not
June at the time of recording this. We haven't hit
We haven't hit it yet. Yeap.
Speaker 2 (12:31):
Yeah, it's a fucking massive bummer. And yeah, I don't know,
like all of like the free market bullshit. I'm like
reading a lot of like labor history stuff, and it's
it's shocking how many recessions there have been since the
Civil War, like the sort of onset of the Gilded
Age and lyza fair capitalism and free market capitalism and shit,
(12:56):
all of that fucking shit up into the depression. There
was like there was almost immediately a panic in eighteen
seventy three because and it was all because of like
over capitalized railroads, Like I mean, they were getting away
with fucking highway robbery back in the day. They were
(13:17):
selling like bonds to railroads that didn't exist, type of shit,
what the fuck's going on here? And then massive overproduction
like that was always the thing with you know, from
the labor side, like scientific management was. Here's what we're
gonna do. The thing that unions do is they try
(13:41):
to cap labor productivity. They try to say, you can't
kill us in the factory, like, we're not gonna work
for twelve fourteen hours. We're only gonna work for eight
and in those eight hours we're gonna have this many breaks.
And also in the eight hours, we're not going to
exceed this level of productivity, and like the there's a
(14:06):
quote from like a union organizer who was like, and
this is back in the nineteenth century, who was like,
the only people who I know who who've stayed in
any industry long enough are the people who drag their
feet and who you know, don't give it their all.
And then the people who come in as scabs trying
(14:28):
to steal jobs away from people. They're out in a
couple of years anyway, and scientific management and tailorization came in,
and they were like, the only objective is to kill
labor because labor caps production, and what we're looking for
is unlimited production. And when every business that's making fucking
(14:53):
soda or whatever is unlimitedly producing, then it floods the market.
People don't have money to buy this shit, and so
the shit just stays there overproduction inflation and then recession,
and it was just happening constantly in eighteen seventy three
and then in eighteen ninety three. Arguably some economic historians
(15:16):
don't think the eighteen seventy three recession ended until eighteen
ninety nine, So if you take that, then you just
have one sustained recession, and then eighteen ninety nine. In
the beginning of the twentieth century, there was this massive
boom period which was then killed again. Recession in nineteen
oh seven, recession in nineteen ten, recession in nineteen twenty
right after World War One, and then the Great Depression,
(15:38):
and then that's when the New Deal, but all the
fucking regulatory frameworks and then you started to have, like
just after those were eliminated, continuous recession era experiences since
two thousand and seven, and it's like it's just history.
History is undefeated and holding a mirror up to society
(16:01):
and going like, you know, the latter, like you don't
have to guess or speculate or have hypotheticals or have
people like going in and being like, well, you know what,
if you disincentivize people, then they're not gonna work It's like,
we have history to tell you that when you let
markets go fucking wild, people are gonna be disincentivized from
(16:23):
working for nothing to buy nothing or to be able
to afford nothing, and the fucking economy is gonna do worse.
It's just absolutely is gonna do worse.
Speaker 1 (16:34):
Yeah.
Speaker 5 (16:34):
The one thing that we and people want to ignore
history for like, oh, but the technology is new, and
it's you know, it's we don't know what it'll mean
when we like inject AI into anything or whatever the
fad of the week is. And the big difference, and
it's not even different because you can look at various
other periods of history for this too, is when you
(16:57):
flood the markets and have unlimited production also burn the planet. Yeah,
like I don't have anywhere to store all this stuff
I made that nobody can buy. And also I've run
out of resources to make more things.
Speaker 2 (17:11):
Yeah, no, it's insane. And then and yeah, and then
like that's always so overstated too, because automation during this
sort of post civil war period didn't lead to increased
or more efficient productivity creating skilled laborers did. Yeah, they
(17:31):
like the fucking machine doesn't run itself, I mean and
that's why there's a very sort of real way that
AI sort of a world historic shifts. But even then
we fucking have to program AI until starts programming itself,
and then we just you know, I guess we just die.
Speaker 1 (17:51):
One thing with the AI, because I'm having to deal
with it a lot of my day job as of now,
as of this recording right here, there are some things
AI can do spectacularly well in horrifying circumstances, but the
idea that it can actually fully replace people in a
(18:14):
meaningful way for the way folks have been talking about
it ubiquitously, it's not there and at all.
Speaker 5 (18:23):
It's capped, like it can't get there in the way
it's built currently because learn the language models are done eating.
There's no more data to eat. They're eating each other
now and cannibalizing and getting dumber and worse. And there's
the and the generative models are just predictive throwing stuff
at a wall. Doesn't do basic calculations the way you
(18:47):
would need it to do because it doesn't actually read anything.
It's just predicting what usually goes next in this pattern, and.
Speaker 1 (18:56):
That does and that's where the oh, this is existentially
unfortunate and horrifying to look at. What AI is really
good at is looking at massive amounts of data and
being able to spot those trends, as you said, and
one of the largest applications of that has been and
(19:17):
is in the dragnet scenario for security. So I mean
just flat out and it's all public. I like to
couch stuff like this by saying this is not conspiracy theory,
Pepe Silvia. This is the things that have been openly
(19:39):
talked about at free conventions where people just talk about
these industry things like this is just on the record
things that they have stated. And by day I mean
literally the CIA M I five massade and like a
few other folks, I think the Journe Intelligence also, but
(20:02):
I I don't fully remember if they were at that conference.
But essentially the biggest problem with the dragnet methodology when
it came to scraping data and getting it is that
law enforcement are still people, They are still human beings.
There was no way to be able to search between
(20:22):
noise and what actually what actually is chatter, what is
coded language versus what is not. Collecting all of that,
even if it's through Google Ads, just was not feasible
to truly understand what's going on with people's movements and
all of the data that they can get through your phone,
your web browser, fitbits, all of these like the Internet
(20:45):
of how often you use your internet connected oven, stove, fridge.
Put all of that together. Yes, literal surveillance state. But
the problem was they had so much they couldn't do
anything with it.
Speaker 4 (20:58):
The issue there was.
Speaker 1 (21:00):
Literally too much junk, and there was like, well, none
of this is meaningful. The scary thing about AI is
a lot of the machine learning algorithms that have processed theanguage,
large language models, as well as some of the calculations
that they're able to make to look at the regression
algorithms for behavior now were actually pretty good at efficiently
(21:24):
going through all of that and saying these are outliers.
This is not yeah, And unfortunately, I think that has
been for many governments worth the investment. I say it
that way because in the last year of the four
hundred and ninety billion dollar investments that various companies have
(21:47):
put into things like open AI data centers, all of
this largely a lot of these AI companies and these
models have not actually found the golden or the panacea
that they were promising with AI. It just has not happened.
Microsoft hould its current funding for one of its data
(22:10):
larger data centers and has lost a large portion of
its investment into AI just period. It's so expensive for
them to run that they are pulling the investments because
there's no way for them to recoup it except for
what we've been talking about when it comes to rent
seeking behavior. Now that all of these AI models have
(22:34):
been pushed onto most current a lot of operating systems,
they're like, well, we're just going to raise the prices
every time, so it's no longer Oh you pay a
little bit more and you get this for free. Oh okay,
this part is end things that they have stated. This
is now Teddy's future site, so future site corner within
(23:02):
I from what I'm looking at, sometime twenty twenty six,
maybe mid year or end of year twenty twenty six,
the amount of money being charged for all of these
AI agents that run on chat GPT is going to
skyrocket exponentially.
Speaker 5 (23:22):
And I'm frankly that can't come soon enough for me
because that will kill a lot of investment, Like, oh,
this is now too expensive. Maybe it is actually cheaper
to just use people because this big fancy dream we
were promised about it replacing all kinds of support tools
(23:42):
or sport teams doesn't actually work the way they expect.
Speaker 1 (23:46):
I want that to be true, but also like they aren't,
like things like duo Lingo. Parts of Microsoft, parts of Amazon,
parts of Facebook have been laying off people for in
of AI support and AI agents, And the problem with
that is the by the time they start hit getting
(24:09):
hit with these costs, people are just going to be
have been out of work for a year.
Speaker 4 (24:14):
Yeah, No, it sucks.
Speaker 5 (24:15):
They have been laying people off really aggressively, assuming that
they'll make back. Like one, it's just in shitified and
it doesn't matter that the product and the service gets
worse while they're waiting for AI to somehow get smart
enough to fill in the gaps. And then when they
realize that it can't, everything's already broken and I don't
(24:39):
know that there's a big incentive to go back and
fix it as long as there's this race to the
bottom of everything's just getting worse and we're scraping out
the quality and we're left with hollow shells of products
that used to work.
Speaker 2 (24:53):
Yeah, and it's also the I mean, what's happening with AI,
happened with the Internet in general when when the dot
com bubble burst, And it's what happens when any new
technology is introduced into labor is it becomes an object
of intense focus, it becomes potential ROI. So there's a
(25:17):
ton of investment. And whenever there's a ton of investment,
you know, fucking people lose, Like what was it? Whoever, However,
many people invested in HDDVD instead of Blu ray, they
just lost their shirts. However, many people invested in like
certain types of electricity in the fucking late nineteenth century,
(25:38):
they just lost because they invested in the wrong thing.
And what it also does create is a bunch of
fucking charlatans. There's a bunch of there are a bunch
of like quote unquote scientific managers who emerged to be like,
you know, I'm an expert in creating the ideal efficient
(25:58):
conditions of factory labor. No one else but me is
good at it. And there are thousands of guys saying that.
Speaker 5 (26:07):
Yeah, And like there's thousands of guys doing being you know,
expert AI prompt engineers or whatever, and like, yeah, this,
this AI tool is the is the silver bullet, Like, bro,
that's just you've reskinned chatr GPT four point zero, put
your late put your white label name on it and
(26:27):
pretending it's some kind of magic box when it's a
mechanical turk.
Speaker 4 (26:31):
And yeah, there's.
Speaker 2 (26:32):
Enough people with enough money who are going to invest
for a little bit. I mean it's going to burst somehow.
A lot of people are going to lose a lot
of money shit, And it's yeah, I mean this is
the fucking free market. The free market is a hellscape.
It's like to be like this is this is good?
(26:53):
Is to be like more often the winner than the loser,
because the losers often lose everything in a very sort
of you know, not like super like unsympathetic way. I
don't mind when when a billion companies emerge to fucking
(27:19):
formalize AI systems and like ninety percent of them go
bankrupt and all of those guys lose all of their money.
That sounds great, That sounds like an ideal America.
Speaker 5 (27:29):
Yeah, if they end up bankrupting a couple of venture
capital firms on their way out, cool.
Speaker 2 (27:35):
Yes, yeah, but like in the American way, we'll probably
bail them out and then try and try to use
that shit. The other thing, too, is like AI does,
and I'm speaking from like now I'm speaking from the
student perspective. AI does need to get better if it's
going to continue to be useful for students, because now
(27:59):
there's like a recognition as soon as you read an
AI paper that it is an AI paper, Like it
has qualities that are unmistakable, Like there's a sterile way
that all AI communicates that when you're reading it and
you're like, this isn't a human being, this is a
(28:19):
machine that's writing this. And also AI never ever, ever,
ever cites anything that's ever been assigned in any class
ever taught, and a lot of a lot of types
when you look at somebody's bibliography and you do a
little investigating and you go, oh, this fucking book doesn't exist,
(28:42):
or like this was a fake book that was mentioned
in a novel. One type, like the data scraping is
crazy where you're like where you email a student and
you go like it's interesting, First of all, your paper
didn't cite any of the material that was assigned in class,
so you do get a zero. Also, I was looking
at your bibiliography and uh, three of the four sources
(29:05):
that you cited don't exist on earth?
Speaker 1 (29:10):
Could you?
Speaker 2 (29:11):
Could you please? But here's the thing you can't do.
You can't be like you get an f because AI
wrote this, because you can't prove it.
Speaker 5 (29:20):
You can't prove it, but you can break down the
like if you did this yourself, you just hallucinated these sources.
So I'm fairly certain a machine hallucinated these sources. But
it doesn't matter. The end result is the same, these
don't exist.
Speaker 2 (29:37):
Or just citing shit that has nothing to do with it,
or you're like this this this was a paper about
racial capitalism and like fucking Jane Austen is in this ship,
like what make the case for me?
Speaker 1 (29:50):
Please?
Speaker 2 (29:52):
So yeah, I doesn't. Also like just kind of need
to get better because it's kind of shitty if you
do Google. My wife and I were doing this when
we're cracking up any question that you ask the Google
AI chatbot, because now every time you Google something, the
fucking AI over for you is the first thing you see.
(30:12):
So we were asking you questions about babies, and like
every time we asked any question about babies, because freaky
shit happens with babies all the time, like outlier stuff.
Google's answers always yes, It's always yes, Yes, babies eat babies.
Speaker 1 (30:35):
Yeh, No, that.
Speaker 2 (30:36):
Happens like because the AI over you like scrapes all
the data and goes like, well, yeah, there was a
case where a baby ate a baby one so yes,
it happens, and then it goes well infrequent babies eat babe,
like the only the only way AI is is interesting
(30:56):
now is a fun cultural exercise. We've didn AI scripts
that we've read on this podcast before. They suck ass.
They're horrible, and that is I don't know, Teddy, why
is an AI getting better? Why does it keep being
kind of bad?
Speaker 1 (31:13):
I mean, it's what what Cause was saying. Well, okay,
So to drill down and be more specific, generative AI,
specifically large language models and models when it comes to
(31:37):
generating images and videos have hit a cap on what
they are able to do, and that cap is quite
literally the extent of human knowledge that has been digitized.
Speaker 4 (31:52):
Same thing to say, I love that, It's absolutely true.
Speaker 1 (31:55):
It is from what pretty much what they've said is
anything that has been digitized has been read by any
of the at least from what has been publicly stated.
Of the four big ones open AI, the one from China,
then there's two other ones, one in the US and
(32:16):
two overseas that it's already read everything. And the problem
with that is once it started reading everything that humans
have produced intellectually, now they're trying to grab more stuff
and continually grab in what people are producing, and it
just the what people are able to produce can't keep
(32:40):
up with the needs of what they are trying, what
the models are trying to do. And now the other
part of this is there is so much generated content
that it's cannibalizing itself. So it's doing the for our
older listeners, the photo copy of a photo copy. For
(33:02):
our millennial video editing listeners, it's like you're trying to
re encode a video file over and over and over again.
The quality is beginning to degrade because it's only copying
copied pixels and not taking the raw images that it
used to. So that's why those two are happening. In
(33:24):
terms of other types of like AI agents and machine modeling,
some of that is actually getting better. There is a
modeling called a rewards model based and it essentially says
all right, we have the best analogy. And one of
(33:46):
the applications you can say is we have a track,
so you have this virtual car. This car is supposed
to go around this track in the most efficient way.
We're going to give you a reward like you. Essentially,
this thing gets a point every time it falls within x, y,
and z threshold within a certain You have to write
(34:07):
a little algorithm itself to say, within these parameters, you
get a point, and every time it deviates, it doesn't,
and you can get fractions. It can get fractions of
the point every time it succeeds. And the better the
reward algorithm, the more precise. So Amazon has been with
some of their mechanical some of their robot moves for
(34:33):
like moving packages and sorting them. They've been using that
and actually having pretty good success when it comes to
some of the warehouse operations. Now, Amazon still employs huge
amounts of people, so even at the pinnacle of how
good these are getting at sorting, they still need labor,
(34:53):
a human labor in order to move stuff around. They're
doing their darnedess mouse, what you were saying about the
folks studying, oh, trying to get to the pinnacle of
people working at the Tailorism is what was studied where
somebody said, oh, if I and I believe this is
in the twenties, I believe it's twenties, maybe thirties. This
(35:17):
guy was like, all right, if I time my workers' movements,
I can maximize efficiency. Taylorism has been largely debunked for
the last eighty For the last sixty years, like people
basically went, no, that's not how human beings function, Like
that is an inherently flawed set of logic. His idea is, oh,
(35:38):
if we time how long it takes them to go
to the bathroom, how fast they move, how long they
get up, how quickly it takes them to do X,
Y and Z tasks. And the problem I'm having with
some of these AI models of productivity it's just saying, well,
if we had more data, Taylorism would work, when we've
pretty much proven that no, taylor or just doesn't is
(36:01):
foundationally not a product like a good measure of productivity,
and you can't actually increase efficiency this way.
Speaker 5 (36:08):
And people tried to reinvent it during the pandemic of
oh no, everybody's remote. We can't like micromanage them physically
in the office, So let's do time tracking of everyone's mouse,
mouse clicks and keyboard strokes and whatnot. And that drove
everyone totally insane, and it didn't get you anything better
(36:28):
than what this already had. Yeah, now that's a really
interesting point about the large language learning Model chat GBT.
It's just fundamentally flawed in how it is designed to work.
Of like it as a predictive algorithm rather than like
a thinking machine, which is obviously a positronic brain is
(36:53):
harder to build, and certainly the technology that went into it.
Predicting with any degree of accuracy.
Speaker 4 (37:00):
Is still fundamentally kind of cool.
Speaker 5 (37:04):
But when it's just like, oh, I'll just ingest the
entire world's information in order to create a bad calculator,
why did you build that?
Speaker 4 (37:15):
Exactly?
Speaker 1 (37:15):
And I will say, there are some really cool things
that it can do. There's one smaller LLM that I've used,
So all of the lms that I've used just again
transparency for everything. The ones that I've downloaded and started
training and using all run locally. I try not to, well,
(37:36):
except when I need to for anyway, doesn't matter. The
ones that I use for personal stuff has all been
trained locally on my machine's So there are some really
cool models when it comes to understanding tone and being
able to analyze when you've written something, how it's going
to come up.
Speaker 2 (37:56):
Mm hmm.
Speaker 1 (37:57):
Yeah.
Speaker 5 (37:57):
Grammarly has stuff in there. I've got one that it's
still based on the chat GPT framework, but it's only
trained on some of the data that we have internally.
So when I'm like, you know, hey, a new product
or feature just rolled out, and I'll feed it like,
how would you write a value statement for this for
a client? And it spits out something like that's close ish.
(38:20):
I can use that as a draft, and it's you know,
it's kind of like a sanity check. I'm sanity checking
it as much as it's sanity checking me of that's okay, yeah,
And that actually helps me think through like you know,
it's supposed to me staring at a blank screen and
like kind of getting writer's block of yeah, and like
I don't have to like sit on a meeting with
(38:41):
two other people and bounce ideas back and forth. I
can just type in this prompt it spits out something
that's like, as we've right on the show, like a
C plus draft of a script. I'm like, cool, Now
I have something to work with where I can immediately
go I don't like this, but I can use bits
and pieces and fit it back together and that absolutely
(39:01):
has saved me time. Those are fine. That's about as
far as it can go. Yes, the way it's builting.
Speaker 1 (39:10):
Yeah, And there are some cool things that it can
do with audio. So not just oh I'm creating these
I'm creating these huge amounts of audio tracks for an
artist that doesn't exist, and I'm going to spam Spotify's
(39:32):
right music generation. But there are some interesting things where
you can do with voice modulation. You can create some
interesting characters that human beings, just not many of them
who are cannot do that, and non trained voice actors
can do some interesting bo coding with it. There is
some very cool things when it comes to video enhancements.
(39:55):
And I saw this one artist who was doing fractal
and Mandel brought set art, which is again it was
only possible with a generative algorithm because they were able
to do this high level math that this person just
wouldn't have been able to do otherwise. Really beautiful, very cool,
(40:19):
needing to see, like the jiblified Trump stuff. Nah, like
not not not needed. Also like you're you. You've taken
the works of beautiful works of somebody and made it soulless.
No thank you. But there again, there are some cool
(40:40):
things that AI can do, and it's a very cool it,
I believe. So one of my sorry I have not
finished the thought. One of my favorite use cases of
the rewards generation model was how these Chinese engineers structured
(41:01):
open ais entire model through this method. They basically had
these bots communicate with open AI at huge amounts of
time and was able to reconstruct how that worked, and
then they released one that was able to run more
efficiently on worse processors. So that means like they can run.
Speaker 5 (41:23):
Ask the AI to rebuild it chateg you be he
to rebuild itself.
Speaker 1 (41:28):
Pretty much. And that's that's a that's an insanely cool thing.
Now it's it again. While there are some cool things,
it still is not two. From what I've seen, it
is not there. It is not to the point where
(41:49):
a it can be justified to be called a actual
thinking machine, because it's not. It is a series of
regression algorithms done really fast on huge data centers. Like
these things are not thinking and they are not producing
original thoughts.
Speaker 5 (42:08):
You know, I love that point. Also, if the AI
has become this like big brush term that of course
came from a bunch of sci fi nerds to begin with,
and as sci fi nerds ourselves, it's important to be
specific about our terminology, that generative learning predictive models as
a separate thing. And also, none of these are passing
(42:32):
a touring test like there, These are not real thinking machines.
And the more, all the snake oil salesmen who are
bankrupting venture capital firms in the next year, we hope
are living making their living off of being intentionally vague
about those distinctions.
Speaker 2 (42:54):
Yeah, And I will also say, because this is this
is interesting contact. So Teddy mentioned Taylorism. So Taylorism predated
the nineteen twenties. Uh did Yeah? Frederick Taylor died in
nineteen fifteen Taylorization was introduced. He was one of the
(43:17):
first management consultants in the late nineteenth century. And why
is that important? That is important because there was a
significant socialist and communist movement in the United States, and
communists and socialists were extremely critical of tailorization. So the
(43:38):
labor movement was very much aware that scientific management was happening,
and so whenever somebody would introduce a stopwatch, which is
what he was most commonly associated with, into a factory,
they would do work stoppages, so like as soon as
they saw the fucking watch, they were out, amazing, awesome.
(44:01):
And the reason it's important is because the nineteen twenties
and the nineteen thirties you had depression, and you had
a systematic dismantling of the labor movement in the United States,
and so it was much more difficult during the depression
and during another depression isn't written about a lot, the
(44:22):
Depression of nineteen twenty which lasted for two years, and
this was like when the industrial workers of the world
was destroyed basically, and so during those decades you didn't
have a cohesive enough labor movement. People were much less
likely to do like sympathy strikes and things like that,
or like on, like to advocate for job control, like
(44:48):
all of the stuff, the shorter working hours, closed shop,
job control, union membership required, all of that shit. After
World War One kind of like went out of the window.
And that was not like it was not helped by Bolshevism,
and so the Soviet Union and the rise of like
(45:10):
communism in Russia was one of the ways that a
lot of like the anti scientific management, anti capitalist sentiment
was like discredited because they're like, no, well, now you
guys are Bolsheviks, which is like the worst thing you
could be called. There was an interesting cooperation of that
(45:32):
term in Baltimore City, random example, where there was a
concept called the Baltimore Soviet Basically they had communized YMCAs
in the city and it was like this like very
like the Paris Commune, kind of like flash in the
pan moment where it was like, oh, there's a way
(45:55):
to live such that everybody can take care of each
other and like the tyranny of the stopwatch doesn't exist
and it lasted for like a couple of years. But yeah,
it's interesting that the kind of like reintegration of democratic
socialism into American politics is happening during what I consider
(46:19):
it to be like a sustained recession since like two
thousand and stefn like just like this fucking out of
control capitalism. And then you have a guy like if
Bernie Sanders ran as a democratic socialist in nineteen sixty,
(46:39):
he might be imprisoned. You know what I mean, Like
like those words were so ostracized in American consciousness because
of the sort of like you know, Russian scapegoat that
was created in the American imaginary. Its sustained over so
many years that people forget that there was a very
(47:03):
very impactful socialism movement that actually like systematized bureaucratic inefficiency
in cities and made things like universal kindergarten. The thing
where socialists were like, you know what we have to do.
We have to win elections. If we win elections, we
can take control of government. If we take control of government,
(47:23):
then we can have our kids go to fucking schools
that don't fall in their heads when they're there, Like
we can like fix shit. And that now we see
that happening right now. Like I have a not a
friend necessarily somebody because she's like ten years younger than me,
but she was like she went to the college that
(47:44):
we're all familiar with, and she's like entering into one
of these programs that is specifically gearing like people who
have like socialists and communist politics to be able to
for office, and like that is I think a just
(48:06):
like the cyclical reemergence of a lot of the things
that were happening when this kind of unbridled capitalism, this
like tailorized efficiency driven, productivity driven society was like trying
to be pushed down people's throats. It's it's like the
(48:29):
the image of a dude working at a massive machine
and he's just making one part of the car, He's
just stamping one part of the car, and then a
motherfucker manager coming in with a stopwatch and him going
like oh hell no and leaving and everybody else following
him is very similar to here's an AI script and
(48:51):
in all of the writers leaving the fucking writer's rooms
like it's it's it's it's shockingly similar in a lot
of ways. And I do think that like the late
nineteenth century, like municipal socialists, that a lot of people
with socialist politics need to be like, you know what
we can do. We can actually take control of government,
(49:14):
and if we do that, we can do some real
serious damage.
Speaker 5 (49:19):
Yeah, well, I think or we are undo a bunch
of damage and prevent additional damage because to your point
of like Bernie Sanders would have gotten arrested, and we
know there's that old black and white picture of him
getting pulled away by cops. This keeps re emerging as
a because and government Reaganism keeps trying to smash it
(49:44):
and break it down and prevent it because it is
so dangerous to the wine must go up forever, infinite
growth capitalism that is destroying the planet. So yeah, those
are depths opposing forces and it's about damn time. That's
very exciting. Good for your for your that next generation.
(50:07):
If they can, if we can all make it that
far that they can get into office, that would be
really really nice and is very inspiring and hopeful.
Speaker 2 (50:20):
The soft launch of that was when uh jenk Yuger
and Kyle Kolinsky started what the fuck was that group
called like Action Democrats or something? Oh yeah, and that
was the That was the movement that brought in the
(50:41):
AOC's I think Ayanna Presley was a part of that.
Speaker 5 (50:47):
Worked briefly, and what you see is the same. The
The most powerful thing that our current system does is absorb,
uh defang, and then repackage and resell revolutionary movements, and
(51:07):
then everything gets filed down. The corners get whittled off,
everyone gets corrupted and bought in, and then their meat
puppets get paraded around and siphon a bunch of really important,
really valuable energy away from the causes they were meant
to advance in the first place.
Speaker 2 (51:29):
Yeah, And it also is like the that that kind
of like panic around socialism and those ideas are like
very there was that. Do you remember the anchor Chris Matthews.
Speaker 5 (51:43):
Yeah, yeah, convinced he was gonna get lined up in
the street and shot by socialists somehow.
Speaker 2 (51:51):
It's insane and that that that dude is like a
liberal anchor or whatever. And he was just on air
having this like moment outside of his own body where
he was like, man, these people are the old ast
shit too at he was. So it was just like
it was like a it was like a boomer being
like and these people are you all are just okay
(52:13):
with this dude just coming out here being like he's
a socialist, Like don't you remember Stalin? Just like the
leaps the leaps in his mind, it was just the
synapses were not firing in a way that made sense
to him, where it was like there there needs to
be like a mass re education of what the fuck
that is and what they what they actually accomplished, because
(52:37):
like a lot of communists just ended up being like
new dealers. Yeah, and they were like, yeah, this is
fucking good enough.
Speaker 4 (52:45):
Right, we can stop there.
Speaker 5 (52:48):
It's fine.
Speaker 4 (52:49):
Uh, people got jobs. We built some cool public works.
Speaker 5 (52:52):
There's like a public library and art program and some
murals and some decent like social security.
Speaker 4 (52:58):
It's fine.
Speaker 2 (52:59):
Government isn't afraid to be like you actually aren't allowed
to just scam people out of all of their money.
Speaker 5 (53:05):
Yeah, you do actually have to put up railings and
not let people fall into pits of acid and get
joke rist or whatever.
Speaker 1 (53:12):
Please don't allow people to lick radiation do it.
Speaker 2 (53:16):
You can't you you can't become jokerized. Not in this
society that we live in.
Speaker 4 (53:25):
That was the dream.
Speaker 5 (53:28):
That's settle for that truly, and seeing that get ripped
away piece by piece and then inevitably it generating people like, well,
I guess we got to go and teach these old
mental lessons again about worker safety. It just keeps going
around and around.
Speaker 2 (53:48):
Yeah, it's so crazy. Yeah, the the kinds of things
that we have to contend with are are are just
are getting sillier. Though there's a substantive difference between like
in the progression of history where it. It does seem
like things are getting sillier and worse. There were bad
(54:09):
presidents before Donald Trump, but like this level of ineptitude
and incompetence is unseen, and there are elements of AI
and its relationship to labor that are just just different.
Like everything is intensifying. We're like, yeah, no, I saw
some dude talking I think it was Tim Miller talking
(54:30):
about like what could happen to us, you know, like
if if the United States like loses it's it's uh,
it's global dominance. And he was like, you know, the
dollar's gonna get devalued and then we're gonna be looking
at something like we're gonna be like France. Oh no,
(54:51):
I'm like, it sounds fucking awesome to not be the
world's superpower.
Speaker 1 (54:57):
Yeah.
Speaker 5 (54:58):
There's like there's a bunch of like second and dairy
powers in Europe who are contending with their own fascism
and having a deal, but also don't have the same
kind of pressure of being a massive empire spread thin
across the globe, being super paranoid all the time, and
they like can get some shit done.
Speaker 2 (55:21):
Yeah, they can actually do something in their country. It
would be great it'd be great if the billionaires all
ran away to I don't know what the next one's
gonna be, but like if the billionaires all ran away
to China's are doing business there, It's like, Okay, we
got rid of those fucking guys. Like, I guess we're
not gonna have hyper loops here, but like.
Speaker 4 (55:41):
Maybe we can have healthcare.
Speaker 2 (55:42):
If we can have some healthcare, maybe we have some
fucking roads, Maybe we could have some paved streets.
Speaker 5 (55:47):
Finally, Yeah, it's so frustrating, Like that was supposed to
be the one upshot of being the empire is you
get your paved roads, you get your infrastructure, Like this
is the whole Like, are we roam nowhere? Worse, we
don't get the nice infrastructure.
Speaker 2 (56:05):
Have you seen Have either of you seen this ezraclined
shit that's going on? No, Yeah, it's it's so wild.
As wrote a book called Abundance, which I have not read,
and so I'm purely responding to the debates that he's
been having with people in the press, and he's going
(56:29):
on podcast and stuff promoting this book, and the thrust
of what seems to be his argument is that leftists
are two have like created so many barriers toward development
and municipal action, and that it's time to deregulate some
(56:50):
elements of local government if there is going to be
the kind of development that we need in blue cities.
And so he points to a lot of failed infrastructure projects,
a lot of failed like transit projects in California, and
then points to Texas, the red state that's able to
build housing and to build it at a more rapid pace.
(57:14):
And the critiques of him are just like he's he's saying,
he's sort of like center laughter, like liberal leaning, But
this is neoliberalism, this is neo conservatism.
Speaker 5 (57:23):
Yeah, this is the same answer of we'll just privatize
everything and then it'll all work.
Speaker 4 (57:28):
No, that's literally never worked.
Speaker 1 (57:33):
Oh it's worked, just not in in helpful ways for
the consumer or people.
Speaker 5 (57:40):
Yes, it's never given us the thing it was promoted
to give. It has always worked exactly the way it
was intended, which is funnel money into the hands of
those same five guys.
Speaker 1 (57:51):
Yeah.
Speaker 2 (57:52):
But the the sort of like the question that he
does he poses is an interesting question. The answers that
he's giving are not satisfying the question of like why
do blue cities, why can they not solve the unhoused crisis.
Why can they not solve the housing crisis? Like why
(58:12):
can't these things happen or why haven't they happened? Are
all very interesting questions. The answers that he arrives at
are not answers that I think hold up to scrutiny
when you're looking at the sort of actual reality of
doing nonprofit of hoordable housing, which I have some experience
(58:34):
with the types of regulations that stymy that are not
the types of regulations that he's pointing out, Like the
environmental regulations are not the things that are stymying public housing.
The zoning code is stymying public housing, But it is
the outdatedness of the zoning code that is doing that.
(58:55):
It is the zoning code like prescribing that you have
to develop this type of structure in this are eight
residential classification, whereas a lot of people can't afford that
kind of structure, or it's the kind of zoning code
or regulation that says that you can't build multi unit
(59:17):
housing here, and that is a holdover from like segregationist America,
where it's like we don't want renters moving into these
neighborhoods and shit like that, and so it's just like
weird thing. It's caused this sort of conversation. But it's
also I think like there is a real disconnect between
people who talk about politics and then people who like
(59:39):
try to do shit and the people who talk about
politics very often. And I'm not questioning as Bona Fidez.
He's a great journalist and he's talked to a lot
of people, and he talked to nonprofit developers who are like,
these are the problems we're having and blah blahlah blah blah,
and I'm like, I see none of this. What I
see is actually a tepidness from local government to engage
(01:00:00):
more because they are terrified they are trying to indemnify
themselves constantly, and like they don't want their credit rating
to decrease, and so they need to balance their budget
in ways that disallows them from releasing funds more quickly
(01:00:21):
to nonprofits. And this neoliberalization of government, like deputizing nonprofit
developers rather than doing it than fucking selves is also
a product of neoliberalization and deregulation. Because local government refuses
to do anything, they are only in the business of
(01:00:44):
creating public private partnerships, and those things are fucking hard
to do, and oftentimes that is the thing that the
fucking the moneyed interests, these sort of like scary moneyed interests,
are the ones who actually get to take advantage of
because what happens all the time is it's like, Okay,
(01:01:07):
so you're gonna you're gonna you're gonna farm this out
to the private sector, and you're going to in good
faith enter into relationships with grassroots organizers and things like that.
They don't have capacity, they're not developers, they don't build things,
so they're gonna have to develop that. So you're gonna
have to give us some fucking time here. And then
when they're not showing results because they have decided to
(01:01:29):
work with a different group of people than the ones
that they're used to, then they start getting like, oh fuck,
I have to get reelected, I have to show something,
I have to show my work. And then it's like
affordable housing project sold to Hedge fund, blah blah blah,
six hundred million dollars is gonna be great. You look
(01:01:50):
at the specifics they're building houses that they're gonna sell
for two fifty The qualifications are like one hundred and
twenty percent area meeting income. So these people are fucking rich,
and so it's just leaving everybody out in the dust.
And this is the fucking problem. The problem is not
the fucking fact that you get to get let abatement tests.
That is not the problem. The problem is not the
(01:02:11):
fact that you have to go through a permitting process
that is only put in so that you don't have
like deathly black mold in your house, or that you
have to check for certain things, or that you have
to ensure that joyster nailed, like that you have to
do things to make the house habitable for people. Those
are good and they have to exist. We can't get
(01:02:33):
rid of those. And zoning coades for the most part,
are good too. Like there's some shit like you shouldn't
have a stable next to a school, like you shouldn't.
You shouldn't and if you do because you're teaching kids
how to ride horses, then there has to be like
sanitary regulations that make sure that those kids don't have
like get jardia and shit like that. Like it's just
(01:02:56):
such a weird conversation to be like, the the problem
is regulation. This is why things are happening in Texas
so quickly and whether or not happening elsewhere where. Texas
is very famously kicking every Texan out of Texas.
Speaker 1 (01:03:12):
Yep.
Speaker 5 (01:03:12):
Yeah, reregulation, not deregulation, Like you want to update those
codes so they make sense for the you know, twenty
twenty five. Sure, but that doesn't mean rip them all down,
so worse housing can get built faster, or more expensive
housing can get built to kick people out so nobody
can afford it. Like that is the same bullshit that's
(01:03:35):
been happening since the eighties that's never going to get
better and will only definitely get worse if you deregulate.
Speaker 2 (01:03:42):
Yeah. The one sure of our way to know that
this is a like a bad idea is that, uh,
they're congressional representatives investing in the quote unquote abundance agenda,
and it's like none of those motherfuckers are good people.
Speaker 5 (01:03:57):
No, that's you know, that's a bad bad sign. If
they're into it and excited about it, it's it's a edchise.
Speaker 2 (01:04:06):
Oh did you say deregulation. Oh yeah, I'd have for
that one hundred and twenty million dollars. Take a do
what you will yeah, but I guess it's time to
go get wrecked.
Speaker 5 (01:04:22):
I will recommend diving back into the classics. And what
I mean by this is Crouching Tiger, Hit and Dragon
Rewatch that called that a classic, thinking about movies at
the turn of them were called the turn of the century.
(01:04:44):
That truly there's this you know, interesting period of a
little bit of CGI, but it's mostly practical effects and
we don't have the budget to do fancy CG that's
going to look like look terrible in four years, So
we're just doing a little bit of like color correction
and you know, scrubbing out the wires for the wire
(01:05:06):
fu and then leave the rest of it to stunt
people and good actors of martial arts backgrounds and beautiful
cinematography and it's it's really beautiful and it's very sad,
and it really is. It's funny watching that and thinking about, uh,
(01:05:26):
the Broke Back Mountain. Also, like while some of these
shots look really similar, and some of the like long
it's just Anglee doing angle things, and it's it's a
it's a really gorgeous film. And this you know, this
slow burn of Michelle Yoh and chay Unfat being unable
to communicate with each other. Is you know now that
(01:05:50):
I'm a little bit older and I'm able to kind
of come back to it of oh yeah, I have
seen this happen, and I have friends that this has
happened too. Of this sort of like beginning to lean
into the middle of your life without being able to
(01:06:10):
talk clearly about things. It's it's just a gorgeous film.
So I highly recommend if you haven't seen it for
whatever reason, dear listener, seek it out, and if you have,
it's worth going back and revisiting.
Speaker 2 (01:06:22):
It's so that is it's wild to call something that
came down to two in two thousand a classic, But
that's what I want to feel, old fucking situation. The
The other funny thing about uh that too is what
Onley would become. Uh it like the CGI is sort
(01:06:44):
of like king with like him and Bob Zamechas are
trying to make movies without people in them. M hm.
And also funny about only two is that he did
sense and sensibility and he worked with sheep, and the
fucking sheep for whatever reason, gave him just fits like
(01:07:05):
he couldn't fucking handle the goddamn sheep. They were just
like they were too chaotic, and so in Broke Back Mountain,
most of those sheep are CG.
Speaker 4 (01:07:17):
Because wow, he was like, trust real sheep.
Speaker 2 (01:07:20):
He's like, I don't trust those motherfuckers. I don't fuck
I don't fuck with them, Like they're.
Speaker 4 (01:07:25):
Not pulling the wool over my eyes.
Speaker 2 (01:07:27):
Again, he had like some kind of like breakdown on
the set of sense and sensibilities. So what they did
was they had like a couple sheep and then they
did like in the in the shots where there's a
bunch of them, and then he like copy pasted them basically,
and then in the big sort of like when they're
having the big like those beautiful like mountain shots where
(01:07:48):
there's like a ton of them, that's just all CG.
Because the man just couldn't handle sheeps. And every time
you kind of like, if you look at a picture
of only he looks like a man who's about on
the verge of a breakdown. That is perhaps sheep related.
Speaker 4 (01:08:04):
She related, I did not know that. That's a delightful story.
Speaker 5 (01:08:10):
The Netflix released a sequel to Crouching Tiger Head and Dragon,
which is based on the next book in the series.
So these are all. These are both based on Wusa
as I guess probably have horribly mispronounced that, but that
genre of Chinese historical fantasy, martial arts fiction, and there's
(01:08:32):
a long series of books that these come from. And
so they actually got the rights to the next book
in the series, the final book, and Onley did not
come back to direct it. Michelle Yo's in it, Donnie
Yen is in it, and by all accounts, it is terrible. Yeah, yeah, yeah.
(01:08:55):
This is the tragedy of Michelle Yo. To me is
that she is a beautiful human and a fantastic actress,
and her filmography mostly sucks, Like you if it's not
worth doing, like, oh, I'm gonna do a MICHELLEYO career retrospective.
Let me go and see everything she's in because most
of it's just terrible.
Speaker 2 (01:09:16):
Oh man, yeah, yeah, you're right, Like they're like it
is weird too, because when whenever people are like Michelle
Yo is a legend, Like I remember when crazy Rich
Asians came out and they were like, the legend Michelle Yo.
Then you go back and you look at her filmography
and you're like, wait, what are they talking about? And
(01:09:39):
it's like crouching Tiger Hidden Dragon. It's like there are
very few. I liked her in Sunshine, but that's also
a movie that people don't like. It's like the Danny
Boyle like Space. I like that.
Speaker 5 (01:09:49):
Yeah, that's one of our like, yeah sci fi nerd
picks that we really liked that nobody else cares about. Well,
it's you know, it's the her legacy of the same
like fred Astaire Ginger Rogers like doing everything Jackie Chan
can do backwards and heels and like, so she's but
a lot of those movies that he was in are
(01:10:11):
also terrible.
Speaker 2 (01:10:12):
Yeah, yeah, I mean for sure, but with him, it's
like it's the output too, He's just making Yeah, every
movie her. I'm looking at her filmography, it's not even
that long.
Speaker 5 (01:10:24):
She to me has the the quality of like, oh
this is an actress. It will just be in anything.
But she's not in that many things.
Speaker 2 (01:10:35):
Which unfortunately also interesting is that like her filmography gets
way better at like post legend like people are like,
I mean, she's a legend. We got to put her
in these movies, and.
Speaker 5 (01:10:48):
Then they start putting her in actually good movies.
Speaker 2 (01:10:50):
Occasionally good movies. Yeah, like Crazy Rich Asians was a
delightful little rom com. I like Shang Chi. Yeah, I
did too in that, and she does a good job.
I like gunpowdered milkshake. That's very fun.
Speaker 1 (01:11:05):
Aaron Gill was into everything everywhere all at.
Speaker 2 (01:11:09):
Once, Dame. Yeah, yeah, these are all like very recent things.
Though she was in Wicked. I really liked Wicked. I
liked her in Wicked. She'll be in Wicked for good
as well.
Speaker 5 (01:11:19):
She's in Star Trek Discovery. She's very good in Star
Trek Discovery, despite Discovery being uneven. But then, like, I
haven't even seen this her Section thirty one movie because
it was it's one of those movies that everyone hated,
Like wasn't for anybody. Yeah, and that's unfortunately the sort
(01:11:40):
of the legacy of her as being fantastic in mostly
bad things.
Speaker 2 (01:11:46):
Yeah, she's in Police Story three, when the franchise famously
goes off the rails.
Speaker 5 (01:11:51):
Right, would be the only other one I would be
interested in actually watching because she's in it. But just
because she's in it is no guarantee of it's the problem.
Speaker 2 (01:12:01):
At least there are two rules. And she's gonna be
a fucking avatar. That makes sense, dude, Avatar just Avatar sucks, dude,
and everybody.
Speaker 4 (01:12:12):
Needs to know, wait are waiting the Blue People, the
Blue People.
Speaker 2 (01:12:16):
Avatar fucking sucks, dude. She's an Avatar four, which is
coming out twenty twenty nine. Like there, be careful what
you wish for it, because like everybody was like, we
want franchise list sci fi original stories, and the one
we got was fucking Avatar. It's like this weird thing.
Avatar is like cilantro, where if it tastes like soap,
(01:12:39):
you just can't walk with it. And I'm one of
those people. There are people who are like I wish
I lived on Pandora, with like a fervor of wanting
to be like I want to be a nave. And
it's like a lot of people love this shit. And
then I saw Avatar too, the Wave of Water, which
was like, for my money, seven and a half hours
(01:13:02):
long and it's just shots of water and it's fucking awful,
And I couldn't believe how fucking bored I was for
all of that time. But Avatar three is coming out
this year. Avatar four is already like in productions, coming
out in twenty twenty nine, and it's going to be
(01:13:22):
I think a ten movies something crazy like that, And
Jim Cameron's going to be dead in the ground before
he finishes this. This is going to be like the
Wheel of Time.
Speaker 5 (01:13:33):
Yeah, but nine gen Cameron, he will have uploaded his
drafts and an AI model of himself into a production
software to keep editing these. Yeah, and the tenth one
is going to be available on USB drives that you
plug directly into your own head to watch.
Speaker 2 (01:13:53):
Who's the Brandon Sanderson to the Brandon Sanderson is to
Robert Jordan? What who is to Jim Cameron? Like, who
finishes the.
Speaker 5 (01:14:08):
Interesting? Unfortunately it's Jon Favreau. Uh, I think you might
be right, which is not not a good thing.
Speaker 1 (01:14:18):
Or like the roosters are Jon Favreau better than it
being JJ Abrams.
Speaker 4 (01:14:25):
Yeah, that's true.
Speaker 5 (01:14:25):
That's that's actually the scariest possible choice and likely.
Speaker 2 (01:14:32):
Well JJ, yeah, go ahead. He loves to put like
the mystery Box into his thing, where it's like, there's
something mysterious here that nobody knows and Jim Cameron is like, no,
it's called unobtainedum because it's fucking hard to get. This
is not a mystery box like JJ would have named
a fucking try a jagga or some shit and you're like, yeah,
(01:14:55):
what does it do?
Speaker 5 (01:14:57):
Like no, no, don't worry about it. It's James Cares like
this is the mcgoffin.
Speaker 2 (01:15:03):
Uh, okay, we're getting wrecked right now. I'm just gonna
recommend what I've The only thing I've been watching is
just Top Chef. So Top Cheff fucking rules. The new
season fucking rules. There's like, like my wife and I
are insane watching these, and we we do wonder if
we're shaping our our daughter's personality in a bad way
(01:15:26):
while we watch Top Chef and she watches us watch
Top Chef because it's just a lot of cursing and
being like so overly critical of people about something that
we know about, like we cook, but like not at
a level of like we watching Top Chef is like
bad rude sports fans, and it's going on in our
(01:15:48):
home as we're watching this and being like, oh, you
fucking broke the saucy piece of shit. Fuck this person.
They spend too much money at Whole Foods, they pick
up half the budget. This restaurant wars is not gonna
be good because she's a fucking bad host. Like, it's
just all way too parasocial and screamy. But I guess
(01:16:10):
that's the way everybody watches television. But yeah, that's what
I recommend. The new season of Top Show is fucking awesome, Teddy,
what you got?
Speaker 1 (01:16:21):
I am going to say a show that I have
been very much enjoying, I'd recommend poker Face. They are
on season two. It is Natasha Leone and Ryan Johnson's
love Letter to Colombo and the original Murder on the
(01:16:42):
Orient Express like a lot of these cozy mystery shows,
but it's Natasha Leone being in a cozy mystery show.
So if you don't like her, you're not gonna like it.
If you didn't like Knives Out, you're not gonna like this.
Speaker 4 (01:16:56):
But I like all of those things.
Speaker 1 (01:16:58):
This sounds awesome, it's great. There are there is one
episode that feels like it's coppaganda, but as the show
progresses it gets it. Actually, there's a later episode in
a later season that makes it go, oh no, this
(01:17:19):
is just ineptitude. Great, Okay, good to know. So overall,
i'd recommend poker Face very fun season two. The next
episode five comes out tomorrow, and by tomorrow, I mean Thursday.
Speaker 2 (01:17:36):
It comes out on Thursdays several weeks ago.
Speaker 1 (01:17:40):
Yes, And the other thing that I would recommend is
night Shade and web Glaze for any of our tech
focused listeners. I know we had a lot of discussion.
Speaker 5 (01:17:53):
It sounds like that like a superhero like duo Nightshade
and Whip.
Speaker 1 (01:18:00):
Oh, we got to make this superhero. So Nightshade was
developed by mit In. I believe it was twenty twenty one.
I have to look at when they were liked. They
initially launched it in twenty twenty one, and it is
an generative AI poisoning software. It essentially re encodes data
(01:18:20):
with garbage, so as models are working on it messes
it up. Now, there to answer a issue that came
up with style mimicry for any of our artists, they
created Webglaze, which basically poisons the entire weighted working set
whenever they use digital image generation. So I technically, just
(01:18:47):
in case anybody who knows me from work ends up
listening to this, I cannot technically recommend that one would
use anything to disrupt any large language bottle usage. However,
these two things exist. If you look up webglaze or
night Shade and you just so happen to put any
(01:19:09):
of your digital art through it and tag it on
Instagram or any of these things. It may or may
not do a lot to poison large amounts of data
in I'm not recommending, I'm just allowing you to know
the information.
Speaker 4 (01:19:24):
Yeah, it's important to be informed, of course.
Speaker 2 (01:19:27):
It just fuck it, just fucks shut up. There's like
a write up about it that's like, that's like, uh,
NIGHTGA is a tool designed to counterbalance the power asymmetry
and the realm of generative AI models in their trainers
like to just say it's to fuck shuit up. Just
be like this shit fucks fuck shuit up, and it's
it is.
Speaker 1 (01:19:47):
It is a research tool in seeing how some of
these structures may or may not have systematized things that
may not be stable. So, like I said, do with
that as you will, but definitely watch poker Face. I
(01:20:07):
will explicitly recommend that.
Speaker 2 (01:20:11):
Yeah. I didn't realize that there's like a whole large
debate about, uh, like if Knives Out is good or not.
Speaker 1 (01:20:21):
That discourse is wild. Some folks are saying, oh, we
have we've had enough of ben Wa Blanc after these
two movies. Can't we just let it die, and I'm like,
there was twenty years of Colombo. No, no, don't think.
Speaker 2 (01:20:39):
Just the fact, just the fact that the discourse isn't
either I love this or I don't think about it.
It's crazy to me to be like, you know what,
fucking I'm gonna write something about knives out and that
it needs to be said that his accent is not believable,
just like what are we talking about?
Speaker 1 (01:20:55):
The glass onion? It was just like, how could he?
How could he? How could they make that assumption about
these characters?
Speaker 2 (01:21:02):
Yeah, yeah, yeah, uh, that'll do it for this episode
of this is just bad. We'll see on the next one.
By It's just a.
Speaker 4 (01:21:21):
It's like a pirates bort your brain.
Speaker 5 (01:21:23):
Robin Nala's no choking opened in your mind with the
probots as you woken hitting hydra halen hairs had for
a time for a head of reasons, for more than
with the soldiers with them and for all seasons.
Speaker 4 (01:21:31):
Listen closely while.
Speaker 5 (01:21:33):
We share our expert teas and custolic comments called Sardine
Street tuition to the multiversity, not it'syco teaching perfect balance.
Speaker 4 (01:21:39):
When we snap in venit jens into
Speaker 5 (01:21:40):
Your ears does the shoulders when we speak Purple men
versusive feet for Randy Savage Randals with their mortal technique