All Episodes

May 8, 2025 65 mins

In episode 1860, Jack and guest co-host Andrew Ti are joined by host of Worse Than You, Mo Fry Pasic, to discuss…REAL ID Isn’t Real, Cyber Trucks Just Totally Stop Selling, This AI Expert Thinks The AI Bubble’s About to Pop and more!

  1. What you need to know about the REAL ID requirements for air travel
  2. The Racist Origins of the Real ID Act
  3. Top Trump agency reveals key reason why REAL ID will be enforced
  4. 'Mass surveillance': Conservatives sound alarm over Trump admin's REAL ID rollout
  5. Trump’s Insistence on Real ID Has Become a Flashpoint for His Tinfoil Hat Fans
  6. You can get a free Krispy Kreme doughnut on May 7 for Real ID deadline: Here's how
  7. Homeland Security chief says travelers with no REAL ID can fly for now, but with likely extra steps
  8. Flying out of Indianapolis without REAL ID? Don't fret — the airport isn't turning people away
  9. Tesla’s Inventory of Unsold Cybertrucks Skyrockets, Despite Offering $10K Discounts and Concealing Listings
  10. The Silicon Valley sceptic warning tech’s new bubble is about to burst
  11. Deep Learning Is Hitting a Wall
  12. Microsoft’s £2.5bn investment in Britain at risk from creaking power grid
  13. Chess helped me win the Nobel Prize, says Google’s AI genius
  14. OpenAI overrode concerns of expert testers to release sycophantic GPT-4o
  15. The next British boom could be in the offing – if Starmer abandons net zero
  16. Finance worker pays out $25 million after video call with deepfake ‘chief financial officer’

LISTEN: .css-j9qmi7{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;font-weight:700;margin-bottom:1rem;margin-top:2.8rem;width:100%;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:start;justify-content:start;padding-left:5rem;}@media only screen and (max-width: 599px){.css-j9qmi7{padding-left:0;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;}}.css-j9qmi7 svg{fill:#27292D;}.css-j9qmi7 .eagfbvw0{-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;color:#27292D;}

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Glad I wore the sickest shirt all time.

Speaker 2 (00:09):
Cha Chip and hell adapting ring.

Speaker 1 (00:12):
I'm also on my dark winged dog ship dark. Wow,
he's not quite he's missing it. He's missing it a
little bit, but it's right there.

Speaker 2 (00:20):
Yeah, he's gotta be Yeah, all the wow, you kids
have fun. I'm not wearing ship. I'm wearing a black
tea man like I'm from New Jersey or something.

Speaker 1 (00:36):
Dephney, Disney merchant at all, beat.

Speaker 2 (00:38):
Up the beat beat beat Uh yeah, man, I got
a that's Disney merch right there.

Speaker 1 (00:43):
Oh yeah.

Speaker 2 (00:43):
Cooper helmet boom.

Speaker 1 (00:45):
The First Order Trooper, First Order.

Speaker 2 (00:48):
You're in storm Trooper Helmet of the First Order.

Speaker 1 (00:54):
I like that.

Speaker 2 (00:56):
No good, no good reason how the New Yorker writes
about things.

Speaker 1 (01:05):
Right, that's like that's a headline from like coroissance version
of the New Yorker. Yeah, like the Chorussanter and it's
just like whoa well.

Speaker 2 (01:16):
They call themselves the First Order. Hello the Internet, and
welcome to season three eighty seven, episode four.

Speaker 1 (01:31):
Of Dead Ailey's ee Geist.

Speaker 2 (01:33):
Yes. Yes, it's a production of iHeartRadio. It's a podcast
where we take a deep dive into America's shared consciousness. Yes,
it's Thursday, May eighth, twenty twenty five, our first full
day in the new real ID regime. I hope everybody's

(01:54):
hanging in there. We're actually going to talk about this
new real Ide world because the the deadline that they
kept pushing back finally came for all of our asses
yesterday May seven, Real Real I d Judge real ID
Judgment Day. Anyways, we'll get to that. My name is

(02:14):
Jack O'Brien aka Potatoes O'Brien, and I'm thrilled to be
joined in our second seat by a hilarious and brilliant
producer and TV writer you know him from the Yo
is this racist podcast?

Speaker 1 (02:27):
It's Andrew Too. Save for the record, this one looks
better on paper than it is gonna sound. Okay, with
the something bad and it takes birhood, who you gonna
call it gets worse? Zy trust hers? Oh no, when
the beef is weird and the memes look good, who

(02:49):
you gonna call zi? Trust? And this is the worst
of it gets worse?

Speaker 2 (02:55):
Okay? Yeah, I mean a poster no hosts, no post
hosts would be better.

Speaker 1 (03:02):
Listen workshopping this, I I got inspired by zeit Trusters
and z I lost faith in it as soon as we.

Speaker 2 (03:11):
Loved even though what it means. But I do like it.
I like the un for it.

Speaker 1 (03:16):
Very stupid.

Speaker 2 (03:16):
I'm so sorry that appropriately stupid. Andrew. How are you doing.
It's great to have you here, fucking great man wearing
your Chippendale T shirt? R. What are they doing on
that T shirt?

Speaker 1 (03:31):
This is this is my favorite shirt. It is Chippendale dabbing. Yeah,
my only regret with this shirt I got. This is
a real shirt, official Disney merch. I got it. Yeah,
this is real. This is not This is from when
I I was working on a show that was shooting
on the Disney lot and for some reason they gave
us the employee discount at the Disney store on the lot.

(03:53):
And this was already in like the this is so
clearly again it's Chippendale dabbing. For anyone's it's not if
we're in seeing video. And this was in obviously like
the fucking clearance bin, and it just struck me as
the sort of thing that went right off the printing
press in the fucking evil T shirt factory straight into
the clearance fin like it had like a shelf life

(04:15):
of it was relevant. While it was flying in the
air between the end of the factory and the clearance.

Speaker 2 (04:21):
It just pulled the clearance bin directly up in the
T shirt.

Speaker 1 (04:24):
Shoot, my biggest short writer is not buying the whole
stack shirt, just like as a gift you give out
to people from now on. Yeah, yeah, I love it.

Speaker 2 (04:35):
That is a timeless classic. Okay, they fucked up by
putting that in the clearance spin that thing rules. What
year was that?

Speaker 1 (04:44):
The twenty room of nineteen is when I bought this shirt,
presumably from.

Speaker 2 (04:49):
The ing was a thing in when like early teens.

Speaker 1 (04:55):
Yeah yeah, yeah, maybe I again, I would I this
went from Boss, I have a great idea to Clarence
Been in the shortest possible amount of time.

Speaker 2 (05:06):
Whatever, Hey Boss, Yeah yeah, I got a great idea,
Chip and Dale.

Speaker 1 (05:11):
Every time I go to Disneyland, I try to find
Chip and Ordale and ask them to dab with this
as evidence that this is a thing, that their character
is new and they have never once done it.

Speaker 2 (05:23):
Super broducer Victor has informed us that dabbing became popular
in the year twenty fifteen. I'm assuming you got.

Speaker 1 (05:30):
That information from Straight to the Trash.

Speaker 2 (05:32):
Anyway, this is my straight put a dash in the middle,
so I have to assume it came from Andrew were
thrilled to be joined in our third seat by a
hilarious comedian actor writer came up in the New York
improv in Theater Scenes Now in Los Angeles had some
nice reviews in regional publications like The New Yorker Ever

(05:56):
Heard of It, who called their acting virtuosic. The New
York Times called them relentless. They have a new podcast,
Worse than You. Please, Welcome to the show.

Speaker 1 (06:07):
Mo Fry passes.

Speaker 2 (06:12):
Hi, Yes, Hello, what's going on? How are you on?

Speaker 3 (06:16):
This is so fun. I also forget when you send
in bios.

Speaker 4 (06:19):
It's like you forget when they're gonna be read aloud
and you just have to have to sit there and
it's like humiliate.

Speaker 2 (06:23):
Be embarrassed. Yeah, anytime there's a New Yorker rave about
somebody on our show, like that's that is the most
meaningful thing to me.

Speaker 4 (06:33):
It's a calling card, for sure. I'm not I'm not
shy about that. It's like I could probably get bigger awards,
and there's certain ones that are here just kind of
like but did you hear?

Speaker 2 (06:42):
But I don't know if you've heard of New Yorker.
I also like to call The New Yorker a regional
publication because.

Speaker 3 (06:48):
It's not not it's not.

Speaker 4 (06:52):
Also, do you guys remember when it was it Paul
Ryan's like steps on or like Nephew dabbed in front
of like the Wife or something like over the Bible
in like twenty sixteen.

Speaker 2 (07:04):
Yeah, lord, I show you the ultimate respect.

Speaker 3 (07:09):
I also wish that you had, like no one's really
cornered the market on a niche Steve Jobs outfit, because
like that's like a sort of innocuous just black tea
and you know, jeans, but someone to have every single
day to wear Chippendale dabbing shirt, right, is so good.

Speaker 1 (07:24):
Yeah.

Speaker 2 (07:24):
I assumed you were going to get the whole clearance
been just for it to like give out as gifts,
But yeah, that's probably the better option, is just you
have a closet that every single day hang it. Yeah, yeah,
just a series of.

Speaker 1 (07:38):
I fussed up my life could be so much better efficient.

Speaker 2 (07:44):
Oh man, what a shirt. And by the way, Andrew
joined the record and showed us his awesome shirt, and
then super producer Justin had a dark wing duc T shirt.
All so that's that's the vibe, that's who who we are.

Speaker 3 (07:59):
I got to go find some Scrooge McDuck material. I'm like,
what are we doing.

Speaker 2 (08:03):
I'm trying to think of what I was watching around
that time after school, because I definitely, like way too
late in life, was watching Dark Wing Duck, Chippendale Rescue Rangers.
I think there was probably another cartoon. I don't think
Gargoyles was still on, and then like Conan reruns for
some reason I think came on right after that. So
I associate Conan, like early Conan with that time. What

(08:27):
a time to be alive.

Speaker 1 (08:28):
Sifocation, the business model that made weird kids and.

Speaker 4 (08:33):
Now so much Sandford and Sons at like three and
it's like thanks for a really weird little white girl.

Speaker 2 (08:38):
Yea, all right, we are thrilled to have you here.
We're going to get to know you a little bit
better in a moment. First, a couple of things that
we're going to be talking about later, A couple of
news stories that are in the zeitgeist. Obviously, the Real
I D Is in the zeitgeist, so we're gonna talk
about the history of this real id act where it's

(08:59):
like we need to make sure that our ideas are
even realer, un unfakeable, unfuck withable, and it's terrible. The
history is terrible of just like ID in America, of
this particular, the real id Act. It was passed back
in two thousand and five. See if you can guess

(09:20):
what was going on back then. And then we're going
to talk about how it's like not even real. It's
that you don't actually you don't actually need a real
idea to travel apparently. Oh god, yeah, so we'll talk
about that. Talk about my favorite news story I've read
this week, which is Tesla's inventory of unsold cyber trucks

(09:40):
skyrockets despite offering huge discounts. I just this article just
made me happy, like in a way, just like the hater,
my inner hater. I'm just like fuck yeah, oh yeah.
Sarah Rump for Mediaite, just write it directly into my veins. Shoot,
it really is.

Speaker 1 (10:00):
It's delightful. Everything about this it just goes. It goes
from like grin to.

Speaker 4 (10:04):
Grid as you read, like cyber truck people also like
want to sort of be combative and want you to
hate them, But I'm like no, it's just like Sean
and Freud like, I'm so embarrassed for you guys.

Speaker 1 (10:13):
Yeah.

Speaker 2 (10:14):
Yeah, it's just so conspicuous. It doesn't look like anything else.
Like I just I can hear like a dumb horn
playing in my head, like a dumb yeah. Like every
time it's driving down the street, I'm just like, uh,
you look like shit.

Speaker 1 (10:31):
I don't know if it's it was on this podcast,
but I did find the ideal. I did see the
ideal use for a cyber truck in Echo Park, which
was there's one in my neighborhood. Is a couple of
bucks away that is wrapped in a vinyl wrap promotion
for a vape juice store. That the first perfect use
of a cybergruck.

Speaker 2 (10:51):
That is the final form of the cyber truck.

Speaker 3 (10:54):
I feel like that's radioactive, Like don't touch that.

Speaker 1 (10:56):
Oh my god. No, everything does welt in its wake.
But it is like, yeah, there's like that's my neighborhood.

Speaker 2 (11:04):
That is, I think it's in my neighborhood. I'd like
see it all the time, and it's got a it's
covered in a shadow from Sonic three, like the character
shadow from Sonic three, like decal. It was just like again,
it's like that's kind of like it should be designed

(11:24):
like us.

Speaker 1 (11:25):
You should come with that stopping job.

Speaker 4 (11:32):
Tell these people they can put that on a Honda.
It's like we are saying this much money, and like,
am I most generous? I'm like, sure, car shapes and
colors have been so homogenized and that's terrifying. Most generously,
I can say, you want to stand out again, put
that decal on a Honda, Like what are we doing?

Speaker 2 (11:49):
They just the timing wasn't great for them. They had
it on back order. They were ordering their cyber trucks.
And then the guy came out and gave went full
strange love on the global media center stage, just a
full Nazi salute. Oh you gotta feel for them, not

(12:09):
at all, you have to feel something I do. It
is we'll figure out what and we'll talk about AI
if we get to it. There's an expert who thinks
who's like talking about He used to be Uber's head
of AI, so he has all the bona fides of
being like a shitty AI dude, and everything he says

(12:31):
about AI makes sense. He's like this bubble is about
to pop, like this whole thing is a disaster and
it all just like makes sense. He's like, it's good
at mimicry, and that's it. It's not good at the
things it claims to be good at, and it's like
actually not getting better, It's actually getting worse as they
continue to develop it down this road. So maybe we'll
talk about that plenty more. But first mo, we do

(12:54):
like to ask our guests, what is something from your
search history that's revealing about who you are?

Speaker 4 (13:01):
Okay, well here's my thing. I'm a Googler, That's what
I'll say. We start there, like anything. If I have
a thought, I'm eight hundred Google's in you know what
I mean. Yeah, And so already we're I'm going to
tell you that I had to like go on my
extra history to find it because I I delete everything
I Google. It's like, I'm not I know, I'm not

(13:25):
hiding anything. I know the data is tracked, but emotionally
it feels better to just like no one knows I
wrote it down. Oh I had a couple of good
ones I had.

Speaker 2 (13:33):
By the way. Uh, you know, if you're a person
who likes to go from thought to Google really quickly,
Elon Musk has a new product that you're in love.

Speaker 3 (13:42):
Oh yeah, it's beta, it's in my brain right now,
you haven't already.

Speaker 1 (13:47):
Amazing.

Speaker 3 (13:48):
Absolutely.

Speaker 4 (13:50):
My two ones I think that are most revealing were one,
what essential oil smell like leather? Because I really like
this one perfume, but it's not non toxic and it's
really expensive.

Speaker 2 (14:04):
It's not non toxic.

Speaker 1 (14:05):
Yeah, much toxic toxic, it's not toxic.

Speaker 4 (14:14):
My favorite Indian restaurant back in Madison says on the
window excellent vegetarian and non vegetarian food.

Speaker 1 (14:26):
It is covering the bases.

Speaker 2 (14:27):
Yes, So you like a perfume that smells like leather,
A scent that smells like leather that is toxic, and
so you're wondering if you can mimic it via essential oil.

Speaker 3 (14:40):
Yes, exactly.

Speaker 4 (14:42):
And then my other google I thought was revealing was
symptoms of hepatitis A because I saw there was an
outbreak in LA and it's it's like I think it's
like fecal and oral transmission and everyone's nasty. So I
was kind of like, oh God, I have to be ready,
like we have to look for in symptoms in myself
and others.

Speaker 2 (15:00):
I don't know how I can't. I can't keep track
of all the heps what hepe is fecal. It's like
I think the.

Speaker 1 (15:09):
Food one if I recall okay, other the craziest hepatitis
that I dodged one time was there's a place in
uh kind of like like South LA that serves blood
clams savice. And I did not know at the time
that blood clam, the little blood part of the blood
clam is close enough to human blood that hepatitis s

(15:30):
could live in.

Speaker 3 (15:32):
Oh oh my god.

Speaker 2 (15:35):
I mean, Andrew, I gotta say this would be one
of these situations that if if you had gone it
that way, I would have been like, on the one hand,
I love you. On the other hand, like you ate
something called blood clams cevice.

Speaker 1 (15:50):
Yeah. Yeah, they also serve him on the half shot.

Speaker 2 (15:54):
And was it delicious?

Speaker 1 (15:56):
I wouldn't be. I'm a pretty like just fucking do it.
It was a little much for me. It mostly tastes
like a big old it's the blood bloody clams dogs.
Does it taste metally, Yeah, yeah, it's like it's not iron.
I mean it's it's like iron a proto hemoglobin sack.

(16:16):
It's the thing. It's like the miracle of evolution where
you're like, oh, this is when it was clam and
then probably this became blood. That's fucking anyway, Sorry, sorry
for that's important. Yeah it was clams.

Speaker 2 (16:31):
Is that how they described it to you when they
when they were like.

Speaker 1 (16:34):
Not there, this was this was this was a Wikipedia Yeah.

Speaker 3 (16:40):
Yeah, this is after you're like, maybe I have HEPSI.

Speaker 2 (16:43):
Yeah yeah, yeah, you know you got to do some
some wild ship to feel live these days and feel
dead and and then actually be dead, you know.

Speaker 3 (16:54):
So important.

Speaker 2 (16:55):
Anyways, Well, we hope the googling on the symptoms of
Pepe any any surprising symptoms there is that the one
that turned your eyes yellow?

Speaker 3 (17:05):
Yeah, because it's a liver disease and seaundice.

Speaker 4 (17:07):
Here's the tough part though, It kind of like plays
out on its own, and I feel like there's such
like a prevalence of liver issues just in like our
generation because of toxic load that I'm kind of like,
I will it play out, like yeah.

Speaker 1 (17:22):
We'll just kind of and mix in with all the
other ship and then it's it's part of a balance
rather than a disease on its own.

Speaker 3 (17:28):
Say that that's probably true.

Speaker 1 (17:33):
I just do.

Speaker 2 (17:34):
That's what we're here for.

Speaker 1 (17:36):
Not only not a doctor on the opposite of a doctor.
Whatever that is.

Speaker 2 (17:40):
Vibes based medicine is what we practice here.

Speaker 1 (17:43):
Mo.

Speaker 2 (17:43):
What is something you think is underrated?

Speaker 1 (17:45):
Oh?

Speaker 3 (17:45):
Non lucrative hobbies?

Speaker 2 (17:48):
Love it?

Speaker 3 (17:48):
Yeah, hugely underrated. I am.

Speaker 4 (17:50):
There was an injured baby hummingbird in my backyard last
week and this I texted this woman who like runs
a hummingbird Sayingtuary and she happened to have a woman
who volunteers for her that was near my apartment, and
it was like, I go to her place and it's
her hobby. She has a full time job, but she
just like rehabilitates little hummingbirds in her house. And it's

(18:13):
not like she's doing it on the gram. She's not
doing it. She just really enjoys it.

Speaker 3 (18:16):
It was awesome.

Speaker 2 (18:18):
How is she coming across the all right? Sorry, my
braind just went to like a dark place where it's
like she's injuring those fucking hummingbirds.

Speaker 3 (18:25):
Oh Munchausen's by, Yeah, yeah, how do you find so
many injured hummingbirds?

Speaker 1 (18:31):
Can I tell you?

Speaker 4 (18:31):
I mean, the little guy is a little barreling through
the air like a snitch, like you know, like I'm just.

Speaker 2 (18:36):
Like I love hummingbirds. I notice hummingbirds, and I've never
seen an injured one, but I mean you found one,
so there you go.

Speaker 1 (18:43):
My mind went straight to like like the hummingbird version
of like a World War One like battle hospital, so
like a little like hummingbird frush under one arm and
a teeny tiny hummingbird cigarette out of the end of
the of course. Yeah, flashback, hummy.

Speaker 2 (19:01):
I love hummingbirds so much. They're so they just seem
to be little like droplets of like something operating at
a different like time space continuum than the rest of us.
It's just like that thing is moving like a fucking UFO.
There's no way remarkable. And then you look at the
weight of their little barrel chest and you go, hell, yeah,

(19:26):
why are you so proud this little one?

Speaker 3 (19:31):
Can I talk about pride? This little one?

Speaker 4 (19:33):
It was like a baby it had fallen from its nest.
And the way it looked at me when I was
like holding it was like, hey, let me down, this
is humiliating, And it was like they're so proud.

Speaker 2 (19:41):
It was crazy, the ones so tall, Oh you think
you're so cool? It really was. How's a baby hummingbird?
Like fingertips?

Speaker 3 (19:54):
Like bizarrely, it was like about this big like like maybe.

Speaker 2 (20:00):
Around yeah intro, wow, I love this. That is amazing,
And I do think we need to bury ourselves in
non lucrative hobbies. I've been talking about the trends in
mundane like people doing mundane ship just for the sake
of doing mundane stuff on like TikTok, and like the
video I keep coming back to is like these people
who made chocolate chip cookies with but like without using

(20:23):
their hands. They just used the trash grabber things.

Speaker 3 (20:26):
And see this is them the ill school ship that
rules exactly.

Speaker 2 (20:30):
It's just like that. I feel like we need that
right now, just like it cut off from any ideological
content and just like the stuff that people used to do,
and like when they stuffed themselves into fucking foam booths
and like sat on flag poles for days.

Speaker 4 (20:47):
My friends and I it was before the garage door
opener sensor was it. We'd try and roll under it
like Indiana Joe and.

Speaker 2 (20:54):
Yeah, so dangerous, but it was.

Speaker 3 (20:56):
We spent hours just trying to roll.

Speaker 2 (20:58):
Under its door that way.

Speaker 1 (21:01):
When I was five, you're sort of talking about what
you want is like if there was a way to
make a fraternity that was not wildly misogynist exactly.

Speaker 2 (21:10):
Yeah. I just think there are like some things that
we can take from them, like just pick around the misogyny,
you know, and there's like a delicious little bites in
there that you get cake.

Speaker 3 (21:21):
Those bites are sixteen goldfish. They make you.

Speaker 2 (21:26):
Still that don't still mean to cold fish. But I
thought there's like some non mean stuff real quick.

Speaker 1 (21:31):
Just speaking of chocolate chip cookies, I just want to
tell you guys about a product that I had recently.
I visited my sister in Atlanta and she had cinnamon chips,
which was I mean, that's ultimately was cocoa, butter, sugar
and cinnamon. But they were the best thing I've ever
had in a pancake. Holy fuck, oh pancake. Yeah good.

Speaker 4 (21:51):
Can I tell you I am actually a chip huge
on a cinnamon chip because to me, it is the
epitome of a nineties coffee house.

Speaker 3 (21:57):
Is a cinnamon chips gone?

Speaker 2 (21:59):
Yeah?

Speaker 1 (22:00):
I had never heard it. Is this the thing that
you knew about?

Speaker 4 (22:02):
I was shocked it's been forgotten. It's like all a butterscotch.
It's it's been But it was like, I think all
the Barnes and Nobles Starbucks hats still have some chips. Okay,
but it's it's not popular and it's good.

Speaker 1 (22:16):
It's really good. It is mostly cocoa oil or palm
oil or whatever.

Speaker 3 (22:20):
But oh yeah, whatever, no problem.

Speaker 2 (22:24):
What is something more that you think is overrated?

Speaker 3 (22:27):
Keeping in touch?

Speaker 2 (22:31):
I hate your mouth. This is I am so mad
at this. I hate it too.

Speaker 4 (22:36):
It's also so overrated because to me, like texting, keeping
in touch with this certain like there isn't depth to it,
and so it kind of feels like a performance, like what's.

Speaker 1 (22:44):
Up you doing? Good?

Speaker 3 (22:45):
Okay? Well, was get in touch?

Speaker 1 (22:46):
But I go.

Speaker 4 (22:49):
I love yeah, you love me. I'll see you when
I see you. If you need me reach out, I'll
reach out if I need you. Yeah, the whole keeping
it No, I'm not gonna do it. I'm not gonna
do it like a.

Speaker 2 (23:01):
Good you know, leave leave it for like an annual
nice conversation as opposed to a hey, just checking it
in a right good like you already know what you
want the answer to be as you're checking in, and
you want it to be short.

Speaker 4 (23:15):
And it's specific. It's not like emotional check it. It's
specifically the check in because you feel like you should
because you care about that person, so you're giving that signal,
and it's like, no, cut them out.

Speaker 3 (23:24):
They know you love.

Speaker 1 (23:25):
Them, They should know if they love you, they would know.

Speaker 3 (23:29):
Okay, I love for getting a little toxic.

Speaker 2 (23:34):
Actually it's your fault that I don't check in on you.

Speaker 1 (23:38):
My my core group of friends from high school, at
least four of us have like birthdays a week apart,
and by the time it's time to text the fourth person,
it's very tiresome because I am out of shit to say.
And we have already talked three times.

Speaker 3 (23:53):
He's been dealing with this since he was fourteen. That's
so funny.

Speaker 1 (23:57):
Yeah, last last dude on the block. It's like I
already said happy birthday to early twice. Right, all right,
let's take a quick break.

Speaker 2 (24:09):
We'll come back. We'll talk about real IDs, and we're back,
and I mean, I gotta ask the group, how are
we doing on our real IDs? Folks? Mo Andrew, are

(24:31):
you real id up?

Speaker 1 (24:33):
I let me just make one comment to you, Jack. Yeah,
I feel like it's more like real id's nuts.

Speaker 2 (24:41):
Thank you.

Speaker 1 (24:42):
There you go.

Speaker 2 (24:43):
I do appreciate you just just dabbed like the Chipmunk
T shirt. I think I have mine.

Speaker 1 (24:51):
I don't know. I had to get my license for
renewed last year.

Speaker 4 (24:54):
So that's it, right, That's exactly how I feel. I
think I have mine, I got it renewed. I don't
know if I said yes, right.

Speaker 2 (25:00):
Yeah, fuck, I don't know mine. So the story, the
story should be read in the context that I'm just
like mad that real ideas are a thing, and that
I haven't done it yet, but I do think they're stupid. Yeah,
we've been hearing about them since this The Real ID Act,
passed back in two thousand and five as a response

(25:22):
to the nine to eleven Commissions recommendations that the federal
government set standards for issuance of source of identifications such
as driver's licenses. Basically just a.

Speaker 1 (25:33):
How's how's their track record on recommend swish one word swish.

Speaker 2 (25:38):
They nailed it.

Speaker 1 (25:39):
Kobe, Kobe from the logo.

Speaker 2 (25:43):
They Yeah, it is bad. It was, you know, war
on terror, horrible travesty of human rights violations. But yeah,
they so they put that out there and then kept
pushing it back, you know, just kept getting pushed back,
and now they expect us to believe that it wasn't pushed.

Speaker 3 (26:01):
Back I mean, oh what, oh a gaslight?

Speaker 2 (26:05):
We all will? I just feel like I was so
used to it getting pushed back that I shouldn't have
to then go out and get it.

Speaker 4 (26:12):
You know.

Speaker 1 (26:13):
Okay, that just feels unfair to.

Speaker 2 (26:17):
So many times I'll just passively get it.

Speaker 4 (26:21):
What makes me laugh is that if you don't get
it right, you just have to have an extra measure
of security. I guess, which makes me laugh because it
reminds me of that remember that early days snapchat meme
in like twenty fifteen of that Sikh man with the
turbine and it just he wrote across the screen and
he's in front of security and goes about to get
randomly selected.

Speaker 2 (26:42):
Yeah, exactly. It's just a pretext for them to choose
to give you extra security attention, which I'm assuming they're
still going to do even if you have the real idea.
But it's been pushed by Trump recently, specifically as a
way to target immigrants and give them an extra way

(27:02):
to you know, stop people, deport people, all all the
stuff they seem to like to do to people who
weren't born here, or were but their parents weren't born here,
or you know, they just protested on behalf of Palestinian people.

Speaker 3 (27:17):
But there's more.

Speaker 2 (27:21):
I mean, and this goes back like American America has
a long history of identification documents being used to control
and surveil black people, and it's rooted in the violence
of slavery, and it's after effects and you know, we
we've covered this before on the show. But it's just

(27:43):
a way for the government to impose its will on
you and cast doubt on you and you know, do
whatever the fuck they want to you. Basically, and even
Sarah Pitt, so this is this was just kind of
a nice like blast from the past where like conservatives
are bad about this too. They're like, this is just
big Brother forcing us through more hoops, which like brought

(28:06):
me back to the era where like conservative people were
just like annoyed by everything. They were just like anti homework,
you know, they're just like, I don't want to fucking
do that.

Speaker 4 (28:18):
Well that's the moment and when these things come about.
The only kind of hope I have is because you
can't appeal to empathy or ethics or morals when it
comes to this, like to crossing whatever aisle quote unquote,
But when it comes to like bureaucracy and folks having
to you know, take days off work and go into
the DMV and go, and you just have to hope

(28:40):
that money then makes them complain and you're like, Okay,
maybe there's some some shift that will happen.

Speaker 2 (28:45):
Yeah, I feel like just things are going to get
more annoying and maybe that will help. Like this makes
things more annoying, Like obviously the fucking economy like is
gonna be shitty and make things I don't know annoying
is the right word, but just harder for people. Tedious, yeah,
tedious and yeah, just having to over and over just

(29:07):
like go through jump through bureaucratic hoops to as a
as a wise philosopher once said aka Sarah Palin a
few sentences ago.

Speaker 1 (29:17):
It is it is super cute that she like clearly
didn't get like multiple conservative like rhetoric upgrades. She's still
doing this like big brother shit, Like what are you
fucking in nineteen eighty seven? That are you talking about?
Like you guys are.

Speaker 2 (29:30):
Past that, you guys are a big brother?

Speaker 1 (29:33):
What are you brother for twenty years? What are you
talking about? Funny?

Speaker 4 (29:37):
I like, I actually like you did her passionately being
like Hey, you guys, this is pretty scary.

Speaker 2 (29:42):
Huh. But Alex Jones, also like old school you know,
conservative in some ways, has suggested that Trump's fascist government
overreach is really just because he's being manipulated to support
real id so like real idea is actually the thing
that he's all right, like you've been trying to get

(30:04):
to so he can like actually they can get like
more control over you, which I don't know sounds plausible
at this point. Oh god, but yeah, it just seems.

Speaker 4 (30:16):
It seems tough in these situations because he's right, it's bad,
but it's like it's because it's barring like you know,
esl folks from Taylor having the right like test ability
at the DMV or the right paperwork for people who
don't have, you know, birth certificates, like all these different things.
But it's that onion thing where it's kind of like
worst guy you know agrees with you, and it's.

Speaker 2 (30:37):
Like no, well yeah, And all of their fantasies of
like conspiracy theories are just them imagining what it would
what it actually is like for people of color in
this country. Yeah, like if that happened to them, you know,
so they're gonna start pulling you over and they're not
gonna let you have guns, and they're gonna, you know,
like it's like that is already the case for many

(30:58):
of the people in the country. It's not gonna happened
to you.

Speaker 4 (31:01):
But yeah, yeah, and they can't fathom Trump like doing it.
What's so weird is every time they're like, and they
tricked them, they tricked old boy, and you're like, what.

Speaker 1 (31:14):
I mean, I think they're they're just kind of leaning
into what they have, right, like he he because Trump
now seems more trickable than ever just even it's just evident.
So I think now they just had to add a
little a little zag to their zig maybe, right. Can
I just say I I flew recently at UH and
at least at Atlanta Airport. The video of Christy nome

(31:36):
Like doing the little like you're about to go through
TSA speech was the most I've ever felt like, oh,
I'm in the first act of Starship Troopers. Like it
was the production value I think was just a little
verhoven Ish on her video that I was like, Oh no.

Speaker 2 (31:52):
It's really there's so many ways that it feels verhoven Ish,
like we're RoboCop and fucking Starship Troopers is like just
not right. Yeah, yeah, no, not as funny at all.

Speaker 1 (32:03):
Just yeah, it has it's a it's a gritty Verhoven
reboot by like you know, like your Zack Snyder types.

Speaker 2 (32:13):
Right, Yeah, it sucks. Yeah, it's like if a good
director was reimagined by someone who sucks at this.

Speaker 3 (32:21):
Because they are in like, we're all like, oh, it's
just opian. It's like this movie. It's like this movie.

Speaker 4 (32:25):
It's like because those people watch these movies, right, It's
like it's created because of that, and so then it
feels super next to reality, like.

Speaker 1 (32:33):
And they're like they they watched Starship Troopers and like
seems good.

Speaker 2 (32:37):
Yeah. Those bugs, those bugs gross. Also, you can get
free donuts, and you know, you might assume that you
have to get your real ID to get free donuts,
like giving blood.

Speaker 1 (32:51):
But you don't.

Speaker 2 (32:52):
Krispy Kreme is offering free donuts as like a way
of taking the edge off for the real ID process,
which may maybe this will be the time I've been
been told by our listeners that I need to try
fresh Crispy Kreams at some point. Have you not done
I still haven't done it.

Speaker 3 (33:11):
I mean you've had twenty years since the phenomenon started.

Speaker 2 (33:14):
What are you doing here being bad at life? I guess.
But yeah, as we've talked about, you don't actually like
you can still fly without the real idea you will
go get put through extra security, which is the same
security people are already subjected to quote unquote randomly by
TSA agents. So yeah, it just seems like people that

(33:39):
the government is banking on people not making the extra
trip to the DMB and then using this dusty law
to further do security checks and just do it, do
whatever the fuck they want to.

Speaker 1 (33:53):
People essentially humanized as Jack as someone who again just
was like, whatever, I'm already at the d m V.
What what information did I give them? If I have
a real ID that I wouldn't have wanted to do,
you know, I don't.

Speaker 2 (34:09):
I don't have that because it's.

Speaker 1 (34:11):
Like a fingerprint or some ship. I guess I just
assume they have all that already, but I don't genuinely don't.

Speaker 2 (34:16):
Blood that was part of it. Here's the thing.

Speaker 1 (34:21):
Here's the thing about me and the d m V.
I was just spraying blood all over anyways.

Speaker 2 (34:25):
So one thing you got me walking into a d
m V. It started to get here about that line
goes fast.

Speaker 1 (34:36):
Yeah, I will say I managed to lose two pairs
of prescription sunglasses this last trip to the DMV or
two trips to get this ID, which fucking sucks. You
lost two on I left trips, I left left one
there and then I got a replacement, and then when
I went to pick up my ID card, which I

(34:58):
again assume was real idea, because I I feel like
I would have remembered if I took a stand and
then I just left those there. Also, don't let me
have sunglasses?

Speaker 3 (35:08):
Yeah, everyone listening, We just got to get you a string.
You know those little.

Speaker 1 (35:13):
Or just like pitch black contact lenses like the guy
from corn.

Speaker 2 (35:18):
Yeah, or transitions they make transitions contact lenses. You just
have to stare directly at the sun and then your
eyes turned.

Speaker 4 (35:25):
Can I tell you I didn't. I didn't such a
literal princess that my friend calls because I didn't see it.
I didn't look at the sun or see a sunset
until I was twelve because I was told not to
look at the sun, and nobody said there's a time
constraint on it, like you're allowed to look when it's setting.

Speaker 2 (35:40):
That's true. Nobody does tell you that. We just all
are like, this is probably.

Speaker 4 (35:44):
Fine, right, yeah, he said, okay, all right, and then
I got to see a sunset at twelve.

Speaker 1 (35:48):
But it was incredible being being conscious of the first time.

Speaker 2 (35:52):
Yeah, I see a sci fi future where you're the
only person who isn't blind. Yeah yeah, you know.

Speaker 1 (35:58):
Yeah, but you're also like a vampire that got cured.
That's very like poignant to remember seeing a sunset for
the first time.

Speaker 4 (36:05):
Yeah, well it is because dear sci fi point. The
reason we found out is because my parents and my
sister were all driving the car. My parents were like,
look at that beautiful sunset. I start screaming because I
was like, I'm going to be the only one to
see for the family, and I like, stop.

Speaker 2 (36:21):
Throwing your body in between their face and the race.
Yeah all right. Uh moving on to just pure shod
in Freuda the So it seems like cyber trucks have
just totally stopped selling, Like and again, I just have
to Sarah rump r u MPF for media. This is

(36:44):
the most I've enjoyed. It's not like there's nothing that
jumped out to me like stylistically, It's just an article.
God damn did I enjoy I was like, I think
I need to cigarette after reading this. You know, I
have my suspicions the cyber trucks were not going to
sell that well. Like we'd heard that Tesla's were becoming
less popular in the aftermath of Again, the mainstream media

(37:06):
loves to be like it's his association with Trump, who
is a polarizing figure and doge which people have their
questions about, and it's not the like blaring almost involuntary
Nazis grew up from the global media at Trump's inauguration
and then immediately followed that by like speeches in Germany

(37:29):
telling them to like stop making a big deal about
the Holocaust or whatever he said there. Anyways, since that day,
Tesla sales have dropped pretty precipitously, while overall electronic vehicle
sales have gone up. So it's like against the grain
of like how everything's moving. And so my assumption was,

(37:51):
like he made it very uncool to own a Tesla
and the like. But Tesla's like at least look like
other cars for the most part, Like you have to
spend a second to make sure you're looking at a
Tesla before you start to judge the person who's driving it,
like there's no mistaking a cyber truck, like they they
just might as well come with like a full brass

(38:13):
band playing the Imperial March, like they're just so fucking
conspicuous from like three blocks away. And so you know,
they're just a statement. And the statement is like, I
stand that this came out after he did that, and
I stand, I'm down with that. I think that was
cool what he did with the uh, with saying his

(38:35):
heart goes out to people. So it felt like, man,
that's gotta be tough for the popularity. And this article.
So this article, in addition to being like so first
Tesla sales going down as e vehicle sales going up.
Next cyber truck quote. A Forbes editor who covers the
electric vehicle market dubbed the cyber truck Elon's ed cell

(38:58):
and the audio auto industry's biggest flop in decades at
the beginning of April. It's accelerated on a downward spiral
since then. So, yeah, already a flop. And then what
we're about to hear happened. A new article by Electric
editor and chief Fred Lambert spells out how brutal the
numbers have you come for the cyber truck. And by

(39:19):
the way, it should be noted that Electric and Lambert
personally have faced sharp criticism over the years for perceived
bias in favor of Tesla and Musk. So this is
like one of his you know, one of the people
who like writes about him glowingly in a way that's
like annoying and has created the problem in the first place.

(39:39):
It was reported in April that about twenty four hundred
new cyber trucks, representing about two hundred million dollars in inventory,
were languishing unsold, leading the company to refuse to accept
them as trade ins, and other used car dealers finding
them unappealing and making only low ball offers. So like
they just stopped taking them back even though the cars
are like broken and like causing all sorts of recalls

(40:02):
and ships.

Speaker 4 (40:03):
Someone had a sticker on their car that was like,
I know, I can't afford to get a new car
like to trade Yeah.

Speaker 2 (40:09):
Yeah, exactly, yeah, I mean that's I mean, that's happening
across Teslas, but like this one is that's the only
acceptable decal. Other than make juice is a giant decal.
That's like I know, so I did not I don't
pay attention to the news, but I'm.

Speaker 1 (40:24):
Like, how dumb can you be? Honestly, Like, it's yeah.

Speaker 2 (40:29):
Now that inventory. So they were already calling it like
the biggest flop of all time when there were like
just two four hundred cyber trucks sitting on the lot unmovable.
That inventory has skyrocketed to a new record high of
more than ten thousand cyber trucks. So it's like four
times where it was when they were like this is

(40:52):
the biggest disaster in the history of cars. Maybe it's
like it's four times that, and like they just can't
even like they can't even explain like what's happening. They
don't there's no like counter narrative. A lot of like
one of the counter narratives they've tried to say is like, yeah,
we're just like ramping up production because there's so much demand,

(41:15):
and like studies have shown that they've actually like slowed
production to almost to a halt because there is no demand.
It's just that apps like zero people are buying them,
and so they're just like they couldn't you couldn't like
slam on the brakes at the factory fast enough, and
so they're just like piling up at the you know,

(41:37):
like the T shirt you're talking about just into the
discount bin and like they're offering huge discounts now and
they can't sell shit. Yeah, it's like the I love
Lucy Sane really yeah.

Speaker 1 (41:53):
I mean the honestly, like last year, the only tiny
ray of hope I had for the election was when
Elon joined. I was like, listen, literally everything he touches
does turn to absolute failure. So the problem was it
just the timeline. We just didn't give him enough time
to cook. If he joined, like you know, mid election cycle,

(42:16):
I do think Elon could have ruined Trump's.

Speaker 2 (42:18):
Chances, right. Well.

Speaker 4 (42:20):
I also think like the reason there's no sort of
counter narrative, right is because then we'd have to talk
about billionaires and it's like, Musk, it's like three hundred
and thirty three billion, what is it? He is?

Speaker 3 (42:31):
It's like or is not? Eight hundred million is a
drop in the bucket.

Speaker 4 (42:34):
And that's what the cyber trucks is called a crossing
to sit there, right, And so it's like it's just
kind of because there's no discernment or accountability within billionaires.
He's like, yeah, it's not a flop to me. It's
like it's whatever. And that's the hard part, and I
think that should be more. The focus is that a
company is able to use these resources like and like

(42:55):
and just no problem, no accountability.

Speaker 1 (42:57):
Yeah, yeah, Well, wealth has been concentrated to such a
degree that one of the worst auto decisions at all
times can be absorbed by a moron.

Speaker 2 (43:08):
Yeah, and he can act to the degree that he
can be like, actually I did that on purpose. It's
actually for tax purposes. Can I tell you for me?

Speaker 1 (43:17):
Your impression is eternally troubling every time? It's actually a
win soctually socually.

Speaker 2 (43:26):
Oh yeah, I also I didn't you know, I have
heard like people are talking about how he and Trump
have gone their separate ways. This article also gets into
like how Trump's policies are just directly and vigorously fucking
elon musk, Like.

Speaker 1 (43:46):
God speaking of essential oil that smell leathery and are
extremely toxic. Imagine being in the room with those two
fucking dickheads, just the smell coming off their bodies.

Speaker 2 (43:58):
He's just right. He bought an election, like put all
his time and energy into like buying an election for
this guy and got at just got what everyone who
thinks they can like play Trump gets Like everyone who
at like thinks they're going to get something in return
from Trump, they always get completely fucked over. And yeah,

(44:22):
I don't. It does for a moment make you be like, wow,
he's like really doesn't give a fuck. He's like pissing
off the oligarchs, and like, will they ever get together
and admit they were wrong? And then that scenario just
stops being plausible because obviously they would never admit they
were wrong or dumb about anything.

Speaker 1 (44:43):
It's it's like watching it's the perfect con Like Trump
knows they can't do shit about him, right, And it
is in that way sort of glorious. And if we
didn't all have to collectively foot the emotional, financial, and
possibly physical safety bill, it would be funny.

Speaker 2 (45:01):
Right, Let's take a let's take a quick break so
that we can all just laugh quietly to ourselves about that,
and then we'll be back. Let's talk about AI will
be right back, and we're back. And yeah, so there's

(45:32):
an academic slash like capitalism. Guy used to be that
used to be ahead of used to be the head
of AI prober So he's like not just an academic
who's like standing on the sidelines being like, you guys
are dumb. He what is one of the guys.

Speaker 1 (45:51):
Jack I just interrupted say, like any version of the
job title financial analyst or commentary blah blah blah is
much better look to described by capitalism. Guy. Capital They
all fucking are.

Speaker 2 (46:03):
Yeah, capitalism, Yeah, we're that's just gonna be a segment
of our show Capitalism. Guys, what are they up to?
But he like he doesn't even think like AI is
bad necessarily, He's just like they've chosen the wrong path.
But it did feel like for the first time I
was reading somebody talking about AI in a way that

(46:26):
actually made sense along alongside like what we've been seeing.
And basically he's just saying that it's kind of fucked,
like it's it's going to continue, They're going to keep
investing in it, and it specifically these like large language
models and the gains that they're making are going to

(46:49):
run out, and the problems like hallucinations where the AI
makes up facts are going to be impossible to solve. Essentially,
and we've also seen recent reporting that like because they're
now drawing on training themselves with an internet that is
partially AI, it's getting worse. Like it's just going to

(47:13):
get worse from there. I had kind of missed this story.
But Open so Open AI obviously like the known you know,
like Global Leader in AI. Suppose the Global Leader and
AI had to like just released their big like the
new GPT for whatever. Yeah, and they had to like
pull it back after a few days because it was

(47:34):
returning misguided, incorrect and downright harmful ideas. It was also
too sick. A phantic company acknowledged that part is amazing crazy.
Isn't that so funny?

Speaker 4 (47:47):
But you know what's crazy to me is it's like
we're like forgetting the fact that it is a pure
mirror and aggregate of us and what we're feeding it
and the algorithms we're feeding it and the personal data.
And I'm going like, of course, of course, course, like
we also have sycophants as our biggest political touchstone and
news it's like we're talking about of course it's that.

Speaker 1 (48:07):
Yeah, Oh it's so funny.

Speaker 2 (48:10):
Yeah, I mean this feels like again and this is
the one hope for this version of AI is. I
guess that the whole like media apparatus is bought in
because so much of the market is propped up on
like AI hype that they're just not going to report
this because like that feels like a massive story that
like chat GPT releases their newest version and it's a

(48:32):
step backward, like in this thing that's supposed to be
like the thing it's supposed to be. The whole pitch
on AI is like this thing's advancing so fast, Yeah
that like we're actually concerned with how fast it's advancing,
Like it's gonna leave us in the dust and like
go fuck off to space like at the end of
her or you know, like Sam Altman when he was

(48:54):
first like coming on the scene. There's a profile of
him from that regional public the New Yorker, where he
like talks about how he like keeps a cyanide capsule
on him because he's like worried about the AI like
coming to life and killing him and his family. Like
he's just playing up like terminator ideas, like because he

(49:16):
knows that it makes it a good pitch. That it's
like this thing's so damn smart, it's it actually scares
me sometimes. I'm so good at my job, I get
scared sometimes and instead of like rocketing along that growth curve,
it's getting worse and like more prone to making shit up.

(49:37):
And they're trying to The sycophantic thing is so funny
to me because they're just like trying to cover it
up for it by having the product just flatter the user.
That's so sad, that's so pathetic, Like, I mean, I'm
happy that it didn't work, I'll say.

Speaker 3 (49:55):
But here's my my thing.

Speaker 1 (49:57):
Though.

Speaker 4 (49:57):
I always get like half it's like burn after reading
where it's like there's more idiots than you think at
the top, and half sort of like maybe this is
part of a bigger thing where it's like nobody has
promised accuracy with AI.

Speaker 3 (50:11):
Nobody has promised truth or accurate data.

Speaker 4 (50:15):
So when data is given to an individual and they
can see so apparently that it is not accurate data,
then there's the issue, right, But that to me doesn't
necessarily say that. That's not the kind of point like
that's not like the idea that the data can be
chosen and picked by people doesn't speak to accuracy of data.

Speaker 3 (50:36):
It's just who's giving the data.

Speaker 4 (50:38):
And that's where I kind of go, oh, then that's
showing a hand too fast and not effective in the
way it is being ty traded to people to use
more actively but not necessarily ineffective as a medium or
thing itself right as.

Speaker 2 (50:54):
A product, right, like they're and the trick that they
did is they never said it was like fully accurate,
and they like claimed sometimes it hallucinates.

Speaker 1 (51:04):
Well they.

Speaker 2 (51:07):
But yeah, the implication is by giving you a product
that it says can like write your legal briefs for you,
and like right, like they're implying that like it can
do this. Like it's like it's like Tesla being like
we have a self driving mode that like legally they
can't call it, like they can't say it can self drive.

(51:29):
So they like but everything around it is suggests that
well they wouldn't like put this in a car if
it was so like obviously, so like it's just yeah,
they they just imply it it is not good at
what it does.

Speaker 1 (51:45):
It's like it's good at what it does, it's not
good at what it's good at. The bosses want you
to think it can.

Speaker 2 (51:52):
Do that's right. So that that's the big thing. The
one thing that he points out, is it's very good
at mimicry. And that is the one thing that I'm like, yeah,
that is true, Like that's all the stuff I've been
impressed by, like the you know images, like the studio
Ghibli thing is like that's fun. You know, that's like

(52:12):
I see that as like a fun little like apple.
Like we this used to just be a thing that
like got introduced to like Snap and like everybody was like,
well look at this thing that Snap can do, and
people will be like that's fun. For now, the really
like mind blowing shit that it can do is like
it's making fraud like much more powerful, like the people

(52:35):
are able to change their race, facial hair, voice, and
more during why video calls with no effort so that
they can full elderly people. In one case in this
article that we'll link off to, it did a deep
fake on like an employee and just made that employee
think that they were in a video call with their

(52:57):
CFO and their CFO was like, I'm gonna need you
to like transfer five million dollars here, five million dollars here,
five million dollars here, and they did it, you know,
like so it it's good at mimicking things. It's not
good at thinking or doing research for you because it's
going to find incorrect research.

Speaker 4 (53:19):
I can't believe we're not talking more about John Anderton though,
and pre crime, Like why aren't we bringing up minority report.

Speaker 3 (53:25):
All the time when we're dating?

Speaker 4 (53:27):
I mean like that to me is like the extrapolation
of data to say that somebody is going to do
something is just going to serve to support deporting people
to venement, like you know, all these different things.

Speaker 2 (53:39):
Oh yeah, they're going to claim that it can do that.

Speaker 1 (53:43):
It seems like they may have deployed versions of that
in like Gods of It. It does sound like they're
already doing that pretty big time.

Speaker 2 (53:52):
Yeah, yeah, I will.

Speaker 1 (53:53):
I think this is a quote from at Citron But
if I'm wrong, I'm sorry about that. But I I
some writer pointed out that like a this all this
large language model stuff is like billion or trillion dollar
solutions for at best million dollar problems, right, like really good.
It's it's just like, yeah, like this stuff is like

(54:16):
kind of helpful for some things, but it is not
close to what it takes to run this shit and
develop this this thing that is like only okay, Yeah, so.

Speaker 2 (54:26):
This guy's prediction for what it because I do think
like the thing that's creating all the momentum and why
this is like a thing that we can't stop fucking
hearing about constantly in the mainstream media is because there's
like a lot of money invested in it. And the
thing that this guy's saying is like, it's not gonna
go away. It's not just gonna like one day people
are gonna be like there's no use for this, because

(54:48):
there are like cool things that can do with regards
to images and mimicry and like creating a C plus
paper that like has a couple factual errors in it,
like that stuff will probably continue to be useful some
some of the teachers grading those papers will probably be
using AI. So just you know, it'll just be like
AI on AI, like bullshit, that's what tron was about.

(55:12):
I think, yeah, it's just gonna spin into Instead of
spinning off into like a hyper intelligent race like a terminator,
it's gonna like spin off into just mediocrity, just like
a world of mediocrity that like mediocre robots talking to
each other essentially.

Speaker 1 (55:30):
But yeah, losers who have been told they're the smartest
thing ever BYPT which yes, exactly losers like.

Speaker 2 (55:39):
But his idea for where this ends up is just
like it's not going to be it's going to get
less and less expensive, and like because nobody has like
that the information of like how to do the thing
that all these AI companies can do, Like everybody knows
it now. It's not secret information as we saw with

(56:00):
the what was the Chinese deep Seek.

Speaker 1 (56:04):
Yeah, I say to the extent I ever use large
language model stuff. I always just go to deep sea.

Speaker 2 (56:10):
Just use deep It's not the worst. Who cares it's
it's it's cheaper, it kills less, like it uses less resources.
But like he's he keeps comparing it to spell Check
in the article. He's like, look like spell Check, it
was a thing that was like cool when we first

(56:31):
got it, but nobody like made a billion dollars selling
spell Check. It's just like a thing that's everywhere now
and it's like everybody knows how to program it into
whatever they're doing. And so like every everybody's going to
have access to AI. It's going to be recognized as
like the not very like impressive technology, and then we'll

(56:53):
keep developing along the path to like where a I
will eventually like be more powerful and something that what
we'll have to reckon with. But like this version is
not it, and it's eventually just going to be like
something that's kind of everywhere. But also this path is
not it the large language model business that's that's his
whole thing. It's not like a type of consciousness that's

(57:17):
just not that will improve with more processing power. It
is simply going down the actual wrong road of.

Speaker 3 (57:24):
Wait, but not it for what and who?

Speaker 1 (57:26):
I mean it as in, if these dumb focks think
they can build like the equivalent of a conscious mind
out of silicon chips, Yeah, this is not how you
even do that, Okay, Yeah, which.

Speaker 2 (57:39):
Is what we also understand because.

Speaker 1 (57:42):
Yeah, but that's what I mean.

Speaker 4 (57:44):
With these Like with these articles though, I get confused
because I'm kind of like, it's not it for the
tech industry. It's not it for art and technology, Like
what is it not? If that's where I So I
get what you're saying that because it is confusing sometimes.

Speaker 1 (57:57):
Yeah, well, because the other problem with with the media
about AI, I will say, I think is that like
everyone has different incentives, Like all the tech people their
goal is a world where they don't have to employ
human beings. Yes, people like are being pitched this idea
that oh, you can I don't know, fucking write a
novel with just by asking this or whatever the fuck

(58:20):
you know what I mean, or this is like a
little assistant for you. And the reality is it's like
just like none of those things and those things are
not consonant with each other. So like that that's why
I think the like the it of what AI's it
is is such a confusing moving target because at best
they're like deluded about what it is, but more realistically

(58:41):
they're just lying about what the goals.

Speaker 2 (58:42):
Are to you, right, I think so that we use
it and train help train it. I think even the thing.

Speaker 1 (58:50):
They didn't count on is how bad that training was
because people are fucking stupid.

Speaker 3 (58:54):
That's so funny, but that's so such sweet, you know,
Like I love that.

Speaker 2 (58:59):
Yeah, it is just going to be used to try
and advertise to us. We talked earlier this week about
like the model that they're developing in the background that
they're like, we're moving from the attention economy to the
intention economy, where now we're going to sell your intention.
We're going to sell your free will to marketers. So basically,

(59:19):
like as you interact with an AI assistant in the background,
that AI assistant is going to be attached to a
market where people are buying and selling, like what it
suggests that you do.

Speaker 4 (59:31):
Yeah, you know, I listened to that, and I just
my whole thing the whole time is like I felt
better because it's sort of this behemoth and sounds scared.

Speaker 3 (59:38):
But then I'm going like, oh, but it's for ads, yeah.

Speaker 2 (59:41):
For ads, and also it'll suck, Like that sounds terrible.
The second people know that's what's happening, I feel like.

Speaker 4 (59:46):
I don't know, maybe connect your value from things a second.
We like help people have i don't know, grounded emotional
bases as individuals. Like then I'm not so worried.

Speaker 2 (59:56):
Yeah that's true if it's just adds well.

Speaker 1 (59:59):
And also it's specifically designed given or not specifically designed,
but like it's fallen into like the way the market
and the culture around it has it is the worst
people are by far the most susceptible to it. So
I'm kind of not hating it. I'm just like, yeah,
that's right, you should. I actually would not mind you

(01:00:20):
reaping what you saw because good fucking god.

Speaker 2 (01:00:23):
Yeah, well, Mo, it's been a true pleasure having you
on the daily zeitgeist.

Speaker 3 (01:00:28):
Joy, I could gap all day.

Speaker 2 (01:00:30):
Where where can people find you? Follow you, hear you
all that good stuff.

Speaker 4 (01:00:35):
Please join me in my podcast Worse Than You with
Mofrey Passik. We've break down sort of people's creative process,
usually comedians, but also different artists, why they make their art,
why it means something to them, and also just the
logistical process of making things. Because again I'm very literal,
I need people to break it down. You can also

(01:00:55):
follow me at at MOPA m E a ux Pas
and I just you know, touch a flower today, have
a good day's.

Speaker 2 (01:01:04):
Yeah, that's good advice. It's the best thing anyone said.

Speaker 1 (01:01:06):
On this Fly over to the flower and just drink
the whole fucking thing down. There's a hummingbird pod.

Speaker 2 (01:01:15):
We do have a large hummingbird listenership.

Speaker 3 (01:01:19):
So I heard that.

Speaker 2 (01:01:21):
Yeah, is there a working media you've been enjoying?

Speaker 4 (01:01:24):
Okay, So I vacillate between like you know, niche artists
of like nineteen twenty and Summer House, and I love
sort of the connectivity of it. So I was trying
to think of different media. But the one prompt he
was at tweet and I just wanted to share a
tweet that I think of all the time, and it's
for twenty twenty and it's my favorite. When everyone's talking

(01:01:46):
about like male female bodied dynamics or all of those,
Rachel sent it.

Speaker 3 (01:01:51):
Tweet it.

Speaker 4 (01:01:52):
Sometimes guys are like, wow, I'm intimidated by how organized
you are, and I'm like, okay, well, I'm intimidated by
the fact that you could kill me with your bare hands,
and that comes up in my head all the time.
I enjoy every time I think of it. I feel
so scene. I love it it so much.

Speaker 2 (01:02:11):
That's great, Andrew, thank you so much for joining. Where
can people find you as their work of media.

Speaker 1 (01:02:16):
You've been enjoying don't find me? I don't know. Andrew
t podcast is leave Me Alone the I mean the
two works of media. I'm going to see Sinners a
third time this week. I love that movie. But I
guess the more obscure piece of media is there is
a YouTube DJ named Jolzo. She's Chinese. Let me She's Chinese.

(01:02:40):
Z h oh you I think, but it's kind of
Chinese for ramble around and one of her sets she
plays a bunch of Chinese city pop, which is way
less well known than Japanese city pop. I'll just say
as a Chinese person, I could acknowledge it's less good,
but it's there. But her sets are great. She's very,
very like you know, chill vibes to grove out to

(01:03:03):
kind of business. And I don't know, I've been enjoying
it so I'm fucking old. So this was a real
revelation for me. I was like, oh, people are doing it.

Speaker 2 (01:03:11):
Col DJ sets on Oh my goodness, this cool DJ set.

Speaker 1 (01:03:17):
I remember these songs.

Speaker 2 (01:03:21):
You can find me on Twitter at Jack Underscore O'Brian
on Blue Sky at jack Obi the Number one Yeah.

Speaker 1 (01:03:31):
D C.

Speaker 2 (01:03:31):
Pearson took a picture of a rack at the at
the grocery store and on one shelf was National Geographic
Pope Francis a Life of Service, and the next one
was Time Magazine, Hello Kitty, the Power of Cuteness, and
he said, sucks they never got.

Speaker 1 (01:03:49):
To collabse.

Speaker 3 (01:03:52):
That's so, that's so true.

Speaker 2 (01:03:55):
It would have been still good. You can find us
on Twitter and Blue Sky at Daily Zeitgeist. We're at
the Daily Zeitgeist on Instagram. You can click on this
episode wherever you're listening to it and go to the description,
and there you will find the footnotes, which is where
we link off to the information that we talked about

(01:04:16):
in today's episode. We also link off to a song
that we think you might enjoy. Super producer Justin is
there a song that you think the people might enjoy?

Speaker 5 (01:04:25):
Yeah, I recently came across this track that I think
I can only describe as like a warm summer romance
between your ears is so relaxing. There's this nice, little
pleasant whistle in between these guitar notes, and the vocals
are so smooth.

Speaker 3 (01:04:45):
I just it really sends me into a relaxing vibe.

Speaker 5 (01:04:49):
So this is indeed by Cruiser Cruza and you can
find that in the footnotes footnotes.

Speaker 2 (01:04:55):
The Daily Zeitgeist a production of iHeart Radio. For more
podcast from my Heart Radio, visit the iHeartRadio app, Apple podcast,
or wherever you listen to your favorite shows. That's going
to do it for us this morning. We're back this
afternoon to tell you what's trending, and we'll talk to
you all then bite.

Speaker 4 (01:05:10):
The Daily zeit Guys is executive produced by Catherine Long
co produced by Bee Wag, co

Speaker 2 (01:05:15):
Produced by Victor Wright, co written by J M McNabb,
edited and engineered by Justin Conner.

The Daily Zeitgeist News

Advertise With Us

Follow Us On

Hosts And Creators

Jack O'Brien

Jack O'Brien

Miles Gray

Miles Gray

Show Links

StoreAboutRSSLive Appearances

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.