Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hello the Internet, and welcome to this episode of the
Weekly Zeitgeist. These are some of our favorite segments from
this week, all edited together into one NonStop infotainment laugh stravaganza.
Uh yeah, So, without further ado, here is the Weekly Zeitgeist.
Speaker 2 (00:25):
We have one of our favorite guests of all time,
one of the listener's favorite guests of all time.
Speaker 1 (00:31):
This is a man who you.
Speaker 2 (00:33):
Know, has has really inspired the imaginations of people across
the country with his YouTube searches and his niche interests
and obviously his love of cold brew. The Poetry Window
will consider that shit open because we are welcoming in
our third set our guest today, mister Chris Craft.
Speaker 3 (00:54):
What's up, Chris, Welcome to the Chris Kroft and Daily Zeitgeist. Okay,
such a lovely place, Such a lovely place. I've been
through the desert on a horse named Chris crofton it.
Good to be out of the guyst love that ne
those Aka is brought to you by me.
Speaker 2 (01:15):
Am Wow what the Eagles and then Neil Young.
Speaker 3 (01:20):
I know, two champions, two champions of the white community.
Speaker 4 (01:26):
Time.
Speaker 1 (01:27):
I did not know that.
Speaker 2 (01:29):
I didn't know what he did. I didn't know I
thought Horse with No Name was a Neil Young song,
but it's not. It's some dude who is trying to
be like Neil I guess what what what dudes?
Speaker 3 (01:39):
Yeah, some dudes trying to sound like Neil Young America.
Speaker 2 (01:44):
And then you recognize obviously part of Jaques's AKA not
Like Us, because, like I was quizzing you before, you
you have intersected with the Kendrick Lamar beef ending track.
Speaker 1 (01:55):
Not Like Us.
Speaker 3 (01:56):
Yeah, and I cannot I can't believe. I'm just I'm
just humble as a white man to know about it.
Speaker 1 (02:02):
Yeah, well, you know, and I and I.
Speaker 3 (02:04):
And it's just a testament to how far that has
gone that a man my age of my complexion knows
about this fucking beef you, Chris, was.
Speaker 2 (02:13):
It like a thing where you're like, what what? Why
is everybody keep talking about miners and stuff? The fuck
is going on? And then you just kind of naturally
figured it out or on YouTube enough that you figured
it out.
Speaker 3 (02:23):
I just fucking I'm online, yeah, way too much. And
so I know about a little bit about Drake being
accused of that sort of stuff, and then I but
I didn't know, I mean obviously, and I know Kendrick
Lamar a little bit, but so I don't exactly know.
Speaker 2 (02:40):
I don't know that. I mean, I know I've heard
the song.
Speaker 1 (02:42):
I've heard the song.
Speaker 3 (02:43):
I've heard the song. I just don't know exactly what
is going on or Yeah that's fine. But I think
that as far as I can tell, it was a
big win for Kendrick yeah yeah, yeah.
Speaker 2 (02:53):
And a big loss for Drake. Yeah yeah, but I
think he'll be back because Drake's like Avengers movies for
the music indust like they need they need him to
generate hits. But we'll see. The brand is pretty fucked
up though at the moment. So I mean I say this,
like Drake is gonna sell records, yeah, yeah.
Speaker 5 (03:12):
When he makes a song, and especially if it's a hit,
it's gonna give radio play.
Speaker 1 (03:16):
Yeah.
Speaker 2 (03:17):
But he's getting the rap community, Yeah, that nigga is.
Speaker 1 (03:21):
Yeah he is.
Speaker 2 (03:24):
People will like forever.
Speaker 1 (03:25):
Yeah.
Speaker 2 (03:25):
People are not looking at him. He will be viewed
as like a right to like hip hop.
Speaker 1 (03:31):
Fans right for sure.
Speaker 2 (03:32):
For sure, capital age Drake fans, capitol age hip hop fans. Exactly.
Speaker 1 (03:37):
What is something from your search histories that's revealing about
who you are? Alex? You want to kick us off.
Speaker 6 (03:44):
Oh gosh, okay, I don't. The thing is, I don't
think so I use Duck duckgo, and so it doesn't
actually keep the search history. And if I actually look
at my Google history, it's actually going to be really shameful.
It's gonna be me like searching my own name to
see it, like if people are like ship talking to
me online.
Speaker 1 (04:04):
Now, this isn't just how we tell if someone's honest,
is if they actually give that answer, we're like, okay.
Speaker 6 (04:09):
So yeah, you actually search yourself.
Speaker 1 (04:11):
Yeah.
Speaker 6 (04:12):
Actually, but I think the last thing I actually searched
was like queer barbers in the Bay Area because I
haven't had a haircut in like a year, and I
think I need to turm up or get you know,
air out the the sides of my head for pride months.
So that's yeah, that's elastic. I searched, what.
Speaker 1 (04:32):
Are you going? You're going full shaved on the sides.
Speaker 6 (04:35):
Or maybe trim it a little bit and up the
back and bring out the curls a little bit.
Speaker 1 (04:40):
So okay, love it? Yeah on board.
Speaker 2 (04:43):
I wish I could a few more days in look.
Speaker 6 (04:49):
In July like discounts, Like it's like like after Valentine's Day,
do I get an undercut at fifty percent off?
Speaker 1 (04:58):
Now?
Speaker 2 (04:58):
Right?
Speaker 1 (04:59):
Exactly? Emily, how about you? What's something from your search history?
Speaker 7 (05:03):
So forgive the poor pronunciation of this and the rest
of the story because Spanish is not one of my languages.
But champurrado, Oh yeah, it's something I search instress.
Speaker 2 (05:12):
Yeah.
Speaker 7 (05:12):
So I was in Mexico City for a conference last
week and at one of the coffee breaks, they had
coffee and decaf coffee, and then they had.
Speaker 1 (05:24):
And the Spanish pronunt give us about that?
Speaker 2 (05:29):
What do you see when you see that?
Speaker 1 (05:30):
Way? Champarado Mexican hot chocolate.
Speaker 2 (05:34):
So yeah, you're literally Google results.
Speaker 7 (05:39):
So the the labels all had like translations into English,
and so it was champurado with o hack and chocolatem Like, yeah,
I got that. What's choppado? And so I look it
up because I want to know what I'm consuming before
I consume it. And it's basically a corn flour based
thick drink like chocolate corn soup.
Speaker 2 (05:57):
It was amazing, chocolate corn soup.
Speaker 1 (06:00):
You had me until chocolate corn soup.
Speaker 6 (06:04):
The corn is just a thickening yeah, chocolate drink.
Speaker 7 (06:09):
Yeah, slight corn flavor, like think corn tortilla, not on
the cob.
Speaker 2 (06:14):
Yeah yeah, oh yeah, yeah, yeah that sounds amazing.
Speaker 8 (06:17):
Yeah, it was really good.
Speaker 1 (06:19):
I love some corn flakes in a chocolate bar. Yeah,
corn chocolate. There you go, Yeah, you got it.
Speaker 2 (06:25):
You gotta arrive in your own way back with the
corn chocolate.
Speaker 7 (06:31):
It was really good and just awesome that it was there.
Like you know, the the coffee breaks had like the
Mexican sweetbreads and stuff like that, but otherwise it was
pretty standard like coffee break stuff, and all of a sudden,
there's this wonderful mystery drink.
Speaker 8 (06:42):
The big urns was it was lovely.
Speaker 1 (06:43):
That sounds great. What is something you think is underrated? Emily?
Speaker 8 (06:48):
I think Seattle's weather is underrated.
Speaker 7 (06:51):
Oh yeah, everyone makes fun of our weather and like,
you know, fine believe that we don't need lots of
people coming here. And it's true it gets dark in
the winter, but like almost any day, you can be
outside and you are not in physical danger because you
are outside.
Speaker 6 (07:05):
I guess that's that's I mean, if you're going for yeah,
that's interesting, but I mean it's I mean, the winters
are just so punishing though it's so gray.
Speaker 8 (07:15):
It's dark.
Speaker 2 (07:16):
But the weather it's dark, it looks it looks like shit,
but experientially not bad for you. I mean, I yeah,
I know, it's when does like it doesn't get all gloomy.
I imagine in the summer, right, you have wonderful blue
skies and you can enjoy.
Speaker 8 (07:34):
The gorgeous fire season aside.
Speaker 7 (07:37):
But yeah, from sort of mid October to early January,
it can be pretty like it's gray and so like
when the sun is technically above the horizon, it's.
Speaker 8 (07:46):
A little hard to tell.
Speaker 7 (07:47):
Yeah, right, right, So, but you know, compared to like Chicago,
where you have maybe four liverable weeks a year.
Speaker 8 (07:55):
Between the too hot and the two cold.
Speaker 6 (07:56):
Wow, wow, I'll do that. Because my thing was going
to be because I was just there and I was
going to say, my answer was going to be, Chicago
is the best American city. I stand on this, like one, no,
absolutely not true. No, I'll even deal. I'll even deal
(08:19):
with I'll deal with the winter. I mean if I okay,
I'll be honest, if I didn't, you know, if the
weather in Chicago, if if I could bring Bay Area
weather to Chicago, I would live in Chicago. I mean
there's other reasons, but I mean it's it's look, the vibes,
immaculate street festivals, the neighborhoods. It's the one place that's
(08:43):
probably the food. It's still comparatively affordable compared to the coasts.
Radical history, you know, just you know, some of the
best politics. Yeah, you know, I would say they.
Speaker 1 (08:59):
Shot an issue there, the Fugitive, Oh.
Speaker 6 (09:03):
I did, that's a deep cut. Yeah, I mean they
I think they've shot a lot of Batman movies there,
because you know, the iconic kind of lower Whacker Drive
and they call it with them and it's yeah, yeah,
that's pretty great city.
Speaker 7 (09:18):
Crappy weather, right, if you're gonna dump on weather somehere.
Everyone makes fun of Seattle's weather.
Speaker 6 (09:24):
Honestly, Emily, this is a hot take. I'd rather take
Chicago's weather than Seattle's weather. I can't I can't do gray.
I can do I.
Speaker 1 (09:34):
Feel like I'm on crossfire.
Speaker 6 (09:36):
I can fridge it, I cannot do gray. It's supers Well.
Speaker 7 (09:41):
This is why I say, like, don't move to Seattle
if you can't handle our weather. Like the people who
move here and then complain about the weather, what.
Speaker 1 (09:46):
Do you expect all of this?
Speaker 2 (09:48):
What they say is true about it being great, like
I'd expect you to be that gray, right. I think
people talk about it like that.
Speaker 1 (09:58):
What is something you think is overrated?
Speaker 2 (10:01):
I'm going Tesla right now, y'all. I just know you
guys driving I finally.
Speaker 9 (10:07):
Like rode in a Tesla and really paid attention to
the feeling of riding in a Tesla car. And I
gotta say, if you're gonna pay that much money for
a car, it's got to not feel like a weird
golf cart that doesn't have any smooth ride.
Speaker 1 (10:21):
It sucks.
Speaker 2 (10:22):
It's a it is not a smooth ride. I would
much rather be in like a nineteen ninety eight like
buickless Saber if I if I want to smooth ride,
But those cars suck to ride in. I'm saying. I
remember the first time I got in one, I was
so underwhelmed, like it was weird. I had built up
Tesla's in my mind like fucking crazy, and I remember
(10:44):
someone I knew it was like partner, drove one and
like picked us up to go somewhere. And first I
fucking embarrassed myself because I know the fucking door handle word. Yeah,
you can't get it. I was like, I was like
rubbing it. The guys gotta push it and then it
comes out. I was like, all right, and then I
was immediately I can fuck this door handle. And then
I got in and then like everything kind of felt
(11:04):
like like not substantial, Like when I pulled the door thing,
I was like, is this like just PVC pipe? Like
the field and synthetic leather. Everything feels very just yeah,
not substantial, transitory something like they're just you feel like, yes,
it's fast, and if it goes too fast, the car
might just kind of like fall apart around you, like
(11:25):
a cybertruck where the paneling will just turn it like
into an air fin and bend backwards on it. And
I feel like we don't even need to talk about those.
It's just like, you know, if you drive a cyber truck.
That's my honestly, that's like my favorite new Like I
was talking about this on the show the other day.
My new favorite like form of schodenfreude is watching the
people with their cyber trucks, Like, I can't the fuck
(11:45):
is wrong with my car. My steering wheel looks like
a little batmobile thing, Like my insurance company won't ensure it.
Speaker 9 (11:52):
I think it's actually fully worthwhile to like. My new
thing with cyber trucks is if I see one, I
actually turn and point and laugh and see and see
if I can ever get the person to like be like.
Speaker 1 (12:02):
Man, man, we're leaving at Yeah, well yeah it is
pretty cool. Uh wow, why are you convulsing?
Speaker 2 (12:09):
I don't know, but yeah, I don't know.
Speaker 9 (12:10):
And that you know, obviously the elon thing is is
is hard to swallow, and I did. There was a
time when I was like, yeah, ev is so cool.
And now I'm just like, give us a train. Somebody,
give us a train. Somebody give us a fast, cool
train like they have in Europe or Japan or whatever,
so that I can go somewhere and not have to drive,
and also not have my car drive me. I don't
(12:31):
really want that to you.
Speaker 1 (12:32):
I see your train, and I raise you a train
tunnel that you can drive in Sea Jack.
Speaker 2 (12:39):
This is what we don't want.
Speaker 1 (12:41):
In this house. We believe the Boring Company is the future.
And I'm just picture.
Speaker 9 (12:46):
I'm just picturing your yard with all of your missile signs.
It's so many signs.
Speaker 1 (12:51):
So many signs, and that guy keeps shooting lost in yeah,
what what's something you think is underrate it?
Speaker 2 (13:00):
I don't know.
Speaker 5 (13:02):
Can I ask my daughter?
Speaker 2 (13:03):
She don't give the Yeah, okay, she don't give from
I won't even okay.
Speaker 5 (13:09):
Give me something that's overrated? Kids?
Speaker 1 (13:13):
Kids? Okay, what's underrate abortions? There you go.
Speaker 2 (13:23):
A juxtaposition, underrated abortions overrated. Oh wow, that's it's your
daughter comedian. She came.
Speaker 1 (13:40):
She was quick with that, she she.
Speaker 3 (13:43):
Writes on the show.
Speaker 5 (13:43):
I tell her the time, she need to be a
damn comedian.
Speaker 1 (13:46):
But she's okay, yeah, right when I.
Speaker 5 (13:52):
Can, I can, let me tell you what I say.
They overrated because you don't get the tax break you
used to get for yes, could I mean?
Speaker 1 (14:02):
No?
Speaker 5 (14:02):
What's his name? Trump changed that? Yeah, we don't get
those great tax breaks. Tax break you used to get
for being poor. You don't get them anymore. So they
saw overrated. You used to get her income createy. It
takes a lot to get income creaty. Y'all father the
hell I'm talking about? Because you didn't need me a
tax break. So, but they used to give you a
(14:25):
ship ton of money per child. They don't do that anymore. Right, So, Yeah,
I used to tell my kids back in the day,
I said, when you're eighteen, I don't get income text
return for you, So that means we're done.
Speaker 2 (14:37):
Right, you have no monetary value? No, I mean yeah,
I just have my first child. And I was like,
I can't wait to see my taxes. And I was
I was like, what it's like I didn't even have Yeah.
I was like, this was the reality I was promised.
But no, no, no, you said it was purely an investment.
You said, exactly, it.
Speaker 5 (14:59):
Was a on an investment, as you put it in
the Mama.
Speaker 1 (15:02):
That was it very business like. He's very business.
Speaker 5 (15:07):
And the crazy part is you don't know how they
go turn out.
Speaker 2 (15:10):
Right now, I got a few right here.
Speaker 5 (15:13):
I have a few kids right here, and be like
I kept you.
Speaker 1 (15:20):
Yeah, but you gotta love him the same. You gotta
love him the same. Yeah.
Speaker 5 (15:23):
I love a lot of my kids. But let me
say this as a parent, because I have a lot
of parents the same life. Everybody got a favorite kid.
Now that doesn't mean that doesn't mean that my mother,
father don't love everybody. I have a favorite kid. I
love the rest of y'all, but this is right here,
it's my favorite.
Speaker 1 (15:39):
And do you tell them though, yeah, you two my favorite. Wait,
why why is you bug your favorite?
Speaker 5 (15:49):
I think I think because it was my last week,
it was ten to two. He's just so sweet. He's
my baby. He's twenty eight three. But like that, when
I'm just talking.
Speaker 10 (15:59):
To I'm curious.
Speaker 2 (16:12):
Yah See, I'm contemplating another child, maybe down the road.
And that's my fear is that I wouldn't immediately like
start comparing them, be like, oh man, this one ain't
ship compared to the other one. Not like in an
aggressive way, but that just merely by having multiple kids,
you have the ability to sort of compare and contrast
and like, and then from there you are kind of like, yeah,
(16:33):
maybe I like the other one better, or maybe I
like this one better.
Speaker 5 (16:36):
Everybody like one man, everybody. But you get fake as
parents to say, oh my god, I love all of
my kids.
Speaker 2 (16:41):
You don't.
Speaker 5 (16:42):
One of them is probably smoke, dope. You can't tell
me if you got a cracky kid that's your favorite,
ain't gonna say you don't love him, and you ain't
gonna do anything you can to get him out, dope,
But that's your fucking head.
Speaker 1 (16:54):
Yeah, well, people will tell truth.
Speaker 5 (16:57):
My oldest used to be my hand and she's straightening up.
Speaker 3 (16:59):
Now.
Speaker 1 (17:00):
Yeah, I have two kids, but they're like really close
in age, so it's like back and forth and sometimes
I can't tell them apart, but they're yeah, back and
forth between who the favorite is? Yeah, back and forth
between who the favorite is there. But the younger one really,
you know, he's he's still a lot sweet, like sweet
most of the time, whereas the old rhymes starting to
(17:21):
he knows what rolling his eyes means now and that's
never forget your first time. You're like, what the fuck
is that? All right, let's uh, let's take a quick
break and we'll come back and we'll talk about all
that cocaine that the President of the United States is
going to be snorting tonight. We'll be right back and
(17:55):
we're back and all right. So Donald Trump seems worried
about the debate that he agreed to after like saying
he wasn't gonna do a debate. He yeah, he said
he was going to do it.
Speaker 2 (18:11):
What happened, I Mean, the whole thing was, these were
rules you agreed to. It's like they're gonna cut the
mics off when you're not talking. They're like, yeah, you
agreed to.
Speaker 1 (18:20):
That yeah, things as you agreed to them.
Speaker 2 (18:23):
They can't I can't bring up I can only have
a water and paper and pen up there. I can't
bring my big old books up there and other crap.
They're like, yeah, yeah, you can't have any of that.
But yeah, It's like, over the last couple of weeks
we've heard increased speculation from the right about how Joe
Biden is going to be higher than method Man and
(18:43):
Red Man on the Blackout album and on Thursday, this
is gonna happen on Thursday night when they're scheduled to
debate for the first time since obviously twenty twenty. The excuses, though,
really began trickling in over the last week, as like,
you know, Trump surrogates and people going out there doing
the news shows, and Trump himself began like lamenting again
the rules he agreed to or how mean Jake Tapper
(19:03):
has been to him in the past. And Trump has
good reason, I think, to not be excited about sharing
the stage because you know, not that poll numbers really matter,
but every time he debates a Democrat, like in twenty
sixteen and twenty twenty, his numbers dip. When people were
just like that juxtapositions a little odd for some people,
like this person spoken complete sentences and this guy was
(19:23):
stalking a woman around the stage. And while again, pulling
is the whole game. It's just like the last time
twenty twenty, that first debate with Biden and Trump, it
just wasn't a good look because Trump came off looking
like a fucking freak next to Joe Biden, who was
merely just an old man, like standing on.
Speaker 1 (19:41):
Stage like he did when there were like primary debates
for the Democratic primary, like in twenty twenty. Biden was
not like good, No, he's not like a good debater, No, no, yeah, no.
Speaker 9 (19:54):
My question is like who comes to these debates going like,
you know what, I've I've stayed pretty independent up to
this point, and I'm just going to see what these
two gentlemen.
Speaker 1 (20:03):
Have to say.
Speaker 9 (20:04):
Man, Trump guys all about it, and I'm just going
to base it on the issues. You reacht their websites
and they seem like they have some they have some differences.
Speaker 2 (20:12):
I'm a Democrat, but I don't know, man, I just didn't.
I didn't really see Trump do his thing like that before.
I'm kind of into it now.
Speaker 9 (20:19):
I'm kind of I'm kind of interested in now, and
I think it is like disingenuous, right, Like, I mean,
do you know people that are like, I don't know, man,
I just don't know if I can hold my nose
and vote for Biden. And it's like this, this is
a conversation where I'm immediately like, Okay, well then I
just don't. I don't have a lot of patience for it.
Speaker 1 (20:36):
You know.
Speaker 9 (20:36):
It's like it's like turning that thing into like a
single issue vote where you're you know, I'm like, whatever
your issue with Biden is, I beg you to show
me like a version that you find a better version
of your beliefs than Trump.
Speaker 1 (20:51):
Yeah.
Speaker 2 (20:51):
I mean, it's just so hard because you're like, especially
with the Biden stuff, so many people are contending with
the anger of how the two party system just like
forces you to be like, obviously I don't want Trump
to be president totally, but Biden is completely unresponsive to
anything that like matters, and what the fuck is this?
But again, by both parties just trade off being the
bad guy so then the other one can raise funds
(21:13):
and then you know, they do the Merrick around. But
it's clear Trump is still fucking hooked on doing freestyle jazz,
talking up there, just flowing on some stream of consciousness, consciousness,
word association shit. And on Saturday in Philadelphia, I don't
know if you saw that like epic rant he had
about water and the sinks and shit like that. A
lot of like he's he's definitely in his water phase
(21:35):
between like the batteries and the boats and the sharks
and like dishwashers, which he also period. Yeah, this is
this is water era.
Speaker 1 (21:44):
And that's something that like young children go like age
like three, you go through your water period where like
water is the coolest thing in the world, and you
have like your little water tables and you kept playing.
Speaker 2 (21:54):
With water exactly. He's in his water period at the moment.
And if you this is a lot of you were
talking about this, but just to give you a taste,
like this is how the guy is talking when he's
just talking. So try and like imagine this on a
debate stage.
Speaker 4 (22:09):
No water in your faucets, you ever, try buying a
new home and you turn on to have restrictors in there.
You want to wash your hair or you want to
wash your hands, you turn on the water and it
goes drip drip the soap. You can't get it off
your hat, so you keep it running for about ten times.
Speaker 1 (22:24):
We'll get you try. The worst is your hair.
Speaker 11 (22:28):
I have this beautiful, luxuriant.
Speaker 1 (22:30):
Hair, luxuriant, and I put stuff on. I put it
in hair.
Speaker 4 (22:36):
I like lots of lather because I like it to
come out extremely dry, because it seems to be slightly
thicker that way.
Speaker 2 (22:44):
What lack anyway, So this is like a lot of
people like he's rambling, he's talking. He's trying to talk
about like water restrictors and shower heads, like this is
the thing he's talked about before. But again he starts
off trying to make some point about like what about
our water, and turns into I like my hair real dry,
(23:04):
take care that. He's just just freestyle man. He's riffs.
He's the king of rifts.
Speaker 1 (23:11):
And think about how our toilets can't choke down his
giant shits. It can't be far behind. It's just a prediction.
Speaker 2 (23:17):
Oh yeah, yeah, yeah, we're talking soon, we got to
be talking toilets. Like again, we're in the water phase,
so yeah, something with something, something aquatic will turn up.
But prior to that performance of ranting, he had an
interview with some right wing blogger and said that they're like, uh,
you got this. The guy was asking, like, you got
this debate coming up? It's pretty intense, and he's like, yeah,
(23:37):
I'm getting pretty Just listen. This is what Trump was saying,
like how he's fucking preparing for the debate.
Speaker 1 (23:43):
Being interviewed by a guy completely bald with a beard.
Just in case that wasn't that's probably clear. I probably
don't need to.
Speaker 2 (23:50):
Yeah, what a right wing blogger, dude looks like.
Speaker 12 (23:52):
Yeah, Joe Biden at Camp David, as you and I
stand here, your debate is Thursday with him, no audience,
CNN controls the mics, Dana Bash, Jake Tapper.
Speaker 2 (24:03):
How do you feel about that matchup?
Speaker 11 (24:05):
Well, it's probably one on three and I've been doing
this for a long time. Though. We'll handle that. And
people say, how are you preparing. I'm preparing by taking
questions from you and others, if you think about it, so,
but I'm prepared by dealing with you. You tell her
than all.
Speaker 12 (24:22):
Of them that, Well, it is a real pleasure to
be here, sir. I know you've got a lot of
fans waiting, so it's welcome. You to town, sir, and
thank you so much for your time.
Speaker 11 (24:29):
You've been a great friend. Thank you very much, Chris
appreciate it.
Speaker 1 (24:31):
Thank you, thank you much for that.
Speaker 2 (24:33):
Guy has like tears in his eye. Yeah. I always
love name Gath. Yeah he got his name right.
Speaker 1 (24:39):
But yeah, Chris, Yeah, I'm my best friend in this world.
Speaker 2 (24:44):
Now, this is where you know, I think most people
become very skeptical because if if you go off the
answer of what he said is debate prep was like,
it's like talking to people like you. I'm taking questions,
so in a way, that's preparing, isn't it. And that
sounds like you're not preparing at all because you're not
going to go on the debate stage. And if you
are going on the debate stage, and your version of
(25:05):
preparing is just like completely whiffing on softballs from sycophants,
and that's your preparation for the debate of your life.
I'm again this will be it'll just be straight up
chaos because he's obviously gonna be getting a ton of
questions about all his bullshit, like felonies to January sixth,
asking about Egene Carroll where he may fucking owe her,
(25:26):
like millions of dollars again by opening his mouth to
Rico charge his fucking classified documents and talking to a
guy who would never be like I mean, like you know,
like did you really why did you have those classified documents?
That's not gone out. He's not preparing it anyway. This
is why I think it's going That's why I think
now we see that there's this like reason that's emerging
(25:48):
from the right, which is coming from a lot of people,
including Trump, which is Joe Biden is on fucking crank
and there's no way he can debate a guy who's
on fucking speed, even though Trump, Joe.
Speaker 9 (26:00):
Biden is on speed. I mean, this is my prayers
that Joe Biden does some speed before this debate, because
I mean, like, as long as we get Joe Biden
like talking fast and walking quick, I think we win
this thing.
Speaker 2 (26:12):
And like, hey, have me that computer monitor, Jack and
a screwdriver. Yeah, see what's going on Porto on I'm
kind of fucking amped, man. We put Joe Biden, we
put Joe Biden on some meth, and he will win
the debate and he'll steal a bunch of copper piping
out of the I.
Speaker 1 (26:30):
Took the bike apart and I took the spokes out
of the wheels, but he used to put the spokes back.
Speaker 9 (26:37):
Look, jack anybody can do this. Yeah, it is weird
to how Trump always stages it like I mean even
that clip was like a wrestling it like it did
feel like, well, it's three on one this weekend.
Speaker 1 (26:47):
They were standing in front of a giant American flag,
like doing the standing interviews.
Speaker 2 (26:51):
Yeah, camera and something, take the mic and go direct
a camera. But yeah, Ronnie Jackson aka fucking doctor feel Good,
the old White House doctor who has had everybody pilled
up in.
Speaker 1 (27:06):
Both administrations by the way, also yeah, the Obama administration. Yeah.
Speaker 2 (27:09):
He submitted a letter as a congressman said quote, I
demand this is to Joe Biden. I demand that you
submit to a clinically validated drug test in order to
reassure the American people that you are mentally fit to
serve as president and not relying on performance enhancing drugs
to help you with your debate performance command.
Speaker 1 (27:29):
There is either Queen of England.
Speaker 2 (27:31):
Yeah no, And again like you're saying, Zach, like this
is this is a guy whose time in the and
the White House was described as quote a wash in speed. Yes, yeah,
like everything they said. Apparently this was staffer's popping pills
and washing them down with alcohol. In large part to
Jackson's leadership as chief medical Advisor, common poll requests included
(27:51):
medafinil adderall, fentanyl, morphine, and ketamine, according to a Pentagon
report release in January, but other unlisted drugs such as
xanax were equally easy to come by from the White
House Medical Unit.
Speaker 1 (28:02):
It really takes a step up at a fentodelmorphia. It's like, yeah, wait,
what for what the the other ones are like, Yeah,
that's what I expect. The what the White House running on?
Like like fent like those anymore? Right? Yeah, it's interesting.
I mean we always talk about how his instinct is
(28:23):
always to accuse the other people of doing the thing
he's doing. And he seems so high when he's up
on the stage, like just the way he's just rambling
from one thing to the other and just talking about
how luxuriant his hair is. Like it feels feels like
he's on like ecstasy or something.
Speaker 2 (28:41):
Yeah, and it just feels like a guy who knows,
like I gotta get I guess I gotta talk for
an hour straight. So what I'm just gonna talk about
whatever the fuck I want.
Speaker 1 (28:49):
Like, you know, he doesn't have to, but he has
to exactly the only thing that fills the the sucking void.
Speaker 2 (28:57):
A good campaign ad for Joe Biden should just be
taking Trump transcripts and having someone read back transcripts of
Trump to potential voters and being like, so, just a
quick thing when you hear Trump say, you know, because
if there's a star in the crowd, you know, their
cameras on my head, the back of the whole time, cameras.
(29:18):
They're the best. Think about the seats, this is a
beautiful crowd, and how we're going to get the water,
and then just be like, so what do you think
about that? What do you mean? Just and just get
the reaction. That should be the whole thing.
Speaker 1 (29:31):
I think it's great. I think it's I think it's awesome.
Speaker 2 (29:34):
Manh huh. And what was he saying?
Speaker 1 (29:36):
I don't know, man, I fucking love cameras. Man, They're
like magic. Just don't get them wet or near a magnet,
you know.
Speaker 2 (29:42):
Yeah, yeah, sharkle by that camera.
Speaker 1 (29:45):
I do think though, like this is this happens every
debate where especially the Republicans. This seems to be like
a piece of accepted wisdom among Republicans, So you really
need to aggressively attack expectations and that that does tend
(30:06):
to work. Like That's why I'm just like, is the
mainstream media just like falling for the same bullshit every
like that they fall for every time, where like Trump's like,
Biden's one of the great debaters of our time, and
he he killed Like at the time, I guess he said,
like remember when Biden debated Paul Ryan, and everyone was like,
Biden's gonna get fucking killed, and then Biden like did fine,
(30:29):
held his own against Paul Ryan, which in retrospect not
that impressive. Paul Ryan's a fucking dipshit, but that like
he did better than expectations. So now Trump's like, this
guy's one of the great debaters of all time and
he's gonna be so he's gonna be flying on peds
up there and then he's gonna show up and like
have the expectations set where he wants them. So I'm
(30:50):
a little I'm a little like, I don't know, he'll
probably show up, like it would be such a bad
look for him, not to show up like I don't
know he's giving you know, like because this is an
of these but he's gonna be on drugs.
Speaker 2 (31:01):
I couldn't debate somebody and then he can just be
like I'm not talking to that speed freak.
Speaker 1 (31:05):
Yeah he won't.
Speaker 2 (31:06):
He won't take a drug test, and I'm not gonna
play like you know, it would.
Speaker 1 (31:09):
Be a real bad look. It was like, I hope
he doesn't show up because that seems like a terrible look.
Speaker 2 (31:15):
Is there even? Like because even in this version, right,
even if he shows up and completely ships the bed
figuratively or literally, yeah, no one's gonna gonna whatever, it's
gonna be like whatever, you know what I mean, Like,
no one's It's so it's hard to know because he's
tryab to be like I'm not losing anybody, you know
what I mean, So like what do I have to
lose if I don't even go up there? But again,
(31:37):
I know he wants to start wind milling about the
fucking like like immigrants or killing people angle, and that's
gonna be a moment for him to sort of, you know,
try and press Biden on something like that. But I
don't know. At the end of the day based on
like how I don't know, just he just seems very
like he's just not into it. But look, we don't
(31:58):
fucking know.
Speaker 13 (31:58):
But I feel like maybe the debate polls are it
might be like like a thing with like the pepsi
taste tests, where you know, pepsi would win taste tests
when it was like a little sip of pepsi versus
a little sip of coke, but like you can't drink
a whole glass of pepsi without your teeth falling out feeling.
Speaker 1 (32:21):
Like they're vibrating. Like I just feel like you're testing
for different things. And like the he always successfully like
makes it horribly ugly in any debate he's in, Like
I'd never leave the debate being like, well he just
got his ass kicked, you know, it's always so I
just feel like some of this is like people like
(32:44):
wishful thinking that he's not going to show up, that
he's going to show up and just like suck. I
don't like, I feel like it could go the other
direction pretty easily. Not that that like this is just
also me, like this is the same comportment I take
into my sports fandom, where I'm like we suck, We're
gonna lose forty points. But it does feel like, I
(33:06):
don't know, it could go badly for Biden. Oh, given
what we've seen of him speaking temporaneous.
Speaker 2 (33:13):
Yeah, no mistake, they're both. No, I don't know who
a favorite is going into this, because just as easily
Trump can just suck all the fucking air out of
the room and just keep these like these same things.
And then Biden's probably like I need a nap. Who knows,
like what the fuck's gonna happen.
Speaker 1 (33:30):
But anything asks for an actual nap.
Speaker 2 (33:34):
Oh god, Okay, he's like time out, man, time out?
Speaker 1 (33:38):
Can we get like it?
Speaker 2 (33:40):
I need a nap and a caramel?
Speaker 1 (33:41):
Yeah?
Speaker 9 (33:42):
Yeah, yeah, I mean maybe Biden, maybe Biden. Maybe the
best plan for Biden is uh yeah, let let Trump
talk more and and and also get on some performance
enhancing drugs that make him like super ripped. I mean, like,
can we get him on HGH at this point or
something so that yeah, looks like the best self?
Speaker 2 (34:03):
How quickly can he look like a light heavyweight MMA
fighter physically?
Speaker 1 (34:07):
Yeah?
Speaker 2 (34:07):
And I'm sure I'm sure Joe Rogan's got some tips,
So like, let's let's get let's get him just shredded
for this one.
Speaker 1 (34:14):
Get his organs to grow, and that would actually be
the one thing that Trump would respond to because as we.
Speaker 2 (34:21):
Know, like he's terrified. Oh my god, guy.
Speaker 1 (34:24):
Came out there with arms like Christmas hams.
Speaker 2 (34:27):
He's wearing a smaller suit jacket, isn't he They're bulging
out of the sleeves. No, no, no, no, no, no not.
He looks like right out of Central Casting. Trump love
Central Casting. He does you get a super strong president.
Trump's gonna like it.
Speaker 1 (34:40):
And these guys they have big muscles, maybe not so
much down here and here, but being up.
Speaker 2 (34:45):
Here up here, Yeah, huge brains.
Speaker 1 (34:48):
All right, let's uh, let's take a quick break. We'll
come back. We'll talk a little pop culture. We'll be
right back. And we're back. We're back, We're back. And
(35:08):
on your show, you've done some good stuff on just
the surveillance side of AI, which I mean that turns
out a lot of the technology that we initially thought
was promising was just eventually used for the purposes of
marketing and surveillance in the end, And it seems like
(35:29):
AI skipped all the promising stuff. And it's just like,
what if we just went right to the right, the
harming harming people.
Speaker 6 (35:39):
Yeah, I will, I will say that kind of. I
mean you had mentioned that this term AI is kind
of being used, Lucy Goosey, and you know, I mean
we we AI is kind of synonymous with large language
models and image generators. But you know, things that have
been called AI also encompass things like biometrics, surveillance, like
(36:04):
like different different systems which use this technology called quoteunquote
machine learning and which is kind of this large scale
patterned recognition.
Speaker 2 (36:15):
So a lot of.
Speaker 6 (36:16):
It's being used, especially at the border, so doing things
like trying to detect verify identities by voices or by faces.
You probably see this if you've been in the airport.
The TSA has been using this and you can still
voluntarily opt out for now, but they're really incentivizing it.
(36:36):
I saw that TSA has this touchless thing now, which
is this facial recognition, so you don't have to present
your ID, you can just scan your face and go and.
Speaker 7 (36:46):
Like don't do that, like take every option to opt out,
and that the fact that those signs are there saying
that this is optional, was it, Penny somebody actually Petty? Yeah, Yeah,
The only reason we had that science is because of
her activism saying like this has to be clear to
the travelers. That's actually optional and you can opt out,
so it's posted there that you.
Speaker 8 (37:05):
Don't have to do this.
Speaker 1 (37:07):
Yeah, all right, then I'm gonna feel you up. Sorry,
those are just the rules.
Speaker 6 (37:11):
Yeah, it's just it's absolutely but I mean it gets,
it gets, you know, leveraged against people who fly to
a lesser degree. But I mean folks who are refugees
orsile's you know, I mean people on the move really
encounter to stuff incredibly violent ways. You know, they do
things like try to they take their blood and say that,
(37:34):
well we can we can associate your we're gonna you know,
sequence your genome and safe you're actually from the country
you say you're from, which is first, it's pseudoscience. I
mean basically all biologists have been like, you can't use
this and determine if someone is X, y Z, like nationality,
because nationalities are one political entities, they're not biological ones,
(37:58):
and so like we can sort of pinpoint you to
a region, but it says nothing to say of anything
about the political borders of a country. There's a great
book I started reading by Petro Molner, which is called
The Walls Have Eyes, which is about this kind of
intense surveillance state or intense surveillance architecture. You know, it's
(38:22):
being used in you know, typically in the border, the
US Mexico border, but also the you know, the various
points of entry in Europe where African migrants are fleeing to,
you know, fleeing you know, places like Sudan and Congo
and the Tagrai region of Ethiopia. So just like and
(38:44):
this is just some of the most violent kind of
stuff he in it you can imagine, and it's way
far away from you know, this kind of Oh, here's
like a fake little child, you know, or a Jesus
holding twelve thousand babies writing in America truck with the
American flag on it, you know, I mean, right, So
the reality is much more stark.
Speaker 7 (39:08):
And you see that you see that one too many
image matching, so you get all these false arrests. So
people because the AI said that they matched the image
from the grain surveillance video. And it's one of these
things where it's bad if it works, because you have
this like increased surveillance power of the state and it's
bad if it doesn't work, because you get all these
(39:28):
false arrests, like it's it's just a bad idea, it's
just a don't and it's not just image stuff. So
we read a while back about a situation in Germany,
I think, where asylum seekers were being vetted as to
whether or not they spoke the right language using you know,
so one of the things you can do with pattern
matching is okay, language identification, that this string what languages
(39:52):
that come from. But it was being done based on
completely inadequate data sets by people who don't speak to languages,
who are not in a position to actually vet the
output of the machine. And so you have these folks
who are in the worst imaginable situation, like you don't
go seeking asylum on a lark, right the other people.
Speaker 2 (40:10):
I broke at home?
Speaker 1 (40:11):
Yeah?
Speaker 7 (40:11):
Yeah, And then they're getting denied because some algorithms said, oh,
you don't speak the language from the place you claim
to be coming from where the person your accent is
wrong or your variety is wrong or whatever. And the
person who's who's run this computer system has no way
of actually checking its output, but they believe it, and
then they get these asylum seekers turned away.
Speaker 2 (40:31):
Yeah, so how does that you know? With everything you said,
how should we feel that Open AI recently welcome to
their board? The eighteenth director of the NSA, Paul nakasone,
is that bad? Or what should we take from that one?
Speaker 8 (40:49):
How should we feel not at all surprised?
Speaker 1 (40:51):
Right?
Speaker 7 (40:52):
How should we feel when open AI? It's like, okay,
bad is whatever the rest of that is is bad?
Speaker 1 (40:56):
Yeah?
Speaker 6 (40:56):
It seems bad, man, Yeah.
Speaker 2 (41:00):
It seems like there's again we're talking like this technology
to mass surveillance pipeline, and who better than someone who
ran the fucking nssay, Like, and I know the way
it's being spun. It's like, you know, this is a
part of cyber command, like he inherently knows like how
what the what the guardbrails need to be in terms
of keeping us safe. But to me, it's just feel like, no,
you brought in a surveillance pro, not someone who understands
(41:22):
inherently like what this specific technology is, but more someone
who's like learns how to harness technology for this other
specific aid.
Speaker 7 (41:30):
Yeah, so surveillance is not synonymous with safety like the
one the one kind of one use case for the
word surveillance that I think actually was pro public safety
is there's a study, the long term study in Seattle
called the Seattle Flu Study, and they are doing what
they call surveillance testing for flu viruses. So they get
volunteers to come in and get swabbed, and they are
(41:51):
keeping track of what viruses are circulating in our community. Right,
I'm all for surveilling the viruses, especially if you can
keep the people out of it.
Speaker 1 (41:58):
Yeah, I would. I would.
Speaker 6 (41:59):
I would add a cult to that just because I
think that I mean there's a lot of surveillance. I mean,
that's the kind of technology, that's the kind of terminology
they use of health surveillance to detect kind of virus
rates and whatnot. I would also add the wrinkle that
like a lot of those you know, organizations are really
trusted by distrusted by marginalized people, Like what are you.
Speaker 2 (42:18):
Going to do?
Speaker 6 (42:18):
What to be you know, like especially thinking like you know,
like lots of lots of transphoks and like especially like
under housed or unhoused transfers, and just like you're going
to do what you want this data fond me for
who you know?
Speaker 1 (42:31):
Right?
Speaker 7 (42:31):
So, yeah, understanding especially because because surveillance in general, like
is not a safety thing, right, It's not. It is
maybe a like safety for people within the walls of
the walled garden thing, but that's not safety, right. That's
the other thing about this is that what we call
AI these days is predicated on enormous data collection, right,
(42:53):
And so to one eccent, it's the sort of an
excuse to go about claiming access to all that data.
And once you have access all that data, you can
do things with it that have nothing to do with
the large language models. And so there's you know, this
is I think less typically less immediately like threatening to
life and limb than the applications that Alex was starting with.
But there's a lot of stuff where it's like, actually,
(43:15):
we would be better off without all that information about
us being out there. And there's an example that came
out recently. So did you see this thing about the
system called recall that came out with Windows eleven? So
this thing, oh god, this is such a mess. So
initially it was going to be by default turned on.
Speaker 2 (43:32):
Oh yes, yeah, this is kind of like the Adobe
story too. Yeah.
Speaker 7 (43:36):
Yeah, every five seconds, it takes a picture of your
screen and then you can use that to like using
AI search for stuff that you've sort of and their
example is something stupid. It's like, yeah, I saw a recipe,
but I don't remember where I saw it, so you
want to be able to search back through your activity
and like zero thought to what this means for people
who are victims of intimate partner violence, right that they
(43:56):
have this surveillance going on in their computer that eventually
it ended up being shipped as off by default because
the cybersecurity folks pushed back really hard.
Speaker 8 (44:06):
And by folks, I don't mean the people at Microsoft.
Speaker 7 (44:07):
I mean the people out in the world or saw
this coming. Yeah, but that's another example of like surveillance
in the name of AI that's supposed to be the
sort of, you know, helpful little thing for you, but
like no thought to what that means for people. And
it's like, yeah, we're just going to turn this on
by default because everybody wants this obviously, right.
Speaker 2 (44:24):
It's like, no, I know how to look through my history.
Actually I've developed that skill. Yeah, I don't need you
to take snapshots of my desktop every three yeads.
Speaker 1 (44:33):
But your show has covered so many kind of upsetting
ways that it doesn't seem like it's people implementing AI.
It's companies implementing AI in a lot of cases to
do jobs that it's not capable of doing. There there's
been incorrect obituaries. Grock, the Elon musk Won the Twitter
one made up fake headlines about Iran attacking Israel and
(44:57):
like public like put them out as like a made
your trending story. You have this great anecdote about a
Facebook chatbot AI like responding to someone had this like
very specific question they have like a gifted disabled child.
They were like, he, does anybody have experience with a
gifted disabled like two E child with like this specific
(45:19):
New York public school program. And the chatbot responds, yes,
I have experience with that, and just like made up
because they knew that's what that's what they wanted to hear.
And fortunately it was like clearly labeled as an AI chatbot.
So the person was like, what what the black mirror? Yeah,
but World Health Organization, you know, eating disorder institutions replacing
(45:43):
therapists with AI, Like you just have all these examples
of this going being used where it shouldn't be and
things going badly, and like that. There's a detail that
I think we talked about last time about dual lingo,
where the model where they let AI take over some
(46:08):
of the stuff that like human teachers and translators were
doing before. And you made the point that people who
are learning the language, who are beginners, are not in
a position to notice that the quality has dropped. Yeah,
And I feel like that's what we're seeing basically everywhere
now is just the Internet is so big, they're just
(46:28):
using it so many different places that it's hard to
catch them all, and then there's not an appetite to
report on all the ways it's fucking up, and so
it just everything is kind of getting slightly too drastically
shittier at once. Yeah, and I don't know what to
(46:50):
do with that.
Speaker 6 (46:52):
I would say, yeah, well go ahead, Emily.
Speaker 8 (46:55):
What you do with that do is you make fun
of it.
Speaker 7 (46:57):
That's one of our things is ridiculous process to like,
you know, try to try to keep the mood up,
but also just show it for.
Speaker 8 (47:03):
How ridiculous it is.
Speaker 7 (47:05):
And then the other thing is to really seek out
the good journalism on this topic, because so much of
it is either fake journalism output by a large language
model these days, or journalists who are basically practicing access journalism,
who are doing the g's thing, who who are reproducing
press releases, and so finding the people who are doing
really good critical work and like supporting them, I think
is super important.
Speaker 6 (47:26):
You we're going to say, well, I was, well, though,
you just teed me up really well, because I was
actually going to say, you know, some of the people
who are doing some of the best work on it
are like four or four Media, and you know, I
want to give a shout out to them because they're
you know, these folks are basically you know, they were
at Motherboard and Motherboard, you know, or the whole Vice
(47:49):
Empire was basically you know, Sunset, and so they laid
off a bunch of people. So they started this kind
of journalist owned and operated place, and you know that
focuses specifically on tech and AI and these folks have
been kind of in the game for so long they
know how to talk about this stuff without really having
(48:12):
this kind of being bowled over. You know, there's people
who play that access journalism, like like Kara Swisher who
like kind of poses herself as this person who was
very antagonistic, but like, you know, right.
Speaker 2 (48:26):
Just like fawning over like AI people and.
Speaker 6 (48:29):
Yeah, like all the time, Well I trusted Elon Musk
and tell I was like, well, why did you trust
this man in the first place? Like did you know
I was reading the Peter Thiel biography The Contrarian and
you know, and like it. It's a very it's a
very harrowing read. I mean it was fascinating, but it
(48:50):
was very harrowing. It wasn't an it was pretty like critical.
But like, you know, they discuss the PayPal days, you know,
twenty four years ago, when you know, Elon Musk was like, well,
I want to name PayPal to X and then and
then everybody was like, why the fuck would you do that?
People are already using people are using PayPal as a verb.
(49:14):
You know, that's effectively the same thing he did with Twitter,
Like people are talking about tweet as a verb. Why
would you say, you know, just it's been like an
absolutely vapid human being with no business sense. Anyways, journal
that was a very long way of saying Kara Switcher
sucks and then also saying that all was saying also
(49:35):
saying that there's lots of folks, there's a number of
folks doing great stuff. So I mean folks at four
or for Karen Howe who's independent, that had been at
The Atlantic and I T Tech Review and the Wall
Street Journal. Curry Johnson, who was at Wired is now
at cal Matters. There's a lot of people that really
report on AI from the perspective of like the people
(49:57):
who it's harming, rather than starting from well, this tool
can do X, Y and Z, right, you know, we
really should take these groups out there claims. But yeah,
I mean the larger part of it is, I mean,
there's just so much stuff out there, you know, and
it's it's so hard, and it is like whack them all.
And I mean we're we're not journalists by training. I mean,
(50:18):
we're sort of doing a journalistic thing right now. We're
I think we're I would not say we are journalists.
I always say we are doing a journalistic thing.
Speaker 1 (50:32):
We're doing journalism.
Speaker 6 (50:33):
We are not doing original reporting. But it is well,
and you know, I would you know, I'm not I
don't know, I'm not the I don't I don't know
who decides this is the court of journalism. But you know,
reporting in so far as looking at original papers and
effectively being like okay, This is marketing.
Speaker 1 (50:52):
This is why it's marketing. They're there, Yeah.
Speaker 6 (50:55):
Rather than you know, a wisbang c net article or
or something that comes out of a content mill and
says Google just published this tool that says you can
you know, find eighteen million materials that are you know, complete,
almost like okay, well let's look at those claims and
(51:16):
upon what grounds those claims stand and and you know
how that's that's a pretty pretty I.
Speaker 7 (51:22):
Think what we're doing is is first of all, sharing
our expertise in our specific fields, but also like modeling
for people how to be critical consumers of journalism, so
journalism adjacent but yeah, definitely without training and journalism totally.
Speaker 1 (51:36):
Yeah, But I think we want to do we want
to do the M and M articles. I mean, oh
my gosh, there's this article that has like done our
brains because it just has this series of sentences.
Speaker 2 (51:49):
That I don't know that because everything is degrading like journalism.
You know, there's that story about like the Daily Mail
was like Natalie Portman was hooked on cocaine when she
was at Harvard. You're like, no, that was from that
rap she did on SNL And it was like a
bit but because this thing is street and then the
Daily Mail had to be like at the end they
corrected it. They're like, just she was not that was
(52:10):
obviously satirical and that was due to human error, Like
they really leaned into that, liken Ya. Of course, did
I say by.
Speaker 7 (52:16):
The time that a fabricated quote of mine came out
of one of these things and was printed as news.
Speaker 1 (52:20):
No? No.
Speaker 7 (52:21):
So I also, like alex so searched my own name
because I talked to journalists or not that I like
to see what's happening. And there was something in an
outfit called Bihar Praba that attributed this quote to me,
which was not something I'd ever said and not anybody
I ever remember talking to. So I emailed the editor
and I said, please take down this fabricated quote in
print of retraction because I never said that, and they
(52:42):
did so the article got updated, removed the thing attributed
to me, and then there was a thing at the
bottom saying we've attracted this. But what they didn't put publicly,
but he told me over email is that the whole
thing came out of Gemini.
Speaker 1 (52:53):
Oh wow, and.
Speaker 8 (52:54):
They posted it as a news article.
Speaker 7 (52:56):
And you know, the only reason I discovered it was
it was my own name, and like, I never said
that thing.
Speaker 2 (53:02):
Well, I need your expertise here to decipher this food
and wine article that was talking about how Eminem's was
coming out with a pumpkin pie flavored eminem but very early.
Normally pumpkin pie flavored things don't enter the market till
around August, like around the fall comes. But eminem is.
Speaker 1 (53:19):
Why we were covering it, because we are journalists.
Speaker 2 (53:21):
Yes, we're like in May pumpkin spice already. No, but
again they were saying, this is because apparently gen Z
and millennial consumers are celebrating Halloween earlier. But this is
this one section that completely wait wait, yeah, I don't
know that's what they're saying. According to their analysis that Okay,
(53:44):
we were that we apparent, So let me read this
for you. Quote. The pre seasonal launch of the milk
chocolate pumpkin pie Eminem's is a strategic move that taps
into Mars' market research. This research indicates that gen Z
and millennials plan to celebrate Halloween by dress, sing up,
and planning for the holiday about six point eight weeks beforehand,
well six point eight weeks from Memorial Day is the
(54:07):
fourth of July, so you still have plenty of time
to latch onto a pop culture trend and turn it
into a creative costume.
Speaker 1 (54:15):
I don't that's it's right, It doesn't. It doesn't make
any sense. I know.
Speaker 6 (54:24):
Yeah, I'm fixating on six point eight.
Speaker 2 (54:29):
What does that even mean?
Speaker 1 (54:30):
Fuck? Does that mean? And where did Memorial Day come from?
Speaker 5 (54:33):
And that?
Speaker 1 (54:33):
And what is six point eight weeks for Memorial Day?
Because it's not any of the days that they said
it was.
Speaker 8 (54:39):
They said July fourth, And also.
Speaker 2 (54:42):
Six point eight eight weeks isn't a real amount of time.
That's forty seven point point six days? Yeah, what is
what is even a six point eight week?
Speaker 7 (54:51):
So if this were real, it's possible that they surveyed
a bunch of people and they said when do you
start planning your Halloween costume? And those people gave dates
and then they averaged that, and that's how you could.
Speaker 1 (55:02):
Get to get that.
Speaker 2 (55:03):
I get that that's fair, but also.
Speaker 7 (55:05):
It totally sounds like someone put into a large language
model write an article about why millennials and gen z
are planning their Halloween costumes earlier.
Speaker 8 (55:15):
Like it sounds like that.
Speaker 2 (55:17):
But also just so odd to say, well, six point
eight weeks from Memorial Day is the fourth of July.
This article didn't even come out like it came out
after Memorial Day and yeah, fourth, It's just nothing made sense.
And I was like, I don't fucking understand what they're
doing to me right now. But again that's this is
like the insidious part for me.
Speaker 8 (55:35):
But this appeared in Food and Wine.
Speaker 2 (55:38):
This is in Food and Wine magazine with a human
like in the byline. And I actually DM this person
on Instagram and I said, do you mind just clarifying
this part? Like I'm a little bit confused, and I've
I've gotten no response.
Speaker 6 (55:51):
I've got no I'm wondering if it's because I know that.
I mean, there was some good coverage and futurism and
they were talking about this company called ad Von Commerce
and the way that basically this this this company has
been basically making AI generated articles for a lot of
different publications, usually on products like product placement. Right, And
(56:16):
so it makes me think it's sort of like this
Food and Wine, you know, may have been one of
their for I forgot the article, but they had a
bun they had like, you know, better homes and gardening
and you know, kind of these legacy articles like that.
So I don't know if it's something of that or
this journalist kind of said, write me this thing, and
I'm just going to drop it and then go with God,
(56:38):
you know.
Speaker 1 (56:39):
Yeah.
Speaker 2 (56:41):
Yeah.
Speaker 1 (56:41):
My My other favorite example of is this headline I
saw somewhere it's no big secret why Van Vaught isn't
around anymore and with a picture of Vince Vaughan but
they just like got his name completely wrong. It's no
secret why Van he isn't around anymore.
Speaker 6 (57:03):
I'm like, you know, if I was just scrolling and
I just and I'd say like, yeah, it was like,
you know, I liked Van Vaught and the Intern and
then yeah, but then I but and then I would
have looked at it, and then I would have double taped.
I'm like, wait, wait, wait did he co star with
oh when Mick Wilson or something?
Speaker 2 (57:23):
Yeah, yeah, yeah, exact Russell Wilson was in that.
Speaker 7 (57:26):
I think it was the ad Week reporting that you're
thinking of alex Futurism did a bunch of it, but
then Adweek had the whole thing about Advon and I
can't quite.
Speaker 6 (57:33):
No, no, no, it was it was future. It was futurism, yeah,
because because Adweek had the thing on this program that
Google was offering and it didn't have a name.
Speaker 8 (57:43):
Oh right, yeah was futurism. Yeah, but it totally sounds.
Speaker 1 (57:47):
Like what it is happening. Yeah, yeah, right, I.
Speaker 7 (57:49):
Thought you were going to talk about the surveillance by
Eminem thing. You said, Eminem's So this was somewhere in Canada.
There was an Eminem vending machine that was like taking
pictures of the students while they were making their purchases.
And I forget what the like a sensible purpose was,
but the students found out and I got it removed.
Speaker 1 (58:06):
Wow, probably freaked out and made a big deal about it.
All right, all right, that's gonna do it for this
week's weekly Zeitgeist. Please like and review the show if
you like. The show means the world to Miles. He
he needs your validation, folks. I hope you're having a
(58:29):
great weekend and I will talk to him Monday. By
(59:09):
st