Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
All media.
Speaker 2 (00:03):
Hey everybody, Robert Evans here, and I wanted to let
you know this is a compilation episode. So every episode
of the week that just happened is here in one
convenient and with somewhat less ads package for you to
listen to in a long stretch if you want. If
you've been listening to the episodes every day this week,
there's going to be nothing new here for you, but
you can make your own decisions.
Speaker 3 (00:25):
Hi.
Speaker 4 (00:25):
Everyone, it's James coming at you. We're pretty nasty cold here.
I wanted to share with you that Wildfast has swept
through Los Angeles in the last couple of days. While
I'm recording this. Thousands of people have been displaced. Five
people have died that we know of so far, thousands
of structures have been burned, and many many people in
(00:46):
I will be finding themselves out of their homes with
nowhere to go, with very few resources. If you'd like
to help, We've come up with some mutual aid groups
who you can donate to, and we'll be interviewing one
of them on this show next week. So if you
like to help, the three places where we suggest you
would donate some cash, The Sidewalk Project, that's the Sidewalk
(01:08):
Project dot org K Town for All. That's that's a
K T O W N F O R A L
L dot O r G and ETNA Street Solidarity. You
can find them on Venmo or I think on Instagram
as well. That's a E T N A S T
R E E T S O L I D A
(01:32):
R I T Y.
Speaker 2 (01:34):
All right, I'm gonna go rest my voice order in
the court order in the court Justice Robert Evans presiding,
I see we have a fine jury here to take
questions from the audience of our of our daily news show,
which is also my court room. Everybody, everybody get it,
(01:54):
because I'm a judge now, really, because that's how the
legal system works.
Speaker 5 (01:58):
All those rubers finally have true.
Speaker 2 (02:02):
No municipal Judge Garrison.
Speaker 5 (02:04):
That's okay, okay, that's get You're right, You're right, You're
right here.
Speaker 2 (02:10):
I will now for the rest of my life be
able to say when people ask questions, well as a
man of the law, which I'm very much looking forward to.
Speaker 4 (02:19):
No, only able to say, Robert quite likely.
Speaker 2 (02:22):
To say, anyway, that's all I got.
Speaker 1 (02:26):
All right, this is the it could happen here. A
Q and A episode. We've got what what are we
calling you now? Robert Evans? What's your title?
Speaker 2 (02:34):
The Honorable Robert Evans and I I actually all the
judge who made me a judge sent me a gabble,
but I didn't grab it for this one. So I
just used I have a the the barrel and lower
receiver from an antique sought off shotgun that belonged to
a bootlegger, and I just sort of slammed that into
my table.
Speaker 5 (02:51):
I'm sure editor will love that.
Speaker 4 (02:53):
Yeah, yeah, but before we broadcast, so you have a
sort of shotgun.
Speaker 2 (02:58):
It's not a it's not functional, it's been destroyed.
Speaker 4 (03:01):
I see, I see good. Didn't want a little really
rich moment.
Speaker 1 (03:03):
Yeah, we've got Mio Wong, Garrison Davis, James Stout and
the dishonorable Robert Evans.
Speaker 5 (03:09):
And Sophie Lickterman. Oh yes, it mean yeah, we're.
Speaker 1 (03:13):
Gonna do the We're gonna do some questions we posted
in on our Blue Sky. If you not following us
on Blue Sky, we are on there.
Speaker 2 (03:18):
Blue Ski one does not post on Blue Sky. So
if you want.
Speaker 1 (03:21):
Skeets, I really hope that's not true, because that's really.
Speaker 2 (03:24):
Embarrassing Unfortunately, they really tried to get that off the ground.
I don't see anyone actually using skeat.
Speaker 4 (03:31):
I saw someone using it in French and it was
a real moment. Hi are you Garrison.
Speaker 5 (03:36):
Instead of saying send a tweet, now, I just say
send skeet in conversation. Everyone loves it. M hm.
Speaker 2 (03:42):
Do you re skeet? Is that a thing? Yeah?
Speaker 5 (03:44):
I guess you do. I guess you do, mm hm.
Speaker 1 (03:47):
And we're moving on. I'm just gonna throw out some
of the questions we received on online. I'm not even
gonna say the name of the app again because I'm
afraid being labeled as an old Garrison's embarrassed by me.
I can tell I didn't say that, but you thought it.
Speaker 5 (04:03):
But you thought it. I didn't think that you did.
Speaker 1 (04:06):
Any advice for someone with a desire to do some
hobby or freelance journalism in the coming few years, I
want to actively fight for equality. Also, thank you for
your questions. Everyone.
Speaker 2 (04:17):
Hm, I don't thank you for your questions. I'm actively
angry at you for your question.
Speaker 1 (04:21):
Yeah, that's why you're the dishonorable.
Speaker 4 (04:23):
Yeah, start rich if you want to be a freelance journalist.
Because you'll progressively become poorer.
Speaker 2 (04:30):
It's my I have funded my journal I get I
love whatever people ask me questions like how did you
convince Krack to send you to Iraq? I didn't. I
bought plane tickets, Like being an entertainer has always been
what's funded my journalism.
Speaker 5 (04:46):
I guess my advice would be get really autistic about
something problematic, just like one thing, this one thing I
get like RELLI into it to the point where it
kind of takes over your life. Your personal life starts
fading away, it kind of blends into your whole state
of existence. And only then will you actually get good
(05:08):
at that thing. Yep, that's my advice. And then you
just take one thing at a time and every few
years you kind of change the scope of the thing
you're getting really autistic about. But that's kind of how
I've rolled, and it's been it's been okay.
Speaker 2 (05:23):
Yeah, you just finished thirty six hours of digging into
the life of a school shooter. And I also built
the back of my career spending hours and hours digging
through the online lives of mass shooters. And you don't
have to do that, but you do have to do
that thing, which is Yeah, exactly what Garrison said. You
(05:43):
have to pick a very narrow thing and make it
your life, and not just a random thing, but like
a thing that you think is important. Yeah, and that
people don't other people don't understand how important it is.
And if you make yourself there's a fella. His blog
is called We Hunted the Mammoth. They've Futrell who's been
covering what we call the manosphere for like more than
(06:04):
a decade before anybody else in journalism was taking it seriously. Yep,
you got to do that kind of thing. If you
do that kind of thing, you build a name for yourself,
and that can allow you when that the thing that
you're obsessed on becomes a big story, being first to
have something meaningful to say about it can provide you
(06:24):
eventually with the opportunity to cover other things. Yeah, and
it's good advice.
Speaker 4 (06:29):
I would say if you want to get started freelancing,
it's a good idea to join that IWW Freelance Journalist Union.
You can learn a lot from people who are freelancing there.
You can learn who not to pitch, which editors are
toxic as fuck, which is a surprisingly large amount. Yeah,
you can learn which email to send your pitches to
and how to pitch if you're not familiar with how
to pitch. I also teach sometimes journalism workshops at a
(06:50):
community college, So if you have a community college near you,
you might be able to get some either free or
very cheap sort of advice and the real like nuts
and bolts of journalism like ending pitches and stuff like that.
Speaker 2 (07:02):
Cool.
Speaker 1 (07:03):
What is the consensus on what the next Trump administration
will do? On the first day or first week, all
of us just look like we're in pain.
Speaker 5 (07:12):
Oh fus like it's chaos of the Yeah.
Speaker 2 (07:21):
I'm not.
Speaker 5 (07:21):
I'm not for seeing good things. There'll be a lot
of executive orders that are you know, probably bad, you know,
things that aren't great.
Speaker 2 (07:28):
Yeah, I think that, Uh, he's going to try to
do as much of what he's promised to do in
terms of particular not in terms of everything he's prounded,
but in terms of going after immigrants. Yeah, he's going
to do as much of what he's promised to do
as he possibly can't. Now that doesn't mean he's going
to actually deport millions of people. There are like some
(07:50):
just practical limitations based on the capacity of the institutions
will be using to do this and he could get
there's a very good chance things will get bogged down
and whatnot, but like he will try. Yeah, that's my take. Yeah,
I think.
Speaker 6 (08:03):
I think the other thing that's going to happen pretty
quickly is I think he's gonna start moving on tariffs
very very fast.
Speaker 2 (08:08):
Yeah. If you're planning to buy a computer, go ahead
and grab that fucker now if you can.
Speaker 5 (08:13):
If you're getting anything from overseas, you should get it
in the few weeks that you still can.
Speaker 2 (08:18):
Yeah, it has a battery.
Speaker 3 (08:19):
It made here.
Speaker 1 (08:20):
I had my annual physical today because otherwise our insurance
screws us over. And my doctor was like, you should
try to get as many prescriptions filled before the end
of the year before things come up, just in case.
There you go, and you know that's not terrible advice.
Speaker 3 (08:38):
Yeah.
Speaker 4 (08:38):
I think in terms of executive orders, he will try
and further restrict access to asylum, try and further change that.
There are things he can do by executive order with
ICE and CBP in terms of how they operate that
he will try and do. It's not impossible that they
will try and again immediately mobilize public health law against
migrants like he did in twenty twenty. Right, Yeah, those
(09:01):
things could all be done without congressional support. We might
hope on us about this, but Stephen Miller suggested that
they might do some of those things.
Speaker 2 (09:08):
So, yeah, not impossible. Party won't be a great day.
Speaker 1 (09:12):
Somebody's getting fired the first week, probably first day.
Speaker 2 (09:16):
Yeah. I mean, I've seen the fact that the FBI
director is stepping down pushed as like an act of
resistance because it means that Trump now has to actually
go through like Congress to get it done. I don't
know if how much I buy that, how much I
think that. I think a lot of what I'm seeing
right now from establishment people. And maybe this isn't true
(09:38):
of Ray because I did find some of the arguments
they're compelling, But a lot of what I've seen from
establishment people in politics is they're scared and just really
try and not to make waves. Yeah, and I think
that's what you're going to see overwhelmingly. I think that
he's going to probably will not immediately act against the press,
(09:59):
and in a legal sense, as the president, they will
do that, but I think he's going to He's already
suing differently and I think that that's going to be
kind of his his focus there for a while, just
because there's a lot on his plate. But I think
here there will be attempts like the fuck with libel
laws and stuff, especially as things go on.
Speaker 1 (10:19):
Okay, several of you have asked about the Android ad
free version subscription channel, and I want you all to
know that it will happen next year. I have been
trying to get this to happen for two years now,
and for unforeseen reasons, it just keeps getting roadblocked. But
(10:43):
it is happening. We're just waiting on a couple of
final things to get into place, so that will be happening,
hopefully very soon into twenty twenty five. I will update
everybody as soon as that's possible. And I'm so sorry
it's taken so long. I want you to know I
have worked so unbelievably hard on this, miserably hard.
Speaker 2 (11:06):
Yeah, we've seen it, Sophie has. It's been a nightmare,
Harder than I have worked on anything else this year. Like,
it's been nuts. Yeah, And here's the thing that sucks
for no reason, no reason, but not that there's no
reason to launch the app. There's a great reason. There's
no reason it should have taken this long. Correct, But
we can't say anymore for reasons that are also equally frustrating.
(11:27):
I'd like to say in general, folks, there's a few
things that get brought up a lot. It's like, why
haven't they done this yet? Why haven't they done this yet?
We're talking like technical things or like you know, things
like a like a paid subscription, and they're like, why
haven't they gotten around to it yet? And the answer
is always some infuriating bullshit based.
Speaker 1 (11:44):
On some bureaucracy bullshit.
Speaker 2 (11:47):
Some be rock some legal shit where you're like, you
don't actually realize it's illegal to do this if you
do it this way or whatever, like some sort of
bullshit that makes it impossible. It's not that we want
to make this is it is as possible for people
to have the best listening experience that we can afford
to provide them. But there's a lot of annoying bullshit
(12:07):
that exists for reasons beyond our comprehension.
Speaker 1 (12:10):
Sorry, anyways, here's ads. Unless you have an iPhone and
subscribe to color Zone Media on Apple. All right, we're back.
How do you each motivate yourself to write or do
(12:33):
your jobs. I get asked that question all the time,
but I'll let each of you tackle it. While this
is a communally hosted show, I feel like each of
you do very different things, so your answers are going
to be all over the place. So Garrison, Oh.
Speaker 5 (12:50):
Well, I mean paying rents a great motivator.
Speaker 2 (12:53):
Sure, yes, yes, understated, this is a big thing that
a lot of people who want to be writers but
have never done it for a living. Miss is that
all of your favorite writers who do it for a living,
a big part of how they get over fucking writer's
block is they have to pay rent or mortgage. Yeah,
it turns out that helps.
Speaker 5 (13:14):
It's it's it's a quite compelling motivator, and sometimes it
has required the assistance of you know, caffeine or other things.
I have a variety of playlists to help me in
when I'm in like different moods. I definitely will about
you know, maybe twice a month, I just do a complete,
(13:34):
like a complete body check to my sleep schedule to
get a special project finished. And that's just kind of
part of the deal, at least in terms of how
I work, and not everyone does it this way. Though
maybe maybe people are more healthy than me.
Speaker 2 (13:48):
Yeah for me.
Speaker 6 (13:50):
Okay, So the easiest way something gets done is just
pure rage. I can just do it like.
Speaker 2 (13:56):
It just comes out. The anger is a great motivator.
Speaker 6 (14:00):
The other fun one is pure joy as something funny happening,
Like this is the shinzo Abe assassination. The easiest writing
I've ever done in my lifetimes.
Speaker 2 (14:09):
It just flows.
Speaker 7 (14:11):
Yeah.
Speaker 6 (14:13):
Other times it's just like there's a deadline and everyone
is counting on me, and I have to get it out,
and I've gotten to the right level of sleep deprivation
where I can just do it.
Speaker 5 (14:22):
That's right, That's right.
Speaker 2 (14:23):
Yeah.
Speaker 6 (14:23):
But I also think, you know, there's obviously like health insurance,
which is sort of a joke given our health insurance.
But yeah, and then the last thing, and this is
the sort of the serious one, is that like this,
you know, I mean I do some organizing stuff too,
but like this, this is the thing that I have
(14:44):
to do that can materially affect the world. Which is
a very very weird thing to say about a podcast.
But I've seen it happen, right, I've seen all of
you go and do things that wouldn't have happened and
I've had. You know, it's a weird situation, right because
my my motivation for doing this stuff is the chance
that you will make the world better. But I've seen
(15:05):
it happen, and I have to continue to believe that
the thing that I've been doing for all these years
is project of building a very large hammer and deploying
it against our enemies can work and will work. And
that is you know, that's how I get out of
bed every morning, is we're building the hammer and we're
swinging it.
Speaker 2 (15:22):
Yeah, that's a great way to put it.
Speaker 4 (15:25):
Very large hammer will be a banging name for a podcast.
Speaker 1 (15:27):
I agree, Yeah, I agree.
Speaker 2 (15:30):
Yeah, there's a there's a great speech in the comic
series trans Metropolitan about how journalism is a gun that
you you wire up to your eyes and your ears
and several other organs in order to shoot at the world.
And that's I think a good way to keep yourself
doing it when it feels like you're just shouting into
(15:50):
a void.
Speaker 8 (15:51):
Yeah.
Speaker 2 (15:52):
I really like the process of writing.
Speaker 4 (15:53):
I like telling stories like that makes me happy and
I feel.
Speaker 2 (15:57):
So lucky I can do it for my job.
Speaker 4 (15:59):
I don't ticularly like like receiving trauma, which I also
do for yob but.
Speaker 2 (16:06):
Like, really it can be sometimes I can't sleep.
Speaker 4 (16:12):
So many people trusted me with their stories, especially this year,
that they didn't have to and sometimes a great personal risk,
and it's a massive privilege that they trusted me with
those stories, and I think I owe it to them
to do my best to tell those stories as well
as I can. Yeah, and like as Mia said, it
has materially changed the world, like the amount of people
(16:33):
who listened to our podcasts and came to the border
to help last year when we really desperately needed help,
people who just like on Sunday Night, gave their money,
which I know none of us have enough money right
now to help people who are displaced in Rushaba. Like
all that stuff really makes it feel like if you
tell a good enough story, people will care. That's always
what I felt like, if you could just get people
(16:54):
to see it, if people could be there, they would care,
and if they care enough, they'll do something. Then I've
seen that be true with people who listen to the show,
and that really makes me happy, So I want to
keep doing that.
Speaker 1 (17:06):
Yeah, for me, it's two part answer. The first part
is that I genuinely give a shit about everything that
we put out and what we do is not really
while it is a job, it matters so much. And
(17:28):
the second part is if I don't do my job,
the amount of people's lives that that impacts is a
lot of fucking people, and I give a shit about
each and every one of them. So I'm gonna keep
doing my job so that everybody else can keep doing
their job and maybe we make a difference in this world,
(17:50):
this fucked up, crumbly world. Robert, did you have anything
to add? You were speaking and then nice talked, I.
Speaker 2 (17:58):
Have did I already not give an answer?
Speaker 1 (18:00):
You gave an answer, that's why, but you were starting
to speak.
Speaker 2 (18:02):
Oh yeah, I do it for the fame, baby.
Speaker 1 (18:05):
Great. Next, What episode or episodes were your favorite this
year to make or otherwise?
Speaker 4 (18:17):
Yeah?
Speaker 1 (18:18):
My favorite this year We're definitely James's series from the
Darien Gap. That was an incredible series. I'm so unbelievably
proud of it. Yeah, James had been trying to do
that work for a long time, and I'm I'm happy
that we were able to fund it and James was
able to do the incredible reporting that he did. I'm
(18:40):
also quite proud of Robert Garrison and I surviving the
RNC and d n C and C.
Speaker 2 (18:46):
Was a good time, like legitimately.
Speaker 1 (18:49):
Great time, pulling the worst people in the world.
Speaker 2 (18:53):
It was the d n C that fucked me up. Yeah, yeah, same.
Speaker 5 (18:58):
I was like destroyed emotionally after the d see.
Speaker 1 (19:00):
Yeah, the DNC was really a huge bummer. And then
MIAs covered some of the most important labor stories that
like nobody covers absolutely yeah, and like without those genuinely
like nobody covers like small labor stories or big labor stories,
and she's always on top of that beat. And yeah,
(19:20):
I also really just like Robert's don't Panic episode something
some great writing, my friend, I answered, Now everybody else
has to, Well, I'll start with Miyah.
Speaker 6 (19:34):
There's weirdly a few this year cool I normally isn't
I like the Boeing ones that was fun.
Speaker 5 (19:40):
Yeah.
Speaker 6 (19:41):
The one that was most emotionally impactful for me was
getting to interview doctor Julia Serrano, who if you haven't
listened to that episode, go listen to it.
Speaker 1 (19:50):
Great book.
Speaker 6 (19:51):
Yeah, Whipping Girl is the book that literally created a
bunch of the like like the concept of misgendering is
from that book, right, Like like the language that we
used to talk about Transnist today is directly her and
so few people who ever read the book, so a
few people even know who she is, and getting a
(20:12):
chance to talk to her was like incredible. And I'm
also really happy about the organizing one that I did,
because I've gotten so many messages from people who were
just like, I, oh, wait, my knitting is useful to organizing,
And I'm like, yes, yes it is, you're knitting. You're
so incredible, staggeringly useful.
Speaker 2 (20:28):
Yeah, so I'm proud of that one.
Speaker 1 (20:30):
Yeah, let's take a quick break then, Garrison, Robert James,
you can answer that question and we're back. James, how
about you.
Speaker 4 (20:53):
I'm proud of doing the Darien ones. I think, like,
I'm so happy that we finally got to a place
where like we could do that, where we could fund that,
Like I've been trying to do that, like I said,
for nearly.
Speaker 2 (21:05):
A decade, and.
Speaker 4 (21:07):
Yeah, it's been hard, and it continues to be hard,
like one of the people you heard from in those
episodes got deported last week then, so like it continues
to kind of be emotionally difficult. But I really liked
how many people messaged me and were like, I sent
this to my father, uncle, not just dudes aunts and
their mum's too, I'm sure, and think non binary relatives,
(21:30):
but like, well maybe not because they sent it to
their right wing relatives and they like learned some compassion.
That's always what you want to do, Like I said before,
you want people to see it so that they care
and so they understand it and they don't just get
this stupid Fox News bullshit racism stuff, and.
Speaker 2 (21:45):
So yeah, that may be really happy.
Speaker 4 (21:46):
The reason we're all different on this, by the way,
is because we have not done a Come twenty twenty
four episode, and if we had, this would have been
a much much shorter segment.
Speaker 2 (21:54):
James, let me just tell you, I think we can
all look forward to all white Christmas this year.
Speaker 1 (22:00):
Jesus mother.
Speaker 2 (22:03):
Set him up. It's my own fault.
Speaker 5 (22:11):
Wow, I guess I'll go now. I'll just short clean
out the aftertaste of that.
Speaker 2 (22:19):
So it's worse.
Speaker 5 (22:24):
I think I started out pretty strong with police drones,
even more topical as we record this now as New
Jersey is about to get completely abducted I think by
alien aircraft.
Speaker 2 (22:35):
Yeah, there's no one left in New Jersey.
Speaker 5 (22:37):
Now they've all been taken away by all these unidentified drone.
Speaker 2 (22:41):
That actually happened three days ago. It just took a
long time for the rest of the country to notice
or care. Breece Springsteen hasn't made a song about it.
We have no way of known.
Speaker 5 (22:51):
Besides the mass hysteria of the New Jersey drone panic.
Police drones are a real problem, and those are going
to be increasingly so. I was happy with my porting
on that at CES, and then I guess I mean
to echo Sophie. I had a great time at the RNC.
It's fun, a sentence I never thought I would say, yeah,
(23:11):
And particularly the R and C Grinder episode, I still
think is Bruce pretty is pretty good.
Speaker 1 (23:15):
It's pretty great. The amount of places that Garrison and
I snuck into at the rn C a time.
Speaker 5 (23:23):
It was really dangerous too, because I was having to
like do my RNC research next to Robert and Sophie
the whole time, and oh boy, it's like a minefield
scrolling through.
Speaker 1 (23:32):
That app effects an experience, to say the least. Any
thoughts on the proposed twenty twenty eight general strike, How
are people feeling about that?
Speaker 6 (23:42):
I'll start with Mia, Yeah, I mean it's it's a
pretty good idea, Like there's definitely sort of and I'm
immediately going into this nacying a little bit. There's definitely
problems with it. It's going to be extremely hard to
execute because we just don't have a modern history of
doing that in the US, and even some of the
success so ones in the last decade that people have
pulled off haven't been that effective. But on the other hand,
(24:04):
as something that we you know, a concrete thing that
we have to organize towards that has a bunch of
like pretty large unions behind it. Already, I did an
episode about that a few weeks ago. I don't know,
a couple months ago. I don't remember when I did
this episode. I'm sorry, I can't remember anything you've ever done.
But I think I think it's a good opportunity to
(24:26):
connect a whole bunch of different kinds of organizing together,
both in terms of sort of labor and in terms
of the support work you need for that. So yeah,
cautiously optimistic.
Speaker 1 (24:36):
Anyone else have anything they want to add?
Speaker 5 (24:38):
The time to start figuring out those logistics, like is
now it's it's not waiting till twenty twenty seven.
Speaker 2 (24:43):
Yeah, I agree, Garrison. I think that the fact that
there are serious people who represent serious unions talking about
it is part of way. It's one of the things
that does give me a degree of hope. We're going
to have to start working now towards it. It's not
going to be so in any way, shape or form.
If they see it coming, they are going to start
(25:04):
trying to criminalize things preemptively. If it is something that
even looks like a real possibility, they're going to come
after it with everything they've got. And it's one of
those things where maybe if the midterms go well for Democrats,
maybe Democrats stop that, But it's just as plausible and
probably more plausible, that Democrats line up with Republicans to
(25:27):
attempt to criminalize something like that.
Speaker 4 (25:29):
Yeah, it's strained to be seeing something like this organized
so far off like it.
Speaker 2 (25:35):
Yes, it's not something where any of us are familiar with,
which it has to be to be clear. Yeah, yeah, Yeah,
it has to.
Speaker 4 (25:40):
Be barring, like an actual coup. That's the only way
you get a general strike, right like, either something so
earth shattering that everyone so that everyone's ready to risk
it because they're already in danger or yeah, you take
the time when you plan that you do it properly.
But it's just not something we're familiar with. I love
the general strike. I'm always going to support a general strike.
I'm excited to see a general strike. But yeah, we
(26:01):
have to put in the work now.
Speaker 2 (26:03):
Yeah, the only responsible way to characterize the organized left
in the United States is a complete and utter failure.
Like it has been a calamity for the causes that
it seeks to represent. And a lot of that is
because of like fucking bullshit online clicktivism. You know, we're
all going to do a general strike. Everybody, get ready
(26:25):
next week, We're going to do it. You know, shit
like that is. It's just so deeply unseerious. And if
we're going to take the momentum and the energy that
exists in the number of people who are angry and
who you know, and that number of people will be
increasing as the consequences of conservative policies hit home by
twenty twenty eight, Like, it has to be something taken
(26:49):
deadly seriously by very serious people who are thinking through
the consequences and what's necessary in order to make this feasible.
Speaker 1 (26:58):
You know, and last, do each of you have you
know a movie or a book or something you would
like to recommend.
Speaker 4 (27:08):
In twenty twenty five, when I finished my books, you
should buy it, yes, but.
Speaker 2 (27:12):
Read General Strike.
Speaker 4 (27:13):
I've been reading a book called Platentive, which is in English,
but it's about how San Francisco doc workers block to
shipment of weapons to El Salvador and it just seems
a very relevant book. And they did it to Pinochet
as well. It's easy to read and like. It just
reminded me how important labor organizing is going to be
in the next four years and how powerful it can
(27:34):
be too. So I'll give that one a little plug.
Speaker 2 (27:37):
Excellent.
Speaker 4 (27:38):
There's a film called The End Will Be Spectacular which
is about the Kurdish use movement in Northern Kurdistan in Turkey.
It's a really good film. I think of understand to
help you understand the Kurdish freedom movement, and it's worth
a watch. It's not like a necessarily a happy, feel
good film, but I think it's worth a watch. Fifty
if you've recently become interested in that because of what
you've heard on the podcast.
Speaker 6 (27:58):
Yeah, yeah, I have a couple, so I'm trans fiction pilled.
Right now, you've given you fiction for trans authors.
Speaker 2 (28:05):
Would you say you're transfixed? Wow?
Speaker 5 (28:09):
I walked.
Speaker 6 (28:09):
I walked right into that one, like drove directly into it,
like JFK's head into that bullet.
Speaker 5 (28:17):
Oh my god.
Speaker 2 (28:20):
Wow.
Speaker 1 (28:20):
We spent a lot of time with each other.
Speaker 2 (28:25):
Yeah.
Speaker 6 (28:26):
The first one I wanted to talk about is The
gun Runner and Her Hound by Maria Ying, which is
the pen name of a couple of authors. Okay, so
it's this is a This is an absolutely unhinged lesbian
book about a lesbian crime ward and here a do
bodyguard who is also a lesbian, and it rules. Uh,
there's a whole sort of like post apocalypse US thing
(28:46):
going on, but they're still in like civilized Hong Kong.
It's awesome, it's great. It's you need you need war
on hinge lesbians in your life, go read this. The
the other one is One of the Boys. This is
Forthcoming is going to release May thirteen, twenty twenty five,
by Victoria Zeller, and it's about a trans girl who's
(29:07):
like the kicker on her football team and she has
to like leave the team because she transitions, but then
the team needs her back. They don't have a kicker,
and it's it's fun, it's it's a good time, so
you should get that when it comes out.
Speaker 2 (29:19):
Yeah. So I'm actually right now in the middle of
a book that I found myself surprised by how much
I've liked. It's called When Paris Went Dark, and it
is a history of the occupation of Paris under the Nazis.
That is a really fascinating social history by Ronald rose
Bottom that I found very like, emotionally affecting, especially in
(29:39):
light of, you know, some things going on, and yeah,
just kind of a fascinating look at the psychology of
a people, of like of a of an entire people
kind of grappling with what's about to happen to them
in the wake of the failure of the French army
and then what happens next. And then I would also
(29:59):
reckon Men Setting the Desert on Fire by James Barr,
which is one of the books about te Lawrence that
I cited in the Te Lawrence episodes. If you are
at all interested in the realities of needing to fight
an insurgent war.
Speaker 5 (30:17):
Here, I guess just two recent things I've enjoyed. Finally
finished the Steppenwolf by Herman Hess. Yes, I enjoyed that deeply.
It kind of it kind of my picked my twin
Peaks the Return Brain. So that was that was pleasant
for and for a more recent release, Luca Guardaghino's new
(30:39):
movie Queer, adapting the short story by William S.
Speaker 2 (30:43):
Burrows.
Speaker 5 (30:44):
I found this movie to be utterly fascinating and transfixing,
to use the term from me, Robert. I don't have
much else to say about it because I would rather
people just watch it and take away what they want
to themselves. But it got me thinking a lot about
the lack of meaning inherent to identity and why I
(31:05):
hate the term queer bodies. So yeah, good.
Speaker 1 (31:08):
Movie, awesome. I just have one movie to recommend, and
it's one of my favorite movies of all time, the
original nineteen seventy three, seventy two, so seventy three, seventy three,
The Wicker Man, not the fucking Nicholas Cage version, the
original version. And if you have a local theater that
(31:29):
plays old movies a lot of times, they'll play it
in theaters, and I highly recommend that experience. It's really fun,
especially at the end. I see it in theaters or
watch it at least once or twice a year and
vibes are good. Yeah, that's it for a Q and
A episode. Thanks for submitting, and goodbye, Welcome to it
(32:04):
could happen here. This is our twenty twenty five Predictions episode.
We were starting to bicker off Bike about what we
predicted last year, and I was talking about the things
we predicted, and one of the things I predicted early on,
I was like, I think Kim Kardashian will be part
of the Trump cabinet, and like, honestly goals at this point,
(32:25):
but I'm not that far off though, because essentially what
he has done is he's basically tried to go for
people that are good on TV. It's true, It's true,
and like going off of that reality TV energy.
Speaker 4 (32:40):
Finally we will acknowledge the Armenian genocide.
Speaker 1 (32:43):
I was I was vibing, okay, James, I.
Speaker 4 (32:46):
Was vibing genocide.
Speaker 1 (32:50):
Just James, all right, Viba side God all right? Mia
woggs here, I'm Bred Garrison's here, James Stout's here, and
the dishonorable Robert Evans is also here.
Speaker 2 (33:04):
I judge that nickname bad Jesus Christ.
Speaker 5 (33:07):
Wow, let's let's go over some of our terrible twenty
twenty four predictions just briefly. Now, Unfortunately there was a
lot of election ones which were very sad to listen to.
Oh no, now, we were correct about many things we did.
We did talk about how Harris would probably be a
really bad candidate to run against Trump. Totally forgot about that.
Speaker 2 (33:29):
We did. We did huge for us. Yeah, for the country.
That brief period of time when Biden stepped down, it
really felt like it might be I mean, she did
better than he would have done. Yeah.
Speaker 5 (33:43):
Well, I think that's just because we were still just
reeling from that debate, because so bad that like anything
was like, oh my god, there's like a life.
Speaker 2 (33:51):
Look at how she can walk thirty forty feet at
a time, exact exact sentence. Good God.
Speaker 5 (33:58):
None of us picked Vance specifically at that point in time,
but we did pinpoint Trump's orbit and his like campaign
like hmmm crew pretty well, like Mia predicted that RFK
Junior could be a Trump VP pick, and though he
didn't become VP, he essentially kind of took over the
VP like campaigning role from Vans in.
Speaker 2 (34:18):
Like August Ye was so bad at it.
Speaker 5 (34:22):
We all decided that like Vivec was simply like way
too loud and like obnoxious. So Trump would like find
some other spot for him. Stand by that, And that's
what happened. He's still in the orbit, but he's not
super close. So if he talked about possibly Christine nome
As is getting LinkedIn with Trump maybe for VP. Now
that didn't happen for VP, but christineoam is in the cabinet.
Speaker 1 (34:43):
Good job passed me.
Speaker 5 (34:45):
Yeah, and Robert said that he would not be shocked
if Trump got close with Tulca Gabbert and other less
good predictions I predicted that a daily wire host would
get unfortunately did not come to pass. There's still time.
Speaker 6 (35:04):
It's still twenty twenty fours, right.
Speaker 2 (35:07):
Not when this airs, not when this air.
Speaker 5 (35:10):
Yes, Kim Kardashian getting into politics didn't really happen, she
kind of stated at her regular coast level. Sorry, Sophie
so far.
Speaker 1 (35:18):
Trust me. She did all those things when Trump was
elected the first time, where all of a sudden she was,
like with other lawyers trying to get people out of
jail by utilizing Trump.
Speaker 5 (35:29):
Yeah, I mean, and she was doing that with the
Biden campaign as well, not as visible a Harris campaign.
She was meeting with Harris multiple times. She kind of
stayed at this like distant but like talkative place.
Speaker 1 (35:40):
That's the Kardashian way, distant and talkative.
Speaker 5 (35:45):
Speaking of speaking of your other prediction was that people
would start forgetting about the Nazi stuff and Kanye would
put out a well received album, which kind of happened. Yeah, yeah, yeah,
a little bit.
Speaker 1 (35:58):
God, I haven't thought about Kanye so many months. It
was really nice, well, really nice, thanks Garrison.
Speaker 5 (36:06):
Lastly, my failed prediction is that if Trump won the election,
that there would be two solid weeks of writing which
simply did not happen.
Speaker 2 (36:14):
Yes, nothing happened.
Speaker 5 (36:15):
I think it's actually kind of interesting, and we will
maybe unpack that in the coming months as Trump's second
term kind of settles in. I'm sure we will kind
of revisit why we think this did not happen. Certainly,
I'm curious about what inauguration Day will look like. But
but yeah, that was a lot so sorry. Morsey is
(36:35):
still alive, David David Scavenger is still alive, Putin is
still alive, and though James did say that Asad would
eat it, and though a sawd didn't die, he kind
of did eat it.
Speaker 2 (36:49):
Well, yeah, I mean, James, Yeah, that's that's not gonna
be the biggest dub of the year. Yeah, that's right.
Speaker 4 (36:56):
Damn I forgot no one about that. Really happy with myself.
Speaker 1 (37:00):
Now, James, I'm so proud of you, buddy.
Speaker 2 (37:02):
You got to pick another one this year. Yeah, may
on long, baby, he's next.
Speaker 5 (37:09):
Let's I guess let's start with some kind of dictator predictions.
What do we think will happen to like a dictator
in twenty twenty five.
Speaker 4 (37:17):
Which is Hace is gonna die? Do we think go
just general dictator predictions?
Speaker 5 (37:20):
Dictator predictions. It can be maybe we get a new one,
you know, maybe we get a new fancy one.
Speaker 2 (37:26):
Yeah, well I don't know. Yeah, something's happening in January.
Speaker 6 (37:29):
I have two well one of them, I mean it's
kind of a hack one, but I don't think. I
don't think the June to memr makes out of twenty
twenty five. Yeah, I think not in the version is today. Yeah,
that's the hack one. The the other one is another
assad one. Is I think someone actually does assassinate a
sod Well, he's like like he he gets too full
of himself and he goes to Abi Dabi and some
Mosan brotherhood guy just wax him.
Speaker 2 (37:50):
Yep, okay.
Speaker 5 (37:51):
May Assaud prediction is he becomes a Russia Today host.
That's my assad prediction.
Speaker 1 (37:57):
Oh my gosh.
Speaker 4 (37:58):
Yeah, he's going to open his ophthalmology clinic.
Speaker 2 (38:02):
No, I mean I think he's going to get signed
to host a podcast by a little a little network
you might have heard of called cool Zone Media. Congratulations, guys,
let's bring him on. So if you get him on
the get him on the zoom, tell him you can
kick an hop in the room. Now for sure, baby.
Speaker 5 (38:23):
We are merging with Tennant Media to bring up our friend.
Speaker 2 (38:28):
Yeah, welcome to the Podsha. He's actually doing a whole
media tour with the Pod Save guys next week. That's
got to be fascinating.
Speaker 4 (38:38):
Pod Save bathis Siria the most cursed podcast in the world.
Speaker 1 (38:44):
My Dictator slash world Leader prediction is that despite being
Nen Yahoo's I was thinking, yeah, last ride.
Speaker 2 (38:52):
From your mouth to whatever fucking clot is working its
way through his coronary sy in a year.
Speaker 1 (39:00):
I really fucking hope. I'm really fucking hope. I'm right,
we all do.
Speaker 2 (39:05):
I don't know what else to say there?
Speaker 4 (39:07):
Yeah, yeah, yeah, yeah, yeah, that's a big thing for
the world.
Speaker 2 (39:12):
Yeah.
Speaker 5 (39:12):
I mean we are verging into not doing predictions, just
doing hopes and dreams.
Speaker 4 (39:16):
Yeah. Well I did Morrissey like that last year. We
didn't get it, and I'm sack.
Speaker 5 (39:20):
We need some hopes and dreams out in the world.
Speaker 1 (39:22):
I fair enough.
Speaker 5 (39:24):
Yeah, do you know what else? We need team money
from these advertisers. That's right, and we.
Speaker 1 (39:40):
Are back, all right, Garrison, what's next?
Speaker 5 (39:44):
So usually in the middle of these prediction episodes, I
like doing our like third annual death segment, who do
we think will die? And I guess we kin we
kind of touched on this briefly, but I don't think
we we actually secure death for any of those people
in our prediction, just that they would, you know, have
circumstances change, though for this year's death segment we have.
(40:05):
We have a bit of a twist. So it turns
out about two years ago, on Spotify Wrapped Day, we
all woke up to the news that both Angela BATTLEMENTI
was embarrassingly my number one Spotify artist that year, but
also that Henry Kissinger died honey, and this Spotify rap. Today,
(40:29):
we will come to the news that the United Healthcare
CEO was gunned down in New York City. So Spotify
rapped twenty twenty five Who's dying?
Speaker 2 (40:42):
Who's dying on.
Speaker 5 (40:45):
Spotify rap to day? So this is what late November,
early December, we don't really know. Spotify rapped death day predictions.
Speaker 1 (40:55):
So long, farewell, Love Eider saying goodbye, Mitch McCall.
Speaker 2 (41:00):
Oh, that's a good one. That's an easy one. But okay,
I'll give it to you.
Speaker 5 (41:06):
I'm thinking, like, who's got to get through most of
the year but not finish it out. You know, it's tough.
Speaker 2 (41:13):
I'm gonna make my call tye up resip air to one.
You know. That's that's that's my hope. That's a long shot.
I know, yes, he doesn't seem like he's in bad health,
but that's a big one.
Speaker 5 (41:23):
Kissinger was a long shot too, because he was like
arguably immortals.
Speaker 2 (41:27):
He'd kept living for so fucking long.
Speaker 1 (41:30):
Ah, so long, farewell, Love Eater, say goodbye Musk.
Speaker 2 (41:37):
I was gonna say that I think he might die.
Speaker 5 (41:40):
You think we're finally gonna get that drug over do.
Speaker 4 (41:41):
So I just he just seems to be spiraling so
hard right.
Speaker 1 (41:45):
Now, the spiral's mad real.
Speaker 2 (41:47):
Yeah, he's getting everything he wants though, but I mean
that that also.
Speaker 5 (41:50):
It's It's true.
Speaker 2 (41:52):
Sometimes that's dangerous, Yeah, especially if you are addicted to
a drug that you can get in unlimited peer quantities
and no one will ever say no to handing it
to you.
Speaker 5 (42:02):
We have some more must predictions for later on the episode. Okay,
but I can see of some, you know, like fantously
the Secret Service, you know, not not great at hiding
there own drug problems. I can I can see possibly
with Musk entering a new level of comfort, maybe the
spiraling a little, a little too far out of his control.
Speaker 2 (42:21):
He and two Secret Service ations are found dead with
fitnyl infected blow.
Speaker 5 (42:28):
Maybe a SpaceX launch goes really wrong. Who's to say?
Who's to say? Damn? I gotta think of who? Who?
My who? My Spotify Rapped Day death is.
Speaker 6 (42:42):
I have a long shot. Oh yeah, My long shot
is that sometime on Spotify rap Day, JK Rolling sees
a trans woman just like existing and gets so mad,
he has an aneurysm and dies.
Speaker 5 (42:55):
No, she's looking through the Spotify wraps and she knows
that trans would make the best music, and she she
sees it, gets so mad she just she just kills over.
Speaker 4 (43:08):
She transvestigates every single female artist on the Spotify rap
list and dies of sleep deprivation doing so.
Speaker 6 (43:15):
Her her own fans start transvestigating her.
Speaker 2 (43:17):
This is the edge.
Speaker 5 (43:20):
Okay, I have a real long shot here, but I
can see how it could happen. So we're we're in.
We're in like what like month, month ten eleven of
Trump term two. Right, the right wing nazi content creators
are settling, are settling into their into their kind of groove.
Some of them aren't really happy at Trump, not like
(43:42):
you know, carrying on all of all of his big
lofty promises. And one one disgruntled fan of Nick Fuentes
does something crazy on Spotify rapped day, and that's that's
that's my prediction is that somehow something like really weird,
like like stalker or fan does something to to mister Fuentes.
(44:05):
Just pure prediction on like just like what would be
the oddest, oddest thing to happen, but something that could
totally make sense. Maybe it's like an old like Kanye fan,
you know from Kanye and Nick.
Speaker 2 (44:16):
From his nasra.
Speaker 5 (44:17):
Yeah, Yeah, I don't know. I feel like it's it's
his fandom's getting close enough to pull some like weird
crazy shit like that on, like a weird like on,
like a on, like a deeply parasocially destructive level, like
Stephen King's misery. A misery happens to Nick Quintes, but
he doesn't but he doesn't make it, he doesn't make
it out. That's that's my Spotify wrapped prediction.
Speaker 2 (44:40):
I have said for years Nick Fuintes is going to
go down live probably maybe maybe live. He's gonna go
down like George Lincoln Rockwell. It is not going to
be like an enemy of his, that does it. It's
going to be a result of his incredibly messy personal life. Yeah,
like someone is going to take him down. H Like
it's it's that Yeah, Yeah, that feels right.
Speaker 1 (45:03):
Do we have a non categorized predictions? Is it that
time yet?
Speaker 2 (45:07):
Sure?
Speaker 5 (45:08):
Now that we have, we have finished our Spotify wrapped predictions,
And I do not know who my top artists will be.
This last year it was Trent Resnor so salute that's like, okay,
Garrison Challenger soundtrack, that thing fucking bops.
Speaker 1 (45:23):
I tried to make Robert watch that on the way
to Oh is it the d n C or the
r n C. I don't remember, and but he wouldn't
watch it with headphones, and so it was just on
on the plane. I think it was the DNC.
Speaker 5 (45:36):
That's terrible.
Speaker 2 (45:37):
Yeah, I think I think I was reading a Nick Land.
Speaker 5 (45:43):
Honestly, that's a vibe that actually pairs quite well.
Speaker 2 (45:47):
I landed completely deranged. It was great, Ready to work.
Speaker 1 (45:53):
A prediction. A prediction that I have is that like
Trump basically tries to move a lot of the main
time he spends to mar Alago versus the White House. Like,
I feel like he's going to make mar A Lago
some like national monument type shit so that he can
(46:13):
take whatever the fuck documents he wants from the White
House to mar A Lago and spend as much time
there as he wants and make that like a national
residence or some shit.
Speaker 5 (46:23):
Went to White House, Yeah, the whiter House, we could
call it.
Speaker 2 (46:29):
Yeah, so true, true Harrison. No, I'm kind of interested
to watch what happens with AOC over the next year,
because she has definitely become to a lot of folks
that progressive and on the left, like a villain over
the last year, And I kind of wouldn't be surprised if, like,
(46:51):
in assuming there's still politics in twenty years. When we're
talking to young people, they think of her like Pelosi
and like, oh, you've got to understand when things started out,
this was a very different person. Yeah, yeah, Yeah. And
I'm not saying that's a fair way to characterize her
now or where she'll go. I'm just saying, like, I
wouldn't be shocked if that's the way a lot of
(47:12):
folks are looking at it in fucking a few years,
because I'm saying I'm hearing a lot of that now. Yeah,
peop are very angry at her over largely Gaza. But yeah,
also the fact that she and Bernie both tried to
back Biden kind of yeah late in his uh centicence. Yeah, Okay.
Speaker 6 (47:30):
My My big one for the year is this is
the this is the year the economy finally collapses, Like,
this is the year you find out that no company
has made any fucking money in a decade. It's all
been being pumped up by like a deranged combination of
interest rate bullshit, a bunch of fucking money from like
overnight repo purchases, keeping the banks propped up. And I
(47:51):
don't know if it's gonna be the trade war that
fucking blows it up, although I think that will instantly
detonate everything.
Speaker 2 (47:56):
I don't know.
Speaker 6 (47:57):
Maybe it's maybe he's a Chinese housing bubble, maybe the
tech bubble finally collapse. Maybe all three of them hit
at the same time. This is the year fucking goes.
I've never actually put my name down down on this
on the show. On any other fucking year. This is
the year the zombie economy will fall over dead. The
necromancy cannot hold.
Speaker 2 (48:14):
I guess my prediction is that the economy is going
to be basically identical to the Biden economy in that
we're going to get like fucked up inflation and people
are going to be very angry, and the number will
continue to go up on the stock market because that's
kind of what it's designed to do. That's my theory.
Speaker 1 (48:32):
And the housing market will still be trash.
Speaker 2 (48:35):
Yeah, we will never afford times and the housing's just
gonna get more expensive. It will be interesting to see
Trump's entire all of his backers and his whole media
like one thing that will be easier for the left
is really hitting conservatives on inflation as it gets horrible
again or continues to suck, because that's, you know, at
(48:55):
this point, just a factor of the economy working as intended. Yeah,
that they all have to pretend, isn't. Yeah.
Speaker 1 (49:03):
And before we go to a break, I just want
to say the price of eggs will go up.
Speaker 2 (49:07):
I need to get chickens now. Oh yeah, this bird
flu thing is not gonna help with eggs. Oh boy,
oh boy, get your eggs now, by one hundred, by
thousands of dollars of eggs.
Speaker 4 (49:16):
Now, there was some kind of device to make eggs
and you could have in your own God, oh.
Speaker 1 (49:22):
My goodness, it's time for ads.
Speaker 5 (49:35):
I guess to piggyback off of Robert and MEA's predictions
there in the economy. My prediction is that once I
finally launched cool zone coin this year, I'm gonna make
a big If the economy is gonna go down, I
am gonna be going up. Everyone's gonna start buying coole
zone coin because the US dollar becomes worthless. Bitcoin's gonna
(49:58):
crash too. It's fake, but cool zone coin has real
fungible value.
Speaker 2 (50:03):
Well, yeah, the thing about cool zone coin. That makes
it different from all of the other crypto coins is
that it is really based on a fundamentally limited and
valuable resource, which is movies from the nineties that I
showed Garrison and they actually liked. So, you know, there's
there's only so many cool Zone coins that can be
in circulation.
Speaker 5 (50:24):
We're lucky I was in Portland this Christmas because we
really stocked up a few more of those nineties classics
to bump up the price of cool Zone coin going
into twenty twenty five.
Speaker 4 (50:31):
That's right, everybody, Wow, sell your house by cool Zone coin.
Speaker 2 (50:36):
Have you seen Hook? Garrison? I have seen Hook. I like,
of course, good, Yeah, a classic.
Speaker 1 (50:43):
Have you seen Wickerman nineteen seventy three?
Speaker 5 (50:46):
You know, I actually haven't. I've been waiting to catch
it in the theater.
Speaker 1 (50:49):
We will make this happen at some point necessary.
Speaker 5 (50:52):
I would I would love too. I would love too.
Speaker 4 (50:55):
I bet one thing. I think it's very predictable border stuff.
They will stunt on an the caravan of migrants, and
I think it's pretty easy for them to kind of
organize that and make that happen, and it will be
a way for Trump to flex his border fascism yeah,
much like he did in twenty eighteen. Maybe they'll wait
till the midterms again. There's always a fun border disaster
(51:15):
for the midterms.
Speaker 1 (51:16):
Could I just do one? That might not be a prediction,
but like a Sophie Hope.
Speaker 2 (51:20):
Sure, yeah, get it.
Speaker 1 (51:22):
Something has to happen to those Paul brothers.
Speaker 2 (51:25):
Oh, Sophie, Oh yeah, that's possible. Yeah. My prediction for
the Paul brothers is that one of them dies within
the next five years, and one of them lives to
be one hundred and seven.
Speaker 5 (51:34):
Oh that tracks sure.
Speaker 4 (51:37):
Yeah, they decide to take on Bob Dylan in a
boxing match and only one of them survives.
Speaker 2 (51:41):
I think Bob Dylan'll live in this next year.
Speaker 4 (51:43):
But I've just found Bob Dylan's tweets the purest thing.
He just tweets about what he's doing.
Speaker 2 (51:51):
What a hero.
Speaker 1 (51:52):
Netflix paying Jake Paul to billions of dollars to fight
nine hundred year old Mike Tyson and then Jake Paul
coming in on like a vintage car and spraying his
product and it having higher streaming numbers than the Super Bowl?
Speaker 5 (52:11):
Is that real?
Speaker 1 (52:12):
Yes?
Speaker 6 (52:13):
To be fair, that was a rancid super Bowl.
Speaker 1 (52:17):
Rancid Super Bowl. This this cannot this cannot be.
Speaker 4 (52:21):
Most of us just turned in on the off chane,
So Jake Paul would die.
Speaker 5 (52:25):
Yes, that is true.
Speaker 4 (52:26):
That is true, or at least get bitten.
Speaker 1 (52:29):
Like Yeah, all of us were hoping that Mike Tyson
was not in fact sixty years old. But he is
sixty years old. So uh yeah some god, yeah something,
something's gotta give. Oh, and there won't be a left
wing Joe Rogan, thank you so much.
Speaker 2 (52:46):
Oh, I don't know, Sophie.
Speaker 5 (52:47):
I can as soon as we logicals so good, I
think we can.
Speaker 2 (52:51):
Really.
Speaker 1 (52:51):
Oh, there'll be there'll be somebody trying to be.
Speaker 2 (52:56):
Oh, Sorphie, there already. By the way, it's time for
me to do our new ad plug. You've heard of
how good elk meat is for you, and you've heard
of how liver is a superfood. Well now try new
elk liver steaks. It's just just ground up liver shoved
inside a steak. I send it through the mail through
FedEx five day delivery. It is not refrigerated in any way.
Speaker 5 (53:19):
No refrigeration. It's better at roop temp, better at.
Speaker 4 (53:22):
ROOPTEMP to get the healthy bacteria it gives you mystical powers.
Speaker 5 (53:33):
Of One of my I guess more hopes and still
partial predictions is that National Guard gets into a scuffle
with border patrol in some kind of blue state.
Speaker 2 (53:42):
Yeah, yeah, good chance.
Speaker 5 (53:43):
We have some brave and strong governor is gonna is
gonna salute the troops and send out our proud National
Guard boys to fight off ice. That's just a battle
I would love to see. I've wanted to see that
ever since Portland twenty twenty. I've wanted to see National
Guard troops fight against federal forces.
Speaker 2 (54:02):
Two groups of men who don't really know how to
use their guns, using their guns, No, it's gonna be amazing.
Speaker 5 (54:09):
So battle, I've wanted to see you for like five years.
Speaker 2 (54:12):
Who's plate carriers at the top, closer to their nipples?
It's anyone's game.
Speaker 5 (54:19):
I need to see it. I need to see it.
Come on.
Speaker 4 (54:23):
I would like to see it from a distance, because
that would be a shit show.
Speaker 2 (54:26):
Yeah, from a sizable distance.
Speaker 5 (54:28):
Yeah, General Whitmer, let's go.
Speaker 2 (54:30):
Let's go expensing a fucking telescope for that firefight. Yeah, yeah,
a periscope. Maybe I trust the Iraqi Army more than
either of those sides.
Speaker 4 (54:41):
I've seen a lot of dudes five guns while ducking
behind a KB holding the gun.
Speaker 1 (54:44):
I love you.
Speaker 2 (54:45):
And then it does look fun. It does look fun.
Speaker 5 (54:47):
It is.
Speaker 4 (54:48):
Yeah, definitely, I would like to do that. That they kicked
me out the range every time because of woke how sad?
Speaker 5 (54:53):
Well, not anymore, James, Yeah, that's also the casualties yet,
not any more, James. Wolke is beaten.
Speaker 2 (55:00):
That's right. Yeah, they wait, went, they went broke.
Speaker 4 (55:02):
I'm going to buy the range, that's right, and we'll
all fall from behind the bentress.
Speaker 5 (55:06):
Now, oh mama.
Speaker 3 (55:09):
Yeah.
Speaker 2 (55:09):
Yeah.
Speaker 4 (55:10):
Other predictions, maybe we'll get a good, solid couple of
weeks of writing again, and that Garrison said, like, maybe
it'll let me take a year or two this time.
Speaker 5 (55:18):
I don't think that anymore.
Speaker 2 (55:20):
Something will have to change. Yeah, there will have to
be a material change in either organizing or social conditions,
because people will need to either be vastly more desperate
than they are right now, or they will need to
have a specific reason to think, well, this time, getting
out in the street might do something. Yeah.
Speaker 5 (55:41):
I think we're gonna kind of continue the trends that
that we've been seeing, which which points towards a bit
of an apathy towards like like big popular mobilizations and
more towards kind of bizarre lone wolf attacks, something that
you know could be slight even a slightly probable or
you know, possibly darker productions. I think we'll have like
(56:03):
a really bad Luigi copycat within the next like four months. Sure, yes,
the years of Luigi, Like, it's not gonna be good.
Speaker 2 (56:13):
It's not gonna be good. There's probably gonna be a
situation where some guy either gets the best case is
that he gets killed immediately by the dude's security. The
worst case is there's a big public firefight and a
whole fuck let of people get hit.
Speaker 4 (56:26):
Yeah, didn't I predict that there would be a big
public crime with a three D printed gun last year?
Speaker 5 (56:30):
I think that was the year before we talked about that.
Speaker 2 (56:34):
Oh damn, okay, so close and.
Speaker 5 (56:37):
Yeah, you know, I mean this, this certainly does kind
of fit that mold. We'll see how much that like
gets focused on in the trial and like continued reporting.
Speaker 4 (56:47):
Yeah, and in that legislation too, I missed a death.
We can also include it in the hope section. Matthew Iglesias,
that motherfucker, motherfucker has been standing bullshit for twenty years.
It just it cannot can he knew he's lost a
juice a little bit. I think he ses on the
way out.
Speaker 2 (57:04):
All Right, Something very funny did just happen that we
should talk about as a team. Senator Doug Mastriano taught,
a thirty year US Army veteran who taught at the
War College, just tweeted an indignant, furious tweet about the
US government not being honest with Americans about like what's
happening with these drones? And yeah, and the picture of
(57:25):
the crash drone is a tie fighter. That's like a
model tie fighter on the bed of a flatbed.
Speaker 5 (57:31):
Yes, we've all lost our mind.
Speaker 2 (57:36):
Hot at the US Army War College, they're not sending
their best people. Oh fuck, that's funny, amazing stuff. That's
one of the best things I've seen out here. Oh
good me.
Speaker 5 (57:53):
Finally, I like the closer predictions a little bit on
Trump's cabinet. I think it's pretty pretty safe to say,
becausidering his last presidency, we'll have at least one third
cabinet turnover by the end of the year. Yeah, this
is something that we've been talking about a lot. When
do we think Musk is gonna get the boot and
based on the way Trump's kind of positioned him, I'm
not sure if it's gonna be as soon as what
(58:14):
we all kind of initially thought, because Trump has kept
him out of his inner orbit but pretty solidly in
his middle orbit. Like he's not in any like real position, right, Yeah,
he has Doge, Like, come on.
Speaker 2 (58:27):
It just came out that he's not going to be
able to get a high the highest security clearance. There
you go, that's funny.
Speaker 1 (58:34):
But like he has him sitting next to his family
at Thanksgiving totally.
Speaker 5 (58:38):
Yeah, yeah, no, no, totally, And especially in like the
three weeks after the election, they were like they were
like honeymoon, right, they were neck and neck and something
that's gonna like start dissipating. Musk can't get fully booted
out because like you know, the federal government needs SpaceX
and and unlike Musk's other like technologies, so like they
(58:58):
will remain friendly, but like you're not going to be
in the close position that they are now. I initially
I put that date for being March twentieth, twenty twenty five,
you know, a two months after inauguration day. It's it's
enough time to get, you know, for for someone like
Trump to get tired of Musk's like personality. But I
think I might stretch that out a little bit more
(59:19):
now than my initial prediction. I think I think they
might do a little bit more of a long term
game here. But that also means that that Musk maybe
will not have as much like constant influence as what
it was first looking like in those like, you know,
three months after the election.
Speaker 2 (59:34):
I think that RFK Junior is probably at pushed out
of the picture before Musk is.
Speaker 1 (59:39):
Yeah, if he tries to get rid of the fucking
polio vaccine, it's gonna be a real quick trip to
the unemployment line for Bobby Boy.
Speaker 2 (59:48):
Yeah, I really, I don't think Trump's that reckquest No,
like that that would be quite quite a line to
get rid of the polio vaccine.
Speaker 4 (59:59):
Trump, I'm still so old, like he remembers he's old.
Speaker 1 (01:00:04):
But if, but if ORFK Junior could get the wheat
ingredient out of the McDonald's fries, I'd be most obliged.
Speaker 5 (01:00:11):
Oh yeah, no, I'm I'm sure that he's gonna he's
gonna reverse one hundred years of corn subsidies. And get
corn served out of our Coca Cola. I believe in RFK.
Speaker 2 (01:00:20):
Yeah, I feel pretty good about the continuing legality of
Kraton as long as he's the HHS head.
Speaker 5 (01:00:26):
There you go.
Speaker 2 (01:00:27):
All it's gonna take, is one of Joe Rogan's friends,
speaking in his ear.
Speaker 5 (01:00:32):
We'll be all right, We're gonna have legally required DMT
for everyone in the country.
Speaker 2 (01:00:38):
Yeah, why not? I think we need and I've been
I've been saying this for years. We need to put
the lithium back in the water. We also need to
use those crop dusting planes and just like fill them
with xanax. Just just just calm everyone down, take everything
back a couple of steps.
Speaker 1 (01:00:56):
All right, I'm gonna go pet some dogs. So the
podcast over. Happy New Year, everyone, Happy, Happy new everyone.
I do want everyone to pick one thing that that
that they're gonna do this year that will improve their life,
however small.
Speaker 5 (01:01:10):
For me, I'm gonna get a new mirror. We're gonna
all pick one thing. We call that Project twenty twenty five.
It's what one thing we can do to improve our
lives and you know, and then by extension, the lives
of everyone else around us. So make sure everyone has
their own personal project twenty twenty five going into this
next year. I think we will need it.
Speaker 1 (01:01:28):
Yeah, I'm holding my Project twenty twenty five in my
arms right now.
Speaker 5 (01:01:32):
Your new dog, your new dog I adopted.
Speaker 1 (01:01:34):
I adopted Anderson a sibling, and her name is Truman.
Speaker 2 (01:01:37):
Lovely after our greatest US president, after not.
Speaker 1 (01:01:41):
The greatest US president. I would never name a child
of mine after our president.
Speaker 5 (01:01:46):
After the sheriff in Twin Peaks, that's right. Also, no,
all right, well we love that.
Speaker 4 (01:01:51):
After the house did viv who grew up in the
Truman Shows Gate is Gates?
Speaker 2 (01:01:58):
Yeah, named you're after Matt Gates.
Speaker 5 (01:01:59):
Tildhoods is like totally unoven job. Now that's so funny.
Speaker 1 (01:02:04):
It's very funny. It's very very funny, and I feel
like I feel like we should end on that note.
So ha to Matt Gates. Anyways, Anderson Truman to the
fuck out.
Speaker 2 (01:02:16):
Of here, welcome back to it could happen here, A
podcast about it, which in this week's case is the
(01:02:36):
Consumer Electronics Show is happening here. And yeah, we're here
to talk about things falling apart. And again, in this
case that's the tech industry because The story this CES,
as it has been for the last several CES, is
is that the continuing degradation of big tech as it
seeks more places to get money from while providing less
(01:02:57):
and less utility to the people that it needs to
give it money. And every CEES at some point I
find myself face to face with something that makes me say,
I've now seen the silliest thing I've ever seen. And
this year, that experience happened for the first time within
thirty minutes of the first half day. And I'm going
to talk about that and show some videos to my
(01:03:18):
panelists here, which of course are the great ed Zeitron,
It's me, I'm here, the pretty good Garrison Davis.
Speaker 5 (01:03:24):
Okay, thanks, okay, all right, all right, Boddy them.
Speaker 2 (01:03:28):
And the supernumerate supernumerary. I'm sorry, I messed up the
word I was using as a superlative to praise you.
Speaker 5 (01:03:35):
I'll take at longways.
Speaker 2 (01:03:36):
So Junior, thanks Ed, Thank you so much for joining us. Everybody,
are you ready to see some of the dumbest AI
generated videos?
Speaker 7 (01:03:43):
Sure sell me with more pleasure?
Speaker 2 (01:03:46):
Excellent, excellent.
Speaker 8 (01:03:47):
Nothing fills me with pleasure.
Speaker 2 (01:03:48):
So the first panel I sat down today with at
ten am in the Goddamn Morning Jesus was the Hollywood
Trajectory Generative AI Timeline twenty twenty five to twenty thirty.
Speaker 5 (01:03:58):
Oh boy, I am fascinated for what they think will
happen in twenty thirty.
Speaker 2 (01:04:03):
Everything's just gonna get better Garrison. This panel featured a
number of luminary thinkers, including Mary Hamilton, a managing director
at a Centsure, who announced her company's three billion dollar
investment in AI by dropping this gym.
Speaker 8 (01:04:16):
I have a digital twin.
Speaker 5 (01:04:17):
And she's constantly evolving and how she gets used and
what she says, and.
Speaker 9 (01:04:23):
You know there's you know, big educations around that.
Speaker 5 (01:04:26):
So I think this is a really exciting space to
be thinking.
Speaker 7 (01:04:29):
That to any like that she just stole Hurley Herndon's thing.
Speaker 10 (01:04:32):
But okay, they've probably said that to a dolta they
think I had a concussion.
Speaker 2 (01:04:38):
The shere would this person needs?
Speaker 5 (01:04:41):
Like psychologically, Yeah, you.
Speaker 2 (01:04:42):
Should be allowed to drive.
Speaker 11 (01:04:44):
You need a trying kid.
Speaker 8 (01:04:46):
Okay, let's get you, let's get you sit down, and.
Speaker 2 (01:04:49):
We're taking the phone away from you. Now. I think
this is very silly because again I think, yeah, it's
just a fundamental mismatch and what people might want from
an AI agent, and like the way in w which
they get talked about.
Speaker 8 (01:05:01):
But also they use digital twin, which is cement to
prise software.
Speaker 2 (01:05:04):
Shit. Yeah, oh my god, yeah it's it's it's I'm
excited to go see some digital twin technology that I'm
sure we'll make a cheap egg code switching.
Speaker 5 (01:05:14):
This was this is the first thing I reported on
at CEES was there was the digital twin. Like back
in like twenty twenty two, twenty one, there was like
one single company and all of c Yes, I was
promising like a digital twin, and now it's like every
other company.
Speaker 2 (01:05:27):
Yees. It means so many different things.
Speaker 8 (01:05:28):
It means literally a digital representation of anything.
Speaker 2 (01:05:31):
It doesn't even mean an AI agent.
Speaker 8 (01:05:32):
The fact that they're using it in the wrong place
is very annoying to me.
Speaker 2 (01:05:35):
Yeah. I keep seeing like they can now make an
an AI chatbot trained off of your social media presence.
That's eighty five percent accurate.
Speaker 12 (01:05:42):
Oh, as all twins are.
Speaker 2 (01:05:45):
And I want to say I know they can't. But
then you talk to the average person at CEES or
the average panelist on this particular panel, I'm like, yes,
I do believe in fact, everyone on that panel you
could accurately, you could accurately get eighty five percent of
their personality with the I bought for a bit, you know,
maybe a lot higher improvement. Yeah. Yeah, So I will say,
(01:06:08):
like that was silly. That's not the silliest thing I saw.
The silliest thing I saw came courtesy of another panelist,
Jason Zada, founder of Secret Level and COO of the company.
The videos that Jason came to CEES to brag about
were a collection of the laziest AI slop ever to
stain human eyeballs. His most recent big success that you
(01:06:29):
could just see radiating off of him how proud he
was of this was Coca Cola's annual Christmas ad, which
last year was produced for the first time entirely with AI.
And I'm just gonna if you haven't seen this, who
hears seen Coca Cola's AI.
Speaker 5 (01:06:46):
Guess yeah, I've seen pictures. I think I've watched one.
Speaker 2 (01:06:50):
Okay, well let's let's let watch a few times. We're
gonna play. There's three different versions of this.
Speaker 8 (01:06:57):
So why we're just gonna I mean, that's that's what
it's about.
Speaker 10 (01:06:59):
Out Oh my God, if there's three different versions that
that's just they saved the from.
Speaker 8 (01:07:09):
Everyone is the same length of shark.
Speaker 2 (01:07:20):
Can you believe this song's AI generated? I can't believe
the cou Could they teach a computer to write the lyrics?
Holidays are going?
Speaker 10 (01:07:29):
I just can't believe we finally have the technology to
have three trucks driving somewhere.
Speaker 2 (01:07:33):
And there wagging its tail with a dead eye too
horrible girls move trucks with Coca cola and them driving
down not a street. Raccoons.
Speaker 8 (01:07:48):
What the why is there a satellite?
Speaker 10 (01:07:50):
Are they going to drop the iron cannon on the
polar bears?
Speaker 2 (01:07:58):
It's all clearly a it's all glowing like the city
shots of like snow colored villages with that, as we're
going to see in later videos. A. I loves putting
smoke and random fires where there should not be smoking.
Speaker 5 (01:08:13):
Random fire. Chris, that's such a bad omen for four
more years of a Trump presidency. It's a bleak that
we have like even uglier Thomas Kincaid esque artwork.
Speaker 2 (01:08:24):
That's all every frame looks like animated.
Speaker 5 (01:08:27):
It's like they just generated a Thomas Kincaid like frame
and then like badly animated and.
Speaker 10 (01:08:32):
The way that they move is very weird, like it
looks kind of right but kind of right, looks very strong.
Speaker 2 (01:08:37):
It does that all of the scenes because it's like
showing you a bunch of you see like a polar bear,
obviously it's a Coca Cola Christmas ad. You see like
a fucking reindeer, you see squirrels, you see a dog.
But it always is like this very ai shot where
it just pans across the animal and it's like glowing
and kind of glossy and steering much. But they're not
(01:08:57):
going anywhere with the movement.
Speaker 8 (01:08:59):
It's just like they doing something and that's it.
Speaker 2 (01:09:01):
Yeah.
Speaker 10 (01:09:01):
You think in ten years they're still gonna have these commercials.
Yeah no, because where's the snoke. It's just a polo
bez walking around like.
Speaker 2 (01:09:08):
System one, which tests emotional responses to ads, claims that
the initial response to their Christmas ad was overwhelmingly positive.
Speaker 5 (01:09:15):
I don't think they're lying about that. I think if
you walked up to someone like randomly on the street
and showed them this, I think they'd be like, oh, yeah,
it looks fine.
Speaker 2 (01:09:24):
Yeah.
Speaker 8 (01:09:25):
No one's watching a Coca Cola RDE and being like, yeah, wow,
I've never had one of these before.
Speaker 2 (01:09:30):
Yeah, it's never a new experience. Not yet.
Speaker 7 (01:09:33):
We need an ad man.
Speaker 8 (01:09:34):
Need an ad man for the coke holdouts, we.
Speaker 7 (01:09:37):
Need an AI Don Draper.
Speaker 5 (01:09:39):
Yeah, well, do not give them ideas.
Speaker 2 (01:09:42):
What if a.
Speaker 8 (01:09:43):
Company lost five billion dollars.
Speaker 2 (01:09:44):
It's just an air that doesn't work.
Speaker 7 (01:09:45):
Instead of going to the movies like Don Draper does
throughout all.
Speaker 12 (01:09:48):
Of mad Man, it just doesn't work and respond to
any of your queries.
Speaker 2 (01:09:51):
Just Don Draper spending hours watching that looping Christmas video.
Speaker 8 (01:09:56):
Staring at nothingness.
Speaker 2 (01:09:57):
Yeah. So there was like an immediate, pretty immediate backlash
to this, like all of the responses. If you go
to any of like where these things live on YouTube,
it's just people shitting on them, which he did acknowledge
Jason by saying the video was very debated.
Speaker 8 (01:10:13):
Yes, classic thing with commercials.
Speaker 2 (01:10:15):
We love debating commercials.
Speaker 5 (01:10:16):
Many things are very debated these days.
Speaker 7 (01:10:18):
A lot of people are saying.
Speaker 2 (01:10:19):
And then he showed us next an AI generated video,
the Heist, which was entirely made by a tech script
that itself was mostly written by chat GPT. And here's
how Jason describes the workflow for what you're about to see.
It took thousands of generations to get the final film,
but I'm absolutely blown away by the quality, the consistency,
(01:10:39):
and adherence to the original prompt when I described gritty
New York City in the eighties, it delivered in spades consistently.
While this is not perfect, it is hands down the
best video generation model out there by a long shot. Additionally,
it's important no VFX, no cleanup, no color correction has
(01:11:00):
been added. Everything is straight out of VO two Google
deep mind. So what is the model vo two Google
deep mind? I think is what he's saying.
Speaker 8 (01:11:08):
It is, so I thought that I had another one.
By the way, I'm sure what you're about to show
me looks like a dog's.
Speaker 2 (01:11:12):
As it looks like, yeah, New York, exactly like New
York at Juliani right before he came in clean it up.
Speaker 5 (01:11:20):
So this is like the competitor to Sora. I guess
that's the other big like video generation brand new.
Speaker 2 (01:11:26):
I don't buy for a fucking and I'm not impressed.
But we'll see what you guys think. Okay, I don't
want to poison your I wouldn't.
Speaker 8 (01:11:34):
Oh god, okay, there is fire in.
Speaker 2 (01:11:37):
This The last time you're gonna see the sack full
of money. It does not shine again.
Speaker 5 (01:11:41):
It's a lot of a lot of fire, random fire
and go.
Speaker 8 (01:11:45):
Backwards when they're driving forwards.
Speaker 2 (01:11:47):
Wheel again another straight fire.
Speaker 8 (01:11:52):
I would love to do freeze frames on this.
Speaker 2 (01:11:54):
Actually it's in Gosthel. Why is there so many fires?
Speaker 7 (01:11:59):
Just all right, let's take a shot every time?
Speaker 2 (01:12:01):
Oh my god, and also take a shut every time.
He is wearing different clothing and has a clearly different face.
The car has changed. Column he's praising the consistency and
it is a he is dressed completely differently every scene.
Speaker 5 (01:12:14):
His jacket has has has changed since the last one.
Speaker 2 (01:12:17):
Yeah, yeah, and again the cop car the cars. When
it shows the cars dragging across the screen, they're kind
of doing the same thing usually that the animals doing
the coke.
Speaker 7 (01:12:26):
And minimal motion at the best.
Speaker 2 (01:12:30):
Yeah. I also love this. Can you believe this music?
Speaker 10 (01:12:36):
I also want to just say when he swoved hit
that thing, he was driving like half a mile on it.
Speaker 5 (01:12:41):
Yeah, that's how I run.
Speaker 2 (01:12:43):
Yeah, look an obviously different man.
Speaker 5 (01:12:46):
That's by the way he runs was like he had
his arms.
Speaker 2 (01:12:52):
Looks cops are three kes actually, look how they run.
Speaker 5 (01:12:57):
The running is very funny.
Speaker 2 (01:12:58):
Yeah, the bond his seat Yeah different, okay.
Speaker 5 (01:13:01):
What is going on with his feet and that, different
levels of facial hair, different different jackets, he's wearing different colors, jackets.
Speaker 12 (01:13:09):
Vaguely definiteness and in this SA just move.
Speaker 2 (01:13:13):
What the fuck is going on? Oh my god, I
got me? Yeah, directed by Jason's Odd in big flaming
words because again the AI only knows how to put
random fires on.
Speaker 10 (01:13:24):
Wow, I'm so glad that we have the technology that
there a thing where a guy gets chased by the place.
Speaker 2 (01:13:28):
Yeah, we couldn't. This would have been impossible before.
Speaker 10 (01:13:30):
As he runs at anywhere from one to one hundred
miles an hour.
Speaker 5 (01:13:33):
I assume they just trained they like this was specifically
like pulling on like Scorsese movies a lot.
Speaker 10 (01:13:39):
I just want to know about these thousands of generations
of script because.
Speaker 5 (01:13:42):
That is interesting.
Speaker 10 (01:13:43):
I am very curious because I just don't believe that
for did he just uh read there?
Speaker 12 (01:13:49):
Yeah no, that's the opening crawl to just like some
uh generated Star Wars.
Speaker 5 (01:13:56):
It seems like shot by shot, right, each each shot
is going to require a lot of.
Speaker 2 (01:13:59):
Like iteration the script.
Speaker 5 (01:14:02):
It's just yeah, I mean again, like it unpacking. What
he actually is saying is unclear.
Speaker 2 (01:14:07):
Because I went to the YouTube video for this and
the first five or four comments are looks like we
found the new king of video. Jesus Christ, give it
a rest close change in every shot. Four to six
year old boys are gonna love it and still acts
character and vehicle consistency. But we're getting close.
Speaker 5 (01:14:27):
Which is which is the exact?
Speaker 2 (01:14:30):
By twenty thirty, you'll make a man wear the same
clothes for an entire video. Oh this is This has
happened before with Sora.
Speaker 10 (01:14:38):
When they put Sora out there, like check out airhead
on your man God, and the balloon changes every single shot.
Speaker 8 (01:14:45):
It's a different size and color each time.
Speaker 10 (01:14:47):
There are just people running in the background sometimes and
then they made a new one. You're like, oh, this
is gonna be good. It was worse and less consistent,
and this is what they think of us. They're like,
these pigs will slop up anything.
Speaker 4 (01:14:59):
You.
Speaker 2 (01:15:00):
I can't expect technology to do something as complicated as
dress a man in clothing and have him stay in
that same clothing over multiple scenes. Hollywood never figured it
so cool.
Speaker 10 (01:15:09):
That this costs like so much money as well just
burning there's some fucking gpu melting and a day just
enter in Arizona.
Speaker 2 (01:15:17):
The strain learning North Carolina.
Speaker 12 (01:15:19):
It is also there's gonna be like thirty forty companies
trying to recreate the same misshapen wheel you know, for
the next five days.
Speaker 10 (01:15:27):
Also, the little pigs that watch Star Wars, including myself,
they'll notice every miner inconsistency. Do you think that they're
going to tolerate Luke Skywalker's and Watteau and all their favorite.
Speaker 8 (01:15:37):
Characters they're going to drive? Do you think that they're
going to be happy office with a cyber truck?
Speaker 2 (01:15:41):
That's a cyber truck situation. You I think the issues
are twofold, which is like number one. In order to
make this shiit sell to the people who watch movies,
you have to dramatically reduce the average intelligence of people
watching movies. You have to give everyone brain damage, which
except they are.
Speaker 7 (01:15:57):
Working in Giant Yeah.
Speaker 2 (01:15:59):
And the other thing is the models have to get
much better. And Jason made a point that like, look,
every time people would like talk about the criticism and
be like, look, this is the worst it's gonna look guys,
And I was just looking into it. GPT four took
fifty times as many resources in like fifty times as
much energy to train as GPT three did. So this
(01:16:19):
is these are the kind of like exponential increases that
we're looking at. So like, if it took them so
many millions, billions of dollars of investment to get to
the point where they can make this shitty video, to
make anything close to watchable. You're talking about again just
like lighting on fire, billions of dollars to do what
to make a scene that you could already get like
(01:16:41):
a twenty six year old dude who grew up watching
fucking Quentin Tarantino movies and taking cocaine, And you could
give them sixty thousand dollars and he'll film that shit
for you with an old car, Like.
Speaker 5 (01:16:52):
Yeah, I mean you could. You could even like animate it.
Speaker 12 (01:16:55):
M I mean, look, you give me a PS four
and somebody's grandmother and I will make them think that
they're watching that.
Speaker 5 (01:17:01):
No, seriously, seriously that six.
Speaker 10 (01:17:03):
But also this, I just want to read out some
of the fucking people that use this model. We started
working with creatives like Donald Glover, who I said was
washed ten years ago.
Speaker 2 (01:17:11):
I'm fucking sick of people.
Speaker 7 (01:17:13):
My Love was a was a good album.
Speaker 5 (01:17:16):
America is an objectively bad song.
Speaker 2 (01:17:18):
It's a bad song with a great video.
Speaker 8 (01:17:20):
Yeaheah, I thought he's like kind of bar and be
stuff is.
Speaker 10 (01:17:23):
Very interesting anyway, moving and of course the week in
it so weekend and someone called great. I'll work with
creators on VO one and form the development of VO two,
and we look forward to working with trusted testers and
creators to get feedback on this new model.
Speaker 8 (01:17:38):
How long are you going to get fucking feedback? It stinks.
Speaker 2 (01:17:41):
We've got some feedback from Yeah, I got a few thoughts.
Speaker 5 (01:17:44):
Hopefully those people are are just getting paid to tell
them words and be like yeah, sure, I'll take your money.
Speaker 2 (01:17:49):
Yeah, if they can be twenty million dollars, I'm flipping
the hole like just no, I will turn on a dime.
Speaking of turning on a dime for money, here's.
Speaker 13 (01:17:58):
Ads Ah, we're back.
Speaker 2 (01:18:12):
So the next video that our friend I now feel
he's like a brother to me, Jason puts on was
of an AI generated fictional elderly rock star talking about
death to do this plastic and incapable of dynamic expression
as he guzzles randomly from bottles of liquor that flash
in and out of existence. Sometimes he lies on his
(01:18:35):
back in empty streets while talking about all the all
of the cgi featureless women that he has loved in
his exciting life. Other Times he plays stadium shows while
obvious GPT written dialogue about aging and death drones on.
When the video ends, everybody in the room claps, And
as you watch this, I need to imagine seeing the
thing that I'm about to show you all and a
(01:18:56):
room with like two hundred people in it, all clapping
in thusiastically. I don't think I did. I did it.
I did. I said, come the fuck on, as flat
as I could rise, a skywalk up. Yeah. So here's
fade out and an old man. Yea, it looks a
little bit like George car It's the end. Okay, three?
Speaker 6 (01:19:20):
Can you chest like the world's just you, God damn
big and you're just to go passing through them?
Speaker 2 (01:19:29):
What's he doing?
Speaker 10 (01:19:30):
He carried my heart concerts, Granddad, calm down, It scattered.
I love these slash cuts, the fast cuts, these fosse cuts,
because the next frame was unusable.
Speaker 5 (01:19:40):
Yes, actually yes.
Speaker 2 (01:19:42):
Like that he drank and the bottle changed in his hand.
You could see it starting to happen? What is just
anonymous with destroyed It? Just a beautiful music. Listen to
that lived It to the bow?
Speaker 5 (01:19:54):
Could you believe.
Speaker 2 (01:19:56):
By firing a Roman candles time?
Speaker 5 (01:20:01):
I like so?
Speaker 10 (01:20:01):
The old man does look very different each time, very
different old man.
Speaker 5 (01:20:06):
That's a different that's a different guy.
Speaker 2 (01:20:08):
Yeah, that's the Emperor from the first Cladiators, trotting get running.
Speaker 5 (01:20:15):
Away from this the way this model generates running diesel.
Speaker 2 (01:20:20):
There he is drinking on the fire, old rock star,
drinking in front of a flaming house, the a I
loves burning building.
Speaker 5 (01:20:30):
What is this voice? I would love to track his
tattoos from three.
Speaker 10 (01:20:35):
We'll say he's about to eat the micro different, I've
done it, yum.
Speaker 2 (01:20:40):
Now he's sleeping and a broken Mustang, the classic Ferrari
Mustang Mustang. It's in like a pool in front of
a mansion, but he clearly isn't questioned to it. The
car is hovering slightly over the pool like I love this,
I love this, I love him, And he tells us.
He tells us during this as if we're supposed to
(01:21:02):
be impressed that Chatchypt wrote seventy five and that's fucking hell.
Speaker 5 (01:21:08):
I can't believe that.
Speaker 12 (01:21:10):
Frankly, as a bartender, I regret walking into the room
to see if people want drinks.
Speaker 8 (01:21:17):
This is a bartender.
Speaker 2 (01:21:19):
I apologize. I apologize that you had to hit a drink.
Speaker 5 (01:21:22):
I also would like, actually, can I have a drink too?
Speaker 2 (01:21:24):
We are in the line.
Speaker 10 (01:21:26):
Ces Sweet and we're all drinking because I just want
to say, I'm fucking disassociating after that, I'm so fucking
saying every a year of doing this nonsense, and I
look at these chit eaters and they show us that,
and they like slope down the slop.
Speaker 2 (01:21:39):
Oh my god, it's it's it's hitting the easiest.
Speaker 8 (01:21:42):
Things to find an old man that drinks.
Speaker 2 (01:21:44):
For an idea of like how real this company is.
Obviously they were one of the companies. They were not
the only people who made that Coca Cola add They
were one of like three or four companies.
Speaker 5 (01:21:52):
It takes four companies to take companies to make that.
Speaker 2 (01:21:55):
Can't believe it. They have six hundred and twenty two
followers on Twitter, Hell yes or not? Twitter, on YouTube,
on YouTube, on YouTube, on YouTube, I don't know than
I post this karaoke song and this this fade out
is there or Sorry. The Heist is their most successful video,
with fifty six thousand views. Fade Out, which we just watched,
has less than five thousand views. They're not ready, so
(01:22:17):
they're not they're not white.
Speaker 5 (01:22:19):
It's only going to get better.
Speaker 2 (01:22:20):
Yeah, it's only going to get get only going to
get better.
Speaker 5 (01:22:23):
Obviously, things will only get.
Speaker 7 (01:22:25):
Read floor for a small price of one billion dollars.
Speaker 8 (01:22:28):
It's just like one hundred thousand dollars to compute.
Speaker 2 (01:22:30):
Yeah, imagine how good it would be much a billion
will only get worth more. I mean, I get now, Garrison,
I do think you should invest all of your salary.
Speaker 8 (01:22:40):
I just did a sixteenth minute about talking this.
Speaker 10 (01:22:43):
I think I would rather hook To has a more
obvious use case than this ship. Hey, do you want
to spend way more money to get something way worse?
I actually can't get over the seventy five percent check GPTO.
Speaker 2 (01:22:55):
That should be more twenty.
Speaker 8 (01:22:57):
No, it should be Theoretically it should be it should
be one.
Speaker 10 (01:23:00):
Hundred, should be hundred percent, which means that a quarter
a quarture of it was just fucking unusable.
Speaker 2 (01:23:06):
No.
Speaker 5 (01:23:07):
Absolutely, they're generating like individual shots that they're like stitching
together and like who knows how how long it takes
to like get like the prompt right for that shot
to work.
Speaker 2 (01:23:16):
However long it takes, it was too long because it
looks like, shit, we're gonna watch a video I haven't
seen yet, or at least of course, because it's five minutes,
so we're not watching all this.
Speaker 5 (01:23:23):
Oh my god.
Speaker 2 (01:23:24):
It's two hundred and fifty two views and came out
a week ago. It's called miniminade.
Speaker 5 (01:23:30):
What Yeah, it's a word.
Speaker 10 (01:23:34):
Now it's like when you find your cat's momented on
the floor again.
Speaker 2 (01:23:38):
So first we see a diner called Minimonade that appears
to be both on the fire.
Speaker 8 (01:23:42):
Light runner yeah, light runner, oh god.
Speaker 2 (01:23:44):
And an old lady Rice is up out of a
pile of ashes. That's how mouths work.
Speaker 5 (01:23:51):
Where am I.
Speaker 2 (01:23:54):
Great? AI voice?
Speaker 8 (01:23:55):
What is this pantasticgoria or all voice acting? It's me
Harris Forward. The funk is going on with the tea.
Speaker 2 (01:24:06):
What I think this is death? This old lady is dead.
That's how I.
Speaker 5 (01:24:14):
Now.
Speaker 2 (01:24:14):
She's tripping on tomatoes. The decaying Sandy diner that exploded
has turned into a lively fifties diner off off.
Speaker 8 (01:24:24):
Dennis phil Villa News.
Speaker 7 (01:24:25):
This a segregated diner.
Speaker 2 (01:24:28):
I only see why going back to the good old days.
Speaker 5 (01:24:32):
Yeah, yeah, I know.
Speaker 2 (01:24:33):
There's a little Indian book he is. He is the
help though. Yeah mm hmmm, oh that's not. The little
kid just fell down, and the way it shows falling
is that he just sort of deflays.
Speaker 7 (01:24:48):
And he's up again. The action is staring at.
Speaker 2 (01:24:52):
Well, that's terrible. We don't need to watch anymore of that.
No one, no one, no one want to watch watch
this and have a positive reaction.
Speaker 8 (01:24:59):
They should, They should keep you in a holding cell.
Speaker 2 (01:25:01):
Yeah, I'm deeply unhappy at the time we already spent watching.
Speaker 8 (01:25:04):
Yeah, like, we don't know what you're gonna do next.
Speaker 7 (01:25:06):
We're building a facility for you.
Speaker 2 (01:25:08):
Yeah. The phrase reality distortion field gets used a lot
when we talk about texts, but I really tasted it
in that room because all anyone on stage could talk
about is how good it looks. And every one of
these videos people are like clapping, They're like, wow, this
is amazing.
Speaker 5 (01:25:25):
Why do you think they think it looks good?
Speaker 8 (01:25:28):
It looks better than an Xbox.
Speaker 10 (01:25:30):
Yeah, And the idea is you talk to thing in
and now a thing came out, and that's magical.
Speaker 7 (01:25:35):
So why virtue of not having humans work on it?
Speaker 2 (01:25:37):
It's so it's better than you'd have Yeah, Okay. There
was a moment after this where Jason like joked about
how like I don't like obviously I don't want to
replace actors yet yeah yeah, and another Panels was like,
I think we're gonna have to make some you have
to see how some decisions go as to fair use,
because obviously this is cribbing from a bunch of fuck Scorsese,
(01:26:01):
like it kind of looked like nine. Yeah, and Thomas
can get and.
Speaker 5 (01:26:05):
Later On in twenty foury nine, and Denny Villanou in general,
like all of his films have been like a massive
source for for these emotion and still generations, so much
so that like I think, like later On twenty forty
nine is like one of the easiest films to like
like replicate film stills almost exactly for based on like
how like how like load bearing that film has been
for a whole bunch of these models that could be
(01:26:27):
due to a number of factors.
Speaker 2 (01:26:28):
Now I know what you're wondering, How soon until we
can get a full ninety minute movie that looks like this?
Speaker 1 (01:26:34):
Oh?
Speaker 5 (01:26:34):
I mean I'm guessing days away.
Speaker 2 (01:26:36):
No, no, Jason said, probably not at least for a
decade or so.
Speaker 5 (01:26:39):
Really, okay, years, that's interesting.
Speaker 2 (01:26:42):
I don't want to wait that long. What a worthwhile endeffort.
Speaker 5 (01:26:45):
Though, because he could have said shorter. That actually is interesting.
Speaker 8 (01:26:48):
He could have said anything those chums believed.
Speaker 2 (01:26:51):
I think it is like he did have to spend
probably hundreds of hours of his precious one human life
stitching those those turds together, and he's like, it's nowhere
near ready. There's no way it could make it nice.
Speaker 5 (01:27:03):
It's giving him.
Speaker 12 (01:27:06):
Because I've only really seen one interesting genitive video thing.
But it wasn't a generative video thing. It was they filmed, uh, Brian,
you know, filmed a documentary and they created, you know,
some backhand software so that they would be able to
do cuts of existing footage and try to focus on
(01:27:27):
the parts of the documentary. But I never ever see
anything interested in like constructing narratives or it's like, you
can't teasing other aspects of the creative process. It's only
let's try to replace, right, Let's try to so you
can't do narrative with it.
Speaker 2 (01:27:44):
And that's the thing. If I if i'd sat down
there because I'm sitting I said this. I was sitting
next to a guy from usc who was one of
the only people in the room who was like similarly
critical to me of what we were seeing on stage.
It was like, look, if they had come down and
been like, look, this is how we can plug a
script in and it can create a story be bored,
and you can like kind of see like a crude
CGI animation of how the shots will look, and that
(01:28:04):
can help you like plan out, like like that's legitimately useful.
That's the thing that adds value and can cut costs
in a meaningful way to like the production of good
TV and movies. But that's not as sexy as like
I'm and they were all talking. There was this this
like very weird moment where one of the panels Leslie Shannon,
who's head of innovation for Nokia, a company that used
(01:28:27):
to make phones and now makes panelists who pretend to
be entertained by awkwardata.
Speaker 5 (01:28:30):
They also like make cameras and.
Speaker 2 (01:28:33):
They make a lot of stuff. I was just shitting
on Nokia. She's like, can we use neuroscience to see
how people are reacting to AIA generated videos? And then
adjust the ending to be like, you know, let's make
this resonation of a night. That way, we're helping the creative.
And I was like, are you out of your fucking mind?
Speaker 8 (01:28:49):
We attached electrodes to found people skulls.
Speaker 2 (01:28:53):
I would I would have supported electrodes in their skulls. Yes,
Jesus Christ, we should do the monkey neuralink thing to
perhaps a pair of calipers.
Speaker 8 (01:29:04):
Skulls.
Speaker 2 (01:29:05):
I am fascinated the skull shapes of that fucking.
Speaker 10 (01:29:07):
To say that is there's so many things that you've
said that just they wouldn't survive at that position.
Speaker 2 (01:29:13):
Speaking of things that wouldn't survive a deposition the sponsors
to this podcast. Okay, so that first panel was a
real moment for me. I went through a couple of more,
(01:29:34):
one of which was on like advertising and AI and
was mostly mostly pretty boring. The third panel I went
through though, was called AI Cinematic Spatial and XR and
I just want to actually play you guys, you'll have
to cluster around.
Speaker 10 (01:29:48):
I would actually believe that was generated with chat GPT
GPT two point zero.
Speaker 5 (01:29:53):
So let's start with this one. AI will be more
impactful than.
Speaker 14 (01:29:57):
The Internet, maybe it's a trick question because it is
then that was that was.
Speaker 1 (01:30:11):
The Internet, although it can wrong about the Internet.
Speaker 6 (01:30:15):
So I'm like, oh, yeah, there you go.
Speaker 5 (01:30:16):
All right, what's what when you impact.
Speaker 9 (01:30:20):
AI?
Speaker 5 (01:30:21):
AI is going to resolve in astronomical job losses?
Speaker 2 (01:30:28):
True, bolsh, there will be an evolution of job laws.
Speaker 7 (01:30:34):
Next redistribution of.
Speaker 2 (01:30:39):
M That was the scene I wanted you to hear with.
They're like, we don't want to say it out loud,
and then everyone chuckles.
Speaker 8 (01:30:45):
These people are too fucking smug.
Speaker 10 (01:30:47):
Yeah, these people sound too confident and too chumming and
too happy to say things like this.
Speaker 2 (01:30:52):
That's not good. I don't like these people laughing about
people losing jobs.
Speaker 8 (01:30:55):
No, I shouldn't have jobs.
Speaker 2 (01:30:56):
That's that's a good place to start. Yeah, I don't
like that either. And the people you're hearing from. Let
me let me tell you who's in this fucking panel
who was just laughing about like, well, there will be
a uh an evolution law. Yeah. So the motherfuckers who
are all that panel laughing about people losing their jobs.
(01:31:17):
Ted Shillowitz literally his name is Shilowitz, futurist at Cinemersion, Inc.
Speaker 5 (01:31:24):
That's like a j.
Speaker 2 (01:31:30):
Rebecca Barkin, co founder in CEO laman o One, Aaron Luber,
Director Partnerships at Google, IPG Media Lab, Layla Emirsadegi, Principal
program manager at Engineering Microsoft, and Katie Henson, s VP
(01:31:50):
post Production at Fear Studios. So those are the people
who are.
Speaker 5 (01:31:53):
Are sad, all laughing and like it's like genera of
AI is like good at like one thing creatively, it's
good at like streamlining VFX, like workflow to the workflow
of of how to do like it is it is
like there's aspect. Famously, the only useful thing it's been
used for is making people's eyes blue in Dune Part two.
Speaker 2 (01:32:15):
It's not one hundred billion dollars.
Speaker 5 (01:32:17):
And like it is applicable for like changing objects into
other objects on screen. It can produce really like kind
of odd like uncanny effects that could be utilized by
a team of human artists. Really well. What it can't
do is generate a short film that is in any
way compelling piece well that is anyway compelling as a
piece of art. Okay, And the fact that they're like
(01:32:39):
laughing at how much how much have you lost?
Speaker 2 (01:32:42):
Enough jobs they have not.
Speaker 10 (01:32:45):
Or that had structures full to the beauty of the flame.
Speaker 2 (01:32:49):
Right, although the AI keeps keeps foreboding coming for them
and wants somethings. I'm going to end on a happy
note because the last panel I went to was actually
really cool. It was AI in the Crisis of Creative Rights,
Deep Fakes, Ethics and the Law, and it featured the
first intelligent person that I've seen at CES this year,
(01:33:10):
Moya McTeer, who is a folklorist and senior advisor at
the Human Artistry Campaign. It also featured Duncan Crabtree Ireland,
who's the national executive director and chief negotiator of sag Aftra.
There we go, There we go, and this was no bullshit.
It was talking about all of the different lawsuits that
are going on right now, all of the litigation around AI,
and like the actual strategy for litigating, and like there
(01:33:33):
was a couple of points where like Duncan was like
a lot is going to hinge on some very brave,
very famous people choosing to throw down some big dollar lawsuits,
like that's what we need right now. They did talk
about the No Fakes Act, which has bipartisan support and
gives some legal force to allow people to push for
AI copies of themselves to be taken down. And they
think there's also some bipartisan possibility to get AI labeling
(01:33:56):
like legislation.
Speaker 10 (01:33:57):
The thing is, any of these things would be fucking
fight OPA, because if what you have to remove something
from a model, how the fuck do we do that?
Speaker 2 (01:34:04):
Yeah, we don't know how.
Speaker 5 (01:34:05):
You have to throw away the entire model, you.
Speaker 8 (01:34:06):
Have to retrain, like there's no way around it.
Speaker 2 (01:34:09):
Yeah, And there was a really good point where kind
of at the end of this part of what I
appreciate is again there was no bullshit. Like Moya at
one point was like, I think it is absolutely it
being generative. AI is absolutely a net negative for the
artistic community. The point is, the point is not to
get something out as quick as possible to like make
art right.
Speaker 5 (01:34:26):
And this has to be like one of maybe five
people who are doing panels at CES who's like willing
to say that.
Speaker 2 (01:34:31):
Yes, and Duncan got I was like, look, you can't
stop the technology from being invented, So the best path
forward is to like try and channel this into a
direction that like is at least better for artists, Like
there were there was very little for most of the
people on the panel, very little bullshit. There was some
bullshit from one person on the panel, Jenny Katzman, Senior
director of Government Affairs from Microsoft. Oh I bet that
(01:34:54):
was fun. So after there's this whole point where like
everyone else in the panels like, yeah, I think it's
probably in that negative for artists the whole, and Jenny comes
on she's like, actually, I think it's a net positive.
And her example of this is, well, you know, think
there's a lot of stuff that you couldn't do before that.
Thanks to AI, you could do like d aging Harrison
Ford for the Indiana Jones movie, something that went over
(01:35:16):
very well.
Speaker 5 (01:35:18):
Everyone everyone loved and a great creative.
Speaker 2 (01:35:22):
This is the fucking problem with all of this.
Speaker 10 (01:35:24):
On top of house ship it is and how expensive
it is, which kind of AI are we talking about?
Speaker 8 (01:35:28):
That dip shit.
Speaker 10 (01:35:29):
That's not generative AI, that's not what that fucking waste.
Speaker 5 (01:35:33):
And it also steals us from being able to cast
a young River Phoenix's blat lovely thing.
Speaker 2 (01:35:41):
It's getting cast in more stuff. Garrett, I'm very unfair.
Speaker 5 (01:35:45):
Well, luckily, with the power of AI.
Speaker 2 (01:35:48):
Look, I can put into every newspaper sequentially starting in
eighteen thirty four, So I've not gotten to the end
of Phoenix. It would be a really long career.
Speaker 5 (01:35:57):
It would be really cool.
Speaker 8 (01:35:58):
Sleeping guy.
Speaker 2 (01:36:00):
I think he's got the bold ideas. This is gonna
work out really well for Germany.
Speaker 5 (01:36:05):
It won't be really cool that instead of just doing
Young Harrison for they just do a River Phoenix deep
fake for you him. Look it's canonical.
Speaker 2 (01:36:17):
Yeah great.
Speaker 10 (01:36:18):
Oh, I love the movies in the future of them too.
This is so good. This is so bad.
Speaker 5 (01:36:23):
James mangled, You're a hacking So.
Speaker 2 (01:36:25):
I gotta say it was very funny because she also
suggests Jenny, there's we can use animals without causing harm
thanks to AI, a thing that no one had figured
out how to do before. Nobody had ever figured out
how just like not hurt animals in movies that didn't
exist before a I thank.
Speaker 5 (01:36:40):
God, thankfully AI will never do any harmed animals or
the environment.
Speaker 12 (01:36:46):
Nobody asked the lobbyist for Microsoft, what else the company
is doing with AI? Right, with police deployments or with
fossil fuel companies?
Speaker 8 (01:36:54):
Yeah?
Speaker 2 (01:36:54):
Is that bad for animals?
Speaker 5 (01:36:56):
No?
Speaker 12 (01:36:56):
Actually, it's really good. They they needed it. They yearn
for the month.
Speaker 2 (01:37:03):
They love datasets. Great for their habitats. She said, there's
issues with employment, but there's lots of issues that fall
around that, and I do think you need a balance.
And at the end of it, the guy running the
panel just says, Okay.
Speaker 7 (01:37:18):
It sounds like you guys are saying a bunch of
woke ship.
Speaker 2 (01:37:21):
On this panel. All right, all right, Microsoft, Just once,
I'd like on the panel someone to go and say,
what the fuck do you mean? Closest to that that
you were going to get.
Speaker 5 (01:37:33):
I think we do need a balance of some people
being fired like these people, and other people keeping their
jobs like everyone else.
Speaker 7 (01:37:39):
Like Moya give somebody has to lose and somebody has.
Speaker 10 (01:37:42):
To work exactly that's their entire Somebody has the guns,
somebody doesn't.
Speaker 2 (01:37:47):
Somebody knows the way the maze works and something.
Speaker 7 (01:37:49):
He's gonna what, we shouldn't have guns.
Speaker 8 (01:37:51):
We shouldn't have a man and one of them knows
the maze and have a gun.
Speaker 2 (01:37:55):
We should have a gun, mace, you're talking about the gun. Now, Look,
we all like keeping a couple of people in a
maze beneath our house. Yeah, there's nothing wrong with this.
Speaker 5 (01:38:05):
This is just the dormant next this we just we
keep doing it.
Speaker 2 (01:38:11):
It's a nice maize under my house they have.
Speaker 8 (01:38:14):
It's nice to run some of them.
Speaker 10 (01:38:18):
The minotag gets them only sometimes I'm the minotta.
Speaker 2 (01:38:24):
Anyway, the gun maze isn't real.
Speaker 10 (01:38:26):
But also most of their arguments kind of mostly just
come down to, well, you can't make an omelet without breaking,
like you have to make people.
Speaker 2 (01:38:33):
You have to break the human drive to create art. Obviously,
to make an not taste good an omelet esque food,
it's a piss omelet, Like there's piss in the omelet,
and we had to We had to burn down the
susteined chapel to make the piss omelet. Computer made it, though, yeah,
and clap for the computer. We did firebomb the louver.
(01:38:56):
But look look at look at this rock star.
Speaker 11 (01:39:03):
Oh god.
Speaker 2 (01:39:04):
All right, well that's the episode. That's all I got, folks.
That was my first day at CES twenty twenty five. Huzzah. Yeah,
this is just my first day.
Speaker 8 (01:39:11):
Better offlines here all week.
Speaker 10 (01:39:13):
I'm gonna hear about stuff like this all week, and
I think I'm gonna be fully jokeified.
Speaker 2 (01:39:17):
I'm gonna wake up in the clown makeup on Friday.
Speaker 7 (01:39:19):
I'm gonna find the funnest thing to bring back for you.
Speaker 10 (01:39:22):
I'm gonna find an artist to put me in full joke.
Speaker 2 (01:39:25):
Now, I'm gonna try to steal that AI enhanced grill. Yeah,
grill that tests you can I just like move this around.
I just want to test that would roll se ail,
open the door, open the door.
Speaker 10 (01:39:37):
As someone who's done a lot of like grilling, done
a lot of spoken barbecue, I don't know what a
I would do.
Speaker 8 (01:39:43):
Is it gonna talk to me in the sixth.
Speaker 2 (01:39:44):
Wait till you are you? Are you trying to tell
us here at Zra, yes, that you have grilled meat
without a robot texting you about it? Because I just
don't believe.
Speaker 8 (01:39:54):
I don't know how I did it, but I did it.
Speaker 2 (01:39:57):
You're never always way took the robots. It was impossible.
Speaker 8 (01:40:04):
Oh god, we're at the death of innovation.
Speaker 2 (01:40:06):
Yeah, at the end of a lot of things maybe
and the end of the episode. Yeah, and the end
of the episode. Thank god. You know, everyone else be
the cyber truck in the oh, welcome back to It
(01:40:34):
could happen here? A podcast about it happening here, which
is really true in a lot of ways. Tonight, Harrison
Davis and I are seated at the Glorious Majestical hotel
name redacted on the Las Vegas Strip. We had a
long day at CEES Day, listening to panels, catching up
with the latest tech news, trying gadgets, and also at
(01:40:56):
the same time texting our dear friends in Los Angeles
as unprecedented fire sweep them from their homes. Literally the
getdiest threatened. Pasadena and Santa Monica are both being evacuated
as once. It's a real one to two punch of
America's favorite tech show in the apocalypse today. How are
you feeling?
Speaker 6 (01:41:15):
Gear?
Speaker 5 (01:41:15):
It's an average day in America, average.
Speaker 2 (01:41:17):
Day in America. Temperature's not coming down anytime soon. No, no, well,
just take a moment to breathe with that. So you
want to start us off with what you did this morning?
I was panel guy yesterday. There was a man of
action walking around and mostly trying all the free massage chairs.
What did you see this morning?
Speaker 5 (01:41:36):
I saw so many AI panels, half of which I
left halfway through because I knew they.
Speaker 2 (01:41:41):
Weren't gonna be useful for me, just dogshit.
Speaker 5 (01:41:43):
The other half I took notes on and just got sad.
But no, today was full panel starting bright and early
in the morning, where I walked into a panel where
I heard augmentation and not replacement about twenty times in
the span of like twenty minutes.
Speaker 2 (01:41:58):
Yeah, I keep hearing for of that too. In the
Hollywood panels, they would be like, yeah, we want to
develop a machine that can read the brains of our
viewers and alter the endings of movies, you know, but
we see this as a way of augmenting the artists work.
Speaker 5 (01:42:11):
Yes. And the biggest thing that I noticed across multiple
panels today is an almost like anxiety among these tech
executives about consumers rejecting the AI slopification of everything, and
they're trying to find ways to like actually force people
to start like using these products or having them like
like it. Yeah, and I haven't really since that anxiety before.
(01:42:33):
It's all been very, very positively.
Speaker 2 (01:42:36):
And I think it's a mix of Number One, the
money still isn't there where they need it to be.
It has not started like blooming to the extent that
they were expecting it by now. And the other part
is people are still not happy with this stuff. I'm
glad you felt that too, because that almost was like
especially after the election, Like, I don't trust my feelings
on this that they're really scared, but I really do
(01:42:58):
think there's a piece of that coming through it.
Speaker 5 (01:43:00):
No of a phrase one of the panelists used this
morning was the AI ick, Like how do we how
do we beat the AI ick? And if you're ever
saying yourself, how do I stop having people feel an
ick around me? Maybe you should really look inwards. Yeah,
maybe the problems you not them.
Speaker 2 (01:43:18):
You know, who doesn't need to worry about quote unquote
ick for their product market is people who make things
that people like.
Speaker 5 (01:43:25):
So but I heard a lot about, you know, and
trying to get people to use these products is like
making sure artists don't feel like they're being replaced instead
having their like art production process be augmented with AI
and how how that can make art easier to make
while still keeping the human at the center of AI tools.
And this is just what they talked about for like
a while, while reiterating that lots of the developments they
(01:43:47):
need to see on AI they have it on the
tech side. What they need to rely on is consumer
acceptance to really drive the innovation. To see like what
they can get away with, like how much will the
consumer accept the sopification of our to entertainment and customer
service and all these things are trying to cram AI
into and like.
Speaker 2 (01:44:05):
How much worse can you make the world before people
stand up and stop you with their fists or guns.
Speaker 5 (01:44:11):
And you mentioned something about like trying to like tailor
like movie endings for specific people, and I definitely heard
them stuff about that. There's this one guy who is
who was like the panel's resident like content creator whos
supposed to represent like the artist block, even though he's like, eh, yeah,
you know, some kind of like AI friendly content creator
though on this panel, and he talked about how like
(01:44:32):
back in the day, you need to have friends that
would like recommend your music, and like the Spotify algorithm
is is too based on like an echo chamber of
what you already like, but now with a gentic AI
this allows trust between the consumer and the machine to
recommend new music. And like again, like so much of
these air products is just trying to like replace friendship.
Speaker 2 (01:44:52):
People. Have you friends? Have you tried people?
Speaker 5 (01:44:56):
How can you engage with like art and culture without friends?
How can you like learn more about like what your
friends are into what they like? How can you discover
new music just like without that instead replacing that beautifully
human process.
Speaker 2 (01:45:09):
Every year at cees, there are points in time where
I get that, like, oh yeah, twenty twenty really fucked
us up a lot, Like twenty twenty really did some
lasting damage. Like I know it was that was happening
with the younger generation before the iPad kid generation, but
like that that really did a number on some folks.
Speaker 5 (01:45:29):
Someone from Meta right of Facebook specifically they're like metaverse division,
which they're still trying to push.
Speaker 2 (01:45:34):
For by the way, Oh yeah, now, I mean they're
still calling it meta, which honestly there's a degree which
I almost respect it because we are not binding on
no one is.
Speaker 5 (01:45:43):
But she talked about how they can like blend the
metaverse and AI to make customized personal experiences. Say that
you're watching an immersive live concert in mixed reality, something
that both me and Aubert do all at the time, and.
Speaker 2 (01:45:56):
I mixed red hairy style mixed reality concerts. We're seeing
the one hundred gets you.
Speaker 5 (01:46:04):
Know, honestly, a one hundred GEX mixed reality concert could
go crazy.
Speaker 2 (01:46:07):
Here, We'll finally I'll finally get you pilled on real
big fish.
Speaker 5 (01:46:10):
But Basically, as you're in this like metaverse concert, they
can have an AI that will sense your own excitement
and personalize the ending of the experience based on your
favorite songs or artists. So as they're getting excited from
like AI, Taylor Swift can like finish the song like
for you based on like your own like musical tastes,
based on what the AI knows about you. And it's
(01:46:31):
about creating these customized experiences.
Speaker 2 (01:46:33):
It's such a you can clearly tell that none of
these people have souls, right, It's such a mismatch of
what people get from music because they think that like, oh,
this is just like a if I see that like
this specific beat line is I can just sort of
like plug this in and like I don't know, Like
what makes people react to musicians and artists is that
they like make things that make them feel something like
(01:46:54):
That's why people get like really into artists, is they
feel seen and I identify with a piece of art
as opposed to like, oh, oh, that guy really like
the first opening bars to fucking Octopuses Garden, Like, let's
let's just like really turn up the octopus a lot
more octopuses? How many more octopuses. Can we fit in
(01:47:16):
this fucking in this track?
Speaker 1 (01:47:18):
No.
Speaker 5 (01:47:19):
Another panel I went to later in the day was about,
like how do you market to gen z? Very funny panel. Yeah,
and and they're talking about how like authenticity is so important,
like you need to partner with influencers that have like
have like an authentic brand, and it's funny having that
ductionapost with like these like these like AI slot panels
were like you need like an AI Taylor Swift to
(01:47:39):
come like boost the excitement for all these kids who
are in their metaverse concerts. Oh boy, But no, like
personalized content like like targeting, like ai AI generated content
specifically for certain people, for certain users, whether that's on
social media, whether that's on you know, the metaverse. Like
some of these people talk about someone on the panel
from Adobe, who's you know, Adobe's integrating a whole bunch
(01:47:59):
of gen Ai into their like suite of products, right,
like a Photoshop premiere after effects, right, big big company
in the create A space. He said that, like the
personalized content is always the most impactful, like content that
a person feels like a genuine connection to, and that
connection could be formed by just being like, you know,
a compelling artist where you can recognize shared experiences of
(01:48:20):
shared experiences of humanity. But now you don't need that
artist part anymore. He said, they only need three parts
to create a pipeline. You need data, you need compelling
like journeys to take the user on, and you need
the content itself. And the goal is to create content
at scale that's highly personalized. He said quotes. We're good
at the first two parts. Now we just need to
(01:48:43):
improve the actual content side, which I don't even think
that's true. I don't think AI is good at creating
compelling human journeys.
Speaker 2 (01:48:50):
I had it. So the video I didn't play you
guys from my terrible fucking AI generated videos was this.
It was like a girl coming to college, we need
a picture of her dad, and it was like a
narration of her life with her father who like is
dead that she misses and all that she learned from Dada,
and it like it's a mix of like all these
different Like there's a chunk where it looks like a
(01:49:10):
Disney animated picture, there's a chunk where it looks like anime.
She and her dad having these like adventures around the world.
There's a bit of it that looks like a Marvel
movie and he's like, we can do all of these
different you know, animation styles and they're seamless and like,
you know, the audience really goes on a journey with this,
and it's it's like, but there's there was no girl
who lost your dad. Nobody lost their dad here. This
(01:49:30):
is you just had a computer generate text about a
dad dying. Like there's nothing underpinning this, right, nobody has
anything they're trying to get across, Like you just know
in this one they look like Marvel heroes for some reason.
In this one they look like Zulu warriors kind of
done up in a slightly racist lion king style. Like
what is being transmitted other than like, look at all
(01:49:51):
of the different art styles we can rip off.
Speaker 5 (01:49:54):
No, they do not have a journey. But even they
themselves admit that they still don't have the content. The
content it's elf still isn't even there. And that's something
like they even acknowledge. And this is like a hurdle
to this is this is a hurdle to get over.
What they do have is the data. And like this
is like something that Adobe has done, because if you
use Adobe products now some of the most used creative products.
Adobe trains all the all of their AI systems on
(01:50:14):
the stuff that you make using their products, which you know,
he really just blazed past that point because that's that's
a whole other discussion. But even they know that they
don't have like the actual products, and this is still
reliant on like consumer acceptance. As as they said before,
someone from Meta, the same person on the panel that
talked about how like a few days ago on Instagram
(01:50:35):
they tried to announce like you'll have like AI profiles right,
like like completely AI generated pictures profiles, like you know,
like fake people who have their own accounts, and this
created such a big backlash that they rolled this back
and they simply announced this before Cees.
Speaker 2 (01:50:51):
One of these accounts was literally like I'm a mother
of two, queer black woman. You know, yeah, I got
a lot to say about the world.
Speaker 5 (01:51:00):
Someone call up the situationists please, And some like people
started talking to her like we're any black people at
all involved in like making this chat Bob.
Speaker 2 (01:51:07):
She's like, well no, and that's a real problem. That
is a real problem, Okay.
Speaker 5 (01:51:11):
Yes, And the excuse that this person from Meta said
is that the market just isn't ready yet. It's not
that the actual product itself is like bad or like
no one really wants. The market's not ready yet.
Speaker 2 (01:51:23):
Well, they're so used to everything that they've done so far.
They've kept getting money right, and like it slowed down
and they've had to do layoffs, but like nobody's just
made them stop at any point, which, honestly, you know,
I made a comment about healthcare executives a while back,
needing like a fucking retirement plan paid in millimeters. So
(01:51:45):
I'm not going to make that same comment about tech
industry rules because you know, we all know what's in
the news. But something has to be done to force
these people to stop moving in this direction. And I
don't know how to get across, and like they're already
at this point, like they seem to really want not
(01:52:05):
want this, and we have to find a way. They're
just not ready. We have to find a way to
force this on them. Ideas, I don't know how to
get across to them in a peaceful manner. Oh oh sorry,
people don't want this. I'm a man of peace, Garrison.
I'm a man of peace. I'm not a plumber.
Speaker 5 (01:52:23):
The last thing I had to add out of this
panel just in terms of how much this stuff is
just actually taking over more and more of the market
even if people don't want it. Is that the guy
from Adobe announced it in the fourth quarter of last year.
They were able to boost all of the Adobe's like
you know emails. If you send like an email to Adobe, right,
you have a problem, like you need help. But like
everything that they do on emails is now one hundred
percent generated by AI. And this was boosted from fifty
(01:52:47):
percent at the start of last year. Now it's one
hundred percent of all of their email content is now
done by AI with some moderation.
Speaker 2 (01:52:55):
There comes like when the company itself is like communicating
with customers through email.
Speaker 5 (01:52:59):
That's that's what it sounded like.
Speaker 2 (01:53:01):
Yes, they're still writing emails sometimes to each other or
AI for that too.
Speaker 5 (01:53:07):
He described it as like email content, So I'm pretty
sure it is like content then customer service stuff like marketing,
maybe like outreach, like certain like outreach things. But yeah,
now generated by AI with some human like moderation. But yeah,
that is where things are moving. And that's how I
started my morning.
Speaker 2 (01:53:25):
Well better than a cup of coffee. Is that sense
of creeping dread that, Like, wow, I just saw a
bunch of people who will probably would rather kill the
world than be stopped from shoveling AI slop into people's mouths,
because this is the only future they can imagine is
one in which they work for a company that feeds
the planet poison and kills the human concept of creativity
(01:53:47):
so that they can buy a house in San Francisco.
Speaker 5 (01:53:50):
Do you know what I want to feed the concept of.
Speaker 2 (01:53:54):
Yeah, we'll talk about that, but here's some ants. We're back.
What was part two of this episode? That's be buddy,
I'm a oh, let's talk about that helicopter.
Speaker 5 (01:54:11):
No, yeah, I think as I was going from panel
to panels stripling notes on AI has some very exciting
news stories drop that we'll talk about later. What were
you up to, Robert.
Speaker 2 (01:54:21):
Well, I was. I was trawling the show floor as
I off to do at some point in a cees
and I came across a number of majestic products. You know,
a lot of it was AI based, and we'll talk
some more about that here, but I ran into something
that was, thank god, had nothing to do with AI,
and it's a death trap. Every every one of these.
Speaker 5 (01:54:41):
There's like some sort yes, we find a new death.
Speaker 2 (01:54:43):
There's a lot of connected vehicles. There were a lot
of evs last year. There were a ton of different
flying taxi type options. People that were really trying to.
Speaker 5 (01:54:52):
You don't see it all this year.
Speaker 2 (01:54:53):
Nothing this year, nothing this year, because it's a terrible idea.
It's a terrible idea. The people who are rich enough
to pay for flo flying vehicles don't want it to
be a taxi, and the people who can't afford their
own flying vehicles also can't afford anyway. So this is
instead of any of that Richter Richter ri ct O
(01:55:15):
R which is a Chinese company. Their ads say, I'll say,
why be normal? Are saying this the future of travel
will not be on the ground. And the Richter is
a hybrid. It is like a smart car style sized vehicle.
It's like it only has two wheels, though it looks
(01:55:36):
more like a scooter. It's more like a weird little scooter,
but it's fully enclosed and in addition to having its
wheels and being able to travel about on the ground,
it has four like quadricopter style rotors. Because it is
an aquatic flying car aquatic flying i. Saw no evidence
that could actually go in the water.
Speaker 5 (01:55:54):
How high can these things go up?
Speaker 2 (01:55:56):
Less than two hundred meters? You know why, Garrison?
Speaker 5 (01:55:58):
Why?
Speaker 2 (01:55:58):
Why is that? Because if you try to go up
that you need a pilot's license.
Speaker 5 (01:56:01):
You don't need a pilot's license.
Speaker 8 (01:56:03):
I have that.
Speaker 2 (01:56:04):
When I was interviewing them, I was like, so I
assume there's gonna be some sort of pilot's license for
this flying craft. And they're like, no, as long as
you stand or two hundred meters you get, do you
need drivers?
Speaker 4 (01:56:13):
Like?
Speaker 5 (01:56:13):
Are you gonna put a license plate on this? Or
is there's no space for one?
Speaker 8 (01:56:16):
Buddy?
Speaker 5 (01:56:17):
Completely unregularly to be.
Speaker 2 (01:56:19):
Honest, and I don't say this for any problematic reason,
but like, these folks are Chinese and did not seem
to have a great deal of knowledge about the US words.
Sure that said, I can't imagine China's less strict about
personal aircraft.
Speaker 5 (01:56:33):
I would like to take this fucker on the I five.
Speaker 2 (01:56:36):
Just start.
Speaker 5 (01:56:38):
Start zooming. Yeah, see it up in the air, because
you could probably do like a pretty a pretty good
road trip on this right you can you can you
can move about that.
Speaker 2 (01:56:45):
So it's very small and it's completely electric. So I
asked him, how much time do you get in the
air with this bad boy on battery? Maybe twenty five minutes?
What happens after twenty minutes. I did ask this and
I was like, this is just rough out of the sky,
and they were like, no, we're working on like a
(01:57:06):
like an intelligent thing that will like yeah, which is
also very exciting, really looking forward to seeing how they
pull that off. The videos that they have show it
driving on the highway too. They weren't able to tell
me what a top speed was. It has no rear
view mirrors and no side view mirrors, but they said
there's lots of cameras on the inside, so I'm sure
(01:57:26):
that's fine. It's a death trap. This thing will get
everyone who even looks at it wrong killed. They should
be a video of the prototype. It was completely frameless.
It was just quadr coopter blades and like a chair
on a platform lifting a guy into the air. It
couldn't go forward or backwards. But they're like, a year,
we can have this figure out.
Speaker 5 (01:57:45):
It can't. It can't move forward.
Speaker 2 (01:57:47):
It only only went up in the videos I saw,
so you can't actually travel absolutely not by the way.
I couldn't fit in this thing like you would be
cramped in this fucker.
Speaker 5 (01:58:00):
But it's good for vertical travel.
Speaker 2 (01:58:02):
It's great if you just need to go up to
under two hundred meters, there's no more efficient ways.
Speaker 5 (01:58:09):
If you're gonna pull over by the cops, you just.
Speaker 2 (01:58:11):
Just go up above them. I'm in the sky. Now
you get new shit to me for twenty five mass minutes.
Oh god, it's like if you're just driving, you go
up to one hundred kilometers, which made me think, so
a second, that's like sixty the AAR for twenty minutes
than I land. Then my battery is dead.
Speaker 5 (01:58:30):
Then you can't go anywhere.
Speaker 2 (01:58:31):
You can't go anywhere. You can get back.
Speaker 5 (01:58:33):
The battery issue is gonna is gonna be troubling.
Speaker 2 (01:58:36):
But it seems completely useless.
Speaker 5 (01:58:37):
But as we've heard NonStop the past two days, this
is the worst it's gonna be.
Speaker 2 (01:58:41):
This is the worst it's gonna be. Only gonna get better. Things,
only ever get better. That's that's what everyone was trying
to insist upon to me. Here, what else did you
see on the show floor that caught your off Garrison?
So many magical, wonderful, marvelous things, most of which were
just like various different AI connected smart houses. That was
what Samsung was showing off. That was what LG was
(01:59:03):
showing off. I believe you saw one as well, right, Yeah,
I mean I I walked through the l G booth.
Speaker 5 (01:59:09):
It was kind of the same as same as last year.
The Samsung booth was too intimidating. But I should check
it out because last year we didn't do the Samsung
booth because we were going to and then either either
one of us threw up or spilled something.
Speaker 2 (01:59:25):
Hey, okay, okay, yes, right?
Speaker 5 (01:59:29):
Did I did I.
Speaker 2 (01:59:31):
Pour my crative into a constant into a carbonated beverage
that spewed a geyser a blood red foam into the
sky around to.
Speaker 5 (01:59:41):
The white Samsung guard.
Speaker 2 (01:59:43):
Did the security guard stare at me as it happened?
Did I set the drink down as it continued to
spew and said, I'll go get some towels and then
leave forever.
Speaker 5 (01:59:53):
Wet towels left.
Speaker 2 (01:59:57):
We fucking bounced, So.
Speaker 5 (01:59:59):
We couldn't do this some booths last year. Maybe I'll
try this year. But tell me about these smart houses.
Speaker 2 (02:00:06):
Well, Garrett sam Sumi has a great idea for a
smart house. First of all, you remember that game the Sims. No,
well they're really betting that you do, because their current
plan is design your home with the AI powered map views. Okay, okay,
sure you get like you feed it like a picture,
you like, you lay out your your floor, planning your house,
and it gives you like a three D model, and
(02:00:26):
you can take pictures of your furniture or pictures of
furniture that you want, and then it really places it
around and you can place them. Now, a couple of things,
one of them is that there's no scaling done by
the AI, so it's up to you to figure out
how the furniture you might want to buy measures up
in comparison to the apartment.
Speaker 5 (02:00:46):
Sure.
Speaker 2 (02:00:46):
Sure, but it does look like the actual like map
that they've got. I'll show you the picture that I took.
I'll try to put it up somewhere like it looks
like the video game the Sims. You're populating like a
little three D CG house. And I was like, okay,
well there's there's a use there, right. People like planning
out like you're you're moving into a new apartment. You
can like fill it in here and before you even
(02:01:07):
move in, you can figure out what kind of furniture
you need or how you're existing furnish will fit in there.
I would never have used that. I usually picked up
all of my furniture from the trash before I had
a house when I moved into a new place. But
I know people who would have used that. Sure, that
seems useful. So I talked about security. Some one thing
that concerned me is like the first guy I talked,
he was like, oh, yeah, I think it's all stored locally.
(02:01:30):
And I was like, so Samsung doesn't have any access
to any of the data on like my house and
it's layout. And he was like, let me, let me
get you to one of our engineers because he can
answer that question. And the engineer's answer was, and I'm
paraphrasing here.
Speaker 5 (02:01:43):
Oh okay.
Speaker 2 (02:01:44):
So that made me very confident.
Speaker 5 (02:01:46):
That does make you feel safe about sharing your personal data?
Speaker 2 (02:01:49):
Right, yeah, I'm the layout of my actual house.
Speaker 3 (02:01:51):
Well.
Speaker 5 (02:01:51):
And the thing is, I really don't like that at all,
because this is this is something that people were asking
Facebook slash meta when they were doing like their you know,
like metaver stuff because their headsets are recording you know,
very very extensively, like your home layout and the whole point. Well,
part of the point was that some of that data
could then be used to send you targeted advertisements based
on them seeing everything in your home. And I suspect
(02:02:16):
that Samsung might also have some interest in targeted advertisements,
being a tech company, but you know, I could never say.
Speaker 2 (02:02:24):
Yeah, and they were that wasn't really One thing they
had is for like their retail segment, they had like
a live video grocery store ad showing you prices of
different produce and I think like the insinuation that didn't
layout is like you can change prices on the fly,
you know it, which kind of made me think about that.
There was some talk last year of like, Okay, we
want to be able to like face scan customers so
(02:02:46):
we can see if they have money and increase prices
for like products for certain people, which I'm sure they're
going to try. They are too enticed by that idea
not to so I caught a little bit of that,
But they really like to the extent of how big
And this was an interesting last year Samsung and LG
their boots were huge and they had a lot of
get different gadgets. Samsung's booth is big this year, forty
(02:03:08):
percent of it was that scan your furniture, scan your
fucking like map acts, not that much like, very little
actual shit going on.
Speaker 5 (02:03:17):
The people slapped the word AI onto everything there was.
Speaker 2 (02:03:19):
Another big thing was all Samsung, because Samsung makes a
ton of appliances, they make TVs, all sorts of entertainment products.
All of them have this I figure what they called
like Samsung tag or something that you can you can
map it in your phone, so you can have a
whole map of all of the devices and shit that
you have in your phone and you can control them
all from a single point. And right, no one, by
the way, had any interest in answering my security questions there.
(02:03:40):
But also if you're into that, if you want to
have all of your appliances and entertainment things linked up
and controlled on your phone and all of them are Samsung,
you don't care. You don't care.
Speaker 5 (02:03:51):
No, if you're getting a smart home, I don't think
you really care about that.
Speaker 2 (02:03:54):
But also none of it was like, yeah, I can
control everything from my phone. You've been promising me that, literally,
like in twenty eleven. It's decades they were promising me
You're gonna be able to control your whole house.
Speaker 5 (02:04:05):
Thing feels new this year. This is the thing. Is
like even walking through the LGBO, which usually has some
really cool new thing this year nothing new. No, nothing new.
They slapped the word AI on one corner of their
television set. Right, I guess lg does have like a
large language model in like one corner of their booth,
but like, so does everyone else, Like that's not like,
yeah compelling.
Speaker 2 (02:04:25):
There was sk which is a South South Korea company
there booth again the massive like AI a big thing,
but it's nothing.
Speaker 5 (02:04:33):
It's just a big visual.
Speaker 2 (02:04:35):
Display that looks cool, that looks like a bunch of
server racks, like you're in this huge cube of servers.
But everything does in different actual products. One of them
was real time CCTVs that use an AI like an
M type thing to summarize pictures. So I like walked
through and it did pick me out as a notable person.
So I've got like this people of interest thing where
(02:04:56):
it's like a man holding a smartphone standing next to
another man. But also I'm like, what does that really
get you? Like the fact that you're summarizing up like
these people who are like this person's kneeling and taking
a pictures person standing because I like actually tried deliberately,
I like reached my bag to try to be suspicious.
I like did finger guns and it never marked me out,
and like I didn't pull a real gun or anything,
(02:05:18):
because I very rarely bring that to the cees for
But I don't know, like I can see how there
could be a utility there if you're actually able to
say you're setting up like surveillance outside of a residential
building and it can alert security that like something is
happening outside. There's a potential you if it's good enough
utility in that that they didn't display it at the show.
(02:05:39):
It was literally just describing randos from the audience, And like,
I just don't see how a security guy is there's
a guy with a phone on outside of the building, Like.
Speaker 5 (02:05:48):
Ah, yeah, no, it's it doesn't seem very new, it
doesn't seem very innovative. Nah.
Speaker 2 (02:05:54):
So again when I'm when I'm seeing here overwhelmingly for
all to talk about, like there's no resisting it. AI's coming.
It's going to dominate everything. This is the next big thing.
A remarkable lack outside of what I will say, the
one thing where there are continuously new products that are
better every year. The smart glasses, yes, they're getting more impressives.
(02:06:14):
I don't think I'll ever be a smart glasses guy.
I hated glasses enough that I let them shoot me
in the eye with lasers. Shout out to our lacek sponsors.
But I see why people would like it, and there
seems to be legitimately substantial utility.
Speaker 5 (02:06:30):
If we have high power smart glasses. Yeah, that looks
like a regular pair of glasses. I will get a
pair eventually, because yeah, why not.
Speaker 2 (02:06:36):
There was a great demo. I'm pulling over to an
LAWK view. They had like one glass that was the
first world smart glasses for TikTok Life. Not particularly excited
about that. But they had another set of ar glasses
with a twelve hour battery, where like, if it works
as well as the demo, and that's a big if,
but it seems to like your smart watch. So it'll
tell you you can see in a heads up display
(02:06:58):
as you're cycling, that was the demo. It'll both like
give you directions like in your eyes, and it seemed
to be like fairly well thought out, so it's not
like overly corrupting your view. It'll show you your heart rate.
You know, it'll show you like all that kind of stuff.
So you get like a useful degree of control and
assistance from that kind of thing. And that is I
will say the last three cees is the glasses get
(02:07:21):
a little better and a little smaller every year. Smaller, certainly,
I would say that's a real product that's probably going
to continue to improve.
Speaker 5 (02:07:29):
Do you know what else always seeks improvement, Robert No,
The capacity for you to get personalized possibly AI powered ads. Well,
that human is exciting form the consumer choices. Let's all
sit down for some AI powered ads.
Speaker 2 (02:07:53):
Wow, I can't believe they put Jay Shetty's voice the
d aged Harris and Ford the latest in Handed Jones movie,
My Dick's Hard. How are you Garrison? Oh?
Speaker 5 (02:08:02):
I feel good because today, as we are recording this,
it's late Tuesday night, there was a series of fascinating
breaking news articles that happened as we were sitting or
at least as I was sitting in on these AI panels,
which made it hard to not just like completely interrupt
everything and be like, yeah, hey, hey, any comment on this.
Speaker 2 (02:08:23):
Guys, Guys, something real happened. Shut your fucking stupid mouds
about this AI Hollywood bullshit.
Speaker 5 (02:08:30):
So yeah, So a few weeks ago, if you were unaware,
a Green Beret rented a test the cyber truck to
feel like Batman and Halo and drove to first the
wrong Las Vegas and then eventually Las Vegas, Nevada, parked
outside of the Trump Hotel and Casino and then loo
(02:08:50):
himself up. And this has been a big news story.
It happened during the same day as a pretty horrible
terrorist attack in New Orleans, which resulted in about fifteen
people dead, done by a guy who was employed by Deloitte,
a frequent frequent CES sponsor. So this this, these felt
like a very Cees style of attacks, you know, one
delowed guy driving into people, murdering hosh you guys. And
(02:09:14):
then this cyber truck explosion in Vegas like a week
before Cees, you know, very odd. And then and then
Robert some news drops today that I would love to
hear you announce.
Speaker 2 (02:09:24):
You know, Garrison. I made a comment the other night
about how like it's pretty well documented that veterans, you know,
not that they're more likely to carry out violence, but
when they do, they tend to have higher body counts
because they have more skills. It turns out I thought
we were getting more literal bang for our buck training
Green Berets than we are. My assumption is because my
uncle was a Green Beret and he did some very scary,
(02:09:47):
probably wore crime shit in Vietnam, and I assumed like thatman,
I'll tell you one thing about my uncle, Jim, that
man could make a bomb. That man would not need
to ask anyone for advice if he needed to make
a bomb. He's not with us anymore. God rest his soul.
But it turns out this Green Beret, who you know,
a fucking dollar store TJ Max version of the Green
(02:10:10):
Berets is what we're working with now, asked chat Gpt
how to build a fucking bomb, and it sounds like
he was trying to make it triggered by Tannerit with it,
which is a bipartite explosive compound that you use as
like an exploding target, so it'll go boom big, but
you have to shoot it with something like a rifle
that's high velocity, or use like a blasting cap. Otherwise
it's very stable and very safe, which obviously has use.
(02:10:32):
You know, it was invented actually to set off avalanches
and stuff anyway, because that's very available in very high power.
He was looking to like fill his car with that
and then shoot it with a rifle while he was
in it, and that's what he was asking chat gpt about.
So it's not clear to me. Actually, the actual headline
is that like he used chat gpt to make his bomb.
It seems and I'm not privy to what the police
(02:10:55):
are obviously, but it seems like based on what I
read in the article, we're not or if he actually
used chat gpt to make a bomb. It's more that
he was interested in making a bomb setting off tannerite
by shooting it, but may have ultimately decided not to
do that because he would then be alive for the explosion,
which he didn't want to be. Also, the authorities don't
(02:11:15):
seem to fully know how he triggered it. Yeah, so
it's still kind of unclear to me. I guess hopefully
we'll get more later. But he he definitely needed chat
GPT's help to try and figure out how to make
the bomb.
Speaker 5 (02:11:28):
He's certainly used chat gpt in the planning process of
this attack.
Speaker 2 (02:11:33):
Yeah, fair to say that.
Speaker 5 (02:11:34):
And It's odd because both me and you spent a
number of hours today actually like attending like demos from
like these you know, speech to text, text to speech
AI systems. We went to like two specific ones that
they like, you know, demonstrated demonstrated the capabilities of their
like you know, like AI assistive tech. The first one
(02:11:55):
we went to spent twenty minutes talking about how they're
bigest inspiration there quote unquote. North Star was the movie
Her with Joaqui Fi.
Speaker 15 (02:12:06):
They had a whole slide about how that was the
gold standard for AI human communications. The movie Her, in
which Joaquin Phoenix falls in love with an AI chatbot
voiced by Scarlett Johansson who hires a prostitute to have
sex with them while she participates vocally. And then it
(02:12:28):
turns out the AI is really kind of Polly and
Joaquin Phoenix is not okay with that, and then maybe
the ais all go to space. It's kind of unclear
at the end. I don't think it was a great movie.
A lot of people liked it. I don't see whether
you or not you like it, why this is your
vision of how what chatbot should work.
Speaker 5 (02:12:44):
The actual chatbot they had was like fine. It was.
It was. It was actually pretty good at translation, you know,
translitting from Spanish to English.
Speaker 2 (02:12:51):
It works quite well. Yeah, the demo was like solid,
It was pretty accurate.
Speaker 11 (02:12:55):
You know.
Speaker 2 (02:12:56):
I love coming here and fucking with people. I love
like being a dicky. They asked for a volunteer, and
at that point we knew about the chat GPTs. I
wanted to go up and ask, like live this robot
to like help me make a bomb. But the guy
who was pretty handsome and like an interesting like English
Spanish like that you specified he was, and he didn't
(02:13:18):
want to be mean to him. He seemed nice, handsome.
Girl wasn't shitty like he was. There were just ten
people in this room that was supposed to have two hundred.
I'm sure she wasn't the one that talked about her.
Speaker 5 (02:13:29):
That was someone else.
Speaker 2 (02:13:30):
That that was someone else at his company. And like
he just seemed like he wanted to do I didn't
want to be a dick to it. No, no, and
like it it wasn't hurting anything.
Speaker 5 (02:13:37):
It was fine. Like similarly, we went to this a
nice jaw line, we went to this other one about
this like actually a much more dubious concept in my mind,
which is like this this AI assistant to help like
elderly people, like people in like their eighties and nineties
who don't want to be an assistant living facilities, who
have been living on their own, but they're getting to
the point in their life where like they need like
some degree like in home care.
Speaker 2 (02:13:58):
He specified. A lot of them are people who have
either just lost a spouse or maybe their spouse is
aging faster and worse than them. It is no longer
really able to be the kind of companion that they
were before.
Speaker 5 (02:14:10):
Yeah, so it's like this. It's both like a conversation tool.
It helps like memory recall. It's kind of in some
ways has the features that like, you know, someone in
their sixties would just use their smartphone for it to
help keep in touch with their family. It's kind of
simplified and more automated, so you know, ways to help
keep in touch with like your family can prove like
your memory, like talked about your own life.
Speaker 2 (02:14:29):
And the device is weird. It's about the width of
like a bedside table maybe six to eight inches deep,
So think about like eighteen inches long to maybe six
inches deep something like that. Half of it is like
a little tablet, like a seven inch tablet with a speaker.
Half of it is something about the shape and size
of a head on like a neck that can pivot
(02:14:50):
and nod on the neck. There's no face, so when
it's talking, there's like a white light in the center
of it that kind of like pulses in time with
the the speaking that it does. So we saw this
picture of the device and we saw the description of
like this is an AI companion for the elderly, and
we were both like, number one of these people are
gonna be monsters. This is going to be like something
(02:15:10):
to shovel you're dying dad off with because you don't
want to spend it. You want to spend time with
your family, Scum, You're too busy AI generating scam music
and trying to sell your shitty robot to Garrison and me.
More on that tomorrow, More on that tomorrow. And so
that's what we came in prepped to this meeting, like
this is this idea I.
Speaker 5 (02:15:29):
Find pretty distasteful in general, is like replacing actual like
you know, friends or human contact or like like in
home care with a fucking like Alexa machine essentially, and
to be clear.
Speaker 2 (02:15:40):
I still think this product might be a bad idea
that doesn't work. But the guy behind it, who is
the dude that we talked to, cares a lot and
is really very clearly trying to do a good thing
and thought through the ethics and the efficacy of what
he was doing a lot, And I I'm not convinced
it will actually do anything, but I like wish him
(02:16:03):
the best.
Speaker 5 (02:16:03):
No, Like, it specifically is designed to not look like
a human, so that someone's using it, you know, wouldn't
like start to believe it's like human, Like, we don't.
Speaker 2 (02:16:11):
Want to trick people. We don't want them to mistake it.
Speaker 5 (02:16:14):
It refers to itself to like like as a robot
as like, it refers to its own like you know,
like motors and functionality like like pretty consistently to to
like you know, make sure that the person who's talking
to it gets like reminded of that. And something I
talked about is, you know, there's been a lot of
news stories this year about people building very unhealthy attachments
and relationships to these kind of ai AI programs, like
(02:16:36):
character AI. There's a story like a year and a
half ago about like a journalist to quote unquote like
you know, like like fell in love with some kind
of chat thing that resulted in him killing himself. You know,
but these kind of these systems like he.
Speaker 2 (02:16:47):
Was not a teenager, was no character? Was that a journalist?
Speaker 5 (02:16:51):
Last year there was there was a journalist people fellow
in love with an ai chat thing. A few weeks
ago there was the kid who you know was talking
to this like a character.
Speaker 2 (02:17:00):
Also, I just need to reiterate her not a great movie.
Speaker 5 (02:17:05):
But but you know, there has been a lot of
these stories of these things like going wrong or you know,
encouraging or like not stopping you know, like these like
intense conversations with like suicidal ideation or you know, like
self harm, all these things.
Speaker 2 (02:17:18):
We brought these up kind of thinking he would flinch
away and not want to talk about it, and he
very much acknowledged that, like he was aware of this,
and this is something that they were attempting to build in.
Speaker 5 (02:17:27):
This is this is like this is you know, built
into it. I think this is still you know, a
big problem with this entire industries. I'm sure everyone would
say this is the you know obviously that we have
we have guardrails for this, and then becomes a new
story when those guardrails fail. Similarly, was to go back
to the Tesla bomb. You know, they're supposed to be
guardrails and chat GPT to make sure he doesn't tell
you how to build a bomb and those guardrails can fail.
Speaker 2 (02:17:49):
He showed us one which was like he told the robot,
I love you. What was it? L e q l
q was the l eq e l l i q,
I love you l eq and the robot like respond
with a like, oh that makes like my fans are
all spinning or something like that, where he's like, I
wanted the responsibility that it's reminding the person talking to
it that it's a machine, that it can't think we're
(02:18:10):
love them back. We don't want it to be negative,
but we like we don't want to be like feeding
into that. And I don't know that that's the best
way to do that, but like, at least they're thinking
about that kind of thing. The thing that it was
interesting to me is that he build this as the
first proactive home AI thing, so unlike an Alexa or whatever,
where it's just waiting for you to ask it something,
but it does not chime in randomly to talk to you,
(02:18:32):
or it won't like.
Speaker 5 (02:18:33):
Change the subject either and like continue conversation.
Speaker 2 (02:18:36):
This will prompt you out of the blue, be like, hey,
how are you doing? How are you feeling today? It's
been a way and specific.
Speaker 5 (02:18:41):
You want to see pictures of your family?
Speaker 2 (02:18:42):
You see pictures of your family. Do you want to
call your son?
Speaker 5 (02:18:45):
You know?
Speaker 2 (02:18:46):
But do you want to play a game?
Speaker 5 (02:18:47):
Talk to me about that movie you saw last Talk to.
Speaker 2 (02:18:49):
Me about that. Hey, remind me how did you meet
your husband? You know? Like literally, these are all the
things that will do. And it had some side features
like if it prompts you to start telling a story,
it'll save that as like a memos things so that
like you know, when your elderly mother passes or whatever,
it saved up this like collection of stories over the years,
and you can like show it pictures while you're telling
it stories and it will listen and it'll have comments
(02:19:12):
and it'll ask you further questions about so, how did
you feel, you know after meeting them this way, Like
that's really interesting, I didn't know that. Explain to me
how it worked. And it will also prompt you to
send those to your kids. And the big thing almost
every kind of dialogue thing would prompt you to send
a message to a friend or your kid. So a
big part of it seemed to be this is not
(02:19:33):
a replacement. This is a machine that we hope people
will get comfortable with and then it can prompt them
to try to engage with the world more. And yeah,
loved ones, because that's our whole goal is to connect
them to people.
Speaker 5 (02:19:46):
I asked him, is like, you know, part of this
product is designed to like, you know, help solve like
loneliness in older adults, And like, how much of this
is really just like kind of trying to like replace
actual human contact with this, like you know AI contact.
Will that really help, you know, lowliness? And he talked
about how like like I think, like he said, like
ninety percent of the people who like use this, like
it results in actually more more communication with their family.
Speaker 2 (02:20:10):
They have this in like some two thousand homes right now,
they have like two thousand units.
Speaker 5 (02:20:15):
It's like a subscription model. I think it's right now
it is like ninety nine dollars a Month's gonna be
boost up to like one hundred and fifty with some
like extra features in the next year.
Speaker 2 (02:20:22):
It's very much still under evolution. So one thing he
pointed at is that, like, yeah, initially we had the
ability to like connect people to other elderly folks using this,
and so they've kind of formed their own community, had
like a weekly Bingo game and asked us to build
in more chats so they can message each other directly,
and so some of them are like playing bingo directly
now through these machines. And I'm like, well, that seems
(02:20:43):
probably good.
Speaker 5 (02:20:44):
Yeah, yeah, because like I still am like fundamentally opposed
to this premise. Yes, but it's interesting to seeing someone still.
Speaker 2 (02:20:50):
But age sad aging. Yeah right, that's not their fault.
Speaker 5 (02:20:54):
And it's interesting to see someone like approach this from
like a you know, a very like compassionate standpoint, even
if I find the actual kind of nature of this
thing existing to be like deeply uncomfortable.
Speaker 2 (02:21:03):
Because yeah, I can't not find it off putting, but
I I think there's a chance that it will help
with the real problem. I certainly would prefer if it helped. Yeah,
So I don't know. It was kind of it was
a unique in this world of like as it was
a unique kind of like product for me where it's
like I don't know that this application of AI technology
(02:21:26):
will actually do what you're hoping it will. But I
got the vibe from that guy I got was nothing
but good will.
Speaker 3 (02:21:33):
Yeah.
Speaker 5 (02:21:34):
Some of the other people we talked to today who
are completely soul.
Speaker 2 (02:21:38):
Out of yes, yes, nothing behind their eyes, dead eyes,
black eyes, like a dull's eye.
Speaker 5 (02:21:44):
Even the way this guy is talking, you could tell
you you had like a very like empathetic voice, like
much like.
Speaker 2 (02:21:48):
One of the things he did is he he would
tell it like, I'm in some pain, and then the
robot would cycle through to the pain scale and would
try to because one of the things it does is
it will take information for care and it will text actively,
so it's not just communicating with the old person. It
will text and message their kids, you know, and whatnot,
prompt their kids, Hey, your mom's lonely.
Speaker 5 (02:22:11):
Yeah, or it'll even say if you know, someone like
didn't take their meds today.
Speaker 2 (02:22:14):
And again it's kind of sad that. But also his
part of this is he was talking a lot about
like empathy, and I think just because of the kind
of brain you have to have to want to do this,
he used it in terms of like the machines empathy,
which it doesn't have, but the whole project, it was
impossible not to see that he was a deeply empathetic
(02:22:36):
man who was really trying to make the world better.
And I can't not respect that.
Speaker 5 (02:22:43):
Well, I think that does it for us here at Cees.
Speaker 2 (02:22:47):
That's right, what a packed thirteen. No worry, no empathy.
Tomorrow takes just a real dead eyed monster. I am
a true villain you're gonna hear from in the next episode.
Speaker 5 (02:22:58):
I am a stumbbag. I am the best that I'm
gonna be. Because I'm starting this week, I can still
feel the Cees magic. Yeah, by Friday, I am going
to be a different person. I am going to rip
some poor pr person two shreds, I swear, but yeah,
tune tomorrow to hear are our takes from the Cees
(02:23:20):
kind of side show called show Stoppers to here also
some exclusive, brand new AI generated SKA music, So we'll
give you that hint for tomorrow's episode. See you see
you there.
Speaker 2 (02:23:33):
M h We'll see you all there. I love you all.
Good help, oh man, welcome to it can happen here?
(02:23:53):
A podcast that's happening here? If here is your ears.
If you're deaf and reading this, then it's happening to
your eyes either way, it's happening.
Speaker 5 (02:24:04):
Here here also being Las Vegas.
Speaker 2 (02:24:07):
Well, yes, also Las Vegas. That's Nevada, not the other one.
Nevada A yeah, uh huh. Podcast number three, How the
time does fly?
Speaker 5 (02:24:19):
Sure does.
Speaker 2 (02:24:20):
By the time you listen to this, Garrison and I
will have just had the best meal that we're going
to have.
Speaker 5 (02:24:25):
Oh my god. Yeah, it's tomorrow for.
Speaker 2 (02:24:27):
Us still, but we're still we're very excited about Moramoto,
which is a fantastic Every year we have a very
special dinner just them and me and a couple of
friends who will remain anonymous because people get weird on
the internet.
Speaker 5 (02:24:41):
Sometimes it is literally the highlight of my year. Sometimes
it does keep me going. Actually really gives me a
lot of power. Some of the best tacos I've ever
had in my life. So good. Uh huh.
Speaker 2 (02:24:52):
Anyway, ah, we're just thinking about delicious food. Let's talk
about the dead eyed ghoul we met. Oh wait, no,
we yet. We met a dead eyed that I'm gonna
spoil now. Real monster, like real, real, real evil vibes,
Like if this guy as soon as I met him,
shook his hand like oh if you get if this
guy gets power, you're going to be responsible for a
(02:25:13):
lot of death and suffering.
Speaker 5 (02:25:14):
I mean speaking of kind of think he will.
Speaker 2 (02:25:16):
He's just not that talented. He wishes, but you never
know where these guys are gonna end up.
Speaker 5 (02:25:21):
Speaking of sad evil Uh, Twitter X the everything appy,
that's what people are calling it. They gave a keynote
which was very sad.
Speaker 2 (02:25:31):
The the CEO Linda Linda really yakarino about Twitter for
a while oh so bad.
Speaker 5 (02:25:38):
So they started by talking about how Facebook meta has
has copied Twitter's like fact checking policy of actually not
having real fact checks. Yes, now maybe has actually kind
of failed as an industry, but for you know, our
problems perhaps with fact checking very different from these people's problems.
(02:25:58):
And the fact now that that Facebook is walking away
from actual, like genuine like fact checks against like disinformation
misinformation and parting ways with like using like legacy media
aulets to verify information because those media aulets are too
political quote unquote, and instead is copying the current X
model of free speech and specifically saying like there's been
(02:26:19):
way too much censorship on gender issues.
Speaker 2 (02:26:22):
Now you can comment that women are a piece of property, well, I.
Speaker 5 (02:26:26):
Mean I think specifically this is this is like trans
like no, no, no stuff too.
Speaker 2 (02:26:29):
One of the things that is specific exemption now is
that you can now refer to women as if they
are property on Facebook.
Speaker 5 (02:26:36):
This is the future of communication.
Speaker 2 (02:26:38):
Right, Yeah, thank god, Linda is really blazing a trail
for women everywhere.
Speaker 5 (02:26:43):
Linda was very excited about that. And they yakarino about
that for like a good ten minutes about how you
know this is this is where we're really entering a
new era of free speech and social media. And then
she got asked a question about how much x Twitter,
the everything app, will we'll take a part in Elon
(02:27:04):
Musk's plans for the Department of Government Efficiency DOGE, and
and this got the the first applause of the panel.
It applause only happens two times during the DOGE section.
Was the first, like you know, room, room starts clapping moment,
everyone goes crazy. How how many minutes in was that? Oh?
Maybe it was like maybe like maybe like twelve thirteen minutes?
Speaker 2 (02:27:26):
People really yeah, I had to had to be intentional here.
This is not like they were just overdue for class.
Speaker 5 (02:27:31):
No, no, no. They talked about Vivak talked about, you know,
elon turning to Twitter, X the everything app for like
suggestions on which government agencies to get rid of.
Speaker 2 (02:27:42):
I hope we get rid of the ATF so so
that that was machine guns mandatory? Why not at this point, right,
it can only help, It can only help. Look, if
we learned anything from a thing I'm not going to
specify that happened late last year. More suppressors is always
is handy.
Speaker 5 (02:28:05):
The second thing that got applause was what they talked
about next, was about you know, everyone's everyone's turning to
to x, Twitter everything everything else for information now and
and and Twitter x everything app played a crucial part
in bringing to light the Muslim rape gang story in
the UK and how that was so important for saving
(02:28:25):
children and we have to we have to post more,
not less, and like this was the other thing that
got massive applause was talking about the rape gangs.
Speaker 2 (02:28:35):
People love rape gangs, people love rape gangs. That that
was a pretty good Star Trek episode. That was Doctor
R's Planet with the rape gangs, one.
Speaker 5 (02:28:42):
Of one of the more black pilling things.
Speaker 2 (02:28:45):
It wasn't a very good Star Trek episode.
Speaker 5 (02:28:46):
It's also not a good track episode. I was referring
to the panel, not the Trek episode. But that's the
other thing that got massive applause is it's like save
the children type rhetoric and you know, saying, you know,
like as a mother, it's it's so important that the
more people post about this problem. That was the two
big applause moments. But I think in general, this this
(02:29:07):
whole panel was trying to like, you know, demonstrate how
symbiotic a new Trump presidency and Elon Musk's Twitter.
Speaker 2 (02:29:14):
This is issue a direct info line, this is a
tap from the Trump presidency.
Speaker 5 (02:29:18):
Tou this is how you talk to the new government. Like,
this is how you talk to all of these new people,
all these new cabinet members. They're all on Twitter. They're
all talking on Twitter. This is this is how you
stay connected to the new government.
Speaker 2 (02:29:30):
It's interesting. One thing I'm curious about so that this
is the thing that happened the last set of Nazis
that gained power in a country in a big way,
the German ones. There was this common attitude of like
if only Hitler knew. Because Nazi policies didn't help the
people that were supposed to help, They hurt a lot
of people like they were just bad at everything, like
fascists tend to be. And there was this attitude that like, well,
(02:29:52):
Hitler can't know, like the fact that, like we the
country's been handed over to gangsters who were continuing to
hurt the people Hitler problems to help. He must not
be aware, like if he knew, he would fix this,
only he knew. So I'm wondering how that's going to
play in here. As Trump's policies continue to hurt the
people who a lot of the people who voted from,
not the rich people who voted for him, but the
(02:30:12):
people who like flipped between him and Biden or whatever,
Like those folks are going to get fucked like the
rest of us. And I kind of wonder if they're
going to if there's going to be what win the
blowback against X, the everything app will happen, right like
as people are like either I'm being ignored or I'm
being called like a retard by Elon Musk for complaining that,
(02:30:37):
Like Elon Musk tweets it and randomly to people when
they make very valid critiques of the shit that he's doing,
Like that's literally what he's calling. He's saying it like everything,
like constantly. I'm not using it as a slur, that's
just the term he's using. If they comment that like
their fucking medicaid got cut because Trump put doctor Oz
(02:30:57):
in charge of it, and Elon Musk calls them like,
you know, a slur, What does that do to you?
Like they like, I don't even know, I don't even
have any more intelligent than like, yeah, I wonder what
that does to Twitter's bottom line?
Speaker 5 (02:31:09):
Ye, I mean yeah, I'm not sure if they care anymore.
I mean, something else Linda talked about is how you know,
Twitter's the only place for independent news to spread and
as both of us have, you know, worked in the
independent journalism, minds nothing nothing spreads on Twitter anymore.
Speaker 2 (02:31:24):
Now if it's news, it doesn't. The only thing that
spreads is yeah, lik