Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to episode two hundred and sixty three of the
Death of Journalism podcast. My name is John Zigler. I'm
your host on today's show. President Trump gets unprecedented and
nearly universal praise for the peace deal between Israel and Amas,
But is it really a great deal? RFK Junior makes
perhaps his craziest autism claim yet. I'll tell you something
(00:22):
you did not know regarding the Katie Porter meltdown in
the California governor's race. I'll also provide a very deep
dive into the artificial intelligence issue that you will not
get anywhere else. And Penn State University makes its second
worst football coach firing in its storied history. I have
(00:46):
been promising for several weeks that I would do a
deep dive into the entire artificial intelligence issue that I've
been studying quite extensively, and I keep waiting for there
to be a slow news week for it to be
an appropriate time to spend most, if not all, of
a podcast episode on that topic, and it just never
(01:07):
seems to happen. So at the risk of having another
long episode, I'm going to do it in this week's episode,
regardless of what other news transpired, and there has been
a lot that has actually taken place over the last week.
It seems to happen every single week, So I'm going
to try to go through the biggest stories as succinctly
(01:31):
and as quickly as I can so that this isn't
some sort of four hour podcast episode. But obviously there
is a lot to talk about in the news that
has occurred over the last week. The biggest story by
far is that there has been a ceasefire and an
apparent piece deal between Israel and Hamas that was negotiated
largely by the United States, specifically by Trump's son in
(01:56):
law Jared Kushner, among others, and Donald Trump himself. The
hostages that were taken as a result of the horrendous
terrorist attack just over two years ago on October seventh,
Bay Jamas were released to great fanfare, and there's obviously
a lot of joy surrounding the release of those hostages.
(02:17):
Trump has been getting an insane and truly unprecedented amount
of praise, so much praise that Trump even thanked the media,
which you've never seen Trump do. He thanked the media
profusely for their overwhelming praise of what he has taking
(02:38):
credit for in brokering this piece deal, this cease fire,
this hostage release. And there's no question that both on
the right and the left, the media coverage of this
has been extremely positive, extremely positive, and I can understand
why that would have a lot of impact on PP.
(03:01):
In a weird way, this piece deal has been kind
of a litmus test for Trump derangement syndrome. Right. I
see this a lot from people on the left. They're
actually hesitant to criticize this at all because it's so
overwhelmingly popular. It's being praised by so many different varieties
of people, so many different countries, that you don't want
(03:24):
to look like you have TDS by standing up and saying, hey,
wait a minute now. To me, that's not the definition
of TDS. I get accused of having TDS all the time.
To me, the definition of TDS is or ought to be,
that you say something incorrect or you come to an
incorrect conclusion about a topic because of your derangement regarding
(03:48):
Donald Trump. And that's something that I have tried exceedingly
hard to do over the last ten years, and I
think most for the most part, I've succeeded in not
succumbing to those temptations as someone who does not like
Donald Trump personally and has opposed him politically on most
but not all things. To me, the standard here is,
(04:10):
are you saying something inaccurate? Are you analyzing something in
a faulty way because you're allowing your view of Trump
to alter your ability to perceive reality. And I have
to tell you, folks, I think there's another side of
this story. And I think the best way to understand
(04:33):
that there is another side of the story is how
this deal got made. Right now, there's been so much
praise over the negotiation skills of Kushner and Trump and Rubio,
and everyone is very happy right now. And I hope
it all turns out great. I really do. Everybody wants
(04:55):
piece in the Middle East. It's fantastic that the hostages
are returned. But somebody's like, okay, so how did this happen?
Why did this happen? What was different about the way
that the Trump team went about this to get this
deal done, and for instance, the Biden team, because obviously
(05:16):
this happened under Biden's watch. Well, to me, this is
not that difficult to understand what really transpired here, and
that is that Trump, who is a master at understanding
leverage as a real estate guy, this has always been
his strength. He understands leverage very very very well. And
(05:38):
the traditional way of trying to broker a peace deal
or a hostage release or a ceasefire when Israel is involved,
Israel obviously being the ally of the United States, the
traditional strategy would be to support Israel full throttle and
to try to force Hamas into making a good deal
(06:03):
for Israel. That would be the traditional way of doing things.
I don't think that's the way the Trump team did
this at all. I think they used a reverse leverage strategy.
And I think you're gonna realize as I described this,
that this is something Trump has already tried, either consciously
(06:25):
or subconsciously when it came to ending the war with
Ukraine and Russia, which he promised he would end on
day one and has not been able to do so.
In the Ukraine Russia situation, he publicly castigated Zelensky, effectively
cut his balls off, told the whole world he has
no cards, and to the extent there was a strategy there,
(06:48):
it was to force him into taking a bad deal
with Putin. Well, Zelensky hasn't done that, and there hasn't
been a deal yet between Russia and Ukraine. But I
think the same basic strategy is at play when it
came to Israel and Hamas. Trump understood that Israel is
(07:12):
completely lost without the United States, maybe even more so
than Ukraine is without the United States, and so therefore
Trump used and those around Trump used our leverage on Israel,
in my opinion, to force Israel into a deal that
really is not all that great at all. And then
(07:35):
they used the leverage that they haven't, specifically the relationships
that Kushner has with the allies of Hamas, to incentivize
them into forcing Hamas to finally be semi rational and
agree to a cease fire and hopefully a peace deal
(07:56):
that lasts. And by the way, part of this equation
that I've been very baffled by is how is it
that we are celebrating universally this peace deal when peace
deals of the Middle East are notoriously difficult to enforce
and maintain. This seems like an obvious premature celebration. Again,
(08:16):
I'm as hopeful as anybody this holds that it works,
that there's peace, people stop dying. It's great that the
hostages are released. Nobody rational is denying any of that.
But when you look at what actually transpired here here,
I think this is a lousy deal for Israel. I mean,
(08:38):
Israel was the one that was attacked horrendously, horrifically, and
so what does Israel get. They get twenty civilians that
have been held in probably horrific conditions against every moral
and legal principle known to man for the last two years.
(09:00):
They get them back. They still don't have even the
bodies of most of the people that were killed in
that horrendous terrorist attack. They get that. But Hamas gets
back almost two thousand soldiers. I mean, right there, you're wondering, Okay,
how does that equate? And then the security measures to
(09:21):
maintain this thing long term don't appear to me to
be in Israel's self interest. So how did this happen? Net?
And Yahoo appears to be very much in favor of
this deal, and you know, no one could ever accused
him of being soft on Hamas well. My sense of
what really transpired here is that over two years Israel,
(09:44):
at least some elements of the political factions within Israel
got tired of this war. Politically, it was becoming a burden.
And then you have Trump basically placing a knife on
Israel's balls and saying, look, I'm done with this. I
need you to make a deal. Then what's Netanyahu gonna do?
(10:08):
And he certainly wants the deal is made unless it
until Hamas breaks the piece deal, which could be a
very interesting scenario should that happen down the road. But
at least for now, Netanya who has to play nice,
He has no choice because Trump has all the leverage. Now,
of course, everybody is praising this, and so how do
(10:28):
you explain zig that everybody in the media. How could
all the media that's so critical of Trump be wrong
or at least somewhat wrong when it comes to universally
praising Trump and his team for what transpired here. Well,
to me, it's not that difficult to understand from a
(10:49):
media perspective. The right wing media is going to go
with whatever Trump says is the truth. So if Trump
is making the deal and it's creating, at least in
the short run, piece, then that's inherently good. So the
entire right wing media is going to go with Trump
(11:09):
no matter what. Even Ben Shapiro, who has been cheerleading
this whole thing. He actually made the trip, like on
the team bus to Israel to celebrate this cease fire.
I would love to know what Ben Shapiro's true opinions
are on this. I find it hard to believe that
the Ben Shapiro that I've been watching for many, many
(11:31):
years really thinks this is a fantastic deal. Here's what
I am pretty darn sure of when it comes to
Ben Shapiro and other people like him in the right
wing media. If Joe Biden made this exact same deal,
or Kamala Harris made this exact same deal, they would
be furious, furious, and I'm very sure of that because
(11:56):
they would be free to speak their true beliefs. But
because of Trump's hold over the right wing maga media,
it's far easier for them. They are deeply incentivized to
not only go along with this, but to cheerlead it,
to literally be part of it. And by the way,
when I say, you know what, if Biden made the deal,
(12:19):
this deal seems like it's almost exactly the same deal
that the Biden team tried to make. They just didn't
go about it either as an effective or as as
cynical a fashion. As the Trump team did, and the
circumstances were a little bit different because I think Israel's
gotten tired, both physically as well as politically. So the
(12:44):
right wing media is completely beholden to whatever Trump wants,
and the left wing media inherently piece is always good,
hostage release is always good. But let's face it, not
all but a lot of the left wing legacy media
not a big man of Israel despite a lot of perceptions.
Is the way you know, the New York based media
(13:06):
used to be in this country. It's not that way anymore.
So I think they see this as a fantastic development
that they're begrudgingly giving Trump credit for. Hey, we got
a peace steel, the hostages are released, and it looks
like Israel didn't win here. That's my perception of why
we have this remarkable universal nature of the reaction of
(13:29):
the media coverage to this being in a positive direction.
And I had to tell you, I was pretty confident
of my take on this until the one major media
member that I saw basically saying the same thing happened
to be Chris Cuomo on News Nation, A Chris Clobo,
who I really have a huge problem with going all
the way back to the old Penn States things, since
(13:50):
he was the only guy to ever interview Aaron Fisher
for a major television outlet when he was with ABC.
So I'm not a Chris Cuomo fan, but I thought
Chris Cuomo did a hell of a job articulating pretty
much exactly what I just said without the analysis of
how the strategy was different and when I'm calling the
reverse leverage strategy, using your leverage against your ally as
(14:13):
opposed to your opponent. And part of that equation is
this unique relationship that Trump and Jared Kushner have some
might say not unique, but deeply financially invested relationship that
Trump and Kushner have with the Middle East and specifically
(14:33):
Cutter or Qatar. It's said both ways, I'll go with
Katar whatever. And so we had a remarkable moment and
a very controversial moment in the aftermath of this deal
that I thought was very very telling, but for reasons
that were different than the way the story ended up
getting perceived. You're probably aware that Cutter, among with other
(14:56):
Middle Eastern countries were very critical a fact, and putting
pressure on Hamas to go along with this deal. And
it was very clear that Secretary of War Pete Hegstath
was connecting that cooperation that Qatar provided with a benefit,
(15:19):
that Qatar was getting a benefit that when Hegseth articulated,
it sounded off the wall bizarre because frankly, Hexseth misspoke
and made it sound as if part of this deal
was that Quitar was going to have its own military
base in the United States of America. Well, we would
(15:40):
later find out that that's not really what's going down,
that they're just going to have some pilots trained by
the US military ATA base in the United States. But
this sounded horrific. Lots of people reacted to this very
very strongly, and understandably so because Hegsath at best totally spoke.
(16:00):
But to me, the most important part of this was
the obvious and direct connection the quid pro quo between
Cutter putting pressure on a moss. By the way, Qatar,
I screwed up the pronunciation again. Katar cited by Trump
himself at twenty seventeen as a major sponsor of terrorism
throughout the world and a bad actor. So here, all
(16:23):
of a sudden, Qatar is a good actor. Gee. I
wonder if part of that's because they just signed a
massive deal for the Trump Organization to build a golf
course there. And of course, Qatar is also the country
that gave Trump a four hundred million dollar airplane to
replace Air Force one g What an amazing coincidence. Well,
(16:44):
here we have Katar playing a significant role in all
this and being rewarded with Pete Hegstath announcing in very
screwed up fashion in this new deal to train the
Katari Air Force in the United States. And for the record,
here's what it's satellite. When Pete Hegseth really screwed the poach,
(17:07):
no one.
Speaker 2 (17:07):
Other than President Trump could have achieved the piece that
what we believe will be a lasting piece in Gaza,
and Qatar played a substantial role from the beginning working
with our folks to ensure that came about.
Speaker 1 (17:19):
So I want to thank you for that historic piece
and look forward to.
Speaker 2 (17:23):
Joining the President as that gets it's already been delivered,
but as that's formally signed as well. And I'm also
proud that today we're announcing or signing a letter of
acceptance to build a Katari EMII Air Force facility at
the Mountain Home Air Base in Idaho location will be host.
Speaker 3 (17:43):
A contingent of Katari.
Speaker 2 (17:44):
F fifteen's and pilots to enhance our combined training, increase.
Speaker 1 (17:48):
Lethality in our operability. It's just another example of.
Speaker 2 (17:52):
Our partnership and I hope, hope, you know, your excellency
that you can.
Speaker 1 (17:56):
Count on us. Now when you hear Hegsath say that,
any rational person is going to go, what the fuck
is that? I was definitely in that camp, but it
was one of those things that was so crazy. I'm like,
how can this possibly be? There's no way this is true.
And it turns out that it's not true. That Hegseeth
(18:16):
greatly exaggerated and misspoke about what the apparent reality of
this is. That did not stop someone like Laura Lumer,
who has been a very close advisor to President Trump
and has even gotten him to fire very significant people
in his administration. She's a MAGA influencer, and she went
(18:37):
absolutely bananas. She posted like forty times on social media
how outrageous this was, and even suggested that she might
not vote in the twenty twenty six midterm elections. Now
I don't know whether or not she's finally come to
her senses and realizes that this is all much ado
about very very little. But that was a very very
(18:58):
telling moment, but not for the reasons that it got
all the publicity. To me, this was an insight into
how this deal got done. And you might be saying, zig,
who cares the deal got done, Let's hope that it
sticks and let's hope for the best. I'm in that camp.
I'm fine with that idea. But to me, I think
(19:21):
Amas won here. When I look at what Hamas got
and what Israel got, I'm like, Okay, what exactly you know,
other than having to endure two years of bombardment from Israel,
what exactly was the negative in the long run that
Hamas suffered because of this outrageous, horrific nine to eleven
(19:44):
style October seventh attack from two years ago? What was it?
To me? I don't think that Israel won here, certainly,
not to the degree. I don't think Amas lost to
the degree that they deserve to lose. But Trump wanted
a deal, and he was willing and able to use
his leverage on his ally and then use relationships with
(20:06):
the allies of Hamas to pressure both sides. That doesn't
mean it's gonna hold. I mean, these things are notoriously
difficult to keep together, and you know, I think there's
a very good chance that it does not hold. Hopefully
it will. I'm not I'm definitely in the camp and
people who are like this fantastic if it holds, great.
(20:29):
I don't like the precedent and I get a little
nervous when the entire media industrial complex is praising Trump
for something that, Okay, maybe in the short run deserves
some praise. But there's another side to this. There's there's
a reason why this deal was able to get done when,
for instance, the Biden team couldn't get it done. You
(20:51):
know the old only Nixon can go to China thing,
which I guess is now so old that many people
probably listen to this don't even understand what that means.
But that's that's an old adage in politics, that only
Nixon had the credibility to go to China without fearing
being called, you know, a sell out to China. Well,
(21:11):
only Trump can sell out Israel, partially because of his
relationship with Netan Yahoo. And I did find it interesting
that when Trump went to Israel, he made a plea
publicly to the president of Israel to pardon the Prime
Minister Benjamin Netton Yahoo. That felt like a little bit
(21:32):
of a quid pro quo too, Like, knowing what I
know of Trump's psychology, I don't think Trump does that
unless he feels like net and Yahoo deserves something a
little extra, like, you know, a little something extra for
the effort. That felt very much like maybe quit pro
quo was the wrong phrase. But knowing what I know
(21:54):
of Trump's psychology, for him to publicly make a plea
for a part for the criminal charges that Netanya, who's
facing in Israel, feels like, you know, you know, Benji
really did be a solid here. He's kind of getting
the raw into this deal. Maybe I could help him
out here. That's just that's just my instinct based upon
(22:15):
what I understand of Trump's psychology. But I am I
am not somebody who thinks that this is an amazing
bit of diplomacy and a miracle situation here. It may
not hold, and I don't think it's all that great
of a deal. That again, but to be clear, this
is not Trump de rangement syndrome, because one, I think
(22:36):
my analysis is dead on it's factually based, and two
I'm hoping it's successful. It would be fantastic if this
was somehow successful. It'll be very interesting to see what
the media reaction if it turns out that it doesn't hold,
because a lot of the things that have been said
may be taken back, and you know, the intensity of
(22:56):
that reaction might be might be quite strong. I mean,
Trump has praised the hell out of Marco Rubio as
the greatest Secretary of State of all time. I think
we ought to remember that. See how howlong that holds
depending on what happens with regard to the peace deal
between Hamas and Israel. But obviously, you know, this is
a story one of many that will keep an eye
(23:17):
on the Death of Journalism podcast. And I hope, I
hope actually we never talked about this again, because I
hope that means that the peace deal has somehow held.
And moving on to other topics, I've got to at
least mention the latest when it comes to RFK Junior,
the head of Health and Human Services, and his crusade
(23:39):
against autism and his desperate attempt to try to come
up with some explanation for why autism rates have exploded
in the last two decades. In America and around some
of the world, although not all the world. And he's
really said a lot of crazy things, so it's difficult
to discern which is the nuttiest. But this whole idea
(24:01):
that a sen of innefit or thailanol is a primary
cause of autism, to me is just so utterly ridiculous
for a number of reasons. But it doesn't to me
even get out of the batter's box as a theory
one because the studies on which that theory is based
are incredibly weak and not definitive of it at all.
(24:24):
And the number one thing you have to remember in
these types of situations is that correlation does not equal causation.
This is something that apparently RFK Jr. Was never taught,
or if he was, he's forgotten about it, because he
does believe, at least when it fits his agenda, the
correlation does equal causation. And to me, the reason why
(24:46):
the tail in all theory doesn't get out of the
batter's box is one autism existed before thailand all, but
that to me is not nearly as definitive as the
other element of this, the other side of the timeline issue,
which is that Thailand aw was being used widely for
about half a century before there was a massive spike
(25:09):
in the autism diagnosis rates in the United States of America.
So how does that make any goddamn sense. It doesn't.
It's absurd on its face, and so you have incredibly
weak studies, but it doesn't even pass any logic test.
And there was another public event where Trump was in attendance,
(25:30):
and our cage was posing another new theory or another
alleged piece of evidence to show that I guess Thailand
all causes autism. In this I'm playing this mostly because
it just shows the weakness of Argay Junior's case, as
well as the bizarre lens He's willing to go to
(25:53):
try to prove something that is not provable because it's
not true. And specifically, I'm referring to the idea that
rm K Junior, in a public event in the presence
of the United the presence of the President of the
United States of America, actually tried to claim that because allegedly,
and I'm just going to presume he's telling the truth
(26:13):
here because I don't have the data in front of me,
but presumably autism rates among men who have been circumcised
are twice as high as among men who have not
been circumcised. And just for the record, here is what
RFK Junior had to say about this, a comment, by
(26:36):
the way, that Donald Trump approved of in this publicly
televised meeting.
Speaker 4 (26:42):
Yeah, there's also just there's many, many other confirmation studies. Oh,
there's two studies that show children who are circumcised early
have doubled the rate of bolotism. That's highly likely because
they're given title. Oh, you know, none of this positive
of all of it we should be paying attention to.
Speaker 1 (27:04):
Yeah, but you.
Speaker 4 (27:05):
Know, there's a tremendous amount of a proof for evidence.
I would say it's a non doctor, but I've studied
this a long time.
Speaker 1 (27:13):
Ago, you know what I mean. All right now, The
idea here in theory is that because boys get part
of their penis chopped off during circumcision as babies, and
therefore they're given tilot all, I guess, which is not
a fact, not an evidence that somehow this must be
(27:36):
must be. There's no other explanation for this other than
the idea that uhha, a sedam in event or tilt
all causes uh autism. Because we have this alleged fact
that boys that are circumcised are twice as likely to
later be diagnosed with autism than boys that are not. Well,
(27:56):
there's so many problems with this, uh, but I'm just
going to go through a couple of them real quick. First,
is that being twice as likely in this kind of
a situation is not that much of a correlation. I mean,
it really isn't. I mean, if it was ten times
or twenty times, Okay, wow, that's that's a pretty strong correlation.
(28:16):
Let's find out, you know, what the causation for that is.
But twice as much, well, it sounds like a big correlation, really, isn't.
Number two, we don't even know whether or not you know,
boys that are circumcised or given tile at all as
a baby, you know, generally circumcision, you know, the baby's
over it in a very very short period of time.
(28:40):
And so there's that part of the equation isn't even
based in any sort of data or facts. It's just
supposition or theory. But the biggest problem is the idea
that you're using circumcision, which is an inherently social phenomenon, right,
It's a cultural phenomenon. It's there's nothing inherently, you know,
(29:06):
other than the social and cultural principles and marais and
religious beliefs and traditions of a particular family that causes
people to get circumcised. So it's inherently a situation where
you have a group of people. The boys that are
circumcised are growing up in a culture that is you
(29:31):
it's not really unique because it's incredibly popular, but it's
different from the culture of boys who are not circumcised.
That's just so obvious. I feel stupid even mentioning it.
So it's clearly there's a difference in culture because that
it is a cultural phenomenon circumcision. Well, once you have
(29:51):
a group that is clearly different culturally, then inherently you've
lost any chance of proving the causation argument, because there
could be all sorts of explanations that are far more
logical than the fact that babies who get circumcised might
(30:14):
be given tilot off for the pain, which to me
doesn't even make any sense, especially since it's a one
shot deal. But they're all I mean, off the top
of my head, I can think of several far more
plausible scenarios to explain this rather minor causation. The first
one that comes to mind is, Okay, so you have
(30:36):
a culture that believes in circumcision, is it not possible
that that culture might also marginally be more likely to
want to have their boy diagnosed with autism as being
on the spectrum to explain odd behavior, which I think
(30:58):
is driving a huge part of the spike in autism
rates over the last couple of decades. It is a
social contagion. So you have a social construct of circumcision
playing into this social contagion of autism diagnosis. That makes
(31:18):
a lot of sense to me. There's also an economic issue.
You could have completely different economics among the circumcised group
as opposed to the uncircumcised group, but it's mostly cultural,
and to me, in a weird way, this actually proves
my argument that this is in fact the social contagion
(31:39):
this autism spike over the last couple of decades. To me,
that is the only explanation that makes total sense. And
I've explained in my own experience, both through my wife
who's a special ed teacher, and my own experiences with
older people who would have been diagnosed with autism if
they had been born later in life that clearly were
(32:00):
on the spectrum that weren't because we had a situation
that was where autism was underdiagnosed and now it's way
over diagnosed because it's actually become cooler. It's not just accepted,
but something that a lot of parents want to have
their kids be diagnosed as having being on the spectrum
(32:21):
for a lot of reasons, some of which are economic.
According to my wife, the special ed teacher, I've said
before that she's had parents begged her to help get
their kid be put on the autism spectrum. So if
you have a social construct, a cultural construct like circumcision
that has a correlation with autism, to me actually goes
(32:41):
against your entire theory. But RMK Jr. Is deeply invested
in this. In fact, he even said at one point,
now a lot of people made a lot of this,
and I understand why they did. He might have kind
of misspoke, but he did actually say, we don't have
the proof that you know aceta minifite causes autism. We're
(33:02):
doing the studies to get the proof. That's almost a
direct quote. Well, that doesn't sound very scientific to me.
That sounds like you have a conclusion that you're trying
to come up with enough data to plausibly support and
you know it's not just circumcision, by the way, maybe
even even worse example that r F K Junior tries
to use all the time when it comes to aceta
(33:24):
minaphin is Cuba. Apparently, according to RFK Junior, And again
I don't know whether or not this is even true,
but I'll just accept it for the sake of argument,
Apparently Cuba has a very low rate of tile and
all use at a very very low rate of autism. Well,
who fucking cares. There's a million things about Cuba that
are completely different that could be the cause of both
(33:48):
of those, including a really horrendous economy and terrible healthcare
and and no incentive to create an autism diagnosis. So
I mean, just to me, that just completely destroys RFK
Junior's credibility. And I've said from the very beginning he's
(34:09):
a nut job and he's dangerous in the position that
he's in. Occasionally he'll say things that I like. Occasionally
he'll say things that it's a bit like Tucker Carlson.
Occasionally he'll say things that are really smart, and you're like, yeah,
that's great, and I wish more people would say that
he did that, you know, with regard to the COVID
(34:30):
reaction in his congressional testimony recently, and I applauded that.
But for all the good things that he occasionally says,
he says a lot of crazy things that are not true,
and I think he further exposed that when it came
to his most recent comments on the issue of autism. Now,
I want to turn to a couple of developments here
in California that made big news over the last week.
(34:54):
The first is that they have arrested a suspect in
the Power Sades fire, the horrendous wildfire that destroyed Pacific
Palisades at the beginning of this year. And I found
this to be rather remarkable on a number of levels,
(35:14):
mainly from a media perspective. I'm just astonished as to
how relatively little news coverage this development got and how
little outrage I have sensed or perceived exists, not just
in the media coverage but among the public here in California.
(35:36):
I mean, this is a guy who, if he is
indeed guilty, and it appears as if he is. Although
I do think they're going to have some problems proving
the case in court, because their allegation is that he
set a fire on New Year's Eve or early on
New Year's Day, and then it's smoldered for seven days,
(35:58):
it was not properly put out, and then eventually, because
of the winds that picked up and the dry conditions,
that fire re sparked and ended up destroying Pacific Palisades.
And you know, twelve people were died, and tens and
tens of not hundreds of billions of dollars in property
(36:19):
damage was created, and all sorts of lives were destroyed.
I mean, you could make a very strong argument that
this caused way more damage than the worst serial killer
in American history, and yet very little outrage, even in
the meat media coverage. We got a lot of media
(36:40):
coverage for about half a day, and even here in California,
you hear almost nothing about this. There's there is no
real anger about it, and it's more of just a curiosity. Oh,
they finally got a guy. And I don't have a
really great explanation for it doesn't make any sense to me.
I mean, it's not like this. It was ten year
years after the fact, we're still within the same calendar year.
(37:03):
And it's just a weird psychological phenomenon that people when
it comes to setting a fire. I don't think that
they think of it the same way as if you know,
someone set off a nuclear bomb, but that's essentially effectively
what happened here, and so I just found that to
(37:23):
be incredibly odd. And of course, you know, most of
Pacific Palisades is nowhere near even getting started in the
rebuilding process. Even I have been surprised by how slow
everything is going. I thought that with a mayoral election
and a gubernatorial election and Olympics coming up here in
(37:44):
Los Angeles, that everybody would be greatly incentivized, especially Gavin
Newsom since he's going to be running for president in
twenty twenty eight, and right now, unfortunately, looks like he
could be the leading contender for the Democratic nomination in
twenty twenty eight. With all those actors, I thought, even
Democrats are going to realize it is very much in
their self interest to make sure they fix this goddamn
(38:07):
thing and cut some of this red tape and allow
them to be rebuilding, because this is this is not
something you can really spin. I mean, if if four
years from now, and we've had the Olympics and then
the presidential elections going on, and you know, in three
years and nothing has been done to rebuild. That's gonna
look really terrible. But they don't see the give as shit.
(38:30):
So I'm just I'm totally baffled by the lack of
democratic urgency to make sure that there's some semblance of rebuilding,
as well as the rather muted reaction to the arrest
of this suspect. And I'm the first person who you know,
I don't like to see people railroad, and I don't
like to come to conclusions without evidence. I don't like
(38:52):
the rush to judgment. I have some concerns about the
nature of the evidence against this guy. But it's just
it's just baffling to me how it's really not been
much of a story at all, considering the impact of
the Palisades fire less than a year ago now probably
getting more coverage nationwide than the arrest of the suspect
(39:15):
and the arson By the way, it should be pointed
out the arsonist and it wasn't climate change or global warming.
Let's make that clear, Gavin Newsom claimed numerous times in
the aftermath of both the Pacific Palisades Fire and the
Altadena fire, that this was evidence of climate change, global warming, Well,
no evidence of that. Now we now know that this
(39:35):
was even though there was environmental circumstances that allowed it
to happen, this was facilitated by a human being and
a criminal arsonist. So Newsom was completely and totally wrong
about that. And don't let me forget to mention that
element of it. But with Newsom being termed out as governor,
there's a race for governor that is starting here in
California that's going to be not this November. Obviously, new
(40:00):
has his Proposition fifty, the redistricting proposition, which I think
is going to be very very close. I'm still predicting
it will probably win by about fifty one forty nine,
maybe fifty two forty eight, but it is very close
and it could lose. But that's this November. Next November
will be the next gouvenatory election to replace Gaven Neussom,
(40:22):
and up until this past week, the leading contender to
replace Gavin Neussom has unfortunately been a woman by the
name of Katie Porter, who's a former congresswoman who ran
and lost for US Senate to Adam Shift here in California,
and it was always presumed, at least by me, that
(40:45):
she would be the next governorate, even as horrible as
she is, mainly because there was no way that she
could not be in the top two vote getters out
of the primary. And it's important to point out the
way that California does this, which is totally screwed up
and is not working out as intended. Boy, there's a
rarity in twenty twenty five life, and especially in our
(41:07):
political realm. But the way this works in California is
the top two primary voters go to the general election
regardless of which party they're in, So there is no
party primary, and in a statewide race, there's a very
very good chance that means there's going to be two
Democrats who will run in the general election. Now, sometimes
(41:32):
it doesn't work out that way. And Katie Porter learned
an important lesson which is going to be relevant in
just a second when I get to what transpired here
during that Senate campaign, because Adams Shift brilliantly, brilliantly used
his own money to promote former Los Angeles dodger Steve
Garvey as the Republican candidate. Because he knew that if
(41:52):
Garvey got in the top two vote getters, that shift
would not have to really run a general election campaign
because there's no chance for a Republican, even like Steve Garvey,
to beat a Democrat in a general election. It's just
not possible in California. Regardless of what somebody might tell
you that's overly optimistic. I got to share you with
(42:14):
one thousand percent certitude, barring massive scandal, the likes of
which would be hard to even comprehend, there is no
chance for a even remotely credible Democrat to lose to
Republican in general election. And Schiff understood, if I help
Garvey get into the general election, I've won. Well, that's
(42:38):
important because of what ended up transpiring with Katie Porter
doing an interview with a local CBS female news reporter
here in Los Angeles. And I have a unique take
on this, and I'm going to tell you something that
I have not seen reported anywhere else, which is as
(42:59):
lead at least as baffling to me as the lack
of reaction to the Pacific Palisades artist being arrested by
the federal government, and that is that you probably have
heard that Katie Porter walked out on essentially walked out
on an interview with the CBS affiliate because she was
upset about questions that she was getting related to her
(43:24):
answer involving proposition fifty. That's how this all began. She
was asked about proposition fifty, she of course supports it,
and then began a conversation with the reporter about whether
or not Republican votes are needed in a general election,
and Porter scoffed at the reporter, the questioner, the interviewer
(43:47):
for even suggesting that there's any chance that if she
made the general election against the Republican that she could
possibly lose. By the way, again, she's one hundred per
sent right, Porter is in that assessment. But here's where
things get really odd, and I don't have an explanation
for this. I have tried to find out why this
(44:10):
transpired the way that it did. But I'm gonna play
you the full clip that went viral. The important part
of from my perspective, this clip, which is about three
minutes long, is that the first minute or so was
already made public almost a month ago. Now I know
(44:35):
this because when I saw it, I'm like, I can't
believe Democrats are gonna let Katie Porter get away with
what she said, because she tells the reporter that essentially
she's gonna use the Steve Garvey strategy. She doesn't say this,
but if you read between the lines, she's gonna use
the Steve Garvey strategy to prevent any other Democrat from
(44:58):
making it into the general campaign, which means that she's
going to win, which if you're a Democrat, if I'm
a Democrat, that's to me outrageous because she's going to
run a campaign that sabotages the chance to have two
Democrats in the general election. And she says it in
a very smug and arrogant fashion. And if you don't
(45:20):
know who Katie Porter is, I mean she's very heavy,
very unattractive. I mean, this is a woman who, to
me is very incredibly unlikable. Even before all that's been
made public in the last week. She was once accused
by her ex husband of having poured hot potatoes over
(45:43):
his head. To me, the most remarkable part of that
story was one that Katie Porter was once married, and
two that she was once married to a male. Both
of those things were shocking to me, much more so
than the hot potatoes being poored on the head. But
if you've seen Katie Porter, you know what I'm talking about.
But here's the from a media perspective, this is the
(46:06):
part that just tries me crazy. So the first minute
or so of this video was out there, made public
by CBS and Los Angeles over about a month ago.
I tried to give it attention. I even send it,
sent it to Alex Michaelson, who at the time was
the news anchor at the Fox affiliate here in Los Angeles.
(46:28):
This thing has been going on so long. He's now
changed jobs. He's now at CNN, which will be interesting
to see how he works out there. He's been a
guest on this podcast before, and he and I have
had many many communications over the years as I try
to keep him in line, and I think he is
at least fair enough to appreciate where I'm coming from.
But I sent him the clip a month ago about
(46:49):
the whole issue of Porter trying to do the Steve
Garvey thing, and I don't even think he responded. No
one cared. And then a week ago just that where
I got done taping the podcast. And to be clear,
when I taped this podcast, especially recently since we've been
going along. It is exhausting. I mean, I can barely
spell my name at the end of almost three hours
(47:11):
of talking without a script, and I happened to see
that the Katie Porter interview, what I thought was the
Katie Porter interview from three or four weeks ago, was
now going viral, and I was very confused. I'm like, what,
what are you people doing this thing? I was trying
to focus attention on this weeks ago, what's going on?
(47:33):
And it took me a while to realize that there
was a new portion of the very same clip that
had not been made public three or four weeks ago,
that now suddenly had been made public, and that was
the part that was causing it to go viral. And
that was that after the initial exchange about Katie Porter
(47:55):
arrogantly saying she's going to win and she doesn't need
Republican votes and that she's going to make or that
there's not another Democrat running against Term in the general
election so she can't lose. After that, she and the
reporter get into a tiff where Porter essentially ends the
interview because she doesn't like all the follow up questions,
none of which were remotely inappropriate, and all of which
(48:17):
showed Porter to be incredibly sensitive, have incredibly thin skin,
and come across as extremely unlikable. So here is the
full clip, and I'll explain you know, or at least
to the extent that I can, after you listen to this,
what transpired from a media perspective. But again, the first
(48:38):
portion of this I had seen almost a month ago.
The second portion is the part that ended up causing
this to go viral because it showed Katie Porter to
be frankly a bitch. And here's what that sounded like.
Speaker 5 (48:51):
What do you say to the forty percent of California
voters who you'll need in order to win, who voted
for Trump?
Speaker 3 (48:59):
How would I need them to win?
Speaker 1 (49:00):
Man?
Speaker 5 (49:01):
Well, unless you think you're going to get sixty percent
of the vote, you think you'll get sixty percent? All
everybody who did not vote for Trump will vote for you.
Speaker 3 (49:08):
That's what you're in a general election. Yes, if it
is me versus a Republican, I think that I will
win the people who did not vote for Trump.
Speaker 5 (49:16):
What if it's you versus another Democrat?
Speaker 3 (49:18):
I don't intend that to be the case.
Speaker 5 (49:20):
So how do you not intend that to be the case.
You do you are you going to ask them not
to run?
Speaker 3 (49:25):
No, No, I'm saying I'm going to build the support.
I have the support already in terms of name recognition,
and so I'm going to do the very best I
can to make sure that we get through this primary
in a really strong position.
Speaker 1 (49:34):
But let me be clear with you.
Speaker 3 (49:35):
I represented Orange County. I represented a purple area. I
have stood on my own two feet and one Republican
votes before. That's not something every candidate in this race
can say. If you're from a deep blue area, if
you're from LA or you're from Oakland, you don't have
an experience.
Speaker 5 (49:50):
Is that you don't need those Trump voters?
Speaker 3 (49:51):
Well, you asked me if I needed them to win,
so you don't. I feel like this is unnecessarily argumentative.
What is your question?
Speaker 5 (49:58):
The question is the same thing I asked every that
this is being called the empowering voters to stop Trump's
power grab. Every other candidate has answered this question, this
is not and I said I support it. So and
the question is what do you say to the forty
percent of voters who voted for Trump.
Speaker 3 (50:15):
Oh, I'm happy to say that it's the do you
need them to win. Part that I don't understand. I'm
happy to answer the ques answer the question is you
haven't written, and I'll answer.
Speaker 5 (50:22):
And we've also asked the other candidates do you think
you need any of those forty percent of California voters
to win? And you're saying no, you don't.
Speaker 3 (50:28):
No, I'm saying I'm going to try to win every
vote I can. And what I'm saying to you.
Speaker 1 (50:32):
Is that well to those voters.
Speaker 5 (50:35):
Okay, so you I.
Speaker 3 (50:36):
Don't want to keep doing this, I'm going to call it.
Thank you.
Speaker 5 (50:41):
You're not going to do the interview with them?
Speaker 3 (50:43):
Nope, not like this. I'm not not with seven follow
ups to every single question you.
Speaker 5 (50:46):
Ask every other candidate has.
Speaker 1 (50:48):
I don't care.
Speaker 3 (50:49):
I don't care. I want to have a pleasant, positive conversation,
which you asked me about every issue on this list.
And if every question you're going to make up a
follow up question, then we're never going to get there
and we're just going to circle around. I had to
do this before ever.
Speaker 5 (51:05):
You've never had to have an conversation. Okay, but every
other candidate has done this.
Speaker 3 (51:12):
What part of I'm me? I'm running for governor because
I'm a leader, so I am going to make so.
Speaker 5 (51:17):
You're not going to answer questions from reporters. Okay, why
don't we go through? I will continue to ask follow
up questions because that's my job as a journalist. But
I will go through and ask these and if you
don't want to answer, you don't want to answer, so
nearly every legislative.
Speaker 3 (51:32):
I don't want to have an unhappy experience with you,
and I don't want this all on camera.
Speaker 5 (51:36):
I don't want to have an unhappy experience with you either.
I would love to continue to ask these questions so
that we can show our viewers what every candidate feels
about every one of these issues that they care about,
and redistrict take it's a massive issue. We're going to
do an entire story just on the responses to that question.
And I have asked everybody the same follow up questions.
Speaker 1 (51:55):
Now, how is it that that second portion of the interview,
and to be clear, after what you just heard there,
porter essentially leaves the interview. How that portion was withheld
by the CBS affiliate in Los Angeles for at least
three weeks is to me still a mystery, maybe even
(52:19):
a bigger mystery is, how is it that I'm the
only one that I'm aware of that's even mentioned this.
I mean, I have tried everything I can. I responded
to the reporter who did the interview on Twitter, asking
her for an explanation. I got no response. I revisited
the issue with Alex Michaelson, who then did a report.
(52:41):
He was fully aware of what I was talking about,
because I said, hey, Alex, why is no one mentioning
that CBS withheld this interview or this portion of the interview,
the viral part of this for three weeks? And he
acknowledged that that's what happened. But even he's never mentioned
it on CNN or or any of the posts he's
made on social media. I mean, so I have not
(53:03):
followed up with Alex because I don't even understand. I
don't under I don't get what's going on here. To me,
this is not difficult. There needs to be an explanation.
This interview was done almost a month ago now, and
so why is it that this the juicy portion is
just now being released. Effectively, My question is who ordered
(53:26):
the code read on Katie Porter or who was trying
to protect Katie Porter and that protection got revoked. I'm
not sure which it is, but from a media standpoint,
this is completely inexcusable because it certainly appears as if
the CBS affiliate in Los Angeles was trying to protect
(53:48):
Katie Porter and then they destroyed her. Here's my guess.
This is purely a guess. I don't have any inside
information on this at all. I think that they originally
put out the interview, deciding that the portion where Katie
Porter and the interviewer get into this argument about questions
that that was somehow off the record or maybe inappropriate
(54:12):
to air, and maybe somebody at the lower level, maybe
the reporter herself, decided not to air it. And then
somehow somebody else, maybe higher up in the food chain,
found out about that portion of the interview, and they
decided to come on, we got to release this. This
part's fantastic. I think something like that probably transpired. I'm
(54:37):
an anti conspiracy person, so unless proven otherwise, I rarely
ever go in the conspiracy direction. But that from a
media perspective I found to be really quite fascinating, very telling.
But unfortunately still this moment rather mysterious because I don't
have a great explanation despite my efforts for what the
(54:57):
hell actually transpired here. As far as how this is
going to impact the race for governor, I think she's
toast because on the heels of this, all sorts of
other terrible videos about Katie Porter have come out, and
now basically it's a feeding frenzy, and Porter is in
a particularly vulnerable position because she's not in office right now. See,
(55:20):
not being in office when you're in the middle of
a feeding frenzy is basically death, especially when you're not
real likable at all. You don't have a huge fan base,
your name recognition isn't that large to begin with. You know,
very few people are deeply invested in you. So when
you have no reason for people to fear you, and
(55:42):
it's very easy for your donors to flee, and no
one's going to rush to your protection or your defense
because you're not in office, I don't see how it's
a weird analogy. I see her as almost as dead
in the water as Bill Belichick is at University of
North Carolina. I don't see how she turns this around
(56:05):
because there's gonna be plenty of other candidates and she's
so unlikable yes, California is incredibly liberal to an absurd degree,
but being that unlikable pretty much trumps everything, and I
think she's reached that level. I've not seen a credible
poll about the race. Since this transpired. In the betting markets,
(56:26):
she has plummeted from being the favorite to you know,
now barely being in contention, and I think that makes
a lot of sense. I don't know who is going
to emerge from the Democratic Party to be the next governor,
but I'm glad it's not going to be Katie Porter.
At least as of right now, it doesn't look like
it's going to be Katie Porter. She could still, I guess,
in theory, make the top two and if there are
(56:50):
two Democrats in the general election, but I don't see
how she would win as an unemployed, you know, bitchy
woman who looks like a but I guess she isn't.
You know, that's not going to appeal to a majority
of Californians. And you know, depending on who the other
candidate is, they're gonna have a very very good chance
(57:11):
to beat her, especially if they're both of the same party.
So my guess is I had been predicting previously. In fact,
when I posted the original portion of that interview about
a month ago, I said, you know, Katie Porter is
probably going to be the next governor. I can't believe this.
This statement she made is not going to create any
controversy or something to that effect. But I no longer
(57:33):
believe she's going to be the next governor. And that's great. Now.
Of course, she might be replaced by someone even worse,
as hard as that might be to believe, but I
don't think Katie Porter is going to be the next
governor of California based upon what we currently know. All right,
As I have promised, for many weeks, I have been
studying this issue of artificial intelligence, probably far more than
(57:56):
I should have been. I know my wife is getting
a bit annoyed by all of the different things that
I have brought up regarding AI and the future. And
I am gonna you know, I've been promising I would
do a major segment on the podcast, and I'm going
to finally fulfill my promise here. And I'm doing this
(58:18):
for a couple of different reasons, one of which is
I expect that as this podcast continues, to whatever extent
it does, into the future, this is going to be
a major issue, and so I want to lay the
foundation for all future AI topics on the death of
journalism podcasts, because this thing is so large. It is
(58:42):
so huge, so expansive, so complex, so unpredictable, that I
feel like you can't deal with this issue in short segments.
So I need to lay this foundation for what this
topic is, how I see it, and where I think
we might be headed in the future, so that in
(59:04):
future episodes of the Death of Journalism podcasts, I can
just do a short bit on AI and not have
it completely take over an entire episode of the podcast.
So I hope that makes sense. That's at least my
philosophy here. Plus I've been promising to do this. Plus
I frankly don't think there's been enough deep dive analysis
(59:28):
of this topic considering how important it may be to
the future of the world. I mean, I really part
of me almost feels stupid that this isn't the only
thing we're talking about. That's how big it could underline
could be. And as I get into this topic, and
(59:49):
I've thought a lot about how in the world do
you tackle a topic as large as artificial intelligence. Well,
let me start by taking a big picture view here,
and I realize that I am perceived as a pessimist
at heart. I get it. I fight that. I dispute
it to some degree. I actually think sometimes I'm a
(01:00:11):
delusional optimist who ends up. You know. The only time
I'm usually wrong is when I'm too optimistic, and when
I'm pessimistic. I tend to be right more often than not.
But I get that people perceive me as a pessimist.
I'm certainly a cynic, there's no question about that. But
when I look at the topic of things that are
(01:00:35):
going to doom us, right, I am very very much
aware that you need to be very very skeptical of
any effort to claim that this new thing is going
to doom us to destruction one because there's an inherent
(01:00:55):
market for that. There always has been since the beginning
of humanity. Oh my god, we're all gonna die. Whatever
the new thing is. That is a very popular reaction.
It's human in nature, it's evolutionary in nature. Part of it,
I think is if we react like that, we might
(01:01:16):
be able to prevent it from actually dooming us. But
there's no question that it's sometimes even a lucrative position
to take. And we've seen this in talk radio for many,
many years, and I obviously come from a talk radio background.
We saw it with Y two K, which is probably
the most dramatic example of this, where for years, you know,
(01:01:39):
people at talk radio made lots and lots of money
scaring the hell out of you that Y two K
was going to cause an apocalyptic reaction when we you know,
change the clocks and the calendars to the year two
thousand and everything was going to melt down. I never
believed that. I thought it was a scar. It turned
(01:02:00):
out to be a bigger scam than I even perceived.
And frankly, I don't know how anyone ever took talk
radio seriously after Y two K ended up being a
big nothing burger. But I'm saying this to give some
perspective and context for how I view these types of issues.
I was a Y two K skeptic and turned out
(01:02:21):
to be right after Y two K, you know, the
next big thing that was going to destroy us all
was global warming climate change. I was a global warming
climate change skeptic, and I think I'm going to be
vindicated on that. As far as the thing that was
going to doom us or kill us all, the COVID panic.
I was a skeptic on that. You know, we were
(01:02:42):
told that that was going to kill us all, and
that was going to completely change life. The Great Reset. Now,
I was fearful that the Great Reset might actually happen,
even if the COVID panic was not based in anywhere
near the reality that we were told that it was,
and I was somewhat pleasant surprised. Although I still think
the reset has had more effect than most people think.
(01:03:04):
I think that Trump getting elected in twenty twenty four
has kind of covered over a lot of the elements
of the Great Reset from COVID that actually are still
very much in play and where the left really has
one even though the perception is that they lost because
Trump got reelected. That's another story for another day. But
(01:03:26):
by and large, I think we dodged the bullet on
the COVID reset. The COVID panic was not what was
going to kill us all. So just in the last
twenty five years, those are three major topics that a
lot of people thought, Okay, this is it, this is
the big one, you know, Y two K, global warming,
(01:03:47):
the COVID panic. I was a skeptic on all of
them to various degrees, and I believe I've been vindicated
on all of them. And I have an anti panic bias.
I mean, I think you've seen that in virtually every
story I've ever talked about, the COVID panic being a
great example, the Penn State Joe Paternal Jerryson Dusky story
(01:04:08):
being another great example. Those were both moral panics. People
lost their fucking minds and it caused a perfect storm
of really bad circumstances that created really horrendous results. Well,
I have a very strong bias against panic. I inherently
believe that panic is bad. And so therefore the context
(01:04:30):
here when it comes to artificial intelligence is that I
am not somebody who is prone. I don't think to
coming to a panic reaction that oh my god, this
is the big one. We're all gonna die, this is
gonna doom us. That being said, I'm really struggling with
(01:04:52):
this one. I am really really struggling with how artificial
intelligence is going to impact American life in the not
too distant future. Now, let me start with my own
experiences with artificial intelligence, which frankly, have had a great
disparity between some of them being amazing and mind blowing
(01:05:19):
and others not being very impressive at all. I mean,
for instance, I spend a lot of time on Twitter
or X. Groc is the AI agent on X. Sometimes
Groc is very very good. Other times Groc is totally
wrong and has put out headlines that are totally false.
(01:05:42):
They get sports results wrong all the time. I say
they I should say it, I guess, but GROC gets
sports results wrong all the time. They had one major
headline on X that Charlie Kirk had survived an assassination attempt.
I mean, they are are wrong. It is wrong constantly.
(01:06:04):
On the other hand, some of the things when you
ask Roca question or you ask chet gpt a question,
the responses you get are quite amazing and mind blowing.
Some of the things in the realm of video, like
with Sora Too just coming out, which is one of
the reasons why I decided to do this topic now,
because a lot of people are having a very dramatic
(01:06:25):
reaction to the incredibly realistic nature of Sora Too put
out by open ai, where you can basically create AI
videos instantaneously, that they are almost impossible to discern between
what is real and what is fake, and some of
that is really remarkable, scary, mind blowing. But I want
(01:06:48):
to make it clear that I am not, as of
this moment, totally and completely sold that this is going
to reach the potential that the pro AI people say
I think it could. I'm just saying it's not there yet,
because there are a lot of situations that I have
(01:07:10):
experienced with AI where I am not all that impressed.
That doesn't mean that it's not going to end up
like some of the most optimistic predictions, that it's not
going to reach over the threshold where AI suddenly becomes
like magic. When you look at Sora too, you go,
oh my gosh, that's magical. Some of the other things
(01:07:32):
that I've already mentioned, I'm not very impressed by you, Like,
for instance, I'm not a great example of this is
X has a feature called Imagine where you can put
photographs into the AI and it can bring them to life.
And sometimes when I do that, I'll take my daughter
(01:07:55):
Grace's artwork and I'll put that in there because she
likes to see what Imagine would do with it. Because
sometimes it's amazing, sometimes it's spectacular, and other times it's
just really terrible and awful. Now, that could just be
the kinks. I mean, we were still in the infancy here,
so I am not one hundred percent sold that this
(01:08:19):
thing is going to be magic, but it could be magic.
And if it is magic, then really, frankly, this is
all we ought to be talking about. And I was very,
very hesitant and frankly slow to come to the conclusion
that this topic was so large that this is really
all we ought to be talking about until a couple
(01:08:42):
of months ago. And it's funny to me how people
that are in the same sphere maybe are like minded,
even though they don't agree on everything, but their brains
tend to think alike, can come to similar conclusions at
exactly the same time without any direct communication about this.
(01:09:04):
And what I'm referring to is there was a kind
of a Shazam moment for me a couple of months ago.
And I'm not name dropping here because there's a reason
why I'm going to mention this person's name, but this
would have been a couple of months ago, and I
was speaking to former NBC Today's Show host Matt Lauer
(01:09:24):
about something else. I don't remember what it was. I
think it might have been my black Swan situation that
I've referenced several times before on this podcast. And I
don't even remember how we got into it, but I
said to Matt, you know, just recently, I have started
to really take a deep dive into artificial intelligence, and
(01:09:46):
I'm starting to get freaked out. And Matt said to me,
almost word for word, I'm in the exact same boat, buddy,
And we had come to almost the exact same conclusion
at almost exactly the same time, that this thing had
evolved to the point where it was no longer theoretical,
(01:10:07):
that it was now in the realm of, if not reality,
something that could very easily become a reality in the
not too distant future. And we both had our own
analogies or metaphors for describing what we think is currently
happening here. And my analogy is that we have essentially
(01:10:34):
learned that in the middle of the Pacific Ocean there
was a nine point eight, you know, unprecedented earthquake somewhere,
you know, at the bottom of the Pacific Ocean. And
of course whenever there's a massive earthquake in the Pacific,
we are on tsunami watch, especially here on the West Coast,
(01:10:57):
and most of the time the tsunamis do not end
up coming to fruition, and if they do, their you know,
their minuscule. You know, they're a couple of inches or
a couple of feet whatever, and there's no damage. But
as we've saw, we saw, you know, in Indonesia many
years ago, I guess about twenty years ago. Uh, sometimes
(01:11:18):
it turns into a total disaster. And that we now
know that this earthquake, under my analogy when it comes
to artificial intelligence, we now know that this earthquake has happened.
It's unprecedentedly large. We're now just waiting to find out
are we going to get a trickle at the shore
or are we just going to get completely overwhelmed and
(01:11:42):
is everything going to get destroyed by the result the
resulting tsunami that occurs because of this unprecedented AI earthquake.
And Matt said, up, yeah, I get where you're coming from,
but I look at it a little bit differently. Matt said,
I think we just have a new nuclear weapon. That
this is the nuclear weapon all over again, only in
(01:12:06):
this particular situation, there are basically no restrictions over who
can own the nuclear bomb, and I thought, oh man,
that might actually be even better than my tsunami metaphor.
I go, but I go back and forth over which
is the better analogy. But I can understand where Matt
was coming from when he referenced the nuclear bomb and
(01:12:30):
the danger here. And he didn't mention this, but I
think it was implied. There's a couple of different dangers here.
Not only is it far less restrictive and seemingly far
easier for people to get their hands on this nuclear bomb,
but also there's much more of a financial incentive for
people to get the nuclear bomb. Now, of course, you
(01:12:52):
could always argue there's financial incentives to get a nuclear bomb,
but by and large, we've done a pretty remarkably good job,
at least so far. And we saw what happened with
the Iranian nuclear facilities earlier this year. We've done a
pretty decent job so far of keeping the ability to
create nuclear weapons in a significant fashion out of the
(01:13:14):
hands of the worst actors. Not perfect, you know, thankfully.
Apparently it's it's a lot more difficult than what we
know so far about AI because North Korea has not
been able to do it successfully. But there's not an
inherent financial incentive driving the you know, the the ability
(01:13:37):
to create a nuclear weapon or to get a nuclear weapon.
And also there's the whole idea of mutually assured destruction,
which has really been at the heart of why things
have not gotten so far out of control when it
comes to nuclear weapons post Hiroshima and Nagasaki. That that
that that moment really kind of forever said Okay, this
(01:13:59):
is something that you better not fuck around with, and
if you do, the price is going to be extremely steep.
And so I said, okay, Matt, all right, So if
you think that this is a situation where we have
a nuclear bomb that's being essentially obtained by all sorts
of different entities, you know, what are we going to
(01:14:22):
do about this? And you know, one of the reasons
why Matt is a relevant person here is he is
still very very well connected within the Hampton's elite. I
often refer to it as the Hampton's media mafia. Also,
you know, the billionaire class. He knows a lot of
very elite people, and he also happens to have a
(01:14:44):
very prominent ranch in New Zealand. In fact, if you
are a tourist in New Zealand And you're taking a
boat ride around the coast, there's a very good chance
that you're going to be You're going to have his
ranch pointed out as a kind of a tourist attraction.
And he indicated to me that a lot of people
are asking him about buying land in New Zealand because
(01:15:06):
of their fears about what's going to happen in the
not too distant future with regard to the world changing
thanks to AI. Now, I don't want to overstate it,
and you know, you know, I don't speak for mad
and I'm just giving you this story because I think
it's relevant to how people are responding to this and
(01:15:32):
just how big a deal it is in the minds
of many people who have studied this and who are
in circles of people who have studied this. And to me,
this is part of why I was like, Okay, my
instincts here that this is not a Y two K,
This is not a global warming, this is not a
(01:15:54):
COVID panic, this is not a test that this could
underlying could really be the real deal have just been verified,
not because of one person, but because I think you'll understand,
you know, my reaction with someone like Matt telling me
that that's the reaction of people in elite circles. That's
concerning not that elite people are always right. Oftentimes they're wrong,
(01:16:18):
and group thinking can be very very powerful and lead
people in the wrong direction. And there's definitely a phenomenon
that the elites, once it becomes a popular thought within
their group, they all think that they must be right
because they're all elites. The elites are inherently right, and
this has gotten them in trouble in a lot of
different ways, many of which we've talked about on this
(01:16:40):
podcast before, you know, Donald Trump being a great example
of this. So I'm not saying that they can't be wrong.
I'm just saying that this had an influence over my
thinking that I was going down a path that wasn't crazy.
This is not just doomsayers. This is not just the
y two k people revisited this situation where holy shit,
(01:17:02):
some very serious people are having the very same concerns
that I am. Now the way that I look at this,
I'm thinking about this in the short run and then
in the longer term. And when I say longer term,
I'm probably thinking five, fifteen, twenty years. But in the
short run, where we are today. There are several elements
(01:17:26):
of the AI issue that are extremely problematic to me.
And this is not in the realm of the apocalyptic
dystopian culture and universe and country that we may be
headed towards. That's more of a longer term issue. But
a more immediate concern I have is the incredibly obvious
(01:17:49):
economic bubble that has been created by AI stocks. Now
this is going to sound contradictory, right Zig, What do
you mean an AI stock bubble? I thought you said
that you think this might be the biggest thing that's
ever happened. If it really is the biggest thing that's
ever happened in human history, then how could it be
(01:18:12):
an economic bubble? I actually think both things could be
true here, and I actually think we already have an
example from which to learn about this. If you look
at the history of the Internet bubble, it, I think
could end up going very much along the same lines,
(01:18:34):
only in my opinion, in a more dramatic fashion. If
you recall, you know, we had in the nineties this
massive internet boom bubble that burst, but the Internet didn't
go away. The Internet did, in fact end up taking
over our lives to a certain degree, and there were
(01:18:56):
several companies that survived that turned out to be humongous,
you know, Google probably being foremost among them. So well, many, many,
many companies did not make it, and there was a
stock bubble burst. It didn't throw us into complete catastrophe.
(01:19:20):
It was frankly a fairly minor blip in retrospect. But
I think we have a model here. We have a
model to where the Internet turned out to be exactly
as advertised pretty much, but there was an economic bubble
attached to it, and I think what we're seeing with
(01:19:41):
AI could be that on steroids, because anybody with any
understanding of economics has to be extremely concerned with the
fact that we basically have a stock market circle journe
going on among like basically six or seven companies that
(01:20:06):
are all having their stock prices increase based upon deals
with each other. And not to get into all the complexities,
but basically open AI is paying for all of these
deals by getting stock in exchange for a chip deal,
(01:20:29):
for instance, knowing that as soon as that is made public,
the issue of AI is so hot that the stock
price for everyone is going to go up. So the
stock price increase is going to pay for the deal itself.
So it's all fake money. It I mean, it's it's
as fake as you could possibly imagine. It's it's literally
(01:20:51):
the definition of a bubble. Now, if AI turns out
to be as powerful and as influential as they are
expecting or hoping for, and as economically viable as they're
hoping for, then you know what, maybe that bubble never bursts.
(01:21:15):
But I actually I'm in the short run, I don't
even know what the root for this is. This is
so typical of so many elements of artificial intelligence that
there's a double edged sword everywhere. I don't know whether
or not it's better for it to be as advertised
and they're not to be a bursting of the bubble,
(01:21:36):
or whether or not it turns out to not be
as advertised and our economy goes to complete shit, because
I don't think people fully understand that with at least
with regard to the stock market and with regard to
our economic growth, how much of it right now is
directly attached to this AI bubble. I mean, you know,
(01:21:59):
open and video origle, all of them. It's just a
big fucking circle jerk, not based on anything I mean
anything real as far as you know, generating revenue. But
here's the part of this business model that is so insidious,
and I don't think gets talked about enough. It's not
(01:22:22):
just that it's a circle jerk, and it's all based
upon the theory that this is going to be so
big that it drives an increase in the stock price
that pays for the deal itself. It's what the business
model of AI generally is. Throughout the history of enormous
(01:22:44):
advances in technology, generally, with some exceptions, generally any brand
new invention from an economic perspective has been about increasing revenue.
I mean, think about all the major inventions and developments
(01:23:05):
technologically of the last you know, one hundred or so years,
almost all of them. We have a new product that
people will buy, like a car, or people will fly
on a plane, or people will buy a phone, or
buy a radio, or buy a television. I mean, you
(01:23:26):
use your own imagination. There's hundreds of things that fit
into this category that creates revenue. By the way, it
also generally creates jobs. Now, in automation, jobs have always
been lost, but sometimes efficiencies are increased, and you know,
sometimes other jobs are created. So it's not This is
(01:23:47):
not a universal truth I'm talking about, but this has
generally been the case that massive technological advances are designed
to increase revenue and inherently naturally also generate jobs. Not always,
(01:24:08):
but generally. AI is fundamentally different the AI business model
as I understand it, having watched way more videos on
TikTok and acts and read way more about this than
I ever thought I would over the last couple of months,
(01:24:29):
the business model for AI appears to be almost entirely
based on the idea that it will kill jobs. That's
not a bug, that's a feature. That this is the
opportunity for major corporations to cut massive portions of their
(01:24:54):
workforce and to gain enormous leverage over the the rest
of their workforce, so that salaries generally will have downward pressure.
In other words, this isn't about generating revenue by and large,
with some exceptions, this is about cutting costs. The revenue
(01:25:18):
generation part of this equation has essentially, you know, reached
its its maximum. So now we got to figure out
in order to you know, keep the shareholders happy and
to keep growth of profits going into the future, we
need to cut costs and employment costs are a huge
(01:25:41):
part of that equation. AI is inherently intended to fix
that problem. So you have, i mean, think about this, folks,
from again short term economic situation. Here you have an inherent,
obvious economic stock bubble being created by a new invention
(01:26:03):
whose entire purpose is to create massive unemployment. And it
is a humongous and not entire Some people are claiming
it's well over fifty percent of all the economic growth
that has occurred over the past year is either directly
(01:26:26):
or indirectly related to AI. So we have a completely
stagnant economy almost except for all of the money being
put into AI, all these data centers being created, the
fact that jobs haven't been completely eliminated as of yet,
but all this innovation, everything, that's all the money that's
(01:26:47):
being spent on this and what the part of this
that makes me most nervous is that, unlike Internet version
one point zero, if this is going to be Internet
version two point zero on steroids, the Internet companies of
the nineties when the bubble burst, they were not too
(01:27:11):
big to fail. The entire economy did not ride on
them not failing by and large, AI with the size
of these companies, specifically in Vidia and the ones I've
already mentioned, Oracle, Open Ai, and others. They are now
(01:27:31):
too big to fail. If they fail, we are headed
for an economic catastrophe by any standard in the short run. Now,
I'm not saying this is gonna happen. I don't know
that it's gonna happen. A lot some people are predicting that.
Other people think, you know this, this giant wave of
(01:27:54):
AI is like the tsunami. I guess only this is
in their world of positive Tsunami is going to carry
all these stock deals to actually come to fruition by
making these companies more profitable, not by generating revenue, by
but cotting their costs, because they can now have AI
(01:28:15):
fulfill the jobs that used to be held by human beings.
And we're not talking about just fringe jobs here. We're
talking about enormous numbers of jobs. And maybe the most
underrated currently part of the short term AI equation, which
(01:28:39):
I find to be both hilarious and telling, is the
amount of energy and water consumption that AI is causing
to happen. I mean, it's hilarious that you know most
of these companies are generally run by liberal and you know,
(01:29:01):
of course, to liberals somehow, these are if a conservative
movement was creating this amount of energy consumption and water consumption,
it would be an environmental catastrophe. But we've heard nothing
about this. This reminds me a lot of how to
use a second football analogy in here that's going to
(01:29:23):
be bizarre. It reminds me a lot of the PAC
twelve disbanding and having all of these liberal West Coast
schools joined the PAC ten and dramatically increase their carbon footprint,
completely exposing themselves as a bunch of hypocrites, and no
one gives a shit because it's not a real issue.
They don't really believe in the carbon footprint. Well, this
(01:29:45):
to me is way way worse because energy costs are
going through the roof right now. Nobody hardly is talking
about this, and a huge portion of why this is
is because of the increased energy being used by AI creation,
not to mention massive amounts of water consumption that the
(01:30:06):
environmentalists are strangely silent about. And so this is to
me both interesting and telling. But it's also another example
of the double edged sword of AI, because every time
I think that there's Okay, this is a really bad
situation involving AI. This is a bad outcome. There's a
(01:30:29):
flip side of the coin or the sword, if you will,
Because is it possible that because AI requires so much
electricity and so much water consumption, and the costs are
on the rise, and our electrical grid is so shitty
to begin with and so outdated, is it possible that
(01:30:51):
it will actually force us because AI is too big
to fail to upgrade our entire electrical system and specific
finally pushed for what we should have been doing for decades,
which is to properly use nuclear power. I think that's possible,
And I think this is a really good example of
(01:31:11):
why the AI issue is so difficult to wrap your
brain around and to predict, because they're almost at every
turn there's two possible outcomes. The outcome where our electrical
grid completely collapses and we're paying insane amounts of money
(01:31:32):
for electricity that's no longer reliable and we have water
shortages and all sorts of other things if we're in
a drought, what have you? Or does this cause us
to finally get serious and fix what is a massive
problem with a solution that's just been sitting there for decades,
and we've been too afraid to use it for reasons
that are idiotic, which is to use nuclear power. I
(01:31:56):
don't have an answer for that. My guess is, of
all the elements of AI that I'm pessimistic about, that
might be the one I'm most optimistic about. I do
think that there is. And you know, maybe I'm repeating
my mistake when it came to the Pacific Palisades and
Altadena fires, where I thought the incentive structure was there
(01:32:17):
for Democrats to fix the problem. It's a similar dynamic
here where we've got a massive electricity problem. By the way,
it should be pointed out, we are currently all paying
an AI tax on our electricity rates that nobody is mentioning.
I mean, we're all paying for this indirectly, not obviously
in full, but at least in part, because this is
(01:32:39):
driving a significant portion of the increase in electric electricity
electricity costs, and energy costs and water costs. And of
course you know we're not getting any of the benefit
of this. In fact, you could argue that we're paying
a tax which will eventually help get rid of a
lot of our jobs, which is just totally saying, but
(01:33:01):
I do think if I had to bet on this,
that we're going to be forced to actually finally do
the right thing when it comes to our grid and
the use of nuclear power. So that could actually be
a good thing. And I want to make it clear
that AI as it appears to be coming to fruition,
is going to do a lot of really good things.
(01:33:23):
There's no question about that. It is going to make
life if it comes to full fruition, if it is
as advertised, it is going to have a lot of
positive influences on society. Efficiencies are going to increase dramatically. Specifically,
I think in the realm of medicine, it could be
(01:33:47):
a huge game changer for the positive I can see
even in the short run when it comes to minor
medical situations and minor legal situations where you used to
either have to just grin and bear it or you
had to go through the hassle of having a doctor's
appointment or meeting with a lawyer. Instead of doing that,
(01:34:10):
you can just ask chat GPT about it or GROCK
and you can get usually a pretty darn good answer
that doesn't require you going to a doctor or going
to a lawyer. I've done both, and so far I
think they're even today in its infancy that the results
are actually pretty good. So in that way, life is
(01:34:33):
going to get even easier than it currently is, maybe better,
certainly more efficient. You know, there are other elements of
life where similar things are going to happen. Technical questions,
you know, Chat, GPT and rock and other AI devices
are really good at giving answers to So I don't
(01:34:53):
want to make this sound like it's all bad. This
is very much a double edged sword all of this.
It's just a matter of of which side of the
sword is going to end up winning. And I don't
have a full understanding of that yet, but I'm very
concerned that it's not going to be the positive one
in the long run. But we're still talking here about
the short term element of this, and so I want
(01:35:16):
to emphasize a lot of this. If it comes to
fruition as is currently expected, as is the conventional wisdom,
is going to be very very positive, with a lot
of really tremendous tools to make life easier and more efficient.
And there's all the mind boggles as to the positive
things that could theoretically come should AI be exactly as advertised.
(01:35:42):
I also want to emphasize I'm not still and I've
already implied this once, but I'll do it again just
to make it clear. I'm still not one hundred percent
convinced AI is going to get over the magic threshold.
I think it's possible that even though we're seeing a
remarkable space with which AI is improving in all elements
(01:36:03):
faster than I think even the experts expected. So you know,
when the experts are being surprised by how fast things
are improving, the natural inclination is, oh my gosh, this
is going to come to fruition, just like everyone's predicting.
If you're thinking about this as an athlete, right using
all sorts of sports and analogies here that I didn't
(01:36:23):
even intend to, but like, if you have an athlete
who clearly at the age of you know, twelve, why
that guy's an athlete, That kid's an athlete. He's going
to be a great basketball player or a baseball player,
football player, and you know they're going to grow into
being this fantastic athlete Division IE player, maybe even pro
and they get even better than you expected at thirteen,
(01:36:47):
and even better than you expected at fourteen and fifteen,
they're even better than you thought they'd be at fifteen.
The natural inclination is to think, oh my god, by
the time they're a full adult to twenty two, are
going to be a world beater, and occasionally that is
the case, often though it's not. Sometimes they hit a wall,
(01:37:10):
whether it's in growth or this their ultimate potential, And
to use that analogy to AI, there's no question that
this is happening faster than most of the experts thought
was possible, and that it is remarkable in so many ways,
mind blowing in so many ways. It could still hit
a wall. There are still people who are credible who
(01:37:31):
think that in certain areas, you know what, AI just
will not never be able to do the things that
a lot of the experts are predicting, and there's some
evidence to support that, like, for instance, there's reports that
some movie studios have been trying to dabble in in
making full on AI movies kind of as an experiment,
(01:37:54):
and that they've been having a lot of problems doing
it for lots of different reasons. One of the reasons
is that it seems to be very difficult to get
AI to duplicate itself or to repeat the exact same process.
This is just one of a million issues that people
are dealing with. So I am not yet one hundred
(01:38:15):
percent convinced that it's going to reach that magic threshold.
I'll use that word magic, you know, probably a lot
in this discussion for the ease of understanding. I don't
think we're gonna know for sure that it's reached magic
quite yet, but my gosh, it certainly looks using that
athlete analogy again, it certainly looks at very early stages
(01:38:39):
as it could easily get there and maybe even exceed that,
and a lot of experts believe that that's where we're heading.
So that's where we are currently. I'm very concerned about
the economic bubble and the short run, I'm very concerned about,
you know, the too big to fail element. I'm very
(01:39:01):
concerned about the timing of this from a political perspective,
and the idea that we could have a president Gavin
Newsom in twenty twenty nine, just when this whole thing
is coming to full fruition and we saw what he
did during the COVID panic. I mean that those are
the types of things in the very short run that
have me very concerned. But there's also a social, cultural
(01:39:25):
and interpersonal relationship aspect that I think we're already seeing
the impact of. And this comes in the form of
the idea that human beings are already starting to have
very strange and dysfunctional relationships with a. I thanks for
(01:39:47):
listening to today's free drop of the abbreviated show. If
you're interested in listening to the entire show, you must
become a patron. Please go to Patreon. That's pat r
EO and dot com. Patreon dot com slash the Death
of Journalism with John Ziggler. My name is j H h
(01:40:09):
N z I E g L e R. That's patreon
dot com slash the Death of Journalism with John Ziggler.
Good luck to you on that. But that's how you
can subscribe.