Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, guys, Saga and Crystal here.
Speaker 2 (00:01):
Independent media just played a truly massive role in this election,
and we are so excited about what that means for
the future of the show.
Speaker 1 (00:08):
This is the only place where you can find honest
perspectives from the left and the right that simply does
not exist anywhere else.
Speaker 2 (00:14):
So if that is something that's important to you, please
go to Breakingpoints dot com. Become a member today and
you'll access to our full shows, unedited, ad free, and
all put together for you every morning in your inbox.
Speaker 1 (00:25):
We need your help to build the future of independent
news media and we hope to see you at Breakingpoints
dot com.
Speaker 2 (00:33):
Let's go ahead and talk about John Stewart and Stephen
A here, because this is a little more fun than
nuclear war. Death in nuclear war and ping fascism is.
Speaker 3 (00:44):
Fleeing.
Speaker 2 (00:46):
It's on par I'd say it's likely roughly equivalent. So
in any case, Steven A goes on with John Sire.
I watched the whole interview. You know, they were growing out,
they were like bonding over their next tragic result, which
you know, I have some of that going on in
my household as well. In any case, the part of
the conversation. People were really interested in though, is Stephen
a is this like very centristy figure. He also just
(01:07):
like he's not a I know, he's like floated this
idea of running for president, but his political takes a
pret surface level. He's not like deep in the weeds
with this kind of stuff, you know. And John Stewart
obviously is like deeply in the weeds with this kind
of stuff. So they get into a bit of a
battle over what is the reason why the Democratic Party
with Kamal Harris lost to Trump in this last election cycle.
(01:30):
Let's go ahead and listen to how that went down.
Speaker 4 (01:32):
The left definitely blew it. I thought that I thought
that the progressive left, the extreme left, really really ruined
the election for the Democrats. And that's that's how I
view this is the truth, and I think that I
think that because that happened, it positioned him to get
away with a lot of things. Then you're putting forth
some of these policies, whether it's transgender issues of other
stuff that they would bring up. You had a lot
of people to say, that's not where our focus should lie.
Speaker 5 (01:53):
When don't you think that's their focus focus?
Speaker 3 (01:57):
The right?
Speaker 5 (01:58):
I feel like the right is far more focused on
where you go to the bathroom. They allowed what people
would consider to be a centrist old guard Democrat to
run again for president, and then you say, the progressive
left wing is the one that didn't. They didn't choose
Joe Biden. They don't have the power in that party.
Chuck Schumer is out there, you know, writing eight strongly
(02:20):
worded questions to Donald Trump about getting The Progressives are
the one going people in this country want medicare for all,
do something. They're the ones that have a principal agenda
that they are going to put forth that all the
Democrats have run from in under the guise of we
have to be moderate and we have to be centrist.
(02:42):
But that doesn't appeal to changing the culture and dynamic
of how this government should work by the people, for
the people of the people.
Speaker 4 (02:51):
That's a great argument, and I don't disagree.
Speaker 6 (02:53):
What I'm saying to you is that we're done here.
Speaker 2 (02:55):
So Johnston then goes on Emily to make a case
that I think he'll be maybe more sympathetic too than
that one.
Speaker 7 (03:03):
I mean that one.
Speaker 2 (03:04):
I totally agree with rights like utterly obsessed with gender
identity and genitals at this point, but well, we can
pause that debate at.
Speaker 7 (03:12):
The moment.
Speaker 3 (03:14):
On that people can look it up and fought with.
Speaker 7 (03:16):
Piers Morgan on this you can oh much better than me.
Speaker 2 (03:20):
But in any case, he goes on and make this
case of like, don't you think the real problem is
that Democrats didn't actually like deliver or promise to deliver
anything real. And so when you've got Bernie Sanders and
AOC and that they're doing this like stop oligarchy to doing,
they're saying like, we're going to help you with your wages,
We're going to get you held gere, Like, isn't that
really the problem is that Democrats had no credibility on
(03:43):
actually addressing people's he didn't put it this way, but
addressing people's material concerns. And Steve and A couldn't really
couldn't really respond to that. He goes, well, that's a
good point, and then went on to say, you know,
some other stuff. But you know, I thought John did
an effective job of positioning this because to just frame
it in terms of left right, I think is deeply confusing.
(04:06):
And this gets to this other piece of news that
just came out, like Josh Holly is now introducing fifteen
dollars minimum wage bill in the Senate. I don't know
if it'll make it to the floor. I'm sure there's
like literally no other Senate Republicans who would vote for it.
But really getting to the left of some of the
Democrats on an issue that should be a core democratic issue,
(04:29):
and no one would look at that and be like, oh,
this is going to cause like Josh Holly to lose,
or this is going to be bad for Republicans. No,
there is a broad recognition that any of this sort
of more populous pieces that whether it's Josh Holly or
Trump has pushed with the no tax on tips or whatever,
that these are good ideas that are also really popular.
And so John Stewart's point is basically like Democrats should
(04:53):
do more of that, and that would be you know,
an effective way of winning people. Oh or if they
actually feel like you have credibility delivering on something real
to them.
Speaker 3 (05:05):
Yeah, And I mean, I think this is one of
the biggest problems but also opportunities for the left. Opportunities
in the sense that Bernie Sanders and Alexandre Acostio Cortez
on the fighting aligarchy tour have leaned into I think
the opportunities presented by that challenge. So it's actually a
challenge that I think forces Democrats to confront those questions.
On the other hand, I think part of the problem
(05:27):
here is that politically, the left wing of the Democratic
Party has the most attractive, appealing, palatable economic agenda. You know,
a lot of my conservative friends, like Blanche at that
be like what the hell you're talking about. But it's
actually much easier to sell voters on Medicare for all
(05:47):
than it is to sell voters on the status quo
or some tweak to the status Yeah.
Speaker 2 (05:53):
Like we're going to negotiate drug prices or medicare on
like five drugs. It's not a don't you feel like
your life is transformed by that? I mean it's like, okay,
well that's better, I guess, right.
Speaker 3 (06:06):
But the problem is when the left wing's economic solutions
are the most palatable and appealing, then you also have
the people who have signed the ACLU pledges like Kamala
Harris did. That allowed the Republicans to run that ad
on supporting taxpayer funded I think it was transition surgeries
(06:26):
for incarcerated migrants the most specific thing possible. But I
think that's where you see Bernie Sanders and Alexandria Cosio
Cortez and Holly is trying to do this from the right,
by the way, because it's also a challenge slash opportunity
for the right where you have the most MAGA people
having economics that are easier to sell, more palatable to
(06:47):
sell to general election populations, statewide populations, and all of that,
but they're also the ones that are now super super
tied down to the MAGA brand. If you look at Holly,
for example, a lot of people remember Holly because the
image of him with the raised fist outside of the
capital in the Mourning of January sixth. So it's not
as though any side has like the upper hand politically
(07:11):
on the question. It's it's actually something that they're both
confronting in their own ways, which is quite interesting.
Speaker 2 (07:17):
I think there's a lot to that because what always
irritates me is, first of all, number one, it is
one Republicans who bring up transgender issues more regularly than
Democrats because they feel like it's a good issue for them.
Speaker 7 (07:30):
There's a.
Speaker 2 (07:33):
Go ahead, try meum, So there's an obsession.
Speaker 7 (07:38):
There is an obsession there. I mean, there's no doubt
about it.
Speaker 2 (07:40):
But what drives me crazy is I'm like, Okay, so
let's accept that, like democratic positioning on the specific issue,
not even transgender issues in general, the specific issue of
transgender girls in sports, let's accept that democratic positioning on
that is not in sync with public opinion. Look at
how many wildly unpopular things that Trump says, does and embraces,
(08:05):
pardoning the violent January six ers, some of whom beat
up cops pulls at about like ten percent, Okay, wildly unpopular,
a bunch of the things that were done at DOJE
wildly unpopular, insane across the board, haphazard tariffs, wildly unpopular.
(08:26):
So then you ask the question, Okay, so if if
both sides have and I would say that Trump has
specifically more issues that pull way worse than the overall
broad like democratic program or what Kamway Harris specifically ran
on where you know she intentionally in her campaign didn't
talk about a lot of the things that were less popular.
Speaker 7 (08:45):
For if Trump has, let's just call it even.
Speaker 2 (08:48):
And they both have unpopular issues that you know that
they support and that they have said things about in
the past. Why does it matter for one and not
the other? And to me, the reason is number one,
Trump is just a talented figure. Number two, he has
a story that I think is wrong and bad and
evil in all of those things, but makes some sense.
(09:11):
The reason your life is bad is because of immigrants,
trans people, cultural elites. So if people sort of broadly
get on board with that narrative, they're willing to forgive
and forget some of the bullshit that is like wildly indefensible.
Democrats first of all, don't seem like they stand for
literally anything at all, which is why this whole like
pull testing. Let me put my finger in the wind,
let me try to reposition on trans issues, let me
(09:33):
like focus test some response on girls in sports that's
gonna not piss off this person, but is gonna appeal
to a moderate, etc. People can smell that a mile away.
So if you don't have a coherent story to tell
and there aren't things that you are demonstrably willing to
fight for, then yeah, they're going to have a lot
(09:53):
more room to tar you with your least popular positions
and frame them in the way that they want to
frame them. So to me, That's the big difference between
why the rights, why Trump specifically unpopular issues don't seem
to matter as much as Democrats is because Democrats frequently
(10:15):
don't have a spine, don't stand out for themselves, don't
have principles they're willing to stand on, aren't willing to
fight for anything, and don't have a coherent story about
what has gone wrong or honesty really about the struggles
that people are facing. Bernie Sanders is the most popular
politician in the country right now, most popular politician in
the country. Does he have a different position on trans issues?
Speaker 3 (10:34):
Now?
Speaker 7 (10:34):
No, he doesn't.
Speaker 2 (10:35):
The reason is because he has credibility on fighting for
things that people care about, and he has a story
that makes sense to people about who is screwing them over,
why things have gone sideways in this country, and how
critically it could be put back on track.
Speaker 3 (10:51):
Right people believe him speaking of the opportunities created by
all of this. Let's put d three on the screen
and talk about zomentum so zoron momdani his Actually he's
now trailing according to this new pole Andrew Cuomo in
a one to one race. So keep that in mind.
(11:13):
He's now trailing Cuomo by only two points. My suspicion
is that that's at least pretty close to the margin
of error. I'll have to go look at the pole here.
But that was also conducted before the Alexandria Occassio Cortes
endorsement of Zoron, Mom, Donnie, and we can put D
four on the screen as well.
Speaker 7 (11:34):
Dan before the debate, which went well for Zoron.
Speaker 3 (11:36):
Yes, yes, and went very poorly for Cuomo. So big
thing for Zoron here across the board. The places where
he lower net favorabilities are just the places where fewer
voters know about him to have an opinion. No demo
has particularly high unfavorables for him overall. And then another post,
a tale of two candidates. Zoron plus forty three net
favorability twenty eight percent haven't heard enough, Cuomo minus one
(11:58):
net favorability, three percent haven't heard enough, So.
Speaker 7 (12:02):
Crystal room to grow for Zoran.
Speaker 3 (12:04):
Well, also, you heard it here first, truly, you heard
it here first. Zomentum is absolutely realm is absolutely real.
Speaker 7 (12:12):
Yeah, no, there's no doubt about it.
Speaker 2 (12:14):
I mean, listen, I want to be fair and say
there are other poles out Salem's rejoined the chat. There
are other poles out that show a larger gap in
favor of Cuomo. This particular polling outfit, though, was extremely
accurate in the Eric Adams mayor race. It's not a
crazy poll right, No, they do have a track record.
(12:35):
I don't think anyone denies that Zoron has significant momentum
right now. And this poll coming before the debate before
AOC endorses, is significant because interestingly, in terms of the
gender divide, Zoran is actually doing better with men. So Democrats,
if you're looking for a candidate who can appeal to men,
(12:55):
and specifically actually white men, there's something to learn from
Zoron's candiacy because he's doing really well with young white
men are probably his, I'm quite confident are his best demographic.
So he's got some answers for you in this whole
man conversation. But you know, the point about the groups
(13:16):
where he's faring the poorest are also the ones that
know the least about him, I think is a really
important one and may help to explain some of why
women are voting more for Cuomo than Zoran at this point,
which disturbing and really a little bit head scratching, but
women also knew less about Zoran at this point, so
(13:38):
there may be rooms to grow there. You know, he
still has some of the demographic challenges that have plagued
the left. I think, you know, older black voters are
the weakest demographic for him. Makes sense because you have
a larger number of conservatives who you know, older black
voters tend to as a group be more conservative than
some of the other parts of the Democratic He's he's
(14:00):
doing pretty well with Latinos though, so he's gained a
lot of ground there, and you know, we'll see they're
hitting him quite hard. They're really trying to go after
him in all sorts of ways, and it's it's going
to be continue to be a difficult hill to climb.
But I think if you're looking at who has room
to grow, obviously he is way more favorable, way higher
(14:24):
favorability than Cuomo. I think he's you know, he's really
got a shot in this thing, which is pretty wild
and pretty extraordinary commentary on the way that the Democratic
base has moved in Trump two point zero as well.
Speaker 3 (14:35):
Yeah, that favorability gap is insane and it'll be really
sad if New Yorkers are stuck with Andrew Cuomo because
I don't think anybody wants that. Even people who are
saying they're maybe favorable to Andrew Cuomo are probably favorable
to him in the respect that they don't like the
other candidates, and they're like, Okay, he's fine, I know him,
but really, nobody wants Andrew Cuomo to be the mayor
(14:57):
of New York City. I mean, maybe like five people
want Andrew Cuomo to the mayor of New York City.
They're all named Cuomo. So if voters are stuck, maybe
they're uncomfortable with Zoron's full democratic socialist platform. And honestly,
that's fully understandable, especially after the city was mismanaged under
build the Blasio Like, I get it, I get it.
(15:18):
So if you're forced to choose between Andrew Cuomo and
somebody who's, by his own admission, on the left flank,
that just sucks for voters. So if it ends up Cuomo,
I mean, good luck. Everyone, Like, that's just I'm sorry
that those are your those are your choices.
Speaker 2 (15:37):
Let me let me say something about the way Zorn's
run his campaign though, too. Yeah, because to the you
know circles back to the like John Stair Stephen A
Smith to tion and positioning. So Zoron has you know,
all the lefty positions, no doubt about it. His tagline
is Zoron for New York you can afford. And every
time we've played as clips, you guys talk to him,
(16:00):
like every time you talk to him, you know what
his policy platform is. Freeze the rent, you know, government
owned grocery stores to compete so that you can, you know,
in food desert, so that you have some fresh fruit
food options. He is laser focused on housing in particular,
free bussing, you know, so that people can ride public transit,
(16:21):
and additional investments in public transit. People know what his
platform is, they know what he stands for. It's really clear.
He has a very clear message, very clear story about
what has gone wrong in New York, the way it's
become an intentionally a luxury good under mayors like Bloomberg
and Juliani and Eric Adams. And he wants to make
(16:44):
it so that you can live in New York and
it doesn't have to be such a struggle. Now, will
one person be able to accomplish that in one term? No,
But he's put forward some achievable. I think concrete goals
that people can really wrap their heads around. And that's
why his campaign has taken And it's not just against Cuomo,
Like there are a bunch of long time established New
(17:05):
York City palls in this race who were getting no traction.
He has sucked up all of the oxygen, all of
the like not Cuomo oxygen. And it's a real testament
to the strength of his very straightforward proposition and his
direct pitch to people's materials concerns and his story about
(17:26):
you know, how how we got.
Speaker 7 (17:27):
Here and what went wrong as well.
Speaker 2 (17:29):
Yeah, and you know the white the white bros, the
young men, Emily, they're eating it up again. Lessons to
be learned here.
Speaker 3 (17:36):
Yeah, it's how aoc one and it's not like esoteric
DSA naval gazing. It's yeah, a huge lesson I think
for for National Democrats. So Crystal, should we move on
to updates in the case of Gretit Sunberg.
Speaker 7 (17:48):
Yeah, let's do that.
Speaker 2 (17:52):
We've been covering the Gaza Aid Freedom flotilla, which was
intercepted in international waters illegally by Israel. The members of
that vessel were, including Greta Tunberg, were arrested, in President
Trump's words and mine as well, kidnapped by Israel, taken
to an Israeli prison, and then they, in order to
(18:14):
effectuate their release, were made to sign these forms that
would consent to their deportation but also would confess to
basically illegally entering Israel, which is not what happened whatsoever.
So some of the activists have refused to sign those forms.
Greta did sign the form and has been released. She
(18:35):
was flown back to Sweden. Reporters spoke to her there
about the ordeal and why she did what she did.
Let's go ahead and take a listen to a little
bit of what she had to say.
Speaker 3 (18:43):
Berg is speaking to reporters. Let's take a listen to
what she has to say.
Speaker 8 (18:47):
Yeah, on international waters, we were illegally attacked and kidnapped
by Israel and the taken against our will to Israel,
where we were detained and then some of us deported.
Speaker 3 (19:01):
Some are still there.
Speaker 4 (19:04):
Uh, there are very.
Speaker 8 (19:06):
Big uncertainties because it was quite chaotic and uncertainly, so
I don't really know what's going on. I haven't had
a phone for many days, and I the last few
hours they were absolutely nothing compared to what what people
are going through in Palestine, especially Gaza right now, and
it is this is yet another international a violation of
(19:30):
international rights, adding to the list of countless of such
especially towards Palestinian that Israel are committing by blocking and
preventing humanitarian aid from entering Gaza.
Speaker 3 (19:43):
That is a that is illegal.
Speaker 8 (19:49):
Sandwiches. Yeah, they probably have posted lots of pr stunt videos.
As I said, I have not seen anything, but they
did an illegal act by kidnapping us on international waters
and against our will, bringing us to Israel, keeping us
in the bottom of the boat, not letting us getting
out and so on. But that is not the real
story here.
Speaker 3 (20:09):
The real story is.
Speaker 8 (20:10):
That there's a genocide going on in Gaza and a
systematic starvation following the seas and blockade now which is
leading to food, medicine, water that are desperately needed to
get into Gaza is prevented from doing so. But of
course there are many attempts like this mission, both by
(20:30):
seeing and land to break that siege and open up
a humanitarian corridor. And this was a mission of attempting
to once again bring aid to Gaza, which is desperately needed,
but also to send solidarity and say that we see you,
we see what is happening, and we cannot accept just
(20:52):
see witnessing all this and doing nothing.
Speaker 3 (20:55):
That can never be an option.
Speaker 2 (20:57):
She also spoke more about how these horrors in Gaza
have been allowed to persis. Let's go ahead and take
a listen to her reaction there.
Speaker 9 (21:06):
Why do you think so many countries governments around the
world are just ignoring what is happening in Gaza because
of racism, That's the simple answer.
Speaker 8 (21:16):
I would say, racism and basically desperately trying to defend
a destructive, deadly system that systematically puts a short term
economic profit.
Speaker 9 (21:28):
And to maximize due political power over the well being
of humans and the planet.
Speaker 3 (21:35):
And right now.
Speaker 9 (21:36):
It's very, very difficult to mortally defend that. It is impossible,
but still they are desperately trying, which is absurd is
not the word, but in words to describe it.
Speaker 2 (21:48):
And Emily, I'm curious for your reaction to this. You know,
there's a lot I think that she said there that
is interesting, but in particular I want to highlight that
she mentions this isn't part of just one effort, and
in fact, the organization that she's involved with. We spoke
with the the spokesperson for the Freedom Flotilla group. They
(22:09):
plan to attempt this many more times, and there are
two separate additional efforts that they're in coordination with to
also try to break the siege. So there's been a
lot of discussion about Greta in particular. What do you
make of of what's gone down here?
Speaker 3 (22:24):
Yeah, I mean I don't have a like super I
think it kind of went exactly as I expected that
it would, and you know, it's better actually than it
could have gone. It's a little interesting.
Speaker 6 (22:39):
Was low.
Speaker 3 (22:40):
The bar was really low. Bar was really low, and
you know, the it's interesting that people still over there.
I also think, hey, this is an example of people
putting their money where their mouth is and actually doing
the thing, walking the walk. So you know, even if
(23:00):
I disagree with her, I'm not mad about it.
Speaker 7 (23:03):
To be honest, it was genuinely courageous. I mean, that's
to me.
Speaker 2 (23:07):
And the other thing is, look, she is a high
profile person. People paid much more attention to this effort
because she was on board. I think Israel's reaction, which
I don't want to underplay, I mean, it was illegal,
like they acted as pirates kidnapping even like I said
before Trump called it, so they kidnapped her, right, but
(23:27):
they didn't kill her, which they've done before with you know,
humanitarian activists previously back in twenty ten. And in fact,
the spokesperson we talked to was on board that ship
that they attacked and killed ten people who were on board,
So they didn't do that. And you know, I think
that level of quote unquote restraint also is partly attributable
to the fact that they realized, like, shit, we can't
(23:48):
kill Greta Tunberg, like this is gonna there's gonna be
a lot of international attention to that. So I think
she very effectively used the large platform that she has
undertook an action that was genuinely courageous, that entailed genuine
personal discomfort and extreme risk in order to try to
do something. And so you know, if you have I
(24:10):
saw Omar Botder front of the show tweeting like if
you had one thousand people who did the same thing
and also sent ships trying to break the siege, it
would become you know, intercepting all of those people and
trying to hold on to this blockade preventing starving people
from getting food like that would become unsustainable with even
(24:31):
you know, a comparatively small level of organization that Greta
has you know, made much more likely and much more
possible here. So I think what she did was extraordinary.
I think I don't know, I don't really understand why
the right really fixates on her and really hates her
in the way that they do. But I think she's
also proved a lot of the haters wrong who felt
(24:52):
like she had this sort of like corporate friendly level
of you know, approved activism. This was certainly not in
that vein both because of the danger of the action
and also because there is nothing corporate approved about standing
up for Palestinian rights.
Speaker 3 (25:11):
Yeah, I mean, I think she had like a youthful
naivete that was easy to make fun of. On the
climate issue, I genuinely feel badly for anybody who gets
famous as a child, whether it was their choice, their parents' choice,
or nobody's choice. So I feel like she's growing up
in front of cameras and panopticon and that just sucks.
(25:32):
But much more this is a this is a version
of Greta Tunberg that has definitely grown up compared to them.
There's there's no question about it. Big w Forgreta on
this one, not for Bebe.
Speaker 2 (25:45):
Yeah, I would say, so, let's go and put this
image up on the screen. This is the latest video
of a quote unquote AID and the way that this
operation looks, I know drop site is reporting that you
had another massacre in the context of Hungary Palace to
means attempting to obtain AID from this Gaza Humanitarian Foundation.
(26:06):
You know, hundreds of people now who have been killed
because of this setup of this aid effectively trap at
this point, and at the same time, we also have
the ambassador to Israel, Mike Kakabee, who is kat Oh
my god.
Speaker 3 (26:23):
The ambassador at a breaking points.
Speaker 2 (26:24):
Salem, Salem, Yes, Mike Kakabee significantly changing US policy and
just I mean making it plain now we don't we
don't support the previous contours of any idea of a
two state solution.
Speaker 7 (26:36):
Let's go ahead and take a listen to that.
Speaker 10 (26:37):
I think the question is what does that Palestinian state
look like? Okay, where is it? Where is it going
to be? Well, what does it have to be in
Judae and Samaria? Does it need to be somewhere different.
Does it need to be an opportunity for people to
have a true place that is completely their own, or
is it going to be in the existing areas that
(26:59):
are currently under the dominion of the PA. So there's
a lot of questions. That's why I'm saying I don't
believe anybody can say it's impossible, it will never happen.
But if someone wants to declare that this is the
exact strip of geography that is going to be the
future Palestinian state, that's where the complication comes from.
Speaker 3 (27:16):
Well, maybe the word is exact, that's the problem.
Speaker 5 (27:19):
Are you suggesting that somewhere other than mandatory Palestine area,
that it could be in Saudi Arabia or something.
Speaker 10 (27:25):
I'm just saying that I think every option should be
and could be on the table.
Speaker 2 (27:30):
Wow, what did you Yeah, what did you make of that?
Emily So he's designed but not in Palestine.
Speaker 3 (27:35):
Well yeah, but he's also like being clever and winking
and nodding and saying, basically, Palestine is impossible because none
of that land belongs to Palestine from his perspective, and.
Speaker 7 (27:46):
It stands the area.
Speaker 3 (27:48):
Yeah. It reminds me of when the Biden administration. Joe
Biden is mister two state solution is funding a war
for Netsan, Yaho, who is explicitly against the idea of
a two state solution. Here you have Mike Kuckabee actually
saying something very different than the president himself. And Trump
is hard to pin down on this issue obviously, like
(28:09):
is his Mara Gaza idea technically under the umbrella of
a Palestinian state as Propalestine activists would define it. No,
absolutely not. But he also would not be comfortable with
just saying, hey, screw it, give it all to Israel
because he has all of these relationships with other Arab
(28:30):
states that he thinks are important and contingent upon finding
a fair solution or a solution that they see is
fair for the people of Palestine. So Mike Cockabe in
that clip, it's not at all surprising. He's saying exactly
what you know, people from the dispensationalist evangelical movement believe
(28:52):
about that land, and it happens to be what a
lot of Israelis believe about that land. But it's not
what the quote unquote America First movement believes about that
that believes about the solution.
Speaker 2 (29:05):
Yeah, well, I mean it is quite consistent with Trump's
like We're going to ethnically cleanse Gaza and push them
somewhere else. Like it certainly fits with that. And we
know there have been discussions that have been ongoing with
a variety of different countries to try to push them
to accept these Palestinian refugees that Trump wants to create.
(29:28):
And so you know, I think it's I think it's
part and parcel with that.
Speaker 3 (29:32):
You know.
Speaker 2 (29:32):
The easy thing to say here is like, okay, well,
if you've made it impossible to have a two state
solution with the ongoing settlement, illegal settlement project, which all
Israeli prime ministers have engaged in, by the way, but
have really ramped up in recent years under net Yahoo
and specifically post October seven, if you've made that impossible, okay,
(29:52):
then how about everybody gets equal rights?
Speaker 7 (29:55):
How about that?
Speaker 2 (29:57):
How about you just live up to your rhetoric about
how you know, you don't discriminate in your democracy and
making an actual democracy where everybody's actually treated equally. Because
if a two state solution is not going to be possible,
and you're putting that off the table and you're doing
everything you can which is a stated goal of Nan
Yahoo to make it. So that's impossible. Well, that's the
(30:19):
other solution that's available to you.
Speaker 3 (30:21):
But in a sense, I don't think he's being honest
about it. In a sense, it's the like sadly realistic
situation on the ground is that that area in I mean,
he's in Jerusalem right there, but you know, the Temple mount,
I mean the Al Aksa, Like nobody wants that to
(30:42):
be given to the other side, is least of all
people who are of the like dispensationalist ideology as people
like Mike Huckabee. Are you know that's so?
Speaker 4 (30:57):
Yeah?
Speaker 3 (30:57):
I mean is it impossible?
Speaker 5 (30:59):
Sure?
Speaker 3 (31:00):
Or should it be impossible? No? And I think that's
where the just depressing reality sets in.
Speaker 7 (31:07):
Yeah.
Speaker 2 (31:08):
Indeed, we've got a great guest standing by to talk
about some extraordinary developments with regard to AI, So let's
go ahead and get to that. There have been a
bunch of very significant developments with regard to AI development
that we did not want to lose sight of this week.
So really excited to be joined this morning by tarn
Steinbreckner Kaufman. She's the CEO of the Golden Gate Institute,
(31:31):
for AI On. She's going to take us through some
of these developments. Taran is great to have you, great
to meet you.
Speaker 6 (31:36):
Thanks so much. It's lovely to be here.
Speaker 2 (31:39):
So, first of all, in the wake of the protests,
which at times turned violent in La, there has been
an effort to spread all kinds of propaganda, as is
quite typical in these news cycles these days, but added
to the mix now we have these just outright AI
generated scenes, which this one in particular, I'm about to show.
(32:01):
It's kind of crazy to me that some people even
thought this was real, because it takes a very surreal
turn quite quickly, but some people did just watch a
piece of it and actually think that it was real. So,
you know, that's putting aside more sort of sophisticated efforts
to create these images and shape people's understanding of events
(32:22):
that are happening in this country. So let's go ahead
and take a look at this one video in particular
AI generated that went viral.
Speaker 6 (32:29):
Why are you rioting? I don't know.
Speaker 3 (32:33):
I was paid to be here and I just want
to destroy stuff? Is what you are doing illegal?
Speaker 7 (32:40):
Well, these are peaceful protests. Even that guy over there
says so and so Taren.
Speaker 2 (32:48):
This really speaks to one of the concerns people have
with regard to AI development, which is just how difficult
it makes it to sort through what even is reality.
Speaker 11 (33:01):
Yeah, this is really important issue. Uh, these are going
to get One thing to know is that these are
going to get much more realistic.
Speaker 6 (33:07):
Like these are.
Speaker 11 (33:08):
You know, if you're a discerning eye and you sort
of know what you're looking for, it's pretty obvious that
this is AI generated. Still it kind of has like
a sheene, it doesn't quite kind of look exactly photo realistic.
And they've gotten much more realistic in the last year,
and they're going to keep getting much more realistic. I
would assume that a year from now I won't be
able to tell easily whether something like that is real
(33:30):
or not based on the actual images and audio. The
second thing, you know is this is a really hard
problem to solve. I've heard, you know, lots of solutions
floating around, like we should you know, require AI generated
video to have a water mark, or we should you know,
It's just really hard. It's not really clear how any
(33:51):
policy solution at scale could work to to you know,
to stop bad actors from making videos and circular them
that are not real. And the third thing which you
pointed to, I think is this is sort of what
I call the truth fog, which is it goes beyond
the effect of the actual negative videos, the false videos,
(34:14):
which is to say that like now, I can't you know,
in a year, I don't think I'm going to be
able to tell whether a video is real, which is
going to make me discredit true videos evidence as well. Right,
And that's a problem too, because anything, you know, I'll
be able to share. If I share a video of
you know, a candidate for president saying something and I'm like,
you should vote for them because they said this, people
(34:34):
are going to be like, how do I know they
actually said that?
Speaker 6 (34:37):
Right?
Speaker 2 (34:38):
And we are I think Emily already seeing some of that.
I know there was an incident with regard to a
doctor in Gaza whose kids were killed by the Israelis,
and there was you know, video of the aftermath, the
horrific aftermath of that, and there are all these claims
that the videos weren't real, that these were just AI generated.
(35:00):
I think we're already seeing some of that fall out
that Tarren's talking about.
Speaker 3 (35:03):
Yeah, it's a good point. I mean, it just becomes
harder to trust anything at all. I wanted to ask
about this news both from The New York Times in
Bloomberg that Mark Zuckerberg is announcing a nine figure by
to bring top talent on for quote unquote super intelligence.
You know that that year timescale. I'm glad you mentioned
it because it was one of the things I asked
(35:24):
about you. At what point are we going to have
a situation where even experts have to exert a huge
amount of effort to tell what's real and what's fake
on social media and even then maybe have a hard
time doing it. What does super intelligence mean as defined
by Mark Zuckerberg and the industry, and is this, you know,
(35:44):
hurdling us even more quickly along in that timeline.
Speaker 11 (35:52):
Everybody has their own term. There's super intelligence, there's advanced
general or artificial general intelligence, agi there's a There's all
these terms floating around, and a lot of people don't
define what they mean by them. So I don't know
exactly what Mark Zuckerberg means. Like when I think about this,
what I'm thinking about is an artificial intelligence that can
(36:14):
do most or almost all of the things that humans
can do on a computer.
Speaker 6 (36:20):
And that includes there for many many jobs.
Speaker 11 (36:24):
There are many many jobs that people can do from
their computer, and when we have an artificial intelligence that
can do all of those tasks remotely, that's obviously going
to have very huge economic impacts. It's going to have
a lot of other impacts on society as well. So
that's my own definition, and I think it's probably sort
of roughly aligned with many people in the industry's definition.
But as I said, different people mean different things. I mean,
(36:46):
I think the number one thing to take away for
there's still a large segment of people, I think who
dismiss AI as hype and you know, sort of like, oh,
this is kind of like crypto, like this thing these
tech bros and Silicon Valley are doing. Mark Zuckerberg has
never paid anyone nine figures for crypto. R AI is
in a different category, and everybody needs to be taking
(37:09):
it very seriously. It's going to have extremely large impacts
on the economy, on the way our democracy works, and
on the rest of our society in many many ways.
And then I think that that's my number one takeaway
from this is like, look, the world's largest corporations are
taking this extremely seriously.
Speaker 6 (37:29):
And you need to as well.
Speaker 2 (37:31):
What does it look like to you to take this
extremely seriously, because I feel like that's where I get
a little lost. I am deeply concerned. I think the
job loss is going to be extraordinary. I don't put
off the table some of the most maximust dystopian possibilities
here as well. But I'm not really sure what to
(37:53):
do about it at this point, because you have this
administration that is totally like no breaks on.
Speaker 7 (37:59):
The car Wild Wild West.
Speaker 2 (38:01):
We've got to we're in this race versus China, and
we've got to be the first to get to it,
and they, you know, have this very aggressive no regulations
approach to it. You have people like Sam Altman and
others who are overtly out there like, yeah, we want
to replace as much of human labor and possibly all
of humor human labor as possible, and we're probably going
(38:22):
to have to completely upend the social contract in order
for this to all work out.
Speaker 7 (38:26):
And yet, you know, I don't see any.
Speaker 2 (38:28):
Real large scale conversation, which is where you come in,
about what that new social contract is going to look
like and how we're going to make sure that people
outside of trillionaires are okay in that new world. So
when you say something like we need to be taken seriously,
like what specific things do you have in mind in particular?
Speaker 11 (38:47):
Yeah, it's hard because the field is moving so fast
that we haven't had time for a real like civil
society ecosystem to evolve around AI.
Speaker 6 (38:55):
Yeah.
Speaker 11 (38:56):
So if you're somebody who might have the like interesting
wherewithal in like starting a new organization around AI, there
are gaps everywhere. Like here's an example is I don't
know of any concerted programs to educate state legislators about AI.
Speaker 6 (39:10):
That's just like one example.
Speaker 11 (39:12):
Another example is, like I've heard many experts in the field,
like Ezra Klein and Kevin Russ and think tank people
in DC say we don't have nearly enough economic frameworks
for thinking about how about the labor dislocations that we
expect are going to happen from AI.
Speaker 6 (39:28):
It's not like there's just gaps everywhere.
Speaker 11 (39:30):
If you're familiar with this field, if you think that
somebody is doing something about AI, and you've got an
idea that somebody should be doing something, probably nobody is
yet or certainly not enough people. So you should go
like investigate and like maybe think about starting an effort
in that field. And I don't think that there are
a lot of pathways yet for people who to like engage,
(39:52):
but I think, but I think you should call your legislators.
I think you should talk to your legislators about how
big of an issue this is, how much you're worried
about it. Right now, AI is still not showing up
as a big issue in public opinion polling, and legislators
and politicians aren't hearing that much about it. I mean
when I say it's not showing if this big issue,
I mean it's not showing up. Like if you ask
people what their top issues are, what they're most worried about,
(40:14):
AI is not showing up on those lists.
Speaker 6 (40:16):
And we need to shift politicians perceptions of that.
Speaker 11 (40:20):
So if you're not going to go start a new organization,
you can at least like talk to your to your
legislators and your representatives about why what they You know
that they need to be paying attention to this.
Speaker 3 (40:29):
And I want to ask, oh, go ahead, go ahead,
was it? I want to ask about this next element
we put on the screen the Apple paper that went
viral and got a lot of reactions to it. This
post from Reuben Hassid says Apple just proved AI quote
unquote reasoning models like Claude Deepseagar one, and oh three
many don't actually reason at all. They just memorize patterns
(40:50):
really well, here's what Apple discovered parentheses. Hint, we're not
as close to AGI as the hype suggests. I read
that post and I think, okay, yeah, I mean, no,
they don't reason, they just memorize patterns really well, they're
freaking computers. Yes, that's exactly right. But then, on the
other hand, to use that as a way to dismiss
(41:14):
the our current proximity to AGI and to say that
this is all over hyped and it's it's really moving
very slowly seems ill advised. And I just kind of
wanted to get your reaction to I guess your reaction
to the reaction from that paper. Yeah.
Speaker 11 (41:32):
Look, there's a set of people out there who do
make their money off of and their and their fame
off of dismissing AI, and you know they'll sort of
seize onto papers like this to do that. I don't
think that this paper is actually that big.
Speaker 6 (41:45):
Of a deal.
Speaker 11 (41:46):
In some sense, it's not saying anything that's particularly surprising.
As you pointed out, it's it's asking the models to
do a set of very hard problems, and they at
some point are like, basically, the models are like, this
problem is too hard. I'm going to stop now, and
then the authors, well, I don't even think the authors
(42:08):
aren't necessarily interpreting that as these models aren't good at
what they do. It's then there's a sort of a
whole you know, Twitter ecosystem that's like, well, now we've
proved that these models aren't good at what they do.
Guess what if you give me one of these hard
problems that they gave these models. I'm also going to
be like, yeah, sorry, I don't have time for this.
This is like you've because had a said I have
to solve this. They give the models constraints like you
(42:30):
have to solve this in a certain context window, which
is maybe roughly equivalent to telling me I have an
hour to solve it, and I'm like, I can't solve
that problem in an hour, So I'm going to stop
now and then interpreting. But I hope you all think
that I'm artificial intelligence or sorry, I'm general intelligence, right,
I'm not artificial intelligence.
Speaker 6 (42:48):
And so they're sort of actually in.
Speaker 11 (42:50):
Some ways behaving a lot like humans in these situations,
and we're interpreting. Some people are interpreting that as saying
that they aren't smart, which I think is just not true.
Speaker 2 (42:58):
Another thing that I've wondered is there's this sort of
like line that's drawn. We were talking about the metaproject
to develop AGI ared Official General Intelligence, and you were saying,
there's all kinds of different terminologies for you know, meaning
whatever this milestone is, Like how much of a just
sort of like line that you cross is AGI and
(43:21):
how much of it is more of a sense. And
it's a little bit hard to tell whether you're there
or you're not. Like, help me understand when people talk
about AGI, how much it will be clear when we've
achieved this whatever this benchmark is. Because I also see
things like you I see that study from you know,
I see that particular study that was going around. I
(43:42):
also see these other studies where AI is scheming to
make sure it's not getting shut off and trying to
like blackmail an engineer with an affair that they think
the engineer is engaged in to avoid having their programming change.
You know, I see things that are deeply disturbing in
terms of AI engaging in. I mean those are very
like human type behaviors and trying to defy the wishes
(44:06):
of their programmers.
Speaker 7 (44:07):
So where are we in that development and will we.
Speaker 2 (44:11):
Really Is it like this hard fast line of we
are not at AGI and now we are at AGI
and it's very clear cut.
Speaker 11 (44:19):
Yeah, this is a great question. I think different. Well,
I think most people in the industry would say it's
not a hard line. And there's there's a term that's
called the jagged frontier. It's like the models can be
very good at some things and very bad other things
that humans can do. That by the same token, you
could say that humans have a jagged frontier, like humans
are very good at some things compared to AI, and
(44:40):
very bad at other things.
Speaker 6 (44:41):
Compared to current AI.
Speaker 11 (44:43):
So I don't think we're going to immediately cross the
rubicon of like suddenly the models are very are better
than humans at everything, including for instance, operating a physical
body in physical space. Right, that's not going to happen
literally overnight, probably, But you always have to qualify everything
(45:05):
you say about AI as like I think so probably
anybody who's not qualifying their statements. I think you have
to give you have to be worried about how much
you can trust what they're saying. So, but I think
that you know we are we at AGI now. I mean,
it's an interesting question because for a long time, the
test that everybody thought we were going to use for
(45:27):
this was called the Turing test, and people who haven't
heard of it, the Turing test is you put a
human behind a computer and an AI behind a computer
or in the computer, and then you have another human
interview both of them, right, And if the human who's
interviewing can't tell whether they're talking to the other human
or to the AI, then the AI has passed the
(45:48):
Turing test of intelligence. And these models pass that test now,
the cutting edge models, right, and so by that define
the definition that we had all used for literally decades,
invented by Alan Turing.
Speaker 6 (46:00):
We've already gotten there.
Speaker 11 (46:02):
And yet these models are very bad at a lot
of things that humans are good at. And I don't
I think anybody is anybody of serious repute as saying
we've actually hit AGI. So the goalposts are moving over time,
I think the goalposts will keep moving.
Speaker 3 (46:18):
Yeah, the train of acceleration is just unbelievable. I was
listening to an old through Line episode on Ralph Nader
and thinking about this, actually, because that's from car accidents
used to be like five times as high as they
are now. It took us actually a really long time
and a lot of tragedy to adapt our policies to
(46:39):
make our roads relatively safe. I mean, relatively is an
important word there. And I guess I'm just wondering how
optimistic you are about the policy evolution meeting the moment,
because right now it looks very bleak for some understandable reasons.
I mean, it's hard to define, it's hard to see
what the future looks like exactly. Things are moving very quick,
(47:01):
but it just seems like the reaction is completely non
existent here in DC.
Speaker 11 (47:08):
Yeah, I mean there's a lot of reasons to be pessimistic.
I think the reason to be optimistic is sort of like,
if you take a step back, humanity has faced a
lot of threats in the past, and so far we've
survived them all right, And so you know, humans are resilient,
and we do live in an era when lots of
people have agency.
Speaker 3 (47:31):
And.
Speaker 11 (47:33):
I hope, I mean, these are choices we are making right.
These are not inevitable conclusions. These are choices that we
as individuals and as society are making about how to
respond to AI. And I hope we can rise to
this challenge. I don't think it's impossible, but it is
a huge challenge, there's no doubt about that.
Speaker 7 (47:51):
Karen.
Speaker 2 (47:51):
Thank you so much for joining us this morning. It's
been great to chat with you. Let people know where
they can find you and follow the work that you
guys are doing.
Speaker 11 (48:00):
Thanks so much. Golden Gate Institute dot org. And we've
got a newsletter that you can subscribe to where we
try to make sense of what's happening in AI for
folks who are outside of the Silicon Valley bubble.
Speaker 2 (48:11):
All right, well, I'm definitely going to subscribe to that.
Thank you so much again, and we'll talk to you
again soon.
Speaker 6 (48:16):
Thank you.
Speaker 3 (48:17):
All right.
Speaker 7 (48:17):
Really interesting talking to Tarin there.
Speaker 3 (48:19):
Huh, Emily, Christal, I feel like we are at a
breaking point with regard to the end of the show. Okay,
see you got me that I walked into it and
I didn't expect it. But both of us should go
to prison for what we just did.
Speaker 2 (48:36):
Indeed, all right, I'm sure Trump stormtroopers will be here
to Russa shortly.
Speaker 7 (48:41):
Emily.
Speaker 2 (48:41):
Always great to see you guys. Thank you so much
for watching the show. Let's see you today is Wednesday.
Tomorrow I will be in with Sager, so we'll have
all of the leadus for you guys. Then, thank you
for subscribing over at Breaking Points. Make sure you use
that free promo code BP free and there it is
up on the screen BP free Breakingpoints dot Com. And
(49:01):
we will see you guys tomorrow.
Speaker 3 (49:03):
See you then,