Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn this stuff they don't want you to know. A
production of iHeartRadio.
Speaker 2 (00:24):
Hello, welcome back to the show. My name is Matt,
my name is Nolan.
Speaker 3 (00:28):
They called me Ben. We're joined as always with our
super producer Alexis code named Doc Holliday Jackson. Most importantly,
you are here. That makes this the stuff they don't
want you to know. We are recording this on the
evening of July third. As humans reckon the calendar here
(00:48):
in the United States, there's a birthday coming up, several
birthdays statistically, if you're listening to this, it might be
your birthday. So happy birthday. We are going to do
a thing and we like to do here in the States,
which is celebrate July fourth Independence Day. A lot of
other countries have their own independence days, but this would
(01:09):
all us.
Speaker 4 (01:10):
Baby. I don't know if this is the case in
cities other than Atlanta, but have you guys noticed that
people start shooting off fireworks like days before the fourth
of July. Yeah, with that, it's a little it's aggressive.
Speaker 3 (01:23):
It's a some cost fallacy, I would say, because you
buy the fireworks and then you kind of want to
test them, and then fireworks are objectively cool.
Speaker 2 (01:31):
And a deal you got like double the fireworks you
thought you were gonna get. What are you gonna do
with all the fireworks?
Speaker 4 (01:36):
I love that.
Speaker 3 (01:36):
I told you guys, I do have a shady uncle
whose name is literally Uncle Sam and no one stop
it true story. No one in my family knows what
he actually does, and he won't tell us. But he
did spend some time in China working in the fireworks industry.
Speaker 4 (01:53):
Hmm okay, yeah, I thought his name might be Wacky Wayne.
You know those fireworks stores around there was s.
Speaker 2 (02:00):
Wacky crazy Steves Fireworkingum.
Speaker 3 (02:04):
I swear I saw many years ago one on the
Gulf Coast called like forefinger Billies or something. That's because
he blew off his fingers.
Speaker 4 (02:13):
It can happen. Those what are the black cats, the
ones that are like no eighties, I think, is the
ones that can actually literally explode your flesh.
Speaker 3 (02:22):
Well, props to you, billy, if you're still around, because
he leaned into it, you know what I mean. Yeah,
he made some real.
Speaker 4 (02:28):
Lean you'll go blind too.
Speaker 2 (02:31):
Over here, they've got cautionary Kyle's fireworks stands.
Speaker 3 (02:34):
The cautionary made that.
Speaker 2 (02:39):
It's just sparklers. They're really tiny, those little snakes like
on the South Park A.
Speaker 3 (02:46):
Sparklers are like the CBD of fireworks. Let's be honest.
You know, yeah, they're fun. CBD is fun, but it's
not the same thing.
Speaker 4 (02:55):
Does does it work? Sparklers at least sparkle that's true.
Speaker 3 (02:59):
We're also going to learn about We're going to have
an update on a trial that we've been following. We're
going to have and a discussion about artificial intelligence. I
think we're all excited for. We're going to learn more
about mr NA because we did mention it for a
reason in a previous episode. But before we do any
(03:20):
of that, we're going to pause for a second and
get to our our kind of breaking story for this evening,
and we're back. We've talked about it before and as
we get into this doc apologies to you. When we
(03:43):
were talking about this.
Speaker 4 (03:43):
Off air, our.
Speaker 3 (03:48):
Log suffering that's the word, just shook her head and
said I had managed to forget about this for just
a couple of minutes.
Speaker 4 (03:57):
I believe the expression was I've managed to forget that
we live in hell.
Speaker 3 (04:02):
Right, yeah, yeah, yeah, And this is in reference to
something called the Supreme Court of the United States, long
a controversial institution. In the most recent sessions of the
Supreme Court, they made rulings that seem, depending on your perspective,
(04:24):
to be either quite reasonable or quite problematic.
Speaker 4 (04:28):
Now, is there no middle ground?
Speaker 2 (04:29):
Ben Well?
Speaker 3 (04:31):
Now, yeah, I don't think there's anybody who goes that's fine,
that's good. Yeah, amidst this is the thing. Okay, So
we've talked about it to many of our friends who
are not in the US. We know it sounds weird.
Ostensibly a democracy the United States, the laws of the
land are ultimately decided or adjudicated by nine kind of
(04:56):
ring raiths. They are not elected officials as and the
public doesn't get to vote for them. They're confirmed by Congress.
They have tremendous legal power, and they are meant to
be a check on the power of the other two
branches of government, the executive and the legislative. It is
(05:17):
weird that they're never up for election, kind of like
the pope. They have the job for as long as
they want.
Speaker 4 (05:23):
And I guess we're in sort of a weird situation
because you know, one along sitting justice passed away, so
the previous president was able to appoint way more than
usually get appointed during a sitting president's term.
Speaker 3 (05:38):
Yeah, yeah, and that is one of the big, big
powers of the executive branch in this respect. Now, we
know that talking about anything that touches on the domestic
political sphere can be divisive, especially in these times. It
can be a sensitive subject. However, we're going to do
(06:00):
at the top of this Evening's Strange News is explain
how this is a conspiracy, right, Please check out our
earlier episodes on things like Project twenty twenty five. And
please understand that we're talking about this. We are not
I think it's fair to say that we are not
(06:22):
championing some polemic agenda. We're not up for election, you
know what I mean. We don't even have a webby,
so we're not the folks in charge of this. But
we are going to show you objectively why you should
be concerned about this. If you live in the United States, absolutely,
But guess what if you don't live in the United States,
(06:45):
this stuff is going to reach out and touch you too.
How would you guys? As we're getting into it. How
would you guys describe your reactions to recent Supreme Court
rulings like, I mean, obviously the when we'll get to
in a second, but let's start with. Let's start with
rolling back federal regulations or how those are decided. Not good, No,
(07:10):
feel good about it?
Speaker 4 (07:11):
Not great?
Speaker 2 (07:12):
Is this the Chevron decision?
Speaker 3 (07:13):
Yes, this is the Chevron decision, the Chevron decision with
scotas you could already tell by the name it is.
It is about companies and what is about what a
company of private entity functioning in the United States must
do to remain legal, and there was What they did
(07:37):
is back in nineteen eighty four there was a landmark
decision called Chevron versus Natural Resources Defense Council, and they
found against Chevron. This gave rise to what we call
the Chevron doctrine. Without getting two into the weeds, you
can read a great piece on this by Amy Howe
(07:59):
the Scots Blog. What we need to know about this
is the Chevron doctrine, which again has stood for decades,
says that if Congress has not directly addressed a specific
question at a center of a dispute, then a court
has to defer to the agency in charge. So, for example,
(08:22):
just keeping it really hypothetical here, if the FDA, the
Food and Drug Administration, had an interpretation of something about
food safety or you know, how much rat feces should
be in raisin brand, the answer is to scoops. If
(08:42):
you know, they have these laws right, like to what
degree of what thresholds are there for contamination such that
the safety of the public is not endangered under the
Chevron doctrine, the courts would have to defer to the
FDA's decision about this, which kind of makes sense because
(09:03):
they're the people whose job it is to know these things.
Speaker 4 (09:07):
And we know there are problems with those agencies as
well in terms of the revolving door of folks moving
from the industries that are regulated by these agencies to
the agencies themselves. So you know, in the best of situations,
there's already conflicts of interest and this just I mean nukes,
the whole any semblance of like expertise being you know,
(09:32):
utilized by these agencies to your point bend, which is
their job to do.
Speaker 3 (09:36):
Yeah.
Speaker 4 (09:37):
Yeah.
Speaker 3 (09:38):
A better example would probably be the EPA, the Environmental
Protection Agency, had an interpretation of the Clean Air Act
right regarding how emissions, fossil fuel emissions in particular, should
be regulated. When you roll back stuff like the Chevron doctrine,
(09:59):
it means that there's going to be legal chaos because
we no longer have an order of operations for decision making. Now,
this sounds really boring until your water turns brown, right,
This sounds stupid and academic, until you realize that you
can find fingers in cans of chili. Right. This opens
(10:22):
the door to return to the days of Upton Sinclair's
the jungle.
Speaker 1 (10:27):
It is.
Speaker 3 (10:27):
It is a dangerous and it should be an a
political concern. It doesn't matter who you vote for. You
need stuff at a certain level of cleanliness.
Speaker 4 (10:36):
But it benefits the polluters. It benefits the companies who
are responsible for these emissions or potentially dangerous drugs or
food whatever. It just opens it up for them to
f arout and find out. Kind of I'm sorry, Matt,
what were you saying?
Speaker 2 (10:52):
It's okay. I just had to say I've read through
a lot of the writing on this guys, and I'm
still a bit confused, and I'm trying to put it
in like more simple terms. I think you've kind of
done it already, been like saying that before if there
was a court case with a company versus the federal government,
the law, whatever it is, if it's vaguely written, you
(11:14):
default to whatever the federal regulating body says, like what
their rules are.
Speaker 3 (11:20):
Basically it's kind of allowing them to make jurisprudence. Yeah.
Speaker 2 (11:24):
Yeah, the entity that's trying to either protect the environment
or the consumer or whatever it is. The federal entity
that's doing some kind of regulation on corporations, on private entities,
you go with whatever that regulator says. But now it's
saying that's it's just every court gets to make its
own decision about each one of those things.
Speaker 3 (11:42):
Now yeah, now it's saying Now it's saying, remember nineteen
eighty four psych because Scotus is famously behind on their
street slang, so they still say stuff like cawabunga and psych.
They did roll. Yes, Toul, answer is your understanding is
astute here. Now we are not ourselves legal experts. We
(12:06):
are incredibly fortunate to have many, many of our fellow
listeners in the crowd who are legal experts, and we
would love to hear some explanations about this now. The
current well, the thing is for the entirety of its history,
the Supreme Court has always been fertile soil for corruption
(12:28):
and influence. And these people do try to do the
right things, and they've made they've made laws that have
made the world and the United States a better place objectively,
but they are continually accused of political machinations, especially because
they recently ruled, and I was complaining about this and
previous recording, they recently ruled that ethics, the ethical regulations
(12:52):
of the justices themselves, should be overseen by the justices themselves. Thanks,
I know, and I can't get out of elite fee
at the library. This is ridiculous question.
Speaker 4 (13:04):
Before we maybe move on. We could probably talk about
this for the whole episode. These decisions. But it's not
to say that there will always be a legal kerfuffle
around simple things. But does it necessarily also mean that
anyone could question something or file a suit or create
a legal kerfuffle and therefore throw everything into disarray, Like
(13:24):
if there are bad actors and there are like maybe
lobbyists or whomever, lawyers on the behalf of these corporations,
they could challenge these laws and then it would it
could potentially cause chaos or have them changed without the
oversight of a supposedly benevolent regulatory body.
Speaker 3 (13:42):
That's yeah, no one knows that's the issue.
Speaker 4 (13:45):
But isn't that is that potentially on the table. I mean,
that seems like a thing that could now happen.
Speaker 2 (13:50):
Doesn't it endanger some of these federal agencies because it
may end up costing them too much money in the
short too long term to fight all of the legal
battles that will you know, will be presented to them.
Because there there are constant legal battles. If we think
about the court cases where it's oh, well, let's just
(14:11):
use Chevron again, Chevron BP whoever gas company versus the
federal government because they're trying to settle some dispute about
regulations that are being imposed on them by the federal government.
Right yeah, So then those you know, the organization like
the EPA or whatever it is, has to go to
court and eat all those.
Speaker 3 (14:30):
Costs right every time, by thousand cuts. That's the legal
strategy of a lot of these powerful entities. And the
it was, like I was saying on Twitter earlier, I
feel like every law student should get a full refund
on their tuition because the stuff they were taught may
not apply right at this point.
Speaker 4 (14:51):
But isn't also Project twenty twenty five kind of all
about eroding these agencies and give them. This further contributes
to the overall, like big picture erosion of the powers
of these agencies. And to Matt's point, potentially you know,
eating them out of house at home, the point where
they can't even afford you exist anymore.
Speaker 3 (15:09):
Let me put it this way, in the most simple terms,
the most simple a political terms. Corruption is a team sport, right.
They don't get a lot of successful lone wolves, or
if you do, they don't last for as long as
they could. And this brings us to the big fish,
the big issue that people have, and if you are American,
(15:31):
you should have an issue with this too. Recently, in
a just a few days ago, in a mic drop moment,
the Supreme Court finally issued their opinion on a case
called Trump versus the United States. That's right, folks, the
name of the case is officially a former president versus
the country that he served as commander in chief of
(15:54):
for a while. And what they found in this decision,
which was divided clearly along ideological lines of the court,
what they found was that in their opinion, any US president,
just like any Russian president, will have broad immunity from
criminal prosecution, a license to commit crimes, so long as
(16:18):
they have they put it use the official powers of
their office to do so. So the Overton window has shifted.
Now the question is not was insert here a crime?
The question is was it official? You know what I mean?
Speaker 4 (16:34):
Doesn't this mean that Nixon was not a crook?
Speaker 3 (16:38):
It says Watergate, you know, would have been totally chill.
Speaker 4 (16:42):
This is wrong.
Speaker 2 (16:43):
It's it's so weird, it's it's it is weird to
me because it does seem to draw a sharp line
between those things that are done as a private action versus,
as you said, an official action, right, Right, So it
doesn't necessarily mean on the surface that the president has
(17:04):
complete immunity, but it does mean if the president does
something that is, let's say illegal, it could be argued
by that president and everybody around that president that whatever
the action was was taken as an official due right.
Speaker 4 (17:20):
What's the criteria for that though? Is it laid out
or is it pretty vague?
Speaker 3 (17:24):
No? No, it's pretty vague. Yeah, because if you're if
you're trying to pull a grift you know, the thing
about lying to a large group of people is you
got to swing big. You know, you want to minimize specifics,
you want to access emotion. Right, So a divided, tribalized
or dare I even say near balkanized public like the
(17:45):
United States, then they're going to be strongly pushed toward
voting like the way you would root for a football team, right,
instead of looking objectively at the day here And make
no mistake, we talked about this in Project twenty twenty five.
It's too much power. It's Unamerican, and it's very difficult
(18:09):
to imagine any president that would not in practice take
advantage of this. I want to go to the specific
descent here from I was going to say, yeah, Justice
Kintaji Brown Jackson wrote a quote that really stood out,
I think to a lot of us and said, quote,
(18:30):
from this day forward, presidents of tomorrow will be free
to exercise the commander in chief powers, the foreign affairs powers,
and all the vast law enforcement powers. And Trine did
Article two, however, they please including new ways that Congress
has deemed criminal and that have potentially grave consequences for
the rights and liberties of Americans. The Justice here is
(18:54):
not being hyperbolic. It is crazy that our Project twenty
twenty five episode came out and then this followed almost
directly on the heels of it. I'm not saying they
listen to our show, you know, if you if you
guys do listen, though, you know, buy us off. We're corrupted.
We want to be on the team.
Speaker 4 (19:13):
Give us a webbie, maybe just chill. And then another
descending opinion from Sonya Soda Mayor said an excerpt, Today's
decision to grant former presidents criminal immunity reshapes the institution
of the presidency, which is kind of inarguably.
Speaker 3 (19:31):
And also with fear of debiocracy, with fearful doing. I dissent.
Speaker 2 (19:36):
Yeah, this is my opinion, guys. It has nothing to
do with potentially next former president Donald Trump or Joe
Biden or anyone in the next what twelve years. It's
coming down the road. I can we quickly read just
a couple of quotes, because this is absolutely it's scary
when you see it.
Speaker 3 (19:57):
Yeah, and there's a big point I want to end with.
Speaker 2 (20:00):
Yeah, just some of the majority opinion written by Justice Roberts.
Speaker 3 (20:03):
Here, right, that's Chief Justice John Roberts.
Speaker 2 (20:06):
Something about courts may not inquire into the president's motives. Like,
that's scary to me.
Speaker 3 (20:11):
Yeah, quote in dividing official from unofficial conduct, courts may
not inquire into the president's motives, which is insane.
Speaker 4 (20:20):
Yeah, but it is up to the lower courts to
decide what constitutes a presidential act. So isn't that sort
of like a I don't know, that seems like what's
the word? I'm looking for a contradiction of terms where
it's up to them to decide what is a presidential act.
But they also can't take into account whether it was
self serving and not in the service of the office,
(20:44):
or of the people or of the country.
Speaker 3 (20:46):
And here's the issue. Also, it goes further to say,
people who do not support this idea of a president
above the law, they would naturally wonder, well, surely it'll,
like you're saying, go to the lower courts. Surely prosecution
and appeals can wend their way through the glacial legal system.
Speaker 4 (21:11):
Doubtful.
Speaker 3 (21:12):
Well, they cut it off at the pass. You could
say that they already hamstrung possible prosecution. Roberts continues in
his statement, and he says a prosecutor cannot quote admit
testimony or private records of the president or his advisors
in quote in figuring out if there is an official act,
there is a crime. That means the president can be
(21:36):
whomever they may be, can be prosecuted for a unofficial act,
but the prosecutors cannot prove whether this president committed a
crime using evidence from the president's official actions. It is
so there's an amami to it. I can't help but
admire the cognitive parkour.
Speaker 4 (21:56):
Pretty sure I've seen this meme already, but if not,
it belongs in this format. Joe Biden has the chance
to do that most hilarious thing ever with this new power,
because it essentially means you could have a political rival
assassinated if you see or or judge that it is
a threat to democracy.
Speaker 2 (22:16):
But you would never have to say that's exactly right, criminal,
or could just do it proven to any of the motivations.
You can't provide testimony that would call into question in
the motivations of any sitting president or their advisors.
Speaker 3 (22:32):
Exactly this could be. This does eclude, this does encompass
things up to an including murder or for some of us,
even worse, treason, right and this leads us to here
is why I posit this isn't a political concern. This
isn't everyone is in this together concern. It does not
(22:53):
matter who you vote for, It does not matter your
political ideology is your own. Your spiritual idea algy, it's
your own. The beautiful thing about this country is you
don't get murdered for disagreeing with people. So far, this
is create This is a direct contradiction of one of
(23:14):
the big things the imperfect founding fathers were all about.
You know, we know what they loved. They loved silly
wigs and stockings. They were problematically into enslaving people. And
they also hated monarchies. That's like one of the main
reasons they started this whole weird experiment. No kings, right.
Speaker 4 (23:35):
This feels weirdly monarchical things.
Speaker 3 (23:39):
It's like somebody saying, hey, as the new managers of
the fire department, we looked into, you know, our bylaws,
and we really should be setting more houses on fire.
Speaker 2 (23:51):
Yeah, but you know, those same guys also thought we
shouldn't let the peons really decide who the next president's
going to be. We should come up with some form
of institution a college of sorts that would actually make
the informed decision, while you know, the peons get to
vote in their silly little election, and.
Speaker 3 (24:12):
A couple of those, a couple of those slick bastards,
tried to get George Washington to be a king, which
is dumb. Not everybody agreed dad crazy ideas. Ben Franklin
looked at the alphabet and said, I've got a better take.
Thomas Jefferson looked at the Bible and said, I have
some notes, you know what I mean. These These were
real hold my beer guys. Plus they were often drunk,
(24:34):
if we're being honest.
Speaker 2 (24:35):
But they had a cool idea, including no more kings.
Speaker 3 (24:38):
Including no more kings. And thank you for bringing us
back to that, because that's the point we need to
stay on. This is literal stuff. They don't want you
to know, not necessarily what's happening now, that's all in
the public sphere, but the consequences, right, the long term plan,
not next year, but the next five years, the next ten,
the next twelve. And we want you hear your thoughts,
(25:01):
whether or not you are a legal scholar, what do
you foresee as the possible fallout of treasons? Supreme Court conspiracy.
iHeartRadio dot Com. We'll be back after a word from
our sponsors.
Speaker 2 (25:20):
And we've returned. We're going to stay in court for
a moment. Not the Supreme one, the crimson King perhaps, No,
the Court of Massachusetts.
Speaker 3 (25:32):
Yes, technically the sexiest court.
Speaker 2 (25:35):
Oh my god, I tell you what. Actually the specific
court here is the Norfolk Superior Court in Denham, Massachusetts.
Speaker 3 (25:47):
And bothered. Wow those smoke shows.
Speaker 2 (25:50):
Yes, we're jumping back to the Karen Reid murder trial
that we've been following since it was brought to our
attention almost exactly one year ago.
Speaker 3 (25:59):
Guy, I retract my sex jokes. This is a murder trial.
Speaker 2 (26:03):
This is a murder Yeah, very true.
Speaker 3 (26:05):
I've un button my shirt back up. Sorry, no word
to take it too far.
Speaker 2 (26:09):
But maxtout brought it to our attention almost exactly a
year ago, in July twenty twenty three, and we're gonna
give you yet another refresher as this is another update
from that trial. Karen Reid is a person, a woman
who was accused of killing her boyfriend, a Boston police
officer named John O'Keefe. It is alleged by the people
(26:31):
bringing those charges against her that she hit him with
her car in Canton. Is in Canton, Massachusetts, on January
twenty ninth, twenty twenty two. She has ever since charges
have been brought to her ever since the incident, denied
any involvement and claims that her boyfriend was in fact
killed at a house party where his body was found
(26:54):
right outside of it. The house party that had lots
and lots of police and other law enforcement officers on hand.
I guess they were also attending that party. There's also
an alleged dog attack. All kinds of stuff, very very complicated.
So Karen Reid was charged with second degree murder and
manslaughter while operating under the influence and leaving the scene
(27:16):
of personal injury and death. If she would have been
convicted of these charges, she would have spent the rest
of her life in jail. And I should be noted
the defense the attorneys that were working for Karen Reid
attempted to paint a picture of an elaborate conspiracy by
the law enforcement officers who were there at that party,
as well as others that were working in official capacities
(27:40):
and other people who were witnesses for the court in
this case. Well, after all the evidence was laid out.
After all the testimony, after all the closing arguments were completed,
Karen Reid's fate was in the hands of the jury,
and after twenty seven hours of deliberation, they failed to
reach a verdict. You guys, oh under yes, the trial's over,
(28:03):
and Judge Beverly Cononi t how you say your name?
C A N N one. She declared it a mistrial
and set July twenty second as the next day for
the court to get back in session to decide what's
going to happen next. It's a pretty big deal. There's
a lot of writing happening right now about it and
what's going on, and also what's specifically going on with
(28:26):
this guy, Michael Proctor, who was the lead investigator. Lots
of weird stuff in there. If anyone was following the case,
this is the guy who was, you know, leading the
investigation and then also texting with personal friends and colleagues
about Karen Reid and saying some pretty messed up things
and very strange, very very strange trial. But I guess
(28:48):
that's the update. If anyone has any specific things they
want to talk about, I guess just send your thoughts
our way. You can email us or we'll tell you
how to call us at the end here. But yeah,
mistrial guys, it's weird how that can work, where basically
defense attorneys all they have to do is create enough
doubt about the official charges right and the investigation to
(29:14):
make some jurors, even just a few, maybe just one,
even say no, I absolutely will not come to unanimous
decision with you about guilt or not guilt.
Speaker 3 (29:23):
Here, we've all seen twelve angry men, which is just
a master work, and.
Speaker 4 (29:28):
I've actually not seen it. And I mean to you
very much.
Speaker 3 (29:32):
I sometimes look around our cohort and wonder who I
would cast as like members of a jury and what
kind of conversations we would have, because over the years
we've had some really intense and strange conversations. But to
that point, however, about a case being tainted, it's something
that prosecutors take very seriously because in this judicial system,
(29:56):
something that goes wrong or something that is an unethical
actor outside of procedure for the quote unquote good guys,
then that can ruin the case, regardless of if the
criminal action is pretty certain.
Speaker 2 (30:14):
Oh yeah, you've heard that before, And that's a huge
trope in American media, shoddy police work right ruin. Yeah.
But in the hey, in this case, right after the
mistrial was declared, we're talking hours. This trooper Michael Proctor,
he's he's a Massachusetts State Police officer. He was relieved
(30:38):
of duty and reassigned from his role as an investigator
with a Norfolk District Attorney's office. They were like, you
did so bad in this whole thing. You're gone.
Speaker 3 (30:50):
And doesn't he have an ia investigation?
Speaker 2 (30:53):
Internal? Yeah, internal affairs investigation going on with that guy
right now?
Speaker 3 (30:57):
Good?
Speaker 2 (30:58):
Not great, not great, but it makes you wonder about
you know, what don't we know and what won't we
ever know because of you know what they find in
that internal investigations case.
Speaker 4 (31:08):
Well, and the hung jury is the ultimate tool of
like things like jury tampering as well, Like that's what
you hope for, is you know, some plant that's like
the loan holdout that keeps a unanimous decision from being rendered.
Speaker 2 (31:19):
But in this case, it doesn't mean this is a trial, right, No.
Speaker 4 (31:23):
No, no, of course it does, but it's like a
ultimate kind of hang up and it seeds doubt as well.
Speaker 3 (31:30):
You know, the American jury duty so bad?
Speaker 4 (31:33):
Have you guys both done it.
Speaker 3 (31:34):
I never get cast. No, I'm sorry, I shouldn't describe
it as an audition, but it does feel like an audition.
They can tell I'm too on board with civic duty.
Speaker 4 (31:43):
I think I've mentioned I was cast once, and I
do feel like I swayed some of the jurors in
what I would argue it's a positive direction. It was
a civil case and it was about awarding somebody some damages.
But then the second time I did the audition, I
was not cast, and it was because the you know,
(32:04):
they basically present to you the basics of what the
case is, to the point where you kind of get
the gist of what's going on, how the injured party
was injured, and it involved a kidnapping type situation and
a hold up. And I had experienced a situation where
a neighbor of mine was home invaded and shot and
(32:25):
the neighbor, a friend of mine, came to my house
bloodied and tied up, and I expressed this story, and
I think they counted me out because they thought I
would be maybe two triggered by my experience.
Speaker 3 (32:37):
On my end, I think it's just that it's the
same thing that happens when I'm in an exit row
at a plane and I have to give the verbal yes.
I'm just like a little too on board. I got
some nansome eyes and I'm like, yeah, let's go to court.
Speaker 2 (32:54):
I don't know about jur number seven.
Speaker 3 (32:58):
There's something in the eyes. But this also, you know,
I appreciate the point you're making about how this can
be kind of a black box right in terms of
what options will be pursued, how things will shake out.
But one thing I love about this is the it
returns us to the point we'll always make, which is,
just because something has left the big headlines doesn't mean
(33:21):
the story is concluded, kind of like vaccines.
Speaker 2 (33:24):
A think you ben do. For the second part of
this segment, we are talking bird flu boys H five
and one av and flu. We've talked about it on
the show before, talked about how it's getting in the cows.
It's transferred to the cows. Bird to cow not good.
Now we're watching humanity, that is, whether or not it's
(33:45):
going to go from cow to human because mammal to
mammal is a little easier than avian to mammal, and
it already did that zooanautic.
Speaker 4 (33:53):
Is that right?
Speaker 3 (33:54):
It jumps in right, yeah.
Speaker 2 (33:56):
Yeah, But it's in the cattle, it's in some of
the milk. According to the people who watch the milk.
Speaker 3 (34:02):
I thought you were going to say, according to people
who drink milk, as though there's some weird edgy demographic.
Speaker 4 (34:10):
Is the ones that like the raw stuff. They're the
edgy milk drinkers you will call her.
Speaker 3 (34:17):
They're just out there. The mcpoils are out there, just
raw dog get milk.
Speaker 2 (34:21):
Yeah, always, but but no nothing, no, no aspersions cast
on the milks as you will. Yeah, yeah, yeah. Besides,
you know some of the barbaric ways they get the
milk out of the cows, you know, and they have
to make them have babies and then take the babies
away from them so they can make it.
Speaker 3 (34:39):
Oh God, have you guys ever milked a cow?
Speaker 4 (34:42):
No, I really squeeze those nips. I've seen video off.
Speaker 3 (34:47):
It put me off milk for a while. Jeez. I
try not to think about it. But yeah, do you guys?
Speaker 2 (34:52):
I just watched an old Tom Green video where he yes,
he does, and I think he might have been the
first one to do it in that capacity on camera.
Speaker 4 (35:03):
Respects my bum is on the step. He was immortalized
in an Eminem song. It occurred to me the other
day a lot of people forgot Tom Green except in
that Eminem thing where history would be talking. What is
he talking about?
Speaker 3 (35:19):
What is talking about? In this line? It was show
was not to know what a woman's glitter is? Intercourse
is yeah, yeah, yeah, wow, Wow?
Speaker 2 (35:33):
What are we talking about? Guys?
Speaker 3 (35:34):
Ye, vaccine scenes. We're we're delivering on our earlier tease
about m R in A conversations.
Speaker 2 (35:44):
I think, oh, okay, we're gonna get to you. We're
gonna get to m R in A vaccines. Okay. So
the reason we're talking about this avian flu is because
we've been chatting about it for a while. Scientists have
been warning, raising little signs everywhere across the world, saying, guys,
this is this is kind of weird. This is not good.
And now, at least according to a Reuter's article that
(36:06):
I read this week, I think we all read perhaps
titled scientists wary of bird flu pandemic quote unfolding in
slow motion. Now, could that be scare tactics, Yes, it
could be. Could it be scientists actually saying, Hey, the
public and you know, journalists, a lot of us don't
really notice pandemics until they're really on that upward swing,
(36:29):
you know. And what these scientists are trying to say is,
we're noticing that we might be on that low, low
bell curve right at the beginning. It feels like that
to us. We want to raise a bigger flag now.
And since that is occurring, as you were saying, Ben,
that mRNA vaccine that we all know, because that's a new,
(36:51):
brand new technology for vaccines that was introduced during the
COVID nineteen pandemic that we all went through, where Moderna
and Fi's in a couple a bunch of other companies
attempted to use this new technology as a way to
fight viruses via vaccine.
Speaker 3 (37:08):
Right.
Speaker 2 (37:09):
The mRNA thing, well, the real kicker here is that
Maderna is being given or granted, let's say, is that
how you would say it. They've been given a grant
by the government, or at least they've been tapped by
the government to develop an mRNA vaccine for the next
(37:31):
flu pandemic, the next outbreak. In this case, it would
be avian.
Speaker 3 (37:35):
Flu, which has always I think for many years now,
avian flu has been a I don't want to call
it a backburner concern, but it's been a consistent part
of what scientists are warning us about, Like as you
mentioned NOL, the zoonotic aspect of an avian flu.
Speaker 4 (37:55):
That could I mean.
Speaker 3 (37:57):
They already have. We already have instances of this. A
lot of scientists will tell you it's it's similar to
the idea of the Yellowstone eruption, right, it is geologically certain,
we're not exactly sure when So when scientists are talking
about a possible bird flu pandemic, there they have been,
(38:17):
at least we know over the past few decades, increasingly
talking not in terms of if, but in terms of win.
Speaker 4 (38:25):
Haven't we had minor bird flu pandemics. Yes, we're quite.
Speaker 2 (38:29):
Deadly, well were pandemics whatever outbreaks.
Speaker 4 (38:34):
But they were quite quite deadly. Couldn't you argue more
deadly even than COVID because people were killed not necessarily
because of byproducts of the condition, but because of the
condition itself. I guys, it's been a minute. I'm sorry.
I'm just reaching back into the memory banks. But for
some reason, it occurs to me that when we had
instances of bird flu, it was real bad.
Speaker 2 (38:57):
I would say when these types of zoonotic diseases jumped
in humans and with their zoonotic behavior, they are more
dangerous than stuff that humans have encountered before because we
have no immunity. There's no herd immunity. That's why the
vaccines were so important during the COVID nineteen pandemic. Right
when a bunch of people really quickly get sick with
the same thing and nobody has immunity, you can get
(39:20):
everybody sick, and then it's just becomes what is you know,
what's the what is the mortality rate of this thing?
Speaker 4 (39:26):
Right? It's fifty percent is what I'm reading online right.
Speaker 2 (39:30):
Now, perhaps of this new one, no, this past in
the past.
Speaker 3 (39:35):
Okay, so that that would be like maybe the H
seven and nine in China. Yeah, and what we're talking
about actually is the new one.
Speaker 4 (39:45):
That's right, but sorry, the mortality rate of the bird flu,
according to the Yale medicine that we've experienced thus far,
was fifty percent. Nine hundred people around the world that got.
Speaker 2 (39:55):
In fact, well, all the more reason to get Maderna,
this huge corporation that's already had a home run with
her COVID nineteen vaccines. To create a new one. Right,
they're being paid one hundred and seventy six million dollars
to quote accelerate development of a pandemic influenza vaccine that
could be used to treat bird flu in people. That
(40:15):
is from ap News, posted on July second, twenty twenty four.
Why is this a weird thing, Well, that's because there
is still so much skepticism and fear about this specific
vaccine technology out there. If you go on TikTok, Instagram,
any of these places, you will see countless humans saying
(40:37):
this very thing out loud, how dangerous this vaccine technology is,
how it is doing various things, and how you know,
rumors of various world leaders and billionaires who are attempting
to get stuff into all of us through these vaccines.
Much of this, if not most of it, is just hearsay, rumors,
(40:57):
conspiracy theories that are unfounded. The one thing that is
founded is that this is a relatively new form of technology,
a new way to treat you know, diseases via vaccine,
which you know, all of us were kind of the
test subjects in the first round from twenty nineteen onwards.
So it is a bit weird that we're going this
(41:19):
way again. I don't know. I just want to put
that out there that it gives me pause thinking about it.
Speaker 3 (41:24):
Yeah, because again it's the and we're not necessarily co
signing some of the more out there conspiracies, not at all.
Speaker 4 (41:32):
We yeah, but we.
Speaker 3 (41:33):
Are arguing its matter of longitudinal data, right, what happens
over time, And the problem is when stuff comes out
without the public being prepped for it, without scooting that
perspective and over to window toward normalization, then it sounds
like a new crazy thing, which kind of the science
(41:54):
is amazing. It is sort of a new crazy thing,
if we're being honest. But because of that, and because
of all the other stuff people have heard associated with
these scientific concepts like DNA and RNA. For most people,
when you hear mRNA, then you're immediately thinking of what
(42:14):
like ancestry, dot com, copaganda, cold cases, true crime podcast,
you know what I mean. It's got it's a weird
millage of things, right, all all soaked up in there.
So maybe I don't know. It's always right to ask questions,
but it is also true that with any new technology
(42:35):
there is potential for unforeseen consequences. I'm trying to be
very careful about saying that we just we haven't spoken
to no one has come back from the future in
linear time and told us, you know, I'm from one
hundred and fifty years after you, guys, and mRNA has
wrecked the planet or it.
Speaker 2 (42:55):
Was the best man. It really fixed everything.
Speaker 3 (42:58):
Guys. The future is so hi, do you have any spirit?
Speaker 2 (43:04):
MODERNA is the new president this year. It's intense.
Speaker 3 (43:08):
The presidents are all now companies.
Speaker 2 (43:09):
Yeah, yeah, yeah, yeah, yeah, but that's gonna happen no
matter what. The outcome of this.
Speaker 3 (43:13):
Citizens was Oh my god, America Premium.
Speaker 2 (43:17):
Yes, if you're interested in this stuff, look up Biomedical
Advanced Research and Development Authority or BARTA. That's the program
that awarded Moderna this one hundred and seventy six million
dollars to accelerate the development of a pandemic influence of vaccine.
All right, tell us what you know about this stuff,
what your thoughts are, and we will be right back
(43:37):
with more strange news.
Speaker 4 (43:45):
And we've returned with one more piece of strange news. Ben.
It was an oddly prescientth that you mentioned. With new
technology often comes unforeseen consequences. We've talked a lot about
the use of AI, of generative AI, of machine learning,
whatever your you know, expression d jour is to describe
(44:07):
this new technology that is advancing rapidly. The funny thing
is a lot of what we're seeing is a little clownish.
We're seeing it used for stuff like making cute anime
portraits of ourselves or generating goofy songs or whatever. And
we you know, we certainly have seen interesting, you know,
kind of effective, positive uses of it, but a lot
(44:30):
of it has been stuff that gives us pause in
terms of like trying to automate people's jobs into oblivion,
you know, using AI things like chat GPT, which is still,
while impressive, not great. I've been watching YouTube videos over
the last few days with a buddy of mine who
pointed out that the AI generated subtitles on YouTube, while
(44:50):
often okay when you watch like old Moni Python episodes,
leave a lot to be desired. They get real jammed
up on British slang and you know, certain colloquialisms and things.
So it's actually kind of almost an extra layer of
comedy to watch the ridiculous things that the AI subtitles say.
Speaker 3 (45:09):
Anything with an accent also a lot of spoken word
or hip hop. It's kind of you know what, it reminds
me of no is. It reminds me of being in
a different country and finding or watching a foreign film
that has like the English subtitles and the pro move.
Speaker 4 (45:30):
If you want to get real weird.
Speaker 3 (45:31):
With it, find a very popular foreign film like Kung
Fu Hustle, get the DVD that already has some subtitles,
and then watch it with another set of subtitles on
top of it, and then you'll see how tricky it
is for humans to translate things.
Speaker 4 (45:47):
Those kind of colloquial sort of expressions that are very
dependent on meter and rhythm of speech and cadence and
all of that stuff. But what we're really talking about today,
the unintended consequence is that I was hinting at, is
power consumption. You know, despite a lot of the things
that we're seeing AI being used for seeming somewhat frivolous,
(46:08):
we know beneath that it's being developed relentlessly by a
lot of giant companies to use for god knows what.
That's the part that we don't fully understand yet. And
with that comes these kind of I guess they're chips
called AI accelerators, and also just the processing power that
(46:31):
it takes to run all of the computations, you know,
to generate these goofy videos or you know whatever anime portraits.
And it is kind of insane the increase in power
demand to the grid that we're seeing with these these
next generation chips, with these AI accelerators, and with the
(46:55):
increased demand on just AI in general. A conversation we
were having a handful of years ago around this was
people who were using graphics cards and processors GPUs to
run crypto mining operations and how much power was being
sucked from the grid to do these crypto mining situations,
and how people would just stack these you know, almost
(47:16):
like private server rooms full of these graphics cards and
just constantly be running them and it was generating insane
amounts of demand on the grid. Well think about that,
and you know, multiply it by an order of magnitude higher.
There is a great article on Forbes AI power consumption
rapidly becoming mission critical and just just to quote the
(47:37):
first paragraph here from Beth Kindig, big tech is spending
tens of billions quarterly on AI accelerators, which has led
to an exponential increase in power consumption. Over the past
few months. Multiple forecasts and data points reveal soaring data
center electricity demand and surging power consumption. The rise of
generative AI and the surging GPU shipments is causing data
(47:59):
centers scale from tens of thousands to one hundred thousand
plus accelerators, shifting the emphasis to power as a mission
critical problem to solve.
Speaker 3 (48:10):
Yeah, it makes sense. I mean, it's I love that
you're mentioning the crypto the cryptocarbon consequence. I want to
be alliterative to my understanding here. One of the issues
is that the public overall still doesn't have a full
grasp on like we know some stats for Google, right
(48:32):
or alphabet, I should say, but we don't know the
full global consequences of this power demand in terms like
we I don't know. I think it'd be smart to
have AGI immediately figure out improvements in solar technology. That's
probably the easiest way around it.
Speaker 4 (48:52):
That's interesting that you used to say that, Ben. Another
article that I wanted to bring up is by Chuck DeVore,
who who I believe is with the Texas Public Policy Foundation,
and he wrote an op ed for The Federalist the
title being AIS insatiable appetite for energy can't be satisfied
(49:14):
by renewables.
Speaker 3 (49:15):
So oh boy, well, it's also I'll say it, it's
also the federalist.
Speaker 2 (49:20):
Yeah, fair enough.
Speaker 4 (49:21):
But the pull quote here is AI is bringing an
unprecedented surge and energy consumption, whether policymakers understand the energy
implications or not. And he says, in the realm of
artificial intelligence, where data crunching and machine learning algorithms reigned supreme,
the demand for energy has emerged as a critical concern.
Speaker 3 (49:39):
Mark P.
Speaker 4 (49:40):
Mills, the executive director of the National Center for Energy
Analytics at the Texas Public Policy Foundation, which this gentleman
Chuck Devoor overseas, argues that the energy requirements for the
AI systems are far more substantial than most of us. No,
his insights paying a sobering picture of the energy landscape
that awaits us. AI continues its relentless advance into every
(50:02):
facet of modern life, he says, the global or actually,
according to the International Energy Agency, the global electricity consumption
by AI alone could reach one thousand tarawatt hours annually
by twenty twenty six, slightly more than the total electricity
consumption of Japan.
Speaker 3 (50:21):
Wow.
Speaker 2 (50:22):
Uh, I was just looking at that Forbes article, Noel,
just it's laying out all of the wattage used per
they call them accelerators the GPUs basically that are being
put out by all these companies and talking about how
it's a some of them have a fifty percent, seventy
five percent, three hundred percent increase in power of usage
(50:45):
over each generation, and just trying to track that and
figure out like where what is that thing called guys?
Where you hit the you hit the peak. So there's
two things in my mind that are going here. How
technology tends to have that acceleration that is exponentially, but
but we we keep hitting these like blocks to where
(51:09):
current technology only will allow us to get to certain
speed or size or whatever like Moore's law. That's it,
that's its Yeah, like we're we're hitting both of those.
But as we get into this like some of the
quantum stuff and all of that, like I don't.
Speaker 4 (51:24):
Know, there's no longer those built in kind of caps, right.
Speaker 2 (51:29):
Yeah, but who knows how far off that actually is,
but it is coming.
Speaker 3 (51:34):
You're absolutely right, and these are just proven, proven trends
of human made technology that we're all we're all aware of, right,
they have been proven at this point so far, and
it's going to take a real groundbreaking shift to change
some of those realities. Quantum uh sorry, quantum technology, I
(51:56):
should say, maybe one of those game changers. One thing
for sure, however, the big thing that is going to
sty me progress in this situation is the secrecy, because
until we can understand, if you don't know the depth
and breadth and specifics of a problem, you have vastly
(52:17):
reduced your ability to solve for that problem.
Speaker 4 (52:20):
And I feel like that's part of why the stuff
that the public is seeing that's being trotted out feels
kind of frivolous, because it's sort of like a fun
new shiny toy that gives us agency to play around
with this new tech in a way that's like entertaining.
You know, Oh, it's no big deal. Look, it's just
like a weird new Instagram filter TikTok or whatever. But
(52:42):
actually there's much more going on beneath the surface, and
much more energy being expended that we don't have a
view into in terms of like the transparency of how
much is this actually affecting our resources?
Speaker 3 (52:56):
Yeah, I think that's a good point because we also
we also have to realize that this is emergent technology,
which means that any numbers people do pull are going
to be pretty perishable and perhaps less relevant within the
span of just a month, which is nuts to think about.
(53:16):
It's bonkers, but it is true. And also, you know,
Pandora's jar has unscrewed. You can't put AI back into
the box. This is not done where they have the
Butleryan jihad against thinking machines. They're on the way. This
is overall a good thing if we can figure it out.
But my my uninformed hope I'm trying to be optimistic nowadays,
(53:42):
is that you know, we've seen we've seen machine learning
models make tremendous improvements in engineering, right, both in the
space of weapons but also in the space of civilian
tech and really arcane problems that may maybe a couple
thousand people in the world would even understand. Right, And
(54:07):
this stuff, these spells, that's what I mean. Programming is
basically casting a spell.
Speaker 4 (54:14):
Well, we thought, yeah, it's like incantation, and the best
people that understand the language can do the best version
of the spell.
Speaker 3 (54:22):
So these algorithms, these things, these creatures conjured through these
spells seem to have a clear advantage on improving concepts
of engineering. They're better at that than they are having
a human like conversation. So maybe this is my hope.
Maybe we can while we're still heading toward that inflection
(54:45):
point like you were talking about, Matt, Maybe there's time
enough before the car hits the tree or goes over
the cliff elm and Lewis style. Maybe there is enough
time to get some of these machine learning projects on
to renewable energy and maybe hopefully disprove the statements of
(55:06):
Chuck Devar, who's also you know, a politician.
Speaker 4 (55:09):
Fair enough, I would ask you, when have we ever
done that? And it does feel to me a lot
like the folks that are wielding the stuff you know,
at the highest levels are big time putting the cart
before the ecological horse.
Speaker 2 (55:24):
Yeah, guys, I think there's time for all of us
to decide social media is no longer worth it, and
we can just rise above and we will no longer
need memes and AI generated cartoons, and there's a world
where we go outside and we take tree limbs and
we play with them like swords. Again, touch grass.
Speaker 4 (55:47):
You should run for office Matt, that's very stirring the
stump speech there, but only at night.
Speaker 3 (55:53):
Yes, this is why you see the problem with coalition
parts in politics, right, Matt pitched the idea. One of
us is on board, right, the other one was was
I agree, asterisk, and we're only doing that at night, right.
Speaker 2 (56:08):
Now, that's fine dusk stick swordsmanship. That's what we'll call it.
Speaker 4 (56:14):
That's our whole platform.
Speaker 2 (56:15):
It is.
Speaker 4 (56:16):
Uh, this is something probably to look at in a
bigger agreed dive. So I'm gonna keep this one short.
That's it, basically, that's the gist is that. And I
don't know that it's necessarily that surprising to anybody, but
here it is from some you know, folks that more
or less know what they're talking about.
Speaker 3 (56:36):
And we have to wonder too, what our timeline looks
like for that, you know, for that possible inflection point.
You brought a statistic that I really want to underline
all which is the idea that these machine learning models
and these infrastructures could become countries of their own in
(56:58):
terms of power consumption. And the question then is, you know,
we're not the experts. Obviously, we don't pretend to be
so but we would love to know what you think
that timeline might be. Is it going to be an
hour lifetime? You know, are we going to run into
it at that point or I don't know. It feels
like things are accelerating. There's a snowball.
Speaker 2 (57:20):
Okay. So the guys, these GPUs, they give off a
tremendous amount of heat. That's one of the big problems,
right right. Do you think there's a way we can
design a system that captures the heat from the GPUs
to then turn that into power for the GPUs.
Speaker 4 (57:36):
It's interesting that there are certain home appliances or like
you know, on the you know, like heaters and things
that actually harness heat from.
Speaker 3 (57:46):
The ground and energy.
Speaker 4 (57:48):
That's exactly right. So I mean it's certainly possible. And I,
for example, have a do humidifier that runs down in
my studio all the time and it pulls moisture out
of the air. Like you just got to wonder if
this is a thing that should be being discussed, like
how do we take a byproduct and turn it into
a net gain in that blastic human Yeah.
Speaker 2 (58:07):
It would be like a hybrid engine the way they
use the wheel movement to then recharge the battery. You
would use the heat from the GPUs as it's generated
to recharge the power system that's supplying the GPUs. Guys,
we can do this basically.
Speaker 3 (58:22):
You just need to figure out how to take something
we already have and use that to spend something to
create more power. So we could also link it to
tidal power. Oh, I'm quite bullish on because because it's free, right.
Speaker 2 (58:38):
Right, What if we had hundreds of thousands of people
locked in rooms with an electric bike of some sort
where they would pedal on the bike.
Speaker 4 (58:50):
And also, like what you're talking about too, would only
work in the largest over the arrays in one location, right,
and that would also require harge scale buy in from
these giant corporations. And unless they were absolutely you know,
forced to do this or it was some sort of
marketing move because things had already gotten so bad they
(59:12):
needed a win in terms of like public opinion, only
then would they do it. They're not going to just
do it on their own. I'm not going to benevolently
say here, take our AI GPU heat and like do something.
You know, let's let's figure out how to funnel it.
It's the same reason that we can't desalinate the ocean
or like it's way too expensive and requires a lot
(59:33):
of entergale.
Speaker 3 (59:34):
Right, and there's a lot of Yeah, there's a lot
of really fascinating technology that's going on here. And folks,
we want to tell you. If you are a physicist
and you clearly know why, like you clearly know why
these projects that we're spitballing haven't been done yet, write
in and let us know from you.
Speaker 2 (59:52):
Oh god no.
Speaker 3 (59:55):
Also wait, wait, wait, this is this is a double
edged sword. Here, give us the facts and then give
us the crazy part. We want to hear you pitch
a weird idea and we want to hear you know
what I mean. Like it's a big thing for us.
We don't like to show up just with problems. We
like to bring solutions.
Speaker 4 (01:00:14):
And I'm not suggesting anyone wants to desalinate the ocean.
I'm just talking about another source of.
Speaker 2 (01:00:25):
Salinate.
Speaker 3 (01:00:29):
Yeah, look at the way the human civilization is. They've
been on that the ocean thing for a while.
Speaker 4 (01:00:33):
Now we're going to keep going.
Speaker 3 (01:00:36):
I'm sorry cursied so much, but folks, we hope you
enjoyed this evening strange news. Uh, we are going to
return with even more explorations. Uh, we want to hear
from you legal experts, non legal experts, people who are
worried about things in general. Let us know. Also, well
(01:00:57):
a million other things. We'll catch up next time. Would
try to be easy to find online.
Speaker 4 (01:01:01):
Find us online at the handle Conspiracy Stuff, where we
exist on Facebook with our Facebook group Here's where it
gets crazy. On x FKA, Twitter and on YouTube with
video content coming at you on the regular. On Instagram
and TikTok, you can find us at the handle conspiracy stuff.
Speaker 2 (01:01:18):
Show the heat syncs on the GPU rather than just
taking that heat away and accepting it excites. Yes, they
accept it and they transfer that heat. I can see it, you, guys,
I can see it. If you want to call us,
call one eight three three STDWYTK that's our voicemail system.
When you call in, you've got three minutes. Say whatever
(01:01:38):
you'd like do. Give yourself a cool nickname. We don't
care what it is, and we're excited to hear what
you choose. Let us know if we can use that
name and your voice on one of our listener mail episodes.
And if you've got more to say, then can fit
in one of those tiny, little three minute maximum voicemails.
Why not instead send us a good old fashion email.
Speaker 3 (01:01:57):
We are the entities that read every every single email
we get. We also received just now an interesting piece
of correspondence from chat GPT. Heychat GPT. We asked, how
do you think we can fix the problem of AI
creating excess carbon emissions? Chat GPT said, it's a multifaceted approach,
(01:02:17):
integrating technological operational policy measures, improve energy efficiency, AI models,
green data centers, sustainable practices distributed an edge computing awareness
and collaboration. Got a little ted talkie at the end.
Speaker 4 (01:02:33):
I'm wondering if green data centers are a thing we
should look at in the future too. Does that involve
some kind of harnessing of thereby proc I wonder if
what you're describing that is a thing you know it's
too good to know.
Speaker 3 (01:02:45):
So green data centers, this is exactly what I was
talking about, powered by renewable energy sources such as solar. Sorry,
Chuck Devor, I respectfully disagree, And we'll also ask some
questions about your campaign funding.
Speaker 4 (01:02:59):
So anyway, I don't know about this guy. You seem
to know a bit more about this guy than I.
Do so. I tried to at least have two perspectives,
but yeah, Ed.
Speaker 3 (01:03:08):
Ed, They're right. Obviously, this is a conversation we're going
to pick up again because we're all heated up about it.
I've even turned my screen read I think, look, he.
Speaker 4 (01:03:17):
Heated you're really hot out of the collar, bet walk with.
Speaker 3 (01:03:20):
Us into the dark conspiracy at iHeartRadio dot com.
Speaker 2 (01:03:42):
Stuff they Don't Want you to Know is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.