All Episodes

May 14, 2025 • 28 mins

The rise of AI data centers is reshaping the outlook for US power markets. Forecast to account for nearly a 10th of all US electricity demand by 2035, data centers are gobbling up power more quickly than electric vehicles, hydrogen or any other demand class this decade. A profound concentration of capital has allowed for this rapid expansion, which is now exerting influence over energy infrastructure and planning investment. But what forms do data centers take, and what are the factors and strategies that influence associated decision making? On today’s show, Tom Rowlands-Rees is joined by BloombergNEF’s Head of US Power, Helen Kou, and Senior Associate Nathalie Limandibhratha to discuss their recent note “US Data Center Market Outlook: The Age of AI”.

Complementary BNEF research on the trends driving the transition to a lower-carbon economy can be found at BNEF<GO> on the Bloomberg Terminal or on bnef.com

Links to research notes from this episode:

US Data Center Market Outlook: The Age of AI - https://www.bnef.com/insights/36281

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Tom Rowlands Reese and you're listening to Switched
on the podcast brought to you by BNF. The rapid
rise of energy intensive AI data centers is reshaping the
near term outlook for US power markets. Outpacing the energy
demand growth of EVS, hydrogen and all other demand classes
out to twenty thirty, data centers will account for eight
point six percent of US electricity demand by twenty thirty five.

(00:23):
That's almost twice as much as today. Largely owned and
operated by a few highly consolidated companies with very deep pockets,
this concentration of capital allows for rapid expansion and a
significant influence over future energy infrastructure investment. So what strategies
are these companies employing to optimize their data center rollout?
On today's show, I'm joined by BNF's head of US

(00:43):
Power Helen Co and US Power Senior Associate Natalie Lemandebrata,
and together we discuss findings from their note US data
center market outlook the Age of AI, which B and
EF clients can find at BNF go on the Bloomberg
Terminal or on BNF dot com. All right, let's get
to talking about the outlook AI data centers with Helen
and Natalie. Helen, thank you for being here. Thanks Tom

(01:16):
and Natalie thanks for being here as well.

Speaker 2 (01:18):
Thank you Tom.

Speaker 1 (01:19):
So Natalie reports up to Helen, and Helen reports up
to me, and I'm not saying that to flex. I'm
saying it for a couple of reasons. One is, I'm
like super proud to have such smart people on my team,
and I'm also particularly proud of the work they've done
around data centers. But also this situation of this reporting

(01:39):
line means that I get to have catch ups with
them regularly, which has been pretty useful to me personally
because this question around AI and data centers has had
a lot of people talking, a lot of people have opinions,
and so I have often found myself in situations where
people are expressing their opinions, and in those situations, I've
developed this tactic to different myself from the pack, which

(02:01):
is that in the situation, I just regurgitate whatever Helen
and Natalie last told me about data centers, and everyone
thinks that I'm really smart. So first off, let's start
just with the headline numbers. How much data center build
are we expecting in the US According to the report
that we just published, so.

Speaker 2 (02:17):
BNF's latest outlook has data center and demand more than
doubling from thirty five gigatts today to close to eighty
gigawatts in twenty thirty five. This would account for close
to nine percent of total US electricity demand.

Speaker 1 (02:32):
Wow, so we're expecting let me just do the mass
suddenly like forty five gigawatts ish, which for those of
you who are you know, maybe new to the energy space,
that's like twenty to thirty nuclear plants, and nuclear plants
are really really big and take a long time to build.
That's a lot of demand. So we are forecasting some
astronomical amount of data center build. How do we compare

(02:53):
to everyone else that has an opinion on this topic?

Speaker 3 (02:57):
We're relatively conservative and relatively conservative. Yes, yeah, our overall
demand build is fairly low in terms of uptake relative
to our third parties.

Speaker 1 (03:09):
Okay, so how come they are forecasting something so much
more aggressive than us, or how come we are so
much more conservative than them?

Speaker 3 (03:18):
Well, we don't really know what our third party counterparts
do in terms of their forecast, but what we do
know is like how we forecast data centers, and our
focus was really to look at like how data centers
move from one stage to the next. So in our
project data base, what we know is that we can
see data center stages. So we have like early stage,

(03:39):
which is basically anything kind of just just got announced.
We have projects that are committed, which is anything that
has some type of like land or permitting agreement, things
that are under construction and then live. And what we
did was we tracked how these data centers moved from
one stage to the next, and we looked at like
the probability of how these data centers moved.

Speaker 1 (03:59):
From and stay to the next. So typically, how long
does it take for a data center, you know, from
early on in this pipeline to being commissioned, how long
would that take?

Speaker 3 (04:10):
Yeah, So what we've found based on data between twenty
twenty to twenty twenty four is that it takes seven
years to build a data center, which is a really,
really long time.

Speaker 1 (04:19):
Okay, So that's really interesting. So in a way, our
forecast we're saying with a fair amount of confidence because
you know, twenty thirty five is only just over a
decade away, and we have data on everything that's getting built,
and we know that it takes most of that decade
for it to get built. So anyone who's forecasting something
more aggressive than us either has a different view on
how long it takes to build data centers, or they

(04:41):
must have different data, or maybe they're using a completely
different methodology. But it's good to know that we're the
ones doing it right. So who's building all of these
data centers this sort of colossal volume of new demand.

Speaker 2 (04:54):
Yeah, so the data center market is pretty concentrated. There's
two main types of owners. There's colo data centers who
have buildings with multiple tenants, and you have these self
build companies, which are typically your large tech companies or
hyperscalers is what they're usually referred to. And the hyperscalers

(05:14):
of Google, Amazon, Microsoft, and Meta are close to fifty
percent of total operating capacity today and this is only
set to grow. They're building much larger campuses close to
gigawatts scale. Amazon has multiple gigawat data center campuses in
development in Virginia. Meta has another two gigawatts in Louisiana,

(05:36):
and just as a point of reference, in the last decade,
data centers have typically been in the tens of megawatts.
So really, as we're pushing through these hundreds of megawatts
and gigawat size, the rise of uptake will be much
faster and larger.

Speaker 1 (05:52):
Okay, And so just just to make sure I've understood
overall correctly, those four companies are fifty percent of the
data center build today, but we think that there's going
to be even more because they're the companies that are
building these really big data centers that are so much
of what we're expecting. So you've already alluded to these
data centers are maybe bigger than the ones we've seen

(06:12):
in the past, that's the trend. But in what other
ways are the data centers that we're expecting different to
the ones that we've seen in the past.

Speaker 3 (06:20):
Yeah, before we jump into that, I think it's important
to understand how be any of categorized as data centers.
So we categorize data centers in three different ways. Vers size,
which Natalie had alluded to, first retail, wholesale and hyperscaler
So that's just based on the project size of a
data center. And then there's operator type, which is the
ownership so either self build or co location, which Natalie

(06:45):
had already explained. And then there's workload, So workload is
based on just the computing process of a data center,
and there are many different types of workload from cloud
or enterprise, telecom, crypto mining, and that determines a lot
about the data center's overall infrastructure and then also their

(07:07):
overall load and power consumption.

Speaker 1 (07:09):
So then, I mean one of the other things that
you've spoken to me about is when we're talking about
AI data centers, that there's two main flavors, So can
you just talk me through those as well.

Speaker 2 (07:19):
Within AI, we mainly branch it out in two main workloads,
AI training and AI inference. AI training is processing a
large amount of data in order to train these large
language models, and AI inference is taking those already train
models for real time applications in use, like when you're
careering in chat GPT, And those two workloads have also

(07:40):
different location and constraints of the data center itself. AI inference,
since they're theoretically interacting with the end user in real time,
they'll care more about latency and location parameters to that
end user. AI training a lot of that processing happens
on site, so theoretically they could be more flexible on

(08:01):
where they locate, and they could follow where there's available
power or other constraints.

Speaker 1 (08:06):
Okay, I mean you use the word theoretically there, which
maybe is doing a lot of work. And we'll come
back to where people are actually building data centers later on,
because I do have a question about that. But roughly,
what is do we have an idea of what proportion
of the data center building the pipeline is for AI
and what proportion of it is training and what proportion
of it is inference.

Speaker 2 (08:27):
That's actually very tough to ascertain exactly what the split is.
If we look at a large Gigawat campus, some of
their buildings could be used for training today, but it
could be used for inference in the future. Similarly, we
talked about different owner types a co location building, they
could have, you know, multiple tenants, and unless you know
exactly who the tenant is and what type of workloads

(08:50):
they're running, it's also difficult to know if it is
for AI training at inference. But we can infer based
on where they're getting built.

Speaker 1 (08:57):
But these distinctions, I mean, what I'm here here is
that it's not just that nobody tells us which it
is that makes it a little bit of a gray area.
It's that actually even a data center itself might sometimes
at one point in its life be doing one thing
and at a different point in it's life be doing
something else.

Speaker 3 (09:13):
I think for power folks, I often try to like
frame it like a data center is very similar to
a battery. Like batteries can do multiple different types of
energy services or ancillary services, a data center can do
multiple different types of workload. It can be doing AI
training or cloud as long as the configuration is correct,
or it could do those types of workload.

Speaker 1 (09:34):
Got it. So just because you've optimized one to one
thing doesn't mean that it can't do the other thing.

Speaker 3 (09:40):
Yeah, And particularly in colocation data centers where basically these
companies are renting out their RT servers to tenants, those
tenants are probably doing different types of workload. There can
be multiple applications in a data center.

Speaker 2 (09:54):
Yeah, a workload, which is often why you see in
colocation that they're optimizing for everything because they don't know
who their tenant is. I will add, though, in terms
of AI data centers that they are quite different from
data centers today. We already talked that these data centers
are getting larger. A lot of that has to do
with the GPUs in those servers being much more so.

Speaker 1 (10:15):
Can you just what a GPUs.

Speaker 2 (10:17):
Graphical processing units, which is they're similar to CPUs central
processing units, but they're very specialized in they're cooler.

Speaker 1 (10:28):
I think that we've got to the level of understanding.
I'm good with it. So it's like it's like a chip,
but it's a different kind of chip.

Speaker 2 (10:34):
Yeah, it's very good at parallel processing, which is optimized
for training large language models. I'm sure lots of people
have heard of Navidia and their GPUs. A lot of
tech companies are also building their AI accelerators, which are
specialized chips for these models. So a lot of the
design of the data center today in order to accommodate

(10:54):
AI training workloads are different. A lot of that has
to do with these GPUs and the rack that they're
on are going to be much more power dense, which
means they need a lot more sophisticated cooling technologies to
accommodate those sort of rack densities that could be tenext
to what typical data centers are today. We've seen, for example,
Metas scrapping data centers that were not ail ready, so

(11:17):
it's quite difficult to retrofit a data center from five
years ago to an AI workload.

Speaker 1 (11:22):
So while a.

Speaker 2 (11:23):
Data center in the future could have multiple uses, it's
also very specialized to what they're going to be running.

Speaker 1 (11:29):
Got it is it? Would it be right for me
to say that, like, a data center designed for AI
can be used for other things too, but a data
center not designed for AI probably can't do AI? Is
that a fair statement?

Speaker 3 (11:42):
I think that's a pretty fair statement. We're basically seeing
new data centers being designed in a way that allows
for AI training, and so there's an influence of like
AI and data center design to be larger and more
power tents.

Speaker 1 (11:56):
So in these training data centers that building these models.
And one of the charts in the note that I
really loved but haven't quite fully digested is one showing
the amount of and you don't have to explain this
unit amount of terror flops required too. Terrorflop that's just
the unit of like computer work, isn't it.

Speaker 2 (12:17):
Yeah, it's just a basic unit of computation.

Speaker 1 (12:20):
Yeah, the amount of terror flops needed to design different
AI models as they've become more and more in sophisticated
and so obviously there's been an expectation that trend is
going to continue, and everyone was you know, freaking out
in both good ways and bad ways about all the
power demand that this will intel. And then I remember
deep seek came along and a lot of the people

(12:41):
were like, oh, this changes everything. Can you just provide
a bit of clarity on all of this.

Speaker 3 (12:46):
There was a couple of things I really tried to
understand about all of this, just in terms of like
power market fundamentals. I think when it comes to like
forecasting for power demand, all things come down to some
very basic constructs. It's usually like how much quantity of
something and then the energy intensity of that something. And

(13:08):
for data centers it's a very similar process. And when
we think about like the energy efficiency of large language
model AI training data centers, there's like a couple things
to think through. First is around the energy intensity of
power consumption from a chips perspective, and chip innovation is
often like confused with like increasing energy efficiency. That's not

(13:34):
necessarily the case. Typically when we think about chip innovation,
it's often optimized for like operations per second, which means
that like typically more like every generation of chips tend
to have draw more power and therefore, like it increases
the energy intensity of a data center. So there are

(13:54):
new chips that are getting invented, like the Nvidia Blackwell
that focuses on energy efficiency. But in general what we've
seen is like the more advanced chips tend to draw
on more power. The other thing that is really important
for AI training is then like the number of parameters
that an AI training model focuses on. So parameters is

(14:15):
just like points of active information for a model to
think through.

Speaker 1 (14:21):
So it's just the number of different things it thinks
about sort of like so if I was thinking, like
should I go to work today or should I walk
to work today? Like, if I had one parameter, it
might be what's the weather like outside? And if the
two parameter might be what's the weather like outside and
what day of the week is it? Yeah, you might
determine whether or not I go to work. So that's

(14:42):
like what each of those is a parameter.

Speaker 3 (14:44):
Exactly, exactly precisely and in general, like in the large
language model community historically, or at least in the power
industry historically, what people had thought through was that like
more sophisticated models required more parameters. So with every generation
of like chatchipt or Gemina or cloud. If you look

(15:05):
at their like technical reports, what you'll see is that
there's increasing amount of parameters that is used to like
train these large language models, and so in general, like
there was this overall consensus that the energy intensity of
air training is an upward trend. You use more powerful chips,
and you're using your training on more and more parameters,

(15:26):
and so that's all driving more and more electricity consumption.
Deep Seek came out in December twenty twenty four, and
in their technical report, what was really cool was that
they had a different training process. So it uses something
called a mixture of experts training process, where instead of
just like plugging in all of their parameters all at once,

(15:49):
what they did was they pre categorized their parameters into experts.
So like certain parameters that specialize in math or like colors,
they would like pre categze them so that when you
put in a query of a question to ask the
model to train, it would know which specialization to pull
the parameters from, which then drew on less power. So

(16:12):
the other thing that the technical report published was that
deep Seek performed at a very high level with chat
GPT or like other types of large language models that
used a lot more parameters, and so it kind of
broke the assumption that you needed a whole bunch of
parameters to make really sophisticated large language models.

Speaker 1 (16:34):
And so do you think that that will massively change
the outlook for power demand from AI?

Speaker 3 (16:40):
So to answer your question from like a data center
demand perspective, not necessarily in our near term forecast on
in our data center outlook, we use like a project
by project level forecasts, right. But in our new energy outlook,
which focused more on like long term forecasts for data
center demand, it focused on that fundamentals based way of forecasting,

(17:02):
which it looks at long term in any given like market,
what data generation and data usage looks like relative to
the energy intensity of that data generation.

Speaker 2 (17:14):
Right.

Speaker 1 (17:15):
And there's a point you make in the report I
recall and I don't think you were talking about deep
seek here. I think you were talking about data center efficiency.
But bring up Jevins paradox, which is this idea that
if you make something more efficient, it doesn't mean necessarily
that we save energy, it's that we just do more
with what we were going to consume anyway, And could
the same logic be applied for deep seek is if

(17:35):
it is more efficient with parameters and therefore energy and
also computer usage, then that just opens the door to
do cooler things with AI than would have previously been possible,
rather than to save energy.

Speaker 3 (17:46):
Yeah, the energy intensity curve of AI training data centers
like instead of it being like an upward swing, it
goes down. But we also then know it opens up
a lot more opportunities for a lot of different types
of business to maybe do AI training right, which means
that you have more companies that may be doing this.

(18:07):
So got it runs paradox.

Speaker 1 (18:09):
Very interesting and actually I think that brings some real
clarity into what this whole deep seat thing means for
power demand, which my main takeaway is like not that much. Ultimately,
it's very difficult to say, but we shouldn't be saying, oh,
this means all this data center demand growth isn't going
to happen. Yes, So pulling out again, there's all of
this data center build that's going to happen in the US,

(18:32):
four companies are going to be behind a little bit
more than half of it. Where are they going to
be building all of this? And why are they going
to be building in those places?

Speaker 2 (18:41):
Yeah? I'll also add on those four companies that are
most of the data center market, they're also the companies
that you know aren't building these AI training models and
have the capacity to train large scale models because it's
a very costly exercise that is only set to grow.
They're also forty of the data center market, but they're

(19:02):
also the ones training AI models because actually, not many
companies can train A models, right, right.

Speaker 1 (19:07):
So a lot of the new big demand is becoming
from these four companies. So a lot of the demand
that we're talking about.

Speaker 2 (19:14):
Yeah, I guess today when we look at the data
center fleet, they're mostly for cloud. But going forward, the
companies that can actually train AI models is less than
ten companies training like frontier models, and that those are
going to be the big tech companies.

Speaker 1 (19:28):
And so then where is this happening and why in
those locations.

Speaker 2 (19:33):
So in our forecast we see three main markets emerge
through twenty thirty five. We break them down by power
region for power forecasting purposes.

Speaker 1 (19:42):
And that's just because that's how we think. I don't
see states, I see power region.

Speaker 2 (19:47):
Yeah, so we see PGM or Coat and Southeast, and
within PGM, which spans fourteen states, we see Virginia continuing
being one of the biggest markets so than Virginia has
been data center capital for the last decade. If Virginia
was a country, it would follow the US and China

(20:09):
as having the largest data center market.

Speaker 4 (20:11):
In the world.

Speaker 1 (20:12):
Wow. Can I say that again? Wow?

Speaker 3 (20:14):
Yeah.

Speaker 2 (20:15):
So Northern Virginia has been kind of at the center
of data center build out. A lot of this has
to do with a bit of history. They had the
first Internet exchange point in the nineties, which was kind
of the beginning of small data center build out, and
data centers typically continue to cluster an existing markets, so

(20:36):
as data centers grow, they'll have supporting infrastructure like fiber optics,
utility relationships, and workforce availability that allows more data centers
to continue to grow. So over time, Northern Virginia just
has gotten hotter and hotter, and we see in our
project Pipeline a lot of that continuing to grow. We

(20:57):
do know that a lot of this could be more
AI inference rather than AI training, but still a lot
of data center demand happening. There. Another state in PJM
is Ohio, which is emerging as one of the main
hubs in the Midwest. Google is building data centers in
New Albany, which is just outside Columbus, one of the

(21:19):
largest cities in Ohio, as well as other co location
and hyperscular companies.

Speaker 1 (21:24):
You know, you've highlighted some a couple of major markets
with PGM. I think I think I thought was really
interesting and maybe slightly paradoxical in the report you wrote
was there's this chart from that has a survey of
data center developers saying what do you prioritize when you're
thinking about where to build a data center? And I
think we said that the top three survey results it
wasn't our survey, it was a third parties. The top

(21:45):
three survey results all related to energy. It was something
like security of supply, how cheap the energy is, how
green the energy is. I can't remember, don't quote me
on them, but it was energy related. When you look
at the data of where they're currently building and have been, well,
it kind of completely contradicts that thesis. You know, PGM
doesn't have the cheapest or cleanest electricity in the US.

(22:07):
I mean you could say it has good security of supply,
So what is behind this paradox?

Speaker 2 (22:12):
I think we a lot of the hype right now
in data center built out is AI related, and we
did talk about how AI training could be more flexible.
And we do see a lot of our merging innovations
of siting near stranded renewable assets and going against the
grain of traditional sighting. But most as we said, most

(22:35):
of them are continuing to build out an existing data
center market. So market's pre AI one theory is basically,
they're investing billions of dollars in a data center for
the next decade or so. While in the near term
they can plan for AI training. In the future, it
could be AI inference, or they could you know, retrofit
and sell it to a co location company altogether, and

(22:57):
they need to plan for those latency and redidancy requirements today.
So even though in the near term they could cite
it in you know, West Texas where there's a lot
of renewables, we still see like Dallas Fort Worth and
like northern Virginia as being hotspots, although we are seeing
a trend of going a bit outside of urban locations

(23:18):
and going more to where there is power supply.

Speaker 1 (23:21):
Got it. I remember earlier in the podcast you said
in theory about you know where you could build training
data cents and this is the this is what you're
saying is in practice, it's like this, but we are
seeing a bit of a trend, but just maybe not
as much as you would think to the sort of
energy ideal locations.

Speaker 4 (23:38):
Yeah.

Speaker 3 (23:38):
I guess one thing that we did also notice, like
to your point on that little contradiction, is that hyperscalers
do take a dual strategy. They're both building in existing
regions and also trialing in new locations. If you look
in our report, you're seeing that the large four companies

(23:59):
they're building with in their own data center clusters and
also like looking for new markets at the same time.
And they're doing that because for them, speed and scale
of development is really critical, particularly because like their AI
business requires them to kind of take a like a.

Speaker 1 (24:18):
Winner is that it's like a there's a computing power
arms race happening, yeah now, and so it's all very
well saying, oh, we'd ideally build it here, but it's
like we just need this right now, and we're going
to do what's tried.

Speaker 4 (24:29):
Yeah.

Speaker 3 (24:30):
Like, I guess it's like a winner takes all game
in the AI business. So they want to build data
centers as quickly as possible, So they're taking all options.

Speaker 1 (24:39):
They want their their particular AI model to be the
Coca Cola of AI. So what does all this mean
for the power sector? I mean, how is this going
to affect the supply mix just keeping up with all
this demand? How's it going to affect the regulatory model?
You know, it's not designed for just suddenly having loads
of new demand dropped on a in very concentrated regions,

(24:59):
as we I've just learned. And then you know what
innovations are we seeing to try and cope with all
of this new demand in the power sector.

Speaker 3 (25:06):
Yeah, So what we're currently seeing is a very strong
reaction towards all of this new data center demand, particularly
from utilities. So in Ohio, where I guess there's a
lot of new investment in data centers, what we're seeing
is ap Ohio, which is the utility in the region.
They've proposed a new tariff which in that tariff they've

(25:29):
acquired or requested that new data centers pay up to
eighty five percent of their projected energy usage, which makes
it less attractive of a market for these data centers
to want to build in that region. In a way,
it's to prevent increasing retail rates from a utilities perspective,
but there's a lot of pushback for installment.

Speaker 1 (25:50):
Yeah, that's I mean, because that's controversial because the entire
basis of the regulated monopoly is providing equal access to
all consumers, even if those consumers that are being discriminated
against our massive corporations, still does undermine the philosophical basis.
So I'm kind of interested to see what's going to
happen there. What do you think's going to mean for

(26:11):
the supply mix, you know, wind, solar, gas, what's going
to do it?

Speaker 2 (26:16):
So I think one of the emerging innovations in moving
from you know, this hyperload growth environment and there's not
much supply available is this trend of colocation or having
on site generation. A lot of traditional grid planning was
for that peak power, and I think we saw this
a couple of years ago, and you know a lot

(26:36):
of taxes and how they integrated crypto mining is having
these as flexible loads and curtailing during hours of peak demand.
We know that non crypto workloads like AI training or
inference may not necessarily curtail, but we do see one
model in which they'll have some sort of on site

(26:56):
generation or colocation supply where they could as a whole
campus interact with grid needs in terms of the total
supply mix. A lot of the short term needs means
that they'll build whatever technology is fastest, and in our
data most of that is when solar batteries, but reliability

(27:17):
is also a huge part of data centers. They're known
for having five nines, which is ninety nine point nine
nine nine percent uptime, which means that you know, firm
capacity like natural gas generation or diesel gent chats that
they've used traditionally will also be a large part of
the solution in the short term.

Speaker 1 (27:36):
So it's a real bit of an open question there.
It's either how quickly things can get built and what
truly is the fastest solution versus the long term needs. Helen,
thank you very much for joining us today.

Speaker 4 (27:47):
Thank you, Ton and Natalie, thank you so much for
joining Thanks Sam.

Speaker 1 (28:00):
Day's episode of Switched On was produced by Cam Gray
with production assistance from Kamala Shelling.

Speaker 4 (28:05):
Bloomberg NIF is a service provided by Bloomberg Finance LP
and its affiliates.

Speaker 1 (28:09):
This recording does not constitute, nor should it be construed
as investment in vice investment recommendations, or a recommendation as
to an investment or other strategy. Bloomberg ANIF should not
be considered as information sufficient upon which to base an
investment decision. Neither Bloomberg Finance LP nor any of its
affiliates makes any representation or warranty as to the accuracy
or completeness of the information contained in this recording, and

(28:32):
any

Speaker 2 (28:32):
Liability as a result of this recording is expressly disclaimed
Advertise With Us

Popular Podcasts

United States of Kennedy
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.