All Episodes

July 23, 2025 6 mins

On today’s show we are looking at the impact of data centres on the local economy where these are located. We are looking at the energy component, and the technology life cycle. 

You have no doubt heard the statement that “energy is the economy.” It is true that for every unit of GDP there is an equivalent unit of energy consumed somewhere in the world. 

But in this instance, energy is the lifeblood of the AI industry. In addition to the gold rush in AI, there is a recognition that the world is not generating enough electricity to satisfy the incremental demand from data centres. 

What caught my attention this past week was not the plethora of announcements of new power generating capacity. Although that is impressive, there is something even more important that we will explore. 

Let’s take a look at the aggregate power generation announcements in just the past month alone. Several of the announcement have been in the range of 2GW. We need to put this in perspective. 2GW of power generation is enough to power 1.5M homes. This is enough to power a city like Phoenix or San Diego or San Antonio. So when we measure the power consumption of a large data centre complex, we are comparing it to the GDP of an entire city of 1.5M homes, or about 3m people. 

The primary driver of obsolescence is the relentless pace of Nvidia's (and other manufacturers') innovation cycle. New generations of AI chips are released frequently, every 12 to 18 months, bringing significant leaps in computational power, efficiency, and new features.

Meta's studies, for example, have shown significant GPU failures during the training of large models like Llama 3, with an estimated annualized failure rate of around 9%, potentially reaching 27% over three years for H100 GPUs. These chips consume a lot of power (700W for H100s, over 1000W for future chips) and generate intense heat, putting immense stress on the hardware.

------------

**Real Estate Espresso Podcast:**
 Spotify: [The Real Estate Espresso Podcast](https://open.spotify.com/show/3GvtwRmTq4r3es8cbw8jW0?si=c75ea506a6694ef1)  
 iTunes: [The Real Estate Espresso Podcast](https://podcasts.apple.com/ca/podcast/the-real-estate-espresso-podcast/id1340482613)  
 Website: [www.victorjm.com](http://www.victorjm.com)  
 LinkedIn: [Victor Menasce](http://www.linkedin.com/in/vmenasce)  
 YouTube: [The Real Estate Espresso Podcast](http://www.youtube.com/@victorjmenasce6734)  
 Facebook: [www.facebook.com/realestateespresso](http://www.facebook.com/realestateespresso)  
 Email: [podcast@victorjm.com](mailto:podcast@victorjm.com)  
**Y Street Capital:**
 Website: [www.ystreetcapital.com](http://www.ystreetcapital.com)  
 Facebook: [www.facebook.com/YStreetCapital](https://www.facebook.com/YStreetCapital)  
 Instagram: [@ystreetcapital](http://www.instagram.com/ystreetcapital)  

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to the Real Estate and special podcast Your Morning
shot at what's new in the world of real estate investing.
I'm your host, Victor Minash. On today's show, we're looking
at the impact of data centers onthe local economy, where these
are located. We're looking at the energy
component and the technology life cycle.
This will affect employment in the local area.
You've no doubt heard the statement that energy is the

(00:21):
economy, and it's true that for every unit of GDP, there's an
equivalent unit of energy consumed somewhere in the world.
But in this instance, with data centers, energy is the lifeblood
of the AI industry. In addition to the gold rush in
AI is a recognition the world isnot generating anywhere near
enough electricity to satisfy the incremental demand from

(00:43):
these new data centers. What caught my attention this
past week was not just the plethora of announcements of new
power generating capacity. Although that is impressive,
there's something even more important that we're going to
explore. Now.
Let's take a look at the aggregate power generation
announcement. In just the past month alone,
several of these announcements have been in the range of two
gigawatts each. We need to put this in

(01:05):
perspective, 2 gigawatts of powers and enough power to
supply a million and a half homes.
This is enough power for a city like Phoenix or San Diego or San
Antonio. So when we measure the power
consumption of a large data center complex, we're comparing
it to the GDP of an entire city of a million and a half homes,
or about 3 million people. Well, yesterday Oracle and Open

(01:28):
AI announced an additional 4 1/2gigawatts of data center
capacity. Much of that's going to be
located in their Stargate facility in Abilene, TX.
Construction of Stargates already underway and parts of
that facility are already up andrunning with racks of Nvidia's
GB 200 being delivered to the facility last month.
Meta announced a month ago a 20 year supply agreement for

(01:51):
electricity in Illinois with Constellation Energy's nuclear
power plant. That plant was scheduled to
close down, but beginning in June of 2027.
The agreement supports the re licensing and continued
operations of Constellations high performing Clinton Nuclear
Power plant for another 2 decades after the state's
ratepayer funded 0 emissions credit program expires.

(02:14):
What's shocking is that Meta is going to pay nearly twice the
going rate for electricity compared with today's rates for
industrial electricity. Constellation Power has been
historically charging around $80per MW hour.
There's some unconfirmed reportsthat Med has agreed to pay
something in the range of $90.00per MW hour for the 20 year
agreement, presumably with escalation clauses.

(02:36):
Now, once the initial capital cost has been made, the number
one cost for continuing to operate these thing is
electricity. $90.00 per MW hour is double the cost of
electricity compared with a modern natural gas plant.
It's somewhere between 30 to $40per MW hour.
The Stargate AI Data center, it's a massive undertaking.

(02:57):
A joint venture between Open AI,Oracle and SoftBank plans to
secure its electric supply through a multi faceted
approach. The strategy involves a mix of
traditional as well as newer energy sources #1 natural gas.
Significant portion of Stargate's initial and ongoing
power is going to be relying on natural gas turbines.
These facilities provide stable base load power and there's

(03:21):
already availability of very cheap natural gas in the area.
Next, they're planning to deploysmall nuclear modular reactors
or SMRS as they're called, for stable carbon free base load
power. These SMRS are a critical
component for long term sustainability.
They've got consistent power generation, they're very
efficient, and they've got enhanced safety features.

(03:42):
The modular design allows for much faster deployment than the
old custom designed nuclear reactors that we're used to
seeing from the 1970s. And then of course, there will
be other green energy sources, including wind and solar.
Now we're seeing extremely high capital expenditures on data
centers and in fact the AI hardware that goes inside them.
The revenue at NVIDIA, which currently has about 90% market

(04:04):
share, is absolutely eye watering and no doubt companies
like AMD will eventually chip away at Nvidia's market share.
But what's truly shocking is thetechnology obsolescence curve.
If you think about telecom technologies, these these
technologies often have a 20 year lifespan.
But in AI we're talking about functional obsolescence and
physical obsolescence on a much shorter time cycle.

(04:26):
We talked about functional obsolescence.
The primary driver is the relentless pace of innovation.
New AI chips are released every 12 to 18 months, and those bring
significant leaps in performance, power efficiency,
and and so on. An AI chip that's two to three
years old might be considered painfully slow compared with the
cutting edge new products by 5 to 6 years.

(04:47):
A chip can be considered ancienthistory in this context.
So the median lifespan from release to final use for
Frontier training requirements is about 3.9 years, and the
newer NVIDIA chips are ranging anywhere from 2.3 to 4 1/2 years
in that range. Now the other piece is physical
obsolescence. These AI subsystems run hot.

(05:08):
They're running continuously at very high power consumption, so
the failure rate is extremely high as well.
The physical life span due to the high utilization just these
chips wear out, is anywhere fromone to three years for high
utilization. The physical life spans of these
chips can be incredibly short because of the high workloads.
Some of the folks that I speak with in the industry are

(05:30):
suggesting that data center GPU's under heavy load might
survive anywhere from one to three years before their
performance and reliability starts to degrade significantly,
or they fail outright altogether.
Meta has published a study showing significant GPU failures
during the training of large language models like Llama
Three, with an estimated annual failure rate of around 9%,

(05:54):
potentially reaching 27% / 3 years for the H100 GPU's.
These are the newest GPU's coming out of NVIDIA.
These chips consume a lot of power, 700 watts for the H1,
hundreds and over a kilowatt, over 1000 watts for the future
chips. They generate intense amounts of
heat and they put immense strainon the hardware.
So we can expect a tremendous amount of regeneration within

(06:18):
the physical data centers. That's going to require a lot of
people, a lot of rewiring, a lotof ongoing maintenance and labor
just to keep those data centers up and running.
When we think of traditional data centers, we think about
buildings that are full of equipment, lots of air
conditioning, lots of fans running, but not very many
people. That's going to be quite
different in AI data centers because the lifespan of these

(06:40):
products is so much shorter and the maintenance requirements are
going to be much, much higher. This represents a significant
business opportunity for those that are located close to those
brand new data centers. As you think about that, have an
awesome rest of your day. Go make some great things
happen. We'll talk to you again about.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.