All Episodes

November 8, 2024 13 mins

In this eye-opening episode of the AiSultana podcast, we delve into the transformative power of nuclear energy, particularly Small Modular Reactors (SMRs), as tech giants like Google, Amazon, and Microsoft invest heavily to fuel their AI data centers.

As global energy demands surge with the growth of AI, companies are navigating the fine line between innovation and environmental responsibility.

Join us as we unpack the economic and community impacts of SMR development, the daunting regulatory challenges, and the call for "digital sobriety" – a mindful approach to managing AI’s energy consumption.

From job creation to cutting-edge safety features, discover how SMRs are shaping a sustainable future for AI and beyond, while balancing the ethics of technology's energy footprint.

Brought to you by AiSultana, a consultancy specializing in AI solutions for industry.

Join us daily for concise updates on crucial developments in AI, and why they matter to you.

Available via YouTube, Apple, and Spotify.

Don't forget to like and subscribe, and explore our free wine consumer app at www.aisultana.com.

Tune in to stay informed about the pivotal topics shaping the future of AI in industry.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Okay, so imagine like all the energy a whole country needs,

(00:04):
you know, like the Netherlands or Sweden.
Mm-hmm.
Well, get this, by 2027, AI alone
could be using that much electricity.
Wow.
Yeah, so it's a bit of a puzzle, right?
We talk about AI being this climate hero.
Yeah.
But at the same time,
it might be like the start of a whole new energy crisis.

(00:24):
Yeah, it really is a fascinating paradox,
especially when you think about how big tech companies,
the ones who've been all about renewable energy, you know,
they're now looking at something completely different
to keep up with AI's, you know, hunger for energy.
And that something is nuclear power.
Mm-hmm.
It's a twist I don't think anybody saw coming.

(00:45):
That's pretty wild.
Google's partnering up with this company called
Kairos Power to get energy from these small modular reactors.
Yeah.
SMRs, they call them.
SMRs, yeah.
And their goal is to have them up and running by 2030.
Okay.
And it's not just them.
Amazon is investing in SMRs too over in Washington state.
And then there's Microsoft.
They're actually reviving a unit
at the old Three Mile Island plant.

(01:07):
Wow.
Yeah, talk about a comeback story.
Really?
But to really understand this shift,
you have to think about what's causing this huge need for energy.
Yeah.
And it's these super complex AI models.
Okay.
Specifically, large language models, LLMs.
Right, right.
They're the brains behind things like chatbots,
which you've probably used recently.
Oh yeah, absolutely.

(01:29):
I feel like I'm talking to a chatbot every other day now.
Exactly.
Ordering food, getting customer service.
Yeah.
Even just messing around online.
Exactly.
And every single one of those interactions,
there's an LLM somewhere crunching data.
Right.
And using up energy.
So they're really power hungry, these LLMs.
These models are incredibly powerful.

(01:49):
Yeah.
But yeah, that power comes at a cost.
So that's why big tech is looking at nuclear power,
specifically these SMRs, as the solution to this energy problem.
Well, in a way, yes.
They see SMRs as a possible bridge solution.
So the idea is they can give reliable carbon-free power,

(02:10):
that always on energy supply we need.
Right.
And they might actually be quicker to get up and running
than big renewable energy projects.
Like what kind of projects?
Like massive solar farms or wind farms.
OK, so that's the why behind this sort of nuclear comeback.
But the big question is, is it really the right move?

(02:33):
I mean, nuclear energy has always been kind of controversial.
Oh, definitely.
There are strong arguments both for and against it.
I mean, on the one hand, you have those benefits
we just talked about.
No greenhouse gases when it's running,
a smaller footprint than those traditional plants.
And maybe even the potential to help revitalize communities
near old power plants.

(02:55):
Oh, interesting.
Think about it.
Jobs, investment, new life for areas
that have been struggling.
Now that makes sense.
Yeah.
Those are some pretty compelling arguments.
But then there's the other side of things, right?
Of course.
And that's where you get into those concerns about safety.
The whole issue of long-term waste disposal,
and just how the public feels about nuclear power

(03:16):
in general.
Yeah, right.
There's like a perception issue.
Yeah.
And then we can't forget about possible cost overruns
and those complicated regulatory hurdles that
come with any new nuclear project.
Right.
It's never straightforward, is it?
And SMRs, they're still a pretty new technology.
So there's still a lot of unknowns.
It's like we're weighing this shiny new solution

(03:38):
with a lot of potential against all the baggage
that nuclear power carries with it.
It's a tough call for sure.
It is.
It really shows just how complicated all this is.
And it makes you wonder, are there
other ways to deal with this?
Things we could be trying that don't involve building even
more power plants.
Right, nuclear or otherwise.
Yeah, exactly.

(03:58):
It can't just be about generating
more and more electricity, right?
Right.
What about making AI itself more energy efficient?
Is that even a possibility?
Oh, absolutely.
There's a ton of research happening right now
on what they call green AI.
Green AI, OK.
Yeah, it's kind of like designing a car engine
to use less gas.
OK, makes sense.
Instead of just guzzling it, it's

(04:19):
about coming up with algorithms and hardware for AI that just
aren't as energy intensive.
So instead of focusing on how to make more energy
to feed these models, we're trying
to teach them to be less hungry.
Exactly.
OK.
If we can train AI to recognize patterns
and optimize complex systems, think like traffic flow

(04:40):
in a city or even global supply chains,
well, we should be able to use that same idea to make AI
itself more sustainable.
OK, I'm starting to see the bigger picture now.
Yeah.
It's not just about swapping out fossil fuels for nuclear.
It's about taking a step back and thinking
about how we're designing and even deploying
AI from the very beginning.
And that means some tough decisions.

(05:02):
Like what?
Well, we have to ask ourselves, are there some AI applications
that we just don't need?
Oh, interesting.
Do we really need AI to generate every single piece of content
we see, or can we be a little more choosy?
It's like that idea of digital sobriety
we talked about earlier, right?
It's like applying reduce, reuse, recycle
to the digital world.

(05:23):
Exactly.
Digital sobriety is becoming more and more important
as AI gets more everywhere.
And it's not just about what we do as individuals.
We need policymakers and regulators
to step in and help create a more sustainable AI ecosystem.
What would that look like?
Well, imagine regulations that encourage companies

(05:45):
to develop energy efficient AI, or even
energy consumption limits for certain types of AI
applications.
Those are some pretty bold ideas.
Feels like we're in uncharted territory.
How do you even regulate something as complicated
and fast changing as AI?
It's a challenge, for sure.
There are no easy answers.
But that's what makes this whole conversation so interesting.

(06:06):
We're dealing with really fundamental questions
about the role of technology in society,
the balance between innovation and sustainability,
and even the ethical side of these increasingly powerful AI
systems.
And these aren't just abstract questions, right?
They actually have real world effects.
No, absolutely.
The choices we make about AI and energy here,

(06:27):
in the developed world, are going
to have a ripple effect across the whole globe.
It's a big responsibility.
Think about developing countries where
having reliable energy is already a huge problem.
If AI becomes this massive energy hog,
could it make those inequalities even worse?
That's a good question.

(06:47):
And create a digital divide?
Could it actually hinder their progress instead of helping?
It's almost like we're standing at a crossroads.
We've got this incredible technology
that could solve so many problems.
But if we're not careful, it could just as easily
create a whole new set of challenges.

(07:07):
That's why these conversations are so important.
We need to bring together experts
from all different fields.
Like who?
Computer scientists, engineers, policymakers, ethicists,
and of course, the public, to figure out
how to move towards a more sustainable and fair AI future.
It sounds like a pretty daunting task.

(07:28):
Yeah.
But also like an incredibly exciting one.
Oh, absolutely.
We're talking about shaping the future of technology
and its impact on all of us.
And it all comes back to that original question.
The one about AI being a climate hero.
Yeah.
Can AI be a climate hero while also facing a potential energy
crisis?
Right.
The answer, like with most things in life, is complicated.

(07:50):
Yeah.
There are no easy answers.
But it's definitely clear that we
need to be asking these hard questions
and having these sometimes difficult conversations.
Yeah, we can't just assume technology
is going to fix everything without thinking
about the possible downsides.
And it's important to remember that technology
is ultimately just a tool.
It is.
Like any tool, it can be used for good or for bad.

(08:12):
It's up to us as a society to decide
how we want to use this incredible power of AI.
That's a great way to put it.
It's a reminder that we're not just
watching this technological revolution happen.
We have a responsibility to shape it.
Right.
To steer it in a direction that benefits everyone.
Yeah.
So where do we go from here?

(08:34):
What are some real steps we can take
to make AI more sustainable and make
sure it's being used for good?
I think one of the most important things
is pushing for more transparency from the companies that are
actually developing and deploying AI.
OK.
We need to know how much energy these systems are using,
where that energy is coming from,
and what the real impact is on the environment.

(08:56):
That makes sense.
We need to be able to hold these companies accountable
and see the whole picture.
Exactly.
And on a more individual level.
We as consumers, we can make smarter choices
about how we use AI.
OK.
We can ask ourselves, do I really
need to use this AI-powered tool?
Or is there a simpler, less energy-intensive way

(09:17):
to get the same result?
Yeah.
It's like asking, do I really need to drive,
or can I walk, or bike?
Not exactly.
It's about being more aware of the impact of our choices,
even online.
Yeah.
But it goes even deeper than that, doesn't it?
It does.
OK.
We can also be more thoughtful about the types
of AI applications that we're supporting.
So what do we want AI to be used for?
Yeah.

(09:38):
Do we want to invest in AI that's
being used to optimize energy grids
and come up with new renewable energy solutions?
Or do we want AI that's mainly used for things
like targeted advertising?
It's about matching our actions with our values
and supporting the kind of future we want to see.
Right.
But how do we actually get there?
How do we get everyone on board with this vision?

(09:59):
That's the million dollar question?
Well, I think education and awareness are key.
We need to make sure that everyone understands
the potential benefits and the risks of AI
and the big choices we're facing.
It's about bringing AI out of the realm of SoFi.
Yeah.
And it is something that everyone understands.
Right.

(10:20):
And the education needs to start early.
Oh, interesting.
We should be teaching kids about AI in school,
not just the technical stuff, but the ethical and social side
of it, too.
It's about giving them the tools
to use this powerful technology responsibly.
And it's about encouraging critical thinking
and questioning.
So not just accepting things at face value.

(10:40):
We need to be asking those tough questions.
Like what?
About the real purpose of AI.
Who is actually benefiting from it?
And who might be hurt by it?
It's about having a deeper conversation.
Yeah.
About the future we're creating with AI.
A conversation that everyone's a part of.
This has been a really eye-opening deep dive.

(11:01):
We started talking about megawatts and data centers.
And we ended up talking about philosophy
in the future of humanity.
It's true.
AI really does make you think bigger.
It reminds us that technology is never
just about the technical stuff.
Right.
It's connected to our values, our social structures,
how we see ourselves in the world.
And this whole discussion about AI and nuclear power,

(11:23):
it highlights how complicated our relationship with nature
really is.
It does.
We're talking about using the power of the atom
to fuel our digital creations.
Yeah.
It makes you realize how much power we have.
And what could happen if we don't use it carefully?
So who is ultimately responsible for making sure
that AI is developed and used safely and ethically?

(11:46):
Is it governments, companies, researchers?
All of the above.
That's a tough question.
Yeah.
There's no easy answer.
But one thing's for sure.
We can't just leave it up to the market.
We need rules, ethical guidelines, some kind
of checks and balances.
To keep things in check.
Yeah, to make sure AI is a force for good.
It sounds like we need a new set of rules for the age of AI.

(12:09):
I think so.
Something that recognizes its potential, but also its risks.
And then those rules, they need to be global.
Yeah, because AI is a global thing.
Exactly.
It affects everyone.
We have to find ways to work together across borders
to make sure AI benefits all of us.
This has been a really thought-provoking conversation.
It has.

(12:29):
It's clear that this isn't just about technology.
It's about society.
It's about philosophy.
It's about us.
It's about what it means to be human in this new world.
Exactly, and that's what makes it so fascinating.
So before we wrap up, I want to leave our listeners with this.
Imagine a world where AI is being used

(12:49):
to solve the climate crisis, to create a more
fair and sustainable world, and to unlock our full potential
as humans.
I love that.
That's the future we can create if we make the right choices
today.
And those choices start with awareness, education,
and action.
Right, it starts with asking those tough questions,

(13:10):
challenging what we think we know,
and expecting more from our leaders.
And from ourselves.
It starts with understanding that we have the power
to shape the future of AI.
It's not predetermined.
It's up to us.
Absolutely.
Well, this has been an amazing conversation.
It has.
Thank you for joining us for this deep dive.
Thanks for having me.
And until next time, keep exploring,

(13:33):
keep asking those questions, and keep diving deep.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.